CN111735479A - Multi-sensor combined calibration device and method - Google Patents
Multi-sensor combined calibration device and method Download PDFInfo
- Publication number
- CN111735479A CN111735479A CN202010881818.4A CN202010881818A CN111735479A CN 111735479 A CN111735479 A CN 111735479A CN 202010881818 A CN202010881818 A CN 202010881818A CN 111735479 A CN111735479 A CN 111735479A
- Authority
- CN
- China
- Prior art keywords
- calibration
- camera
- laser radar
- calibration plate
- plate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Manufacturing & Machinery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Manipulator (AREA)
- Image Analysis (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention provides a multi-sensor combined calibration device and method, and relates to a multi-sensor calibration technology. The method solves the problem of multi-sensor combined calibration in the prior art. The multi-sensor combined calibration device and the multi-sensor combined calibration method comprise a mechanical arm, wherein a sensor fusion frame is arranged on the mechanical arm, a laser radar, a monocular camera and a computer for processing data are arranged on the sensor fusion frame, the multi-sensor combined calibration device also comprises a laser radar-camera four-calibration-plate combined calibration target, and the laser radar-camera four-calibration-plate combined calibration target comprises a No. 1 calibration plate, a No. 2 calibration plate, a No. 3 calibration plate and a No. 4 calibration plate; the center positions of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate are marked by dots and are used for providing characteristic points for external reference calibration. The embodiment of the invention builds a portable multi-sensor fusion framework, thereby being convenient for calibration and secondary development; the embodiment of the invention uses a mechanical arm auxiliary calibration method, and can realize intelligent calibration and batch calibration.
Description
Technical Field
The invention belongs to the technical field of sensors, and particularly relates to a multi-sensor calibration technology.
Background
The instant positioning and Mapping (SLAM) technology provides environment perception information for unmanned driving, and the traditional SLAM technology is divided into a laser SLAM and a visual SLAM; the laser radar has the advantages of high ranging precision, no influence of light and the like, and the camera has the advantages of low cost, rich image information and the like; however, the single sensor SLAM has great limitations, for example, the laser radar has a slow update frequency, has motion distortion, and cannot provide accurate measurement values in severe environments such as rain and snow; the camera cannot obtain accurate three-dimensional information and is greatly limited by ambient light.
The inertial navigation system can provide accurate angular velocity and angular velocity as a pose estimation auxiliary tool. Therefore, the SLAM environment perception capability can be improved by fusing multi-sensor data such as laser radar, visual sensors and inertial navigation systems.
Calibration of a sensor in the SLAM system is divided into internal reference calibration and external reference calibration. The internal reference calibration of the sensor mainly refers to the calculation of an internal reference matrix of a camera and the calculation of an error coefficient of an inertial navigation system, so that the measured data of the sensor is accurate. External reference calibration among sensors is a prerequisite condition for accurately carrying out multi-sensor information fusion; and (4) external reference calibration between the sensors, namely determining the pose transformation relation between the coordinate systems of the sensors.
According to the traditional external reference calibration method, a pose conversion relation from a laser radar coordinate system, a camera coordinate system and an inertial navigation system coordinate system to a vehicle body coordinate system is sought by taking a vehicle body coordinate system as a reference. However, in the traditional external reference calibration, all sensors are fixed in a vehicle body, the vehicle body is limited by the movement dimension, the calibration process is complicated, and the accurate calibration of the yaw angle and the roll angle is difficult to realize.
The laser radar has high ranging precision and has no special requirement on a calibration reference object, but the camera is based on two-dimensional image characteristics and can be calibrated only by the specific calibration reference object. The existing external reference calibration method for the laser radar and the camera comprises the following steps: the method comprises a calibration method based on a single chessboard calibration plate, a calibration method based on an L-shaped calibration plate and a calibration method based on a 3D chessboard target, which are different in major and minor aspects, and solves the extrinsic parameter matrix by matching laser 3D characteristic points and camera 2D characteristic points.
The calibration of the laser radar and the inertial navigation system needs to be carried out under the motion condition, and the inertial navigation system can provide accurate acceleration measurement values and angular velocity measurement values. The traditional calibration method is a hand-eye calibration method, but the calibration precision is difficult to guarantee; the hundred-degree Apollo calibration tool controls the vehicle to move around the 8-shaped line, and data acquisition and external parameter calibration of the sensor are realized.
The multi-sensor combined calibration is one of the most fiery topics in the field of unmanned driving at present, and the existing calibration technology has the defects of low automation degree, complex operation, low accuracy and the like.
Disclosure of Invention
The invention aims to provide a multi-sensor combined calibration device and a multi-sensor combined calibration method aiming at the problems in the prior art.
The purpose of the invention can be realized by the following technical scheme: a multi-sensor combined calibration device comprises a mechanical arm, wherein a sensor fusion frame is arranged on the mechanical arm, a laser radar, a monocular camera and a computer for processing data are arranged on the sensor fusion frame, the multi-sensor combined calibration device also comprises a laser radar-camera four-calibration-plate combined calibration target, and the laser radar-camera four-calibration-plate combined calibration target comprises a No. 1 calibration plate, a No. 2 calibration plate, a No. 3 calibration plate and a No. 4 calibration plate; the center positions of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate are marked by dots and are used for providing characteristic points for external reference calibration; black and white checkerboard patterns are distributed on the No. 4 calibration plate and used for calibrating internal references of the camera; the four calibration plates are distributed according to the shape of Chinese character' tianThe number 1 calibration plate and the number 2 calibration plate are arranged in parallel, the number 3 calibration plate and the number 4 calibration plate are arranged in front of the number 1 calibration plate and the number 2 calibration plate in parallel, and the number 3 calibration plate and the number 4 calibration plate are lower than the number 1 calibration plate and the number 2 calibration plate; no. 3 normal vector of calibration plate surfacen 3Demarcating the normal vector of the plate surface with No. 4n 4The angle difference is more than 30 degrees; no. 1 calibration plate normal vectorn 1Demarcating the normal vector of the plate with number 2The angular difference between them is greater than 30 deg..
A multi-sensor combined calibration method comprises the steps of calibrating internal parameters of a camera and calibrating external parameters of a laser radar-camera; calibrating internal parameters of the camera, controlling the mechanical arm and the camera by the computer unit to enable the camera to shoot the checkerboard calibration board at different postures, and calculating the internal parameters K of the camera by adopting a Zhang Yongyou calibration method; the external reference calibration method of the laser radar-camera comprises the following steps:
the method comprises the following steps: the computer unit adjusts the posture of the mechanical arm, so that the calibration plate module appears in the visual field range of the laser radar and the camera; controlling the mechanical arm to keep a static state, and acquiring data by using a laser radar and a camera;
step two: analyzing and processing laser radar data, and extracting point cloud characteristic points; recording coordinate information of each point cloud, filtering abnormal points, partitioning point cloud data by adopting a point cloud partitioning method, and dividing the point cloud data of four calibration plates into four different groups、、、Extracting a point cloud clustering center point by adopting a K-means method,is the firstThe three-dimensional coordinate value of the central point of each calibration plate in the laser radar coordinate systemSelecting laser characteristic points for matching;
step three: analyzing and processing camera data, and extracting visual feature points; recording the gray value of each pixel, adopting FAST key point extraction algorithm to detect the place with obvious gray change of local pixels of the calibration plate, thereby extracting the central point position of each calibration plate, extracting the central point positions of the dots of the calibration plate No. 1, No. 2 and No. 3, recording the coordinate values thereof、、And the coordinate value of the center point of the No. 4 calibration plate is obtained by analyzing the relation between the checkerboards,Is the firstThe position of the central point of the calibration plate is in the camera coordinate systemTwo-dimensional coordinate values ofSelecting visual feature points for matching;
step four: according to laser radar characteristic pointsAnd camera feature pointsAnd establishing a matching relation:establishing a minimum reprojection error according to the characteristic point matching relation, establishing an error equation, and performing least square solution according to the error equation to obtain an optimal external parameter matrix。
In some embodiments, the feature points are collected from the corner positions of the calibration plate based on the feature points of the center position of the calibration plate.
In some embodiments, the internal reference calibration of the camera adopts a checkerboard calibration method, and the computer unit controls the mechanical arm and the camera so that the camera shoots checkerboard calibration plates at different postures; and extracting corner information of 20 to 30 pictures, and calculating the camera internal parameter K by adopting a Zhang Zhengyou calibration method.
In some embodiments, the system further comprises an inertial navigation system arranged on the sensor fusion frame, the combined calibration method comprises a laser radar and inertial navigation system combined calibration method, and the initial motion moment of the mechanical arm is defined asThe moment when the mechanical arm finishes moving is,The laser point cloud scanned at any moment is,Laser radar coordinate system under timeCoordinate system of inertial navigation system at moment,Laser radar coordinate system under time,Coordinate system of inertial navigation system at momentLaser radar inIs timed toThe pose transformation matrix between moments isInertial navigation systems inIs timed toBits between momentsThe attitude transformation matrix isThe external parameter matrix between the laser radar and the inertial navigation system is;
The method comprises the following steps:
the method comprises the following steps: the mechanical arm moves in an appointed track, and in the movement process of the mechanical arm, the laser radar and the inertial navigation system acquire data;
step two: controlling the mechanical arm to stop moving, and processing data acquired by the sensor; for the laser radar, performing distortion removal processing on each frame of laser point cloud according to a uniform motion model; removing outliers in the point cloud data by adopting a point cloud filtering algorithm; in order to reduce the computational complexity, each frame of laser point cloud data is subjected to down-sampling processing by using a voxelization grid method;
step three: calculating laser radar coordinate systemAt an initial momentOf a coordinate systemCoordinate system related to end of movementPosition and posture transformation matrix betweenPose transformation matrixThe calculation method comprises the following steps: matching by using iterative nearest neighbor methodFrame point cloud andframe point cloud to obtain the firstFrame point cloudAnd a firstFrame point cloudThe matching relationship of (1); first, theTime toThe pose transformation of the laser radar at any moment is composed of a rotation matrix and a translation vectorComposition, showing the pose transformation relation of two frames of point clouds, further constructing an error equation, converting the error equation into a least square problem and calculating by using an SVD method、According to、Get the first time to the second timeThe pose transformation matrix of the time laser radar isFurther, the pose of the n frames of laser radar is transformed into a matrixMultiply by cumulatively to obtain lidar slaveIs timed toPose transformation matrix of time;
Step four: calculating inertial navigation system coordinate systemAt an initial momentOf a coordinate systemAnd the end time of exerciseOf a coordinate systemPosition and posture transformation matrix betweenPose transformation matrixThe calculation method comprises the following steps: acceleration measurement for inertial navigation systemIntegrating the quantity data and the angular velocity measurement data to obtain displacement dataAnd rotation dataThen, thenIs timed toThe pose transformation matrix of the moment inertial navigation system is expressed as;
Step five: by passingWill be provided withLaser point cloud of timeIs projected toOf time of dayUnder the coordinate system, obtaining point cloud;
Step six: by passingAnd the parameter to be calibratedWill beLaser point cloud of timeIs projected toOf time of dayUnder a coordinate system, obtaining;
Step seven, matchingAndtwo groups of point clouds, by aligning the two groups of point clouds, to the external reference matrixOptimizing by adopting an iterative closest point methodAndthe described point cloud region registration of the same piece, the construction and the optimization of the nearest neighbor errorAccording to、、Solving an extrinsic parameter matrix。
In some embodiments, under the condition of sufficient visual conditions, the observation data of the monocular camera is recorded, and the visual-IMU calibration tool is adopted to calculate the external parameter matrix between the monocular camera and the inertial navigation system(ii) a According to the determined three external parameter matrixes、、The pose consistency between the two sensors is verified in a combined mode, conversion parameters in the reference matrix are adjusted, and the multi-sensor combined calibration precision is improved; the parameter optimization adopts an on-line adjustment method, fuses the data of the laser radar, the monocular camera and the inertial navigation system, and carries out the parameter optimization、And (6) adjusting.
Compared with the prior art, the multi-sensor combined calibration device and method have the following advantages:
the invention is suitable for vision sensors such as 16-line, 32-line, 64-line and other multi-line laser radars, monocular cameras, binocular cameras, RGBD cameras and the like; the embodiment of the invention builds a portable multi-sensor fusion framework, thereby being convenient for calibration and secondary development; the embodiment of the invention uses a mechanical arm auxiliary calibration method, and can realize intelligent calibration and batch calibration.
Description of the drawings:
in the drawings, which are not necessarily drawn to scale, like reference numerals may describe similar components in different views. Like reference numerals having different letter suffixes may represent different examples of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed herein.
FIG. 1 is a schematic view of a robotic arm provided with a sensor fusion frame;
FIG. 2 is a schematic diagram of a multi-sensor fusion framework;
FIG. 3 is a schematic diagram of a position relationship of a combined calibration target of four calibration plates of a laser radar-camera;
FIG. 4 is a schematic view of the spatial arrangement of calibration targets;
FIG. 5 is a schematic diagram of a lidar-camera joint calibration;
FIG. 6 is a schematic diagram of a feature point distribution;
FIG. 7 is a schematic illustration of laser-visual feature matching;
FIG. 8 is a schematic diagram of laser radar pose transformation;
FIG. 9 is a schematic diagram of inertial navigation system pose transformation;
FIG. 10 is a schematic diagram of a calibration flow.
In the figure: 101. a sensor fusion framework; 102. a mechanical arm; 103. a connecting mechanism; 104. a computer unit; 105. an operation table; 201. a laser radar; 202. a monocular camera; 203. an inertial navigation system; 204. a metal frame; 205. a fixing device; 206 clamp.
Detailed Description
The following are specific examples of the present invention, and the technical solutions of the present invention are further described with reference to the drawings, but the present invention is not limited to these examples, and the following embodiments do not limit the invention according to the claims. Moreover, all combinations of features described in the embodiments are not necessarily essential to the solution of the invention.
Examples
A multi-sensor combined calibration device comprises a multi-sensor fusion frame 101, a mechanical arm 102, a connecting mechanism 103, a computer unit 104 and an operation table 105; the multi-sensor fusion frame 101 fixes a laser radar 201, a monocular camera 202 and an inertial navigation system 203 under a metal frame 204 which can move freely by a multi-sensor combined construction method, and the frame can be carried on environment sensing occasions such as unmanned vehicles and unmanned aerial vehicles through a clamp 206, is suitable for secondary development and is convenient for implementation of external reference calibration; the metal frame is 18cm long, 6cm wide and 8cm high; adopting a multi-sensor combined construction method, and installing the sensors according to the same coordinate system in a pointing manner; the laser radar 201 is arranged at the center of the top of a metal frame 204, and a fixing device 205 is arranged inside the metal frame; the monocular camera 202 is installed at the position 5cm on the left side of the fixing device 205, and the inertial navigation system 203 is installed at the position 5cm on the right side of the fixing device 205; the mechanical arm 102 is arranged above the operating platform 105 and provides translation and rotation motion in three axial directions; the tail end of the mechanical arm is connected with the fixing device 205 through the connecting mechanism 103, and the height of the multi-sensor fusion frame 101 from the ground is 140cm in an initial state; the computer unit 103 has functions of: controlling the motion of the mechanical arm 102, controlling the sensor to collect data, processing the sensor data and calculating the external parameter matrix.
The system also comprises a laser radar-camera four-calibration-plate combined calibration target, wherein the laser radar-camera four-calibration-plate combined calibration target consists of four calibration plates with the same size, the laser radar-camera four-calibration-plate combined calibration target 104 consists of four calibration plates with the size of 30cm × 30cm, as shown in figure 3, the central positions of No. 1 to No. 3 calibration plates are marked by round points and used for providing characteristic points for external reference calibration, black-and-white checkerboard patterns are distributed on the No. 4 calibration plate and used for internal reference calibration of the camera, the four calibration plates are distributed in the air in a 'tian' -shape, for more accurately dividing the four calibration plates, the calibration plates are distributed in the environment according to different distances and angles, as shown in figure 4, the No. 3 calibration plate and the No. 4 calibration plate are fixed by adopting a base with the height of 120cm, the base is placed on a horizontal line, and the distance multi-sensor fusion frame is arranged on the horizontal line130cm, No. 3 calibration plate normal vectorn 3And the normal vector of No. 4 calibration platen 4The angle difference is more than 30 degrees; no. 1 calibration board and No. 2 calibration board adopt the height to be fixed for 160 cm's base, and the base is placed on level line two, and distance sensor fuses the frame180cm, ensuring that the No. 1 calibration plate and the No. 2 calibration plate are not shielded; normal vector of No. 1 calibration platen 1And the normal vector of No. 2 calibration platen 2The angular difference between them is greater than 30 deg..
The computer unit controls the mechanical arm and the camera to enable the camera to shoot the checkerboard calibration board at different postures; and extracting corner information of 20 to 30 pictures, and calculating the camera internal parameter K by adopting a Zhang Zhengyou calibration method.
The laser radar-camera four-calibration-plate combined calibration target is used for calibrating internal parameters of a monocular camera and calibrating external parameters of the laser radar-camera; the combined calibration method of the laser radar and the monocular camera comprises the following steps:
the method comprises the following steps: the computer unit adjusts the posture of the mechanical arm, as shown in fig. 5, so that the combined calibration target appears in the visual field range of the laser radar and the camera; controlling the mechanical arm to keep a static state, and acquiring data by using a laser radar and a camera; then the computer unit processes the measurement data; finally, a least square problem is constructed to solve an external parameter matrix;
Step two: according to the data acquisition, laser radar data and camera data are respectively processed, for the laser radar data, coordinate information of each point cloud is recorded, abnormal points are filtered out, the point cloud data are divided by adopting a point cloud dividing method, and the point cloud data of four calibration plates are divided into four different groups、、、(ii) a Extracting a point cloud clustering center point by adopting a K-means method;is the firstThe central point of the calibration plate is positioned in the laser radar coordinate systemThree-dimensional coordinate values ofSelecting laser characteristic points for matching;
step three: for camera data, recording the gray value of each pixel, and detecting a place where the gray change of local pixels of the calibration plate is obvious by adopting a FAST key point extraction algorithm so as to extract the position of the central point of each calibration plate; for No. 1 to No. 3 calibration plates, the center position of the dot is extracted, and the coordinate value of the dot is recorded、、(ii) a For the No. 4 calibration plate, the coordinate value of the central point of the calibration plate is solved by analyzing the relation between the checkerboards;Is the firstThe position of the central point of the calibration plate is in the camera coordinate systemTwo-dimensional coordinate values ofSelecting visual feature points for matching;
step four: from lidar 3D feature pointsAnd monocular camera 2D feature pointsAnd establishing a matching relation:as shown in fig. 7, according to the matching relationship between defined lidar coordinates (Xi, Xi) and monocular camera coordinates (Ui, Vi) based on feature points Pi, a minimum reprojection error is established, and a least square problem is constructed; defining the external parameter matrixes of the laser radar and the monocular camera as follows:
whereinIn order to be a matrix of rotations,is a translation vector; then combining the matching relationship and the external reference matrix, the error equation can be expressed as:
whereinAnd constructing a least square optimization function for the depth values of the visual feature points according to an error equation to minimize errors and obtain an optimal external parameter matrix:
further, optionally, more feature points are collected for matching, on the basis of feature points at the center positions of the calibration plates, the corner positions of the calibration plates are collected as feature points, as shown in fig. 6, 5 positions of each calibration plate are selected as feature points, a least square optimization function is constructed according to the method, and an external parameter matrix is solved; experiments prove that the more the number of characteristic points is, the more accurate the calculation of the external parameter matrix is.
The joint calibration method of the laser radar and the inertial navigation system comprises the following steps:
with the assistance of a mechanical arm, adopting a laser radar-inertial navigation system external reference calibration scheme based on point cloud matching; the method comprises the following specific steps:
firstly, controlling the mechanical arm to move, and acquiring data while the sensor moves in a space;
secondly, stopping the mechanical arm to move, and processing data acquired by the sensor;
then, calculating the initial time of the laser radarOf a coordinate systemAnd end of exerciseOf a coordinate systemPosition and posture transformation matrix between(ii) a Calculating inertial navigation system at initial timeOf a coordinate systemAnd the end time of exerciseOf a coordinate systemPosition and posture transformation matrix between;
Thirdly, byWill be provided withLaser point cloud of timeIs projected toOf time of dayUnder the coordinate system, obtaining point cloud(ii) a By passingAnd the parameter to be calibratedWill beLaser point cloud of timeIs projected toOf time of dayUnder a coordinate system, obtaining(ii) a Finally, matchingAndtwo groups of point clouds, and calculating an external parameter matrix by aligning the two groups of point clouds;
The mechanical arm moves to control the mechanical arm to move in a specified track; taking the coordinate axis of the mechanical arm as a reference, moving 100cm along the positive direction of an X axis, moving 100cm along the negative direction of the X axis, moving 100cm along the positive direction of a Y axis, moving 100cm along the negative direction of the Y axis, moving 100cm along the positive direction of a Z axis, and moving 100cm along the negative direction of the Z axis; rotating 180 degrees clockwise around the X axis, rotating 180 degrees counterclockwise around the X axis, rotating 180 degrees clockwise around the Y axis, rotating 180 degrees counterclockwise around the Y axis, rotating 180 degrees clockwise around the Z axis, and rotating 180 degrees counterclockwise around the Z axis; in the motion process of the mechanical arm, data acquisition is carried out by a laser radar and an inertial navigation system;
the sensor data processing is to perform distortion removal processing on each frame of laser point cloud according to a uniform motion model for the laser radar; removing outliers in the point cloud data by adopting a point cloud filtering algorithm; in order to reduce the computational complexity, each frame of laser point cloud data is subjected to down-sampling processing by using a voxelization grid method;
the pose transformation matrixThe calculation method comprises the following steps: matching by using iterative nearest neighbor methodFrame point cloud andframe point cloud to obtain the firstFrame point cloudAnd a firstFrame point cloudThe matching relationship of (1); definition ofTime toThe pose of the laser radar is changed by the rotation matrixAnd translation vectorThe components of the composition are as follows,the corresponding relationship between the two frames of point clouds is represented as:
the error equation is expressed as:
the pose transformation matrixThe calculation method comprises the following steps: integrating acceleration measurement data and angular velocity measurement data of the inertial navigation system to obtain displacement dataAnd rotation data,Is timed toThe pose transformation matrix of the moment inertial navigation system is expressed as:
the device is toLaser point cloud of timeIs projected toOf time of dayUnder the coordinate system, for the laser radar, as shown in fig. 8, the position and orientation transformation matrix can be directly passed throughTo obtainIn thatCoordinate representation in a coordinate system:
For inertial navigation systems, as shown in FIG. 9, the pose transformation matrix can be usedAnd external parameter matrixTo obtainIn thatCoordinate representation in a coordinate system:
The computing external parameter matrixThe method is to align the point cloudAnd point cloud(ii) a In theory, it is possible to use,anddescribed is the same piece of point cloud data, which are spatially coincident, i.e.:
due to the existence of the external reference error,andare not completely coincident; using an iterative closest point method, willAndand (3) registering the point cloud regions of the same piece, and constructing and optimizing a nearest neighbor error:
further, optionally, under the condition of sufficient visual conditions, the observation data of the monocular camera is recorded, and the external parameter matrix between the monocular camera and the inertial navigation system is calculated by adopting a visual-IMU calibration tool(ii) a According to the determined three external parameter matrixes、、The pose consistency between the two is verified in a combined mode, if the pose consistency fails to be verified, the conversion parameters in the external parameter matrix are adjusted until the verification passes, and therefore under the condition that the verification fails, the embodiment of the invention is used for verifying the conversion parameters in the external parameter matrix、、Optimizing parameters, and improving the precision of multi-sensor combined calibration; the parameters are optimized by adopting an online adjusting method, fusing a laser radar,Monocular camera, inertial navigation system data, pair、And (6) adjusting.
Although some terms are used more herein, the possibility of using other terms is not excluded. These terms are used merely to more conveniently describe and explain the nature of the present invention; they are to be construed as being without limitation to any additional limitations that may be imposed by the spirit of the present invention. The order of execution of the operations, steps, and the like in the apparatuses and methods shown in the specification and drawings may be implemented in any order as long as the output of the preceding process is not used in the subsequent process, unless otherwise specified. The descriptions using "first", "next", etc. for convenience of description do not imply that they must be performed in this order.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.
Claims (6)
1. A multi-sensor combined calibration device comprises a mechanical arm, wherein a sensor fusion frame is arranged on the mechanical arm, a laser radar, a monocular camera and a computer for processing data are arranged on the sensor fusion frame, and the multi-sensor combined calibration device is characterized by further comprising a laser radar-camera four-calibration-plate combined calibration target, wherein the laser radar-camera four-calibration-plate combined calibration target comprises a No. 1 calibration plate, a No. 2 calibration plate, a No. 3 calibration plate and a No. 4 calibration plate; the center positions of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate are marked by dots and are used for providing characteristic points for external reference calibration; black and white checkerboard patterns are distributed on the No. 4 calibration plate and used for calibrating internal references of the camera; the four calibration plates are distributed according to the shape of Chinese character' tianThe number 1 calibration plate and the number 2 calibration plate are arranged in parallel, the number 3 calibration plate and the number 4 calibration plate are arranged in front of the number 1 calibration plate and the number 2 calibration plate in parallel, and the number 3 calibration plate and the number 4 calibration plate are lower than the number 1 calibration plate and the number 2 calibration plate; no. 3 normal vector of calibration plate surfacen 3Demarcating the normal vector of the plate surface with No. 4n 4The angle difference is more than 30 degrees; no. 1 calibration plate normal vectorn 1Demarcating the normal vector of the plate with number 2n 2The angular difference between them is greater than 30 deg..
2. A multi-sensor joint calibration method of the multi-sensor joint calibration apparatus according to claim 1, comprising an internal reference calibration of a camera, an external reference calibration of a lidar-camera; calibrating internal parameters of the camera, controlling the mechanical arm and the camera by the computer unit to enable the camera to shoot the checkerboard calibration board at different postures, and calculating the internal parameters K of the camera by adopting a Zhang Yongyou calibration method; the external reference calibration method of the laser radar-camera comprises the following steps:
the method comprises the following steps: the computer unit adjusts the posture of the mechanical arm, so that the calibration plate module appears in the visual field range of the laser radar and the camera; controlling the mechanical arm to keep a static state, and acquiring data by using a laser radar and a camera;
step two: analyzing and processing laser radar data, and extracting point cloud characteristic points; recording coordinate information of each point cloud, filtering abnormal points, partitioning point cloud data by adopting a point cloud partitioning method, and dividing the point cloud data of four calibration plates into four different groups、、、To adoptExtracting a point cloud clustering center point by using a K-means method;is the firstThe three-dimensional coordinate value of the central point of each calibration plate in the laser radar coordinate systemSelecting laser characteristic points for matching;
step three: analyzing and processing camera data, and extracting visual feature points; recording the gray value of each pixel, and detecting a place where the gray change of local pixels of the calibration plate is obvious by adopting a FAST key point extraction algorithm so as to extract the position of the central point of each calibration plate; extracting the center positions of the dots of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate, and recording the coordinate values、、(ii) a By analyzing the relation between the chequers, the coordinate value of the central point of the No. 4 calibration plate is obtained;Is the firstThe position of the central point of the calibration plate is in the camera coordinate systemTwo-dimensional coordinate values ofSelecting visual feature points for matching;
step four: according to laser radar characteristic pointsAnd camera feature pointsAnd establishing a matching relation:establishing a minimum reprojection error according to the characteristic point matching relation, establishing an error equation, and performing least square solution according to the error equation to obtain an optimal external parameter matrix。
3. The multi-sensor joint calibration method according to claim 2, wherein the corner positions of the calibration plate are collected as feature points on the basis of the feature points at the center position of the calibration plate.
4. The multi-sensor joint calibration method according to claim 2, wherein the camera is calibrated by using a checkerboard calibration method, and the mechanical arm and the camera are controlled by the computer unit so that the checkerboard calibration plate is shot by the camera at different postures; and extracting corner information of 20 to 30 pictures, and calculating the camera internal parameter K by adopting a Zhang Zhengyou calibration method.
5. The multi-sensor joint calibration method according to claim 2, further comprising arranging a sensor fusion frameThe inertial navigation system comprises a laser radar and inertial navigation system combined calibration method, and the initial motion moment of the mechanical arm is defined asThe moment when the mechanical arm finishes moving is,The laser point cloud scanned at any moment is,Laser radar coordinate system under timeCoordinate system of inertial navigation system at moment,Laser radar coordinate system under time,Coordinate system of inertial navigation system at momentLaser radar inIs timed toThe pose transformation matrix between moments isInertial navigation systems inIs timed toThe pose transformation matrix between moments isThe external parameter matrix between the laser radar and the inertial navigation system isThe method comprises the following steps:
the method comprises the following steps: the mechanical arm moves in an appointed track, and in the movement process of the mechanical arm, the laser radar and the inertial navigation system acquire data;
step two: controlling the mechanical arm to stop moving, and processing data acquired by the sensor; for the laser radar, performing distortion removal processing on each frame of laser point cloud according to a uniform motion model; removing outliers in the point cloud data by adopting a point cloud filtering algorithm;
step three: calculating laser radar coordinate systemAt an initial momentOf a coordinate systemCoordinate system related to end of movementPosition and posture transformation matrix betweenPose transformation matrixThe calculation method comprises the following steps: matching by using iterative nearest neighbor methodFrame point cloud andframe point cloud to obtain the firstFrame point cloudAnd a firstFrame point cloudThe matching relationship of (1); first, theTime toPose transformation rotation matrix of time laser radarAnd translation vectorThe method comprises the steps of constructing an error equation, converting the error equation into a least square problem and calculating by using an SVD (singular value decomposition) method、According to、To obtain the firstTime toThe pose transformation matrix of the time laser radar isTransforming the pose of the n frames of laser radar into a matrixMultiply by cumulatively to obtain lidar slaveIs timed toPose transformation matrix of time;
Step four: calculating inertial navigation system coordinate systemAt an initial momentOf a coordinate systemAnd the end time of exerciseOf a coordinate systemPosition and posture transformation matrix betweenPose transformation matrixThe calculation method comprises the following steps: integrating acceleration measurement data and angular velocity measurement data of the inertial navigation system to obtain displacement dataAnd rotation dataThen, thenIs timed toThe pose transformation matrix of the moment inertial navigation system is expressed as;
Step five: by passingWill be provided withLaser point cloud of timeIs projected toOf time of dayUnder the coordinate system, obtaining point cloud;
Step six: by passingAnd the parameter to be calibratedWill beLaser point cloud of timeIs projected toOf time of dayUnder a coordinate system, obtaining;
Step seven, matchingAndtwo groups of point clouds, by aligning the two groups of point clouds, to the external reference matrixOptimizing by adopting an iterative closest point methodAndthe described point cloud region registration of the same piece, the construction and the optimization of the nearest neighbor errorAccording to、、Solving an extrinsic parameter matrix。
6. Multiple sensing according to claim 5The device combined calibration method is characterized in that observation data of the monocular camera are recorded under the condition of sufficient visual conditions, and a visual-IMU calibration tool is adopted to calculate an external parameter matrix between the monocular camera and an inertial navigation system(ii) a According to the determined three external parameter matrixes、、The pose consistency between the two sensors is verified in a combined mode, conversion parameters in the reference matrix are adjusted, and the multi-sensor combined calibration precision is improved; optimizing parameters, adopting an online adjusting method, fusing data of a laser radar, a monocular camera and an inertial navigation system, and performing parameter optimization、And (6) adjusting.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010881818.4A CN111735479B (en) | 2020-08-28 | 2020-08-28 | Multi-sensor combined calibration device and method |
JP2021003139A JP7072759B2 (en) | 2020-08-28 | 2021-01-12 | Composite calibration device and method using multiple sensors |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010881818.4A CN111735479B (en) | 2020-08-28 | 2020-08-28 | Multi-sensor combined calibration device and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111735479A true CN111735479A (en) | 2020-10-02 |
CN111735479B CN111735479B (en) | 2021-03-23 |
Family
ID=72658909
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010881818.4A Active CN111735479B (en) | 2020-08-28 | 2020-08-28 | Multi-sensor combined calibration device and method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7072759B2 (en) |
CN (1) | CN111735479B (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112444798A (en) * | 2020-11-27 | 2021-03-05 | 杭州易现先进科技有限公司 | Multi-sensor equipment space-time external parameter calibration method and device and computer equipment |
CN112509067A (en) * | 2021-02-02 | 2021-03-16 | 中智行科技有限公司 | Multi-sensor combined calibration method and device, electronic equipment and storage medium |
CN112577517A (en) * | 2020-11-13 | 2021-03-30 | 上汽大众汽车有限公司 | Multi-element positioning sensor combined calibration method and system |
CN112790786A (en) * | 2020-12-30 | 2021-05-14 | 无锡祥生医疗科技股份有限公司 | Point cloud data registration method and device, ultrasonic equipment and storage medium |
CN112882000A (en) * | 2021-02-05 | 2021-06-01 | 北京科技大学 | Automatic calibration method of laser radar |
CN112881999A (en) * | 2021-01-25 | 2021-06-01 | 上海西虹桥导航技术有限公司 | Semi-automatic calibration method for multi-line laser radar and vision sensor |
CN112927302A (en) * | 2021-02-22 | 2021-06-08 | 山东大学 | Calibration plate and calibration method for multi-line laser radar and camera combined calibration |
CN113192174A (en) * | 2021-04-06 | 2021-07-30 | 中国计量大学 | Mapping method and device and computer storage medium |
CN113218435A (en) * | 2021-05-07 | 2021-08-06 | 复旦大学 | Multi-sensor time synchronization method |
CN113269107A (en) * | 2021-06-01 | 2021-08-17 | 航天智造(上海)科技有限责任公司 | Interactive intelligent disassembling and assembling system based on deep learning |
CN113298881A (en) * | 2021-05-27 | 2021-08-24 | 中国科学院沈阳自动化研究所 | Monocular camera-IMU-mechanical arm space combined calibration method |
CN113376618A (en) * | 2021-06-22 | 2021-09-10 | 昆明理工大学 | Multi-path side laser radar point cloud registration device and using method |
CN114252099A (en) * | 2021-12-03 | 2022-03-29 | 武汉科技大学 | Intelligent vehicle multi-sensor fusion self-calibration method and system |
CN114643599A (en) * | 2020-12-18 | 2022-06-21 | 沈阳新松机器人自动化股份有限公司 | Three-dimensional machine vision system and method based on point laser and area-array camera |
CN114882115A (en) * | 2022-06-10 | 2022-08-09 | 国汽智控(北京)科技有限公司 | Vehicle pose prediction method and device, electronic equipment and storage medium |
CN114879153A (en) * | 2022-06-08 | 2022-08-09 | 中国第一汽车股份有限公司 | Radar parameter calibration method and device and vehicle |
CN114894116A (en) * | 2022-04-08 | 2022-08-12 | 苏州瀚华智造智能技术有限公司 | Measurement data fusion method and non-contact measurement equipment |
CN115097427A (en) * | 2022-08-24 | 2022-09-23 | 北原科技(深圳)有限公司 | Automatic calibration system and method based on time-of-flight method |
CN115908589A (en) * | 2023-02-23 | 2023-04-04 | 深圳佑驾创新科技有限公司 | Multi-sensor calibration system and method |
CN116038719A (en) * | 2023-04-03 | 2023-05-02 | 广东工业大学 | Method, device and equipment for tracking and measuring pose of tail end of mechanical arm |
CN116630444A (en) * | 2023-07-24 | 2023-08-22 | 中国矿业大学 | Optimization method for fusion calibration of camera and laser radar |
CN117554937A (en) * | 2024-01-08 | 2024-02-13 | 安徽中科星驰自动驾驶技术有限公司 | Error-controllable laser radar and combined inertial navigation external parameter calibration method and system |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114770517B (en) * | 2022-05-19 | 2023-08-15 | 梅卡曼德(北京)机器人科技有限公司 | Method for calibrating robot through point cloud acquisition device and calibration system |
CN114993245B (en) * | 2022-05-31 | 2024-04-05 | 山西支点科技有限公司 | High-precision target calibrating method of target calibrating equipment in movable base platform and external field vibration environment |
CN115026814B (en) * | 2022-06-01 | 2024-04-12 | 中科苏州智能计算技术研究院 | Camera automatic calibration method for mechanical arm movement space reconstruction |
CN115092671B (en) * | 2022-06-08 | 2023-09-26 | 深圳市南科佳安机器人科技有限公司 | Feeding and discharging control method |
CN115122331A (en) * | 2022-07-04 | 2022-09-30 | 中冶赛迪工程技术股份有限公司 | Workpiece grabbing method and device |
CN115153925B (en) * | 2022-07-18 | 2024-04-23 | 杭州键嘉医疗科技股份有限公司 | Automatic drill bit positioning device and method for dental implant operation |
CN115159149B (en) * | 2022-07-28 | 2024-05-24 | 深圳市罗宾汉智能装备有限公司 | Visual positioning-based material taking and unloading method and device |
CN115241110B (en) * | 2022-08-15 | 2023-12-08 | 魅杰光电科技(上海)有限公司 | Wafer motion control method and wafer motion control system |
CN115442584B (en) * | 2022-08-30 | 2023-08-18 | 中国传媒大学 | Multi-sensor fusion type special-shaped surface dynamic projection method |
JP2024066886A (en) * | 2022-11-02 | 2024-05-16 | 京セラ株式会社 | Electronic device, electronic device control method, and program |
CN115712111A (en) * | 2022-11-07 | 2023-02-24 | 北京斯年智驾科技有限公司 | Camera and radar combined calibration method and system, electronic device, computer equipment and storage medium |
CN116000927A (en) * | 2022-12-29 | 2023-04-25 | 中国工程物理研究院机械制造工艺研究所 | Measuring device and method for spatial position guiding precision of robot vision system |
CN115793261B (en) * | 2023-01-31 | 2023-05-02 | 北京东方瑞丰航空技术有限公司 | Visual compensation method, system and equipment for VR glasses |
CN115908121B (en) * | 2023-02-23 | 2023-05-26 | 深圳市精锋医疗科技股份有限公司 | Endoscope registration method, device and calibration system |
CN116358517B (en) * | 2023-02-24 | 2024-02-23 | 杭州宇树科技有限公司 | Height map construction method, system and storage medium for robot |
CN115965760B (en) * | 2023-03-15 | 2023-06-09 | 成都理工大学 | Debris flow alluvial simulation experiment accumulation body surface reconstruction system |
CN116423505B (en) * | 2023-03-30 | 2024-04-23 | 杭州邦杰星医疗科技有限公司 | Error calibration method for mechanical arm registration module in mechanical arm navigation operation |
CN116512286B (en) * | 2023-04-23 | 2023-11-14 | 九众九机器人有限公司 | Six-degree-of-freedom stamping robot and stamping method thereof |
CN116449387B (en) * | 2023-06-15 | 2023-09-12 | 南京师范大学 | Multi-dimensional environment information acquisition platform and calibration method thereof |
CN116563297B (en) * | 2023-07-12 | 2023-10-31 | 中国科学院自动化研究所 | Craniocerebral target positioning method, device and storage medium |
CN117008122A (en) * | 2023-08-04 | 2023-11-07 | 江苏苏港智能装备产业创新中心有限公司 | Method and system for positioning surrounding objects of engineering mechanical equipment based on multi-radar fusion |
CN116687386B (en) * | 2023-08-07 | 2023-10-31 | 青岛市畜牧工作站(青岛市畜牧兽医研究所) | Radar detection system and method for comprehensive calibration of cattle body shape data |
CN116862999B (en) * | 2023-09-04 | 2023-12-08 | 华东交通大学 | Calibration method, system, equipment and medium for three-dimensional measurement of double cameras |
CN116883516B (en) * | 2023-09-07 | 2023-11-24 | 西南科技大学 | Camera parameter calibration method and device |
CN117092625B (en) * | 2023-10-10 | 2024-01-02 | 北京斯年智驾科技有限公司 | External parameter calibration method and system of radar and combined inertial navigation system |
CN117109505B (en) * | 2023-10-24 | 2024-01-30 | 中国飞机强度研究所 | Method for measuring blocking hook posture and determining space deformation data of carrier-based aircraft |
CN117140536B (en) * | 2023-10-30 | 2024-01-09 | 北京航空航天大学 | Robot control method and device and robot |
CN117257459B (en) * | 2023-11-22 | 2024-03-12 | 杭州先奥科技有限公司 | Map expansion method and system in electromagnetic navigation bronchoscopy with respiratory disturbance resistance |
CN117284499B (en) * | 2023-11-24 | 2024-01-19 | 北京航空航天大学 | Monocular vision-laser-based pose measurement method for spatial unfolding mechanism |
CN117433511B (en) * | 2023-12-20 | 2024-03-12 | 绘见科技(深圳)有限公司 | Multi-sensor fusion positioning method |
CN117646828B (en) * | 2024-01-29 | 2024-04-05 | 中国市政工程西南设计研究总院有限公司 | Device and method for detecting relative displacement and water leakage of pipe jacking interface |
CN117968680A (en) * | 2024-03-29 | 2024-05-03 | 西安现代控制技术研究所 | Inertial-radar integrated navigation limited frame measurement variable weight updating method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107194983A (en) * | 2017-05-16 | 2017-09-22 | 华中科技大学 | A kind of three-dimensional visualization method and system based on a cloud and image data |
CN109118547A (en) * | 2018-11-01 | 2019-01-01 | 百度在线网络技术(北京)有限公司 | Multi-cam combined calibrating system and method |
CN109712189A (en) * | 2019-03-26 | 2019-05-03 | 深兰人工智能芯片研究院(江苏)有限公司 | A kind of method and apparatus of sensor combined calibrating |
CN110599546A (en) * | 2019-08-28 | 2019-12-20 | 贝壳技术有限公司 | Method, system, device and storage medium for acquiring three-dimensional space data |
CN110599541A (en) * | 2019-08-28 | 2019-12-20 | 贝壳技术有限公司 | Method and device for calibrating multiple sensors and storage medium |
CN110686704A (en) * | 2019-10-18 | 2020-01-14 | 深圳市镭神智能系统有限公司 | Pose calibration method, system and medium for laser radar and combined inertial navigation |
CN111127563A (en) * | 2019-12-18 | 2020-05-08 | 北京万集科技股份有限公司 | Combined calibration method and device, electronic equipment and storage medium |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103983961A (en) | 2014-05-20 | 2014-08-13 | 南京理工大学 | Three-dimensional calibration target for joint calibration of 3D laser radar and camera |
US10282591B2 (en) | 2015-08-24 | 2019-05-07 | Qualcomm Incorporated | Systems and methods for depth map sampling |
KR20190126458A (en) | 2017-04-17 | 2019-11-11 | 코그넥스코오포레이션 | High precision calibration system and method |
CN109828262A (en) | 2019-03-15 | 2019-05-31 | 苏州天准科技股份有限公司 | Laser radar and the automatic combined calibrating method of camera based on plane and space characteristics |
CN110322519B (en) | 2019-07-18 | 2023-03-31 | 天津大学 | Calibration device and calibration method for combined calibration of laser radar and camera |
US10726579B1 (en) | 2019-11-13 | 2020-07-28 | Honda Motor Co., Ltd. | LiDAR-camera calibration |
CN112907676B (en) | 2019-11-19 | 2022-05-10 | 浙江商汤科技开发有限公司 | Calibration method, device and system of sensor, vehicle, equipment and storage medium |
CN111627072B (en) | 2020-04-30 | 2023-10-24 | 贝壳技术有限公司 | Method, device and storage medium for calibrating multiple sensors |
-
2020
- 2020-08-28 CN CN202010881818.4A patent/CN111735479B/en active Active
-
2021
- 2021-01-12 JP JP2021003139A patent/JP7072759B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107194983A (en) * | 2017-05-16 | 2017-09-22 | 华中科技大学 | A kind of three-dimensional visualization method and system based on a cloud and image data |
CN109118547A (en) * | 2018-11-01 | 2019-01-01 | 百度在线网络技术(北京)有限公司 | Multi-cam combined calibrating system and method |
CN109712189A (en) * | 2019-03-26 | 2019-05-03 | 深兰人工智能芯片研究院(江苏)有限公司 | A kind of method and apparatus of sensor combined calibrating |
CN110599546A (en) * | 2019-08-28 | 2019-12-20 | 贝壳技术有限公司 | Method, system, device and storage medium for acquiring three-dimensional space data |
CN110599541A (en) * | 2019-08-28 | 2019-12-20 | 贝壳技术有限公司 | Method and device for calibrating multiple sensors and storage medium |
CN110686704A (en) * | 2019-10-18 | 2020-01-14 | 深圳市镭神智能系统有限公司 | Pose calibration method, system and medium for laser radar and combined inertial navigation |
CN111127563A (en) * | 2019-12-18 | 2020-05-08 | 北京万集科技股份有限公司 | Combined calibration method and device, electronic equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
吴昱晗等: "一种基于点云匹配的激光雷达/IMU 联合标定方法", 《测控技术与仪器仪表》 * |
韩栋斌等: "基于多对点云匹配的三维激光雷达外参数标定", 《激光与光电子学进展》 * |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112577517A (en) * | 2020-11-13 | 2021-03-30 | 上汽大众汽车有限公司 | Multi-element positioning sensor combined calibration method and system |
CN112444798B (en) * | 2020-11-27 | 2024-04-09 | 杭州易现先进科技有限公司 | Method and device for calibrating space-time external parameters of multi-sensor equipment and computer equipment |
CN112444798A (en) * | 2020-11-27 | 2021-03-05 | 杭州易现先进科技有限公司 | Multi-sensor equipment space-time external parameter calibration method and device and computer equipment |
CN114643599A (en) * | 2020-12-18 | 2022-06-21 | 沈阳新松机器人自动化股份有限公司 | Three-dimensional machine vision system and method based on point laser and area-array camera |
CN112790786A (en) * | 2020-12-30 | 2021-05-14 | 无锡祥生医疗科技股份有限公司 | Point cloud data registration method and device, ultrasonic equipment and storage medium |
CN112881999B (en) * | 2021-01-25 | 2024-02-02 | 上海西虹桥导航技术有限公司 | Semi-automatic calibration method for multi-line laser radar and vision sensor |
CN112881999A (en) * | 2021-01-25 | 2021-06-01 | 上海西虹桥导航技术有限公司 | Semi-automatic calibration method for multi-line laser radar and vision sensor |
CN112509067B (en) * | 2021-02-02 | 2021-04-27 | 中智行科技有限公司 | Multi-sensor combined calibration method and device, electronic equipment and storage medium |
CN112509067A (en) * | 2021-02-02 | 2021-03-16 | 中智行科技有限公司 | Multi-sensor combined calibration method and device, electronic equipment and storage medium |
CN112882000A (en) * | 2021-02-05 | 2021-06-01 | 北京科技大学 | Automatic calibration method of laser radar |
CN112882000B (en) * | 2021-02-05 | 2023-02-03 | 北京科技大学 | Automatic calibration method for laser radar |
CN112927302A (en) * | 2021-02-22 | 2021-06-08 | 山东大学 | Calibration plate and calibration method for multi-line laser radar and camera combined calibration |
CN112927302B (en) * | 2021-02-22 | 2023-08-15 | 山东大学 | Calibration plate and calibration method for combined calibration of multi-line laser radar and camera |
CN113192174A (en) * | 2021-04-06 | 2021-07-30 | 中国计量大学 | Mapping method and device and computer storage medium |
CN113192174B (en) * | 2021-04-06 | 2024-03-26 | 中国计量大学 | Picture construction method and device and computer storage medium |
CN113218435A (en) * | 2021-05-07 | 2021-08-06 | 复旦大学 | Multi-sensor time synchronization method |
CN113298881B (en) * | 2021-05-27 | 2023-09-12 | 中国科学院沈阳自动化研究所 | Spatial joint calibration method for monocular camera-IMU-mechanical arm |
CN113298881A (en) * | 2021-05-27 | 2021-08-24 | 中国科学院沈阳自动化研究所 | Monocular camera-IMU-mechanical arm space combined calibration method |
CN113269107A (en) * | 2021-06-01 | 2021-08-17 | 航天智造(上海)科技有限责任公司 | Interactive intelligent disassembling and assembling system based on deep learning |
CN113376618A (en) * | 2021-06-22 | 2021-09-10 | 昆明理工大学 | Multi-path side laser radar point cloud registration device and using method |
CN113376618B (en) * | 2021-06-22 | 2024-03-01 | 昆明理工大学 | Multi-path side laser radar point cloud registration device and use method |
CN114252099B (en) * | 2021-12-03 | 2024-02-23 | 武汉科技大学 | Multi-sensor fusion self-calibration method and system for intelligent vehicle |
CN114252099A (en) * | 2021-12-03 | 2022-03-29 | 武汉科技大学 | Intelligent vehicle multi-sensor fusion self-calibration method and system |
CN114894116B (en) * | 2022-04-08 | 2024-02-23 | 苏州瀚华智造智能技术有限公司 | Measurement data fusion method and non-contact measurement equipment |
CN114894116A (en) * | 2022-04-08 | 2022-08-12 | 苏州瀚华智造智能技术有限公司 | Measurement data fusion method and non-contact measurement equipment |
CN114879153A (en) * | 2022-06-08 | 2022-08-09 | 中国第一汽车股份有限公司 | Radar parameter calibration method and device and vehicle |
CN114882115A (en) * | 2022-06-10 | 2022-08-09 | 国汽智控(北京)科技有限公司 | Vehicle pose prediction method and device, electronic equipment and storage medium |
CN114882115B (en) * | 2022-06-10 | 2023-08-25 | 国汽智控(北京)科技有限公司 | Vehicle pose prediction method and device, electronic equipment and storage medium |
CN115097427A (en) * | 2022-08-24 | 2022-09-23 | 北原科技(深圳)有限公司 | Automatic calibration system and method based on time-of-flight method |
CN115097427B (en) * | 2022-08-24 | 2023-02-10 | 北原科技(深圳)有限公司 | Automatic calibration method based on time-of-flight method |
CN115908589A (en) * | 2023-02-23 | 2023-04-04 | 深圳佑驾创新科技有限公司 | Multi-sensor calibration system and method |
US11919177B1 (en) | 2023-04-03 | 2024-03-05 | Guangdong University Of Technology | Tracking measurement method, apparatus and device for pose of tail end of manipulator |
CN116038719A (en) * | 2023-04-03 | 2023-05-02 | 广东工业大学 | Method, device and equipment for tracking and measuring pose of tail end of mechanical arm |
CN116038719B (en) * | 2023-04-03 | 2023-07-18 | 广东工业大学 | Method, device and equipment for tracking and measuring pose of tail end of mechanical arm |
CN116630444B (en) * | 2023-07-24 | 2023-09-29 | 中国矿业大学 | Optimization method for fusion calibration of camera and laser radar |
CN116630444A (en) * | 2023-07-24 | 2023-08-22 | 中国矿业大学 | Optimization method for fusion calibration of camera and laser radar |
CN117554937A (en) * | 2024-01-08 | 2024-02-13 | 安徽中科星驰自动驾驶技术有限公司 | Error-controllable laser radar and combined inertial navigation external parameter calibration method and system |
CN117554937B (en) * | 2024-01-08 | 2024-04-26 | 安徽中科星驰自动驾驶技术有限公司 | Error-controllable laser radar and combined inertial navigation external parameter calibration method and system |
Also Published As
Publication number | Publication date |
---|---|
CN111735479B (en) | 2021-03-23 |
JP7072759B2 (en) | 2022-05-23 |
JP2022039906A (en) | 2022-03-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111735479B (en) | Multi-sensor combined calibration device and method | |
CN109270534B (en) | Intelligent vehicle laser sensor and camera online calibration method | |
CN112396664B (en) | Monocular camera and three-dimensional laser radar combined calibration and online optimization method | |
CN111325801B (en) | Combined calibration method for laser radar and camera | |
EP3479353A1 (en) | Systems and methods for identifying pose of cameras in a scene | |
CN108594245A (en) | A kind of object movement monitoring system and method | |
CN112837383B (en) | Camera and laser radar recalibration method and device and computer readable storage medium | |
CN110987021B (en) | Inertial vision relative attitude calibration method based on rotary table reference | |
CN106625673A (en) | Narrow space assembly system and assembly method | |
CN109724586B (en) | Spacecraft relative pose measurement method integrating depth map and point cloud | |
US20220230348A1 (en) | Method and apparatus for determining a three-dimensional position and pose of a fiducial marker | |
CN114608554B (en) | Handheld SLAM equipment and robot instant positioning and mapping method | |
CN112819711B (en) | Monocular vision-based vehicle reverse positioning method utilizing road lane line | |
CN111915685B (en) | Zoom camera calibration method | |
Luo et al. | Docking navigation method for UAV autonomous aerial refueling | |
CN110030979B (en) | Spatial non-cooperative target relative pose measurement method based on sequence images | |
CN108257184B (en) | Camera attitude measurement method based on square lattice cooperative target | |
CN114777768A (en) | High-precision positioning method and system for satellite rejection environment and electronic equipment | |
CN114001651A (en) | Large-scale long and thin cylinder type component pose in-situ measurement method based on binocular vision measurement and prior detection data | |
CN117115271A (en) | Binocular camera external parameter self-calibration method and system in unmanned aerial vehicle flight process | |
Gao et al. | Altitude information acquisition of uav based on monocular vision and mems | |
CN110490934A (en) | Mixing machine vertical blade attitude detecting method based on monocular camera and robot | |
CN113405532B (en) | Forward intersection measuring method and system based on structural parameters of vision system | |
CN112489118B (en) | Method for quickly calibrating external parameters of airborne sensor group of unmanned aerial vehicle | |
CN115100287A (en) | External reference calibration method and robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |