CN112964291B - Sensor calibration method, device, computer storage medium and terminal - Google Patents
Sensor calibration method, device, computer storage medium and terminal Download PDFInfo
- Publication number
- CN112964291B CN112964291B CN202110360085.4A CN202110360085A CN112964291B CN 112964291 B CN112964291 B CN 112964291B CN 202110360085 A CN202110360085 A CN 202110360085A CN 112964291 B CN112964291 B CN 112964291B
- Authority
- CN
- China
- Prior art keywords
- pose data
- pose
- coordinate system
- preset
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 24
- 238000006243 chemical reaction Methods 0.000 claims description 47
- 239000011159 matrix material Substances 0.000 claims description 38
- 230000006870 function Effects 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 14
- 238000013519 translation Methods 0.000 claims description 11
- 238000010276 construction Methods 0.000 abstract description 4
- 230000001360 synchronised effect Effects 0.000 abstract description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D18/00—Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Manufacturing & Machinery (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
The embodiment of the invention matches point cloud data in a laser radar with a preset map to obtain first pose data; the second pose data obtained through inertial navigation equipment (INS) are aligned with the first pose data, so that third pose data are obtained, unification of data for sensor calibration is ensured, and the first pose data and the third pose data with synchronous time stamps are obtained; the third pose data obtained in real time realizes the calibration of the on-line sensor, provides data support for vehicle track estimation and driving environment construction, and improves the accuracy of vehicle navigation.
Description
Technical Field
The present disclosure relates to, but is not limited to, navigation technologies, and in particular, to a method, an apparatus, a computer storage medium, and a terminal for calibrating a sensor.
Background
The sensor is a key component of the intelligent vehicle, and sensor calibration is a precondition for automatic driving and vehicle interconnection; as a key component of intelligent vehicles, research on accuracy of sensors and perception of roads is an important topic of automatic driving. In order to acquire the running environment of the vehicle, firstly, the accuracy of sensor calibration is ensured, and high-fidelity information is acquired; sensor calibration in the related art needs to be realized by using a checkerboard target and a calibration room, and a great deal of labor and material cost is required.
Sensor calibration is a central theme for the rapid development of autonomous vehicles. The necessary calibration of the inertial measurement unit (IMU, inertial Measurement Unit) is generally implemented in the form of a hand-eye calibration, wherein two transformation matrices (a transformation matrix and a matrix determined by a cost function) in the sensor calibration process can be found by a closed-form or iterative solution; generally, an inertial measurement unit is a device for measuring three-axis attitude angles (or angular rates) and accelerations of an object, and generally an inertial sensor includes three single-axis accelerometers and three single-axis gyroscopes, where the accelerometers detect accelerations of the object on independent three axes of a carrier coordinate system, and the gyroscopes detect angular velocities of the carrier relative to a navigation coordinate system, and the inertial sensor can measure the accelerations and angular velocities of the object in a three-dimensional space, so as to calculate the attitude of the object.
In order to solve the difficulties of data association and motion compensation, technicians typically use linear interpolation or B-spline parameterization to perform off-line calibration on a lidar odometer; the off-line calibration cannot be continuously calculated, and the real-time calibration cannot be realized when the vehicle moves, so that the real-time running environment of the vehicle cannot be obtained, and the accuracy of the vehicle navigation is affected.
Disclosure of Invention
The following is a summary of the subject matter described in detail herein. This summary is not intended to limit the scope of the claims.
The embodiment of the invention provides a method and a device for calibrating a sensor, a computer storage medium and a terminal, which can obtain the real-time running environment of a vehicle and improve the accuracy of vehicle navigation.
The embodiment of the invention provides a method for calibrating a sensor, which comprises the following steps:
matching the laser radar point cloud data in the preset duration with a preset map to obtain first pose data;
converting the second pose data obtained by the inertial navigation device INS into third pose data aligned with the obtained first pose data;
and calibrating the sensor of the vehicle according to the first pose data and the third pose data obtained by conversion.
On the other hand, the embodiment of the invention also provides a computer storage medium, wherein a computer program is stored in the computer storage medium, and the computer program realizes the sensor calibration method when being executed by a processor.
In still another aspect, an embodiment of the present invention further provides a terminal, including: a memory and a processor, the memory storing a computer program; wherein,,
the processor is configured to execute the computer program in the memory;
the computer program, when executed by the processor, implements a method for sensor calibration as described above.
In still another aspect, an embodiment of the present invention further provides a device for calibrating a sensor, including: the device comprises a first conversion unit, a second conversion unit and a calibration unit; wherein,,
the first conversion unit is configured to: matching the laser radar point cloud data in the preset duration with a preset map to obtain first pose data;
the second conversion unit is configured to: converting the second pose data obtained by the inertial navigation device INS into third pose data aligned with the obtained first pose data;
the calibration unit is set as: and calibrating the sensor of the vehicle according to the first pose data and the third pose data obtained by conversion.
According to the embodiment of the invention, the point cloud data in the laser radar is matched with a preset map to obtain first pose data; the second pose data obtained through inertial navigation equipment (INS) are aligned with the first pose data, so that third pose data are obtained, unification of data for sensor calibration is ensured, and the first pose data and the third pose data with synchronous time stamps are obtained; the third pose data obtained in real time realizes the calibration of the on-line sensor, provides data support for vehicle track estimation and driving environment construction, and improves the accuracy of vehicle navigation.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate and do not limit the invention.
FIG. 1 is a flow chart of a method of sensor calibration according to an embodiment of the present invention;
FIG. 2 is a block diagram of a sensor calibration device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail hereinafter with reference to the accompanying drawings. It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be arbitrarily combined with each other.
The steps illustrated in the flowchart of the figures may be performed in a computer system, such as a set of computer-executable instructions. Also, while a logical order is depicted in the flowchart, in some cases, the steps depicted or described may be performed in a different order than presented herein.
FIG. 1 is a flowchart of a method for calibrating a sensor according to an embodiment of the present invention, as shown in FIG. 1, including:
step 101, matching laser radar point cloud data in a preset duration with a preset map to obtain first pose data;
in an exemplary embodiment, the preset map of the embodiment of the present invention includes: map with relative error less than preset threshold. In an illustrative example, the map of the preset threshold in an embodiment of the present invention may include a map having a relative error of less than 30 cm; the relative error is a definition of the accuracy assessment known to those skilled in the art, and will not be described in detail herein.
In an exemplary embodiment, the preset map of the embodiment of the present invention includes: high-precision point cloud maps.
In one illustrative example, the high-precision point cloud map of the embodiment of the present invention includes: a high-precision point cloud map with a relative error of less than 20 cm.
In an illustrative example, step 101 matches lidar point cloud data over a preset duration with a preset map, comprising:
reading laser radar point cloud data in a preset time period according to a preset interval;
and matching the read laser radar point cloud data with a preset map through a preset point cloud registration algorithm to obtain first pose data.
In one illustrative example, the point cloud registration algorithm of the present invention includes a normal distribution transform (NDT, normal Distributions Transform) or iterative closest point (ICP, iterative Closest Point).
Note that NDT is a probability-based method, and laser radar point cloud data is modeled as a normal distribution by a probability density function (PDF, probability Distribution Function) and matched. The lidar point cloud data is divided into grid cells, and a corresponding probability density function consisting of a mean vector and a covariance matrix is calculated for each grid cell. In an exemplary embodiment, the preset interval of the embodiment of the invention can be determined by a person skilled in the art according to the loading and operation speed of a server implementing the method of the embodiment of the invention, and theoretically, under the condition of guaranteeing the loading and operation speed, the smaller the preset interval is, the more laser radar point cloud data can be processed, and the more data for sensor calibration can be obtained.
in an illustrative example, an embodiment of the present invention converts second pose data obtained by an INS into third pose data aligned with the obtained first pose data, comprising:
determining adjacent time stamps and corresponding poses of the first pose data in the second pose data according to the time stamps of the first pose data;
performing gesture interpolation on the second pose data according to the determined adjacent time stamp of the first pose data in the second pose data and the corresponding pose thereof to obtain third pose data;
wherein the gesture interpolation comprises: cubic interpolation and/or linear interpolation.
According to the embodiment of the invention, the data association of the first pose data and the second pose data is established through the pose interpolation.
It should be noted that, in the embodiment of the present invention, aiming at the fact that sampling frequencies of the first pose data and the second pose data are different, a pose interpolation mode is adopted to process the first pose data and the second pose data, and a data association between the first pose data and the second pose data can be established through third pose data obtained through interpolation.
it should be noted that, the third pose data in the embodiment of the present invention includes: data obtained by Global Navigation Satellite Systems (GNSS) and inertial measurement units.
According to the embodiment of the invention, the point cloud data in the laser radar is matched with a preset map to obtain first pose data; the second pose data and the first pose data obtained through INS are aligned to obtain third pose data, unification of data for sensor calibration is guaranteed, and the first pose data and the third pose data with synchronous time stamps are obtained; the third pose data obtained in real time realizes the calibration of the on-line sensor, provides data support for vehicle track estimation and driving environment construction, and improves the accuracy of vehicle navigation.
In an exemplary embodiment, the method for calibrating the sensor of the vehicle comprises the following steps:
setting a conversion matrix for converting the first pose data into third pose data;
and solving the conversion matrix through a preset cost function to realize the sensor calibration of the vehicle. In one illustrative example, a transformation matrix that transforms first pose data into third pose dataThe following formula is satisfied:
In an illustrative example, an embodiment of the present invention converts first pose data into third pose data, comprising:
and expanding the determined conversion matrix through the rotation matrix and the translation vector, wherein the expanded conversion matrix is as follows:
converting the first pose data into third pose data based on the expanded conversion matrix;
wherein R represents pose rotation, t represents pose translation, W represents a coordinate system of a preset map, L represents a laser radar coordinate system, I represents a coordinate system of inertial navigation equipment,representing the pose rotation of the laser radar coordinate system relative to the preset map coordinate system, < >>Representing the pose rotation of the INS's coordinate system relative to the lidar's coordinate system, +.>Representing the pose rotation of the inertial navigation device's coordinate system relative to the preset map coordinate system, +.>Representing the pose translation of the lidar coordinate system relative to a preset map coordinate system, < >>Pose translation of the coordinate system representing the inertial navigation device relative to the coordinate system of the lidar, +.>And the position and the posture of the coordinate system of the inertial navigation device relative to the coordinate system of the preset map are translated. The external parameters are used for describing the relative position relationship between the two coordinate systems, and one coordinate system can be transformed to the position of the other coordinate system through the external parameters, so that the origins of the two coordinate systems coincide, and the directions of all coordinate axes are respectively corresponding to the same direction. The subscripts 3*1 and 3*3 in the matrix are used to represent the matrix size, and the matrix to which the subscripts are not added and the matrix to which the subscripts are identified are identical.
In an exemplary embodiment, the preset map and the GNSS use the same coordinate system; in an illustrative example, the coordinate system employed by the preset map and GNSS may be a universal transverse axis ink card holder coordinate system (UTM, universal Transverse Mercartor Grid System); in an exemplary example, the first pose data in the embodiment of the present invention may include vehicle moving track data generated based on a preset map.
The cost function of the embodiment of the invention can be determined by a person skilled in the art according to the cost function of the off-line sensor calibration in the related art.
In an exemplary embodiment, the method solves the conversion matrix through a preset cost function, and includes:
solving the expanded conversion matrix to obtain an equation pair;
the conversion cost function is used as a cost matrix according to the obtained equation;
parameter adjustment is carried out on the conversion matrix according to the obtained cost matrix, so that the conversion matrix is solved; wherein, the expression of the cost function is: the equation pair is: />And->Representing the pose rotation of the laser radar coordinate system relative to the preset map coordinate system, < >>Representing the pose rotation of the coordinate system of the inertial navigation device relative to the coordinate system of the lidar +.>Representing the pose rotation of the inertial navigation device's coordinate system relative to the preset map coordinate system, +.>Representing the pose translation of the lidar coordinate system relative to a preset map coordinate system, < >>Pose translation of the coordinate system representing the inertial navigation device relative to the coordinate system of the lidar, +.>And the position and the posture of the coordinate system of the inertial navigation device relative to the coordinate system of the preset map are translated.
The embodiment of the invention also provides a computer storage medium, wherein a computer program is stored in the computer storage medium, and the method for calibrating the sensor is realized when the computer program is executed by a processor.
The embodiment of the invention also provides a terminal, which comprises: a memory and a processor, the memory storing a computer program; wherein,,
the processor is configured to execute the computer program in the memory;
the computer program when executed by a processor implements a method for calibrating a sensor as described above.
FIG. 2 is a block diagram of a sensor calibration device according to an embodiment of the present invention, as shown in FIG. 2, including: the device comprises a first conversion unit, a second conversion unit and a calibration unit; wherein,,
the first conversion unit is configured to: matching the laser radar point cloud data in the preset duration with a preset map to obtain first pose data;
the second conversion unit is configured to: converting the second pose data obtained by the INS into third pose data aligned with the obtained first pose data;
the calibration unit is set as: and calibrating the sensor of the vehicle according to the first pose data and the third pose data obtained by conversion.
According to the embodiment of the invention, the point cloud data in the laser radar is matched with a preset map to obtain first pose data; the second pose data and the first pose data obtained through INS are aligned to obtain third pose data, unification of data for sensor calibration is guaranteed, and the first pose data and the third pose data with synchronous time stamps are obtained; the third pose data obtained in real time realizes the calibration of the on-line sensor, provides data support for vehicle track estimation and driving environment construction, and improves the accuracy of vehicle navigation.
In an exemplary embodiment, the first conversion unit of the embodiment of the present invention is configured to:
reading laser radar point cloud data in a preset time period according to a preset interval; and matching the read laser radar point cloud data with a preset map through a preset point cloud registration algorithm to obtain first pose data.
In an exemplary embodiment, the second conversion unit of the embodiment of the present invention is configured to:
determining adjacent time stamps and corresponding poses of the first pose data in the second pose data according to the time stamps of the first pose data;
performing gesture interpolation on the second pose data according to the determined adjacent time stamp of the first pose data in the second pose data and the corresponding pose thereof to obtain third pose data; wherein the gesture interpolation comprises: cubic interpolation and/or linear interpolation.
In an exemplary embodiment, the preset map in the embodiment of the present invention includes: map with relative error less than preset threshold.
In an exemplary embodiment, the preset map in the embodiment of the present invention includes: high-precision point cloud maps.
In an exemplary embodiment, the calibration unit according to the embodiment of the present invention is configured to: setting a conversion matrix for converting the first pose data into third pose data; and solving the conversion matrix through a preset cost function to realize the sensor calibration of the vehicle.
"one of ordinary skill in the art will appreciate that all or some of the steps, systems, functional modules/units in the apparatus, and methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between the functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed cooperatively by several physical components. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as known to those skilled in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. Furthermore, as is well known to those of ordinary skill in the art, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. "
Claims (6)
1. A method of sensor calibration, comprising:
matching the laser radar point cloud data in the preset duration with a preset map to obtain first pose data;
converting the second pose data obtained by the inertial navigation device INS into third pose data aligned with the obtained first pose data;
according to the first pose data and the third pose data obtained by conversion, calibrating a sensor of the vehicle;
the converting the second pose data obtained by the INS into third pose data aligned with the obtained first pose data includes:
determining adjacent time stamps and corresponding poses of the first pose data in the second pose data according to the time stamps of the first pose data;
performing gesture interpolation on the second pose data according to the determined adjacent time stamp of the first pose data in the second pose data and the corresponding pose thereof to obtain third pose data;
wherein the gesture interpolation comprises: cubic interpolation and/or linear interpolation;
the step of matching the laser radar point cloud data in the preset time period with a preset map to obtain first pose data comprises the following steps:
reading laser radar point cloud data in the preset time according to a preset interval;
matching the read laser radar point cloud data with the preset map through a preset point cloud registration algorithm to obtain the first pose data;
the sensor calibration of the vehicle comprises the following steps:
setting a conversion matrix for converting the first pose data into the third pose data;
solving the conversion matrix through a preset cost function to realize the sensor calibration of the vehicle; the solving the conversion matrix through the preset cost function comprises the following steps:
solving the expanded conversion matrix to obtain an equation pair;
the conversion cost function is used as a cost matrix according to the obtained equation;
parameter adjustment is carried out on the conversion matrix according to the obtained cost matrix, so that the conversion matrix is solved;
wherein, the expression of the cost function is: the equation pair is: />And->Representing the pose rotation of the laser radar coordinate system relative to the preset map coordinate system, < >>Representing the pose rotation of the coordinate system of the inertial navigation device relative to the coordinate system of the lidar +.>Representing the pose of the inertial navigation device's coordinate system relative to a preset map coordinate systemRotate (S)/(S)>Representing the pose translation of the lidar coordinate system relative to a preset map coordinate system, < >>Pose translation of the coordinate system representing the inertial navigation device relative to the coordinate system of the lidar, +.>And the position and the posture of the coordinate system of the inertial navigation device relative to the coordinate system of the preset map are translated.
2. The method of claim 1, wherein the preset map comprises: map with relative error less than preset threshold.
3. The method of claim 2, wherein the preset map comprises: high-precision point cloud maps.
4. A computer storage medium having stored therein a computer program which, when executed by a processor, implements a method of sensor calibration according to any one of claims 1 to 3.
5. A terminal, comprising: a memory and a processor, the memory storing a computer program; wherein,,
the processor is configured to execute the computer program in the memory;
the computer program, when executed by the processor, implements a method of sensor calibration as claimed in any one of claims 1 to 3.
6. A device for sensor calibration, comprising: the device comprises a first conversion unit, a second conversion unit and a calibration unit; wherein,,
the first conversion unit is configured to: matching the laser radar point cloud data in the preset duration with a preset map to obtain first pose data;
the second conversion unit is configured to: converting the second pose data obtained by the inertial navigation device INS into third pose data aligned with the obtained first pose data;
the calibration unit is set as: according to the first pose data and the third pose data obtained by conversion, calibrating a sensor of the vehicle;
the second conversion unit is configured to:
determining adjacent time stamps and corresponding poses of the first pose data in the second pose data according to the time stamps of the first pose data;
performing gesture interpolation on the second pose data according to the determined adjacent time stamp of the first pose data in the second pose data and the corresponding pose thereof to obtain third pose data;
wherein the gesture interpolation comprises: cubic interpolation and/or linear interpolation;
the step of matching the laser radar point cloud data in the preset time period with a preset map to obtain first pose data comprises the following steps:
reading laser radar point cloud data in the preset time according to a preset interval;
matching the read laser radar point cloud data with the preset map through a preset point cloud registration algorithm to obtain the first pose data;
the sensor calibration of the vehicle comprises the following steps:
setting a conversion matrix for converting the first pose data into the third pose data;
solving the conversion matrix through a preset cost function to realize the sensor calibration of the vehicle; the solving the conversion matrix through the preset cost function comprises the following steps:
solving the expanded conversion matrix to obtain an equation pair;
the conversion cost function is used as a cost matrix according to the obtained equation;
parameter adjustment is carried out on the conversion matrix according to the obtained cost matrix, so that the conversion matrix is solved;
wherein, the expression of the cost function is: the equation pair is: />And->Representing the pose rotation of the laser radar coordinate system relative to the preset map coordinate system, < >>Representing the pose rotation of the coordinate system of the inertial navigation device relative to the coordinate system of the lidar +.>Representing the pose rotation of the inertial navigation device's coordinate system relative to the preset map coordinate system, +.>Representing the pose translation of the lidar coordinate system relative to a preset map coordinate system, < >>Pose translation of the coordinate system representing the inertial navigation device relative to the coordinate system of the lidar, +.>Coordinate system representing inertial navigation devicePose translation relative to the coordinate system of the preset map.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110360085.4A CN112964291B (en) | 2021-04-02 | 2021-04-02 | Sensor calibration method, device, computer storage medium and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110360085.4A CN112964291B (en) | 2021-04-02 | 2021-04-02 | Sensor calibration method, device, computer storage medium and terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112964291A CN112964291A (en) | 2021-06-15 |
CN112964291B true CN112964291B (en) | 2023-07-14 |
Family
ID=76280834
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110360085.4A Active CN112964291B (en) | 2021-04-02 | 2021-04-02 | Sensor calibration method, device, computer storage medium and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112964291B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113465608B (en) * | 2021-07-22 | 2024-05-03 | 清华大学苏州汽车研究院(吴江) | Road side sensor calibration method and system |
CN113640778B (en) * | 2021-08-12 | 2024-09-03 | 东风悦享科技有限公司 | Multi-laser radar combined calibration method based on non-overlapping view fields |
CN113759349B (en) * | 2021-09-22 | 2022-10-04 | 阿波罗智能技术(北京)有限公司 | Calibration method of laser radar and positioning equipment Equipment and autonomous driving vehicle |
CN117367419A (en) * | 2022-06-29 | 2024-01-09 | 深圳市海柔创新科技有限公司 | Robot positioning method, apparatus and computer readable storage medium |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2750214B1 (en) * | 1996-06-21 | 1998-09-11 | Thomson Csf | METHOD FOR CALIBRATING POSITIONING ERRORS OF A RADAR AND THE GROUND SPEED DRIFT OF AN INERTIAL POWER PLANT ON BOARD AN AIRCRAFT |
CN109270534B (en) * | 2018-05-07 | 2020-10-27 | 西安交通大学 | Intelligent vehicle laser sensor and camera online calibration method |
CN108765498B (en) * | 2018-05-30 | 2019-08-23 | 百度在线网络技术(北京)有限公司 | Monocular vision tracking, device and storage medium |
CN109901139B (en) * | 2018-12-28 | 2023-07-04 | 文远知行有限公司 | Laser radar calibration method, device, equipment and storage medium |
CN109901138B (en) * | 2018-12-28 | 2023-07-04 | 文远知行有限公司 | Laser radar calibration method, device, equipment and storage medium |
CN111678533A (en) * | 2019-03-11 | 2020-09-18 | 武汉小狮科技有限公司 | Calibration method for inertial navigation equipment and laser radar |
CN111538032B (en) * | 2020-05-19 | 2021-04-13 | 北京数字绿土科技有限公司 | Time synchronization method and device based on independent drawing tracks of camera and laser radar |
CN112051590B (en) * | 2020-08-31 | 2021-06-15 | 广州文远知行科技有限公司 | Detection method and related device for laser radar and inertial measurement unit |
CN112489111B (en) * | 2020-11-25 | 2024-01-30 | 深圳地平线机器人科技有限公司 | Camera external parameter calibration method and device and camera external parameter calibration system |
CN112362054B (en) * | 2020-11-30 | 2022-12-16 | 上海商汤临港智能科技有限公司 | Calibration method, calibration device, electronic equipment and storage medium |
-
2021
- 2021-04-02 CN CN202110360085.4A patent/CN112964291B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN112964291A (en) | 2021-06-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112964291B (en) | Sensor calibration method, device, computer storage medium and terminal | |
CN112113574B (en) | Method, apparatus, computing device and computer-readable storage medium for positioning | |
CN110927708B (en) | Calibration method, device and equipment of intelligent road side unit | |
CN109435955B (en) | Performance evaluation method, device and equipment for automatic driving system and storage medium | |
CN111947671A (en) | Method, apparatus, computing device and computer-readable storage medium for positioning | |
CN109059906A (en) | Vehicle positioning method, device, electronic equipment, storage medium | |
CN111552757B (en) | Method, device and equipment for generating electronic map and storage medium | |
US20240255305A1 (en) | Vehicle positioning method and apparatus, computer device, and storage medium | |
CN111469781A (en) | Method and apparatus for outputting information | |
CN113494910B (en) | Vehicle positioning method and device based on UWB positioning and storage medium | |
Xian et al. | Fusing stereo camera and low-cost inertial measurement unit for autonomous navigation in a tightly-coupled approach | |
US11353579B2 (en) | Method for indicating obstacle by smart roadside unit | |
CN112034438A (en) | Radar calibration method and device, electronic equipment and storage medium | |
CN114897942B (en) | Point cloud map generation method and device and related storage medium | |
CN115839718B (en) | Fusion positioning method and device based on motion constraint | |
CN108240807B (en) | Method for estimating space occupation | |
CN110082805A (en) | A kind of 3 D locating device and method | |
CN113734198B (en) | Method and device for acquiring relative course of target | |
CN112665593B (en) | Vehicle positioning method and device | |
KR102624644B1 (en) | Method of estimating the location of a moving object using vector map | |
Sferlazza et al. | Localization from inertial data and sporadic position measurements | |
CN112747752A (en) | Vehicle positioning method, device, equipment and storage medium based on laser odometer | |
CN108241365B (en) | Method and apparatus for estimating space occupation | |
CN113551678A (en) | Map construction method, high-precision map construction method and mobile device | |
CN117308925B (en) | Navigation method, device, equipment and medium for spectral map inertial navigation combination |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |