CN112964291A - Sensor calibration method and device, computer storage medium and terminal - Google Patents

Sensor calibration method and device, computer storage medium and terminal Download PDF

Info

Publication number
CN112964291A
CN112964291A CN202110360085.4A CN202110360085A CN112964291A CN 112964291 A CN112964291 A CN 112964291A CN 202110360085 A CN202110360085 A CN 202110360085A CN 112964291 A CN112964291 A CN 112964291A
Authority
CN
China
Prior art keywords
data
posture
attitude
preset
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110360085.4A
Other languages
Chinese (zh)
Other versions
CN112964291B (en
Inventor
杨殿阁
黄琪杰
杨蒙蒙
江昆
温拓朴
张晓龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202110360085.4A priority Critical patent/CN112964291B/en
Publication of CN112964291A publication Critical patent/CN112964291A/en
Application granted granted Critical
Publication of CN112964291B publication Critical patent/CN112964291B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention discloses a method and a device for calibrating a sensor, a computer storage medium and a terminal, wherein the method comprises the steps of matching laser radar inside point cloud data with a preset map to obtain first position and orientation data; the second position data obtained by inertial navigation equipment (INS) is aligned with the first position data to obtain third position data, so that the unification of data for sensor calibration is ensured, and the first position data and the third position data with synchronous timestamps are obtained; the online sensor calibration is realized through the third posture data obtained in real time, data support is provided for vehicle track estimation and driving environment construction, and the accuracy of vehicle navigation is improved.

Description

Sensor calibration method and device, computer storage medium and terminal
Technical Field
The present disclosure relates to, but not limited to, navigation technologies, and more particularly, to a method, an apparatus, a computer storage medium, and a terminal for calibrating a sensor.
Background
The sensor is a key component of the intelligent vehicle, and the sensor calibration is a prerequisite condition for automatic driving and vehicle interconnection; as a key component of smart vehicles, the study of sensor accuracy and perception of roads is an important issue for autonomous driving. In order to obtain the driving environment of a vehicle, firstly, the calibration accuracy of a sensor is ensured, and high-fidelity information is obtained; sensor calibration in the related art needs to be realized by using a checkerboard target and a calibration room, and a large amount of labor and material cost is needed.
Sensor calibration is the core subject of rapid development of autonomous vehicles. The necessary calibration of an Inertial Measurement Unit (IMU) is generally implemented in the form of hand-eye calibration, where two transformation matrices (a transformation matrix and a matrix determined by a cost function) in the sensor calibration process can be found by closed-form or iterative solution; generally, an inertial measurement unit is a device for measuring the three-axis attitude angle (or angular rate) and acceleration of an object, and generally, an inertial sensor includes three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detect the acceleration of the object in three independent axes of a carrier coordinate system, and the gyroscopes detect the angular velocity of the carrier relative to a navigation coordinate system, and the acceleration and the angular velocity of the object in three-dimensional space can be measured by the inertial sensor, and the attitude of the object can be calculated by the acceleration and the angular velocity.
In order to solve the difficulties of data correlation and motion compensation, technicians generally use linear interpolation or B-spline parameterization to perform off-line calibration on the laser radar odometer; the off-line calibration cannot be continuously calculated, and real-time calibration cannot be realized when the vehicle moves, so that the real-time driving environment of the vehicle cannot be obtained, and the accuracy of vehicle navigation is influenced.
Disclosure of Invention
The following is a summary of the subject matter described in detail herein. This summary is not intended to limit the scope of the claims.
The embodiment of the invention provides a sensor calibration method, a sensor calibration device, a computer storage medium and a terminal, which can be used for acquiring the real-time driving environment of a vehicle and improving the accuracy of vehicle navigation.
The embodiment of the invention provides a sensor calibration method, which comprises the following steps:
matching laser radar point cloud data within a preset time length with a preset map to obtain first position data;
converting the second position and posture data obtained by the inertial navigation equipment INS into third position and posture data aligned with the obtained first position and posture data;
and calibrating the sensor of the vehicle according to the first position data and the third position data obtained by conversion.
On the other hand, the embodiment of the present invention further provides a computer storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the method for calibrating the sensor is implemented.
In another aspect, an embodiment of the present invention further provides a terminal, including: a memory and a processor, the memory having a computer program stored therein; wherein the content of the first and second substances,
the processor is configured to execute the computer program in the memory;
the computer program, when executed by the processor, implements a method of sensor calibration as described above.
In another aspect, an embodiment of the present invention further provides a sensor calibration apparatus, including: the device comprises a first conversion unit, a second conversion unit and a calibration unit; wherein the content of the first and second substances,
the first conversion unit is configured to: matching laser radar point cloud data within a preset time length with a preset map to obtain first position data;
the second conversion unit is configured to: converting the second position and posture data obtained by the inertial navigation equipment INS into third position and posture data aligned with the obtained first position and posture data;
the calibration unit is configured to: and calibrating the sensor of the vehicle according to the first position data and the third position data obtained by conversion.
The method comprises the steps of matching laser radar line point cloud data with a preset map to obtain first position data; the second position data obtained by inertial navigation equipment (INS) is aligned with the first position data to obtain third position data, so that the unification of data for sensor calibration is ensured, and the first position data and the third position data with synchronous timestamps are obtained; the online sensor calibration is realized through the third posture data obtained in real time, data support is provided for vehicle track estimation and driving environment construction, and the accuracy of vehicle navigation is improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the example serve to explain the principles of the invention and not to limit the invention.
FIG. 1 is a flow chart of a method of sensor calibration according to an embodiment of the present invention;
fig. 2 is a block diagram of a sensor calibration apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. It should be noted that the embodiments and features of the embodiments in the present application may be arbitrarily combined with each other without conflict.
The steps illustrated in the flow charts of the figures may be performed in a computer system such as a set of computer-executable instructions. Also, while a logical order is shown in the flow diagrams, in some cases, the steps shown or described may be performed in an order different than here.
Fig. 1 is a flowchart of a method for calibrating a sensor according to an embodiment of the present invention, as shown in fig. 1, including:
step 101, matching laser radar point cloud data within a preset time length with a preset map to obtain first position and orientation data;
in an exemplary embodiment, the presetting of the map according to the embodiment of the present invention includes: and the relative error is smaller than a preset threshold value. In an exemplary example, the map of the preset threshold in the embodiment of the present invention may include a map with a relative error of less than 30 centimeters; the relative error is a definition of accuracy evaluation well known to those skilled in the art and will not be described herein.
In an exemplary embodiment, the presetting of the map according to the embodiment of the present invention includes: and (5) high-precision point cloud map.
In an illustrative example, a high precision point cloud map of an embodiment of the invention includes: and the relative error is less than 20 cm.
In an exemplary embodiment, the step 101 of matching the laser radar point cloud data within a preset time length with a preset map includes:
reading laser radar point cloud data within a preset time length according to a preset interval;
and matching the read laser radar point cloud data with a preset map through a preset point cloud registration algorithm to obtain first position and attitude data.
In an exemplary example, the Point cloud registration algorithm of the embodiment of the present invention includes Normal Distribution Transform (NDT) or Iterative Closest Point (ICP).
NDT is a Probability-based method, and laser radar point cloud data is modeled as normal Distribution by a Probability Density Function (PDF) and is matched. The lidar point cloud data is divided into grid cells, and a corresponding probability density function consisting of a mean vector and a covariance matrix is calculated for each grid cell. In an exemplary example, the preset interval according to the embodiment of the present invention may be determined by a person skilled in the art according to the loading and operation speed of a server implementing the method according to the embodiment of the present invention, and theoretically, under the condition that the loading and operation speed is ensured, the smaller the preset interval is, the more laser radar point cloud data can be processed, and the richer data for sensor calibration can be obtained.
Step 102, converting second position and posture data obtained by inertial navigation equipment (INS) into third position and posture data aligned with the obtained first position and posture data;
in an illustrative example, an embodiment of the present invention converts second position data obtained by an INS into third position data aligned with first position data obtained, comprising:
according to the timestamp of the first position and posture data, determining an adjacent timestamp of the first position and posture data in the second position and posture corresponding to the timestamp;
performing attitude interpolation on the second attitude data according to the determined adjacent timestamp of the first attitude data in the second attitude data and the corresponding attitude of the first attitude data to obtain third attitude data;
wherein the attitude interpolation includes: cubic interpolation and/or linear interpolation.
According to the embodiment of the invention, the data association of the first position data and the second position data is established through the attitude interpolation.
It should be noted that, in the embodiment of the present invention, for the difference in sampling frequency between the first posture data and the second posture data, a posture interpolation mode is adopted for processing, and data association between the first posture data and the second posture data may be established through the third posture data obtained through interpolation.
103, calibrating a sensor of the vehicle according to the first position data and the third position data obtained by conversion;
it should be noted that the third posture data in the embodiment of the present invention includes: data obtained by a Global Navigation Satellite System (GNSS) and an inertial measurement unit.
The method comprises the steps of matching laser radar line point cloud data with a preset map to obtain first position data; the second position data obtained by the INS is aligned with the first position data to obtain third position data, so that the unification of data for sensor calibration is ensured, and the first position data and the third position data with synchronous timestamps are obtained; the online sensor calibration is realized through the third posture data obtained in real time, data support is provided for vehicle track estimation and driving environment construction, and the accuracy of vehicle navigation is improved.
In one illustrative example, an embodiment of the present invention performs sensor calibration for a vehicle, comprising:
setting a conversion matrix for converting the first posture data into third posture data;
and solving the conversion matrix through a preset cost function to realize the sensor calibration of the vehicle. In an illustrative example, a transformation matrix transforms first bit position data into third bit position data
Figure BDA0003005206820000051
The following formula is satisfied:
Figure BDA0003005206820000052
wherein the content of the first and second substances,
Figure BDA0003005206820000053
representing the first bit position data and the second bit position data,
Figure BDA0003005206820000054
representing third posture data.
In one illustrative example, an embodiment of the present invention converts first posture data to third posture data, comprising:
extending the determined transformation matrix by the rotation matrix and the translation vector, wherein the extended transformation matrix is as follows:
Figure BDA0003005206820000055
converting the first posture data into third posture data based on the expanded conversion matrix;
wherein R represents pose rotation, t represents pose translation, W represents a coordinate system of a preset map, L represents a laser radar coordinate system, I represents a coordinate system of inertial navigation equipment,
Figure BDA0003005206820000061
indicating the rotation of the position and posture of the coordinate system of the laser radar relative to the preset map coordinate system,
Figure BDA0003005206820000062
the pose rotation of the coordinate system representing the INS relative to the coordinate system of the lidar,
Figure BDA0003005206820000063
representing the rotation of the pose of the inertial navigation device's coordinate system relative to a preset map coordinate system,
Figure BDA0003005206820000064
representing the pose translation of the laser radar coordinate system relative to the preset map coordinate system,
Figure BDA0003005206820000065
representing a pose translation of the inertial navigation device's coordinate system relative to the lidar's coordinate system,
Figure BDA0003005206820000066
and representing the pose translation of the coordinate system of the inertial navigation equipment relative to the coordinate system of the preset map. The external parameters are used for describing the relative position relationship between the two coordinate systems, one coordinate system can be converted to the position of the other coordinate system through the external parameters, so that the original points of the coordinate systems are overlapped, and the directions of the coordinate axes are respectively corresponding to the same direction. Subscripts 3 x 1 and 3 x 3 in the matrix are used to indicate the matrix size, and the matrix without subscript and the matrix with subscript identification are the same.
In an exemplary embodiment, the predetermined map and the GNSS use the same coordinate system; in an exemplary embodiment, the coordinate System adopted by the predetermined map and the GNSS may be a Universal Transverse mercator coordinate System (UTM); in an exemplary example, the first posture data in the embodiment of the present invention may include vehicle running track data generated based on a preset map.
Cost functions of embodiments of the present invention may be determined by those skilled in the art according to cost functions calibrated by offline sensors in the related art.
In an exemplary embodiment, the solving of the transformation matrix by the preset cost function includes:
solving the expanded conversion matrix to obtain an equation pair;
converting the cost function into a cost matrix according to the obtained equation;
adjusting parameters of the conversion matrix according to the obtained cost matrix to realize the solution of the conversion matrix; wherein, the expression of the cost function is:
Figure BDA0003005206820000067
Figure BDA0003005206820000068
the equation pair is:
Figure BDA0003005206820000069
and
Figure BDA00030052068200000610
indicating the rotation of the position and posture of the coordinate system of the laser radar relative to the preset map coordinate system,
Figure BDA00030052068200000611
representing the rotation of the pose of the inertial navigation device's coordinate system relative to the lidar's coordinate system,
Figure BDA00030052068200000612
coordinate system representing an inertial navigation deviceRotating relative to the pose of a preset map coordinate system,
Figure BDA00030052068200000613
representing the pose translation of the laser radar coordinate system relative to the preset map coordinate system,
Figure BDA00030052068200000614
representing a pose translation of the inertial navigation device's coordinate system relative to the lidar's coordinate system,
Figure BDA0003005206820000071
and representing the pose translation of the coordinate system of the inertial navigation equipment relative to the coordinate system of the preset map.
The embodiment of the invention also provides a computer storage medium, wherein a computer program is stored in the computer storage medium, and the method for calibrating the sensor is realized when the computer program is executed by the processor.
An embodiment of the present invention further provides a terminal, including: a memory and a processor, the memory having stored therein a computer program; wherein the content of the first and second substances,
the processor is configured to execute the computer program in the memory;
the computer program, when executed by a processor, implements a method for sensor calibration as described above.
Fig. 2 is a block diagram of a structure of a sensor calibration apparatus according to an embodiment of the present invention, as shown in fig. 2, including: the device comprises a first conversion unit, a second conversion unit and a calibration unit; wherein the content of the first and second substances,
the first conversion unit is configured to: matching laser radar point cloud data within a preset time length with a preset map to obtain first position data;
the second conversion unit is configured to: converting the second posture data obtained by the INS into third posture data aligned with the obtained first posture data;
the calibration unit is configured to: and calibrating the sensor of the vehicle according to the first position data and the third position data obtained by conversion.
The method comprises the steps of matching laser radar line point cloud data with a preset map to obtain first position data; the second position data obtained by the INS is aligned with the first position data to obtain third position data, so that the unification of data for sensor calibration is ensured, and the first position data and the third position data with synchronous timestamps are obtained; the online sensor calibration is realized through the third posture data obtained in real time, data support is provided for vehicle track estimation and driving environment construction, and the accuracy of vehicle navigation is improved.
In an exemplary embodiment, the first conversion unit of the embodiment of the present invention is configured to:
reading laser radar point cloud data within a preset time length according to a preset interval; and matching the read laser radar point cloud data with a preset map through a preset point cloud registration algorithm to obtain first position and attitude data.
In an exemplary embodiment, the second conversion unit of the embodiment of the present invention is configured to:
according to the timestamp of the first position and posture data, determining an adjacent timestamp of the first position and posture data in the second position and posture corresponding to the timestamp;
performing attitude interpolation on the second attitude data according to the determined adjacent timestamp of the first attitude data in the second attitude data and the corresponding attitude of the first attitude data to obtain third attitude data; wherein the attitude interpolation includes: cubic interpolation and/or linear interpolation.
In an exemplary embodiment, the preset map in the embodiment of the present invention includes: and the relative error is smaller than a preset threshold value.
In an exemplary embodiment, the preset map in the embodiment of the present invention includes: and (5) high-precision point cloud map.
In an exemplary embodiment, the calibration unit in the embodiment of the present invention is configured to: setting a conversion matrix for converting the first posture data into third posture data; and solving the conversion matrix through a preset cost function to realize the sensor calibration of the vehicle.
"one of ordinary skill in the art will appreciate that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media "as is well known to those of ordinary skill in the art.

Claims (10)

1. A method of sensor calibration, comprising:
matching laser radar point cloud data within a preset time length with a preset map to obtain first position data;
converting the second position and posture data obtained by the inertial navigation equipment INS into third position and posture data aligned with the obtained first position and posture data;
and calibrating the sensor of the vehicle according to the first position data and the third position data obtained by conversion.
2. The method according to claim 1, wherein the matching of the lidar point cloud data within a preset time duration with a preset map to obtain first pose data comprises:
reading the laser radar point cloud data within the preset time length according to a preset interval;
and matching the read laser radar point cloud data with the preset map through a preset point cloud registration algorithm to obtain the first position data.
3. The method of claim 1, wherein converting the second position data obtained by the INS into third position data aligned with the first position data obtained comprises:
according to the timestamp of the first position and posture data, determining an adjacent timestamp and a corresponding posture of the first position and posture data in the second position and posture data;
performing attitude interpolation on the second attitude data according to the determined adjacent timestamp of the first attitude data in the second attitude data and the corresponding attitude of the first attitude data to obtain third attitude data;
wherein the pose interpolation comprises: cubic interpolation and/or linear interpolation.
4. The method according to any one of claims 1 to 3, wherein the preset map comprises: and the relative error is smaller than a preset threshold value.
5. The method of claim 4, wherein the preset map comprises: and (5) high-precision point cloud map.
6. A method according to any one of claims 1 to 3, wherein said performing sensor calibration of a vehicle comprises:
setting a conversion matrix for converting the first posture data into the third posture data;
and solving the conversion matrix through a preset cost function to realize the sensor calibration of the vehicle.
7. A computer storage medium having stored thereon a computer program which, when executed by a processor, carries out a method of sensor calibration according to any one of claims 1 to 6.
8. A terminal, comprising: a memory and a processor, the memory having a computer program stored therein; wherein the content of the first and second substances,
the processor is configured to execute the computer program in the memory;
the computer program, when executed by the processor, implements a method for sensor calibration as claimed in any one of claims 1-6.
9. An apparatus for sensor calibration, comprising: the device comprises a first conversion unit, a second conversion unit and a calibration unit; wherein the content of the first and second substances,
the first conversion unit is configured to: matching laser radar point cloud data within a preset time length with a preset map to obtain first position data;
the second conversion unit is configured to: converting the second position and posture data obtained by the inertial navigation equipment INS into third position and posture data aligned with the obtained first position and posture data;
the calibration unit is configured to: and calibrating the sensor of the vehicle according to the first position data and the third position data obtained by conversion.
10. The apparatus of claim 9, wherein the second conversion unit is configured to:
according to the timestamp of the first position and posture data, determining an adjacent timestamp and a corresponding posture of the first position and posture data in the second position and posture data;
performing attitude interpolation on the second attitude data according to the determined adjacent timestamp of the first attitude data in the second attitude data and the corresponding attitude of the first attitude data to obtain third attitude data;
wherein the pose interpolation comprises: cubic interpolation and/or linear interpolation.
CN202110360085.4A 2021-04-02 2021-04-02 Sensor calibration method, device, computer storage medium and terminal Active CN112964291B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110360085.4A CN112964291B (en) 2021-04-02 2021-04-02 Sensor calibration method, device, computer storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110360085.4A CN112964291B (en) 2021-04-02 2021-04-02 Sensor calibration method, device, computer storage medium and terminal

Publications (2)

Publication Number Publication Date
CN112964291A true CN112964291A (en) 2021-06-15
CN112964291B CN112964291B (en) 2023-07-14

Family

ID=76280834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110360085.4A Active CN112964291B (en) 2021-04-02 2021-04-02 Sensor calibration method, device, computer storage medium and terminal

Country Status (1)

Country Link
CN (1) CN112964291B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113465608A (en) * 2021-07-22 2021-10-01 清华大学苏州汽车研究院(吴江) Calibration method and system for roadside sensor
CN113640778A (en) * 2021-08-12 2021-11-12 东风悦享科技有限公司 Multi-laser radar combined calibration method based on non-overlapping view field
CN113759349A (en) * 2021-09-22 2021-12-07 阿波罗智能技术(北京)有限公司 Calibration method and device for laser radar and positioning device and automatic driving vehicle
WO2024001649A1 (en) * 2022-06-29 2024-01-04 深圳市海柔创新科技有限公司 Robot positioning method, apparatus and computing readable storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5883593A (en) * 1996-06-21 1999-03-16 Thomson-Csf Method for the calibration of the positioning errors of a radar and the drift in ground speed of an inertial unit on board an aircraft
CN108765498A (en) * 2018-05-30 2018-11-06 百度在线网络技术(北京)有限公司 Monocular vision tracking, device and storage medium
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN109901139A (en) * 2018-12-28 2019-06-18 文远知行有限公司 Laser radar scaling method, device, equipment and storage medium
CN109901138A (en) * 2018-12-28 2019-06-18 文远知行有限公司 Laser radar scaling method, device, equipment and storage medium
CN111538032A (en) * 2020-05-19 2020-08-14 北京数字绿土科技有限公司 Time synchronization method and device based on independent drawing tracks of camera and laser radar
CN111678533A (en) * 2019-03-11 2020-09-18 武汉小狮科技有限公司 Calibration method for inertial navigation equipment and laser radar
CN112051590A (en) * 2020-08-31 2020-12-08 广州文远知行科技有限公司 Detection method and related device for laser radar and inertial measurement unit
CN112362054A (en) * 2020-11-30 2021-02-12 上海商汤临港智能科技有限公司 Calibration method, calibration device, electronic equipment and storage medium
CN112489111A (en) * 2020-11-25 2021-03-12 深圳地平线机器人科技有限公司 Camera external parameter calibration method and device and camera external parameter calibration system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5883593A (en) * 1996-06-21 1999-03-16 Thomson-Csf Method for the calibration of the positioning errors of a radar and the drift in ground speed of an inertial unit on board an aircraft
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN108765498A (en) * 2018-05-30 2018-11-06 百度在线网络技术(北京)有限公司 Monocular vision tracking, device and storage medium
CN109901139A (en) * 2018-12-28 2019-06-18 文远知行有限公司 Laser radar scaling method, device, equipment and storage medium
CN109901138A (en) * 2018-12-28 2019-06-18 文远知行有限公司 Laser radar scaling method, device, equipment and storage medium
CN111678533A (en) * 2019-03-11 2020-09-18 武汉小狮科技有限公司 Calibration method for inertial navigation equipment and laser radar
CN111538032A (en) * 2020-05-19 2020-08-14 北京数字绿土科技有限公司 Time synchronization method and device based on independent drawing tracks of camera and laser radar
CN112051590A (en) * 2020-08-31 2020-12-08 广州文远知行科技有限公司 Detection method and related device for laser radar and inertial measurement unit
CN112489111A (en) * 2020-11-25 2021-03-12 深圳地平线机器人科技有限公司 Camera external parameter calibration method and device and camera external parameter calibration system
CN112362054A (en) * 2020-11-30 2021-02-12 上海商汤临港智能科技有限公司 Calibration method, calibration device, electronic equipment and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113465608A (en) * 2021-07-22 2021-10-01 清华大学苏州汽车研究院(吴江) Calibration method and system for roadside sensor
CN113465608B (en) * 2021-07-22 2024-05-03 清华大学苏州汽车研究院(吴江) Road side sensor calibration method and system
CN113640778A (en) * 2021-08-12 2021-11-12 东风悦享科技有限公司 Multi-laser radar combined calibration method based on non-overlapping view field
CN113759349A (en) * 2021-09-22 2021-12-07 阿波罗智能技术(北京)有限公司 Calibration method and device for laser radar and positioning device and automatic driving vehicle
CN113759349B (en) * 2021-09-22 2022-10-04 阿波罗智能技术(北京)有限公司 Calibration method of laser radar and positioning equipment Equipment and autonomous driving vehicle
WO2024001649A1 (en) * 2022-06-29 2024-01-04 深圳市海柔创新科技有限公司 Robot positioning method, apparatus and computing readable storage medium

Also Published As

Publication number Publication date
CN112964291B (en) 2023-07-14

Similar Documents

Publication Publication Date Title
CN112964291B (en) Sensor calibration method, device, computer storage medium and terminal
CN109059906B (en) Vehicle positioning method and device, electronic equipment and storage medium
CN109270545B (en) Positioning true value verification method, device, equipment and storage medium
CN110927708B (en) Calibration method, device and equipment of intelligent road side unit
EP2133662A2 (en) Methods and system of navigation using terrain features
CN112113574A (en) Method, apparatus, computing device and computer-readable storage medium for positioning
EP2175237B1 (en) System and methods for image-based navigation using line features matching
CN112230242A (en) Pose estimation system and method
JP2021524026A (en) Posture judgment system and method
CN112835085B (en) Method and device for determining vehicle position
CN114111775B (en) Multi-sensor fusion positioning method and device, storage medium and electronic equipment
CN110458885B (en) Positioning system and mobile terminal based on stroke perception and vision fusion
CN112967392A (en) Large-scale park mapping and positioning method based on multi-sensor contact
CN111469781B (en) For use in output of information processing system method and apparatus of (1)
CN112362054A (en) Calibration method, calibration device, electronic equipment and storage medium
Xian et al. Fusing stereo camera and low-cost inertial measurement unit for autonomous navigation in a tightly-coupled approach
CN112965076B (en) Multi-radar positioning system and method for robot
CN114897942B (en) Point cloud map generation method and device and related storage medium
KR102506411B1 (en) Method and apparatus for estimation of location and pose on vehicle and record medium for this
CN113034538B (en) Pose tracking method and device of visual inertial navigation equipment and visual inertial navigation equipment
US11353579B2 (en) Method for indicating obstacle by smart roadside unit
CN113538699A (en) Positioning method, device and equipment based on three-dimensional point cloud and storage medium
CN112747752A (en) Vehicle positioning method, device, equipment and storage medium based on laser odometer
CN117308925B (en) Navigation method, device, equipment and medium for spectral map inertial navigation combination
CN110082805A (en) A kind of 3 D locating device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant