CN109901139B - Laser radar calibration method, device, equipment and storage medium - Google Patents

Laser radar calibration method, device, equipment and storage medium Download PDF

Info

Publication number
CN109901139B
CN109901139B CN201811623241.6A CN201811623241A CN109901139B CN 109901139 B CN109901139 B CN 109901139B CN 201811623241 A CN201811623241 A CN 201811623241A CN 109901139 B CN109901139 B CN 109901139B
Authority
CN
China
Prior art keywords
data
point cloud
laser radar
calibration
inertial navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811623241.6A
Other languages
Chinese (zh)
Other versions
CN109901139A (en
Inventor
冯荻
雷宇苍
杜杭肯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WeRide Corp
Original Assignee
WeRide Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WeRide Corp filed Critical WeRide Corp
Priority to CN201811623241.6A priority Critical patent/CN109901139B/en
Publication of CN109901139A publication Critical patent/CN109901139A/en
Application granted granted Critical
Publication of CN109901139B publication Critical patent/CN109901139B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention relates to a laser radar calibration method, a device, equipment and a storage medium, wherein a terminal acquires point cloud data and inertial navigation system data of a laser radar to be calibrated in a preset calibration scene; map data corresponding to the calibration scene is obtained; acquiring a position conversion relation through a calibration algorithm according to point cloud data, inertial navigation system data and map data of the laser radar to be calibrated; and determining the calibration result of the laser radar according to the position conversion relation. In the method, the terminal automatically acquires the position conversion relation between the laser radar and the inertial navigation system through the calibration algorithm according to the point cloud data, the inertial navigation system data and the map data of the laser radar to be calibrated, and then determines the calibration result according to the position conversion relation, so that the calibration result of the laser radar is acquired automatically through the calibration algorithm, the process of acquiring the calibration result of the laser radar through manual measurement is avoided, and the calibration efficiency of the laser radar is improved.

Description

Laser radar calibration method, device, equipment and storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a laser radar calibration method, device, equipment, and storage medium.
Background
With the development of unmanned technology, vehicle-mounted sensors are often adopted to acquire position information around a vehicle, and then the vehicle for automatic driving is planned, decided or controlled according to the information detected by the sensors.
In general, an in-vehicle sensor may use a plurality of sensors to acquire position information around a vehicle. However, there is a certain difference between the relative poses (including relative positions and orientations) of the plurality of sensors, and the relative poses of the plurality of sensors need to be calibrated, so that the position information around the vehicle acquired by the plurality of sensors is unified under the same coordinate system, and then planning, decision-making or control is performed on automatic driving of the vehicle according to the position information under the same coordinate system. The above-described calibration process refers to a process of obtaining the relative positions between the plurality of sensors. The existing laser radar calibration method generally obtains the relative positions of the laser radar and other sensors by manual physical measurement, and performs mark matching according to the relative positions so as to realize the relative pose between the laser radar and the other sensors.
By adopting the method, the laser radar is calibrated through manual physical measurement, and the calibration efficiency is low especially for the calibration of a large number of laser radars.
Disclosure of Invention
Based on the above, it is necessary to provide a laser radar calibration method, device, equipment and storage medium for solving the problem of low laser radar calibration efficiency.
In a first aspect, a laser radar calibration method includes:
acquiring point cloud data and inertial navigation system data of a laser radar to be calibrated in a preset calibration scene; the calibration scene is provided with a target reference object;
map data corresponding to the calibration scene are obtained;
acquiring the position conversion relation between the laser radar and the inertial navigation system through a calibration algorithm according to the point cloud data of the laser radar to be calibrated, the inertial navigation system data and the map data; the calibration algorithm is used for converting the matching result and the inertial navigation system data into data under the same coordinate system after the coordinate system matching is carried out on the point cloud data and the map data;
and determining a calibration result of the laser radar according to the position conversion relation.
In one embodiment, the acquiring the point cloud data and the inertial navigation system data of the laser radar to be calibrated in the preset calibration scene includes:
according to the data acquisition rule, respectively acquiring multi-frame point cloud data of the laser radar to be calibrated and multi-frame inertial navigation system data of the inertial navigation system in a preset calibration scene.
In one embodiment, the obtaining, according to the point cloud data of the laser radar to be calibrated, the inertial navigation system data, and the map data, the position conversion relationship between the laser radar and the inertial navigation system through a calibration algorithm includes:
matching the multi-frame point cloud data to a coordinate system corresponding to the map data to obtain multi-frame map point cloud data;
and converting the multi-frame map point cloud data into point cloud data under a coordinate system corresponding to the multi-frame inertial navigation system data, and acquiring the position conversion relation between the laser radar and the inertial navigation system.
In one embodiment, the matching the multi-frame point cloud data to the coordinate system corresponding to the map data to obtain multi-frame map point cloud data includes:
matching the point cloud data to a coordinate system corresponding to the map data through a point cloud algorithm to obtain map point cloud data; the point cloud algorithm includes an ICP algorithm, and/or an NDT algorithm.
In one embodiment, the converting the multi-frame map point cloud data into the point cloud data under the coordinate system corresponding to the multi-frame inertial navigation system data, to obtain the position conversion relationship between the laser radar and the inertial navigation system, includes:
Enumerating a plurality of relative pose equations according to the multi-frame inertial navigation data and the multi-frame map point cloud data; the map point cloud data of one frame and the inertial navigation system data in the same frame form a group of data, and each group of data corresponds to a relative pose equation;
calculating position conversion parameters in the relative pose equations according to each relative pose equation and a group of data corresponding to the relative pose equations so as to obtain a plurality of position conversion parameters;
and determining the position conversion relation according to a plurality of position conversion parameters and a plurality of relative pose equations.
In one embodiment, the determining the position conversion relationship according to the plurality of position conversion parameters and the plurality of relative pose equations includes:
accumulating the plurality of relative pose equations by adopting a least square method to obtain a target relative pose equation;
inducing a plurality of relative position conversion parameters through an enumeration algorithm to obtain acquisition target position conversion parameters;
substituting the target position conversion parameters into the target relative pose equation to obtain the position conversion relation.
In one embodiment, the determining the calibration result of the laser radar according to the position conversion relationship includes:
Determining the position conversion relation as a calibration result of the laser radar;
or alternatively, the process may be performed,
and visualizing the position conversion relation, and determining the visualized result as a calibration result of the laser radar.
In one embodiment, the target reference object is a reference object that makes the reflected signal strength greater than a preset threshold value after the signals of the laser radar and the inertial navigation system pass through the target reference object.
In one embodiment, the preset calibration scene is a scene determined according to the type of the laser radar, the inertial navigation system and the type of the target reference object.
In a second aspect, a laser radar calibration device, the device comprising:
the first acquisition module is used for acquiring point cloud data of the laser radar to be calibrated and inertial navigation system data in a preset calibration scene; the calibration scene is provided with a target reference object;
the second acquisition module is used for acquiring map data corresponding to the calibration scene;
the conversion module is used for acquiring the position conversion relation between the laser radar and the inertial navigation system through a first calibration algorithm according to the point cloud data of the laser radar to be calibrated, the inertial navigation system data and the map data; the first calibration algorithm is used for converting the matching result and the inertial navigation system data into data under the same coordinate system after the coordinate system matching is carried out on the point cloud data and the map data;
And the calibration module is used for determining the calibration result of the laser radar according to the position conversion relation.
In a third aspect, a computer device comprises a memory storing a computer program and a processor executing the method steps of the laser radar calibration method described above.
In a fourth aspect, a computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the method steps of the lidar calibration method described above.
According to the laser radar calibration method, the device, the equipment and the storage medium, the terminal acquires point cloud data and inertial navigation system data of the laser radar to be calibrated in a preset calibration scene, wherein the calibration scene is provided with a target reference object; map data corresponding to the calibration scene is obtained; according to the point cloud data, the inertial navigation system data and the map data of the laser radar to be calibrated, the position conversion relation between the laser radar and the inertial navigation system is obtained through a calibration algorithm, and the calibration algorithm is used for converting the matching result and the inertial navigation system data into data under the same coordinate system after coordinate system matching is carried out on the point cloud data and the map data; and determining the calibration result of the laser radar according to the position conversion relation. In the method, the terminal automatically acquires the position conversion relation between the laser radar and the inertial navigation system through the calibration algorithm according to the point cloud data, the inertial navigation system data and the map data of the laser radar to be calibrated, and then automatically determines the calibration result of the laser radar according to the position conversion relation, so that the process of manually measuring to obtain the calibration result of the laser radar is avoided, and the calibration efficiency of the laser radar is improved.
Drawings
FIG. 1 is a schematic diagram of a lidar calibration application environment provided by one embodiment;
FIG. 2 is a flow chart of a laser radar calibration method according to one embodiment;
FIG. 3 is a flow chart of a laser radar calibration method according to another embodiment;
FIG. 4 is a flow chart of a laser radar calibration method according to another embodiment;
FIG. 5 is a flow chart of a laser radar calibration method according to another embodiment;
FIG. 6 is a flow chart of a laser radar calibration method according to another embodiment;
FIG. 7 is a schematic diagram of a laser radar calibration apparatus according to an embodiment;
FIG. 8 is a schematic diagram of a laser radar calibration apparatus according to another embodiment;
FIG. 9 is a schematic diagram of a laser radar calibration apparatus according to another embodiment;
FIG. 10 is an internal block diagram of a computing junction device provided by one embodiment.
Detailed Description
With the development of unmanned technology, vehicle-mounted sensors are often adopted to acquire position information around a vehicle, so that an automatic driving vehicle is planned, decided or controlled according to the information detected by the sensors. In general, an in-vehicle sensor may use a plurality of sensors to acquire position information around a vehicle. However, there is a certain difference between the relative poses (including relative positions and orientations) of the plurality of sensors, and the relative poses of the plurality of sensors need to be calibrated, so that the position information around the vehicle collected by the plurality of sensors is unified under the same coordinate system, and then planning, decision-making or control is performed on the automatically driven vehicle according to the position information under the same coordinate system. The above-described calibration process refers to a process of obtaining the relative positions between the plurality of sensors. The laser radar calibration method, device, equipment and storage medium aim to solve the problem of low calibration efficiency.
It should be noted that, the laser radar calibration method provided by the embodiment of the application not only can be applied to an unmanned scene, but also can be applied to a robot navigation scene, and the embodiment of the application does not limit a specific application scene.
The laser radar calibration method provided by the embodiment can be suitable for the application environment shown in fig. 1. As shown in fig. 1, the laser radar 10 and the inertial navigation system 20 may be installed at any position of the vehicle, and the calibration result of the laser radar is determined by acquiring relative position information between the laser radar 10 and the inertial navigation system 20 through a calibration algorithm.
It should be noted that, in the laser radar calibration method provided in the embodiment of the present application, the execution body may be a laser radar calibration device, and the device may be implemented in a software, hardware or a combination of software and hardware to be part or all of a computer device for laser radar calibration.
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments.
FIG. 2 is a flow chart of a laser radar calibration method according to an embodiment. The embodiment relates to a specific process for automatically acquiring the position conversion relation between a laser radar and an inertial navigation system through a calibration algorithm and further automatically determining the calibration result of the laser radar according to the position conversion relation. As shown in fig. 2, the method comprises the steps of:
s101, acquiring point cloud data and inertial navigation system data of a laser radar to be calibrated in a preset calibration scene; the calibration scene has a target reference object therein.
Specifically, the target reference object may be a reference object that enables the lidar to form a good point cloud imaging characteristic, and the target reference object may be one reference object or may be a plurality of reference objects, which is not limited in the embodiment of the present application. For example, the target reference may be a straight, continuous wall or curb. The preset calibration scene can be a scene which contains a target reference object and can enable the inertial navigation system to be positioned normally, and can be an outdoor crossroad with continuous and straight wall surfaces on the roadside; the road side is provided with an L-shaped route of a continuous and flat wall surface; outdoor crossroads with regular buildings on the roadsides; l-shaped routes for orderly building are arranged on the roadside; the roadside has a non-vertical intersection or an L-shaped route which meets the typical target of the requirements; the roadside has any one of a parking lot or an open space meeting the typical target, or a combined scene formed by a plurality of scenes. For example, the predetermined calibration scenario may include a scenario where the inertial navigation system is normally connected to a global positioning system (Global Positioning System, GPS) and is a straight, continuous road curb, so that the inertial navigation system can be normally positioned. The point cloud data may be reflected signals carrying information such as azimuth and distance when the laser radar signals irradiate the object surface, and the point cloud data may include position information of a plurality of points and intensity information of the reflected laser radar signals corresponding to the position information.
In a specific preset calibration scene, the process of acquiring the point cloud data of the laser radar can be to transmit radar signals to a target reference object through the laser radar to acquire the point cloud data; or the laser radar signal is sent to the target reference object through the laser radar, after the reflected signal is obtained, the reflected signal is subjected to point cloud clustering operation to obtain the point cloud data with space average, and then the point cloud data with space average is subjected to noise reduction treatment to obtain the point cloud data; the embodiments of the present application are not limited in this regard.
S102, acquiring map data corresponding to the calibration scene.
Specifically, the map data corresponding to the calibration scene may be high-precision map data, which may include position information of a plurality of points and image information corresponding thereto. When the map data corresponding to the calibration scene is specifically obtained, the map data corresponding to the calibration scene stored in the server may be downloaded through the server, or the map data corresponding to the calibration scene may be obtained in real time through a camera or other devices.
S103, acquiring the position conversion relation between the laser radar and the inertial navigation system through a calibration algorithm according to the point cloud data, the inertial navigation system data and the map data of the laser radar to be calibrated; the calibration algorithm is used for converting the matching result and the inertial navigation system data into data under the same coordinate system after the coordinate system matching is carried out on the point cloud data and the map data.
Specifically, the calibration algorithm may be used to match the point cloud data with the map data in a coordinate system, and then convert the matching result and the inertial navigation system data into data in the same coordinate system, which may be an algorithm for converting the point cloud data of the laser radar to be calibrated into data in a coordinate system corresponding to the inertial navigation system data, an algorithm for converting the inertial navigation system data into data in a coordinate system corresponding to the point cloud data of the laser radar to be calibrated, or an algorithm for converting the point cloud data of the laser radar to be calibrated and the inertial navigation system data into data in a third-party coordinate system. The position conversion relation can be a conversion relation between a coordinate system corresponding to the laser radar to be calibrated and a coordinate system corresponding to the inertial navigation system.
On the basis of the above embodiment, the positional conversion relationship may be determined by establishing a conversion relationship between the positional information of the target reference object in the point cloud data of the laser radar to be calibrated and the positional information of the target reference object in the inertial navigation system data. When specifically establishing a conversion relation between the position information of the target reference object in the point cloud data of the laser radar to be calibrated and the position information of the target reference object in the inertial navigation system data, a coordinate system conversion equation can be enumerated, and the position conversion relation is determined; or by listing a plurality of coordinate conversion equations, selecting a target coordinate conversion equation, and determining a position conversion relation; the embodiments of the present application are not limited in this regard.
S104, determining a calibration result of the laser radar according to the position conversion relation.
Specifically, the calibration result of the laser radar may be the relative position between the laser radar and the inertial navigation system. On the basis of the embodiment, in the specific process of determining the calibration result of the laser radar according to the position conversion relation, after the position conversion relation is determined, optionally, the position conversion relation may be determined as the calibration result of the laser radar; or, visualizing the position conversion relation, and determining the visualized result as a laser radar calibration result; the embodiments of the present application are not limited in this regard.
According to the laser radar calibration method, the terminal acquires point cloud data and inertial navigation system data of the laser radar to be calibrated in a preset calibration scene, wherein the calibration scene is provided with a target reference object; map data corresponding to the calibration scene is obtained; according to the point cloud data, the inertial navigation system data and the map data of the laser radar to be calibrated, the position conversion relation between the laser radar and the inertial navigation system is obtained through a calibration algorithm, and the calibration algorithm is used for converting the matching result and the inertial navigation system data into data under the same coordinate system after coordinate system matching is carried out on the point cloud data and the map data; and determining the calibration result of the laser radar according to the position conversion relation. In the embodiment, the terminal automatically acquires the position conversion relation between the laser radar and the inertial navigation system through the calibration algorithm according to the point cloud data, the inertial navigation system data and the map data of the laser radar to be calibrated, and further automatically determines the calibration result of the laser radar according to the position conversion relation, so that the process of manually measuring to obtain the calibration result of the laser radar is avoided, and further the calibration efficiency of the laser radar is improved.
Optionally, the target reference object is a reference object that makes the reflected signal intensity greater than a preset threshold value after the signal emitted by the laser radar passes through the target reference object.
Specifically, the target reference object is a reference object with reflected signal intensity greater than a preset threshold value after the signal emitted by the laser radar passes through the target reference object. The preset threshold may be such that after a signal emitted by the lidar passes through the target reference object, the signal strength of the reflected lidar signal is greater than the minimum radar signal strength that the lidar can identify; the preset threshold may also be a position setting of a plurality of reference objects that causes no mutual interference between a plurality of reflected signals acquired by the lidar through the plurality of reference objects when the target reference object is the plurality of reference objects.
Optionally, the preset calibration scene is a scene determined according to the types of the laser radar, the inertial navigation system and the target reference object.
Specifically, the preset calibration scene may be a scene in which the surrounding environment does not cause significant signal interference to the target reference object. In a preset calibration scene, the laser radar acquires point cloud data of the laser radar to be calibrated through a signal reflected by a target reference object. The inertial navigation system is connected with the GPS to acquire the position information of the surrounding environment in the GPS signal in real time, and the position information corresponding to a plurality of points is acquired in real time. For example, according to the method that the laser radar obtains point cloud data of the laser radar to be calibrated through a laser signal, as the angle scatterer has good scattering characteristics on the laser signal, a target reference object corresponding to the laser radar can be the angle scatterer; the inertial navigation system obtains the data of the inertial navigation system according to the received GPS signals, so that the inertial navigation system needs an open field, and the GPS signals are not interfered by external environments. Furthermore, according to the types of the laser radar to be calibrated, the inertial navigation system and the target reference object, the preset calibration scene is determined to be an outdoor field which does not cause obvious electromagnetic interference on the signals returned by the laser radar through the angle scatterer, and the inertial navigation system can accurately receive GPS signals.
Based on the above embodiment, the step S101 "obtains point cloud data and inertial navigation system data of the laser radar to be calibrated in a preset calibration scene; the method comprises the steps that a target reference object is arranged in a calibration scene, multi-frame point cloud data of the laser radar to be calibrated and multi-frame inertial navigation system data of an inertial navigation system can be respectively obtained through a data acquisition rule, and the S101 is used for obtaining the point cloud data and the inertial navigation system data of the laser radar to be calibrated in a preset calibration scene; having a target reference "in a calibration scenario, one possible implementation includes: according to the data acquisition rule, respectively acquiring multi-frame point cloud data of the laser radar to be calibrated and multi-frame inertial navigation system data of the inertial navigation system in a preset calibration scene.
Specifically, the data collection rule may be to obtain point cloud data of the laser radar to be calibrated through the target reference object, which may be to obtain point cloud data of multiple frames of laser radar to be calibrated in a static manner, or obtain point cloud data of multiple frames of laser radar to be calibrated in a moving manner. When the point cloud data of the multi-frame laser radar to be calibrated are respectively acquired through the target reference object according to the data acquisition rule, the carrier where the laser radar is positioned can be in a stopped state, and the point cloud data of the multi-frame laser radar to be calibrated are acquired; or the carrier can obtain the point cloud data of a plurality of frames of laser radars to be calibrated when in a slow running state; the embodiments of the present application are not limited in this regard. The carrier may be an autonomous vehicle, a assisted vehicle, or a robot, which is not limited in this embodiment.
Fig. 3 is a flow chart of a laser radar calibration method according to another embodiment. The embodiment relates to a specific process for obtaining map point cloud data according to point cloud data and map data of a laser radar to be calibrated, and further obtaining a position conversion relation according to the map point cloud data and inertial navigation system data. As shown in fig. 3, the step S103 "a possible implementation manner of obtaining the position conversion relationship between the laser radar and the inertial navigation system through the calibration algorithm according to the point cloud data, the inertial navigation system data and the map data of the laser radar to be calibrated" includes the following steps:
and S201, matching the multi-frame point cloud data to a coordinate system corresponding to the map data to obtain multi-frame map point cloud data.
Specifically, the map point cloud data may be point cloud data obtained by projecting point cloud data of the laser radar to be calibrated to a coordinate system corresponding to the map data. Specifically, in the process of matching multi-frame point cloud data to a coordinate system corresponding to map data to obtain multi-frame map point cloud data, the point cloud data of each frame of laser radar to be calibrated can be respectively matched to the map data to obtain multi-frame map point cloud data; or accumulating and averaging the point cloud data of the multi-frame laser radar to be calibrated, which are obtained at the same moment, so that the signal intensity of each position is accumulated, and acquiring the averaged point cloud data according to the number of frames of the point cloud data at the same moment; and similarly, acquiring the averaged point cloud data at a plurality of other moments, and further matching the multi-frame averaged point cloud data to a coordinate system corresponding to the map data to acquire multi-frame map point cloud data.
Optionally, matching the point cloud data to a coordinate system corresponding to the map data through a point cloud algorithm to obtain map point cloud data; the point cloud algorithm includes an ICP algorithm, and/or an NDT algorithm.
In particular, the iterative closest point (Iterative Closest Point, ICP) algorithm may be a surface fitting algorithm that is a quaternion-based point set-to-point set registration method. After determining its corresponding set of nearby points from the set of measured points, a new set of nearby points is calculated. And (3) carrying out iterative calculation by using the method until the objective function value formed by the residual square sum is unchanged, and ending the iterative process. A normal distribution transformation (Normal Distribution Transform, NDT) algorithm may be applied to a statistical model of three-dimensional points, which may use standard optimization techniques to determine an optimal match between two point clouds, which does not utilize feature computation and matching of the corresponding points during registration, which is short in matching time. For example, after the point cloud data set determines the corresponding nearby point set in the map data, a new nearby point cloud data set is calculated, and the iterative calculation is performed on the above process until the objective function value formed by the sum of squares of residuals is unchanged, so that the iterative process is ended, and the map point cloud data is obtained.
S202, converting the multi-frame map point cloud data into point cloud data under a coordinate system corresponding to the multi-frame inertial navigation system data, and acquiring the position conversion relation between the laser radar and the inertial navigation system.
Specifically, on the basis of the above embodiment, after the multi-frame map point cloud data is acquired, the map point cloud data acquired at the same time and the same position may be converted into the coordinate system corresponding to the corresponding inertial navigation system data. Converting the multi-frame map point cloud data into point cloud data under a coordinate system corresponding to the multi-frame inertial navigation system data to obtain a position conversion relation between the laser radar and the inertial navigation system, wherein the position conversion relation can be determined by enumerating a coordinate system conversion equation; or by listing a plurality of coordinate conversion equations, selecting a target coordinate conversion equation, and determining a position conversion relation; the embodiments of the present application are not limited in this regard.
According to the laser radar calibration method, the terminal matches the multi-frame point cloud data to the coordinate system corresponding to the map data to obtain the multi-frame map point cloud data, converts the multi-frame map point cloud data to the point cloud data corresponding to the multi-frame inertial navigation system data, and obtains the position conversion relation between the laser radar and the inertial navigation system. In this embodiment, the terminal obtains multiple frames of map point cloud data by matching the multiple frames of point cloud data to the coordinate system corresponding to the map data, and converts the multiple frames of map point cloud data into point cloud data under the coordinate system corresponding to the multiple frames of inertial navigation system data, so that the terminal can automatically obtain the position conversion relation between the laser radar and the inertial navigation system, further determine the calibration result of the laser radar according to the position conversion relation, and further determine the calibration result of the laser radar according to the position conversion relation, so that the calibration result of the laser radar is obtained automatically, the process of obtaining the calibration result of the laser radar by manual measurement is avoided, and further the calibration efficiency of the laser radar is improved.
Based on the embodiment, the terminal can obtain the position conversion relation between the laser radar and the inertial navigation system through a calibration algorithm according to the point cloud data, the inertial navigation system data and the map data of the laser radar to be calibrated. The specific process of how the terminal automatically obtains the position conversion relation according to the calibration algorithm is described in detail below by means of fig. 4. The step 202 of converting the multi-frame map point cloud data into point cloud data under a coordinate system corresponding to the multi-frame inertial navigation system data to obtain a position conversion relationship between the laser radar and the inertial navigation system includes the following steps:
s301, enumerating a plurality of relative pose equations according to multi-frame inertial navigation system data and multi-frame map point cloud data; the map point cloud data of one frame and the inertial navigation system data in the same frame form a group of data, and each group of data corresponds to a relative pose equation.
Specifically, the relative pose equation may be a coordinate transformation equation obtained according to map point cloud data of one frame and inertial navigation system data in the same frame. The map point cloud data of one frame and the inertial navigation system data in the same frame form a group of data, and the group of data can be the map point cloud data of one frame and the inertial navigation system data of one frame which are acquired at the same position at the same moment to form a group of data in the same frame. In a specific process of enumerating a plurality of relative pose equations according to multi-frame inertial navigation system data and multi-frame map point cloud data, each frame of map point cloud data corresponds to one inertial navigation system data. The expression modes of the coordinate information in the map point cloud data and the inertial navigation system data are different, a relative pose equation can be enumerated, and the laser radar data and other sensor data are unified into one expression mode.
S302, calculating position conversion parameters in the relative pose equations according to each relative pose equation and a group of data corresponding to the relative pose equations to obtain a plurality of position conversion parameters.
Specifically, each relative pose equation corresponds to a set of map point cloud data and inertial navigation system data, the map point cloud data and the inertial navigation system data are substituted into the corresponding relative pose equation, and the position conversion parameters in the relative pose equation are calculated. The map point cloud data comprises multi-frame map point cloud data, the inertial navigation system data comprises multi-frame inertial navigation system data, the multi-frame map point cloud data and the corresponding multi-frame inertial navigation system respectively form multiple groups of data, the multiple groups of data are substituted into the corresponding relative pose equation respectively, and a plurality of position conversion parameters are obtained through calculation.
S303, determining a position conversion relation according to a plurality of position conversion parameters and a plurality of relative pose equations.
Specifically, after obtaining the plurality of position conversion parameters on the basis of the above embodiment, the position conversion relationship may be determined according to the plurality of position parameters and the corresponding plurality of relative pose equations, which may be that a target relative pose equation is determined first, then the target position conversion parameters are obtained, and then the position conversion relationship is determined according to the target relative position equation and the target position conversion parameters.
Optionally, one possible implementation method of S303 "determining the position conversion relationship according to the plurality of position conversion parameters and the plurality of relative pose equations" includes the steps in the embodiment shown in fig. 5:
s401, accumulating the plurality of relative pose equations by using a least square method to obtain a target relative pose equation.
Specifically, the idea of the least square method may be adopted to accumulate a plurality of relative pose equations to obtain an error of reducing the relative pose equations, which may find the best function match of the data by minimizing the sum of squares of the error. In this embodiment, a plurality of relative pose equations may be accumulated to obtain an optimal relative pose equation with minimized error, i.e., a target relative pose equation.
S402, summarizing a plurality of relative position conversion parameters through an enumeration algorithm to obtain acquisition target position conversion parameters.
Specifically, the position conversion parameters may include a plurality of parameters, and a target position conversion parameter may be obtained by determining a part of the plurality of parameters through an enumeration algorithm and then summarizing other parameters according to the part of the parameters. For example, the position conversion parameters may include six parameters a, b, c, d, e, and f, and specific values of the three parameters a, b, and c are first determined, and then, according to the three parameter values a, b, and c, a plurality of values d, e, and f are exhausted, and a plurality of values d, e, and f which satisfy the preset requirement are selected as the parameter values d, e, and f. Further, the position conversion parameters are determined according to the six parameter values of a, b, c, d, e and f.
S403, substituting the target position conversion parameters into the target relative pose equation to obtain the position conversion relation.
According to the laser radar calibration method, the terminal enumerates a plurality of relative pose equations according to multi-frame inertial navigation system data and multi-frame map point cloud data, calculates position conversion parameters in the relative pose equations according to each relative pose equation and a group of data corresponding to the relative pose equations, so as to obtain a plurality of position conversion parameters, and further determines a position conversion relation according to the plurality of position conversion parameters and the plurality of relative pose equations. In this embodiment, the terminal obtains a plurality of position conversion parameters by enumerating a plurality of relative pose equations, and further determines a position conversion relationship according to the plurality of relative pose equations and the plurality of position conversion parameters, so that a calibration result of the obtained laser radar is automatically obtained through a calibration algorithm, a process of obtaining the calibration result of the laser radar through manual measurement is avoided, and further calibration efficiency is improved.
Fig. 6 is a flowchart of a laser radar calibration method according to another embodiment, as shown in fig. 6, and based on the above embodiment, a laser radar calibration method includes:
s501, respectively acquiring multi-frame point cloud data of a laser radar to be calibrated and multi-frame inertial navigation system data of an inertial navigation system in a preset calibration scene according to a data acquisition rule.
S502, matching point cloud data to a coordinate system corresponding to map data through a point cloud algorithm to obtain map point cloud data; the point cloud algorithm includes an ICP algorithm, and/or an NDT algorithm.
S503, enumerating a plurality of relative pose equations according to multi-frame inertial navigation data and multi-frame map point cloud data; the map point cloud data of one frame and the inertial navigation system data in the same frame form a group of data, and each group of data corresponds to a relative pose equation.
S504, calculating position conversion parameters in the relative pose equations according to each relative pose equation and a group of data corresponding to the relative pose equations so as to obtain a plurality of position conversion parameters.
S505, accumulating the plurality of relative pose equations by using a least square method to obtain a target relative pose equation.
S506, summarizing the plurality of relative position conversion parameters through an enumeration algorithm to obtain the acquired target position conversion parameters.
S507, substituting the target position conversion parameters into a target relative pose equation to obtain a position conversion relation.
The technical effects of the laser radar calibration method in this embodiment are similar to those of the embodiments corresponding to the foregoing embodiments, and will not be described herein.
It should be understood that, although the steps in the flowcharts of fig. 2-6 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in fig. 2-6 may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor does the order in which the sub-steps or stages are performed necessarily occur in sequence, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or other steps.
Fig. 7 is a schematic structural diagram of a lidar calibration device according to an embodiment. As shown in fig. 7, the laser radar calibration device includes: a first acquisition module 10, a second acquisition module 20, a conversion module 30 and a calibration module 40, wherein:
the first acquisition module 10 is used for acquiring point cloud data of the laser radar to be calibrated and inertial navigation system data in a preset calibration scene; the calibration scene is provided with a target reference object;
The second obtaining module 20 is configured to obtain map data corresponding to the calibration scene;
the conversion module 30 is configured to obtain a position conversion relationship between the laser radar and the inertial navigation system through a first calibration algorithm according to the point cloud data of the laser radar to be calibrated, the inertial navigation system data and the map data; the first calibration algorithm is used for converting the matching result and the inertial navigation system data into data under the same coordinate system after the coordinate system matching is carried out on the point cloud data and the map data;
and the calibration module 40 is used for determining the calibration result of the laser radar according to the position conversion relation.
In one embodiment, the first obtaining module 10 is specifically configured to obtain, in a preset calibration scene, multi-frame point cloud data of the laser radar to be calibrated and multi-frame inertial navigation system data of the inertial navigation system according to a data acquisition rule.
The laser radar calibration device provided in the embodiment of the application may execute the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
Fig. 8 is a schematic structural diagram of a lidar calibration device according to another embodiment, and based on the embodiment shown in fig. 7, the conversion module 30 includes: a matching unit 301 and a conversion unit 302, wherein:
A matching unit 301, configured to match the multi-frame point cloud data to a coordinate system corresponding to the map data, so as to obtain multi-frame map point cloud data;
and the conversion unit 302 is configured to convert the multi-frame map point cloud data into point cloud data under a coordinate system corresponding to the multi-frame inertial navigation system data, and obtain a position conversion relationship between the laser radar and the inertial navigation system.
In one embodiment, the matching unit 301 is specifically configured to match the point cloud data to a coordinate system corresponding to the map data by using a point cloud algorithm, so as to obtain map point cloud data; the point cloud algorithm includes an ICP algorithm, and/or an NDT algorithm.
Fig. 9 is a schematic structural diagram of a lidar calibration device according to another embodiment, and the conversion unit 302 includes, based on the embodiment shown in fig. 7 or fig. 8: enumerating subunit 3021, converting subunit 3022, and determining subunit 3023, wherein:
an enumeration subunit 3021 configured to enumerate a plurality of relative pose equations according to the multi-frame inertial navigation data and the multi-frame map point cloud data; the map point cloud data of one frame and the inertial navigation system data in the same frame form a group of data, and each group of data corresponds to a relative pose equation;
A conversion subunit 3022, configured to calculate, according to each relative pose equation and a set of data corresponding to the relative pose equation, a position conversion parameter in the relative pose equation, so as to obtain a plurality of position conversion parameters;
a determining subunit 3023 for determining the positional conversion relationship according to the plurality of positional conversion parameters and the plurality of relative pose equations.
In one embodiment, the determining subunit 3023 is specifically configured to accumulate the plurality of relative pose equations by using a least squares method to obtain the target relative pose equation; inducing a plurality of relative position conversion parameters through an enumeration algorithm to obtain acquisition target position conversion parameters; substituting the target position conversion parameters into the target relative pose equation to obtain the position conversion relation.
In one embodiment, the position conversion relation is determined as a calibration result of the laser radar; or visualizing the position conversion relation, and determining the visualized result as the calibration result of the laser radar.
In one embodiment, the target reference object is a reference object that makes the reflected signal intensity greater than a preset threshold value after the signal emitted by the laser radar passes through the target reference object.
In one embodiment, the preset calibration scene is a scene determined according to the type of the laser radar, the inertial navigation system and the type of the target reference object.
The laser radar calibration device provided in the embodiment of the application may execute the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
For a specific limitation of the laser radar calibration device, reference may be made to the limitation of the laser radar calibration method hereinabove, and the description thereof will not be repeated here. The above-mentioned laser radar calibration device may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a terminal, and an internal structure diagram thereof may be as shown in fig. 10. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer device, when executed by the processor, implements a laser radar calibration method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the structures shown in FIG. 10 are only block diagrams of portions of structures associated with the disclosed aspects and are not limiting as to the computer device on which the disclosed aspects may be implemented, and that a particular computer device may include more or less components than those shown, or may combine some of the components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
acquiring point cloud data and inertial navigation system data of a laser radar to be calibrated in a preset calibration scene; the calibration scene is provided with a target reference object;
map data corresponding to the calibration scene are obtained;
acquiring the position conversion relation between the laser radar and the inertial navigation system through a calibration algorithm according to the point cloud data of the laser radar to be calibrated, the inertial navigation system data and the map data; the calibration algorithm is used for converting the matching result and the inertial navigation system data into data under the same coordinate system after the coordinate system matching is carried out on the point cloud data and the map data;
And determining a calibration result of the laser radar according to the position conversion relation.
In one embodiment, the processor when executing the computer program further performs the steps of: according to the data acquisition rule, respectively acquiring multi-frame point cloud data of the laser radar to be calibrated and multi-frame inertial navigation system data of the inertial navigation system in a preset calibration scene.
In one embodiment, the processor when executing the computer program further performs the steps of: matching the multi-frame point cloud data to a coordinate system corresponding to the map data to obtain multi-frame map point cloud data; and converting the multi-frame map point cloud data into point cloud data under a coordinate system corresponding to the multi-frame inertial navigation system data, and acquiring the position conversion relation between the laser radar and the inertial navigation system.
In one embodiment, the processor when executing the computer program further performs the steps of: matching the point cloud data to a coordinate system corresponding to the map data through a point cloud algorithm to obtain map point cloud data; the point cloud algorithm includes an ICP algorithm, and/or an NDT algorithm.
In one embodiment, the processor when executing the computer program further performs the steps of: enumerating a plurality of relative pose equations according to the multi-frame inertial navigation data and the multi-frame map point cloud data; the map point cloud data of one frame and the inertial navigation system data in the same frame form a group of data, and each group of data corresponds to a relative pose equation; calculating position conversion parameters in the relative pose equations according to each relative pose equation and a group of data corresponding to the relative pose equations so as to obtain a plurality of position conversion parameters; and determining the position conversion relation according to a plurality of position conversion parameters and a plurality of relative pose equations.
In one embodiment, the processor when executing the computer program further performs the steps of: accumulating the plurality of relative pose equations by adopting a least square method to obtain a target relative pose equation; inducing a plurality of relative position conversion parameters through an enumeration algorithm to obtain acquisition target position conversion parameters; substituting the target position conversion parameters into the target relative pose equation to obtain the position conversion relation.
In one embodiment, the processor when executing the computer program further performs the steps of: determining the position conversion relation as a calibration result of the laser radar; or visualizing the position conversion relation, and determining the visualized result as the calibration result of the laser radar.
In one embodiment, the target reference object is a reference object that makes the reflected signal intensity greater than a preset threshold value after the signal emitted by the laser radar passes through the target reference object.
In one embodiment, the preset calibration scene is a scene determined according to the type of the laser radar, the inertial navigation system and the type of the target reference object.
The computer device provided in this embodiment has similar implementation principles and technical effects to those of the above method embodiment, and will not be described herein.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring point cloud data and inertial navigation system data of a laser radar to be calibrated in a preset calibration scene; the calibration scene is provided with a target reference object;
map data corresponding to the calibration scene are obtained;
acquiring the position conversion relation between the laser radar and the inertial navigation system through a calibration algorithm according to the point cloud data of the laser radar to be calibrated, the inertial navigation system data and the map data; the calibration algorithm is used for converting the matching result and the inertial navigation system data into data under the same coordinate system after the coordinate system matching is carried out on the point cloud data and the map data;
and determining a calibration result of the laser radar according to the position conversion relation.
In one embodiment, the computer program when executed by the processor further performs the steps of: according to the data acquisition rule, respectively acquiring multi-frame point cloud data of the laser radar to be calibrated and multi-frame inertial navigation system data of the inertial navigation system in a preset calibration scene.
In one embodiment, the computer program when executed by the processor further performs the steps of: matching the multi-frame point cloud data to a coordinate system corresponding to the map data to obtain multi-frame map point cloud data; and converting the multi-frame map point cloud data into point cloud data under a coordinate system corresponding to the multi-frame inertial navigation system data, and acquiring the position conversion relation between the laser radar and the inertial navigation system.
In one embodiment, the computer program when executed by the processor further performs the steps of: matching the point cloud data to a coordinate system corresponding to the map data through a point cloud algorithm to obtain map point cloud data; the point cloud algorithm includes an ICP algorithm, and/or an NDT algorithm.
In one embodiment, the computer program when executed by the processor further performs the steps of: enumerating a plurality of relative pose equations according to the multi-frame inertial navigation data and the multi-frame map point cloud data; the map point cloud data of one frame and the inertial navigation system data in the same frame form a group of data, and each group of data corresponds to a relative pose equation; calculating position conversion parameters in the relative pose equations according to each relative pose equation and a group of data corresponding to the relative pose equations so as to obtain a plurality of position conversion parameters; and determining the position conversion relation according to a plurality of position conversion parameters and a plurality of relative pose equations.
In one embodiment, the computer program when executed by the processor further performs the steps of: accumulating the plurality of relative pose equations by adopting a least square method to obtain a target relative pose equation; inducing a plurality of relative position conversion parameters through an enumeration algorithm to obtain acquisition target position conversion parameters; substituting the target position conversion parameters into the target relative pose equation to obtain the position conversion relation.
In one embodiment, the computer program when executed by the processor further performs the steps of: determining the position conversion relation as a calibration result of the laser radar; or visualizing the position conversion relation, and determining the visualized result as the calibration result of the laser radar.
In one embodiment, the target reference object is a reference object that makes the reflected signal intensity greater than a preset threshold value after the signal emitted by the laser radar passes through the target reference object.
In one embodiment, the preset calibration scene is a scene determined according to the type of the laser radar, the inertial navigation system and the type of the target reference object.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided by the present disclosure may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the invention, which are described in detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.

Claims (12)

1. A laser radar calibration method, the method comprising:
acquiring point cloud data and inertial navigation system data of a laser radar to be calibrated in a preset calibration scene; the calibration scene is provided with a target reference object;
map data corresponding to the calibration scene are obtained;
acquiring the position conversion relation between the laser radar and the inertial navigation system through a calibration algorithm according to the point cloud data of the laser radar to be calibrated, the inertial navigation system data and the map data; the calibration algorithm is used for converting the matching result and the inertial navigation system data into data under the same coordinate system after the coordinate system matching is carried out on the point cloud data and the map data;
And determining a calibration result of the laser radar according to the position conversion relation.
2. The method of claim 1, wherein the obtaining the point cloud data and the inertial navigation system data of the lidar to be calibrated in the predetermined calibration scene comprises:
according to the data acquisition rule, respectively acquiring multi-frame point cloud data of the laser radar to be calibrated and multi-frame inertial navigation system data of the inertial navigation system in a preset calibration scene.
3. The method according to claim 2, wherein the obtaining, by a calibration algorithm, the positional conversion relationship between the lidar and the inertial navigation system according to the point cloud data of the lidar to be calibrated, the inertial navigation system data, and the map data includes:
matching the multi-frame point cloud data to a coordinate system corresponding to the map data to obtain multi-frame map point cloud data;
and converting the multi-frame map point cloud data into point cloud data under a coordinate system corresponding to the multi-frame inertial navigation system data, and acquiring the position conversion relation between the laser radar and the inertial navigation system.
4. The method of claim 3, wherein the matching the multi-frame point cloud data to the coordinate system corresponding to the map data to obtain multi-frame map point cloud data includes:
Matching the point cloud data to a coordinate system corresponding to the map data through a point cloud algorithm to obtain map point cloud data; the point cloud algorithm includes an ICP algorithm, and/or an NDT algorithm.
5. The method of claim 3, wherein the converting the multi-frame map point cloud data into point cloud data in a coordinate system corresponding to the multi-frame inertial navigation system data, and obtaining the position conversion relationship between the lidar and the inertial navigation system, comprises:
enumerating a plurality of relative pose equations according to the multi-frame inertial navigation system data and the multi-frame map point cloud data; the map point cloud data of one frame and the inertial navigation system data in the same frame form a group of data, and each group of data corresponds to a relative pose equation;
calculating position conversion parameters in the relative pose equations according to each relative pose equation and a group of data corresponding to the relative pose equations so as to obtain a plurality of position conversion parameters;
and determining the position conversion relation according to a plurality of position conversion parameters and a plurality of relative pose equations.
6. The method of claim 5, wherein determining the positional conversion relationship from a plurality of positional conversion parameters and a plurality of relative pose equations comprises:
Accumulating the plurality of relative pose equations by adopting a least square method to obtain a target relative pose equation;
inducing a plurality of relative position conversion parameters through an enumeration algorithm to obtain acquisition target position conversion parameters;
substituting the target position conversion parameters into the target relative pose equation to obtain the position conversion relation.
7. The method according to any one of claims 1 to 6, wherein determining a calibration result of the lidar according to the positional conversion relation comprises:
determining the position conversion relation as a calibration result of the laser radar;
or alternatively, the process may be performed,
and visualizing the position conversion relation, and determining the visualized result as a calibration result of the laser radar.
8. The method of any one of claims 1-6, wherein the target reference is a reference that causes the reflected signal strength to be greater than a predetermined threshold after the laser radar transmitted signal passes through the target reference.
9. The method according to any one of claims 1-6, wherein the preset calibration scenario is a scenario determined according to the type of the lidar, the inertial navigation system and the type of the target reference object.
10. A lidar calibration device, the device comprising:
the first acquisition module is used for acquiring point cloud data of the laser radar to be calibrated and inertial navigation system data in a preset calibration scene; the calibration scene is provided with a target reference object;
the second acquisition module is used for acquiring map data corresponding to the calibration scene;
the conversion module is used for acquiring the position conversion relation between the laser radar and the inertial navigation system through a first calibration algorithm according to the point cloud data of the laser radar to be calibrated, the inertial navigation system data and the map data; the first calibration algorithm is used for converting the matching result and the inertial navigation system data into data under the same coordinate system after the coordinate system matching is carried out on the point cloud data and the map data;
and the calibration module is used for determining the calibration result of the laser radar according to the position conversion relation.
11. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1-9 when the computer program is executed.
12. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1-9.
CN201811623241.6A 2018-12-28 2018-12-28 Laser radar calibration method, device, equipment and storage medium Active CN109901139B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811623241.6A CN109901139B (en) 2018-12-28 2018-12-28 Laser radar calibration method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811623241.6A CN109901139B (en) 2018-12-28 2018-12-28 Laser radar calibration method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109901139A CN109901139A (en) 2019-06-18
CN109901139B true CN109901139B (en) 2023-07-04

Family

ID=66943512

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811623241.6A Active CN109901139B (en) 2018-12-28 2018-12-28 Laser radar calibration method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109901139B (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112414444B (en) * 2019-08-22 2023-05-30 阿里巴巴集团控股有限公司 Data calibration method, computer equipment and storage medium
CN112558023B (en) * 2019-09-25 2024-03-26 华为技术有限公司 Calibration method and device of sensor
CN110686704A (en) * 2019-10-18 2020-01-14 深圳市镭神智能系统有限公司 Pose calibration method, system and medium for laser radar and combined inertial navigation
CN112684432B (en) * 2019-10-18 2024-04-16 武汉万集光电技术有限公司 Laser radar calibration method, device, equipment and storage medium
CN110837080B (en) * 2019-10-28 2023-09-05 武汉海云空间信息技术有限公司 Rapid calibration method of laser radar mobile measurement system
CN111060132B (en) * 2019-11-29 2022-09-23 苏州智加科技有限公司 Calibration method and device for travelling crane positioning coordinates
CN111161353B (en) * 2019-12-31 2023-10-31 深圳一清创新科技有限公司 Vehicle positioning method, device, readable storage medium and computer equipment
CN111427026B (en) * 2020-02-21 2023-03-21 深圳市镭神智能系统有限公司 Laser radar calibration method and device, storage medium and self-moving equipment
CN113767264A (en) * 2020-03-05 2021-12-07 深圳市大疆创新科技有限公司 Parameter calibration method, device, system and storage medium
CN116106927A (en) * 2020-03-27 2023-05-12 深圳市镭神智能系统有限公司 Two-dimensional grid map construction method, medium and system based on laser radar
CN111458721B (en) * 2020-03-31 2022-07-12 江苏集萃华科智能装备科技有限公司 Exposed garbage identification and positioning method, device and system
CN111390911A (en) * 2020-04-03 2020-07-10 东莞仕达通自动化有限公司 Manipulator position calibration system and calibration method
WO2021253193A1 (en) * 2020-06-15 2021-12-23 深圳市大疆创新科技有限公司 Calibration method and calibration apparatus for external parameters of multiple groups of laser radars, and computer storage medium
CN112068108A (en) * 2020-08-11 2020-12-11 南京航空航天大学 Laser radar external parameter calibration method based on total station
CN112146682B (en) * 2020-09-22 2022-07-19 福建牧月科技有限公司 Sensor calibration method and device for intelligent automobile, electronic equipment and medium
CN112034431B (en) * 2020-09-25 2023-09-12 新石器慧通(北京)科技有限公司 External parameter calibration method and device for radar and RTK
CN112285676B (en) * 2020-10-22 2024-02-09 知行汽车科技(苏州)股份有限公司 Laser radar and IMU external parameter calibration method and device
CN112379353B (en) * 2020-11-10 2022-10-25 上海交通大学 Combined calibration method and system among multiple target laser radars
CN112731358B (en) * 2021-01-08 2022-03-01 奥特酷智能科技(南京)有限公司 Multi-laser-radar external parameter online calibration method
CN112965047B (en) * 2021-02-01 2023-03-14 中国重汽集团济南动力有限公司 Vehicle multi-laser radar calibration method, system, terminal and storage medium
CN112964291B (en) * 2021-04-02 2023-07-14 清华大学 Sensor calibration method, device, computer storage medium and terminal
CN114787015A (en) * 2021-06-15 2022-07-22 华为技术有限公司 Calibration method and device for automatic driving vehicle
CN113534110B (en) * 2021-06-24 2023-11-24 香港理工大学深圳研究院 Static calibration method for multi-laser radar system
CN113848541B (en) * 2021-09-22 2022-08-26 深圳市镭神智能系统有限公司 Calibration method and device, unmanned aerial vehicle and computer readable storage medium
CN114265042A (en) * 2021-12-09 2022-04-01 上海禾赛科技有限公司 Calibration method, calibration device, calibration system and readable storage medium
CN113933820B (en) * 2021-12-16 2022-03-08 中智行科技有限公司 Laser radar external reference calibration method without calibration object
CN114413887B (en) * 2021-12-24 2024-04-02 北京理工大学前沿技术研究院 Sensor external parameter calibration method, device and medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6608913B1 (en) * 2000-07-17 2003-08-19 Inco Limited Self-contained mapping and positioning system utilizing point cloud data
CN104794743A (en) * 2015-04-27 2015-07-22 武汉海达数云技术有限公司 Color point cloud producing method of vehicle-mounted laser mobile measurement system
CN105719284B (en) * 2016-01-18 2018-11-06 腾讯科技(深圳)有限公司 A kind of data processing method, device and terminal
CN107204037B (en) * 2016-03-17 2020-11-24 中国科学院光电研究院 Three-dimensional image generation method based on active and passive three-dimensional imaging system
CN107796370B (en) * 2016-08-30 2020-09-08 北京四维图新科技股份有限公司 Method and device for acquiring conversion parameters and mobile mapping system
SG10201700299QA (en) * 2017-01-13 2018-08-30 Otsaw Digital Pte Ltd Three-dimensional mapping of an environment
CN107421507A (en) * 2017-04-28 2017-12-01 上海华测导航技术股份有限公司 Streetscape data acquisition measuring method
CN107688184A (en) * 2017-07-24 2018-02-13 宗晖(上海)机器人有限公司 A kind of localization method and system
CN107463918B (en) * 2017-08-17 2020-04-24 武汉大学 Lane line extraction method based on fusion of laser point cloud and image data
CN108594245A (en) * 2018-07-04 2018-09-28 北京国泰星云科技有限公司 A kind of object movement monitoring system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于多对点云匹配的三维激光雷达外参数标定;韩栋斌 等;《激光与光电子学进展》(第02期);全文 *

Also Published As

Publication number Publication date
CN109901139A (en) 2019-06-18

Similar Documents

Publication Publication Date Title
CN109901139B (en) Laser radar calibration method, device, equipment and storage medium
CN109975773B (en) Millimeter wave radar calibration method, device, equipment and storage medium
CN109901138B (en) Laser radar calibration method, device, equipment and storage medium
CN111208492B (en) Vehicle-mounted laser radar external parameter calibration method and device, computer equipment and storage medium
US10240934B2 (en) Method and system for determining a position relative to a digital map
EP3118705A2 (en) Map production method, mobile robot, and map production system
US20150142248A1 (en) Apparatus and method for providing location and heading information of autonomous driving vehicle on road within housing complex
WO2018181974A1 (en) Determination device, determination method, and program
AU2018282302A1 (en) Integrated sensor calibration in natural scenes
US11635762B2 (en) System and method for collaborative sensor calibration
US11682139B2 (en) System and method for trailer pose estimation
US11875682B2 (en) System and method for coordinating collaborative sensor calibration
CN110608746B (en) Method and device for determining the position of a motor vehicle
CN110906939A (en) Automatic driving positioning method and device, electronic equipment, storage medium and automobile
US11852730B2 (en) System and method for collaborative calibration via landmark
KR101704634B1 (en) Apparatus and method for generating driving route of autonomous vehicle and method for controlling driving of autonomous vehicle
CN103843035A (en) Device and method for the geometric calibration of sensor data formed by means of a vehicle sensor system
CN113124880B (en) Map building and positioning method and device based on two sensor data fusion
US20220266825A1 (en) Sourced lateral offset for adas or ad features
JP2017181476A (en) Vehicle location detection device, vehicle location detection method and vehicle location detection-purpose computer program
CN113008248A (en) Method and system for generating and updating digital maps
KR101929681B1 (en) Method and Apparatus for Peripheral Vehicle Location Estimation using V2V and Environment Scanning Sensor
JP2022087821A (en) Data fusion method and device
CN114694111A (en) Vehicle positioning
CN110794434B (en) Pose determination method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant