CN109584183B - Laser radar point cloud distortion removal method and system - Google Patents

Laser radar point cloud distortion removal method and system Download PDF

Info

Publication number
CN109584183B
CN109584183B CN201811479464.XA CN201811479464A CN109584183B CN 109584183 B CN109584183 B CN 109584183B CN 201811479464 A CN201811479464 A CN 201811479464A CN 109584183 B CN109584183 B CN 109584183B
Authority
CN
China
Prior art keywords
point cloud
coordinate system
frame
cloud data
transformation matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201811479464.XA
Other languages
Chinese (zh)
Other versions
CN109584183A (en
Inventor
何磊
文龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN201811479464.XA priority Critical patent/CN109584183B/en
Publication of CN109584183A publication Critical patent/CN109584183A/en
Application granted granted Critical
Publication of CN109584183B publication Critical patent/CN109584183B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/08Projecting images onto non-planar surfaces, e.g. geodetic screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a laser radar point cloud distortion removing method and system. The method comprises the steps of firstly, acquiring point cloud information of a frame of point cloud data acquired by a laser radar; the point cloud information comprises coordinates of each point in the frame of point cloud data under a vehicle body coordinate system, a frame head moment course angle, a frame tail moment course angle and the vehicle speed of the laser radar carrying vehicle; then, determining a rotation translation transformation matrix in a distortion removal process according to the point cloud information; converting the point cloud data from the vehicle body coordinate system to a geodetic coordinate system according to the rotational translation transformation matrix to obtain point cloud data under the geodetic coordinate system; and finally, converting the point cloud data under the geodetic coordinate system into a vehicle body coordinate system to obtain the point cloud data after distortion removal. Compared with the traditional method, the laser radar point cloud distortion removing method provided by the invention has the advantages of smaller calculated amount, stronger real-time property and higher stability, and is particularly suitable for unmanned vehicles running at low speed.

Description

Laser radar point cloud distortion removal method and system
Technical Field
The invention relates to the technical field of automobile and environment perception, in particular to a laser radar point cloud distortion removing method and system.
Background
The intelligent vehicle is formed by adding advanced sensors (radar and camera), controllers, actuators and other devices on the basis of a common vehicle, so that the vehicle has the capability of sensing the surrounding environment. However, when the laser radar rotationally collects the point cloud information of the surrounding environment, because the unmanned vehicle is in a state of constantly moving, a certain amount of movement exists between the frame head and the frame tail of each frame of point cloud in the process of the unmanned vehicle moving, that is to say, the laser points in one frame are not in the same local coordinate system, which inevitably causes distortion and distortion of the point cloud. The traditional distortion removing method is to extract a characteristic line or a characteristic surface in two frames of data before and after the data is extracted, and the interframe motion quantity is obtained by calculating the moving distance of the characteristic line or the characteristic surface between the two frames.
Disclosure of Invention
The invention aims to provide a laser radar point cloud distortion removal method and a laser radar point cloud distortion removal system, which are used for improving the stability and the real-time performance of a distortion removal result and reducing the calculation amount of a distortion removal process.
In order to achieve the purpose, the invention provides the following scheme:
a lidar point cloud undistorting method, the method comprising:
acquiring point cloud information of a frame of point cloud data acquired by a laser radar; the point cloud information comprises coordinates of each point in the frame of point cloud data under a vehicle body coordinate system, a frame head moment course angle, a frame tail moment course angle and the vehicle speed of the laser radar carrying vehicle;
determining a rotational-translational transformation matrix of a distortion removal process according to the point cloud information;
converting the point cloud data from the vehicle body coordinate system to a geodetic coordinate system according to the rotational translation transformation matrix to obtain point cloud data under the geodetic coordinate system;
and converting the point cloud data under the geodetic coordinate system into a vehicle body coordinate system to obtain the point cloud data after distortion removal.
Optionally, the determining a rotation-translation transformation matrix of the distortion removal process according to the point cloud information specifically includes:
determining an approximate course angle of the laser radar carrying vehicle according to the frame header moment course angle and the frame tail moment course angle;
and determining the rotation translation transformation matrix according to the approximate course angle, the frame head moment course angle, the frame tail moment course angle and the vehicle speed.
Optionally, the determining the rotational-translational transformation matrix according to the approximate heading angle, the frame header moment heading angle, the frame tail moment heading angle and the vehicle speed specifically includes:
determining the rotation translation transformation matrix according to the approximate course angle, the frame head moment course angle, the frame tail moment course angle and the vehicle speed
Figure BDA0001893011300000021
Wherein G isuFor said rotational-translational transformation matrix αiThe point cloud data is a heading angle of the ith point in the frame of point cloud data, V is the speed of the laser radar carrying vehicle, T is the acquisition period of the laser radar, D is the angular resolution of the laser radar, and β is the approximate heading angle.
Optionally, the transforming the point cloud data from the vehicle body coordinate system to a geodetic coordinate system according to the rotational translation transformation matrix to obtain the point cloud data under the geodetic coordinate system specifically includes:
and performing rotational translation transformation on the ith point in the frame of point cloud data according to the rotational translation transformation matrix, and transforming the ith point in the vehicle body coordinate system to the ith point in the geodetic coordinate system.
Optionally, the transforming the point cloud data under the geodetic coordinate system back to the vehicle coordinate system to obtain the point cloud data after distortion removal, specifically includes:
determining a frame header moment rotation transformation matrix according to the frame header moment course angle;
and transforming the ith point in the geodetic coordinate system into a vehicle body coordinate system according to the frame header moment rotation transformation matrix to obtain the distortion-removed point cloud data of the ith point.
A lidar point cloud distortion removal system, the system comprising:
the point cloud information acquisition module is used for acquiring point cloud information of one frame of point cloud data acquired by the laser radar; the point cloud information comprises coordinates of each point in the frame of point cloud data under a vehicle body coordinate system, a frame head moment course angle, a frame tail moment course angle and the vehicle speed of the laser radar carrying vehicle;
the rotational translation transformation matrix determining module is used for determining a rotational translation transformation matrix in the distortion removing process according to the point cloud information;
the first coordinate system transformation module is used for transforming the point cloud data from the vehicle body coordinate system to a geodetic coordinate system according to the rotational translation transformation matrix to obtain point cloud data under the geodetic coordinate system;
and the second coordinate system transformation module is used for transforming the point cloud data under the geodetic coordinate system into the vehicle body coordinate system to obtain the point cloud data after distortion removal.
Optionally, the rotation-translation transformation matrix determining module specifically includes:
the approximate course angle determining unit is used for determining the approximate course angle of the laser radar carrying vehicle according to the frame header moment course angle and the frame tail moment course angle;
and the rotation translation transformation matrix determining unit is used for determining the rotation translation transformation matrix according to the approximate course angle, the frame head moment course angle, the frame tail moment course angle and the vehicle speed.
Optionally, the rotation-translation transformation matrix determining unit specifically includes:
a rotation translation transformation matrix determining subunit for determining the rotation translation transformation matrix according to the approximate course angle, the frame head time course angle, the frame tail time course angle and the vehicle speed
Figure BDA0001893011300000031
Wherein G isuFor said rotational-translational transformation matrix αiThe point cloud data is a heading angle of the ith point in the frame of point cloud data, V is the speed of the laser radar carrying vehicle, T is the acquisition period of the laser radar, D is the angular resolution of the laser radar, and β is the approximate heading angle.
Optionally, the first coordinate system transformation module specifically includes:
and the first coordinate system transformation unit is used for performing rotational translation transformation on the ith point in the frame of point cloud data according to the rotational translation transformation matrix and transforming the ith point in the vehicle body coordinate system to the ith point in the geodetic coordinate system.
Optionally, the second coordinate system transformation module specifically includes:
the frame header moment rotation transformation matrix determining unit is used for determining a frame header moment rotation transformation matrix according to the frame header moment course angle;
and the second coordinate system transformation unit is used for transforming the ith point in the geodetic coordinate system into the vehicle body coordinate system according to the frame header moment rotation transformation matrix to obtain the distortion-removed ith point cloud data.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the invention provides a laser radar point cloud distortion removal method and system, which can be used for removing distortion generated by the movement of a laser radar in point cloud data acquired by the laser radar. The method comprises the steps of firstly, acquiring point cloud information of a frame of point cloud data acquired by a laser radar; the point cloud information comprises coordinates of each point in the frame of point cloud data under a vehicle body coordinate system, a frame head moment course angle, a frame tail moment course angle and the vehicle speed of the laser radar carrying vehicle; then, determining a rotation translation transformation matrix in a distortion removal process according to the point cloud information; converting the point cloud data from the vehicle body coordinate system to a geodetic coordinate system according to the rotational translation transformation matrix to obtain point cloud data under the geodetic coordinate system; and finally, converting the point cloud data under the geodetic coordinate system into a vehicle body coordinate system to obtain the point cloud data after distortion removal. Compared with the traditional method, the laser radar point cloud distortion removing method provided by the invention has the advantages of smaller calculated amount, stronger real-time property and higher stability, and is particularly suitable for unmanned vehicles running at low speed.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a flow chart of a method for laser radar point cloud distortion removal according to the present invention;
FIG. 2 is a schematic view of the heading angle of the unmanned vehicle provided by the present invention;
FIG. 3 is a schematic diagram of a process for transforming a coordinate system using the method of the present invention;
FIG. 4 is a schematic diagram of approximate course angle solution provided by the present invention;
FIG. 5 is a schematic diagram of a point cloud distortion removal process using the method of the present invention;
fig. 6 is a system structure diagram of a laser radar point cloud distortion removal system provided by the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims to provide a laser radar point cloud distortion removal method and a laser radar point cloud distortion removal system, which are used for improving the stability and the real-time performance of a distortion removal result and reducing the calculation amount of a distortion removal process.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Fig. 1 is a flowchart of a method for laser radar point cloud distortion removal provided by the present invention. Referring to fig. 1, the laser radar point cloud distortion removing method provided by the invention specifically includes:
step 101: and acquiring point cloud information of one frame of point cloud data acquired by the laser radar.
Under the condition that the laser radar carries the vehicle and the laser radar state information is known, the method can eliminate or reduce the point cloud distortion generated by the movement of the laser radar in the point cloud data, so that the point cloud data can reflect the detected environment information more truly.
The point cloud information collected by the invention comprises coordinates of each point in the frame of point cloud data under the vehicle body coordinate system, and the coordinates of each point in the frame of point cloud data under the vehicle body coordinate system comprise a frame header of a K-th frame in the vehicle bodyCoordinates (X) in a coordinate systemKF u,YKF u) And the coordinate (X) of the frame tail of the K frame under the vehicle body coordinate systemKL u,YKL u) And the coordinates of the ith point data except the frame head and the frame tail in the K frame point cloud data under the vehicle body coordinate system.
The point cloud information further comprises a speed V of the laser radar carrying vehicle, a course angle α of the laser radar carrying vehicle, an acquisition period T of the laser radar and an angular resolution D of the laser radar, wherein the laser radar carrying vehicle is generally an unmanned vehicle, the speed V is the speed of the unmanned vehicle and can be read by a wheel speed sensor on the unmanned vehicle, and the course angle α of the laser radar carrying vehicle comprises a frame header moment course angle αFAnd end of frame time heading angle αLReadable by an inertial navigation system on the unmanned vehicle the frame header time heading angle αFI.e. the heading angle of the unmanned vehicle at the frame head moment, the heading angle α at the frame tail momentLThe heading angle of the unmanned vehicle at the end of frame time is shown in FIG. 2, the heading angle schematic diagram of the unmanned vehicle provided by the invention is shown in FIG. 2, XOY is a geodetic coordinate system, X ' O ' Y ' is a vehicle body coordinate system, as shown in FIG. 2, when the unmanned vehicle turns right, the heading angle α of the unmanned vehicle is a positive value, and when the unmanned vehicle turns left, α is a negative value.
Step 102: and determining a rotation and translation transformation matrix of the distortion removing process according to the point cloud information.
Fig. 3 is a schematic diagram of the process of transforming the coordinate system by the method of the present invention. Setting a geodetic coordinate system H (XOY), a vehicle body local coordinate system U (X ' O ' Y '), and a frame header coordinate in the Kth frame under the geodetic coordinate system as (X)kF H,YKF H) And the frame header coordinate (X) in the K frame in the vehicle body coordinate systemKF u,YKF u) The coordinate of the frame end of the Kth frame in the geodetic coordinate system is (X)kL H,YKL H) The coordinate of the frame tail of the Kth frame under the vehicle body coordinate system is (X)KL u,YKL u)。
Coordinate (X) of frame tail under vehicle body coordinate systemKL u,YKL u) Transformation to coordinates in the geodetic coordinate system:
(XkL H,YKL H)=(XKL u,YKL u)R1
wherein R is1And the rotation transformation matrix at the end of the frame time is the rotation transformation matrix from the vehicle body coordinate system to the geodetic coordinate system.
Figure BDA0001893011300000061
α thereinLThe heading angle of the unmanned vehicle at the end of frame time.
At the end of the frame coordinate (X)KL u,YKL u) After the point coordinate system is converted to the geodetic coordinate system from the vehicle body coordinate system, the point coordinate system can be translated by combining the inter-frame motion amount S and the approximate heading angle β, the running distance of the unmanned vehicle in a laser radar scanning period can be obtained according to the speed of the unmanned vehicle, namely, the inter-frame motion amount S-VT. is used for translating the point coordinate by combining the inter-frame motion amount and the approximate heading angle with the time acquired by each point cloud data point.
Let the coordinate of the frame end after translation be (X)kL HF,YKL HF) Then, the coordinates after translation can be obtained by the following formula:
(XKL HF,YKL HF,1)=(XKL H,YKL H,1)M
where M is the translation transformation matrix and where,
Figure BDA0001893011300000062
β is the approximate heading angle.
FIG. 4 is a schematic diagram of an approximate heading angle solution provided by the present invention. As shown in fig. 4, the approximate heading angle of the laser radar mounted vehicle can be obtained by calculating the heading angle at the frame head and the frame tail. Throughout the laser radar acquisition cycle, the heading angle of the unmanned vehicle can be approximately expressed as:
Figure BDA0001893011300000063
wherein Δ α - αLF,αFHeading angle of unmanned vehicle at frame start time, αLThe heading angle of the unmanned vehicle at the end of frame time.
FIG. 5 is a schematic diagram of a point cloud distortion removal process using the method of the present invention. Combining the end-of-frame time rotation transformation matrix R1A rotation-translation transformation matrix G can be obtained:
Figure BDA0001893011300000064
namely:
(XKL HF,YKL HF,1)=(XKL u,YKL u,1)G
then the frame tail is subjected to the coordinate (X) in the geodetic coordinate systemkL HF,YKL HF) And transforming the coordinate system of the carriage returning body to finish the whole distortion removing process:
(XKI UF,YKI UF)=(XkL HF,YKL HF)R2
wherein R is2And rotating a transformation matrix for the frame header moment, namely a rotation transformation matrix from a geodetic coordinate system to a vehicle body coordinate system.
Figure BDA0001893011300000065
α thereinFThe heading angle of the unmanned vehicle at the frame header time is shown in fig. 5.
So far, the point cloud coordinates at the frame end time are projected to the vehicle body coordinate system at the frame head time, and the whole conversion process is as shown in fig. 5.
The process projects the coordinate of the frame tail time to the vehicle body coordinate system of the frame head time, so that the motion quantity S of the unmanned vehicle when the laser radar collects the ith point can be firstly solved for any ith point between the frame head and the frame tailiSince the rotation of the laser radar is uniform, the time t elapsed when the ith point is acquired can be knowni
Figure BDA0001893011300000071
Obviously:
Figure BDA0001893011300000072
assuming that the heading angle is uniformly changed in the acquisition period when the unmanned vehicle turns, the heading angle at the ith point is:
Figure BDA0001893011300000073
wherein Δ α - αLF
At this time, a final rotational-translational transformation matrix G suitable for any ith point can be obtainedu:
Figure BDA0001893011300000074
Wherein G isuFor said rotational-translational transformation matrix αiThe point cloud data is a heading angle of the ith point in the frame of point cloud data, V is the speed of the laser radar carrying vehicle, T is the acquisition period of the laser radar, D is the angular resolution of the laser radar, and β is the approximate heading angle.
Step 103: and transforming the point cloud data from the vehicle body coordinate system to a geodetic coordinate system according to the rotational translation transformation matrix to obtain the point cloud data under the geodetic coordinate system.
After the approximate course angle is obtained, the coordinates of each point between the frame head and the frame tail are subjected to rotational translation transformation, namely each point and the matrix GuAnd multiplying to obtain point cloud data under the geodetic coordinate system. The method specifically comprises the following steps:
and performing rotational translation transformation on the ith point in the frame of point cloud data according to the rotational translation transformation matrix, namely transforming the coordinate and the rotational translation of the ith point in a vehicle body coordinate systemChange matrix GuMultiplying to obtain the coordinates of the ith point in the geodetic coordinate system, and converting the point cloud data in the vehicle body coordinate system into the point cloud data in the geodetic coordinate system.
Step 104: and converting the point cloud data under the geodetic coordinate system into a vehicle body coordinate system to obtain the point cloud data after distortion removal.
After the point cloud data under the vehicle body coordinate system is converted into the point cloud data under the geodetic coordinate system, the rotation conversion and the translation conversion of the point cloud data are completed, and then the coordinates after each point conversion are converted into the vehicle body coordinate system, namely the coordinates and the matrix R2And multiplying to obtain the point cloud data after distortion removal. The method specifically comprises the following steps:
determining a frame header moment rotation transformation matrix according to the frame header moment course angle
Figure BDA0001893011300000081
Rotating the transformation matrix R according to the frame header time2Converting the ith point in the geodetic coordinate system into a vehicle body coordinate system, namely, rotating a transformation matrix R between the coordinate of the ith point in the geodetic coordinate system and the frame head moment2Multiplying to obtain the coordinate of the ith point in the vehicle body coordinate system, and obtaining the point cloud data after distortion removal.
The point cloud coordinate obtained after the steps are finished is the point cloud coordinate after the distortion removal is finished, and the coordinate can reflect the surrounding environment information more truly compared with the original point cloud data.
Therefore, the method has the advantages that the coordinates of each point in one frame of point cloud data are converted into a geodetic coordinate system from the vehicle body coordinate system, then each point is subjected to coordinate correction according to the motion amount between frames, then the corrected coordinates are converted into the vehicle body coordinate system from the geodetic coordinate system, the distortion removal process of the point cloud data is completed, the distortion generated due to the motion of the laser radar in the point cloud data collected by the laser radar can be removed, and compared with the traditional method, the method is smaller in calculated amount, higher in real-time performance and higher in stability, and is particularly suitable for unmanned vehicles running at low speed.
According to the laser radar point cloud distortion removing method provided by the invention, the invention also provides a laser radar point cloud distortion removing system. Fig. 6 is a system structure diagram of a laser radar point cloud distortion removal system provided by the invention. Referring to fig. 6, the system includes:
a point cloud information obtaining module 601, configured to obtain point cloud information of a frame of point cloud data collected by a laser radar; the point cloud information comprises coordinates of each point in the frame of point cloud data under a vehicle body coordinate system, a frame head moment course angle, a frame tail moment course angle and the vehicle speed of the laser radar carrying vehicle;
a rotation and translation transformation matrix determining module 602, configured to determine a rotation and translation transformation matrix of a distortion removal process according to the point cloud information;
the first coordinate system transformation module 603 is configured to transform the point cloud data from the vehicle body coordinate system to a geodetic coordinate system according to the rotational translation transformation matrix, so as to obtain point cloud data in the geodetic coordinate system;
and a second coordinate system transformation module 604, configured to transform the point cloud data in the geodetic coordinate system back to the vehicle body coordinate system, so as to obtain the point cloud data after distortion removal.
The rotation-translation transformation matrix determining module 602 specifically includes:
the approximate course angle determining unit is used for determining the approximate course angle of the laser radar carrying vehicle according to the frame header moment course angle and the frame tail moment course angle;
and the rotation translation transformation matrix determining unit is used for determining the rotation translation transformation matrix according to the approximate course angle, the frame head moment course angle, the frame tail moment course angle and the vehicle speed.
The rotation-translation transformation matrix determining unit specifically includes:
a rotation translation transformation matrix determining subunit for determining the rotation translation transformation matrix according to the approximate course angle, the frame head time course angle, the frame tail time course angle and the vehicle speed
Figure BDA0001893011300000091
Wherein G isuFor said rotational-translational transformation matrix αiThe point cloud data is a heading angle of the ith point in the frame of point cloud data, V is the speed of the laser radar carrying vehicle, T is the acquisition period of the laser radar, D is the angular resolution of the laser radar, and β is the approximate heading angle.
The first coordinate system transformation module 603 specifically includes:
and the first coordinate system transformation unit is used for performing rotational translation transformation on the ith point in the frame of point cloud data according to the rotational translation transformation matrix and transforming the ith point in the vehicle body coordinate system to the ith point in the geodetic coordinate system.
The second coordinate system transformation module 604 specifically includes:
the frame header moment rotation transformation matrix determining unit is used for determining a frame header moment rotation transformation matrix according to the frame header moment course angle;
and the second coordinate system transformation unit is used for transforming the ith point in the geodetic coordinate system into the vehicle body coordinate system according to the frame header moment rotation transformation matrix to obtain the distortion-removed ith point cloud data.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.

Claims (6)

1. A laser radar point cloud distortion removal method, the method comprising:
acquiring point cloud information of a frame of point cloud data acquired by a laser radar; the point cloud information comprises coordinates of each point in the frame of point cloud data under a vehicle body coordinate system, a frame head moment course angle, a frame tail moment course angle and the vehicle speed of the laser radar carrying vehicle;
determining a rotation and translation transformation matrix in a distortion removal process according to the point cloud information, which specifically comprises the following steps:
determining an approximate course angle of the laser radar carrying vehicle according to the frame header moment course angle and the frame tail moment course angle;
determining the rotation translation transformation matrix according to the approximate course angle, the frame head moment course angle, the frame tail moment course angle and the vehicle speed, and specifically comprising the following steps:
determining the rotation translation transformation matrix according to the approximate course angle, the frame head moment course angle, the frame tail moment course angle and the vehicle speed
Figure FDA0002423082480000011
Wherein G isuFor said rotational-translational transformation matrix αiThe heading angle of the ith point in the frame of point cloud data is V-shaped, the speed of the laser radar carrying vehicle is T, the acquisition period of the laser radar is T, the angle resolution of the laser radar is D, β is the approximate heading angle;
converting the point cloud data from the vehicle body coordinate system to a geodetic coordinate system according to the rotational translation transformation matrix to obtain point cloud data under the geodetic coordinate system;
and converting the point cloud data under the geodetic coordinate system into a vehicle body coordinate system to obtain the point cloud data after distortion removal.
2. The lidar point cloud distortion removal method according to claim 1, wherein the transforming the point cloud data from the vehicle body coordinate system to the geodetic coordinate system according to the rotational-translational transformation matrix to obtain the point cloud data under the geodetic coordinate system comprises:
and performing rotational translation transformation on the ith point in the frame of point cloud data according to the rotational translation transformation matrix, and transforming the ith point in the vehicle body coordinate system to the ith point in the geodetic coordinate system.
3. The lidar point cloud distortion removal method according to claim 2, wherein the transforming the point cloud data in the geodetic coordinate system back to the vehicle coordinate system to obtain the point cloud data after distortion removal comprises:
determining a frame header moment rotation transformation matrix according to the frame header moment course angle;
and transforming the ith point in the geodetic coordinate system into a vehicle body coordinate system according to the frame header moment rotation transformation matrix to obtain the distortion-removed point cloud data of the ith point.
4. A lidar point cloud distortion removal system, the system comprising:
the point cloud information acquisition module is used for acquiring point cloud information of one frame of point cloud data acquired by the laser radar; the point cloud information comprises coordinates of each point in the frame of point cloud data under a vehicle body coordinate system, a frame head moment course angle, a frame tail moment course angle and the vehicle speed of the laser radar carrying vehicle;
the rotational translation transformation matrix determining module is used for determining a rotational translation transformation matrix in the distortion removing process according to the point cloud information;
the rotation-translation transformation matrix determining module specifically includes:
the approximate course angle determining unit is used for determining the approximate course angle of the laser radar carrying vehicle according to the frame header moment course angle and the frame tail moment course angle;
the rotation translation transformation matrix determining unit is used for determining the rotation translation transformation matrix according to the approximate course angle, the frame head moment course angle, the frame tail moment course angle and the vehicle speed;
the rotation-translation transformation matrix determining unit specifically includes:
a rotation translation transformation matrix determining subunit for determining the rotation translation transformation matrix according to the approximate course angle, the frame head time course angle, the frame tail time course angle and the vehicle speed
Figure FDA0002423082480000021
Wherein G isuFor said rotational-translational transformation matrix αiThe heading angle of the ith point in the frame of point cloud data is V-shaped, the speed of the laser radar carrying vehicle is T, the acquisition period of the laser radar is T, the angle resolution of the laser radar is D, β is the approximate heading angle;
the first coordinate system transformation module is used for transforming the point cloud data from the vehicle body coordinate system to a geodetic coordinate system according to the rotational translation transformation matrix to obtain point cloud data under the geodetic coordinate system;
and the second coordinate system transformation module is used for transforming the point cloud data under the geodetic coordinate system into the vehicle body coordinate system to obtain the point cloud data after distortion removal.
5. The lidar point cloud distortion removal system of claim 4, wherein the first coordinate system transformation module comprises:
and the first coordinate system transformation unit is used for performing rotational translation transformation on the ith point in the frame of point cloud data according to the rotational translation transformation matrix and transforming the ith point in the vehicle body coordinate system to the ith point in the geodetic coordinate system.
6. The lidar point cloud distortion removal system of claim 5, wherein the second coordinate system transformation module comprises:
the frame header moment rotation transformation matrix determining unit is used for determining a frame header moment rotation transformation matrix according to the frame header moment course angle;
and the second coordinate system transformation unit is used for transforming the ith point in the geodetic coordinate system into the vehicle body coordinate system according to the frame header moment rotation transformation matrix to obtain the distortion-removed ith point cloud data.
CN201811479464.XA 2018-12-05 2018-12-05 Laser radar point cloud distortion removal method and system Expired - Fee Related CN109584183B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811479464.XA CN109584183B (en) 2018-12-05 2018-12-05 Laser radar point cloud distortion removal method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811479464.XA CN109584183B (en) 2018-12-05 2018-12-05 Laser radar point cloud distortion removal method and system

Publications (2)

Publication Number Publication Date
CN109584183A CN109584183A (en) 2019-04-05
CN109584183B true CN109584183B (en) 2020-05-29

Family

ID=65927282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811479464.XA Expired - Fee Related CN109584183B (en) 2018-12-05 2018-12-05 Laser radar point cloud distortion removal method and system

Country Status (1)

Country Link
CN (1) CN109584183B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110200552B (en) * 2019-06-20 2020-11-13 小狗电器互联网科技(北京)股份有限公司 Method for removing distortion of measuring end point of laser radar and sweeper
CN110824496B (en) * 2019-09-18 2022-01-14 北京迈格威科技有限公司 Motion estimation method, motion estimation device, computer equipment and storage medium
CN112649815B (en) * 2019-10-10 2023-04-11 华为技术有限公司 Method and device for processing data
CN110888120B (en) * 2019-12-03 2023-04-07 华南农业大学 Method for correcting laser radar point cloud data motion distortion based on integrated navigation system
CN111798397A (en) * 2020-07-08 2020-10-20 上海振华重工电气有限公司 Jitter elimination and rain and fog processing method for laser radar data
WO2022061850A1 (en) * 2020-09-28 2022-03-31 深圳市大疆创新科技有限公司 Point cloud motion distortion correction method and device
CN114372914B (en) * 2022-01-12 2022-09-13 吉林大学 Mechanical laser radar point cloud preprocessing method applied to mining electric shovel
CN116359938B (en) * 2023-05-31 2023-08-25 未来机器人(深圳)有限公司 Object detection method, device and carrying device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509098A (en) * 2011-10-08 2012-06-20 天津大学 Fisheye image vehicle identification method
CN103439977A (en) * 2013-08-23 2013-12-11 西安应用光学研究所 High-speed target tracking control method applied to photoelectric tracker
CN104952107A (en) * 2015-05-18 2015-09-30 湖南桥康智能科技有限公司 Three-dimensional bridge reconstruction method based on vehicle-mounted LiDAR point cloud data
CN106504275A (en) * 2016-10-12 2017-03-15 杭州深瞳科技有限公司 A kind of inertial positioning and the real-time three-dimensional method for reconstructing of point cloud registering coupling and complementing
CN106997614A (en) * 2017-03-17 2017-08-01 杭州光珀智能科技有限公司 A kind of large scale scene 3D modeling method and its device based on depth camera
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107123156A (en) * 2017-03-10 2017-09-01 西北工业大学 A kind of active light source projection three-dimensional reconstructing method being combined with binocular stereo vision
CN108020826B (en) * 2017-10-26 2019-11-19 厦门大学 Multi-line laser radar and multichannel camera mixed calibration method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509098A (en) * 2011-10-08 2012-06-20 天津大学 Fisheye image vehicle identification method
CN103439977A (en) * 2013-08-23 2013-12-11 西安应用光学研究所 High-speed target tracking control method applied to photoelectric tracker
CN104952107A (en) * 2015-05-18 2015-09-30 湖南桥康智能科技有限公司 Three-dimensional bridge reconstruction method based on vehicle-mounted LiDAR point cloud data
CN106504275A (en) * 2016-10-12 2017-03-15 杭州深瞳科技有限公司 A kind of inertial positioning and the real-time three-dimensional method for reconstructing of point cloud registering coupling and complementing
CN106997614A (en) * 2017-03-17 2017-08-01 杭州光珀智能科技有限公司 A kind of large scale scene 3D modeling method and its device based on depth camera
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LOAM:Lidar Odometry and Mapping in real-time;Ji Zhang等;《ReacherGate》;20140131;论文正文 *
基于车载多激光雷达的地图构建与障碍物检测;林辉;《中国优秀硕士学位论文全文数据库 信息科技辑》;20180115;论文正文 *

Also Published As

Publication number Publication date
CN109584183A (en) 2019-04-05

Similar Documents

Publication Publication Date Title
CN109584183B (en) Laser radar point cloud distortion removal method and system
CN108257161B (en) Multi-camera-based vehicle environment three-dimensional reconstruction and motion estimation system and method
CN107255476B (en) Indoor positioning method and device based on inertial data and visual features
WO2019105044A1 (en) Method and system for lens distortion correction and feature extraction
CN107341831B (en) IMU (inertial measurement Unit) -assisted visual feature robust tracking method and device
CN107590827A (en) A kind of indoor mobile robot vision SLAM methods based on Kinect
JP7391701B2 (en) Method, device, storage medium, and program for removing steady lateral deviation
CN105976353A (en) Spatial non-cooperative target pose estimation method based on model and point cloud global matching
CN104318561A (en) Method for detecting vehicle motion information based on integration of binocular stereoscopic vision and optical flow
CN111210478B (en) Common-view-free multi-camera system external parameter calibration method, medium and system
CN111862234A (en) Binocular camera self-calibration method and system
CN110232418B (en) Semantic recognition method, terminal and computer readable storage medium
CN104346833A (en) Vehicle restructing algorithm based on monocular vision
CN115343722A (en) Laser radar SLAM method based on loop detection in large-range scene
CN103413325A (en) Vehicle speed identification method based on vehicle body feature point positioning
CN113240813A (en) Three-dimensional point cloud information determination method and device
CN114842093A (en) Automatic calibration system and calibration method for external parameters of vehicle-mounted monocular camera based on key points
CN106408589A (en) Vehicle-mounted overlooking camera based vehicle movement measurement method
JP6768554B2 (en) Calibration device
CN113076988A (en) Mobile robot vision SLAM key frame self-adaptive screening method based on neural network
CN114219852A (en) Multi-sensor calibration method and device for automatic driving vehicle
CN112419411A (en) Method for realizing visual odometer based on convolutional neural network and optical flow characteristics
CN114897967B (en) Material form identification method for autonomous operation of excavating equipment
CN108665448B (en) Obstacle detection method based on binocular vision
CN116442707A (en) System and method for estimating vertical and pitching motion information of vehicle body based on binocular vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200529