CN109584183A - A kind of laser radar point cloud goes distortion method and system - Google Patents

A kind of laser radar point cloud goes distortion method and system Download PDF

Info

Publication number
CN109584183A
CN109584183A CN201811479464.XA CN201811479464A CN109584183A CN 109584183 A CN109584183 A CN 109584183A CN 201811479464 A CN201811479464 A CN 201811479464A CN 109584183 A CN109584183 A CN 109584183A
Authority
CN
China
Prior art keywords
point cloud
course angle
frame
cloud data
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811479464.XA
Other languages
Chinese (zh)
Other versions
CN109584183B (en
Inventor
何磊
文龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN201811479464.XA priority Critical patent/CN109584183B/en
Publication of CN109584183A publication Critical patent/CN109584183A/en
Application granted granted Critical
Publication of CN109584183B publication Critical patent/CN109584183B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G06T5/77
    • G06T3/08
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a kind of laser radar point clouds to go distortion method and system.The method obtains the point cloud information of a frame point cloud data of laser radar acquisition first;The point cloud information includes coordinate of each point under bodywork reference frame in the frame point cloud data, the speed of the frame head moment course angle of the frame point cloud data, postamble moment course angle and the laser radar equipped vehicle;Then the rotation translation transformation matrix for going distortion process is determined according to the point cloud information;The point cloud data is converted into earth coordinates from bodywork reference frame according to the rotation translation transformation matrix, obtains the point cloud data under earth coordinates;The point cloud data under the earth coordinates is finally switched back into bodywork reference frame, obtains the point cloud data after going distortion.Laser radar point cloud provided by the invention goes distortion method smaller relative to conventional method calculation amount, and real-time is stronger, and stability is higher, is particularly suitable for the automatic driving vehicle run at a low speed.

Description

A kind of laser radar point cloud goes distortion method and system
Technical field
The present invention relates to automobiles and technical field of environmental perception, go distortion method more particularly to a kind of laser radar point cloud And system.
Background technique
So-called " intelligent vehicle ", exactly increased on the basis of common vehicle advanced sensor (radar, camera), The devices such as controller, actuator, the ability for making vehicle have perception ambient enviroment.But laser radar is in rotating acquisition ambient enviroment Point cloud information when, due to unmanned vehicle be in the moment movement state so that in unmanned vehicle traveling process each frame point cloud frame There are one section of amounts of exercise between head and postamble, that is to say, that and the laser point in a frame is not to be in the same local coordinate system, This inevitably results in the distortion and distortion of a cloud.It is the feature extracted in two frame data of front and back that tradition, which goes distortion method, Perhaps characteristic face by calculating characteristic curve between two frames or the mobile distance of characteristic face obtains interframe movement amount to line, however, This method, which is extracted accuracy by characteristic curve and characteristic face, to be influenced, and has the shortcomings that unstable result, and due to needs time All the points in a cloud are gone through, calculation amount is very big, and the requirement of real-time is not achieved sometimes.
Summary of the invention
The object of the present invention is to provide a kind of laser radar point clouds to go distortion method and system, goes distortion result to improve Stability and real-time, while reducing and going distortion process calculation amount.
To achieve the above object, the present invention provides following schemes:
A kind of laser radar point cloud goes distortion method, which comprises
Obtain the point cloud information of a frame point cloud data of laser radar acquisition;The point cloud information includes the frame point cloud Coordinate of each point under bodywork reference frame in data, the frame head moment course angle of the frame point cloud data, postamble moment The speed of course angle and the laser radar equipped vehicle;
The rotation translation transformation matrix for going distortion process is determined according to the point cloud information;
The point cloud data is converted into earth coordinates from bodywork reference frame according to the rotation translation transformation matrix, is obtained Obtain the point cloud data under earth coordinates;
Point cloud data under the earth coordinates is switched back into bodywork reference frame, obtains the point cloud data after going distortion.
Optionally, described that the rotation translation transformation matrix for going distortion process is determined according to the point cloud information, it specifically includes:
The laser radar equipped vehicle is determined according to the frame head moment course angle and the postamble moment course angle Approximate course angle;
According to the approximate course angle, the frame head moment course angle, the postamble moment course angle and the speed Determine the rotation translation transformation matrix.
Optionally, it is described according to the approximate course angle, the frame head moment course angle, the postamble moment course angle with And the speed determines the rotation translation transformation matrix, specifically includes:
According to the approximate course angle, the frame head moment course angle, the postamble moment course angle and the speed Determine the rotation translation transformation matrixWherein GuIt translates and becomes for the rotation Change matrix;αiCourse angle when for i-th in the frame point cloud data;V is the speed of the laser radar equipped vehicle;T For the collection period of laser radar;D is the angular resolution of laser radar;β is the approximate course angle.
Optionally, described to be converted into the point cloud data greatly from bodywork reference frame according to the rotation translation transformation matrix Ground coordinate system obtains the point cloud data under earth coordinates, specifically includes:
Transformation is moved to i-th point of progress rotary flat in the frame point cloud data according to the rotation translation transformation matrix, By i-th point transformation under bodywork reference frame to i-th point under earth coordinates.
Optionally, the point cloud data by under the earth coordinates switches back to bodywork reference frame, after distortion is gone in acquisition Point cloud data, specifically include:
Frame head moment rotational transformation matrix is determined according to the frame head moment course angle;
According to the frame head moment rotational transformation matrix by the i-th point transformation carriage return body coordinate under the earth coordinates System obtains i-th point cloud data after going distortion.
A kind of laser radar point cloud removes distortion system, the system comprises:
Point cloud information obtains module, the point cloud information of the frame point cloud data for obtaining laser radar acquisition;The point Cloud information includes coordinate of each point under bodywork reference frame in the frame point cloud data, the frame of the frame point cloud data The speed of head moment course angle, postamble moment course angle and the laser radar equipped vehicle;
Translation transformation matrix determining module is rotated, for going the rotation of distortion process to translate according to point cloud information determination Transformation matrix;
First coordinate system transformation module, for being sat the point cloud data from car body according to the rotation translation transformation matrix Mark system is converted into earth coordinates, obtains the point cloud data under earth coordinates;
Second coordinate system transformation module, for the point cloud data under the earth coordinates to be switched back to bodywork reference frame, Obtain the point cloud data after going distortion.
Optionally, the rotation translation transformation matrix determining module specifically includes:
Approximate course angle determination unit, for being determined according to the frame head moment course angle and the postamble moment course angle The approximate course angle of the laser radar equipped vehicle;
Translation transformation matrix determination unit is rotated, for according to the approximate course angle, the frame head moment course angle, institute It states postamble moment course angle and the speed determines the rotation translation transformation matrix.
Optionally, the rotation translation transformation matrix determination unit specifically includes:
Rotation translation transformation matrix determines subelement, for according to the approximate course angle, the frame head moment course angle, institute It states postamble moment course angle and the speed determines the rotation translation transformation matrix Wherein GuFor the rotation translation transformation matrix;αiCourse angle when for i-th in the frame point cloud data;V is the laser The speed of radar equipped vehicle;T is the collection period of laser radar;D is the angular resolution of laser radar;β is the approximation Course angle.
Optionally, the first coordinate system transformation module specifically includes:
First coordinate system transformation unit, for according to the rotation translation transformation matrix in the frame point cloud data I-th point of progress rotary flat moves transformation, by i-th point transformation under bodywork reference frame to i-th point under earth coordinates.
Optionally, the second coordinate system transformation module specifically includes:
Frame head moment rotational transformation matrix determination unit, for determining that the frame head moment revolves according to the frame head moment course angle Turn transformation matrix;
Second coordinate system transformation unit, being used for will be under the earth coordinates according to the frame head moment rotational transformation matrix The i-th point transformation return bodywork reference frame, obtain go distortion after i-th point cloud data.
The specific embodiment provided according to the present invention, the invention discloses following technical effects:
The present invention provides a kind of laser radar point cloud and goes distortion method and system, can be used for removing laser radar and collects Point cloud data in due to laser radar itself movement and the distortion that generates.The method obtains laser radar acquisition first The point cloud information of one frame point cloud data;The point cloud information includes each point in the frame point cloud data in bodywork reference frame Under coordinate, the frame head moment course angle of the frame point cloud data, postamble moment course angle and the laser radar carry The speed of vehicle;Then the rotation translation transformation matrix for going distortion process is determined according to the point cloud information;According to the rotation The point cloud data is converted into earth coordinates from bodywork reference frame by translation transformation matrix, obtains the point cloud under earth coordinates Data;The point cloud data under the earth coordinates is finally switched back into bodywork reference frame, obtains the point cloud data after going distortion. Laser radar point cloud provided by the invention goes distortion method smaller relative to conventional method calculation amount, and real-time is stronger, stability It is higher, it is particularly suitable for the automatic driving vehicle run at a low speed.
Detailed description of the invention
It in order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, below will be to institute in embodiment Attached drawing to be used is needed to be briefly described, it should be apparent that, the accompanying drawings in the following description is only some implementations of the invention Example, for those of ordinary skill in the art, without any creative labor, can also be according to these attached drawings Obtain other attached drawings.
Fig. 1 is the method flow diagram that laser radar point cloud provided by the invention goes distortion method;
Fig. 2 is unmanned vehicle course angle schematic diagram provided by the invention;
Fig. 3 is the process schematic that coordinate system transformation is carried out using the method for the present invention;
Fig. 4 is that approximate course angle provided by the invention solves schematic diagram;
Fig. 5 is to carry out the schematic diagram that a cloud goes distortion process using the method for the present invention;
Fig. 6 is the system construction drawing that laser radar point cloud provided by the invention removes distortion system.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
The object of the present invention is to provide a kind of laser radar point clouds to go distortion method and system, goes distortion result to improve Stability and real-time, while reducing and going distortion process calculation amount.
In order to make the foregoing objectives, features and advantages of the present invention clearer and more comprehensible, with reference to the accompanying drawing and specific real Applying mode, the present invention is described in further detail.
Fig. 1 is the method flow diagram that laser radar point cloud provided by the invention goes distortion method.Referring to Fig. 1, the present invention is mentioned The laser radar point cloud of confession goes distortion method to specifically include:
Step 101: obtaining the point cloud information of a frame point cloud data of laser radar acquisition.
It, can be by the method for the invention under conditions of known laser radar equipped vehicle and laser radar status information It eliminates or reduces due to the point cloud distortion that laser radar itself moves and generates in point cloud data, to allow point cloud data more Add the environmental information that truly reaction detection arrives.
The point cloud information that the present invention acquires includes each point in the frame point cloud data under bodywork reference frame Coordinate, frame head when coordinate of each point under bodywork reference frame in the frame point cloud data includes k-th frame is in car body Coordinate (X under coordinate systemKF u, YKF u), the coordinate (X of the postamble of k-th frame under bodywork reference frameKL u, YKL u) and k-th frame point Coordinate of i-th point data under bodywork reference frame in cloud data in addition to frame head and postamble.
The point cloud information further includes the vehicle velocity V of the laser radar equipped vehicle, the laser radar equipped vehicle Course angle α, the collection period T of laser radar, laser radar angular resolution D.The laser radar equipped vehicle is usually Unmanned vehicle, V are the speed of unmanned vehicle, can be read by the wheel speed sensors on unmanned vehicle.The laser radar equipped vehicle Course angle α includes frame head moment course angle αFWith postamble moment course angle αL, can be read by the inertial navigation system on unmanned vehicle. The frame head moment course angle αFThe as course angle of frame head moment unmanned vehicle, the postamble moment course angle αLAs postamble when Carve the course angle of unmanned vehicle.Fig. 2 is unmanned vehicle course angle schematic diagram provided by the invention.XOY is earth coordinates, X ' in Fig. 2 O ' Y ' is bodywork reference frame.As shown in Fig. 2, unmanned vehicle course angle α is positive value, and α is when unmanned vehicle turns left when unmanned vehicle is turned right Negative value.
Step 102: the rotation translation transformation matrix for going distortion process is determined according to the point cloud information.
Fig. 3 is the process schematic that coordinate system transformation is carried out using the method for the present invention.If earth coordinates are H (XOY), vehicle Body local coordinate system is U (X ' O ' Y '), and coordinate of the frame head under earth coordinates when k-th frame is (XkF H, YKF H), when k-th frame Coordinate (X of the frame head under bodywork reference frameKF u, YKF u), coordinate of the postamble of k-th frame under earth coordinates is (XkL H, YKL H), coordinate of the postamble of k-th frame under bodywork reference frame is (XKL u, YKL u)。
By coordinate (X of the postamble under bodywork reference frameKL u, YKL u) it is transformed to the coordinate under earth coordinates:
(XkL H, YKL H)=(XKL u, YKL u)R1
Wherein R1For postamble moment rotational transformation matrix, the i.e. rotational transformation matrix of bodywork reference frame to earth coordinates.Wherein αLFor the course angle of postamble moment unmanned vehicle.
By the coordinate (X of postambleKL u, YKL u) transform under earth coordinates from bodywork reference frame after, in combination with interframe Amount of exercise S and approximation course angle β translate the coordinate.The road that unmanned vehicle travels within a laser radar scanning period Journey can obtain, i.e. interframe movement amount S=VT according to unmanned vehicle speed.It is combined by interframe movement amount and approximate course angle each The point cloud data point collected moment translates the coordinate.
If the coordinate of postamble is (X after translationkL HF, YKL HF), then coordinate can be acquired by following formula after translating:
(XKL HF, YKL HF, 1) and=(XKL H, YKL H, 1) and M
Wherein M is translation transformation matrix,β is approximate course angle.
Fig. 4 is that approximate course angle provided by the invention solves schematic diagram.As shown in figure 4, the laser radar equipped vehicle Approximate course angle can be calculated and obtain by frame head and course angle when postamble.In entire laser radar collection period, unmanned vehicle Course angle can approximately indicate are as follows:
Δ α=α in formulaLF, αFFor the course angle of frame head moment unmanned vehicle, αLFor the course angle of postamble moment unmanned vehicle.
Fig. 5 is to carry out the schematic diagram that a cloud goes distortion process using the method for the present invention.It rotates and becomes in conjunction with the postamble moment Change matrix R1, available rotation translation transformation matrix G:
That is:
(XKL HF, YKL HF, 1) and=(XKL u, YKL u, 1) and G
Subsequent coordinate (the X by postamble under earth coordinates againkL HF, YKL HF) switch back to bodywork reference frame, it can be completed whole It is a to go distortion process:
(XKI UF, YKI UF)=(XkL HF, YKL HF)R2
Wherein R2For frame head moment rotational transformation matrix, the i.e. rotational transformation matrix of earth coordinates to bodywork reference frame.Wherein αFFor the course angle of frame head moment unmanned vehicle, as shown in Figure 5.
So far, the point cloud coordinate at postamble moment has projected under the bodywork reference frame at frame head moment, entire transformed Journey is as shown in Figure 5.
The above process is by under the bodywork reference frame of the coordinate projection at postamble moment to frame head moment, then in frame head It any i-th point between postamble, can first find out when laser radar collects at i-th, the amount of exercise S of unmanned vehiclei, Since the rotation of laser radar is at the uniform velocity, then can learn when collecting at i-th, elapsed time ti:
It is obvious:
Assuming that course angle is even variation in collection period when unmanned vehicle turns to, then course angle at i-th Are as follows:
Wherein Δ α=αLF
The final rotation translation transformation matrix G being all suitable for for any i-th point available at this timeu:
Wherein GuFor the rotation translation transformation matrix;αiCourse angle when for i-th in the frame point cloud data;V is The speed of the laser radar equipped vehicle;T is the collection period of laser radar;D is the angular resolution of laser radar;β is The approximation course angle.
Step 103: the point cloud data is converted into from bodywork reference frame by the earth according to the rotation translation transformation matrix Coordinate system obtains the point cloud data under earth coordinates.
After acquiring the approximate course angle, it will be rotated since frame head to the coordinate of each point postamble Translation transformation, i.e., by each point and matrix GuIt is multiplied, obtains the point cloud data under earth coordinates.Specifically:
Transformation is moved to i-th point of progress rotary flat in the frame point cloud data according to the rotation translation transformation matrix, I.e. by i-th point of coordinate under bodywork reference frame and rotation translation transformation matrix GuIt is multiplied, to obtain at i-th point in geodetic coordinates Coordinate under system completes the point cloud data being converted into the point cloud data under bodywork reference frame under earth coordinates.
Step 104: the point cloud data under the earth coordinates being switched back into bodywork reference frame, obtains the point after going distortion Cloud data.
After point cloud data under bodywork reference frame to be converted into the point cloud data under earth coordinates, that is, complete a cloud number According to rotation transformation and translation transformation, then by the coordinate transform after each point transformation return bodywork reference frame, i.e., with matrix R2Phase Multiply, obtains the point cloud data after going distortion.Specifically:
Frame head moment rotational transformation matrix is determined according to the frame head moment course angle
According to the frame head moment rotational transformation matrix R2By the i-th point transformation carriage return body coordinate under the earth coordinates System, i.e., by i-th point of coordinate under earth coordinates and frame head moment rotational transformation matrix R2Be multiplied, thus obtain i-th point Coordinate under bodywork reference frame obtains the point cloud data after going distortion.
The point cloud coordinate for completing to obtain after above-mentioned steps is to complete to remove the point cloud coordinate after distortion, and the coordinate is compared to original Beginning point cloud data can more truly react the environmental information of surrounding.
As it can be seen that the present invention is by transforming to geodetic coordinates by bodywork reference frame for the coordinate of each point in a frame point cloud data System, then carries out coordinate modification to each point according to interframe movement amount, then by revised coordinate by earth coordinates respectively Bodywork reference frame is transformed to again, and that completes point cloud data goes distortion process, can remove the collected point cloud data of laser radar In due to laser radar itself movement and the distortion that generates, this method is smaller relative to conventional method calculation amount, and real-time is more By force, stability is higher, is particularly suitable for the automatic driving vehicle run at a low speed.
The laser radar point cloud provided according to the present invention goes distortion method, and the present invention also provides a kind of laser radar point clouds to go Distortion system.Fig. 6 is the system construction drawing that laser radar point cloud provided by the invention removes distortion system.Referring to Fig. 6, the system Include:
Point cloud information obtains module 601, the point cloud information of the frame point cloud data for obtaining laser radar acquisition;It is described Point cloud information includes coordinate of each point under bodywork reference frame in the frame point cloud data, the frame point cloud data The speed of frame head moment course angle, postamble moment course angle and the laser radar equipped vehicle;
Translation transformation matrix determining module 602 is rotated, for determining the rotation for going distortion process according to the point cloud information Translation transformation matrix;
First coordinate system transformation module 603, for according to the rotation translation transformation matrix by the point cloud data from vehicle Body coordinate system transformation obtains the point cloud data under earth coordinates to earth coordinates;
Second coordinate system transformation module 604, for the point cloud data under the earth coordinates to be switched back to car body coordinate System obtains the point cloud data after going distortion.
Wherein, the rotation translation transformation matrix determining module 602 specifically includes:
Approximate course angle determination unit, for being determined according to the frame head moment course angle and the postamble moment course angle The approximate course angle of the laser radar equipped vehicle;
Translation transformation matrix determination unit is rotated, for according to the approximate course angle, the frame head moment course angle, institute It states postamble moment course angle and the speed determines the rotation translation transformation matrix.
The rotation translation transformation matrix determination unit specifically includes:
Rotation translation transformation matrix determines subelement, for according to the approximate course angle, the frame head moment course angle, institute It states postamble moment course angle and the speed determines the rotation translation transformation matrix Wherein GuFor the rotation translation transformation matrix;αiCourse angle when for i-th in the frame point cloud data;V is the laser The speed of radar equipped vehicle;T is the collection period of laser radar;D is the angular resolution of laser radar;β is the approximation Course angle.
The first coordinate system transformation module 603 specifically includes:
First coordinate system transformation unit, for according to the rotation translation transformation matrix in the frame point cloud data I-th point of progress rotary flat moves transformation, by i-th point transformation under bodywork reference frame to i-th point under earth coordinates.
The second coordinate system transformation module 604 specifically includes:
Frame head moment rotational transformation matrix determination unit, for determining that the frame head moment revolves according to the frame head moment course angle Turn transformation matrix;
Second coordinate system transformation unit, being used for will be under the earth coordinates according to the frame head moment rotational transformation matrix The i-th point transformation return bodywork reference frame, obtain go distortion after i-th point cloud data.
Each embodiment in this specification is described in a progressive manner, the highlights of each of the examples are with other The difference of embodiment, the same or similar parts in each embodiment may refer to each other.For system disclosed in embodiment For, since it is corresponded to the methods disclosed in the examples, so being described relatively simple, related place is said referring to method part It is bright.
Used herein a specific example illustrates the principle and implementation of the invention, and above embodiments are said It is bright to be merely used to help understand method and its core concept of the invention;At the same time, for those skilled in the art, foundation Thought of the invention, there will be changes in the specific implementation manner and application range.In conclusion the content of the present specification is not It is interpreted as limitation of the present invention.

Claims (10)

1. a kind of laser radar point cloud goes distortion method, which is characterized in that the described method includes:
Obtain the point cloud information of a frame point cloud data of laser radar acquisition;The point cloud information includes the frame point cloud data In coordinate of each point under bodywork reference frame, the frame head moment course angle of the frame point cloud data, postamble moment course The speed of angle and the laser radar equipped vehicle;
The rotation translation transformation matrix for going distortion process is determined according to the point cloud information;
The point cloud data is converted into earth coordinates from bodywork reference frame according to the rotation translation transformation matrix, is obtained big Point cloud data under ground coordinate system;
Point cloud data under the earth coordinates is switched back into bodywork reference frame, obtains the point cloud data after going distortion.
2. laser radar point cloud according to claim 1 goes distortion method, which is characterized in that described to be believed according to described cloud Breath determines the rotation translation transformation matrix for going distortion process, specifically includes:
The approximation of the laser radar equipped vehicle is determined according to the frame head moment course angle and the postamble moment course angle Course angle;
It is determined according to the approximate course angle, the frame head moment course angle, the postamble moment course angle and the speed The rotation translation transformation matrix.
3. laser radar point cloud according to claim 2 goes distortion method, which is characterized in that described according to the approximate boat The rotation translation transformation square is determined to angle, the frame head moment course angle, the postamble moment course angle and the speed Battle array, specifically includes:
It is determined according to the approximate course angle, the frame head moment course angle, the postamble moment course angle and the speed The rotation translation transformation matrixWherein GuFor the rotation translation transformation Matrix;αiCourse angle when for i-th in the frame point cloud data;∨ is the speed of the laser radar equipped vehicle;T is The collection period of laser radar;D is the angular resolution of laser radar;β is the approximate course angle.
4. laser radar point cloud according to claim 3 goes distortion method, which is characterized in that described according to the rotary flat It moves transformation matrix and the point cloud data is converted into earth coordinates from bodywork reference frame, obtain the point cloud number under earth coordinates According to specifically including:
Transformation is moved to i-th point of progress rotary flat in the frame point cloud data according to the rotation translation transformation matrix, by vehicle I-th point transformation under body coordinate system is to i-th point under earth coordinates.
5. laser radar point cloud according to claim 4 goes distortion method, which is characterized in that described by the geodetic coordinates Point cloud data under system switches back to bodywork reference frame, obtains the point cloud data after going distortion, specifically includes:
Frame head moment rotational transformation matrix is determined according to the frame head moment course angle;
The i-th point transformation under the earth coordinates is returned into bodywork reference frame according to the frame head moment rotational transformation matrix, is obtained I-th point cloud data after must going distortion.
6. a kind of laser radar point cloud removes distortion system, which is characterized in that the system comprises:
Point cloud information obtains module, the point cloud information of the frame point cloud data for obtaining laser radar acquisition;Described cloud letter Breath includes coordinate of each point under bodywork reference frame in the frame point cloud data, when the frame head of the frame point cloud data Carve the speed of course angle, postamble moment course angle and the laser radar equipped vehicle;
Translation transformation matrix determining module is rotated, for determining the rotation translation transformation for going distortion process according to the point cloud information Matrix;
First coordinate system transformation module, for according to the rotation translation transformation matrix by the point cloud data from bodywork reference frame Earth coordinates are converted into, the point cloud data under earth coordinates is obtained;
Second coordinate system transformation module is obtained for the point cloud data under the earth coordinates to be switched back to bodywork reference frame Point cloud data after going distortion.
7. laser radar point cloud according to claim 6 removes distortion system, which is characterized in that the rotation translation transformation square Battle array determining module specifically includes:
Approximate course angle determination unit, for according to the frame head moment course angle and postamble moment course angle determination The approximate course angle of laser radar equipped vehicle;
Translation transformation matrix determination unit is rotated, for according to the approximate course angle, the frame head moment course angle, the frame Tail moment course angle and the speed determine the rotation translation transformation matrix.
8. laser radar point cloud according to claim 7 removes distortion system, which is characterized in that the rotation translation transformation square Battle array determination unit specifically includes:
Rotation translation transformation matrix determines subelement, for according to the approximate course angle, the frame head moment course angle, the frame Tail moment course angle and the speed determine the rotation translation transformation matrix Wherein GuFor the rotation translation transformation matrix;αiCourse angle when for i-th in the frame point cloud data;V is the laser The speed of radar equipped vehicle;T is the collection period of laser radar;D is the angular resolution of laser radar;β is the approximation Course angle.
9. laser radar point cloud according to claim 8 removes distortion system, which is characterized in that first coordinate system transformation Module specifically includes:
First coordinate system transformation unit, for according to the rotation translation transformation matrix to i-th in the frame point cloud data Point carries out rotary flat and moves transformation, by i-th point transformation under bodywork reference frame to i-th point under earth coordinates.
10. laser radar point cloud according to claim 9 removes distortion system, which is characterized in that second coordinate system becomes Mold changing block specifically includes:
Frame head moment rotational transformation matrix determination unit, for determining that the rotation of frame head moment becomes according to the frame head moment course angle Change matrix;
Second coordinate system transformation unit, for according to the frame head moment rotational transformation matrix by under the earth coordinates I point transformation returns bodywork reference frame, obtains i-th point cloud data after going distortion.
CN201811479464.XA 2018-12-05 2018-12-05 Laser radar point cloud distortion removal method and system Active CN109584183B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811479464.XA CN109584183B (en) 2018-12-05 2018-12-05 Laser radar point cloud distortion removal method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811479464.XA CN109584183B (en) 2018-12-05 2018-12-05 Laser radar point cloud distortion removal method and system

Publications (2)

Publication Number Publication Date
CN109584183A true CN109584183A (en) 2019-04-05
CN109584183B CN109584183B (en) 2020-05-29

Family

ID=65927282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811479464.XA Active CN109584183B (en) 2018-12-05 2018-12-05 Laser radar point cloud distortion removal method and system

Country Status (1)

Country Link
CN (1) CN109584183B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110200552A (en) * 2019-06-20 2019-09-06 小狗电器互联网科技(北京)股份有限公司 The measurement terminals of laser radar are gone with the method and sweeper of distortion
CN110824496A (en) * 2019-09-18 2020-02-21 北京迈格威科技有限公司 Motion estimation method, motion estimation device, computer equipment and storage medium
CN110888120A (en) * 2019-12-03 2020-03-17 华南农业大学 Method for correcting laser radar point cloud data motion distortion based on integrated navigation system
CN111798397A (en) * 2020-07-08 2020-10-20 上海振华重工电气有限公司 Jitter elimination and rain and fog processing method for laser radar data
CN112649815A (en) * 2019-10-10 2021-04-13 华为技术有限公司 Method and device for processing data
WO2022061850A1 (en) * 2020-09-28 2022-03-31 深圳市大疆创新科技有限公司 Point cloud motion distortion correction method and device
CN114372914A (en) * 2022-01-12 2022-04-19 吉林大学 Mechanical laser radar point cloud preprocessing method applied to mining electric shovel
CN116359938A (en) * 2023-05-31 2023-06-30 未来机器人(深圳)有限公司 Object detection method, device and carrying device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509098A (en) * 2011-10-08 2012-06-20 天津大学 Fisheye image vehicle identification method
CN103439977A (en) * 2013-08-23 2013-12-11 西安应用光学研究所 High-speed target tracking control method applied to photoelectric tracker
CN104952107A (en) * 2015-05-18 2015-09-30 湖南桥康智能科技有限公司 Three-dimensional bridge reconstruction method based on vehicle-mounted LiDAR point cloud data
CN106504275A (en) * 2016-10-12 2017-03-15 杭州深瞳科技有限公司 A kind of inertial positioning and the real-time three-dimensional method for reconstructing of point cloud registering coupling and complementing
CN106997614A (en) * 2017-03-17 2017-08-01 杭州光珀智能科技有限公司 A kind of large scale scene 3D modeling method and its device based on depth camera
CN107123156A (en) * 2017-03-10 2017-09-01 西北工业大学 A kind of active light source projection three-dimensional reconstructing method being combined with binocular stereo vision
CN108020826A (en) * 2017-10-26 2018-05-11 厦门大学 Multi-line laser radar and multichannel camera mixed calibration method
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509098A (en) * 2011-10-08 2012-06-20 天津大学 Fisheye image vehicle identification method
CN103439977A (en) * 2013-08-23 2013-12-11 西安应用光学研究所 High-speed target tracking control method applied to photoelectric tracker
CN104952107A (en) * 2015-05-18 2015-09-30 湖南桥康智能科技有限公司 Three-dimensional bridge reconstruction method based on vehicle-mounted LiDAR point cloud data
CN106504275A (en) * 2016-10-12 2017-03-15 杭州深瞳科技有限公司 A kind of inertial positioning and the real-time three-dimensional method for reconstructing of point cloud registering coupling and complementing
CN107123156A (en) * 2017-03-10 2017-09-01 西北工业大学 A kind of active light source projection three-dimensional reconstructing method being combined with binocular stereo vision
CN106997614A (en) * 2017-03-17 2017-08-01 杭州光珀智能科技有限公司 A kind of large scale scene 3D modeling method and its device based on depth camera
CN108020826A (en) * 2017-10-26 2018-05-11 厦门大学 Multi-line laser radar and multichannel camera mixed calibration method
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JI ZHANG等: "LOAM:Lidar Odometry and Mapping in real-time", 《REACHERGATE》 *
林辉: "基于车载多激光雷达的地图构建与障碍物检测", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110200552A (en) * 2019-06-20 2019-09-06 小狗电器互联网科技(北京)股份有限公司 The measurement terminals of laser radar are gone with the method and sweeper of distortion
CN110824496A (en) * 2019-09-18 2020-02-21 北京迈格威科技有限公司 Motion estimation method, motion estimation device, computer equipment and storage medium
CN112649815A (en) * 2019-10-10 2021-04-13 华为技术有限公司 Method and device for processing data
CN110888120A (en) * 2019-12-03 2020-03-17 华南农业大学 Method for correcting laser radar point cloud data motion distortion based on integrated navigation system
CN110888120B (en) * 2019-12-03 2023-04-07 华南农业大学 Method for correcting laser radar point cloud data motion distortion based on integrated navigation system
CN111798397A (en) * 2020-07-08 2020-10-20 上海振华重工电气有限公司 Jitter elimination and rain and fog processing method for laser radar data
WO2022061850A1 (en) * 2020-09-28 2022-03-31 深圳市大疆创新科技有限公司 Point cloud motion distortion correction method and device
CN114372914A (en) * 2022-01-12 2022-04-19 吉林大学 Mechanical laser radar point cloud preprocessing method applied to mining electric shovel
CN116359938A (en) * 2023-05-31 2023-06-30 未来机器人(深圳)有限公司 Object detection method, device and carrying device
CN116359938B (en) * 2023-05-31 2023-08-25 未来机器人(深圳)有限公司 Object detection method, device and carrying device

Also Published As

Publication number Publication date
CN109584183B (en) 2020-05-29

Similar Documents

Publication Publication Date Title
CN109584183A (en) A kind of laser radar point cloud goes distortion method and system
US10867409B2 (en) Methods and systems to compensate for vehicle calibration errors
CN109166140B (en) Vehicle motion track estimation method and system based on multi-line laser radar
CN111986506B (en) Mechanical parking space parking method based on multi-vision system
CN110243358A (en) The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN105172793B (en) The pose evaluation method of autonomous driving vehicle
CN104061899B (en) A kind of vehicle side inclination angle based on Kalman filtering and angle of pitch method of estimation
CN111907516B (en) Full-automatic parking method and system
CN109579847A (en) Extraction method of key frame, device and smart machine in synchronous superposition
US10929995B2 (en) Method and apparatus for predicting depth completion error-map for high-confidence dense point-cloud
JP2021049969A (en) Systems and methods for calibrating steering wheel neutral position
CN110307850A (en) Reckoning localization method and automated parking system
CN113819914A (en) Map construction method and device
CN104159078A (en) Plotting-type independent module dynamic trajectory reversing aid system and method thereof
CN107664504A (en) A kind of path planning apparatus
CN110569602B (en) Data acquisition method and system for unmanned vehicle
CN105987697B (en) The wheeled AGV navigation locating method of Mecanum and system under a kind of quarter bend
CN102774380A (en) Method for judging running state of vehicle
JP2007309670A (en) Vehicle position detector
JP2010089698A (en) Automatic driving system and automatic driving method
CN112099378B (en) Front vehicle lateral motion state real-time estimation method considering random measurement time lag
CN113252022A (en) Map data processing method and device
CN115540850A (en) Unmanned vehicle mapping method combining laser radar and acceleration sensor
CN110901638A (en) Driving assistance method and system
CN113405555B (en) Automatic driving positioning sensing method, system and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant