CN112083433A - Laser radar distortion removal method applied to two-wheeled mobile robot - Google Patents

Laser radar distortion removal method applied to two-wheeled mobile robot Download PDF

Info

Publication number
CN112083433A
CN112083433A CN202010704463.1A CN202010704463A CN112083433A CN 112083433 A CN112083433 A CN 112083433A CN 202010704463 A CN202010704463 A CN 202010704463A CN 112083433 A CN112083433 A CN 112083433A
Authority
CN
China
Prior art keywords
data
queue
time
distortion removal
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010704463.1A
Other languages
Chinese (zh)
Other versions
CN112083433B (en
Inventor
董辉
袁登鹏
董浩
吴祥
吴宇航
童涛
夏启剑
周俊阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN202010704463.1A priority Critical patent/CN112083433B/en
Publication of CN112083433A publication Critical patent/CN112083433A/en
Application granted granted Critical
Publication of CN112083433B publication Critical patent/CN112083433B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a laser radar distortion removal method applied to a two-wheeled mobile robot, which comprises the following steps: acquiring original pose data of each laser point in a frame of laser radar; acquiring first attitude data of the odometer in a first time period; acquiring acceleration data of the IMU and angular velocity data of the Z axis of the IMU in a second time period; carrying out primary distortion removal on the original pose data by using the angular velocity data of the Z axis of the IMU; compensating the first attitude data of the odometer by using the acceleration data of the IMU to obtain compensation data; and performing secondary distortion removal on the data subjected to the primary distortion removal by using the compensation data to obtain final distortion removal data of the laser radar. The method can accurately reflect the motion condition of the laser radar, and can realize distortion removal and state estimation decoupling of the laser radar.

Description

Laser radar distortion removal method applied to two-wheeled mobile robot
Technical Field
The application belongs to the technical field of robot mapping and positioning, and particularly relates to a laser radar distortion removal method applied to a two-wheeled mobile robot.
Background
The mobile robot is used as a new transport tool and has great superiority. Not only can reduce labor intensity and improve production efficiency, but also can liberate people from dangerous, severe and heavy working environments in the aspect of production modes. Since the 80 s of the last century, research into mobile robots has begun in many countries.
The laser radar has an important role in the image construction and pose estimation of the robot in an unknown environment. However, the high scanning frequency and the multi-line laser radar are expensive, and the development and the application of the indoor service mobile robot are greatly limited. The single-line low-scanning-frequency laser radar is low in price, mature in technology and wide in application. But when applied to a two-wheeled mobile robot, distortion occurs when the terrain is scanned. The causes of the distortion are: laser scanning frequency, data of the laser spot is not instantaneously acquired; the laser frame rate is low, and the displacement of the robot in the motion process cannot be ignored; during laser spot measurement, the two-wheeled robot shakes.
The odometer has very high updating frequency (serial port communication can reach 200HZ), can be accurate reaction robot motion condition in a period of time, IMU can be accurate measurement robot's the X axle direction acceleration, can be accurate reaction two-wheeled mobile robot X axle direction shake. However, when the odometer is used to remove the distortion of the data of the laser radar, the data may be bumpy, and the accuracy of removing the distortion may be low.
Disclosure of Invention
The application aims to provide a laser radar distortion removal method applied to a two-wheeled mobile robot, which can accurately reflect the motion condition of the laser radar and can realize distortion removal and state estimation decoupling of the laser radar.
In order to achieve the purpose, the technical scheme adopted by the application is as follows:
a lidar distortion removal method applied to a two-wheeled mobile robot, wherein the two-wheeled mobile robot is provided with a lidar, a speedometer and an IMU, and the lidar distortion removal method applied to the two-wheeled mobile robot comprises the following steps:
acquiring original pose data of each laser point in a frame of laser radar, wherein the time of the initial laser point in the frame of laser radar is tsTime of last laser spot is te
Acquiring first attitude data of the odometer in a first time period, wherein the starting time of the first time period is taEnd time tbAnd t isa<ts<te<tb
Acquiring acceleration data of the IMU and angular velocity data of the Z axis of the IMU in a second time period, wherein the starting time of the second time period is tmEnd time tnAnd t ism<ta<tb<tn
Carrying out primary distortion removal on the original pose data by using the angular velocity data of the Z axis of the IMU;
compensating the first attitude data of the odometer by using the acceleration data of the IMU to obtain compensation data;
and performing secondary distortion removal on the data subjected to the primary distortion removal by using the compensation data to obtain final distortion removal data of the laser radar.
Several alternatives are provided below, but not as an additional limitation to the above general solution, but merely as a further addition or preference, each alternative being combinable individually for the above general solution or among several alternatives without technical or logical contradictions.
Preferably, the raw pose data is stored in a queue lidar 1 in chronological order, tsThe original pose data acquired at any moment is stored at the head of a queue Lidarlist1 and recorded as
Figure BDA0002594155980000021
teThe original pose data acquired at any moment is stored at the tail of the queue Lidarlist1 and recorded as
Figure BDA0002594155980000022
The time interval of each laser point in a frame of laser radar is delta trThus, the length of the queue Lidarlist1 is lrl
Preferably, the first attitude data of the odometer acquired in the first time period is stored in the queue Odomlist1 in time sequence, and t isaThe first attitude data collected at the moment is stored at the head of the queue Odomlist1 and recorded as
Figure BDA0002594155980000023
tbThe first attitude data collected at the moment is stored at the tail of the queue Odomlist1 and recorded as
The time interval of two adjacent first bit position data is delta toThus, the length of queue Odomlist1 is lol
Preferably, the acceleration data of the IMU acquired in the second time period are stored in an array acceraray 1 in time sequence, and the first data storage t of the array acceraray 1mAcceleration data collected at the moment, and the last data storage t of the array Accarray1nAcceleration data acquired at any moment, wherein the time interval between two adjacent acceleration data is delta tuThe length of the array Accarray1 is lal
The angular velocity data of the Z axis of the IMU acquired in the second time period are stored in the number according to the time sequenceIn the set Warray1, the first data deposit t of the array Warray1mThe angular velocity data collected at the moment, the last data deposit t of the array Warray1nThe angular velocity data are collected at the moment, and the time interval between two adjacent angular velocity data is delta tuThe length of the array Warray1 is lalAnd in the presence of lol>lal>lrlThe relationship (2) of (c).
Preferably, the performing the first distortion removal on the original pose data by using the angular velocity data of the Z axis of the IMU includes:
taking the array Warray1, multiplying each angular velocity data in the array Warray1 by the time interval DeltatuTo obtain
Figure BDA0002594155980000031
Years are put in the group Warray2,
Figure BDA0002594155980000032
representing the k +1 angular velocity data in the array Warray1, k ≦ (l) for 0 ≦ kal-1);
Updating the original pose data in the queue lidar 1 with the array Warray 2: get the k +1 st original pose data Lidarlist1[ k ] in the queue Lidarlist1]Azimuth angle σ inkIs updated to
Figure BDA0002594155980000033
Will update sigmakEach datum in the subsequent queue lidar 1 is multiplied by an R matrix in the form of a column vector consisting of an x-axis coordinate, a y-axis coordinate and an azimuth angle, and the R matrix is recorded as a queue lidar 2, so as to complete the first distortion removal, wherein the R matrix is a coordinate conversion matrix between the known installation position of the lidar and the installation position of the IMU.
Preferably, the compensating the first attitude data of the odometer by using the acceleration data of the IMU to obtain the compensation data includes:
taking an array Accarray1, processing data in an array Accarray1 through linear interpolation, storing the processed data in an array Accarray2, and storing the length of an array Accarray2Is 1ol
And superposing the first attitude data in the queue Odomlist1 with the corresponding elements in the array Accarray2 in sequence:
Figure BDA0002594155980000034
wherein (x)i,yi,θi) Is the i +1 st bit position data, a, in the queue Odomlist1iIs the i +1 th acceleration data in the array Accarray2, i is more than or equal to 0 and less than or equal to (l)ol-1),ΔtoThe time interval of two adjacent first posture data is;
storing the compensation data obtained after superposition in a queue Odomlist2, wherein the time interval between two adjacent compensation data is delta toQueue Odomlist2 is l in lengtholThe head of the queue Odomlist2 stores taCompensation data at the moment, tail deposit t of queue Odomlist2bCompensation data for the time of day.
Preferably, the second distortion removal of the data after the first distortion removal by using the compensation data to obtain final distortion removal data of the laser radar includes:
let tg<ts<tpAnd t issThe compensation data which do not correspond to each other at the moment are equal to the data after the distortion is removed for the first time, tgAnd tpThe compensation data corresponding to the moment is equal to the data after the first distortion removal, i.e.
Figure BDA0002594155980000041
Figure BDA0002594155980000042
For t in the queue Lidarlist2 obtained after the first distortion removalgThe data corresponding to the time of day is,
Figure BDA0002594155980000043
for t in the queue Lidarlist2 obtained after the first distortion removalpThe data corresponding to the time of day is,
Figure BDA0002594155980000044
for t in queue Odomlist2gThe compensation data corresponding to the time of day,
Figure BDA0002594155980000045
for t in queue Odomlist2pCompensation data corresponding to the moment;
calculating t according to the formulasInterpolation point of time
Figure BDA0002594155980000046
The following were used:
Figure BDA0002594155980000047
calculating to obtain t by the same methodeInterpolation point of time
Figure BDA0002594155980000048
And at tsTime teCalculating to obtain other 3 interpolation points at any time among the moments;
interpolating between two adjacent interpolation points of the 5 interpolation points using linear interpolation
Figure BDA0002594155980000049
And storing all the data generated after interpolation into a queue Lidarlist3 to be used as the final distortion removal data of the laser radar, and finishing secondary distortion removal.
According to the laser radar distortion removal method applied to the two-wheeled mobile robot, the primary distortion removal is carried out on the original pose data by using the angular velocity data of the Z axis of the IMU. Because the IMU is sensitive to the course angle, the IMU can be adopted to carry out primary distortion removal in the distortion removal process, so that the primary correction of data is realized, and a better correction result can be obtained in the later period; the odometer is compensated by using the acceleration data of the IMU, so that the distortion of the odometer caused by position drift and bumping due to the fact that the robot shakes forwards and backwards is reduced; and finally, the data after the first distortion removal is corrected for the second time by using the compensation data with higher accuracy after compensation, so that the distortion removal of the laser radar can be effectively completed, and the motion condition of the laser radar can be accurately reflected.
Drawings
Fig. 1 is a flowchart of a lidar distortion removal method applied to a two-wheeled mobile robot according to the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
In one embodiment, a laser radar distortion removal method applied to a two-wheeled mobile robot is provided, and is used for performing distortion removal correction on data acquired by a laser radar so as to facilitate image construction and positioning of the two-wheeled mobile robot (robot for short). The two-wheeled mobile robot is equipped with at least a lidar, an odometer, and an IMU, and the number of the lidar, the odometer, and the IMU is at least one. The embodiment focuses on removing distortion of the laser radar data, and does not strictly limit the number of the laser radar, the odometer and the IMU.
In this embodiment, the laser radar, the odometer, and the IMU are all one, and if there are a plurality of laser radars, the odometer, and the IMU, the distortion of the laser radar data can be removed by using one laser radar, the odometer, and the IMU as a set of data hardware, or by using the data mean values of a plurality of odometers or the data mean values of a plurality of IMUs.
As shown in fig. 1, the laser radar distortion removing method applied to the two-wheeled mobile robot in the embodiment includes the following steps:
and step S1, data acquisition.
(1) Acquiring original pose data of each laser point in a frame of laser radar, wherein the time of the initial laser point in the frame of laser radar is tsTime of last laser spot is te
The embodiment stores the acquired original pose data in a queue Lidarlist1 according to the time sequence, tsThe original pose data acquired at any moment is stored at the head of a queue Lidarlist1 and recorded as
Figure BDA0002594155980000051
teThe original pose data acquired at any moment is stored at the tail of the queue Lidarlist1 and recorded as
Figure BDA0002594155980000052
The time interval of each laser point in a frame of laser radar is delta trThus, the length of the queue Lidarlist1 is lrl. The laser transmitter rotates at a constant speed in the laser radar, laser is transmitted once when the laser transmitter rotates for a small angle, and a frame of complete data is generated after the laser transmitter rotates for a certain angle.
It is easy to understand that, because the pose data includes x-axis coordinates, y-axis coordinates and azimuth angles, the present embodiment stores the pose data in the form of a queue, but the queue storage form is not limited to be a unique storage form, and for example, the pose data may be stored in 3 independent arrays in a split manner.
(2) Acquiring first attitude data of the odometer in a first time period, wherein the starting time of the first time period is taEnd time tbAnd t isa<ts<te<tbThat is, the time range for acquiring the odometer data covers the time of one frame of laser, so as to ensure the effectiveness of subsequent data processing.
The acquired first posture data has various storage forms, and can be a queue, an array or an index pointer and the like. In order to reduce the storage pressure and ensure the data reading and writing speed, the embodiment provides a preferred storage form as follows:
storing the first attitude data of the odometer acquired in the first time period into an Odomlist1 according to the time sequence, taThe first attitude data collected at the moment is stored at the head of the queue Odomlist1 and recorded as
Figure BDA0002594155980000061
tbThe first attitude data collected at the moment is stored at the tail of the queue Odomlist1 and recorded as
Figure BDA0002594155980000062
The time interval of two adjacent first bit position data is delta toThus, the length of queue Odomlist1 is lol
(3) Acquiring acceleration data of an IMU (Inertial measurement unit, which is a device for measuring three-axis attitude angle (or angular velocity) and acceleration of an object) and angular velocity data of a Z axis of the IMU in a second time period, wherein the starting time of the second time period is tmEnd time tnAnd t ism<ta<tb<tnThat is, the time range for acquiring the data of the IMU covers the time range for acquiring the data of the odometer.
Because the IMU can simultaneously acquire the acceleration data and the angular velocity data, in this embodiment, the acceleration data and the angular velocity data at the same time are simultaneously acquired, and the acquired data are split and stored.
The present embodiment provides a preferred way for storing acceleration data as follows: the acceleration data of the IMU acquired in the second time period are stored in an array Accarray1 according to the time sequence, and the first data of an array Accarray1 is stored tmAcceleration data collected at the moment, and the last data storage t of the array Accarray1nAcceleration data acquired at any moment, wherein the time interval between two adjacent acceleration data is delta tuArray Accarray1 has a length of lal
For storing angular velocity data, the present embodiment provides a preferred way to: the angular speed data of the Z axis of the IMU acquired in the second time period are stored in an array Warray1 according to the time sequence, and the first data storage t of the array Warray1mThe angular velocity data collected at the moment, the last data deposit t of the array Warray1nThe angular velocity data are collected at the moment, and the time interval between two adjacent angular velocity data is delta tuThe length of the array Warray1 is lal
Similarly, the storage form of the acceleration data and the angular velocity data is not limited to the above-described form, but the above is only one preferable storage method.
Considering the hardware cost and the data quantity, the output frequency of the mileage counting data is greater than the output frequency of the IMU data, the output frequency of the IMU data is greater than the output frequency of the laser radar data, and the length of the data acquired in the primary distortion removal satisfies lol>lal>lrlThe relationship (2) of (c).
And step S2, performing first distortion removal on the original pose data by using the angular velocity data of the Z axis of the IMU. Because the IMU is sensitive to the course angle, the IMU can be adopted to carry out primary distortion removal in the distortion removal process, so that the primary correction of data is realized, and better correction results can be obtained in the later period.
A common distortion removal operation may be performed by using linear interpolation, but in order to ensure the strength of the first distortion removal and avoid the problems of over-removal or insignificant distortion removal effect, a preferred distortion removal method is provided as follows:
taking the array Warray1, multiplying each angular velocity data in the array Warray1 by the time interval DeltatuTo obtain
Figure BDA0002594155980000071
Are stored in the array Warray2,
Figure BDA0002594155980000072
representing the (k + 1) th angular velocity in the array Warray1Data, k is more than or equal to 0 and less than or equal to (l)al-1)。
Updating the original pose data in the queue lidar 1 with the array Warray 2: get the k +1 st original pose data Lidarlist1[ k ] in the queue Lidarlist1]Azimuth angle σ inkIs updated to
Figure BDA0002594155980000073
Will update sigmakEach datum in the subsequent queue lidar 1 is multiplied by an R matrix in the form of a column vector consisting of an x-axis coordinate, a y-axis coordinate and an azimuth angle, and the R matrix is recorded as a queue lidar 2, so as to complete the first distortion removal, wherein the R matrix is a coordinate conversion matrix between the known installation position of the lidar and the installation position of the IMU. And R is a positive matrix of 3 x 3.
In the first distortion removal, the azimuth angle σ is updated with the R matrix pairkAnd the later queue Lidarlist1 is converted, so that the effectiveness of removing distortion is further improved, and the problem that the distortion removing effect is poor due to the installation position of the component is avoided.
And step S3, compensating the first position and posture data of the odometer by using the acceleration data of the IMU to obtain compensation data. Because the odometer can be subjected to position drift jolt caused by fore-and-aft shaking of the robot, in order to reduce distortion caused by the drift jolt, the IMU is added in the embodiment, and the IMU can measure the acceleration of the laser radar along with the swinging of the vehicle body at any time, so that the odometer is compensated by utilizing the acceleration data of the IMU.
The data compensation mode can be a proportional compensation mode, a simple superposition mode and the like. The present embodiment combines the characteristics of the IMU and the odometer, and provides a preferred compensation method as follows:
taking an array Accarray1, processing data in an array Accarray1 through linear interpolation, and storing the processed data in an array Accarray2, wherein the length of the array Accarray2 is lol
And superposing the first attitude data in the queue Odomlist1 with the corresponding elements in the array Accarray2 in sequence:
Figure BDA0002594155980000074
wherein (x)i,yi,θi) Is the i +1 st bit position data, a, in the queue Odomlist1iIs the i +1 th acceleration data in the array Accarray2, i is more than or equal to 0 and less than or equal to (l)ol-1),ΔtoIs the time interval between two adjacent first bit position data.
Storing the compensation data obtained after superposition in a queue Odomlist2, wherein the time interval between two adjacent compensation data is delta toQueue Odomlist2 is l in lengtholThe head of the queue Odomlist2 stores taCompensation data at the moment, tail deposit t of queue Odomlist2bCompensation data for the time of day.
And step S4, carrying out secondary distortion removal on the data subjected to the primary distortion removal by using the compensation data to obtain final distortion removal data of the laser radar.
The data of the laser radar after the first distortion removal is partially corrected, and the secondary correction is performed by using the compensation data with higher accuracy after the compensation, so that the distortion removal of the laser radar can be effectively completed, and the motion condition of the laser radar can be accurately reflected.
In this embodiment, the second distortion removal is implemented by linear interpolation, and the specific process is as follows:
let tg<ts<tpAnd t issThe compensation data which do not correspond to each other at the moment are equal to the data after the distortion is removed for the first time, tgAnd tpThe compensation data corresponding to the moment is equal to the data after the first distortion removal, i.e.
Figure BDA0002594155980000081
Figure BDA0002594155980000082
For t in the queue Lidarlist2 obtained after the first distortion removalgThe data corresponding to the time of day is,
Figure BDA0002594155980000083
for the first timeT in queue Lidarlist2 obtained after distortion removalpThe data corresponding to the time of day is,
Figure BDA0002594155980000084
for t in queue Odomlist2gThe compensation data corresponding to the time of day,
Figure BDA0002594155980000085
for t in queue Odomlist2pCompensation data corresponding to the moment. In the data lookup process, tgAnd tpThe time of day forces a data in queue Odomlist2 to be equal to the data in queue Lidarlist 2.
Calculating t according to the formulasInterpolation point of time
Figure BDA0002594155980000086
The following were used:
Figure BDA0002594155980000087
calculating to obtain t by the same methodeInterpolation point of time
Figure BDA0002594155980000088
And at tsTime teAnd calculating to obtain other 3 interpolation points at any time between the moments. Calculation of interpolation points to
Figure BDA0002594155980000089
For example, the calculation processes of other interpolation points are the same, and are not described in detail in this embodiment.
Interpolating between two adjacent interpolation points of the 5 interpolation points using linear interpolation
Figure BDA00025941559800000810
And storing all the data generated after interpolation into a queue Lidarlist3 to be used as the final distortion removal data of the laser radar, and finishing secondary distortion removal.
The interpolation point at 5 moments is calculated during the second distortion removal because the lidar used in this embodiment is mist a2, and the scanning frequency of the lidar is 10 hz, i.e. the scanning time of one cycle is 0.05 seconds. By the linear interpolation process, the frequency of the laser radar is 20 hz, that is, 5 points are interpolated, which is equivalent to 100 hz, so that the error can be assumed to be 0.
The linear interpolation processing method adopted in this embodiment is an existing linear interpolation method, that is, an average value is inserted before two adjacent points, time average processing and data average processing are performed, and some details not mentioned in this embodiment of the linear interpolation processing method may be implemented by referring to the prior art, and this embodiment is not further limited.
It is easy to understand that the softness of the distorted data can be increased by performing the second-order distortion processing by using the linear difference method, the sudden change is avoided from being too large, the rationality of the data is also stronger, and in other embodiments, quadratic or trigonometric function fitting can be adopted, and quadratic distortion removal is achieved through least square and the like.
The finally obtained distortion removing data are pose data of laser points with higher accuracy, the pose data of each laser point are converted into angle and distance data, and then the angle and distance data are packaged and issued into laser beams, so that the laser beam can be used for robot mapping or positioning and the like.
It should be noted that, for convenience of description, the present embodiment adopts two methods, that is, describing data with reference to time and describing data with reference to number, but since the collected data are stored in time sequence, the data at each time and the number of data are corresponding to each other. E.g. taThe first attitude data collected at the moment is stored at the head of the queue Odomlist1 and recorded as
Figure BDA0002594155980000091
The 1 st data stored in the head of line queue, also understood as the temporal queue Odomlist1, i.e. the
Figure BDA0002594155980000092
Figure BDA0002594155980000093
Other similar considerations are understood.
It should be understood that, although the steps in the flowchart of fig. 1 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 1 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. A laser radar distortion removal method applied to a two-wheeled mobile robot, wherein the two-wheeled mobile robot is provided with a laser radar, a speedometer and an IMU (inertial measurement Unit), and is characterized by comprising the following steps:
acquiring original pose data of each laser point in a frame of laser radar, wherein the time of the initial laser point in the frame of laser radar is tsTime of last laser spot is te
Acquiring first attitude data of the odometer in a first time period, wherein the starting time of the first time period is taEnd time tbAnd t isa<ts<te<tb
Acquiring acceleration data of the IMU and angular velocity data of the Z axis of the IMU in a second time period, wherein the starting time of the second time period is tmEnd time tnAnd t ism<ta<tb<tn
Carrying out primary distortion removal on the original pose data by using the angular velocity data of the Z axis of the IMU;
compensating the first attitude data of the odometer by using the acceleration data of the IMU to obtain compensation data;
and performing secondary distortion removal on the data subjected to the primary distortion removal by using the compensation data to obtain final distortion removal data of the laser radar.
2. The lidar undistorted method for a two-wheeled mobile robot of claim 1, wherein the raw pose data is stored in a queue lidar 1 in time order, tsThe original pose data acquired at any moment is stored at the head of a queue Lidarlist1 and recorded as
Figure FDA0002594155970000011
teThe original pose data acquired at any moment is stored at the tail of the queue Lidarlist1 and recorded as
Figure FDA0002594155970000012
The time interval of each laser point in a frame of laser radar is delta trThus, the length of the queue Lidarlist1 is lrl
3. The lidar distortion removal method for two-wheeled mobile robot as claimed in claim 2, wherein the method comprisesIs characterized in that the first attitude data of the odometer acquired in the first time period is stored in a queue Odomlist1 according to the time sequence, taThe first attitude data collected at the moment is stored at the head of the queue Odomlist1 and recorded as
Figure FDA0002594155970000013
tbThe first attitude data collected at the moment is stored at the tail of the queue Odomlist1 and recorded as
Figure FDA0002594155970000014
The time interval of two adjacent first bit position data is delta toThus, the length of queue Odomlist1 is lol
4. The lidar distortion removal method applied to a two-wheeled mobile robot of claim 3, wherein the acceleration data of the IMU acquired in the second time period are stored in an array acceraray 1 in time sequence, and a first data storage t of the array acceraray 1 is storedmAcceleration data collected at the moment, and the last data storage t of the array Accarray1nAcceleration data acquired at any moment, wherein the time interval between two adjacent acceleration data is delta tuThe length of the array Accarray1 is lal
The angular speed data of the Z axis of the IMU acquired in the second time period are stored in an array Warray1 according to the time sequence, and the first data storage t of the array Warray1mThe angular velocity data collected at the moment, the last data deposit t of the array Warray1nThe angular velocity data are collected at the moment, and the time interval between two adjacent angular velocity data is delta tuThe length of the array Warray1 is lalAnd in the presence of lol>lal>lrlThe relationship (2) of (c).
5. The lidar undistorted method applied to a two-wheeled mobile robot of claim 4, wherein the first undistorted of the original pose data by using the angular velocity data of the Z-axis of the IMU comprises:
taking the array Warray1, multiplying each angular velocity data in the array Warray1 by the time interval DeltatuTo obtain
Figure FDA0002594155970000021
Are stored in the array Warray2,
Figure FDA0002594155970000022
representing the k +1 angular velocity data in the array Warray1, k ≦ (l) for 0 ≦ kal-1);
Updating the original pose data in the queue lidar 1 with the array Warray 2: get the k +1 st original pose data Lidarlist1[ k ] in the queue Lidarlist1]Azimuth angle σ inkIs updated to
Figure FDA0002594155970000023
Will update sigmakEach datum in the subsequent queue lidar 1 is multiplied by an R matrix in the form of a column vector consisting of an x-axis coordinate, a y-axis coordinate and an azimuth angle, and the R matrix is recorded as a queue lidar 2, so as to complete the first distortion removal, wherein the R matrix is a coordinate conversion matrix between the known installation position of the lidar and the installation position of the IMU.
6. The lidar distortion removal method applied to a two-wheeled mobile robot of claim 5, wherein the compensating the first attitude data of the odometer by using the acceleration data of the IMU to obtain the compensation data comprises:
taking an array Accarray1, processing data in an array Accarray1 through linear interpolation, and storing the processed data in an array Accarray2, wherein the length of the array Accarray2 is lol
And superposing the first attitude data in the queue Odomlist1 with the corresponding elements in the array Accarray2 in sequence:
Figure FDA0002594155970000024
wherein (x)i,yi,θi) Is the i +1 st bit position data, a, in the queue Odomlist1iIs the i +1 th acceleration data in the array Accarray2, i is more than or equal to 0 and less than or equal to (l)ol-1),ΔtoThe time interval of two adjacent first posture data is;
storing the compensation data obtained after superposition in a queue Odomlist2, wherein the time interval between two adjacent compensation data is delta toQueue Odomlist2 is l in lengtholThe head of the queue Odomlist2 stores taCompensation data at the moment, tail deposit t of queue Odomlist2bCompensation data for the time of day.
7. The lidar distortion removal method applied to a two-wheeled mobile robot of claim 6, wherein the second distortion removal of the data after the first distortion removal by using the compensation data to obtain final distortion removal data of the lidar comprises:
let tg<ts<tpAnd t issThe compensation data which do not correspond to each other at the moment are equal to the data after the distortion is removed for the first time, tgAnd tpThe compensation data corresponding to the moment is equal to the data after the first distortion removal, i.e.
Figure FDA0002594155970000031
Figure FDA0002594155970000032
For t in the queue Lidarlist2 obtained after the first distortion removalgThe data corresponding to the time of day is,
Figure FDA0002594155970000033
for t in the queue Lidarlist2 obtained after the first distortion removalpThe data corresponding to the time of day is,
Figure FDA0002594155970000034
for t in queue Odomlist2gThe compensation data corresponding to the time of day,
Figure FDA0002594155970000035
for t in queue Odomlist2pCompensation data corresponding to the moment;
calculating t according to the formulasInterpolation point of time
Figure FDA0002594155970000036
The following were used:
Figure FDA0002594155970000037
calculating to obtain t by the same methodeInterpolation point of time
Figure FDA0002594155970000038
And at tsTime teCalculating to obtain other 3 interpolation points at any time among the moments;
interpolating between two adjacent interpolation points of the 5 interpolation points using linear interpolation
Figure FDA0002594155970000039
And storing all the data generated after interpolation into a queue Lidarlist3 to be used as the final distortion removal data of the laser radar, and finishing secondary distortion removal.
CN202010704463.1A 2020-07-21 2020-07-21 Laser radar distortion removal method applied to two-wheeled mobile robot Active CN112083433B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010704463.1A CN112083433B (en) 2020-07-21 2020-07-21 Laser radar distortion removal method applied to two-wheeled mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010704463.1A CN112083433B (en) 2020-07-21 2020-07-21 Laser radar distortion removal method applied to two-wheeled mobile robot

Publications (2)

Publication Number Publication Date
CN112083433A true CN112083433A (en) 2020-12-15
CN112083433B CN112083433B (en) 2023-06-13

Family

ID=73735353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010704463.1A Active CN112083433B (en) 2020-07-21 2020-07-21 Laser radar distortion removal method applied to two-wheeled mobile robot

Country Status (1)

Country Link
CN (1) CN112083433B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113219973A (en) * 2021-05-08 2021-08-06 浙江工业大学 Efficient local path control method for mobile robot
CN113311411A (en) * 2021-04-19 2021-08-27 杭州视熵科技有限公司 Laser radar point cloud motion distortion correction method for mobile robot

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180074176A1 (en) * 2016-09-14 2018-03-15 Beijing Baidu Netcom Science And Technology Co., Ltd. Motion compensation method and apparatus applicable to laser point cloud data
US10043076B1 (en) * 2016-08-29 2018-08-07 PerceptIn, Inc. Visual-inertial positional awareness for autonomous and non-autonomous tracking
CN109709801A (en) * 2018-12-11 2019-05-03 智灵飞(北京)科技有限公司 A kind of indoor unmanned plane positioning system and method based on laser radar
CN109975792A (en) * 2019-04-24 2019-07-05 福州大学 Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN110879400A (en) * 2019-11-27 2020-03-13 炬星科技(深圳)有限公司 Method, equipment and storage medium for fusion positioning of laser radar and IMU
CN110888120A (en) * 2019-12-03 2020-03-17 华南农业大学 Method for correcting laser radar point cloud data motion distortion based on integrated navigation system
CN111045017A (en) * 2019-12-20 2020-04-21 成都理工大学 Method for constructing transformer substation map of inspection robot by fusing laser and vision
CN111199578A (en) * 2019-12-31 2020-05-26 南京航空航天大学 Unmanned aerial vehicle three-dimensional environment modeling method based on vision-assisted laser radar
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration
CN111398984A (en) * 2020-03-22 2020-07-10 华南理工大学 Self-adaptive laser radar point cloud correction and positioning method based on sweeping robot

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10043076B1 (en) * 2016-08-29 2018-08-07 PerceptIn, Inc. Visual-inertial positional awareness for autonomous and non-autonomous tracking
US20180074176A1 (en) * 2016-09-14 2018-03-15 Beijing Baidu Netcom Science And Technology Co., Ltd. Motion compensation method and apparatus applicable to laser point cloud data
CN109709801A (en) * 2018-12-11 2019-05-03 智灵飞(北京)科技有限公司 A kind of indoor unmanned plane positioning system and method based on laser radar
CN109975792A (en) * 2019-04-24 2019-07-05 福州大学 Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN110879400A (en) * 2019-11-27 2020-03-13 炬星科技(深圳)有限公司 Method, equipment and storage medium for fusion positioning of laser radar and IMU
CN110888120A (en) * 2019-12-03 2020-03-17 华南农业大学 Method for correcting laser radar point cloud data motion distortion based on integrated navigation system
CN111045017A (en) * 2019-12-20 2020-04-21 成都理工大学 Method for constructing transformer substation map of inspection robot by fusing laser and vision
CN111199578A (en) * 2019-12-31 2020-05-26 南京航空航天大学 Unmanned aerial vehicle three-dimensional environment modeling method based on vision-assisted laser radar
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration
CN111398984A (en) * 2020-03-22 2020-07-10 华南理工大学 Self-adaptive laser radar point cloud correction and positioning method based on sweeping robot

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113311411A (en) * 2021-04-19 2021-08-27 杭州视熵科技有限公司 Laser radar point cloud motion distortion correction method for mobile robot
CN113219973A (en) * 2021-05-08 2021-08-06 浙江工业大学 Efficient local path control method for mobile robot
CN113219973B (en) * 2021-05-08 2022-06-24 浙江工业大学 Local path control method of mobile robot

Also Published As

Publication number Publication date
CN112083433B (en) 2023-06-13

Similar Documents

Publication Publication Date Title
CN109975792B (en) Method for correcting point cloud motion distortion of multi-line laser radar based on multi-sensor fusion
CN113311411B (en) Laser radar point cloud motion distortion correction method for mobile robot
CN110243358B (en) Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system
CN112083726A (en) Park-oriented automatic driving double-filter fusion positioning system
CN112697138B (en) Bionic polarization synchronous positioning and composition method based on factor graph optimization
CN112083433B (en) Laser radar distortion removal method applied to two-wheeled mobile robot
CN110530361B (en) Steering angle estimator based on agricultural machinery double-antenna GNSS automatic navigation system
CN111238469B (en) Unmanned aerial vehicle formation relative navigation method based on inertia/data chain
CN108387236B (en) Polarized light SLAM method based on extended Kalman filtering
CN112284384A (en) Cooperative positioning method of clustered multi-deep-sea submersible vehicle considering measurement abnormity
CN112326990A (en) Method and system for measuring speed of working vehicle
CN109827580A (en) A kind of automobile attitude data collection system
CN113639722B (en) Continuous laser scanning registration auxiliary inertial positioning and attitude determination method
CN116448145A (en) Navigation attitude determination method based on polarization vector space difference
CN112284381B (en) Visual inertia real-time initialization alignment method and system
CN113008229B (en) Distributed autonomous integrated navigation method based on low-cost vehicle-mounted sensor
JP2007538231A (en) Interferometric sensing system
JP4846784B2 (en) Vehicle trajectory measuring device
CN116167919A (en) Laser point cloud data de-distortion method based on kernel ridge regression
CN114897942B (en) Point cloud map generation method and device and related storage medium
CN116338719A (en) Laser radar-inertia-vehicle fusion positioning method based on B spline function
CN114543793B (en) Multi-sensor fusion positioning method of vehicle navigation system
CN112857366B (en) Optical fiber strapdown inertial navigation system attitude calculation method based on compression structure
CN115097481A (en) Point cloud motion compensation method and device and electronic equipment
CN114440881A (en) Unmanned vehicle positioning method integrating multi-source sensor information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant