CN113960622A - Real-time positioning method and device fusing laser radar and IMU sensor information - Google Patents

Real-time positioning method and device fusing laser radar and IMU sensor information Download PDF

Info

Publication number
CN113960622A
CN113960622A CN202111245119.1A CN202111245119A CN113960622A CN 113960622 A CN113960622 A CN 113960622A CN 202111245119 A CN202111245119 A CN 202111245119A CN 113960622 A CN113960622 A CN 113960622A
Authority
CN
China
Prior art keywords
time
pose
laser radar
imu
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111245119.1A
Other languages
Chinese (zh)
Inventor
方浩
班超
窦丽华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202111245119.1A priority Critical patent/CN113960622A/en
Publication of CN113960622A publication Critical patent/CN113960622A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)

Abstract

The invention provides a real-time positioning method and a real-time positioning device integrating laser radar and IMU sensor information, which take unscented Kalman filtering as a framework, use high-frequency inertial measurement unit data to predict the motion of an unmanned platform, and cache the data; after receiving the laser radar data with the timestamp, the point cloud matching module uses the predicted pose as prior information and performs matching based on normal distribution transformation and a known point cloud map to obtain global positioning information with delay; and updating the unscented Kalman filtering at the moment of the laser radar timestamp by using the positioning information, calibrating the prediction error of the inertial measurement unit, and performing re-prediction by using cached high-frequency inertial measurement unit data based on the positioning information to realize the low-delay and high-frequency positioning effect. The method can realize high-frequency and low-delay real-time positioning, and brings better control effect.

Description

Real-time positioning method and device fusing laser radar and IMU sensor information
Technical Field
The invention relates to the field of robot sensing and navigation, in particular to a real-time positioning method and device fusing laser radar and IMU sensor information.
Background
The positioning technology of the robot is a basic function that the robot can realize autonomous navigation and planning, and has important significance for a mobile robot platform. The robot positioning technology is classified as positioning in a known environment or positioning in an unknown environment, wherein the positioning problem of positioning in a known environment is also referred to as a repositioning problem, i.e. the robot determines its position in the environment given an environment map or measurement data. The common robot positioning technology comprises satellite positioning, Wifi positioning or positioning based on vision and a laser sensor, and is different from technologies such as satellite positioning or Wifi positioning, and the positioning technology based on sensors such as a laser radar and a camera can determine the position of the robot in the environment according to self perception data on the basis of not external measurement, so that the robot can be assisted to realize autonomous positioning in a satellite-free positioning environment. At present, a positioning method based on a laser sensor can achieve better precision indoors and outdoors, and is widely applied to the field of mobile robots. However, the problems of low positioning frequency, high calculation overhead, high delay and the like caused by the large amount of data and the characteristics of the laser radar sensor still need to be solved.
For the repositioning problem of the robot based on a known map, the existing solutions are as follows: the literature (S.Pang, D.Kent, X.Cai, H.Al-Qassab, D.Morris and H.Radha, "3D Scan Registration Based Localization for Autonomous Vehicles-A Comparison of NDT and ICP under responsive Conditions," vehicle Technology reference (VTC-Fall),2018) describes two methods of Localization Based on Point cloud matching, Localization Based on Closest Point iteration (ICP Point) and Normal Distribution Transformation (NDT). Compared with the distance relation between the corresponding points calculated by ICP matching, the NDT calculates the normal distribution of the point cloud, has lower requirement on the initial value and has better robustness and positioning accuracy. In the prior art, the method adopting NDT has the defects of low positioning frequency, delay in positioning and suitability for low-speed movement only, so that the NDT positioning result cannot be directly used in a control link.
Disclosure of Invention
In order to solve the technical problems, the invention provides a real-time positioning method and a real-time positioning device fusing laser radar and IMU sensor information, which are used for solving the technical problems that in the prior art, the NDT positioning frequency is low, the positioning delay is only suitable for low-speed movement, and the NDT positioning result cannot be directly used for a control link.
According to a first aspect of the present invention, there is provided a real-time positioning method for integrating information of lidar and an IMU sensor, the method comprising the following steps:
step S100: in an off-line stage, respectively modeling an observation model of a laser radar and a prediction model of an IMU (inertial measurement Unit), wherein the observation model of the laser radar is used for determining the observation of the laser radar on the attitude state of the robot; the prediction model of the IMU is used for determining the prediction pose of the robot according to the measurement data of the IMU sensor;
step S101: in the real-time information acquisition stage, the laser radar and the IMU sensor respectively start to acquire information from the initial moment; for Time series Time ═ tk-m,tk-m+1…,tkObtaining IMU measurement data once at each time point, obtaining laser radar observation data once every m time points, wherein the laser radar observation data are point cloud data, and tkIs the current time, tk-mRecording the time of last optimal pose estimation as the optimal pose estimation result
Figure BDA0003320685330000021
The laser radar is at tkObtaining observation data when the IMU sensor has obtained a secondary t from a predictive modelk-mTo tkPredicted pose { pose2 at each time point in this time periodk-m,pose2k-m+1,…,pose2k}; storing original measurement data acquired by the IMU at each time point in a memory of a computer;
step S102: obtaining tkIMU sensor robot predicted pose position 2 corresponding to momentkPredicting the pose position (position 2) of the IMU sensor robotkObtaining a laser mine based on NDT matching as a priori poseUp to tkPose observation at time pos 1k(ii) a Based on tkPredicted pose position of time of day 2kAnd an observation pose position 1kCalculating the optimal estimation of the pose of the robot at the moment by using unscented Kalman filtering
Figure BDA0003320685330000031
Step S103: the NDT matching causes a time lag τ, τ ═ tk'-tk,tk'Time after NDT match, and tk'<tk+mBased on the prediction model and the tkOptimal estimation pose at a time
Figure BDA0003320685330000032
Predicting pose { pos 2 for IMU sensor robotk+1,…,pose2k'Predicting again to obtain an updated IMU sensor robot predicted pose set { pos 2'k+1,…,pose2'k'};
After the updating is finished, if the real-time IMU and the laser radar measurement data are still obtained at the moment, the step S101 is executed; if no measurement data is input, the method ends.
According to a second aspect of the present invention, there is provided a real-time positioning apparatus fusing lidar and IMU sensor information, the apparatus comprising:
a modeling module: respectively modeling an observation model of a laser radar and a prediction model of an IMU (inertial measurement Unit) in an off-line stage, wherein the observation model of the laser radar is used for determining the observation of the laser radar on the attitude state of the robot; the prediction model of the IMU is used for determining the prediction pose of the robot according to the measurement data of the IMU sensor;
a prediction information acquisition module: the method comprises the steps that in a real-time information acquisition stage, a laser radar and an IMU sensor respectively start to acquire information from an initial moment; for Time series Time ═ tk-m,tk-m+1…,tkObtaining IMU measurement data once at each time point, and obtaining lidar observation data once every m time points, wherein the lidar data comprises a plurality of time points, and each time point comprises a plurality of time points, each time point comprises a plurality of IMU measurement data, each time point comprises a plurality of time points, each time point comprises a plurality of time point, each time point comprises a plurality of time point, each time point comprises a plurality of time point, each time point is obtained IMU measurement data, each time point is obtained time point, each time point is obtained, each time point is obtained time point is respectively, each time point is obtained, each time point is respectively, each time point is obtained, each time point is respectively, each time point is obtainedThe observation data is point cloud data, tkIs the current time, tk-mRecording the time of last optimal pose estimation as the optimal pose estimation result
Figure BDA0003320685330000033
The laser radar is at tkObtaining observation data when the IMU sensor has obtained a secondary t from a predictive modelk-mTo tkPredicted pose { pose2 at each time point in this time periodk-m,pose2k-m+1,…,pose2k}; storing original measurement data acquired by the IMU at each time point in a memory of a computer;
an optimal estimation determination module: configured to obtain tkIMU sensor robot predicted pose position 2 corresponding to momentkPredicting the pose position (position 2) of the IMU sensor robotkObtaining lidar vs t based on NDT matching as prior posekPose observation at time pos 1k(ii) a Based on tkPredicted pose position of time of day 2kAnd an observation pose position 1kCalculating the optimal estimation of the pose of the robot at the moment by using unscented Kalman filtering
Figure BDA0003320685330000041
An update module: configured such that said NDT matching results in a time lag τ, τ ═ tk'-tk,tk'Time after NDT match, and tk'<tk+mBased on the prediction model and the tkOptimal estimation pose at a time
Figure BDA0003320685330000042
Predicting pose { pos 2 for IMU sensor robotk+1,…,pose2k’Predicting again to obtain an updated IMU sensor robot predicted pose set { pos 2'k+1,…,pose2'k'}; after updating is completed, if the real-time IMU and the laser radar measurement data are still acquired at the moment, the prediction information acquisition module is triggeredA block; and if no measurement data is input, ending the positioning.
According to a third aspect of the present invention, there is provided a real-time positioning system integrating information of lidar and an IMU sensor, comprising:
a processor for executing a plurality of instructions;
a memory to store a plurality of instructions;
the instructions are stored in the memory, and loaded and executed by the processor to perform the real-time positioning method for integrating the lidar and the IMU sensor information.
According to a fourth aspect of the present invention, there is provided a computer readable storage medium having a plurality of instructions stored therein; the instructions are used for loading and executing the real-time positioning method for fusing the information of the laser radar and the IMU sensor by the processor.
According to the scheme, unscented Kalman filtering is used as a fusion framework, high-frequency IMU is used for pose state prediction, accurate pose information obtained by laser point cloud matching is used as observation for pose state updating, the pose state prediction is used as prior information of point cloud matching, the accuracy of point cloud matching is improved, a data caching and re-prediction mechanism is introduced, delay caused by point cloud matching calculation is restrained, and the instantaneity and robustness of robot pose estimation can be guaranteed. The invention provides a method for fusing a laser radar and an IMU (inertial measurement Unit) based on unscented Kalman filtering, which realizes high-frequency and robust pose state estimation by means of state prediction by the IMU and state update by the laser radar matching pose; the state predicted by the IMU is used as the prior information of the laser radar matching, the nonlinearity of the optimization problem in point cloud matching is reduced, and the precision of a matching algorithm and the robustness of high-speed rotation are improved; a data caching and re-prediction mechanism is introduced, and the IMU data obtained by updating the last time point is cached, so that the state from the acquisition time of the observation data to the current time period can be re-predicted for the observation data with time lag, and the time lag of the pose estimation result is reduced.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical solutions of the present invention more clearly understood and to implement them in accordance with the contents of the description, the following detailed description is given with reference to the preferred embodiments of the present invention and the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention. In the drawings:
FIG. 1 is a flow chart of a real-time method of fusing lidar and IMU sensor information in accordance with one embodiment of the present invention;
FIG. 2 is a schematic diagram of a delay compensation real-time positioning framework based on unscented Kalman filtering multi-sensor fusion according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating the effect of a method for fusing lidar information and IMU sensor information according to an embodiment of the present invention;
FIG. 4 is a flow chart illustrating an NDT matching method according to an embodiment of the present invention;
FIG. 5 is a logic diagram of a data buffering and delay compensation mechanism according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a real-time positioning device fusing information of a lidar and an IMU sensor according to an embodiment of the present invention.
Detailed Description
First, a real-time method flow for fusing information of a laser radar and an IMU sensor according to an embodiment of the present invention is described with reference to fig. 1 to 3, where the method includes the following steps:
step S100: in an off-line stage, respectively modeling an observation model of a laser radar and a prediction model of an IMU (inertial measurement Unit), wherein the observation model of the laser radar is used for determining the observation of the laser radar on the attitude and posture state of the robot according to a point cloud matching result of the laser radar; the prediction model of the IMU is used for determining the prediction pose of the robot according to the measurement data of the IMU sensor;
step S101: in the real-time information acquisition stage, the laser radar and the IMU sensor respectively start to acquire information from the initial moment;for Time series Time ═ tk-m,tk-m+1…,tkObtaining IMU measurement data once at each time point, obtaining laser radar observation data once every m time points, wherein the laser radar observation data are point cloud data, and tkIs the current time, tk-mRecording the time of last optimal pose estimation as the optimal pose estimation result
Figure BDA0003320685330000061
The laser radar is at tkObtaining observation data when the IMU sensor has obtained a secondary t from a predictive modelk-mTo tkPredicted pose { pose2 at each time point in this time periodk-m,pose2k-m+1,…,pose2k}; storing original measurement data acquired by the IMU at each time point in a memory of a computer;
step S102: obtaining tkIMU sensor robot predicted pose position 2 corresponding to momentkPredicting the pose position (position 2) of the IMU sensor robotkObtaining lidar vs t based on NDT matching as prior posekPose observation at time pos 1k(ii) a Based on tkPredicted pose position of time of day 2kAnd an observation pose position 1kCalculating the optimal estimation of the pose of the robot at the moment by using unscented Kalman filtering
Figure BDA0003320685330000071
Step S103: the NDT matching causes a time lag τ, τ ═ tk'-tk,tk'Time after NDT match, and tk'<tk+mBased on the prediction model and the tkOptimal estimation pose at a time
Figure BDA0003320685330000072
Predicting pose { pos 2 for IMU sensor robotk+1,…,pose2k'Predicting again to obtain an updated IMU sensor robot predicted pose set{pose2'k+1,…,pose2'k'};
After the updating is finished, if the real-time IMU and the laser radar measurement data are still obtained at the moment, the step S101 is executed; if no measurement data is input, the method ends.
In step S100, the prediction model of the IMU and the observation model of the lidar are respectively:
Figure BDA0003320685330000073
Figure BDA0003320685330000074
wherein x iskIs tkThe status variables of the robot acquired by the IMU at the moment,
Figure BDA0003320685330000075
is tkNoise predicted at the moment, subject to mean 0 and variance
Figure BDA0003320685330000076
Gaussian noise of (2); f denotes by tk-1State recurrence formula for moment prediction k moment, xk-1Is tk-1State variable of the robot at the moment, wk-1Is tk-1Prediction noise at a time;
ykrepresents tkThe observation of the environment by the lidar at the moment,
Figure BDA0003320685330000077
is tkNoise observed at the moment, obeying a mean of 0 and a variance of
Figure BDA0003320685330000078
Gaussian noise of (2); h represents observation data y at the current momentkState x at the present momentkThe following is shown.
Further, a prediction equation of the IMU is determined in the step S100 andobservation equation, state variable x, for lidarkIncluding the position p of the robot in the map coordinate systemkVelocity v of robot in map coordinate systemkAttitude R of robot under map coordinate systemkThe attitude is represented by a rotation matrix;
the prediction equation for IMU is as follows:
Figure BDA0003320685330000081
wherein a iskkIMU at tkLinear and angular velocities, p, of the robot measured at a timek-1Is tk-1Position of the robot at the moment in the map coordinate system, vk-1Is tk-1The speed of the robot at the moment under a map coordinate system, delta t is the interval time of two adjacent measurements of the IMU, Rk-1Is tk-1The posture of the robot at the moment under the map coordinate system, and g is the gravity acceleration.
The observation equation for lidar is as follows:
Figure BDA0003320685330000082
wherein y iskIs tkObserving data at any moment, and observing the pose of the robot based on point cloud matching
Figure BDA0003320685330000083
And observation of robot pose
Figure BDA0003320685330000084
Robot speed observation
Figure BDA0003320685330000085
Passing through tkAnd tk-1The position difference of the time is obtained.
In view of the fact that the optimal estimation method based on probability distribution used in the present embodiment directly uses a non-linear prediction model, which may cause the failure of the estimation method in the subsequent steps, for this problem, the prediction model is not used to directly predict the state, but an approximation method is adopted, the present embodiment selects the distribution of 2N +1 point approximate states, and N represents the dimension of the state. The specific method adopts an unscented Kalman filtering prediction algorithm, and the unscented Kalman filtering prediction algorithm flow is as follows:
inputting: mean value x of the state at the previous momentk-1Covariance Pk-1IMU measurement value akk
Calculating 2N +1 sigma points capable of approximating state distribution according to the mean value and covariance of the state at the last moment
Figure BDA0003320685330000086
Figure BDA0003320685330000087
Figure BDA0003320685330000091
After calculating the sigma point of the previous moment, predicting the sigma point of the current moment by using a motion model of the IMU:
Figure BDA0003320685330000092
the mean and covariance of the predicted σ -points are then calculated:
Figure BDA0003320685330000093
wherein
Figure BDA0003320685330000094
Wherein
Figure BDA0003320685330000095
In order to be the mean of the predicted states,
Figure BDA0003320685330000096
in order to predict the covariance of the states,
Figure BDA0003320685330000097
corresponding to the predicted pose position 1k
As shown in fig. 4, in this embodiment, after obtaining observation data of a laser radar at a certain time, the pose of the current point cloud in the point cloud map needs to be calculated by a point cloud registration method, so as to calculate the t of the robotkTemporal lidar position pos 1k
The step S102: obtaining tkIMU sensor robot predicted pose position 2 corresponding to momentkPredicting the pose position (position 2) of the IMU sensor robotkObtaining lidar vs t based on NDT matching as prior posekPose observation at time pos 1kThe method comprises the following steps:
step S1021: the lidar obtains tkObservation data of time, determining tkOriginal three-dimensional point cloud P obtained by time measurementkFor the original three-dimensional point cloud PkDown-sampling is carried out to obtain a filtered point cloud set
Figure BDA0003320685330000098
Figure BDA0003320685330000099
xiIndicating the laser point cloud in the current lidar coordinate system
Figure BDA00033206853300000910
Three-dimensional coordinates of (i) ═ 1 … Nx,NxThe number of the point clouds after filtering; acquiring a point cloud map at tkIMU sensor robot predicted pose position 2 corresponding to momentkThree-dimensional point cloud set in lower laser radar measuring range
Figure BDA00033206853300000911
Wherein y isjRepresenting point clouds in a point cloud map in a map coordinate system
Figure BDA00033206853300000912
Three-dimensional coordinates of (j) 1 … Ny,NyThe number of the point clouds extracted from the point cloud map is obtained; the original three-dimensional point cloud indicates that the laser radar is at tkThe method comprises the steps of obtaining point cloud constantly, wherein the point cloud map refers to a pre-established point cloud map, and the three-dimensional point cloud in the laser radar measuring range is the point cloud of the point cloud map in the laser radar measuring range;
step S1022: determining probability distribution (mu, sigma) of the point cloud in the point cloud map, wherein mu is the mean value of the probability distribution of the point cloud, and sigma is the variance of the probability distribution of the point cloud
Figure BDA0003320685330000101
Step S1023: obtaining tkIMU sensor robot predicted pose position 2 corresponding to momentkDeposit 2kAs
Figure BDA0003320685330000102
At an initial value of XkThe point cloud in the map is subjected to three-dimensional transformation and converted into a map coordinate system
Figure BDA0003320685330000103
In, memory
Figure BDA0003320685330000104
Figure BDA0003320685330000105
x′iIs xiIn the map coordinate system
Figure BDA0003320685330000106
A lower three-dimensional coordinate;
Figure BDA0003320685330000107
as laser minesCoordinate system of
Figure BDA0003320685330000108
To the map coordinate system
Figure BDA0003320685330000109
Three-dimensional coordinate transformation of (2);
step S1024: determining a calculation XkProbability distribution of each point in the tree;
Figure BDA00033206853300001010
wherein g (Y, x'i) To calculate XkA formula of probability distribution of each point in the tree;
step S1025: an optimization target max Ψ is constructed,
Figure BDA00033206853300001011
wherein the content of the first and second substances,
Figure BDA00033206853300001012
to connect point xiBy passing
Figure BDA00033206853300001013
From lidar coordinate system FkTransformation to map coordinate system
Figure BDA00033206853300001014
The above-mentioned optimization target is solved by standard Gauss-Newton method, and the current optimum transformation is worked out by means of iteration mode
Figure BDA00033206853300001015
Will be provided with
Figure BDA00033206853300001016
As a result of point cloud matching, and as a robot at tkTemporal lidar position pos 1k
The step S102: based on tkPredicted pose position of time of day 2kAnd an observation pose position 1kCalculating the optimal estimation of the pose of the robot at the moment by using unscented Kalman filtering
Figure BDA0003320685330000111
Namely, covariance matrixes of the predicted pose and the observed pose are respectively calculated, and the covariance matrixes represent the pose position 2kPosition 1kAccording to the two confidence values, determining the optimal pose estimation
Figure BDA0003320685330000112
In this embodiment, t is calculated according to the unscented kalman filter equationkThe optimal estimation of the pose of the robot at the moment adopts an unscented Kalman filtering optimal estimation algorithm, and the algorithm is as follows:
inputting an algorithm: mean value of predicted state at this time
Figure BDA0003320685330000113
Covariance
Figure BDA0003320685330000114
Point cloud matching pose position 2k
Calculating the observed value and the mean and covariance of the observed value:
Figure BDA0003320685330000115
Figure BDA0003320685330000116
wherein
Figure BDA0003320685330000117
Calculating the cross covariance of x and y according to the predicted and observed sigma point distribution and covariance, and calculating the Kalman gain Kk
Figure BDA0003320685330000118
Figure BDA0003320685330000119
Finally, calculating the final optimal estimated value according to the Kalman gain
Figure BDA00033206853300001110
Sum covariance
Figure BDA00033206853300001111
Figure BDA00033206853300001112
Figure BDA00033206853300001113
Specifically, the predicted state mean value of the current time as the input data for the unscented kalman filter optimal estimation
Figure BDA00033206853300001114
Covariance
Figure BDA00033206853300001115
And point cloud matching pose position (pos 2)k. First, from the observation equation, the mean of the observations can be calculated
Figure BDA00033206853300001116
Sum variance Py(ii) a Combined predicted state, variance
Figure BDA00033206853300001117
And PyRepresenting the confidence for both estimates. According to the observed data and the predicted state distribution, two are calculatedMutual variance of state estimates PxyAnd calculates the Kalman gain Kk,Kk∈[0,1],KkThe larger the confidence of the observed state, the higher the confidence of the predicted state, and the KkFrom predicted pose position 2kAnd an observation pose position 1kTo determine the optimal estimate
Figure BDA0003320685330000121
And corresponding covariance
Figure BDA0003320685330000122
Here, the
Figure BDA0003320685330000123
Correspond to
Figure BDA0003320685330000124
Compared with the most common extended Kalman filtering, the unscented Kalman filtering has better approximation effect on the nonlinear function, and the probability distribution function is not required to be non-Gaussian, so that a better state estimation result can be obtained.
In the step S103, because the point cloud data amount of the laser radar is huge, the calculation cost of the pose of the laser radar in the map is solved in an iterative matching manner, and the time lag generated thereby affects the subsequent filtering fusion. Therefore, a data buffering and delay compensation mechanism is designed, and the principle thereof is shown in fig. 5. That is, there is a large overhead due to the NDT matching calculation, and the time is already tkMove to tk',tk'<tk+mI.e. the next frame of lidar point cloud has not been observed. Thus, point cloud matching introduces a time lag τ, τ ═ tk'-tk. Due to tkChange of state at the moment based on the prediction model and the tkOptimal estimation pose at a time
Figure BDA0003320685330000125
To IMU sensor robotPredict pose { position 2k+1,…,pose2k'Predicting again to obtain an updated IMU sensor robot predicted pose set { pos 2'k+1,…,pose2'k'};
After the updating is finished, if the real-time IMU and the laser radar measurement data are still obtained at the moment, the step S101 is executed; if no measurement data is input, the method ends.
In this embodiment, the point cloud matching results are used according to the unscented kalman filter equation
Figure BDA0003320685330000126
At time tkUpdate the state variable, at this time (t)k,tk') The prediction state in the time period is influenced by state updating, IMU data in the memory is called to re-predict the state variable to obtain an updated real-time prediction state, and therefore point cloud matching results are restrained
Figure BDA0003320685330000127
The effect of the time lag τ on the state estimation.
The embodiment of the present invention further provides a real-time positioning device fusing information of a laser radar and an IMU sensor, as shown in fig. 6, the device includes:
a modeling module: respectively modeling an observation model of a laser radar and a prediction model of an IMU (inertial measurement Unit) in an off-line stage, wherein the observation model of the laser radar is used for determining the observation of the laser radar on the attitude state of the robot; the prediction model of the IMU is used for determining the prediction pose of the robot according to the measurement data of the IMU sensor;
a prediction information acquisition module: the method comprises the steps that in a real-time information acquisition stage, a laser radar and an IMU sensor respectively start to acquire information from an initial moment; for Time series Time ═ tk-m,tk-m+1…,tkObtaining IMU measurement data once at each time point, obtaining laser radar observation data once every m time points, wherein the laser radar observation data are point cloud data, and tkIs the current time, tk-mFor the last timeThe time of the optimal pose estimation is recorded, and the result of the optimal pose estimation when the optimal pose estimation is carried out last time is recorded as
Figure BDA0003320685330000131
The laser radar is at tkObtaining observation data when the IMU sensor has obtained a secondary t from a predictive modelk-mTo tkPredicted pose { pose2 at each time point in this time periodk-m,pose2k-m+1,…,pose2k}; storing original measurement data acquired by the IMU at each time point in a memory of a computer;
an optimal estimation determination module: configured to obtain tkIMU sensor robot predicted pose position 2 corresponding to momentkPredicting the pose position (position 2) of the IMU sensor robotkObtaining lidar vs t based on NDT matching as prior posekPose observation at time pos 1k(ii) a Based on tkPredicted pose position of time of day 2kAnd an observation pose position 1kCalculating the optimal estimation of the pose of the robot at the moment by using unscented Kalman filtering
Figure BDA0003320685330000132
An update module: configured such that said NDT matching results in a time lag τ, τ ═ tk'-tk,tk'Time after NDT match, and tk'<tk+mBased on the prediction model and the tkOptimal estimation pose at a time
Figure BDA0003320685330000133
Predicting pose { pos 2 for IMU sensor robotk+1,…,pose2k'Predicting again to obtain an updated IMU sensor robot predicted pose set { pos 2'k+1,…,pose2'k'};
After updating is completed, if the real-time IMU and the laser radar measurement data are still obtained at the moment, the prediction information obtaining module is triggered; and if no measurement data is input, ending the positioning.
The embodiment of the invention further provides a real-time positioning system fusing laser radar and IMU sensor information, which comprises:
a processor for executing a plurality of instructions;
a memory to store a plurality of instructions;
the instructions are stored in the memory, and loaded and executed by the processor to perform the real-time positioning method for integrating the lidar and the IMU sensor information.
The embodiment of the invention further provides a computer readable storage medium, wherein a plurality of instructions are stored in the storage medium; the instructions are used for loading and executing the real-time positioning method for fusing the information of the laser radar and the IMU sensor by the processor.
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a physical machine Server, or a network cloud Server, etc., and needs to install a Windows or Windows Server operating system) to perform some steps of the method according to various embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, and any simple modification, equivalent change and modification made to the above embodiment according to the technical spirit of the present invention are still within the scope of the technical solution of the present invention.

Claims (7)

1. A real-time positioning method fusing laser radar and IMU sensor information is characterized by comprising the following steps:
step S100: in an off-line stage, respectively modeling an observation model of a laser radar and a prediction model of an IMU (inertial measurement Unit), wherein the observation model of the laser radar is used for determining the observation of the laser radar on the attitude and posture state of the robot according to a point cloud matching result of the laser radar; the prediction model of the IMU is used for determining the prediction pose of the robot according to the measurement data of the IMU sensor;
step S101: in the real-time information acquisition stage, the laser radar and the IMU sensor respectively start to acquire information from the initial moment; for Time series Time ═ tk-m,tk-m+1…,tkObtaining IMU measurement data once at each time point, obtaining laser radar observation data once every m time points, wherein the laser radar observation data are point cloud data, and tkIs the current time, tk-mRecording the time of last optimal pose estimation as the optimal pose estimation result
Figure FDA0003320685320000011
The laser radar is at tkObtaining observation data when the IMU sensor has obtained a secondary t from a predictive modelk-mTo tkPredicted pose { pose2 at each time point in this time periodk-m,pose2k-m+1,…,pose2k}; storing original measurement data acquired by the IMU at each time point in a memory of a computer;
step S102: obtaining tkIMU sensor robot predicted pose position 2 corresponding to momentkPredicting the pose position (position 2) of the IMU sensor robotkObtaining lidar vs t based on NDT matching as prior posekPose observation at time pos 1k(ii) a Based on tkPredicted pose position of time of day 2kAnd an observation pose position 1kCalculating the optimal estimation of the pose of the robot at the moment by using unscented Kalman filtering
Figure FDA0003320685320000012
Step S103: the NDT matching causes a time lag τ, τ ═ tk'-tk,tk'Time after NDT match, and tk'<tk+mBased on the prediction model and the tkOptimal estimation pose at a time
Figure FDA0003320685320000013
Predicting pose { pos 2 for IMU sensor robotk+1,…,pose2k'Predicting again to obtain the updated IMU sensor robot predicted poseSet { fuse 2'k+1,…,pose2'k'};
After the updating is finished, if the real-time IMU and the laser radar measurement data are still obtained at the moment, the step S101 is executed; if no measurement data is input, the method ends.
2. The method of claim 1, wherein in step S100, the prediction model of the IMU and the observation model of the lidar are:
Figure FDA0003320685320000021
Figure FDA0003320685320000022
wherein x iskIs tkThe status variables of the robot acquired by the IMU at the moment,
Figure FDA0003320685320000023
is tkNoise predicted at the moment, subject to mean 0 and variance
Figure FDA0003320685320000024
Gaussian noise of (2); f denotes by tk-1State recurrence formula for moment prediction k moment, xk-1Is tk-1State variable of the robot at the moment, wk-1Is tk-1Prediction noise at a time;
ykrepresents tkThe observation of the environment by the lidar at the moment,
Figure FDA0003320685320000025
is tkNoise observed at the moment, obeying a mean of 0 and a variance of
Figure FDA0003320685320000026
Gaussian noise of (2); h meterCurrent time observation data ykState x at the present momentkThe following is shown.
3. The method of claim 2, wherein the step S102: obtaining tkIMU sensor robot predicted pose position 2 corresponding to momentkPredicting the pose position (position 2) of the IMU sensor robotkObtaining lidar vs t based on NDT matching as prior posekPose observation at time poselkThe method comprises the following steps:
step S1021: the lidar obtains tkObservation data of time, determining tkOriginal three-dimensional point cloud P obtained by time measurementkFor the original three-dimensional point cloud PkDown-sampling is carried out to obtain a filtered point cloud set
Figure FDA0003320685320000027
Figure FDA0003320685320000028
xiIndicating the laser point cloud in the current lidar coordinate system
Figure FDA0003320685320000029
Three-dimensional coordinates of (i) ═ 1 … Nx,NxThe number of the point clouds after filtering; acquiring a point cloud map at tkIMU sensor robot predicted pose position 2 corresponding to momentkThree-dimensional point cloud set in lower laser radar measuring range
Figure FDA00033206853200000210
Wherein y isjRepresenting point clouds in a point cloud map in a map coordinate system
Figure FDA00033206853200000211
Three-dimensional coordinates of (j) 1 … Ny,NyThe number of the point clouds extracted from the point cloud map is obtained; the original three-dimensional point cloud indicates that the laser radar is at tkTime of day acquisitionThe point cloud map refers to a pre-established point cloud map, and the three-dimensional point cloud in the laser radar measuring range is the point cloud of the point cloud map in the laser radar measuring range;
step S1022: determining probability distribution (mu, sigma) of the point cloud in the point cloud map, wherein mu is the mean value of the probability distribution of the point cloud, and sigma is the variance of the probability distribution of the point cloud
Figure FDA0003320685320000031
Step S1023: obtaining tkIMU sensor robot predicted pose position 2 corresponding to momentkDeposit 2kAs
Figure FDA0003320685320000032
At an initial value of XkThe point cloud in the map is subjected to three-dimensional transformation and converted into a map coordinate system
Figure FDA0003320685320000033
In, memory
Figure FDA0003320685320000034
Figure FDA0003320685320000035
x′iIs xiIn the map coordinate system
Figure FDA0003320685320000036
A lower three-dimensional coordinate;
Figure FDA0003320685320000037
as a laser radar coordinate system
Figure FDA0003320685320000038
To the map coordinate system
Figure FDA0003320685320000039
Three-dimensional coordinate transformation of (2);
step S1024: determining a calculation XkProbability distribution of each point in the tree;
Figure FDA00033206853200000310
wherein g (Y, x'i) To calculate XkA formula of probability distribution of each point in the tree;
step S1025: an optimization target max Ψ is constructed,
Figure FDA00033206853200000311
wherein the content of the first and second substances,
Figure FDA00033206853200000312
to connect point xiBy passing
Figure FDA00033206853200000313
From lidar coordinate systems
Figure FDA00033206853200000314
Transformation to map coordinate system
Figure FDA00033206853200000315
The above-mentioned optimization target is solved by standard Gauss-Newton method, and the current optimum transformation is worked out by means of iteration mode
Figure FDA00033206853200000316
Will be provided with
Figure FDA00033206853200000317
As a result of point cloud matching, and as a robot at tkTemporal lidar position pos 1k
4. The method of claim 3, wherein the step S102: based on tkPredicted pose position of time of day 2kAnd an observation pose position 1kCalculating the optimal estimation of the pose of the robot at the moment by using unscented Kalman filtering
Figure FDA00033206853200000318
Namely, covariance matrixes of the predicted pose and the observed pose are respectively calculated, and the covariance matrixes represent the pose position 2kPosition 1kAccording to the two confidence values, determining the optimal pose estimation
Figure FDA0003320685320000041
5. A real-time positioning device for fusing laser radar and IMU sensor information, which is characterized by comprising:
a modeling module: respectively modeling an observation model of a laser radar and a prediction model of an IMU (inertial measurement Unit) in an off-line stage, wherein the observation model of the laser radar is used for determining the observation of the laser radar on the attitude state of the robot; the prediction model of the IMU is used for determining the prediction pose of the robot according to the measurement data of the IMU sensor;
a prediction information acquisition module: configuring a laser radar and an IMU sensor to respectively obtain information from an initial moment; for Time series Time ═ tk-m,tk-m+1…,tkObtaining IMU measurement data once at each time point, obtaining laser radar observation data once every m time points, wherein the laser radar observation data are point cloud data, and tkIs the current time, tk-mRecording the time of last optimal pose estimation as the optimal pose estimation result
Figure FDA0003320685320000042
The laser radar is at tkObtaining observation data when the IMU sensor has obtained a secondary t from a predictive modelk-mTo tkPredicted pose { pose2 at each time point in this time periodk-m,pose2k-m+1,…,pose2k}; storing original measurement data acquired by the IMU at each time point in a memory of a computer;
an optimal estimation determination module: configured to obtain tkIMU sensor robot predicted pose position 2 corresponding to momentkPredicting the pose position (position 2) of the IMU sensor robotkObtaining lidar vs t based on NDT matching as prior posekPose observation at time pos 1k(ii) a Based on tkPredicted pose position of time of day 2kAnd an observation pose position 1kCalculating the optimal estimation of the pose of the robot at the moment by using unscented Kalman filtering
Figure FDA0003320685320000043
An update module: configured such that said NDT matching causes a time lag τ, τ ═ tk'-tk,tk'Time after NDT match, and tk'<tk+mBased on the prediction model and the tkOptimal estimation pose at a time
Figure FDA0003320685320000044
Predicting pose { pos 2 for IMU sensor robotk+1,…,pose2k'Predicting again to obtain an updated IMU sensor robot predicted pose set { pos 2'k+1,…,pose2'k'}; after updating is completed, if the real-time IMU and the laser radar measurement data are still obtained at the moment, the prediction information obtaining module is triggered; and if no measurement data is input, ending the positioning.
6. A real-time positioning system fusing laser radar and IMU sensor information comprises:
a processor for executing a plurality of instructions;
a memory to store a plurality of instructions;
wherein the plurality of instructions are to be stored by the memory and loaded and executed by the processor to perform the method of any of claims 1-4.
7. A computer-readable storage medium having stored therein a plurality of instructions; the plurality of instructions for being loaded by a processor and for performing the method of any of claims 1-4.
CN202111245119.1A 2021-10-26 2021-10-26 Real-time positioning method and device fusing laser radar and IMU sensor information Pending CN113960622A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111245119.1A CN113960622A (en) 2021-10-26 2021-10-26 Real-time positioning method and device fusing laser radar and IMU sensor information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111245119.1A CN113960622A (en) 2021-10-26 2021-10-26 Real-time positioning method and device fusing laser radar and IMU sensor information

Publications (1)

Publication Number Publication Date
CN113960622A true CN113960622A (en) 2022-01-21

Family

ID=79466943

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111245119.1A Pending CN113960622A (en) 2021-10-26 2021-10-26 Real-time positioning method and device fusing laser radar and IMU sensor information

Country Status (1)

Country Link
CN (1) CN113960622A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114608569A (en) * 2022-02-22 2022-06-10 杭州国辰机器人科技有限公司 Three-dimensional pose estimation method, system, computer device and storage medium
CN114660589A (en) * 2022-03-25 2022-06-24 中国铁建重工集团股份有限公司 Method, system and device for positioning underground tunnel
CN114897942A (en) * 2022-07-15 2022-08-12 深圳元戎启行科技有限公司 Point cloud map generation method and device and related storage medium
CN116295345A (en) * 2023-03-16 2023-06-23 上海知而行科技有限公司 Positioning method, device, equipment and medium in optical interference environment

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114608569A (en) * 2022-02-22 2022-06-10 杭州国辰机器人科技有限公司 Three-dimensional pose estimation method, system, computer device and storage medium
CN114608569B (en) * 2022-02-22 2024-03-01 杭州国辰机器人科技有限公司 Three-dimensional pose estimation method, system, computer equipment and storage medium
CN114660589A (en) * 2022-03-25 2022-06-24 中国铁建重工集团股份有限公司 Method, system and device for positioning underground tunnel
CN114660589B (en) * 2022-03-25 2023-03-10 中国铁建重工集团股份有限公司 Method, system and device for positioning underground tunnel
CN114897942A (en) * 2022-07-15 2022-08-12 深圳元戎启行科技有限公司 Point cloud map generation method and device and related storage medium
CN114897942B (en) * 2022-07-15 2022-10-28 深圳元戎启行科技有限公司 Point cloud map generation method and device and related storage medium
CN116295345A (en) * 2023-03-16 2023-06-23 上海知而行科技有限公司 Positioning method, device, equipment and medium in optical interference environment
CN116295345B (en) * 2023-03-16 2024-01-19 上海知而行科技有限公司 Positioning method, device, equipment and medium in optical interference environment

Similar Documents

Publication Publication Date Title
CN113960622A (en) Real-time positioning method and device fusing laser radar and IMU sensor information
KR102581263B1 (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN110178048B (en) Method and system for generating and updating vehicle environment map
CN111102978B (en) Method and device for determining vehicle motion state and electronic equipment
JP2021165731A (en) Positioning method, apparatus, computing device, and computer-readable storage medium
KR102238522B1 (en) Vehicle and method for generating map corresponding to three-dimentional space
Zhang et al. Vision-aided localization for ground robots
CN114608568B (en) Multi-sensor information based instant fusion positioning method
CN111469781B (en) For use in output of information processing system method and apparatus of (1)
CN112965076B (en) Multi-radar positioning system and method for robot
CN117451032A (en) SLAM method and system of low-calculation-force and loose-coupling laser radar and IMU
Taghizadeh et al. A low-cost integrated navigation system based on factor graph nonlinear optimization for autonomous flight
CN114897942B (en) Point cloud map generation method and device and related storage medium
KR102506411B1 (en) Method and apparatus for estimation of location and pose on vehicle and record medium for this
CN116182905A (en) Laser radar and combined inertial navigation space-time external parameter calibration method, device and system
CN116222541A (en) Intelligent multi-source integrated navigation method and device using factor graph
CN114047766B (en) Mobile robot data acquisition system and method for long-term application of indoor and outdoor scenes
CN113495281B (en) Real-time positioning method and device for movable platform
CN113379915B (en) Driving scene construction method based on point cloud fusion
CN114705223A (en) Inertial navigation error compensation method and system for multiple mobile intelligent bodies in target tracking
CN117437770A (en) Target state estimation method, device, electronic equipment and medium
Aditya Implementation of a 4D fast SLAM including volumetric sum of the UAV
CN112269187A (en) Robot state detection method, device and equipment
CN113804194B (en) Positioning method, device and equipment of driving equipment and storage medium
CN116972834A (en) Multi-sensor fusion positioning method, device, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination