CN108387236B - Polarized light SLAM method based on extended Kalman filtering - Google Patents
Polarized light SLAM method based on extended Kalman filtering Download PDFInfo
- Publication number
- CN108387236B CN108387236B CN201810128657.4A CN201810128657A CN108387236B CN 108387236 B CN108387236 B CN 108387236B CN 201810128657 A CN201810128657 A CN 201810128657A CN 108387236 B CN108387236 B CN 108387236B
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- aerial vehicle
- coordinate system
- polarized light
- moment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 238000001914 filtration Methods 0.000 title claims abstract description 16
- 238000005259 measurement Methods 0.000 claims abstract description 32
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 7
- 239000011159 matrix material Substances 0.000 claims description 35
- 238000012935 Averaging Methods 0.000 claims description 7
- 230000001133 acceleration Effects 0.000 claims description 6
- 230000007704 transition Effects 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000012886 linear function Methods 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims 1
- 238000010276 construction Methods 0.000 abstract description 2
- 230000010287 polarization Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 238000000149 argon plasma sintering Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Mathematical Analysis (AREA)
- Computational Mathematics (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Optimization (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Algebra (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
Abstract
The invention discloses a polarized light SLAM method based on extended Kalman filtering, and belongs to the field of unmanned aerial vehicle autonomous navigation. The method realizes the determination of the position of the unmanned aerial vehicle and the construction of a surrounding environment map by establishing a state model of the unmanned aerial vehicle and a measurement model based on a laser radar sensor and a polarized light sensor and by means of distributed extended Carl filtering, namely an EKF algorithm, and improves the accuracy of the simultaneous positioning and composition of the unmanned aerial vehicle by utilizing the characteristics of matching and complementation of polarized light information and laser radar information and no interference from other outside.
Description
Technical Field
The invention relates to the field of unmanned aerial vehicle autonomous navigation, in particular to a polarized light SLAM method based on extended Kalman filtering, which aims to solve the problem of how to determine the position of an unmanned aerial vehicle and sense the external environment.
Background
SLAM is an abbreviation for Simultaneous Localization and Mapping, meaning "Simultaneous Localization and Mapping". The method is a process of calculating the position of a moving object and constructing an environment map according to the information of a sensor. At present, SLAM technology has been applied to the fields of unmanned aerial vehicles, unmanned driving, robots, AR, smart home, and the like.
SLAM research focuses on minimizing noise at the moving body pose and landmark points of a map using filter theory, and typically uses the input of a odometer as input to the prediction process and the output of a laser sensor as input to the update process. Extended Kalman Filter (EKF) filtering algorithm is the filtering algorithm used by most current SLAM methods. In recent years, unmanned aerial vehicles develop very rapidly, the SLAM is used for the unmanned aerial vehicles, the problems that the positions of the unmanned aerial vehicles are uncertain, and position errors are accumulated in dead reckoning can be solved, the characteristics in the environment are repeatedly detected by using sensors carried by the unmanned aerial vehicles, so that the correction of the positions of the unmanned aerial vehicles and the positions of the characteristics is completed, meanwhile, an environment map is constructed, the construction of the position information of the unmanned aerial vehicles and the surrounding environment map can be completed without predicting map information or depending on external auxiliary equipment, but the positioning accuracy and the composition accuracy of the SLAM scheme are not high, and the robustness is poor.
In recent years, polarized light is increasingly researched, in 1871, a famous physicist Rayleigh proposes Rayleigh scattering law, and reveals light scattering characteristics, and then people obtain a whole-day-domain atmospheric polarization distribution mode based on the Rayleigh scattering law. The atmospheric polarization distribution mode is relatively stable, rich navigation information is contained in the atmospheric polarization distribution mode, and many living beings in nature can utilize sky polarized light to navigate or assist in navigation. The polarization navigation mechanism is a very effective navigation means, has the characteristics of being passive, free of radiation, good in concealment and the like, and can provide a new solution for navigation tasks in complex environments.
The polarized light is used on the SLAM of the unmanned aerial vehicle, so that the accuracy of determining the position of the unmanned aerial vehicle and constructing a map of the surrounding environment can be improved, and the problem of low robustness is solved.
Disclosure of Invention
The invention mainly solves the problems that: how to apply polarization information to unmanned aerial vehicle SLAM in order to solve unmanned aerial vehicle location simultaneously and to confirm the problem difficult, that environmental suitability is poor, the composition is not accurate enough in the composition that exists.
The technical scheme adopted by the invention is as follows: a polarized light SLAM method based on extended Kalman filtering comprises the following steps:
(1) selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as a system state, and establishing a dynamic model of the unmanned aerial vehicle;
(2) establishing a measurement model of the laser radar;
(3) establishing a measurement model of the polarized light sensor;
(4) initializing a system and a map;
(5) matching road sign points;
(6) estimating the position, the speed, the attitude and the position of the landmark point of the unmanned aerial vehicle by using laser radar data and polarized light sensor data of the landmark point through a distributed extended Kalman filter;
(7) updating the map;
selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as system states, and establishing a dynamic model of the unmanned aerial vehicle; the method comprises the following steps that a world coordinate system with an unmanned aerial vehicle starting position as an origin, namely a w system, the positive north direction is the positive direction of an x axis of the world coordinate system, the positive west direction is the positive direction of a y axis of the world coordinate system, and the positive direction of a z axis of the world coordinate system is determined according to a right-hand rule; establishing a body coordinate system with the center of the body of the unmanned aerial vehicle as an original point, namely a system b, taking the direction parallel to the longitudinal axis of the body and pointing to the head as the positive direction of the x axis of the body coordinate system, taking the direction parallel to the transverse axis of the body and pointing to the left as the positive direction of the y axis of the body coordinate system, and determining the positive direction of the z axis of the body coordinate system according to the right hand rule; selecting the position, speed and attitude of the unmanned aerial vehicle and the position of a landmark point as a system state; the system state transition equation is as follows:
wherein
Representing the state of the system at time k-1,representing the coordinates of the unmanned plane at the moment k-1 in the world coordinate system,respectively representing the coordinates of the unmanned aerial vehicle in the directions of x, y and z axes under a world coordinate system at the moment of k-1,representing the speed of the unmanned plane at the moment k-1 in the world coordinate system,respectively represents the speed of the unmanned plane pointing to the directions of the three axes of x, y and z under the world coordinate system at the moment of k-1,representing the pose of the drone, in the form of euler angles,respectively a roll angle, a pitch angle and a yaw angle,the coordinates of m landmark points in the world coordinate system at the moment of k-1 are shown,it is indicated that the (i) th,i1, 2., the coordinates of the m landmark points in the world coordinate system,is the specific force acceleration under the coordinate system of the machine body at the moment of k-1,is the angular velocity under the coordinate system of the machine body at the moment of k-1,is the weight acceleration under the world coordinate at the moment of k-1, delta t is the sampling time interval,
for the transformation matrix from the body coordinate system to the world coordinate system at time k-1, nk-1For system noise, the noise is set to white Gaussian noise, Qk-1Is a system noise covariance matrix;
wherein, the step (2) establishes a measurement model of the laser radar; selecting the distance and angle measured by the laser radar as measured values, and giving a measuring model of the landmark points according to the data of the laser radar:
wherein
i 1, 2.. m represents a non-linear function in the measurement equation of the ith landmark point at the time k, and yk,i,lidar=[dk,i θk,i λk,i]TRepresenting the measured value of the system at time k, dk,iIndicate the distance theta between the unmanned aerial vehicle centroid measured by the laser radar at the moment k and the ith landmark pointk,iIndicate unmanned aerial vehicle barycenter and ith road mark point's that laser radar measured at moment k range angle of pitch, lambdak,iIndicating the azimuth angle u of the centroid and ith landmark point of the unmanned aerial vehicle measured by the laser radar at the moment kk,lidarRepresenting the lidar measurement noise, the noise is set to white Gaussian noise, Rk,lidarMeasuring a noise covariance matrix;
wherein, the step (3) establishes a measurement model of the polarized light sensor: selecting the included angle between the longitudinal axis of the machine body and the solar meridian measured by the polarized light sensor as a measured value, wherein an observation model of the measured data given by the polarized light is as follows:
whereinIs the included angle between the long axis of the machine body and the solar meridian acquired by the polarized light sensor,the azimuth angle of the sun is taken as the azimuth angle,is the altitude angle of the sun,the number of the solar declination is,as to the latitude of the observation point,is the angle of the sun's time uk,polarRepresenting the measurement noise of the polarized light sensor, the noise being set to white Gaussian noise, Rk,polarMeasuring a noise covariance matrix;
wherein, the step (4) is system initialization; converting the coordinate system of the unmanned aerial vehicle into a world coordinate system and giving the unmanned aerial vehicle a system state at an initial momentThe initial covariance matrix is P0Establishing a global map with the starting position of the unmanned aerial vehicle as an origin;
wherein, the step (5) road sign point matching; matching the data measured by the laser radar at the time k with the landmark points stored in the global map by using an Iterative Closest Point (ICP) algorithm, sending the landmark points to corresponding sub-filters for subsequent operation for the landmark points successfully matched, establishing corresponding sub-filters for the landmark points failed in matching, preparing for next iteration, and deleting the corresponding sub-filters for the landmark points which cannot be observed;
the position, the speed, the attitude and the position of the landmark point of the unmanned aerial vehicle are estimated by using the laser radar data and the polarized light sensor data of the landmark point through a distributed extended Kalman filter in the step (6); firstly, the estimation state and covariance matrix of each sub-filter at the k-1 momentAnd combining the output data of the IMU at the moment of k-1, performing one-step recursion to obtain the position of the robot deduced by the IMU and the state transition matrix, then adding laser sensor information and polarized light information according to a predicted value to perform measurement updating calculation, outputting a corrected value of each sub-filter to the position of the unmanned aerial vehicle and an updated value of the road mark point, simultaneously outputting a corresponding covariance matrix, inputting the estimation result of each sub-filter into a main filter, determining the proportion of each sub-filter in the main filter according to the covariance of each sub-filter, and normalizing, thereby outputting the position, the speed, the attitude and the position of the road mark point of the unmanned aerial vehicle. The method comprises the following specific steps:
and (3) time updating:
measurement updating:
weighted averaging of the main filter:
whereinRepresenting the system state estimate at the time of the k-th instant,represents the covariance matrix at time k, FkJacobian matrix, H, representing the equation of statekJacobian matrix, η, representing an observation equationiIn order to be the weighting coefficients,represents the covariance matrix of the system after weighted averaging,representing the system state estimation after weighted average;
wherein the step (7) of map updating; regarding the existing landmark points in the global map, taking the corresponding filtered landmark points as the latest landmark points, and directly adding the landmark points which are not successfully matched in the step (5) into the global map;
compared with the prior art, the invention has the advantages that:
the invention discloses a polarized light SLAM method based on extended Kalman filtering, and belongs to the field of unmanned aerial vehicle autonomous navigation. According to the method, the unmanned aerial vehicle state model and the measurement model based on the laser radar sensor and the polarized light sensor are established, the distributed EKF algorithm is utilized to determine the position of the unmanned aerial vehicle and construct the surrounding environment map, the characteristics that the polarization information and the laser radar information are complementary and are not interfered by other external factors are effectively utilized, and the precision of the unmanned aerial vehicle position estimation is improved. Under the condition of keeping the measurement information of the laser radar unchanged, the advantage that the measurement error of the polarized light is not accumulated along with time is utilized, the accumulation of the error of the yaw angle under the long-time condition is reduced, and the accuracy of the global pose is improved. The positioning and composition precision of the unmanned aerial vehicle in the position environment is higher, and the adaptability to unknown environment is enhanced.
Drawings
FIG. 1 is a flow chart of a polarized light SLAM method based on extended Kalman filtering;
fig. 2 is a diagram of a simulation experiment result of a polarized light SLAM method based on extended kalman filtering.
Detailed Description
The invention is further described with reference to the following figures and detailed description.
As shown in fig. 1, the method for polarized light SLAM based on extended kalman filter according to the present invention includes the following steps:
(1) selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as a system state, and establishing a dynamic model of the unmanned aerial vehicle;
selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as system states, and establishing a dynamic model of the unmanned aerial vehicle; the method comprises the following steps that a world coordinate system with an unmanned aerial vehicle starting position as an origin, namely a w system, the positive north direction is the positive direction of an x axis of the world coordinate system, the positive west direction is the positive direction of a y axis of the world coordinate system, and the positive direction of a z axis of the world coordinate system is determined according to a right-hand rule; establishing a body coordinate system with the center of the body of the unmanned aerial vehicle as an original point, namely a system b, taking the direction parallel to the longitudinal axis of the body and pointing to the head as the positive direction of the x axis of the body coordinate system, taking the direction parallel to the transverse axis of the body and pointing to the left as the positive direction of the y axis of the body coordinate system, and determining the positive direction of the z axis of the body coordinate system according to the right hand rule; selecting the position, speed and attitude of the unmanned aerial vehicle and the position of a landmark point as a system state; the system state transition equation is as follows:
wherein
Representing the state of the system at time k-1,representing the coordinates of the unmanned plane at the moment k-1 in the world coordinate system,respectively representing the coordinates of the unmanned aerial vehicle in the directions of x, y and z axes under a world coordinate system at the moment of k-1,representing the speed of the unmanned plane at the moment k-1 in the world coordinate system,respectively represents the speed of the unmanned plane pointing to the directions of the three axes of x, y and z under the world coordinate system at the moment of k-1,representing the pose of the drone, in the form of euler angles,respectively a roll angle, a pitch angle and a yaw angle,the coordinates of m landmark points in the world coordinate system at the moment of k-1 are shown,denotes the coordinates of the i, i-th 1, 2., m landmark points in the world coordinate system,is the specific force acceleration under the coordinate system of the machine body at the moment of k-1,is the angular velocity under the coordinate system of the machine body at the moment of k-1,is the weight acceleration under the world coordinate at the moment of k-1, delta t is the sampling time interval,
for the transformation matrix from the body coordinate system to the world coordinate system at time k-1, nk-1For system noise, the noise is set to white Gaussian noise, Qk-1Is a system noise covariance matrix;
(2) establishing a measurement model of the laser radar;
the step (2) is to establish a measurement model of the laser radar; selecting the distance and angle measured by the laser radar as measured values, and giving a measuring model of the landmark points according to the data of the laser radar:
wherein
i 1, 2.. m represents a non-linear function in the measurement equation of the ith landmark point at the time k, and yk,i,lidar=[dk,i θk,i λk,i]TRepresenting the measured value of the system at time k, dk,iUnmanned aerial vehicle mass center and second mass center representing laser radar measurement at k momentDistance of i waypoints, θk,iIndicate unmanned aerial vehicle barycenter and ith road mark point's that laser radar measured at moment k range angle of pitch, lambdak,iIndicating the azimuth angle u of the centroid and ith landmark point of the unmanned aerial vehicle measured by the laser radar at the moment kk,lidarRepresenting the lidar measurement noise, the noise is set to white Gaussian noise, Rk,lidarMeasuring a noise covariance matrix;
(3) establishing a measurement model of the polarized light sensor;
the step (3) is to establish a measurement model of the polarized light sensor: selecting the included angle between the longitudinal axis of the machine body and the solar meridian measured by the polarized light sensor as a measured value, wherein an observation model of the measured data given by the polarized light is as follows:
whereinIs the included angle between the long axis of the machine body and the solar meridian acquired by the polarized light sensor,the azimuth angle of the sun is taken as the azimuth angle,is the altitude angle of the sun,the number of the solar declination is,is the latitude of the observation point, t is the solar time angle, uk,polarRepresenting the measurement noise of the polarized light sensor, the noise being set to white Gaussian noise, Rk,polarFor measuring noiseAn acoustic covariance matrix;
(4) initializing a system and a map;
the step (4) is system initialization; converting the coordinate system of the unmanned aerial vehicle into a world coordinate system and giving the unmanned aerial vehicle a system state at an initial momentInitial covariance matrix P0Establishing a global map with the starting position of the unmanned aerial vehicle as an origin;
(5) matching road sign points;
the step (5) of road sign point matching; matching the data measured by the laser radar at the time k with the landmark points stored in the global map by using an Iterative Closest Point (ICP) algorithm, sending the landmark points to corresponding sub-filters for subsequent operation for the landmark points successfully matched, establishing corresponding sub-filters for the landmark points failed in matching, preparing for next iteration, and deleting the corresponding sub-filters for the landmark points which cannot be observed;
(6) estimating the position, the speed, the attitude and the position of the landmark point of the unmanned aerial vehicle by using laser radar data and polarized light sensor data of the landmark point through a distributed extended Kalman filter;
estimating the position, the speed and the attitude of the unmanned aerial vehicle and the position of the landmark point by using the laser radar data and the polarized light sensor data of the landmark point through a distributed extended Kalman filter; firstly, the estimation state and covariance matrix of each sub-filter at the k-1 momentCombining the output data of the IMU at the moment of k-1, performing one-step recursion to obtain the robot position calculated by the IMU and the state transition matrix, then adding laser sensor information and polarized light information to perform measurement updating calculation according to a predicted value, outputting a corrected value of each sub-filter to the position of the unmanned aerial vehicle and an updated value of a landmark point, simultaneously outputting a corresponding covariance matrix, and inputting the estimation result of each sub-filter to a main filterIn the device, the proportion of each sub-filter in the main filter is determined according to the covariance of each sub-filter and normalized, so that the position, the speed, the attitude and the position of the landmark point of the unmanned aerial vehicle are output. The method comprises the following specific steps:
and (3) time updating:
measurement updating:
weighted averaging of the main filter:
whereinRepresenting the system state estimate at the time of the k-th instant,representing the covariance matrix of the system at time k, FkJacobian matrix, H, representing the equation of statekJacobian matrix, η, representing an observation equationiIn order to be the weighting coefficients,represents the covariance matrix of the system after weighted averaging,representing the system state estimation after weighted average;
(7) updating the map;
the step (7) of map updating; regarding the existing landmark points in the global map, taking the corresponding filtered landmark points as the latest landmark points, and directly adding the landmark points which are not successfully matched in the step (5) into the global map;
as shown in fig. 2, the simulation experiment results show that the dot is the true pose of the drone, the triangle dot is the pose of the drone estimated by the proposed method, and the five-star dot is the set landmark point. Through many times of experiments, the average error of the unmanned aerial vehicle on the x axis is 0.1812 meters, the average error of the y axis is 0.2081 meters, the average error of the z axis is 0.2532 meters, and the average error of the position is 0.4323 meters.
Claims (5)
1. A polarized light SLAM method based on extended Kalman filtering is characterized in that: the method comprises the following steps:
(1) selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as a system state, and establishing a dynamic model of the unmanned aerial vehicle;
(2) establishing a measurement model of the laser radar;
(3) establishing a measurement model of the polarized light sensor;
(4) initializing a system and a map;
(5) matching road sign points;
(6) estimating the position, the speed, the attitude and the position of the landmark point of the unmanned aerial vehicle by using laser radar data and polarized light sensor data of the landmark point through a distributed extended Kalman filter;
(7) updating the map;
selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as system states, and establishing a dynamic model of the unmanned aerial vehicle; the method comprises the following steps that a world coordinate system with an unmanned aerial vehicle starting position as an origin, namely a w system, the positive north direction is the positive direction of an x axis of the world coordinate system, the positive west direction is the positive direction of a y axis of the world coordinate system, and the positive direction of a z axis of the world coordinate system is determined according to a right-hand rule; establishing a body coordinate system with the center of the body of the unmanned aerial vehicle as an original point, namely a system b, taking the direction parallel to the longitudinal axis of the body and pointing to the head as the positive direction of the x axis of the body coordinate system, taking the direction parallel to the transverse axis of the body and pointing to the left as the positive direction of the y axis of the body coordinate system, and determining the positive direction of the z axis of the body coordinate system according to the right hand rule; selecting the position, speed and attitude of the unmanned aerial vehicle and the position of a landmark point as a system state; the system state transition equation is as follows:
wherein
Representing the state of the system at time k-1,representing the coordinates of the unmanned plane at the moment k-1 in the world coordinate system,respectively representing the coordinates of the unmanned aerial vehicle in the directions of x, y and z axes under a world coordinate system at the moment of k-1,representing the speed of the unmanned plane at the moment k-1 in the world coordinate system,respectively represents the speed of the unmanned plane pointing to the directions of the three axes of x, y and z under the world coordinate system at the moment of k-1,representing the pose of the drone, in the form of euler angles,respectively a roll angle, a pitch angle and a yaw angle,the coordinates of m landmark points in the world coordinate system at the moment of k-1 are shown,denotes the coordinates of the ith landmark point in the world coordinate system, where i is 1, 2.. multidot.m,is the specific force acceleration under the coordinate system of the machine body at the moment of k-1,is the angular velocity under the coordinate system of the machine body at the moment of k-1,is the gravitational acceleration under the world coordinate at the moment of k-1, delta t is the sampling time interval,for the transformation matrix from the body coordinate system to the world coordinate system at time k-1, nk-1The noise is set as white Gaussian noise as system noise;
the step (2) is to establish a measurement model of the laser radar; selecting the distance and angle measured by the laser radar as measured values, and giving a measuring model of the landmark points according to the data of the laser radar:
wherein
i 1, 2.. m represents a non-linear function in the measurement equation of the ith landmark point at the time k, and yk,lidar=[dk,i θk,i λk,i]TIndicating the time of kMeasured value of the system, dk,iIndicate the distance theta between the unmanned aerial vehicle centroid measured by the laser radar at the moment k and the ith landmark pointk,iIndicate unmanned aerial vehicle barycenter and ith road mark point's that laser radar measured at moment k range angle of pitch, lambdak,iIndicating the azimuth angle u of the centroid and ith landmark point of the unmanned aerial vehicle measured by the laser radar at the moment kk,lidarRepresenting the lidar measurement noise, the noise is set to white Gaussian noise, Rk,lidarMeasuring a noise covariance matrix;
the step (3) is to establish a measurement model of the polarized light sensor: selecting the included angle between the longitudinal axis of the machine body and the solar meridian measured by the polarized light sensor as a measured value, wherein an observation model of the measured data given by the polarized light is as follows:
whereinIs the included angle between the long axis of the machine body and the solar meridian acquired by the polarized light sensor,the azimuth angle of the sun is taken as the azimuth angle,is the altitude angle of the sun,the number of the solar declination is,as to the latitude of the observation point,is the angle of the sun's time uk,polarThe measurement noise of the polarized light sensor is represented, and the noise is set to be white Gaussian noise.
2. The polarized light SLAM method based on extended Kalman filtering of claim 1, characterized in that: the step (4) is system initialization; converting the coordinate system of the unmanned aerial vehicle into a world coordinate system and giving the unmanned aerial vehicle a system state at an initial momentInitial covariance matrix of P0And establishing a global map with the starting position of the unmanned aerial vehicle as the origin.
3. The polarized light SLAM method based on extended Kalman filtering of claim 1, characterized in that: the step (5) of road sign point matching; and matching the data measured by the laser radar at the time k with the landmark points stored in the global map by using an iterative closest point algorithm, sending the landmark points to corresponding sub-filters for subsequent operation for the landmark points successfully matched, establishing corresponding sub-filters for the landmark points failed in matching, preparing for next iteration, and deleting the corresponding sub-filters for the landmark points which cannot be observed.
4. The polarized light SLAM method based on extended Kalman filtering of claim 1, characterized in that: estimating the position, the speed and the attitude of the unmanned aerial vehicle and the position of the landmark point by using the laser radar data and the polarized light sensor data of the landmark point through a distributed extended Kalman filter; firstly, the estimation state and covariance matrix of each sub-filter at the k-1 momentCombining IMU outputs at time k-1Data, a position of the unmanned aerial vehicle calculated by the IMU and the state transition matrix is obtained by one-step recursion, then, according to a predicted value, the laser sensor information and the polarized light information are added for measurement and update calculation, a corrected value of each sub-filter to the position of the unmanned aerial vehicle and an updated value of a road mark point are output, a corresponding covariance matrix is simultaneously output, an estimation result of each sub-filter is input into a main filter, the proportion of each sub-filter in the main filter is determined according to the covariance of each sub-filter, normalization is carried out, and therefore the position, the speed, the attitude and the position of the road mark point of the unmanned aerial vehicle are output, and the method specifically comprises the following steps:
and (3) time updating:
measurement updating:
weighted averaging of the main filter:
wherein, i is 1, 2.. times, m,representing the system state estimate at the time of the k-th instant,representing the covariance matrix of the system at time k, FkJacobian matrix, H, representing the equation of statekJacobian matrix, η, representing an observation equationiIn order to be the weighting coefficients,represents the covariance matrix of the system after weighted averaging,representing the system state estimate after weighted averaging, Qk-1Is the system noise covariance matrix.
5. The polarized light SLAM method based on extended Kalman filtering of claim 1, characterized in that: the step (7) of map updating; and (5) regarding the existing landmark points in the global map, taking the corresponding filtered landmark points as the latest landmark points, and directly adding the landmark points which are not successfully matched in the step (5) into the global map.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810128657.4A CN108387236B (en) | 2018-02-08 | 2018-02-08 | Polarized light SLAM method based on extended Kalman filtering |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810128657.4A CN108387236B (en) | 2018-02-08 | 2018-02-08 | Polarized light SLAM method based on extended Kalman filtering |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108387236A CN108387236A (en) | 2018-08-10 |
CN108387236B true CN108387236B (en) | 2021-05-07 |
Family
ID=63075502
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810128657.4A Active CN108387236B (en) | 2018-02-08 | 2018-02-08 | Polarized light SLAM method based on extended Kalman filtering |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108387236B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020049468A1 (en) * | 2018-09-04 | 2020-03-12 | Mazdak Hooshang | Navigating a vehicle through a predefined path |
CN110779514B (en) * | 2019-10-28 | 2021-04-06 | 北京信息科技大学 | Hierarchical Kalman fusion method and device for auxiliary attitude determination of bionic polarization navigation |
CN114077245A (en) * | 2020-08-21 | 2022-02-22 | 苏州三六零机器人科技有限公司 | SLAM method and device for multiple data sources, sweeping robot and readable medium |
CN112697138B (en) * | 2020-12-07 | 2022-09-27 | 北方工业大学 | Bionic polarization synchronous positioning and composition method based on factor graph optimization |
CN113739795B (en) * | 2021-06-03 | 2023-10-20 | 东北电力大学 | Underwater synchronous positioning and mapping method based on polarized light/inertia/vision integrated navigation |
CN115235475B (en) * | 2022-09-23 | 2023-01-03 | 成都凯天电子股份有限公司 | MCC-based EKF-SLAM back-end navigation path optimization method |
CN115574816B (en) * | 2022-11-24 | 2023-03-14 | 东南大学 | Bionic vision multi-source information intelligent perception unmanned platform |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2981448A1 (en) * | 2011-10-18 | 2013-04-19 | Floch Albert Le | DEVICE FOR DETECTING THE DIRECTION OF A CACHE SUN, BASED ON A BIREFRINGENT DEPOLARIZER |
CN105606104A (en) * | 2016-03-17 | 2016-05-25 | 北京工业大学 | Robot autonomous navigation method based on heading-assisting distributed type SLAM (Simultaneous Localization and Mapping) |
CN105737832A (en) * | 2016-03-22 | 2016-07-06 | 北京工业大学 | Distributed SLAM (simultaneous localization and mapping) method on basis of global optimal data fusion |
CN106197432A (en) * | 2016-08-30 | 2016-12-07 | 北京航空航天大学 | A kind of UAV Landing method based on FastSLAM algorithm |
CN106886030A (en) * | 2017-03-24 | 2017-06-23 | 黑龙江硅智机器人有限公司 | It is applied to the synchronous mode map structuring and alignment system and method for service robot |
-
2018
- 2018-02-08 CN CN201810128657.4A patent/CN108387236B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2981448A1 (en) * | 2011-10-18 | 2013-04-19 | Floch Albert Le | DEVICE FOR DETECTING THE DIRECTION OF A CACHE SUN, BASED ON A BIREFRINGENT DEPOLARIZER |
CN105606104A (en) * | 2016-03-17 | 2016-05-25 | 北京工业大学 | Robot autonomous navigation method based on heading-assisting distributed type SLAM (Simultaneous Localization and Mapping) |
CN105737832A (en) * | 2016-03-22 | 2016-07-06 | 北京工业大学 | Distributed SLAM (simultaneous localization and mapping) method on basis of global optimal data fusion |
CN106197432A (en) * | 2016-08-30 | 2016-12-07 | 北京航空航天大学 | A kind of UAV Landing method based on FastSLAM algorithm |
CN106886030A (en) * | 2017-03-24 | 2017-06-23 | 黑龙江硅智机器人有限公司 | It is applied to the synchronous mode map structuring and alignment system and method for service robot |
Non-Patent Citations (1)
Title |
---|
基于天空偏振光的SLAM方法的研究;王道斌;《中国博士学位论文全文数据库信息科技辑》;20150315;第24-25,75-89页 * |
Also Published As
Publication number | Publication date |
---|---|
CN108387236A (en) | 2018-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108362288B (en) | Polarized light SLAM method based on unscented Kalman filtering | |
CN108387236B (en) | Polarized light SLAM method based on extended Kalman filtering | |
CN108731670B (en) | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization | |
CN106840179B (en) | Intelligent vehicle positioning method based on multi-sensor information fusion | |
CN110243358B (en) | Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system | |
CN112697138B (en) | Bionic polarization synchronous positioning and composition method based on factor graph optimization | |
CN104075715B (en) | A kind of underwater navigation localization method of Combining with terrain and environmental characteristic | |
CN104061899B (en) | A kind of vehicle side inclination angle based on Kalman filtering and angle of pitch method of estimation | |
CN104655131B (en) | Inertial navigation Initial Alignment Method based on ISTSSRCKF | |
CN101846734B (en) | Agricultural machinery navigation and position method and system and agricultural machinery industrial personal computer | |
CN107015238A (en) | Unmanned vehicle autonomic positioning method based on three-dimensional laser radar | |
Mu et al. | End-to-end navigation for autonomous underwater vehicle with hybrid recurrent neural networks | |
WO2014042710A2 (en) | Pose estimation | |
CN112798021B (en) | Inertial navigation system inter-travelling initial alignment method based on laser Doppler velocimeter | |
CN105333869A (en) | Unmanned reconnaissance aerial vehicle synchronous positioning and picture compositing method based on self-adaption EKF | |
CN112923931A (en) | Feature map matching and GPS positioning information fusion method based on fixed route | |
CN115574816B (en) | Bionic vision multi-source information intelligent perception unmanned platform | |
CN111025229B (en) | Underwater robot pure orientation target estimation method | |
CN115200578A (en) | Polynomial optimization-based inertial-based navigation information fusion method and system | |
CN113503872B (en) | Low-speed unmanned aerial vehicle positioning method based on fusion of camera and consumption-level IMU | |
CN109387198A (en) | A kind of inertia based on sequential detection/visual odometry Combinated navigation method | |
CN111220151B (en) | Inertia and milemeter combined navigation method considering temperature model under load system | |
CN112325878A (en) | Ground carrier combined navigation method based on UKF and air unmanned aerial vehicle node assistance | |
CN111693044A (en) | Fusion positioning method | |
CN111256708A (en) | Vehicle-mounted integrated navigation method based on radio frequency identification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |