CN108387236B - Polarized light SLAM method based on extended Kalman filtering - Google Patents

Polarized light SLAM method based on extended Kalman filtering Download PDF

Info

Publication number
CN108387236B
CN108387236B CN201810128657.4A CN201810128657A CN108387236B CN 108387236 B CN108387236 B CN 108387236B CN 201810128657 A CN201810128657 A CN 201810128657A CN 108387236 B CN108387236 B CN 108387236B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
coordinate system
polarized light
moment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810128657.4A
Other languages
Chinese (zh)
Other versions
CN108387236A (en
Inventor
杜涛
白鹏飞
郭雷
王华锋
刘万泉
王月海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
North China University of Technology
Beihang University
Original Assignee
North China University of Technology
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by North China University of Technology, Beihang University filed Critical North China University of Technology
Priority to CN201810128657.4A priority Critical patent/CN108387236B/en
Publication of CN108387236A publication Critical patent/CN108387236A/en
Application granted granted Critical
Publication of CN108387236B publication Critical patent/CN108387236B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Algebra (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a polarized light SLAM method based on extended Kalman filtering, and belongs to the field of unmanned aerial vehicle autonomous navigation. The method realizes the determination of the position of the unmanned aerial vehicle and the construction of a surrounding environment map by establishing a state model of the unmanned aerial vehicle and a measurement model based on a laser radar sensor and a polarized light sensor and by means of distributed extended Carl filtering, namely an EKF algorithm, and improves the accuracy of the simultaneous positioning and composition of the unmanned aerial vehicle by utilizing the characteristics of matching and complementation of polarized light information and laser radar information and no interference from other outside.

Description

Polarized light SLAM method based on extended Kalman filtering
Technical Field
The invention relates to the field of unmanned aerial vehicle autonomous navigation, in particular to a polarized light SLAM method based on extended Kalman filtering, which aims to solve the problem of how to determine the position of an unmanned aerial vehicle and sense the external environment.
Background
SLAM is an abbreviation for Simultaneous Localization and Mapping, meaning "Simultaneous Localization and Mapping". The method is a process of calculating the position of a moving object and constructing an environment map according to the information of a sensor. At present, SLAM technology has been applied to the fields of unmanned aerial vehicles, unmanned driving, robots, AR, smart home, and the like.
SLAM research focuses on minimizing noise at the moving body pose and landmark points of a map using filter theory, and typically uses the input of a odometer as input to the prediction process and the output of a laser sensor as input to the update process. Extended Kalman Filter (EKF) filtering algorithm is the filtering algorithm used by most current SLAM methods. In recent years, unmanned aerial vehicles develop very rapidly, the SLAM is used for the unmanned aerial vehicles, the problems that the positions of the unmanned aerial vehicles are uncertain, and position errors are accumulated in dead reckoning can be solved, the characteristics in the environment are repeatedly detected by using sensors carried by the unmanned aerial vehicles, so that the correction of the positions of the unmanned aerial vehicles and the positions of the characteristics is completed, meanwhile, an environment map is constructed, the construction of the position information of the unmanned aerial vehicles and the surrounding environment map can be completed without predicting map information or depending on external auxiliary equipment, but the positioning accuracy and the composition accuracy of the SLAM scheme are not high, and the robustness is poor.
In recent years, polarized light is increasingly researched, in 1871, a famous physicist Rayleigh proposes Rayleigh scattering law, and reveals light scattering characteristics, and then people obtain a whole-day-domain atmospheric polarization distribution mode based on the Rayleigh scattering law. The atmospheric polarization distribution mode is relatively stable, rich navigation information is contained in the atmospheric polarization distribution mode, and many living beings in nature can utilize sky polarized light to navigate or assist in navigation. The polarization navigation mechanism is a very effective navigation means, has the characteristics of being passive, free of radiation, good in concealment and the like, and can provide a new solution for navigation tasks in complex environments.
The polarized light is used on the SLAM of the unmanned aerial vehicle, so that the accuracy of determining the position of the unmanned aerial vehicle and constructing a map of the surrounding environment can be improved, and the problem of low robustness is solved.
Disclosure of Invention
The invention mainly solves the problems that: how to apply polarization information to unmanned aerial vehicle SLAM in order to solve unmanned aerial vehicle location simultaneously and to confirm the problem difficult, that environmental suitability is poor, the composition is not accurate enough in the composition that exists.
The technical scheme adopted by the invention is as follows: a polarized light SLAM method based on extended Kalman filtering comprises the following steps:
(1) selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as a system state, and establishing a dynamic model of the unmanned aerial vehicle;
(2) establishing a measurement model of the laser radar;
(3) establishing a measurement model of the polarized light sensor;
(4) initializing a system and a map;
(5) matching road sign points;
(6) estimating the position, the speed, the attitude and the position of the landmark point of the unmanned aerial vehicle by using laser radar data and polarized light sensor data of the landmark point through a distributed extended Kalman filter;
(7) updating the map;
selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as system states, and establishing a dynamic model of the unmanned aerial vehicle; the method comprises the following steps that a world coordinate system with an unmanned aerial vehicle starting position as an origin, namely a w system, the positive north direction is the positive direction of an x axis of the world coordinate system, the positive west direction is the positive direction of a y axis of the world coordinate system, and the positive direction of a z axis of the world coordinate system is determined according to a right-hand rule; establishing a body coordinate system with the center of the body of the unmanned aerial vehicle as an original point, namely a system b, taking the direction parallel to the longitudinal axis of the body and pointing to the head as the positive direction of the x axis of the body coordinate system, taking the direction parallel to the transverse axis of the body and pointing to the left as the positive direction of the y axis of the body coordinate system, and determining the positive direction of the z axis of the body coordinate system according to the right hand rule; selecting the position, speed and attitude of the unmanned aerial vehicle and the position of a landmark point as a system state; the system state transition equation is as follows:
Figure BDA0001574277440000021
wherein
Figure BDA0001574277440000022
Figure BDA0001574277440000023
Representing the state of the system at time k-1,
Figure BDA0001574277440000024
representing the coordinates of the unmanned plane at the moment k-1 in the world coordinate system,
Figure BDA0001574277440000025
respectively representing the coordinates of the unmanned aerial vehicle in the directions of x, y and z axes under a world coordinate system at the moment of k-1,
Figure BDA0001574277440000026
representing the speed of the unmanned plane at the moment k-1 in the world coordinate system,
Figure BDA0001574277440000027
respectively represents the speed of the unmanned plane pointing to the directions of the three axes of x, y and z under the world coordinate system at the moment of k-1,
Figure BDA0001574277440000031
representing the pose of the drone, in the form of euler angles,
Figure BDA0001574277440000032
respectively a roll angle, a pitch angle and a yaw angle,
Figure BDA0001574277440000033
the coordinates of m landmark points in the world coordinate system at the moment of k-1 are shown,
Figure BDA0001574277440000034
it is indicated that the (i) th,i1, 2., the coordinates of the m landmark points in the world coordinate system,
Figure BDA0001574277440000035
is the specific force acceleration under the coordinate system of the machine body at the moment of k-1,
Figure BDA0001574277440000036
is the angular velocity under the coordinate system of the machine body at the moment of k-1,
Figure BDA0001574277440000037
is the weight acceleration under the world coordinate at the moment of k-1, delta t is the sampling time interval,
Figure BDA0001574277440000038
for the transformation matrix from the body coordinate system to the world coordinate system at time k-1, nk-1For system noise, the noise is set to white Gaussian noise, Qk-1Is a system noise covariance matrix;
wherein, the step (2) establishes a measurement model of the laser radar; selecting the distance and angle measured by the laser radar as measured values, and giving a measuring model of the landmark points according to the data of the laser radar:
Figure BDA0001574277440000039
wherein
Figure BDA00015742774400000310
Figure BDA00015742774400000311
Figure BDA00015742774400000312
i 1, 2.. m represents a non-linear function in the measurement equation of the ith landmark point at the time k, and yk,i,lidar=[dk,i θk,i λk,i]TRepresenting the measured value of the system at time k, dk,iIndicate the distance theta between the unmanned aerial vehicle centroid measured by the laser radar at the moment k and the ith landmark pointk,iIndicate unmanned aerial vehicle barycenter and ith road mark point's that laser radar measured at moment k range angle of pitch, lambdak,iIndicating the azimuth angle u of the centroid and ith landmark point of the unmanned aerial vehicle measured by the laser radar at the moment kk,lidarRepresenting the lidar measurement noise, the noise is set to white Gaussian noise, Rk,lidarMeasuring a noise covariance matrix;
wherein, the step (3) establishes a measurement model of the polarized light sensor: selecting the included angle between the longitudinal axis of the machine body and the solar meridian measured by the polarized light sensor as a measured value, wherein an observation model of the measured data given by the polarized light is as follows:
Figure BDA0001574277440000041
Figure BDA0001574277440000042
wherein
Figure BDA0001574277440000043
Is the included angle between the long axis of the machine body and the solar meridian acquired by the polarized light sensor,
Figure BDA0001574277440000044
the azimuth angle of the sun is taken as the azimuth angle,
Figure BDA0001574277440000045
is the altitude angle of the sun,
Figure BDA0001574277440000046
the number of the solar declination is,
Figure BDA0001574277440000047
as to the latitude of the observation point,
Figure BDA0001574277440000048
is the angle of the sun's time uk,polarRepresenting the measurement noise of the polarized light sensor, the noise being set to white Gaussian noise, Rk,polarMeasuring a noise covariance matrix;
wherein, the step (4) is system initialization; converting the coordinate system of the unmanned aerial vehicle into a world coordinate system and giving the unmanned aerial vehicle a system state at an initial moment
Figure BDA00015742774400000413
The initial covariance matrix is P0Establishing a global map with the starting position of the unmanned aerial vehicle as an origin;
wherein, the step (5) road sign point matching; matching the data measured by the laser radar at the time k with the landmark points stored in the global map by using an Iterative Closest Point (ICP) algorithm, sending the landmark points to corresponding sub-filters for subsequent operation for the landmark points successfully matched, establishing corresponding sub-filters for the landmark points failed in matching, preparing for next iteration, and deleting the corresponding sub-filters for the landmark points which cannot be observed;
the position, the speed, the attitude and the position of the landmark point of the unmanned aerial vehicle are estimated by using the laser radar data and the polarized light sensor data of the landmark point through a distributed extended Kalman filter in the step (6); firstly, the estimation state and covariance matrix of each sub-filter at the k-1 moment
Figure BDA0001574277440000049
And combining the output data of the IMU at the moment of k-1, performing one-step recursion to obtain the position of the robot deduced by the IMU and the state transition matrix, then adding laser sensor information and polarized light information according to a predicted value to perform measurement updating calculation, outputting a corrected value of each sub-filter to the position of the unmanned aerial vehicle and an updated value of the road mark point, simultaneously outputting a corresponding covariance matrix, inputting the estimation result of each sub-filter into a main filter, determining the proportion of each sub-filter in the main filter according to the covariance of each sub-filter, and normalizing, thereby outputting the position, the speed, the attitude and the position of the road mark point of the unmanned aerial vehicle. The method comprises the following specific steps:
and (3) time updating:
Figure BDA00015742774400000410
Figure BDA00015742774400000411
Figure BDA00015742774400000412
measurement updating:
Figure BDA0001574277440000051
Figure BDA0001574277440000052
Figure BDA0001574277440000053
Figure BDA0001574277440000054
Figure BDA0001574277440000055
Figure BDA0001574277440000056
weighted averaging of the main filter:
Figure BDA0001574277440000057
Figure BDA0001574277440000058
Figure BDA0001574277440000059
wherein
Figure BDA00015742774400000510
Representing the system state estimate at the time of the k-th instant,
Figure BDA00015742774400000511
represents the covariance matrix at time k, FkJacobian matrix, H, representing the equation of statekJacobian matrix, η, representing an observation equationiIn order to be the weighting coefficients,
Figure BDA00015742774400000512
represents the covariance matrix of the system after weighted averaging,
Figure BDA00015742774400000513
representing the system state estimation after weighted average;
wherein the step (7) of map updating; regarding the existing landmark points in the global map, taking the corresponding filtered landmark points as the latest landmark points, and directly adding the landmark points which are not successfully matched in the step (5) into the global map;
compared with the prior art, the invention has the advantages that:
the invention discloses a polarized light SLAM method based on extended Kalman filtering, and belongs to the field of unmanned aerial vehicle autonomous navigation. According to the method, the unmanned aerial vehicle state model and the measurement model based on the laser radar sensor and the polarized light sensor are established, the distributed EKF algorithm is utilized to determine the position of the unmanned aerial vehicle and construct the surrounding environment map, the characteristics that the polarization information and the laser radar information are complementary and are not interfered by other external factors are effectively utilized, and the precision of the unmanned aerial vehicle position estimation is improved. Under the condition of keeping the measurement information of the laser radar unchanged, the advantage that the measurement error of the polarized light is not accumulated along with time is utilized, the accumulation of the error of the yaw angle under the long-time condition is reduced, and the accuracy of the global pose is improved. The positioning and composition precision of the unmanned aerial vehicle in the position environment is higher, and the adaptability to unknown environment is enhanced.
Drawings
FIG. 1 is a flow chart of a polarized light SLAM method based on extended Kalman filtering;
fig. 2 is a diagram of a simulation experiment result of a polarized light SLAM method based on extended kalman filtering.
Detailed Description
The invention is further described with reference to the following figures and detailed description.
As shown in fig. 1, the method for polarized light SLAM based on extended kalman filter according to the present invention includes the following steps:
(1) selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as a system state, and establishing a dynamic model of the unmanned aerial vehicle;
selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as system states, and establishing a dynamic model of the unmanned aerial vehicle; the method comprises the following steps that a world coordinate system with an unmanned aerial vehicle starting position as an origin, namely a w system, the positive north direction is the positive direction of an x axis of the world coordinate system, the positive west direction is the positive direction of a y axis of the world coordinate system, and the positive direction of a z axis of the world coordinate system is determined according to a right-hand rule; establishing a body coordinate system with the center of the body of the unmanned aerial vehicle as an original point, namely a system b, taking the direction parallel to the longitudinal axis of the body and pointing to the head as the positive direction of the x axis of the body coordinate system, taking the direction parallel to the transverse axis of the body and pointing to the left as the positive direction of the y axis of the body coordinate system, and determining the positive direction of the z axis of the body coordinate system according to the right hand rule; selecting the position, speed and attitude of the unmanned aerial vehicle and the position of a landmark point as a system state; the system state transition equation is as follows:
Figure BDA0001574277440000061
wherein
Figure BDA0001574277440000062
Figure BDA0001574277440000063
Representing the state of the system at time k-1,
Figure BDA0001574277440000064
representing the coordinates of the unmanned plane at the moment k-1 in the world coordinate system,
Figure BDA0001574277440000065
respectively representing the coordinates of the unmanned aerial vehicle in the directions of x, y and z axes under a world coordinate system at the moment of k-1,
Figure BDA0001574277440000066
representing the speed of the unmanned plane at the moment k-1 in the world coordinate system,
Figure BDA0001574277440000067
respectively represents the speed of the unmanned plane pointing to the directions of the three axes of x, y and z under the world coordinate system at the moment of k-1,
Figure BDA0001574277440000068
representing the pose of the drone, in the form of euler angles,
Figure BDA0001574277440000069
respectively a roll angle, a pitch angle and a yaw angle,
Figure BDA00015742774400000610
the coordinates of m landmark points in the world coordinate system at the moment of k-1 are shown,
Figure BDA00015742774400000611
denotes the coordinates of the i, i-th 1, 2., m landmark points in the world coordinate system,
Figure BDA00015742774400000612
is the specific force acceleration under the coordinate system of the machine body at the moment of k-1,
Figure BDA00015742774400000613
is the angular velocity under the coordinate system of the machine body at the moment of k-1,
Figure BDA00015742774400000614
is the weight acceleration under the world coordinate at the moment of k-1, delta t is the sampling time interval,
Figure BDA0001574277440000071
for the transformation matrix from the body coordinate system to the world coordinate system at time k-1, nk-1For system noise, the noise is set to white Gaussian noise, Qk-1Is a system noise covariance matrix;
(2) establishing a measurement model of the laser radar;
the step (2) is to establish a measurement model of the laser radar; selecting the distance and angle measured by the laser radar as measured values, and giving a measuring model of the landmark points according to the data of the laser radar:
Figure BDA0001574277440000072
wherein
Figure BDA0001574277440000073
Figure BDA0001574277440000074
Figure BDA0001574277440000075
i 1, 2.. m represents a non-linear function in the measurement equation of the ith landmark point at the time k, and yk,i,lidar=[dk,i θk,i λk,i]TRepresenting the measured value of the system at time k, dk,iUnmanned aerial vehicle mass center and second mass center representing laser radar measurement at k momentDistance of i waypoints, θk,iIndicate unmanned aerial vehicle barycenter and ith road mark point's that laser radar measured at moment k range angle of pitch, lambdak,iIndicating the azimuth angle u of the centroid and ith landmark point of the unmanned aerial vehicle measured by the laser radar at the moment kk,lidarRepresenting the lidar measurement noise, the noise is set to white Gaussian noise, Rk,lidarMeasuring a noise covariance matrix;
(3) establishing a measurement model of the polarized light sensor;
the step (3) is to establish a measurement model of the polarized light sensor: selecting the included angle between the longitudinal axis of the machine body and the solar meridian measured by the polarized light sensor as a measured value, wherein an observation model of the measured data given by the polarized light is as follows:
Figure BDA0001574277440000076
Figure BDA0001574277440000081
wherein
Figure BDA0001574277440000082
Is the included angle between the long axis of the machine body and the solar meridian acquired by the polarized light sensor,
Figure BDA0001574277440000083
the azimuth angle of the sun is taken as the azimuth angle,
Figure BDA0001574277440000084
is the altitude angle of the sun,
Figure BDA0001574277440000085
the number of the solar declination is,
Figure BDA0001574277440000086
is the latitude of the observation point, t is the solar time angle, uk,polarRepresenting the measurement noise of the polarized light sensor, the noise being set to white Gaussian noise, Rk,polarFor measuring noiseAn acoustic covariance matrix;
(4) initializing a system and a map;
the step (4) is system initialization; converting the coordinate system of the unmanned aerial vehicle into a world coordinate system and giving the unmanned aerial vehicle a system state at an initial moment
Figure BDA00015742774400000811
Initial covariance matrix P0Establishing a global map with the starting position of the unmanned aerial vehicle as an origin;
(5) matching road sign points;
the step (5) of road sign point matching; matching the data measured by the laser radar at the time k with the landmark points stored in the global map by using an Iterative Closest Point (ICP) algorithm, sending the landmark points to corresponding sub-filters for subsequent operation for the landmark points successfully matched, establishing corresponding sub-filters for the landmark points failed in matching, preparing for next iteration, and deleting the corresponding sub-filters for the landmark points which cannot be observed;
(6) estimating the position, the speed, the attitude and the position of the landmark point of the unmanned aerial vehicle by using laser radar data and polarized light sensor data of the landmark point through a distributed extended Kalman filter;
estimating the position, the speed and the attitude of the unmanned aerial vehicle and the position of the landmark point by using the laser radar data and the polarized light sensor data of the landmark point through a distributed extended Kalman filter; firstly, the estimation state and covariance matrix of each sub-filter at the k-1 moment
Figure BDA0001574277440000087
Combining the output data of the IMU at the moment of k-1, performing one-step recursion to obtain the robot position calculated by the IMU and the state transition matrix, then adding laser sensor information and polarized light information to perform measurement updating calculation according to a predicted value, outputting a corrected value of each sub-filter to the position of the unmanned aerial vehicle and an updated value of a landmark point, simultaneously outputting a corresponding covariance matrix, and inputting the estimation result of each sub-filter to a main filterIn the device, the proportion of each sub-filter in the main filter is determined according to the covariance of each sub-filter and normalized, so that the position, the speed, the attitude and the position of the landmark point of the unmanned aerial vehicle are output. The method comprises the following specific steps:
and (3) time updating:
Figure BDA0001574277440000088
Figure BDA0001574277440000089
Figure BDA00015742774400000810
measurement updating:
Figure BDA0001574277440000091
Figure BDA0001574277440000092
Figure BDA0001574277440000093
Figure BDA0001574277440000094
Figure BDA0001574277440000095
Figure BDA0001574277440000096
weighted averaging of the main filter:
Figure BDA0001574277440000097
Figure BDA0001574277440000098
Figure BDA0001574277440000099
wherein
Figure BDA00015742774400000910
Representing the system state estimate at the time of the k-th instant,
Figure BDA00015742774400000911
representing the covariance matrix of the system at time k, FkJacobian matrix, H, representing the equation of statekJacobian matrix, η, representing an observation equationiIn order to be the weighting coefficients,
Figure BDA00015742774400000912
represents the covariance matrix of the system after weighted averaging,
Figure BDA00015742774400000913
representing the system state estimation after weighted average;
(7) updating the map;
the step (7) of map updating; regarding the existing landmark points in the global map, taking the corresponding filtered landmark points as the latest landmark points, and directly adding the landmark points which are not successfully matched in the step (5) into the global map;
as shown in fig. 2, the simulation experiment results show that the dot is the true pose of the drone, the triangle dot is the pose of the drone estimated by the proposed method, and the five-star dot is the set landmark point. Through many times of experiments, the average error of the unmanned aerial vehicle on the x axis is 0.1812 meters, the average error of the y axis is 0.2081 meters, the average error of the z axis is 0.2532 meters, and the average error of the position is 0.4323 meters.

Claims (5)

1. A polarized light SLAM method based on extended Kalman filtering is characterized in that: the method comprises the following steps:
(1) selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as a system state, and establishing a dynamic model of the unmanned aerial vehicle;
(2) establishing a measurement model of the laser radar;
(3) establishing a measurement model of the polarized light sensor;
(4) initializing a system and a map;
(5) matching road sign points;
(6) estimating the position, the speed, the attitude and the position of the landmark point of the unmanned aerial vehicle by using laser radar data and polarized light sensor data of the landmark point through a distributed extended Kalman filter;
(7) updating the map;
selecting the attitude, the speed, the position and the landmark points of the unmanned aerial vehicle as system states, and establishing a dynamic model of the unmanned aerial vehicle; the method comprises the following steps that a world coordinate system with an unmanned aerial vehicle starting position as an origin, namely a w system, the positive north direction is the positive direction of an x axis of the world coordinate system, the positive west direction is the positive direction of a y axis of the world coordinate system, and the positive direction of a z axis of the world coordinate system is determined according to a right-hand rule; establishing a body coordinate system with the center of the body of the unmanned aerial vehicle as an original point, namely a system b, taking the direction parallel to the longitudinal axis of the body and pointing to the head as the positive direction of the x axis of the body coordinate system, taking the direction parallel to the transverse axis of the body and pointing to the left as the positive direction of the y axis of the body coordinate system, and determining the positive direction of the z axis of the body coordinate system according to the right hand rule; selecting the position, speed and attitude of the unmanned aerial vehicle and the position of a landmark point as a system state; the system state transition equation is as follows:
Figure FDA0002960311650000011
wherein
Figure FDA0002960311650000012
Figure FDA0002960311650000013
Representing the state of the system at time k-1,
Figure FDA0002960311650000014
representing the coordinates of the unmanned plane at the moment k-1 in the world coordinate system,
Figure FDA0002960311650000015
respectively representing the coordinates of the unmanned aerial vehicle in the directions of x, y and z axes under a world coordinate system at the moment of k-1,
Figure FDA0002960311650000016
representing the speed of the unmanned plane at the moment k-1 in the world coordinate system,
Figure FDA0002960311650000017
respectively represents the speed of the unmanned plane pointing to the directions of the three axes of x, y and z under the world coordinate system at the moment of k-1,
Figure FDA0002960311650000018
representing the pose of the drone, in the form of euler angles,
Figure FDA0002960311650000019
respectively a roll angle, a pitch angle and a yaw angle,
Figure FDA0002960311650000021
the coordinates of m landmark points in the world coordinate system at the moment of k-1 are shown,
Figure FDA0002960311650000022
denotes the coordinates of the ith landmark point in the world coordinate system, where i is 1, 2.. multidot.m,
Figure FDA0002960311650000023
is the specific force acceleration under the coordinate system of the machine body at the moment of k-1,
Figure FDA0002960311650000024
is the angular velocity under the coordinate system of the machine body at the moment of k-1,
Figure FDA0002960311650000025
is the gravitational acceleration under the world coordinate at the moment of k-1, delta t is the sampling time interval,
Figure FDA0002960311650000026
for the transformation matrix from the body coordinate system to the world coordinate system at time k-1, nk-1The noise is set as white Gaussian noise as system noise;
the step (2) is to establish a measurement model of the laser radar; selecting the distance and angle measured by the laser radar as measured values, and giving a measuring model of the landmark points according to the data of the laser radar:
Figure FDA0002960311650000027
wherein
Figure FDA0002960311650000028
Figure FDA0002960311650000029
Figure FDA00029603116500000210
i 1, 2.. m represents a non-linear function in the measurement equation of the ith landmark point at the time k, and yk,lidar=[dk,i θk,i λk,i]TIndicating the time of kMeasured value of the system, dk,iIndicate the distance theta between the unmanned aerial vehicle centroid measured by the laser radar at the moment k and the ith landmark pointk,iIndicate unmanned aerial vehicle barycenter and ith road mark point's that laser radar measured at moment k range angle of pitch, lambdak,iIndicating the azimuth angle u of the centroid and ith landmark point of the unmanned aerial vehicle measured by the laser radar at the moment kk,lidarRepresenting the lidar measurement noise, the noise is set to white Gaussian noise, Rk,lidarMeasuring a noise covariance matrix;
the step (3) is to establish a measurement model of the polarized light sensor: selecting the included angle between the longitudinal axis of the machine body and the solar meridian measured by the polarized light sensor as a measured value, wherein an observation model of the measured data given by the polarized light is as follows:
Figure FDA0002960311650000031
Figure FDA0002960311650000032
wherein
Figure FDA0002960311650000033
Is the included angle between the long axis of the machine body and the solar meridian acquired by the polarized light sensor,
Figure FDA0002960311650000034
the azimuth angle of the sun is taken as the azimuth angle,
Figure FDA0002960311650000035
is the altitude angle of the sun,
Figure FDA0002960311650000036
the number of the solar declination is,
Figure FDA0002960311650000037
as to the latitude of the observation point,
Figure FDA0002960311650000038
is the angle of the sun's time uk,polarThe measurement noise of the polarized light sensor is represented, and the noise is set to be white Gaussian noise.
2. The polarized light SLAM method based on extended Kalman filtering of claim 1, characterized in that: the step (4) is system initialization; converting the coordinate system of the unmanned aerial vehicle into a world coordinate system and giving the unmanned aerial vehicle a system state at an initial moment
Figure FDA0002960311650000039
Initial covariance matrix of P0And establishing a global map with the starting position of the unmanned aerial vehicle as the origin.
3. The polarized light SLAM method based on extended Kalman filtering of claim 1, characterized in that: the step (5) of road sign point matching; and matching the data measured by the laser radar at the time k with the landmark points stored in the global map by using an iterative closest point algorithm, sending the landmark points to corresponding sub-filters for subsequent operation for the landmark points successfully matched, establishing corresponding sub-filters for the landmark points failed in matching, preparing for next iteration, and deleting the corresponding sub-filters for the landmark points which cannot be observed.
4. The polarized light SLAM method based on extended Kalman filtering of claim 1, characterized in that: estimating the position, the speed and the attitude of the unmanned aerial vehicle and the position of the landmark point by using the laser radar data and the polarized light sensor data of the landmark point through a distributed extended Kalman filter; firstly, the estimation state and covariance matrix of each sub-filter at the k-1 moment
Figure FDA00029603116500000310
Combining IMU outputs at time k-1Data, a position of the unmanned aerial vehicle calculated by the IMU and the state transition matrix is obtained by one-step recursion, then, according to a predicted value, the laser sensor information and the polarized light information are added for measurement and update calculation, a corrected value of each sub-filter to the position of the unmanned aerial vehicle and an updated value of a road mark point are output, a corresponding covariance matrix is simultaneously output, an estimation result of each sub-filter is input into a main filter, the proportion of each sub-filter in the main filter is determined according to the covariance of each sub-filter, normalization is carried out, and therefore the position, the speed, the attitude and the position of the road mark point of the unmanned aerial vehicle are output, and the method specifically comprises the following steps:
and (3) time updating:
Figure FDA00029603116500000311
Figure FDA00029603116500000312
Figure FDA00029603116500000313
measurement updating:
Figure FDA0002960311650000041
Figure FDA0002960311650000042
Figure FDA0002960311650000043
Figure FDA0002960311650000044
Figure FDA0002960311650000045
Figure FDA0002960311650000046
weighted averaging of the main filter:
Figure FDA0002960311650000047
Figure FDA0002960311650000048
Figure FDA0002960311650000049
wherein, i is 1, 2.. times, m,
Figure FDA00029603116500000410
representing the system state estimate at the time of the k-th instant,
Figure FDA00029603116500000411
representing the covariance matrix of the system at time k, FkJacobian matrix, H, representing the equation of statekJacobian matrix, η, representing an observation equationiIn order to be the weighting coefficients,
Figure FDA00029603116500000412
represents the covariance matrix of the system after weighted averaging,
Figure FDA00029603116500000413
representing the system state estimate after weighted averaging, Qk-1Is the system noise covariance matrix.
5. The polarized light SLAM method based on extended Kalman filtering of claim 1, characterized in that: the step (7) of map updating; and (5) regarding the existing landmark points in the global map, taking the corresponding filtered landmark points as the latest landmark points, and directly adding the landmark points which are not successfully matched in the step (5) into the global map.
CN201810128657.4A 2018-02-08 2018-02-08 Polarized light SLAM method based on extended Kalman filtering Active CN108387236B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810128657.4A CN108387236B (en) 2018-02-08 2018-02-08 Polarized light SLAM method based on extended Kalman filtering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810128657.4A CN108387236B (en) 2018-02-08 2018-02-08 Polarized light SLAM method based on extended Kalman filtering

Publications (2)

Publication Number Publication Date
CN108387236A CN108387236A (en) 2018-08-10
CN108387236B true CN108387236B (en) 2021-05-07

Family

ID=63075502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810128657.4A Active CN108387236B (en) 2018-02-08 2018-02-08 Polarized light SLAM method based on extended Kalman filtering

Country Status (1)

Country Link
CN (1) CN108387236B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020049468A1 (en) * 2018-09-04 2020-03-12 Mazdak Hooshang Navigating a vehicle through a predefined path
CN110779514B (en) * 2019-10-28 2021-04-06 北京信息科技大学 Hierarchical Kalman fusion method and device for auxiliary attitude determination of bionic polarization navigation
CN114077245A (en) * 2020-08-21 2022-02-22 苏州三六零机器人科技有限公司 SLAM method and device for multiple data sources, sweeping robot and readable medium
CN112697138B (en) * 2020-12-07 2022-09-27 北方工业大学 Bionic polarization synchronous positioning and composition method based on factor graph optimization
CN113739795B (en) * 2021-06-03 2023-10-20 东北电力大学 Underwater synchronous positioning and mapping method based on polarized light/inertia/vision integrated navigation
CN115235475B (en) * 2022-09-23 2023-01-03 成都凯天电子股份有限公司 MCC-based EKF-SLAM back-end navigation path optimization method
CN115574816B (en) * 2022-11-24 2023-03-14 东南大学 Bionic vision multi-source information intelligent perception unmanned platform

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2981448A1 (en) * 2011-10-18 2013-04-19 Floch Albert Le DEVICE FOR DETECTING THE DIRECTION OF A CACHE SUN, BASED ON A BIREFRINGENT DEPOLARIZER
CN105606104A (en) * 2016-03-17 2016-05-25 北京工业大学 Robot autonomous navigation method based on heading-assisting distributed type SLAM (Simultaneous Localization and Mapping)
CN105737832A (en) * 2016-03-22 2016-07-06 北京工业大学 Distributed SLAM (simultaneous localization and mapping) method on basis of global optimal data fusion
CN106197432A (en) * 2016-08-30 2016-12-07 北京航空航天大学 A kind of UAV Landing method based on FastSLAM algorithm
CN106886030A (en) * 2017-03-24 2017-06-23 黑龙江硅智机器人有限公司 It is applied to the synchronous mode map structuring and alignment system and method for service robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2981448A1 (en) * 2011-10-18 2013-04-19 Floch Albert Le DEVICE FOR DETECTING THE DIRECTION OF A CACHE SUN, BASED ON A BIREFRINGENT DEPOLARIZER
CN105606104A (en) * 2016-03-17 2016-05-25 北京工业大学 Robot autonomous navigation method based on heading-assisting distributed type SLAM (Simultaneous Localization and Mapping)
CN105737832A (en) * 2016-03-22 2016-07-06 北京工业大学 Distributed SLAM (simultaneous localization and mapping) method on basis of global optimal data fusion
CN106197432A (en) * 2016-08-30 2016-12-07 北京航空航天大学 A kind of UAV Landing method based on FastSLAM algorithm
CN106886030A (en) * 2017-03-24 2017-06-23 黑龙江硅智机器人有限公司 It is applied to the synchronous mode map structuring and alignment system and method for service robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于天空偏振光的SLAM方法的研究;王道斌;《中国博士学位论文全文数据库信息科技辑》;20150315;第24-25,75-89页 *

Also Published As

Publication number Publication date
CN108387236A (en) 2018-08-10

Similar Documents

Publication Publication Date Title
CN108362288B (en) Polarized light SLAM method based on unscented Kalman filtering
CN108387236B (en) Polarized light SLAM method based on extended Kalman filtering
CN108731670B (en) Inertial/visual odometer integrated navigation positioning method based on measurement model optimization
CN106840179B (en) Intelligent vehicle positioning method based on multi-sensor information fusion
CN110243358B (en) Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system
CN112697138B (en) Bionic polarization synchronous positioning and composition method based on factor graph optimization
CN104075715B (en) A kind of underwater navigation localization method of Combining with terrain and environmental characteristic
CN104061899B (en) A kind of vehicle side inclination angle based on Kalman filtering and angle of pitch method of estimation
CN104655131B (en) Inertial navigation Initial Alignment Method based on ISTSSRCKF
CN101846734B (en) Agricultural machinery navigation and position method and system and agricultural machinery industrial personal computer
CN107015238A (en) Unmanned vehicle autonomic positioning method based on three-dimensional laser radar
Mu et al. End-to-end navigation for autonomous underwater vehicle with hybrid recurrent neural networks
WO2014042710A2 (en) Pose estimation
CN112798021B (en) Inertial navigation system inter-travelling initial alignment method based on laser Doppler velocimeter
CN105333869A (en) Unmanned reconnaissance aerial vehicle synchronous positioning and picture compositing method based on self-adaption EKF
CN112923931A (en) Feature map matching and GPS positioning information fusion method based on fixed route
CN115574816B (en) Bionic vision multi-source information intelligent perception unmanned platform
CN111025229B (en) Underwater robot pure orientation target estimation method
CN115200578A (en) Polynomial optimization-based inertial-based navigation information fusion method and system
CN113503872B (en) Low-speed unmanned aerial vehicle positioning method based on fusion of camera and consumption-level IMU
CN109387198A (en) A kind of inertia based on sequential detection/visual odometry Combinated navigation method
CN111220151B (en) Inertia and milemeter combined navigation method considering temperature model under load system
CN112325878A (en) Ground carrier combined navigation method based on UKF and air unmanned aerial vehicle node assistance
CN111693044A (en) Fusion positioning method
CN111256708A (en) Vehicle-mounted integrated navigation method based on radio frequency identification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant