CN114440881A - Unmanned vehicle positioning method integrating multi-source sensor information - Google Patents
Unmanned vehicle positioning method integrating multi-source sensor information Download PDFInfo
- Publication number
- CN114440881A CN114440881A CN202210109679.2A CN202210109679A CN114440881A CN 114440881 A CN114440881 A CN 114440881A CN 202210109679 A CN202210109679 A CN 202210109679A CN 114440881 A CN114440881 A CN 114440881A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- model
- matrix
- algorithm
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 62
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 78
- 230000004927 fusion Effects 0.000 claims abstract description 28
- 239000011159 matrix material Substances 0.000 claims description 62
- 238000001914 filtration Methods 0.000 claims description 59
- 239000002245 particle Substances 0.000 claims description 40
- 230000001133 acceleration Effects 0.000 claims description 28
- 239000013598 vector Substances 0.000 claims description 28
- 230000014509 gene expression Effects 0.000 claims description 22
- 230000000694 effects Effects 0.000 claims description 11
- 238000005070 sampling Methods 0.000 claims description 8
- 238000006073 displacement reaction Methods 0.000 claims description 6
- 230000007704 transition Effects 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000010606 normalization Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims description 4
- 238000004458 analytical method Methods 0.000 claims description 3
- 238000000342 Monte Carlo simulation Methods 0.000 claims description 2
- 238000012952 Resampling Methods 0.000 claims description 2
- 238000005259 measurement Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 4
- 230000000295 complement effect Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
The invention relates to an unmanned vehicle positioning method fusing multi-source sensor information, which comprises the following steps: step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining the vehicle to be controlled with an inertial navigation system and a GPS signal as a true value; step 2: establishing a linear model; and step 3: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a linear model, and acquiring a fusion positioning result under the linear model; and 4, step 4: establishing a nonlinear model; and 5: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a kinematic model, and acquiring a fusion positioning result under a nonlinear model; step 6: establishing a bicycle model; and 7: compared with the prior art, the unmanned vehicle positioning method has the advantages of improving unmanned vehicle positioning accuracy, being high in applicability and the like.
Description
Technical Field
The invention relates to the technical field of automatic driving, in particular to an unmanned vehicle positioning method fusing multi-source sensor information.
Background
With the economic growth of China in recent years, automobiles are not only convenient vehicles, but also intelligent networking of the automobiles is another requirement of people, so that unmanned automobiles are concerned by the majority of researchers and the industry due to the portability and the expansibility of the unmanned automobiles. The unmanned vehicle positioning is a basic module for researching the field of unmanned vehicles, and different research methods are needed by using different perception sensors. The scheme of simply using GPS to complete positioning work in the market at present is characterized in that the scheme is all-weather and has a large coverage area, but when satellite signals are interfered, a large error is generated, and in addition, the scheme of simply using IMU to complete positioning work is a relative positioning technology, and the positioning error is accumulated to be larger and larger along with the time.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide an unmanned vehicle positioning method fusing multi-source sensor information.
The purpose of the invention can be realized by the following technical scheme:
an unmanned vehicle positioning method fusing multi-source sensor information comprises the following steps:
step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining the vehicle to be controlled with an inertial navigation system and a GPS signal as a true value to verify the fused positioning precision;
step 2: establishing a linear model according to the GPS position information and the IMU information;
and step 3: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a linear model, and acquiring a fusion positioning result under the linear model;
and 4, step 4: establishing a nonlinear model according to the GPS position information and the IMU information;
and 5: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a nonlinear model, and acquiring a fusion positioning result under the nonlinear model;
and 6: further deepening nonlinearity of the nonlinear model and modeling again to obtain a bicycle model;
and 7: and acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under the single vehicle model, and obtaining a fusion positioning result under the single vehicle model.
In the step 1, GPS information is collected through a GNSS antenna installed on the roof of the vehicle, IMU information is collected through an inertial navigation unit, the GPS information comprises longitude and latitude information of the vehicle, the longitude and latitude information of the vehicle can be converted into a horizontal coordinate and a vertical coordinate of the vehicle, and the IMU information comprises three-axis angular velocity and acceleration of the vehicle.
In step 2, the linear model includes a constant velocity model and a constant acceleration model, wherein the expression of the state vector of the constant velocity model is as follows:
wherein x, y, theta and v are respectively an abscissa, an ordinate, a vehicle course angle and a speed of the vehicle position at the time t;
the expressions of the state equation and the observation equation of the constant speed model are respectively as follows:
wherein, Delta t is a time interval,for the state vector after the time at has elapsed,for observations obtained by GPS, xgpsAs the abscissa, y, of the vehicle position obtained by GPSgpsFor vehicles derived from GPSThe ordinate of the vehicle position;
the expression of the state vector of the constant acceleration model is:
wherein s and v are respectively the displacement and the speed of the vehicle at the moment t;
the expressions of the state equation and the observation equation of the constant acceleration model are respectively as follows:
wherein, a is the acceleration of the vehicle,is the state vector after the time of delta t, s (t) is the displacement of the vehicle at the time t, v (t) is the speed of the vehicle at the time t,the observed value is the displacement measured by the GPS;
the constant acceleration model takes the values measured by the accelerometer as system inputs:
u(k)=a(k)
where u (k) is the system input and a (k) is the value measured by the accelerometer.
In the step 3, the unmanned vehicle positioning method fusing multi-source sensor information under the linear model specifically comprises the following steps:
step 301: obtaining more accurate unmanned vehicle position information according to a Kalman filtering algorithm, wherein the Kalman filtering algorithm comprises five steps of state prediction, error covariance prediction, Kalman gain updating, state updating and error covariance updating:
wherein ,a predictor representing the state vector at time k,represents the best estimate of the state vector at time k,prediction value, P, representing the error covariance at time kkOptimal estimate, K, representing the error covariance at time KkRepresenting the Kalman gain at time k, Q being the process noise covariance matrix, R being the observation noise covariance matrix, uk-1The system input at the moment of k-1 is represented, A represents a state transition matrix of the system, H represents an observation matrix of the system, I represents a unit matrix, and B is a control input matrix;
for the constant velocity model, the expressions of an observation matrix H and a state transition matrix A obtained according to the state equation and the observation equation of the constant velocity model are respectively as follows:
wherein, delta t is a time interval, and theta is a course angle of the vehicle;
initializing a covariance matrix P into an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R:
for the constant acceleration model, the expressions of a state transition matrix A, an observation matrix H and a control input matrix B obtained according to the state equation, the observation value and the system input of the constant acceleration model are respectively as follows:
H=[1 0]
wherein Δ t is a time interval;
initializing the covariance matrix P as an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R as follows:
R=1;
step 302: and comparing the obtained optimal estimation value with the true value, and calculating the variance before and after filtering.
In the step 4, the nonlinear model is a kinematic model based on constant acceleration and angular velocity, and the expression of the state vector of the kinematic model is as follows:
wherein ,xk、yk、ψk、ωk、vk and akRespectively an abscissa, an ordinate, a course angle, an angular velocity, a speed and an acceleration of the vehicle at the moment k;
the expressions of the state equation and the observation equation of the kinematic model are respectively as follows:
wherein ,is the state vector of the vehicle at time k-1, vk-1The speed of the vehicle at time k-1, dt is the time interval,as an observed value, xg,k、yg,k、aa,k and ωg,kThe acceleration obtained by the accelerometer and the angular velocity measured by the gyroscope are respectively the abscissa and the ordinate of the vehicle observed by the GPS at the moment k.
In the step 5, the process of obtaining the unmanned vehicle positioning method fusing multi-source sensor information under the kinematics model specifically comprises the following steps:
step 501: performing fusion positioning by respectively adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filter algorithm;
step 502: comprehensively comparing the three filtering algorithms, calculating RMS error, and selecting the optimal positioning algorithm, wherein the RMS error has the calculation formula:
errorrms=|X-Xest|
wherein, errorrmsX is the true value of the filtered error and includes the true values of the abscissa and ordinate of the vehicle, XestRespectively obtaining the optimal estimated values of the abscissa and the ordinate of the filtered vehicle;
step 503: and selecting a corresponding filtering algorithm with the minimum RMS error, and acquiring a fusion positioning result under the nonlinear model based on the filtering algorithm.
In step 501, the specific formula for expanding the kalman filtering algorithm includes:
wherein ,Fk-1And HkRespectively, the Jacobian matrix, ukFor the system input at time k, xk-1Is the state vector at the moment of k-1;
jacobian matrix Fk-1 and HkAre respectively:
wherein ,ψk-1The heading angle of the vehicle at time k-1,vk-1the speed of the vehicle at time k-1, dt is the time interval;
the specific filtering process of the unscented Kalman filtering algorithm is as follows:
step 1 a: initialization state vector x0State estimation error covariance P0A process noise matrix Q and an observation noise matrix R, for a kinematic model, the initial values of the state vectorsIs true value, the process noise matrix Q, the observation noise matrix R and the initial error covariance matrix P0The initialized expression is:
step 2 a: according to the state vector x at the previous momentkSum error covariance matrix PkSelecting a sampling point at the current moment and distributing weight;
step 3 a: selecting 2n +1 sampling points, calculating the mean value and covariance of all the sampling points at the current moment through a state equation of the system, and finishing time updating;
step 4 a: measuring and updating the sampling points through an observation equation of the system;
step 5 a: calculating Kalman gain through prediction and measurement updating, and finishing filtering updating;
the particle filter algorithm adopts a Monte Carlo method to solve the problem of nonlinear filtering, and the filtering process of the particle filter algorithm is as follows:
and step 2 b: updating particlesCalculating likelihood probability of each particleWeighting value of each particleIs updated toAnd carrying out normalization processing, wherein the weight value updating formula of the particles is as follows:
wherein ,the weight of the particles is the weight of the particles,is the likelihood probability of a particle, zkAs an observed value of the system at time k,is the output value, R, of the ith particle observation equation at time kkAn observed noise covariance matrix at time k;
and step 3 b: to pairResampling, and calculating effective particle number NeffThereby obtainingNovel particle swarmAnd the weight of the particlesIs 1/N;
and 4 b: and calculating the filtering state estimation and the variance estimation of the current moment by adopting the weight and the state of the resampled particles.
In step 6, the relationship between the speed and the angular velocity in the bicycle model is as follows:
wherein, beta is a slip angle,the speed in the horizontal direction is the speed,the speed in the vertical direction is the speed,angular velocity as course angle, v as centroid speed, psi as course anglerAnd lfRespectively rear overhang length and front overhang length, deltafAnd deltarRespectively a front wheel deflection angle and a rear wheel deflection angle;
the slip angle β is calculated as:
wherein ,lrAnd lfRespectively rear overhang length and front overhang length, deltafAnd deltarRespectively a front wheel deflection angle and a rear wheel deflection angle;
the expression of the state vector of the bicycle model is:
wherein ,xk、yk、ψk、r、b、N、δr,k and ωkRespectively representing the abscissa, the ordinate, the course angle, the wheel radius, the vehicle wheelbase, the gear ratio, the front wheel deflection angle and the wheel rotation angular speed of the vehicle at the moment k, keeping the wheel radius, the vehicle wheelbase and the gear ratio unchanged, and taking the front wheel deflection angle and the wheel rotation angular speed of the vehicle as system input, dt is a time interval;
the equation of state of the bicycle model is as follows:
the observed values of the bicycle model are:
wherein ,xg,k and yg,kRespectively an abscissa and an ordinate of the vehicle observed by the GPS at the moment k;
the observation equation of the bicycle model is as follows:
in step 7, the process of obtaining the unmanned vehicle positioning algorithm fusing multi-source sensor information under the single vehicle model specifically comprises the following steps:
step 701: respectively obtaining the optimal estimation values of the vehicle under the bicycle model by adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filter algorithm based on a public data set;
step 702: and (3) carrying out error analysis aiming at the coordinates and the heading angle of the vehicle, and calculating RMS errors corresponding to the abscissa, the ordinate and the heading angle of the vehicle, wherein the RMS error has a calculation formula as follows:
errorrms=|X-Xest|
wherein, errorrmsX is the true value of the filtered error, including the true values of the abscissa and ordinate of the vehicle, XestRespectively obtaining the optimal estimated values of the abscissa, the ordinate and the course angle of the filtered vehicle;
step 703: carrying out mean value processing on each RMS error to realize normalization;
step 704: adding the three normalized RMS errors of each filtering algorithm to form a score, evaluating the performance and effect of the three filtering algorithms by comparing the scores of the filtering algorithms, and further selecting the score with the minimum value as the optimal positioning algorithm;
step 705: and obtaining a fusion positioning result under the bicycle model based on the optimal positioning algorithm.
In step 701, when the bicycle model is filtered by adopting a filtering algorithm, the initial state vector is obtainedTaking a true value, taking the wheel radius R to be 0.425, taking the vehicle wheel base b to be 0.8, taking the gear ratio N to be 5, and taking the initialization process noise matrix Q, the observation noise matrix R and the initial error covariance matrix P0The expressions are respectively:
compared with the prior art, the invention has the following advantages:
1. according to the invention, GPS information and IMU information are fused, and the complementarity of the two sensor information is utilized, so that a more accurate positioning result of the unmanned vehicle is obtained, the positioning precision of automatic driving in a simple environment is improved, and the technical effect of controlling the safe driving of the vehicle is further realized;
2. the invention selects the optimal algorithm according to the nonlinear degree of the system to obtain the optimal positioning result, thereby realizing the technical effect of enhancing the applicability of the unmanned vehicle positioning method.
Drawings
Fig. 1 is a linear model diagram used in the unmanned vehicle positioning method with multi-source sensor information fused according to the present invention.
Fig. 2 is an enlarged view of a fusion result of the unmanned vehicle positioning method for fusing multi-source sensor information, which is realized by the method and aims at a linear model.
Fig. 3 is a kinematic model diagram used in the unmanned vehicle positioning method with multi-source sensor information fused according to the present invention.
Fig. 4 is an enlarged view of a fusion result of the kinematics model implemented by the unmanned vehicle positioning method for fusing multi-source sensor information according to the present invention.
Fig. 5 is a comparison graph of filtering effects of three different filtering algorithms for a kinematic model, which is implemented by the unmanned vehicle positioning method with integration of multi-source sensor information provided by the invention.
Fig. 6 is a diagram of a single vehicle model used in the unmanned vehicle positioning method with multi-source sensor information fused according to the present invention.
Fig. 7 is a fusion result diagram for a single vehicle model, which is realized by the unmanned vehicle positioning method for fusing multi-source sensor information according to the present invention.
Fig. 8 is an enlarged view showing comparison of filtering effects of three different filtering algorithms for a single vehicle model, which is realized by the unmanned vehicle positioning method fusing multi-source sensor information according to the present invention.
FIG. 9 is a schematic flow chart of the method of the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and the specific embodiments.
Examples
An unmanned vehicle positioning method fusing multi-source sensor information comprises the following steps:
step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining the vehicle to be controlled with an inertial navigation system and a GPS signal as a true value to verify the fused positioning precision;
step 2: establishing a linear model according to the GPS position information and the IMU information;
and step 3: obtaining an unmanned vehicle positioning method fusing multi-source sensor information according to the linear model, and obtaining a fusion positioning result under the linear model;
and 4, step 4: establishing a nonlinear model according to the GPS position information and the IMU information;
and 5: acquiring an unmanned vehicle positioning method for fusing multi-source sensor information under a kinematic model according to a nonlinear model, and acquiring a fusion positioning result under the nonlinear model;
step 6: further deepening nonlinearity of the nonlinear model according to the nonlinear model and modeling again to complete the establishment of the bicycle model;
and 7: and acquiring the unmanned vehicle positioning method fusing multi-source sensor information under the single vehicle model according to the single vehicle model, and acquiring a fusion positioning result under the single vehicle model.
In step 1, GPS information is collected by a GNSS antenna mounted on the roof of the vehicle, and angular velocity and acceleration information is collected by an inertial navigation unit.
In step 2, the linear model includes a constant velocity model and a constant acceleration model.
In step 3, the method for positioning the unmanned vehicle by fusing multi-source sensor information under the linear model specifically comprises the following steps:
the method comprises the steps that a linear model, IMU information and GPS information are used as input, the GPS position information and the IMU information are fused according to a Kalman filtering algorithm to obtain more accurate unmanned vehicle position information, and the more accurate unmanned vehicle position information is compared with a true value;
in step 4, the nonlinear model is a kinematic model of constant acceleration and angular velocity.
In step 5, the unmanned vehicle positioning method fusing multi-source sensor information under the kinematic model specifically comprises the following steps:
and completing fusion positioning by respectively adopting an extended Kalman filtering algorithm, an unscented Kalman filtering algorithm and a particle filtering algorithm, performing comprehensive comparison, and selecting an optimal positioning algorithm.
In step 6, the single vehicle model has stronger nonlinearity than the kinematic model and is more similar to the real-life vehicle model.
In step 7, the unmanned vehicle positioning method fusing multi-source sensor information under the single vehicle model specifically comprises the following steps:
aiming at the public data set, the influence of the nonlinearity degree of the bicycle model on the result accuracy of the unmanned vehicle fusion positioning algorithm is analyzed by adopting an extended Kalman filtering algorithm, an unscented Kalman filtering algorithm and a particle filtering algorithm, and the optimal positioning algorithm is obtained.
According to the invention, data acquisition is carried out on a vehicle to be controlled through a GPS and an IMU, linear system modeling is carried out on a system, a more accurate positioning result is obtained by adopting Kalman filtering algorithm fusion, nonlinear modeling is carried out on the system and real vehicle testing is carried out, positioning accuracy of the unmanned vehicle is obviously improved by adopting extended Kalman filtering, unscented Kalman filtering and particle filtering under a nonlinear system, nonlinearity of the model is further expanded, a single vehicle model is further adopted for experimental simulation and an optimal filtering algorithm is selected.
As shown in fig. 1 to 8, an unmanned vehicle positioning method fusing multi-source sensor information includes the following steps: the method comprises the steps of obtaining GPS information and IMU information of a vehicle to be controlled, obtaining centimeter-level precision positioning obtained by combining an inertial navigation system and GPS signals of the vehicle to be controlled as a true value, comparing the true value with a fusion positioning result, establishing a linear constant acceleration model, realizing fusion of the GPS position information and the IMU position information by adopting a Kalman filtering algorithm, comparing the true value with the true value, calculating the variance before and after filtering, establishing a kinematic model of constant acceleration and angular acceleration, realizing the fusion of the GPS information and the IMU information by adopting an extended Kalman filtering algorithm, an unscented Kalman filtering algorithm and a particle filtering algorithm, calculating RMS (root mean square) errors, judging the advantages and disadvantages of the effects of the three algorithms, further deepening nonlinearity, establishing a single vehicle model, finishing the fusion of the GPS information and the IMU information by using a public data set, and carrying out error analysis aiming at coordinate values and course angles.
The invention can complement the advantages and disadvantages of GPS information and IMU information, realizes more accurate unmanned vehicle positioning effect under simple environment, obtains the most accurate unmanned vehicle positioning effect by comparing the effects of three filtering algorithms under a nonlinear system, adopts an extended Kalman filtering algorithm in the environment with weak nonlinearity, and obtains better results by adopting an unscented Kalman filtering algorithm and a particle filtering algorithm when the nonlinearity degree of the environment is strengthened, so that the invention can realize good fusion positioning effect under the environments with different nonlinearity degrees and has strong applicability.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
1. An unmanned vehicle positioning method fusing multi-source sensor information is characterized by comprising the following steps:
step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining the vehicle to be controlled with an inertial navigation system and a GPS signal as a true value to verify the fused positioning precision;
step 2: establishing a linear model according to the GPS position information and the IMU information;
and step 3: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a linear model, and acquiring a fusion positioning result under the linear model;
and 4, step 4: establishing a nonlinear model according to the GPS position information and the IMU information;
and 5: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a nonlinear model, and acquiring a fusion positioning result under the nonlinear model;
step 6: further deepening nonlinearity of the nonlinear model and modeling again to obtain a bicycle model;
and 7: and acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under the single vehicle model, and obtaining a fusion positioning result under the single vehicle model.
2. The method as claimed in claim 1, wherein in step 1, GPS information is collected through a GNSS antenna installed on a roof of a vehicle, and IMU information is collected through an inertial navigation unit, wherein the GPS information includes longitude and latitude information of the vehicle, and the longitude and latitude information of the vehicle is converted into an abscissa and an ordinate of the vehicle, and the IMU information includes three-axis angular velocity and acceleration of the vehicle.
3. The method according to claim 1, wherein in step 2, the linear model comprises a constant velocity model and a constant acceleration model, and the expression of the state vector of the constant velocity model is as follows:
wherein x, y, theta and v are respectively an abscissa, an ordinate, a vehicle course angle and a speed of the vehicle position at the time t;
the expressions of the state equation and the observation equation of the constant speed model are respectively as follows:
wherein, Delta t is a time interval,for the state vector after the time at has elapsed,for observations obtained by GPS, xgpsAs the abscissa, y, of the vehicle position obtained by GPSgpsIs the ordinate of the vehicle position obtained by the GPS;
the expression of the state vector of the constant acceleration model is:
wherein s and v are respectively the displacement and the speed of the vehicle at the time t;
the expressions of the state equation and the observation equation of the constant acceleration model are respectively as follows:
wherein, a is the acceleration of the vehicle,is a state vector after the time of delta t, s (t) is the displacement of the vehicle at the time of t, v (t) is the speed of the vehicle at the time of t,the observed value is the displacement measured by the GPS;
the constant acceleration model takes the values measured by the accelerometer as system inputs:
u(k)=a(k)
where u (k) is the system input and a (k) is the value measured by the accelerometer.
4. The unmanned aerial vehicle positioning method fused with multi-source sensor information according to claim 3, wherein in the step 3, the unmanned aerial vehicle positioning method fused with multi-source sensor information under a linear model specifically comprises:
step 301: obtaining more accurate unmanned vehicle position information according to a Kalman filtering algorithm, wherein the Kalman filtering algorithm comprises five steps of state prediction, error covariance prediction, Kalman gain updating, state updating and error covariance updating:
wherein ,a predictor representing the state vector at time k,represents the best estimate of the state vector at time k,prediction value, P, representing the error covariance at time kkOptimal estimate, K, representing the error covariance at time KkRepresenting the Kalman gain at time k, Q being the process noise covariance matrix, R being the observation noise covariance matrix, uk-1The system input at the moment of k-1 is represented, A represents a state transition matrix of the system, H represents an observation matrix of the system, I represents a unit matrix, and B is a control input matrix;
for the constant speed model, the expressions of an observation matrix H and a state transition matrix A obtained according to the state equation and the observation equation of the constant speed model are respectively as follows:
wherein, delta t is a time interval, and theta is a heading angle of the vehicle;
initializing a covariance matrix P into an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R:
for the constant acceleration model, the expressions of a state transition matrix A, an observation matrix H and a control input matrix B obtained according to the state equation, the observation value and the system input of the constant acceleration model are respectively as follows:
H=[1 0]
wherein Δ t is a time interval;
initializing the covariance matrix P as an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R as follows:
R=1;
step 302: and comparing the obtained optimal estimation value with the true value, and calculating the variance before and after filtering.
5. The method according to claim 1, wherein in step 4, the nonlinear model is a kinematic model based on constant acceleration and angular velocity, and the expression of the state vector of the kinematic model is as follows:
wherein ,xk、yk、ψk、ωk、vk and akRespectively an abscissa, an ordinate, a course angle, an angular velocity, a speed and an acceleration of the vehicle at the moment k;
the expressions of the state equation and the observation equation of the kinematic model are respectively as follows:
wherein ,is the state vector of the vehicle at time k-1, vk-1The speed of the vehicle at time k-1, dt is the time interval,as an observed value, xg,k、yg,k、aa,k and ωg,kThe acceleration obtained by the accelerometer and the angular velocity measured by the gyroscope are respectively the abscissa and the ordinate of the vehicle observed by the GPS at the moment k.
6. The unmanned aerial vehicle positioning method fusing multi-source sensor information according to claim 1, wherein in the step 5, the process of obtaining the unmanned aerial vehicle positioning method fusing multi-source sensor information under the kinematic model specifically comprises the following steps:
step 501: performing fusion positioning by respectively adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filter algorithm;
step 502: comprehensively comparing the three filtering algorithms, calculating RMS error, and selecting the optimal positioning algorithm, wherein the RMS error has the calculation formula:
errorrms=|X-Xest|
wherein, errorrmsX is the true value of the filtered error and includes the true values of the abscissa and ordinate of the vehicle, XestRespectively obtaining the optimal estimated values of the abscissa and the ordinate of the filtered vehicle;
step 503: and selecting a corresponding filtering algorithm with the minimum RMS error, and acquiring a fusion positioning result under the nonlinear model based on the filtering algorithm.
7. The unmanned aerial vehicle positioning method fusing multi-source sensor information according to claim 6, wherein in step 501, expanding a specific formula of a Kalman filtering algorithm comprises:
wherein ,Fk-1And HkRespectively, the Jacobian matrix, ukFor the system input at time k, xk-1Is the state vector at the moment of k-1;
jacobian matrix Fk-1 and HkAre respectively:
wherein ,ψk-1Is the heading angle, v, of the vehicle at time k-1k-1The speed of the vehicle at time k-1, dt is the time interval;
the specific filtering process of the unscented Kalman filtering algorithm is as follows:
step 1 a: initialization state vector x0State estimation error covariance P0A process noise matrix Q and an observation noise matrix R, for a kinematic model, the initial values of the state vectorsIs true value, the process noise matrix Q, the observation noise matrix R and the initial error covariance matrix P0The initialized expression is:
step 2 a: according to the state vector x at the previous momentkSum error covariance matrix PkSelecting a sampling point at the current moment and distributing weight;
step 3 a: selecting 2n +1 sampling points, calculating the mean value and covariance of all the sampling points at the current moment through a state equation of the system, and finishing time updating;
step 4 a: measuring and updating the sampling points through an observation equation of the system;
step 5 a: calculating Kalman gain through prediction and measurement updating, and finishing filtering updating;
the particle filter algorithm adopts a Monte Carlo method to solve the problem of nonlinear filtering, and the filtering process of the particle filter algorithm is as follows:
and step 2 b: updating particlesCalculating likelihood probability of each particleWeighting each particleIs updated toAnd carrying out normalization processing, wherein the weight value updating formula of the particles is as follows:
wherein ,is the weight of the particle(s),is the likelihood probability of a particle, zkAs an observed value of the system at time k,is the output value, R, of the ith particle observation equation at time kkAn observed noise covariance matrix at time k;
and step 3 b: to pairResampling, and calculating effective particle number NeffThereby obtaining a new particle groupAnd the weight of the particlesIs 1/N;
and 4 b: and calculating the filtering state estimation and the variance estimation of the current moment by adopting the weight and the state of the resampled particles.
8. The unmanned aerial vehicle positioning method fused with multi-source sensor information as claimed in claim 1, wherein in step 6, the relation satisfied between the speed and the angular velocity in the single vehicle model is as follows:
wherein, beta is a slip angle,the speed in the horizontal direction is the speed,the speed in the vertical direction is the speed,angular velocity as course angle, v as centroid speed, psi as course anglerAnd lfRespectively rear overhang length and front overhang length, deltafAnd deltarRespectively a front wheel deflection angle and a rear wheel deflection angle;
the slip angle β is calculated as:
wherein ,lrAnd lfRespectively rear overhang length and front overhang length, deltafAnd deltarRespectively a front wheel deflection angle and a rear wheel deflection angle;
the expression of the state vector of the bicycle model is:
wherein ,xk、yk、ψk、r、b、N、δr,k and ωkRespectively an abscissa, an ordinate, a course angle, a wheel radius, a vehicle wheel base, a gear ratio, a front wheel deflection angle and a wheel rotation angular speed of the vehicle at the moment k, wherein the wheel radius, the vehicle wheel base and the gear ratio are kept unchanged, the front wheel deflection angle and the wheel rotation angular speed of the vehicle are used as system input, and dt is a time interval;
the equation of state of the bicycle model is as follows:
the observed values of the bicycle model are:
wherein ,xg,k and yg,kRespectively an abscissa and an ordinate of the vehicle observed by the GPS at the moment k;
the observation equation of the bicycle model is as follows:
9. the method according to claim 8, wherein the step 7 of obtaining the unmanned vehicle positioning algorithm fused with the multi-source sensor information under the single vehicle model specifically comprises the following steps:
step 701: respectively obtaining the optimal estimation values of the vehicle under the bicycle model by adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filter algorithm based on a public data set;
step 702: and (3) carrying out error analysis aiming at the coordinates and the heading angle of the vehicle, and calculating RMS errors corresponding to the abscissa, the ordinate and the heading angle of the vehicle, wherein the RMS error has a calculation formula as follows:
errorrms=|X-Xest|
wherein, errorrmsX is the true value of the filtered error, including the true values of the abscissa and ordinate of the vehicle, XestRespectively obtaining the optimal estimated values of the abscissa, the ordinate and the course angle of the filtered vehicle;
step 703: carrying out mean value processing on each RMS error to realize normalization;
step 704: adding the three normalized RMS errors of each filtering algorithm to form a score, evaluating the performance and effect of the three filtering algorithms by comparing the scores of the filtering algorithms, and further selecting the score with the minimum value as the optimal positioning algorithm;
step 705: and obtaining a fusion positioning result under the bicycle model based on the optimal positioning algorithm.
10. The method of claim 9, wherein in step 701, initial state vectors are obtained when the single vehicle model is filtered by a filtering algorithmTaking a true value, taking the wheel radius R to be 0.425, taking the vehicle wheel base b to be 0.8, taking the gear ratio N to be 5, and taking the initialization process noise matrix Q, the observation noise matrix R and the initial error covariance matrix P0The expressions are respectively:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210109679.2A CN114440881B (en) | 2022-01-29 | 2022-01-29 | Unmanned vehicle positioning method integrating multi-source sensor information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210109679.2A CN114440881B (en) | 2022-01-29 | 2022-01-29 | Unmanned vehicle positioning method integrating multi-source sensor information |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114440881A true CN114440881A (en) | 2022-05-06 |
CN114440881B CN114440881B (en) | 2023-05-30 |
Family
ID=81372107
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210109679.2A Active CN114440881B (en) | 2022-01-29 | 2022-01-29 | Unmanned vehicle positioning method integrating multi-source sensor information |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114440881B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116222544A (en) * | 2023-05-09 | 2023-06-06 | 浙江大学湖州研究院 | Automatic navigation and positioning method and device for feeding vehicle facing to feeding farm |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2716597A1 (en) * | 2010-10-08 | 2012-04-08 | Canadian Space Agency | Apparatus and method for driftless attitude determination and reliable localization of vehicles |
JP2012173190A (en) * | 2011-02-23 | 2012-09-10 | Seiko Epson Corp | Positioning system and positioning method |
CN103439731A (en) * | 2013-08-29 | 2013-12-11 | 北京空间飞行器总体设计部 | GPS/INS integrated navigation method based on unscented Kalman filtering |
CN108931799A (en) * | 2018-07-18 | 2018-12-04 | 兰州交通大学 | Train combined positioning method and system based on the search of recurrence fast orthogonal |
CN110243358A (en) * | 2019-04-29 | 2019-09-17 | 武汉理工大学 | The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion |
CN110602647A (en) * | 2019-09-11 | 2019-12-20 | 江南大学 | Indoor fusion positioning method based on extended Kalman filtering and particle filtering |
CN110956665A (en) * | 2019-12-18 | 2020-04-03 | 中国科学院自动化研究所 | Vehicle turning track bidirectional calculation method, system and device |
CN113063429A (en) * | 2021-03-18 | 2021-07-02 | 苏州华米导航科技有限公司 | Self-adaptive vehicle-mounted combined navigation positioning method |
CN113252033A (en) * | 2021-06-29 | 2021-08-13 | 长沙海格北斗信息技术有限公司 | Positioning method, positioning system and robot based on multi-sensor fusion |
-
2022
- 2022-01-29 CN CN202210109679.2A patent/CN114440881B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2716597A1 (en) * | 2010-10-08 | 2012-04-08 | Canadian Space Agency | Apparatus and method for driftless attitude determination and reliable localization of vehicles |
JP2012173190A (en) * | 2011-02-23 | 2012-09-10 | Seiko Epson Corp | Positioning system and positioning method |
CN103439731A (en) * | 2013-08-29 | 2013-12-11 | 北京空间飞行器总体设计部 | GPS/INS integrated navigation method based on unscented Kalman filtering |
CN108931799A (en) * | 2018-07-18 | 2018-12-04 | 兰州交通大学 | Train combined positioning method and system based on the search of recurrence fast orthogonal |
CN110243358A (en) * | 2019-04-29 | 2019-09-17 | 武汉理工大学 | The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion |
CN110602647A (en) * | 2019-09-11 | 2019-12-20 | 江南大学 | Indoor fusion positioning method based on extended Kalman filtering and particle filtering |
CN110956665A (en) * | 2019-12-18 | 2020-04-03 | 中国科学院自动化研究所 | Vehicle turning track bidirectional calculation method, system and device |
CN113063429A (en) * | 2021-03-18 | 2021-07-02 | 苏州华米导航科技有限公司 | Self-adaptive vehicle-mounted combined navigation positioning method |
CN113252033A (en) * | 2021-06-29 | 2021-08-13 | 长沙海格北斗信息技术有限公司 | Positioning method, positioning system and robot based on multi-sensor fusion |
Non-Patent Citations (3)
Title |
---|
LI XU等: "Multi-sensor fusion methodology for enhanced land vehicle positioning", 《INFORMATION FUSION》 * |
VICENT GIRBÉS-JUAN等: "Asynchronous Sensor Fusion of GPS, IMU and CAN-Based Odometry for Heavy-Duty Vehicles", 《IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY 》 * |
周群等: "基于融合算法的GPS/UWB/MARG协同定位系统研究", 《现代电子技术》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116222544A (en) * | 2023-05-09 | 2023-06-06 | 浙江大学湖州研究院 | Automatic navigation and positioning method and device for feeding vehicle facing to feeding farm |
CN116222544B (en) * | 2023-05-09 | 2023-08-04 | 浙江大学湖州研究院 | Automatic navigation and positioning method and device for feeding vehicle facing to feeding farm |
Also Published As
Publication number | Publication date |
---|---|
CN114440881B (en) | 2023-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112083726B (en) | Park-oriented automatic driving double-filter fusion positioning system | |
CN111307162B (en) | Multi-sensor fusion positioning method for automatic driving scene | |
CN111426318B (en) | Low-cost AHRS course angle compensation method based on quaternion-extended Kalman filtering | |
CN108731670B (en) | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization | |
CN106840179B (en) | Intelligent vehicle positioning method based on multi-sensor information fusion | |
CN104061899B (en) | A kind of vehicle side inclination angle based on Kalman filtering and angle of pitch method of estimation | |
CN111595592B (en) | Performance evaluation method of adaptive cruise control system | |
CN110307836B (en) | Accurate positioning method for welt cleaning of unmanned cleaning vehicle | |
CN112505737B (en) | GNSS/INS integrated navigation method | |
CN111426320B (en) | Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter | |
CN104198765A (en) | Coordinate system transformation method for detection of vehicle motion acceleration | |
CN113063429B (en) | Self-adaptive vehicle-mounted integrated navigation positioning method | |
CN108362288A (en) | Polarized light S L AM method based on unscented Kalman filtering | |
CN110702113B (en) | Method for preprocessing data and calculating attitude of strapdown inertial navigation system based on MEMS sensor | |
CN107600073B (en) | A kind of vehicle centroid side drift angle estimating system and method based on Multi-source Information Fusion | |
CN113340298B (en) | Inertial navigation and dual-antenna GNSS external parameter calibration method | |
CN108387236A (en) | Polarized light S L AM method based on extended Kalman filtering | |
CN114966629A (en) | Vehicle body laser radar external reference calibration method based on EKF algorithm framework | |
CN111751857A (en) | Vehicle pose estimation method, device, storage medium and system | |
CN103123487B (en) | A kind of spacecraft attitude determination method | |
CN114440881B (en) | Unmanned vehicle positioning method integrating multi-source sensor information | |
CN113008229B (en) | Distributed autonomous integrated navigation method based on low-cost vehicle-mounted sensor | |
CN114935345A (en) | Vehicle-mounted inertial navigation mounting angle error compensation method based on pattern recognition | |
CN108507587B (en) | Three-dimensional vehicle-mounted positioning navigation method based on coordinate transformation | |
CN113029173A (en) | Vehicle navigation method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |