CN114440881B - Unmanned vehicle positioning method integrating multi-source sensor information - Google Patents

Unmanned vehicle positioning method integrating multi-source sensor information Download PDF

Info

Publication number
CN114440881B
CN114440881B CN202210109679.2A CN202210109679A CN114440881B CN 114440881 B CN114440881 B CN 114440881B CN 202210109679 A CN202210109679 A CN 202210109679A CN 114440881 B CN114440881 B CN 114440881B
Authority
CN
China
Prior art keywords
vehicle
model
positioning
matrix
algorithm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210109679.2A
Other languages
Chinese (zh)
Other versions
CN114440881A (en
Inventor
张卫东
刘馨泽
王咸鹏
张永辉
黄梦醒
沈重
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hainan University
Original Assignee
Hainan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hainan University filed Critical Hainan University
Priority to CN202210109679.2A priority Critical patent/CN114440881B/en
Publication of CN114440881A publication Critical patent/CN114440881A/en
Application granted granted Critical
Publication of CN114440881B publication Critical patent/CN114440881B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention relates to a method for positioning an unmanned vehicle by fusing multisource sensor information, which comprises the following steps: step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining an inertial navigation system and GPS signals of the vehicle to be controlled as a true value; step 2: establishing a linear model; step 3: acquiring an unmanned vehicle positioning algorithm fusing multi-source sensor information under a linear model, and acquiring a fused positioning result under the linear model; step 4: establishing a nonlinear model; step 5: acquiring an unmanned vehicle positioning algorithm fusing multi-source sensor information under a kinematic model, and acquiring a fusion positioning result under a nonlinear model; step 6: establishing a bicycle model; step 7: the invention has the advantages of improving the unmanned vehicle positioning accuracy and strong applicability compared with the prior art.

Description

Unmanned vehicle positioning method integrating multi-source sensor information
Technical Field
The invention relates to the technical field of automatic driving, in particular to an unmanned vehicle positioning method integrating multisource sensor information.
Background
With the recent increase of Chinese economy, automobiles are not just convenient vehicles, and intelligent networking of automobiles has become another requirement of people, so that unmanned vehicles are focused by vast scientific researchers and industries due to portability and expansibility. The unmanned vehicle positioning is a basic module for researching the unmanned vehicle field, and different research methods are needed to be adopted for using different perception sensors. The scheme of using GPS to complete positioning work is characterized by all weather and large coverage area, but generates great error when satellite signal is interfered.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide the unmanned vehicle positioning method integrating the information of the multisource sensors.
The aim of the invention can be achieved by the following technical scheme:
a unmanned vehicle positioning method integrating multisource sensor information comprises the following steps:
step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining an inertial navigation system and GPS signals of the vehicle to be controlled as a true value to verify the positioning precision after fusion;
step 2: establishing a linear model according to the GPS position information and the IMU information;
step 3: acquiring an unmanned vehicle positioning algorithm fusing multi-source sensor information under a linear model, and acquiring a fused positioning result under the linear model;
step 4: establishing a nonlinear model according to GPS position information and IMU information;
step 5: acquiring an unmanned vehicle positioning algorithm fusing multi-source sensor information under a nonlinear model, and acquiring a fused positioning result under the nonlinear model;
step 6: further deepening nonlinearity of the nonlinear model and re-modeling to obtain a bicycle model;
step 7: and acquiring an unmanned vehicle positioning algorithm for fusing the multi-source sensor information under the bicycle model, and acquiring a fused positioning result under the bicycle model.
In the step 1, GPS information is collected through a GNSS antenna arranged on the roof of the vehicle, IMU information is collected through an inertial navigation unit, the GPS information comprises longitude and latitude information of the vehicle, the longitude and latitude information of the vehicle can be converted into an abscissa and an ordinate of the vehicle, and the IMU information comprises three-axis angular velocity and acceleration of the vehicle.
In the step 2, the linear model includes a constant velocity model and a constant acceleration model, where the expression of the state vector of the constant velocity model is:
Figure BDA0003494729110000021
wherein x, y, theta and v are respectively the abscissa, the ordinate, the course angle and the speed of the vehicle at the moment t;
the expressions of the state equation and the observation equation of the constant velocity model are respectively:
Figure BDA0003494729110000022
Figure BDA0003494729110000023
wherein, deltat is the time interval,
Figure BDA0003494729110000024
for the state vector after the Δt time has elapsed, +.>
Figure BDA0003494729110000025
X is the observation value obtained by GPS gps Y is the abscissa of the vehicle position obtained by GPS gps Is the ordinate of the vehicle position obtained by GPS;
the expression of the state vector of the constant acceleration model is:
Figure BDA0003494729110000026
wherein s and v are the displacement and the speed of the vehicle at the moment t respectively;
the expressions of the state equation and the observation equation of the constant acceleration model are respectively:
Figure BDA0003494729110000027
Figure BDA0003494729110000028
where a is the acceleration of the vehicle,
Figure BDA0003494729110000029
for the state vector after the Δt time, s (t) is the displacement of the vehicle at the time t, v (t) is the speed of the vehicle at the time t, +.>
Figure BDA0003494729110000031
The displacement obtained by GPS measurement is an observed value;
the constant acceleration model takes the value measured by the accelerometer as the system input:
u(k)=a(k)
where u (k) is the system input and a (k) is the value measured by the accelerometer.
In the step 3, the unmanned vehicle positioning method for fusing the multi-source sensor information under the linear model specifically comprises the following steps:
step 301: more accurate unmanned vehicle position information is obtained according to a Kalman filtering algorithm, and the specific process of the Kalman filtering algorithm is divided into five links, namely state prediction, error covariance prediction, kalman gain update, state update and error covariance update:
Figure BDA0003494729110000032
Figure BDA0003494729110000033
Figure BDA0003494729110000034
Figure BDA0003494729110000035
Figure BDA0003494729110000036
wherein ,
Figure BDA0003494729110000037
predictive value representing state vector at time k, +.>
Figure BDA0003494729110000038
Optimal estimate representing state vector at time k, < >>
Figure BDA0003494729110000039
Predictive value representing k-moment error covariance, P k Optimal estimation value representing K time error covariance, K k Kalman gain at k time, Q is process noise covariance matrix, R is observation noise covariance matrix, u k-1 The system input at the moment k-1 is represented, A represents a state transition matrix of the system, H represents an observation matrix of the system, I represents an identity matrix, and B represents a control input matrix;
for the constant speed model, the expressions of the observation matrix H and the state transition matrix A obtained according to the state equation and the observation equation of the constant speed model are respectively as follows:
Figure BDA00034947291100000310
Figure BDA00034947291100000311
wherein deltat is the time interval and theta is the heading angle of the vehicle;
initializing a covariance matrix P as an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R:
Figure BDA00034947291100000312
Figure BDA00034947291100000313
for a constant acceleration model, the expressions of a state transition matrix A, an observation matrix H and a control input matrix B are obtained according to the state equation, the observation value and the system input of the constant acceleration model, and are respectively as follows:
Figure BDA0003494729110000041
H=[1 0]
Figure BDA0003494729110000042
wherein Δt is the time interval;
initializing a covariance matrix P as an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R as follows:
Figure BDA0003494729110000043
R=1;
step 302: and comparing the obtained optimal estimated value with the true value, and calculating the variance before and after filtering.
In the step 4, the nonlinear model is a kinematic model based on constant acceleration and angular velocity, and the expression of the state vector of the kinematic model is:
Figure BDA0003494729110000044
wherein ,xk 、y k 、ψ k 、ω k 、v k and ak The horizontal coordinate, the vertical coordinate, the course angle, the angular speed, the speed and the acceleration of the vehicle at the moment k respectively;
the expressions of the state equation and the observation equation of the kinematic model are respectively:
Figure BDA0003494729110000045
Figure BDA0003494729110000046
wherein ,
Figure BDA0003494729110000047
is the state vector of the vehicle at time k-1, v k-1 For the speed of the vehicle at time k-1, dt is the time interval,
Figure BDA0003494729110000048
for observations, x g,k 、y g,k 、a a,k and ωg,k And respectively obtaining acceleration and angular velocity measured by a gyroscope for the vehicle abscissa, the vehicle ordinate and the accelerometer obtained by GPS observation at the moment k.
In the step 5, the process of the unmanned vehicle positioning method for acquiring the information of the fusion multisource sensor under the kinematic model specifically comprises the following steps:
step 501: respectively adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filter algorithm to carry out fusion positioning;
step 502: the three filtering algorithms are comprehensively compared, the RMS error is calculated to select the optimal positioning algorithm, and the calculation formula of the RMS error is as follows:
error rms =|X-X est |
wherein, error rms To filter the error, X is true, including true values of the vehicle's abscissa and ordinate, X est The optimal estimated values of the abscissa and the ordinate of the vehicle after filtering are respectively;
step 503: and selecting a corresponding filtering algorithm with the minimum RMS error, and acquiring a fusion positioning result under the nonlinear model based on the filtering algorithm.
In the step 501, expanding a specific formula of the kalman filter algorithm includes:
Figure BDA0003494729110000051
Figure BDA0003494729110000052
wherein ,Fk-1 And H is k Respectively jacobian matrices, u k For system input at time k, x k-1 A state vector at time k-1;
jacobian matrix F k-1 and Hk The expressions of (2) are respectively:
Figure BDA0003494729110000053
/>
Figure BDA0003494729110000054
wherein ,ψk-1 Is the heading angle of the vehicle at the moment k-1, v k-1 The speed of the vehicle at the moment k-1, and dt is the time interval;
the specific filtering process of the unscented Kalman filtering algorithm is as follows:
step 1a: initializing a state vector x 0 State estimation error covariance P 0 A process noise matrix Q and an observation noise matrix R, initial values of state vectors for a kinematic model
Figure BDA0003494729110000055
For true values, a process noise matrix Q, an observation noise matrix R, and an initial error covariance matrix P 0 The initialized expression is:
Figure BDA0003494729110000061
Figure BDA0003494729110000062
Figure BDA0003494729110000063
step 2a: according to the state vector x of the last moment k And error covariance matrix P k Selecting a sampling point at the current moment and distributing weights;
step 3a: 2n+1 sampling points are selected, the mean value and covariance of all sampling points at the current moment are calculated through a state equation of the system, and time updating is completed;
step 4a: measuring and updating the sampling points through an observation equation of the system;
step 5a: calculating Kalman gain through prediction and measurement updating to finish filtering updating;
the particle filtering algorithm solves the nonlinear filtering problem by adopting a Monte Carlo method, and the filtering process of the particle filtering algorithm is as follows:
step 1b: initializing N particles
Figure BDA0003494729110000064
And weight of the particle->
Figure BDA0003494729110000065
Uniformly setting the total of the two groups to be 1/N;
step 2b: updating particles
Figure BDA0003494729110000066
Calculating likelihood probability of each particle
Figure BDA0003494729110000067
Weight of each particle +.>
Figure BDA0003494729110000068
Updated to->
Figure BDA0003494729110000069
And carrying out normalization processing, wherein the weight updating formula of the particles is as follows:
Figure BDA00034947291100000610
wherein ,
Figure BDA00034947291100000611
weight of particle->
Figure BDA00034947291100000612
For likelihood probability of particle, z k Is the observed value of the system at the moment k,
Figure BDA00034947291100000613
for the output value of the ith particle observation equation at k time, R k The observed noise covariance matrix at the moment k;
step 3b: for a pair of
Figure BDA0003494729110000071
Resampling and calculating effective particle number N eff Thereby obtaining a new particle group
Figure BDA0003494729110000072
And the weight of the particleValue->
Figure BDA0003494729110000073
1/N;
step 4b: and calculating the filter state estimation and variance estimation of the current moment by adopting the weight and the state of the particles after resampling.
In the step 6, the relationship between the speed and the angular speed in the bicycle model is as follows:
Figure BDA0003494729110000074
Figure BDA0003494729110000075
Figure BDA0003494729110000076
wherein, beta is the slip angle,
Figure BDA0003494729110000077
is the speed in the horizontal direction>
Figure BDA0003494729110000078
Is vertical speed +.>
Figure BDA0003494729110000079
Is the angular velocity of the course angle, v is the centroid speed, ψ is the magnitude of the course angle, l r And/l f The rear suspension length and the front suspension length, delta f And delta r The front wheel deflection angle and the rear wheel deflection angle are respectively;
the slip angle β is calculated as:
Figure BDA00034947291100000710
wherein ,lr And/l f The rear suspension length and the front suspension length, delta f And delta r The front wheel deflection angle and the rear wheel deflection angle are respectively;
the expression of the state vector of the bicycle model is:
Figure BDA00034947291100000711
wherein ,xk 、y k 、ψ k 、r、b、N、δ r,k and ωk Respectively determining the abscissa, the ordinate, the course angle, the wheel radius, the vehicle wheelbase, the gear ratio, the front wheel deflection angle and the wheel rotation angular speed of the vehicle at the moment k, wherein the wheel radius, the vehicle wheelbase and the gear ratio are kept unchanged, the front wheel deflection angle and the wheel rotation angular speed of the vehicle are input by a system, and dt is a time interval;
the state equation of the bicycle model is:
Figure BDA00034947291100000712
the observed values of the bicycle model are:
Figure BDA0003494729110000081
wherein ,xg,k and yg,k The abscissa and the ordinate of the vehicle observed by the GPS at the time k are respectively;
the observation equation of the bicycle model is as follows:
Figure BDA0003494729110000082
in the step 7, the process of acquiring the unmanned vehicle positioning algorithm fusing the multi-source sensor information under the bicycle model specifically comprises the following steps:
step 701: based on the public data set, adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filtering algorithm to respectively obtain the optimal estimation value of the vehicle under the bicycle model;
step 702: error analysis is carried out on coordinates and course angles of the vehicle, and RMS errors corresponding to the abscissa, the ordinate and the course angles of the vehicle are calculated, wherein the calculation formula of the RMS errors is as follows:
error rms =|X-X est |
wherein, error rms To filter the error, X is true, including true values of the vehicle's abscissa and ordinate, X est The optimal estimated values of the abscissa, the ordinate and the course angle of the filtered vehicle are respectively;
step 703: each RMS error is subjected to mean value processing to achieve normalization;
step 704: adding the three normalized RMS errors of each filtering algorithm to form a score, and comparing the scores of the filtering algorithms to evaluate the performance and effect of the three filtering algorithms, thereby selecting the optimal positioning algorithm with the minimum score;
step 705: and obtaining a fusion positioning result under the bicycle model based on an optimal positioning algorithm.
In the step 701, when the bicycle model is filtered by using a filtering algorithm, an initial state vector is calculated
Figure BDA0003494729110000083
Taking the true value, taking the wheel radius R to be 0.425, taking the vehicle wheelbase b to be 0.8, taking the gear ratio N to be 5, initializing the process noise matrix Q, the observation noise matrix R and the initial error covariance matrix P 0 The expressions are respectively: />
Figure BDA0003494729110000084
Figure BDA0003494729110000091
Figure BDA0003494729110000092
Compared with the prior art, the invention has the following advantages:
1. according to the invention, GPS information and IMU information are fused, and more accurate unmanned vehicle positioning result is obtained by utilizing the complementarity of the two sensor information, so that the automatic driving positioning precision under a simple environment is improved, and the technical effect of safe driving of the vehicle is further controlled;
2. according to the invention, the optimal positioning result is obtained by selecting the optimal algorithm aiming at the nonlinearity degree of the system, so that the technical effect of enhancing the applicability of the unmanned vehicle positioning method is realized.
Drawings
Fig. 1 is a linear model diagram used in the unmanned vehicle positioning method with multi-source sensor information fusion according to the present invention.
Fig. 2 is an enlarged view of a fusion result for a linear model, which is achieved by the unmanned vehicle positioning method for fusing multi-source sensor information.
Fig. 3 is a diagram of a kinematic model used in the method for locating an unmanned vehicle with multi-source sensor information fusion according to the present invention.
Fig. 4 is an enlarged view of a fusion result aiming at a kinematic model, which is achieved by the unmanned vehicle positioning method for fusing multi-source sensor information.
Fig. 5 is a comparison chart of filtering effects of three different filtering algorithms for a kinematic model, which is realized by the unmanned vehicle positioning method based on the multi-source sensor information.
Fig. 6 is a schematic diagram of a bicycle model used in the method for positioning an unmanned vehicle by fusing multi-source sensor information according to the present invention.
Fig. 7 is a diagram of a fusion result of a bicycle model, which is achieved by the unmanned vehicle positioning method for fusing multi-source sensor information.
Fig. 8 is a comparison and amplification diagram of filtering effects of three different filtering algorithms for a bicycle model, which is realized by the unmanned vehicle positioning method based on the multi-source sensor information.
FIG. 9 is a schematic flow chart of the method of the present invention.
Detailed Description
The invention will now be described in detail with reference to the drawings and specific examples.
Examples
A unmanned vehicle positioning method integrating multisource sensor information comprises the following steps:
step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining an inertial navigation system and GPS signals of the vehicle to be controlled as a true value to verify the positioning precision after fusion;
step 2: establishing a linear model according to the GPS position information and the IMU information;
step 3: acquiring an unmanned vehicle positioning method for fusing multi-source sensor information according to the linear model, and acquiring a fused positioning result under the linear model;
step 4: establishing a nonlinear model according to GPS position information and IMU information;
step 5: acquiring an unmanned vehicle positioning method for fusing multisource sensor information under a kinematic model according to a nonlinear model, and acquiring a fused positioning result under the nonlinear model;
step 6: the nonlinear model is further deepened and modeled again according to the nonlinear model, so that the establishment of the bicycle model is completed;
step 7: and acquiring the unmanned vehicle positioning method for fusing the multisource sensor information under the bicycle model according to the bicycle model, and acquiring a fused positioning result under the bicycle model.
In step 1, GPS information is collected by a GNSS antenna mounted on the roof of the vehicle, and angular velocity and acceleration information is collected by an inertial navigation unit.
In step 2, the linear model includes a constant velocity model and a constant acceleration model.
In step 3, the unmanned vehicle positioning method for fusing the multi-source sensor information under the linear model specifically comprises the following steps:
taking the linear model, the IMU information and the GPS information as inputs, fusing the GPS position information and the IMU information according to a Kalman filtering algorithm, obtaining more accurate unmanned vehicle position information, and comparing with a true value;
in step 4, the nonlinear model is a kinematic model of constant acceleration and angular velocity.
In step 5, the unmanned vehicle positioning method based on the fusion of the multisource sensor information under the kinematic model specifically comprises the following steps:
and respectively adopting an extended Kalman filtering algorithm, an unscented Kalman filtering algorithm and a particle filtering algorithm to finish fusion positioning, comprehensively comparing, and selecting an optimal positioning algorithm.
In step 6, the bicycle model has stronger nonlinearity than the kinematic model and is more similar to the real-life vehicle model.
In step 7, the unmanned vehicle positioning method for fusing the multi-source sensor information under the bicycle model specifically comprises the following steps:
aiming at the public data set, an extended Kalman filtering algorithm, an unscented Kalman filtering algorithm and a particle filtering algorithm are adopted to analyze the influence of the nonlinearity degree of the bicycle model on the accuracy of the result of the unmanned bicycle fusion positioning algorithm, and an optimal positioning algorithm is obtained.
According to the invention, the GPS and the IMU are used for carrying out data acquisition on the vehicle to be controlled, the system is used for carrying out linear system modeling, a Kalman filtering algorithm is adopted for fusion to obtain a more accurate positioning result, the system is used for carrying out nonlinear modeling and carrying out real vehicle testing, the unmanned vehicle positioning precision is obviously improved by adopting expanded Kalman filtering, unscented Kalman filtering and particle filtering under the nonlinear system, the nonlinearity of the model is further enlarged, a bicycle model is adopted for further experimental simulation and an optimal filtering algorithm is selected, the GPS information and the IMU information are fused, the technical problem of how to improve the positioning precision in automatic driving under a simple environment to control the vehicle running is solved, and the optimal solution mode can be selected in the systems with different nonlinearity degrees, so that the method has strong applicability.
As shown in fig. 1 to 8, an unmanned vehicle positioning method integrating multi-source sensor information comprises the following steps: the GPS information and IMU information of the vehicle to be controlled are obtained, centimeter-level precision positioning of the vehicle to be controlled, which is obtained by combining an inertial navigation system and GPS signals, is obtained as a true value, is used for comparing with a fusion positioning result, a linear constant acceleration model is established, the GPS position information and the IMU position information are fused by adopting a Kalman filtering algorithm, the GPS position information and the IMU position information are compared with the true value, the variance before and after filtering is calculated, a kinematic model of constant acceleration and angular acceleration is established, the fusion of the GPS information and the IMU information is realized by adopting a Kalman filtering algorithm, an unscented Kalman filtering algorithm and a particle filtering algorithm, the RMS error is calculated, the advantages and disadvantages of the three algorithm effects are judged, nonlinearity is further deepened, a bicycle model is established, the GPS information and the IMU information fusion is completed through a public data set, and error analysis is carried out aiming at coordinate values and heading angles.
According to the invention, the GPS information and the IMU information can be complemented in quality, so that a more accurate unmanned vehicle positioning effect can be obtained in a simple environment, the most accurate unmanned vehicle positioning effect can be obtained by comparing the effects of three filtering algorithms in a nonlinear system, an extended Kalman filtering algorithm is adopted in an environment with weak nonlinearity, and when the nonlinearity degree of the environment is enhanced, the obtained unscented Kalman filtering algorithm and a particle filtering algorithm are better, so that the invention can realize a good fusion positioning effect in the environment with different nonlinearities.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that various equivalent modifications and substitutions are possible within the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (7)

1. The unmanned vehicle positioning method integrating the multi-source sensor information is characterized by comprising the following steps of:
step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining an inertial navigation system and GPS signals of the vehicle to be controlled as a true value to verify the positioning precision after fusion;
step 2: establishing a linear model according to the GPS position information and the IMU information;
step 3: acquiring an unmanned vehicle positioning algorithm fusing multi-source sensor information under a linear model, and acquiring a fused positioning result under the linear model;
step 4: establishing a nonlinear model according to GPS position information and IMU information;
step 5: acquiring an unmanned vehicle positioning algorithm fusing multi-source sensor information under a nonlinear model, and acquiring a fused positioning result under the nonlinear model;
step 6: further deepening nonlinearity of the nonlinear model and re-modeling to obtain a bicycle model;
step 7: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a bicycle model, and acquiring a fused positioning result under the bicycle model;
in the step 2, the linear model includes a constant velocity model and a constant acceleration model, where the expression of the state vector of the constant velocity model is:
Figure FDA0004189214510000011
wherein x, y, theta and v are respectively the abscissa, the ordinate, the course angle and the speed of the vehicle at the moment t;
the expressions of the state equation and the observation equation of the constant velocity model are respectively:
Figure FDA0004189214510000012
Figure FDA0004189214510000013
wherein, deltat is the time interval,
Figure FDA0004189214510000014
for the state vector after the Δt time has elapsed, +.>
Figure FDA0004189214510000015
X is the observation value obtained by GPS gps Y is the abscissa of the vehicle position obtained by GPS gps Is the ordinate of the vehicle position obtained by GPS;
the expression of the state vector of the constant acceleration model is:
Figure FDA0004189214510000021
wherein s and v are the displacement and the speed of the vehicle at the moment t respectively;
the expressions of the state equation and the observation equation of the constant acceleration model are respectively:
Figure FDA0004189214510000022
Figure FDA0004189214510000023
where a is the acceleration of the vehicle,
Figure FDA0004189214510000024
for the state vector after the Δt time, s (t) is the displacement of the vehicle at the time t, v (t) is the speed of the vehicle at the time t, +.>
Figure FDA0004189214510000025
For observations, i.e. GPS measurementsDisplacement;
the constant acceleration model takes the value measured by the accelerometer as the system input:
u(k)=a(k)
where u (k) is the system input and a (k) is the value measured by the accelerometer;
in the step 4, the nonlinear model is a kinematic model based on constant acceleration and angular velocity, and the expression of the state vector of the kinematic model is:
Figure FDA0004189214510000026
wherein ,xk 、y k 、ψ k 、ω k 、v k and ak The horizontal coordinate, the vertical coordinate, the course angle, the angular speed, the speed and the acceleration of the vehicle at the moment k respectively;
the expressions of the state equation and the observation equation of the kinematic model are respectively:
Figure FDA0004189214510000027
Figure FDA0004189214510000028
wherein ,
Figure FDA0004189214510000029
is the state vector of the vehicle at time k-1, v k-1 For the speed of the vehicle at time k-1, dt is the time interval,/-)>
Figure FDA00041892145100000210
For observations, x g,k 、y g,k 、a a,k and ωg,k And respectively obtaining acceleration and angular velocity measured by a gyroscope for the vehicle abscissa, the vehicle ordinate and the accelerometer obtained by GPS observation at the moment k.
2. The method for locating an unmanned vehicle with integrated multisource sensor information according to claim 1, wherein in the step 1, GPS information is collected through a GNSS antenna installed on a roof, IMU information is collected through an inertial navigation unit, the GPS information comprises longitude and latitude information of a vehicle, longitude and latitude information of the vehicle can be converted into an abscissa and an ordinate of the vehicle, and the IMU information comprises three-axis angular speed and acceleration of the vehicle.
3. The method for positioning the unmanned vehicle by fusing the multi-source sensor information according to claim 1, wherein in the step 3, the method for positioning the unmanned vehicle by fusing the multi-source sensor information under the linear model is specifically as follows:
step 301: more accurate unmanned vehicle position information is obtained according to a Kalman filtering algorithm, and the specific process of the Kalman filtering algorithm is divided into five links, namely state prediction, error covariance prediction, kalman gain update, state update and error covariance update:
Figure FDA0004189214510000031
Figure FDA0004189214510000032
Figure FDA0004189214510000033
Figure FDA0004189214510000034
Figure FDA0004189214510000035
wherein ,
Figure FDA0004189214510000036
predictive value representing state vector at time k, +.>
Figure FDA0004189214510000037
Optimal estimate representing state vector at time k, < >>
Figure FDA0004189214510000038
Predictive value representing k-moment error covariance, P k Optimal estimation value representing K time error covariance, K k Kalman gain at k time, Q is process noise covariance matrix, R is observation noise covariance matrix, u k-1 The system input at the moment k-1 is represented, A represents a state transition matrix of the system, H represents an observation matrix of the system, I represents an identity matrix, and B represents a control input matrix;
for the constant speed model, the expressions of the observation matrix H and the state transition matrix A obtained according to the state equation and the observation equation of the constant speed model are respectively as follows:
Figure FDA0004189214510000039
Figure FDA00041892145100000310
wherein deltat is the time interval and theta is the heading angle of the vehicle;
initializing a covariance matrix P as an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R:
Figure FDA00041892145100000311
Figure FDA00041892145100000312
for a constant acceleration model, the expressions of a state transition matrix A, an observation matrix H and a control input matrix B are obtained according to the state equation, the observation value and the system input of the constant acceleration model, and are respectively as follows:
Figure FDA0004189214510000041
H=[1 0]
Figure FDA0004189214510000042
wherein Δt is the time interval;
initializing a covariance matrix P as an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R as follows:
Figure FDA0004189214510000043
R=1;
step 302: and comparing the obtained optimal estimated value with the true value, and calculating the variance before and after filtering.
4. The method for positioning the unmanned vehicle by fusing the multi-source sensor information according to claim 1, wherein in the step 5, the process of acquiring the unmanned vehicle positioning method by fusing the multi-source sensor information under the kinematic model specifically comprises the following steps:
step 501: respectively adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filter algorithm to carry out fusion positioning;
step 502: the three filtering algorithms are comprehensively compared, the RMS error is calculated to select the optimal positioning algorithm, and the calculation formula of the RMS error is as follows:
error rms =|X-X est |
wherein, error rms To filter the error, X is true, including true values of the vehicle's abscissa and ordinate, X est The optimal estimated values of the abscissa and the ordinate of the vehicle after filtering are respectively;
step 503: and selecting a corresponding filtering algorithm with the minimum RMS error, and acquiring a fusion positioning result under the nonlinear model based on the filtering algorithm.
5. The method for locating an unmanned vehicle with multi-source sensor information integrated according to claim 1, wherein in step 6, the relationship between the speed and the angular velocity in the bicycle model is:
Figure FDA0004189214510000044
Figure FDA0004189214510000045
Figure FDA0004189214510000046
wherein, beta is the slip angle,
Figure FDA0004189214510000051
is the speed in the horizontal direction>
Figure FDA0004189214510000052
Is vertical speed +.>
Figure FDA0004189214510000053
The angular velocity of course angle, v is the centroid speed, and ψ is the courseAngle of direction, l r And/l f The rear suspension length and the front suspension length, delta f And delta r The front wheel deflection angle and the rear wheel deflection angle are respectively;
the slip angle β is calculated as:
Figure FDA0004189214510000054
wherein ,lr And/l f The rear suspension length and the front suspension length, delta f And delta r The front wheel deflection angle and the rear wheel deflection angle are respectively;
the expression of the state vector of the bicycle model is:
Figure FDA0004189214510000055
wherein ,xk 、y k 、ψ k 、r、b、N、δ r,k and ωk Respectively determining the abscissa, the ordinate, the course angle, the wheel radius, the vehicle wheelbase, the gear ratio, the front wheel deflection angle and the wheel rotation angular speed of the vehicle at the moment k, wherein the wheel radius, the vehicle wheelbase and the gear ratio are kept unchanged, the front wheel deflection angle and the wheel rotation angular speed of the vehicle are input by a system, and dt is a time interval;
the state equation of the bicycle model is:
Figure FDA0004189214510000056
the observed values of the bicycle model are:
Figure FDA0004189214510000057
wherein ,xg,k and yg,k The abscissa and the ordinate of the vehicle observed by the GPS at the time k are respectively;
the observation equation of the bicycle model is as follows:
Figure FDA0004189214510000058
wherein B is a control input matrix.
6. The method for locating the unmanned vehicle by fusing the multi-source sensor information according to claim 5, wherein in the step 7, the process of obtaining the unmanned vehicle locating algorithm by fusing the multi-source sensor information under the bicycle model specifically comprises the following steps:
step 701: based on the public data set, adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filtering algorithm to respectively obtain the optimal estimation value of the vehicle under the bicycle model;
step 702: error analysis is carried out on coordinates and course angles of the vehicle, and RMS errors corresponding to the abscissa, the ordinate and the course angles of the vehicle are calculated, wherein the calculation formula of the RMS errors is as follows:
error rms =|X-X est |
wherein, error rms To filter the error, X is true, including true values of the vehicle's abscissa and ordinate, X est The optimal estimated values of the abscissa, the ordinate and the course angle of the filtered vehicle are respectively;
step 703: each RMS error is subjected to mean value processing to achieve normalization;
step 704: adding the three normalized RMS errors of each filtering algorithm to form a score, and comparing the scores of the filtering algorithms to evaluate the performance and effect of the three filtering algorithms, thereby selecting the optimal positioning algorithm with the minimum score;
step 705: and obtaining a fusion positioning result under the bicycle model based on an optimal positioning algorithm.
7. The method for locating an unmanned vehicle with multi-source sensor information integrated according to claim 6, wherein said steps ofIn step 701, when the bicycle model is filtered by a filtering algorithm, an initial state vector is calculated
Figure FDA0004189214510000061
Taking the true value, taking the wheel radius R to be 0.425, taking the vehicle wheelbase b to be 0.8, taking the gear ratio N to be 5, initializing the process noise matrix Q, the observation noise matrix R and the initial error covariance matrix P 0 The expressions are respectively:
Figure FDA0004189214510000062
Figure FDA0004189214510000063
/>
Figure FDA0004189214510000071
/>
CN202210109679.2A 2022-01-29 2022-01-29 Unmanned vehicle positioning method integrating multi-source sensor information Active CN114440881B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210109679.2A CN114440881B (en) 2022-01-29 2022-01-29 Unmanned vehicle positioning method integrating multi-source sensor information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210109679.2A CN114440881B (en) 2022-01-29 2022-01-29 Unmanned vehicle positioning method integrating multi-source sensor information

Publications (2)

Publication Number Publication Date
CN114440881A CN114440881A (en) 2022-05-06
CN114440881B true CN114440881B (en) 2023-05-30

Family

ID=81372107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210109679.2A Active CN114440881B (en) 2022-01-29 2022-01-29 Unmanned vehicle positioning method integrating multi-source sensor information

Country Status (1)

Country Link
CN (1) CN114440881B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116222544B (en) * 2023-05-09 2023-08-04 浙江大学湖州研究院 Automatic navigation and positioning method and device for feeding vehicle facing to feeding farm
CN117804452A (en) * 2023-12-07 2024-04-02 盐城中科高通量计算研究院有限公司 Monte Carlo algorithm-based charging platform vehicle positioning method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2716597A1 (en) * 2010-10-08 2012-04-08 Canadian Space Agency Apparatus and method for driftless attitude determination and reliable localization of vehicles
JP2012173190A (en) * 2011-02-23 2012-09-10 Seiko Epson Corp Positioning system and positioning method
CN103439731A (en) * 2013-08-29 2013-12-11 北京空间飞行器总体设计部 GPS/INS integrated navigation method based on unscented Kalman filtering
CN108931799A (en) * 2018-07-18 2018-12-04 兰州交通大学 Train combined positioning method and system based on the search of recurrence fast orthogonal
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN110602647A (en) * 2019-09-11 2019-12-20 江南大学 Indoor fusion positioning method based on extended Kalman filtering and particle filtering
CN110956665A (en) * 2019-12-18 2020-04-03 中国科学院自动化研究所 Vehicle turning track bidirectional calculation method, system and device
CN113063429A (en) * 2021-03-18 2021-07-02 苏州华米导航科技有限公司 Self-adaptive vehicle-mounted combined navigation positioning method
CN113252033A (en) * 2021-06-29 2021-08-13 长沙海格北斗信息技术有限公司 Positioning method, positioning system and robot based on multi-sensor fusion

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2716597A1 (en) * 2010-10-08 2012-04-08 Canadian Space Agency Apparatus and method for driftless attitude determination and reliable localization of vehicles
JP2012173190A (en) * 2011-02-23 2012-09-10 Seiko Epson Corp Positioning system and positioning method
CN103439731A (en) * 2013-08-29 2013-12-11 北京空间飞行器总体设计部 GPS/INS integrated navigation method based on unscented Kalman filtering
CN108931799A (en) * 2018-07-18 2018-12-04 兰州交通大学 Train combined positioning method and system based on the search of recurrence fast orthogonal
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN110602647A (en) * 2019-09-11 2019-12-20 江南大学 Indoor fusion positioning method based on extended Kalman filtering and particle filtering
CN110956665A (en) * 2019-12-18 2020-04-03 中国科学院自动化研究所 Vehicle turning track bidirectional calculation method, system and device
CN113063429A (en) * 2021-03-18 2021-07-02 苏州华米导航科技有限公司 Self-adaptive vehicle-mounted combined navigation positioning method
CN113252033A (en) * 2021-06-29 2021-08-13 长沙海格北斗信息技术有限公司 Positioning method, positioning system and robot based on multi-sensor fusion

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Asynchronous Sensor Fusion of GPS, IMU and CAN-Based Odometry for Heavy-Duty Vehicles;Vicent Girbés-Juan等;《IEEE Transactions on Vehicular Technology 》;第70卷(第9期);第8617-8626页 *
Multi-sensor fusion methodology for enhanced land vehicle positioning;Li Xu等;《Information Fusion》;第46卷(第3期);第51-62页 *
基于融合算法的GPS/UWB/MARG协同定位系统研究;周群等;《现代电子技术》;第41卷(第6期);第82-86页 *

Also Published As

Publication number Publication date
CN114440881A (en) 2022-05-06

Similar Documents

Publication Publication Date Title
CN112083726B (en) Park-oriented automatic driving double-filter fusion positioning system
CN111307162B (en) Multi-sensor fusion positioning method for automatic driving scene
CN114440881B (en) Unmanned vehicle positioning method integrating multi-source sensor information
CN104061899B (en) A kind of vehicle side inclination angle based on Kalman filtering and angle of pitch method of estimation
CN113819914B (en) Map construction method and device
CN113386781B (en) Intelligent vehicle track tracking control method based on data-driven vehicle dynamics model
CN111595592B (en) Performance evaluation method of adaptive cruise control system
CN109466558B (en) Road adhesion coefficient estimation method based on EKF (extended Kalman Filter) and BP (Back propagation) neural network
CN111426318A (en) Low-cost AHRS course angle compensation method based on quaternion-extended Kalman filtering
CN112505737B (en) GNSS/INS integrated navigation method
CN113340298B (en) Inertial navigation and dual-antenna GNSS external parameter calibration method
CN104215262A (en) On-line dynamic inertia sensor error identification method of inertia navigation system
CN113063429A (en) Self-adaptive vehicle-mounted combined navigation positioning method
CN114966629A (en) Vehicle body laser radar external reference calibration method based on EKF algorithm framework
CN111751857A (en) Vehicle pose estimation method, device, storage medium and system
CN112099378B (en) Front vehicle lateral motion state real-time estimation method considering random measurement time lag
CN114475581B (en) Automatic parking positioning method based on wheel speed pulse and IMU Kalman filtering fusion
CN114912061B (en) Accurate assessment method for lane keeping auxiliary system of commercial vehicle
CN103395435A (en) Real-time positioning system method of high-precision and high-speed train
CN115060257A (en) Vehicle lane change detection method based on civil-grade inertia measurement unit
CN113008229B (en) Distributed autonomous integrated navigation method based on low-cost vehicle-mounted sensor
CN109033017B (en) Vehicle roll angle and pitch angle estimation method under packet loss environment
CN113029173A (en) Vehicle navigation method and device
CN108413923B (en) Vehicle roll angle and pitch angle estimation method based on robust hybrid filtering
CN114543793B (en) Multi-sensor fusion positioning method of vehicle navigation system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant