CN114440881A - Unmanned vehicle positioning method integrating multi-source sensor information - Google Patents

Unmanned vehicle positioning method integrating multi-source sensor information Download PDF

Info

Publication number
CN114440881A
CN114440881A CN202210109679.2A CN202210109679A CN114440881A CN 114440881 A CN114440881 A CN 114440881A CN 202210109679 A CN202210109679 A CN 202210109679A CN 114440881 A CN114440881 A CN 114440881A
Authority
CN
China
Prior art keywords
vehicle
model
matrix
algorithm
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210109679.2A
Other languages
Chinese (zh)
Other versions
CN114440881B (en
Inventor
张卫东
刘馨泽
王咸鹏
张永辉
黄梦醒
沈重
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hainan University
Original Assignee
Hainan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hainan University filed Critical Hainan University
Priority to CN202210109679.2A priority Critical patent/CN114440881B/en
Publication of CN114440881A publication Critical patent/CN114440881A/en
Application granted granted Critical
Publication of CN114440881B publication Critical patent/CN114440881B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention relates to an unmanned vehicle positioning method fusing multi-source sensor information, which comprises the following steps: step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining the vehicle to be controlled with an inertial navigation system and a GPS signal as a true value; step 2: establishing a linear model; and step 3: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a linear model, and acquiring a fusion positioning result under the linear model; and 4, step 4: establishing a nonlinear model; and 5: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a kinematic model, and acquiring a fusion positioning result under a nonlinear model; step 6: establishing a bicycle model; and 7: compared with the prior art, the unmanned vehicle positioning method has the advantages of improving unmanned vehicle positioning accuracy, being high in applicability and the like.

Description

Unmanned vehicle positioning method integrating multi-source sensor information
Technical Field
The invention relates to the technical field of automatic driving, in particular to an unmanned vehicle positioning method fusing multi-source sensor information.
Background
With the economic growth of China in recent years, automobiles are not only convenient vehicles, but also intelligent networking of the automobiles is another requirement of people, so that unmanned automobiles are concerned by the majority of researchers and the industry due to the portability and the expansibility of the unmanned automobiles. The unmanned vehicle positioning is a basic module for researching the field of unmanned vehicles, and different research methods are needed by using different perception sensors. The scheme of simply using GPS to complete positioning work in the market at present is characterized in that the scheme is all-weather and has a large coverage area, but when satellite signals are interfered, a large error is generated, and in addition, the scheme of simply using IMU to complete positioning work is a relative positioning technology, and the positioning error is accumulated to be larger and larger along with the time.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide an unmanned vehicle positioning method fusing multi-source sensor information.
The purpose of the invention can be realized by the following technical scheme:
an unmanned vehicle positioning method fusing multi-source sensor information comprises the following steps:
step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining the vehicle to be controlled with an inertial navigation system and a GPS signal as a true value to verify the fused positioning precision;
step 2: establishing a linear model according to the GPS position information and the IMU information;
and step 3: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a linear model, and acquiring a fusion positioning result under the linear model;
and 4, step 4: establishing a nonlinear model according to the GPS position information and the IMU information;
and 5: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a nonlinear model, and acquiring a fusion positioning result under the nonlinear model;
and 6: further deepening nonlinearity of the nonlinear model and modeling again to obtain a bicycle model;
and 7: and acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under the single vehicle model, and obtaining a fusion positioning result under the single vehicle model.
In the step 1, GPS information is collected through a GNSS antenna installed on the roof of the vehicle, IMU information is collected through an inertial navigation unit, the GPS information comprises longitude and latitude information of the vehicle, the longitude and latitude information of the vehicle can be converted into a horizontal coordinate and a vertical coordinate of the vehicle, and the IMU information comprises three-axis angular velocity and acceleration of the vehicle.
In step 2, the linear model includes a constant velocity model and a constant acceleration model, wherein the expression of the state vector of the constant velocity model is as follows:
Figure BDA0003494729110000021
wherein x, y, theta and v are respectively an abscissa, an ordinate, a vehicle course angle and a speed of the vehicle position at the time t;
the expressions of the state equation and the observation equation of the constant speed model are respectively as follows:
Figure BDA0003494729110000022
Figure BDA0003494729110000023
wherein, Delta t is a time interval,
Figure BDA0003494729110000024
for the state vector after the time at has elapsed,
Figure BDA0003494729110000025
for observations obtained by GPS, xgpsAs the abscissa, y, of the vehicle position obtained by GPSgpsFor vehicles derived from GPSThe ordinate of the vehicle position;
the expression of the state vector of the constant acceleration model is:
Figure BDA0003494729110000026
wherein s and v are respectively the displacement and the speed of the vehicle at the moment t;
the expressions of the state equation and the observation equation of the constant acceleration model are respectively as follows:
Figure BDA0003494729110000027
Figure BDA0003494729110000028
wherein, a is the acceleration of the vehicle,
Figure BDA0003494729110000029
is the state vector after the time of delta t, s (t) is the displacement of the vehicle at the time t, v (t) is the speed of the vehicle at the time t,
Figure BDA0003494729110000031
the observed value is the displacement measured by the GPS;
the constant acceleration model takes the values measured by the accelerometer as system inputs:
u(k)=a(k)
where u (k) is the system input and a (k) is the value measured by the accelerometer.
In the step 3, the unmanned vehicle positioning method fusing multi-source sensor information under the linear model specifically comprises the following steps:
step 301: obtaining more accurate unmanned vehicle position information according to a Kalman filtering algorithm, wherein the Kalman filtering algorithm comprises five steps of state prediction, error covariance prediction, Kalman gain updating, state updating and error covariance updating:
Figure BDA0003494729110000032
Figure BDA0003494729110000033
Figure BDA0003494729110000034
Figure BDA0003494729110000035
Figure BDA0003494729110000036
wherein ,
Figure BDA0003494729110000037
a predictor representing the state vector at time k,
Figure BDA0003494729110000038
represents the best estimate of the state vector at time k,
Figure BDA0003494729110000039
prediction value, P, representing the error covariance at time kkOptimal estimate, K, representing the error covariance at time KkRepresenting the Kalman gain at time k, Q being the process noise covariance matrix, R being the observation noise covariance matrix, uk-1The system input at the moment of k-1 is represented, A represents a state transition matrix of the system, H represents an observation matrix of the system, I represents a unit matrix, and B is a control input matrix;
for the constant velocity model, the expressions of an observation matrix H and a state transition matrix A obtained according to the state equation and the observation equation of the constant velocity model are respectively as follows:
Figure BDA00034947291100000310
Figure BDA00034947291100000311
wherein, delta t is a time interval, and theta is a course angle of the vehicle;
initializing a covariance matrix P into an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R:
Figure BDA00034947291100000312
Figure BDA00034947291100000313
for the constant acceleration model, the expressions of a state transition matrix A, an observation matrix H and a control input matrix B obtained according to the state equation, the observation value and the system input of the constant acceleration model are respectively as follows:
Figure BDA0003494729110000041
H=[1 0]
Figure BDA0003494729110000042
wherein Δ t is a time interval;
initializing the covariance matrix P as an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R as follows:
Figure BDA0003494729110000043
R=1;
step 302: and comparing the obtained optimal estimation value with the true value, and calculating the variance before and after filtering.
In the step 4, the nonlinear model is a kinematic model based on constant acceleration and angular velocity, and the expression of the state vector of the kinematic model is as follows:
Figure BDA0003494729110000044
wherein ,xk、yk、ψk、ωk、vk and akRespectively an abscissa, an ordinate, a course angle, an angular velocity, a speed and an acceleration of the vehicle at the moment k;
the expressions of the state equation and the observation equation of the kinematic model are respectively as follows:
Figure BDA0003494729110000045
Figure BDA0003494729110000046
wherein ,
Figure BDA0003494729110000047
is the state vector of the vehicle at time k-1, vk-1The speed of the vehicle at time k-1, dt is the time interval,
Figure BDA0003494729110000048
as an observed value, xg,k、yg,k、aa,k and ωg,kThe acceleration obtained by the accelerometer and the angular velocity measured by the gyroscope are respectively the abscissa and the ordinate of the vehicle observed by the GPS at the moment k.
In the step 5, the process of obtaining the unmanned vehicle positioning method fusing multi-source sensor information under the kinematics model specifically comprises the following steps:
step 501: performing fusion positioning by respectively adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filter algorithm;
step 502: comprehensively comparing the three filtering algorithms, calculating RMS error, and selecting the optimal positioning algorithm, wherein the RMS error has the calculation formula:
errorrms=|X-Xest|
wherein, errorrmsX is the true value of the filtered error and includes the true values of the abscissa and ordinate of the vehicle, XestRespectively obtaining the optimal estimated values of the abscissa and the ordinate of the filtered vehicle;
step 503: and selecting a corresponding filtering algorithm with the minimum RMS error, and acquiring a fusion positioning result under the nonlinear model based on the filtering algorithm.
In step 501, the specific formula for expanding the kalman filtering algorithm includes:
Figure BDA0003494729110000051
Figure BDA0003494729110000052
wherein ,Fk-1And HkRespectively, the Jacobian matrix, ukFor the system input at time k, xk-1Is the state vector at the moment of k-1;
jacobian matrix Fk-1 and HkAre respectively:
Figure BDA0003494729110000053
Figure BDA0003494729110000054
wherein ,ψk-1The heading angle of the vehicle at time k-1,vk-1the speed of the vehicle at time k-1, dt is the time interval;
the specific filtering process of the unscented Kalman filtering algorithm is as follows:
step 1 a: initialization state vector x0State estimation error covariance P0A process noise matrix Q and an observation noise matrix R, for a kinematic model, the initial values of the state vectors
Figure BDA0003494729110000055
Is true value, the process noise matrix Q, the observation noise matrix R and the initial error covariance matrix P0The initialized expression is:
Figure BDA0003494729110000061
Figure BDA0003494729110000062
Figure BDA0003494729110000063
step 2 a: according to the state vector x at the previous momentkSum error covariance matrix PkSelecting a sampling point at the current moment and distributing weight;
step 3 a: selecting 2n +1 sampling points, calculating the mean value and covariance of all the sampling points at the current moment through a state equation of the system, and finishing time updating;
step 4 a: measuring and updating the sampling points through an observation equation of the system;
step 5 a: calculating Kalman gain through prediction and measurement updating, and finishing filtering updating;
the particle filter algorithm adopts a Monte Carlo method to solve the problem of nonlinear filtering, and the filtering process of the particle filter algorithm is as follows:
step 1 b: initializing N particles
Figure BDA0003494729110000064
And weight of the particles
Figure BDA0003494729110000065
Uniformly setting the ratio to 1/N;
and step 2 b: updating particles
Figure BDA0003494729110000066
Calculating likelihood probability of each particle
Figure BDA0003494729110000067
Weighting value of each particle
Figure BDA0003494729110000068
Is updated to
Figure BDA0003494729110000069
And carrying out normalization processing, wherein the weight value updating formula of the particles is as follows:
Figure BDA00034947291100000610
wherein ,
Figure BDA00034947291100000611
the weight of the particles is the weight of the particles,
Figure BDA00034947291100000612
is the likelihood probability of a particle, zkAs an observed value of the system at time k,
Figure BDA00034947291100000613
is the output value, R, of the ith particle observation equation at time kkAn observed noise covariance matrix at time k;
and step 3 b: to pair
Figure BDA0003494729110000071
Resampling, and calculating effective particle number NeffThereby obtainingNovel particle swarm
Figure BDA0003494729110000072
And the weight of the particles
Figure BDA0003494729110000073
Is 1/N;
and 4 b: and calculating the filtering state estimation and the variance estimation of the current moment by adopting the weight and the state of the resampled particles.
In step 6, the relationship between the speed and the angular velocity in the bicycle model is as follows:
Figure BDA0003494729110000074
Figure BDA0003494729110000075
Figure BDA0003494729110000076
wherein, beta is a slip angle,
Figure BDA0003494729110000077
the speed in the horizontal direction is the speed,
Figure BDA0003494729110000078
the speed in the vertical direction is the speed,
Figure BDA0003494729110000079
angular velocity as course angle, v as centroid speed, psi as course anglerAnd lfRespectively rear overhang length and front overhang length, deltafAnd deltarRespectively a front wheel deflection angle and a rear wheel deflection angle;
the slip angle β is calculated as:
Figure BDA00034947291100000710
wherein ,lrAnd lfRespectively rear overhang length and front overhang length, deltafAnd deltarRespectively a front wheel deflection angle and a rear wheel deflection angle;
the expression of the state vector of the bicycle model is:
Figure BDA00034947291100000711
wherein ,xk、yk、ψk、r、b、N、δr,k and ωkRespectively representing the abscissa, the ordinate, the course angle, the wheel radius, the vehicle wheelbase, the gear ratio, the front wheel deflection angle and the wheel rotation angular speed of the vehicle at the moment k, keeping the wheel radius, the vehicle wheelbase and the gear ratio unchanged, and taking the front wheel deflection angle and the wheel rotation angular speed of the vehicle as system input, dt is a time interval;
the equation of state of the bicycle model is as follows:
Figure BDA00034947291100000712
the observed values of the bicycle model are:
Figure BDA0003494729110000081
wherein ,xg,k and yg,kRespectively an abscissa and an ordinate of the vehicle observed by the GPS at the moment k;
the observation equation of the bicycle model is as follows:
Figure BDA0003494729110000082
in step 7, the process of obtaining the unmanned vehicle positioning algorithm fusing multi-source sensor information under the single vehicle model specifically comprises the following steps:
step 701: respectively obtaining the optimal estimation values of the vehicle under the bicycle model by adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filter algorithm based on a public data set;
step 702: and (3) carrying out error analysis aiming at the coordinates and the heading angle of the vehicle, and calculating RMS errors corresponding to the abscissa, the ordinate and the heading angle of the vehicle, wherein the RMS error has a calculation formula as follows:
errorrms=|X-Xest|
wherein, errorrmsX is the true value of the filtered error, including the true values of the abscissa and ordinate of the vehicle, XestRespectively obtaining the optimal estimated values of the abscissa, the ordinate and the course angle of the filtered vehicle;
step 703: carrying out mean value processing on each RMS error to realize normalization;
step 704: adding the three normalized RMS errors of each filtering algorithm to form a score, evaluating the performance and effect of the three filtering algorithms by comparing the scores of the filtering algorithms, and further selecting the score with the minimum value as the optimal positioning algorithm;
step 705: and obtaining a fusion positioning result under the bicycle model based on the optimal positioning algorithm.
In step 701, when the bicycle model is filtered by adopting a filtering algorithm, the initial state vector is obtained
Figure BDA0003494729110000083
Taking a true value, taking the wheel radius R to be 0.425, taking the vehicle wheel base b to be 0.8, taking the gear ratio N to be 5, and taking the initialization process noise matrix Q, the observation noise matrix R and the initial error covariance matrix P0The expressions are respectively:
Figure BDA0003494729110000084
Figure BDA0003494729110000091
Figure BDA0003494729110000092
compared with the prior art, the invention has the following advantages:
1. according to the invention, GPS information and IMU information are fused, and the complementarity of the two sensor information is utilized, so that a more accurate positioning result of the unmanned vehicle is obtained, the positioning precision of automatic driving in a simple environment is improved, and the technical effect of controlling the safe driving of the vehicle is further realized;
2. the invention selects the optimal algorithm according to the nonlinear degree of the system to obtain the optimal positioning result, thereby realizing the technical effect of enhancing the applicability of the unmanned vehicle positioning method.
Drawings
Fig. 1 is a linear model diagram used in the unmanned vehicle positioning method with multi-source sensor information fused according to the present invention.
Fig. 2 is an enlarged view of a fusion result of the unmanned vehicle positioning method for fusing multi-source sensor information, which is realized by the method and aims at a linear model.
Fig. 3 is a kinematic model diagram used in the unmanned vehicle positioning method with multi-source sensor information fused according to the present invention.
Fig. 4 is an enlarged view of a fusion result of the kinematics model implemented by the unmanned vehicle positioning method for fusing multi-source sensor information according to the present invention.
Fig. 5 is a comparison graph of filtering effects of three different filtering algorithms for a kinematic model, which is implemented by the unmanned vehicle positioning method with integration of multi-source sensor information provided by the invention.
Fig. 6 is a diagram of a single vehicle model used in the unmanned vehicle positioning method with multi-source sensor information fused according to the present invention.
Fig. 7 is a fusion result diagram for a single vehicle model, which is realized by the unmanned vehicle positioning method for fusing multi-source sensor information according to the present invention.
Fig. 8 is an enlarged view showing comparison of filtering effects of three different filtering algorithms for a single vehicle model, which is realized by the unmanned vehicle positioning method fusing multi-source sensor information according to the present invention.
FIG. 9 is a schematic flow chart of the method of the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and the specific embodiments.
Examples
An unmanned vehicle positioning method fusing multi-source sensor information comprises the following steps:
step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining the vehicle to be controlled with an inertial navigation system and a GPS signal as a true value to verify the fused positioning precision;
step 2: establishing a linear model according to the GPS position information and the IMU information;
and step 3: obtaining an unmanned vehicle positioning method fusing multi-source sensor information according to the linear model, and obtaining a fusion positioning result under the linear model;
and 4, step 4: establishing a nonlinear model according to the GPS position information and the IMU information;
and 5: acquiring an unmanned vehicle positioning method for fusing multi-source sensor information under a kinematic model according to a nonlinear model, and acquiring a fusion positioning result under the nonlinear model;
step 6: further deepening nonlinearity of the nonlinear model according to the nonlinear model and modeling again to complete the establishment of the bicycle model;
and 7: and acquiring the unmanned vehicle positioning method fusing multi-source sensor information under the single vehicle model according to the single vehicle model, and acquiring a fusion positioning result under the single vehicle model.
In step 1, GPS information is collected by a GNSS antenna mounted on the roof of the vehicle, and angular velocity and acceleration information is collected by an inertial navigation unit.
In step 2, the linear model includes a constant velocity model and a constant acceleration model.
In step 3, the method for positioning the unmanned vehicle by fusing multi-source sensor information under the linear model specifically comprises the following steps:
the method comprises the steps that a linear model, IMU information and GPS information are used as input, the GPS position information and the IMU information are fused according to a Kalman filtering algorithm to obtain more accurate unmanned vehicle position information, and the more accurate unmanned vehicle position information is compared with a true value;
in step 4, the nonlinear model is a kinematic model of constant acceleration and angular velocity.
In step 5, the unmanned vehicle positioning method fusing multi-source sensor information under the kinematic model specifically comprises the following steps:
and completing fusion positioning by respectively adopting an extended Kalman filtering algorithm, an unscented Kalman filtering algorithm and a particle filtering algorithm, performing comprehensive comparison, and selecting an optimal positioning algorithm.
In step 6, the single vehicle model has stronger nonlinearity than the kinematic model and is more similar to the real-life vehicle model.
In step 7, the unmanned vehicle positioning method fusing multi-source sensor information under the single vehicle model specifically comprises the following steps:
aiming at the public data set, the influence of the nonlinearity degree of the bicycle model on the result accuracy of the unmanned vehicle fusion positioning algorithm is analyzed by adopting an extended Kalman filtering algorithm, an unscented Kalman filtering algorithm and a particle filtering algorithm, and the optimal positioning algorithm is obtained.
According to the invention, data acquisition is carried out on a vehicle to be controlled through a GPS and an IMU, linear system modeling is carried out on a system, a more accurate positioning result is obtained by adopting Kalman filtering algorithm fusion, nonlinear modeling is carried out on the system and real vehicle testing is carried out, positioning accuracy of the unmanned vehicle is obviously improved by adopting extended Kalman filtering, unscented Kalman filtering and particle filtering under a nonlinear system, nonlinearity of the model is further expanded, a single vehicle model is further adopted for experimental simulation and an optimal filtering algorithm is selected.
As shown in fig. 1 to 8, an unmanned vehicle positioning method fusing multi-source sensor information includes the following steps: the method comprises the steps of obtaining GPS information and IMU information of a vehicle to be controlled, obtaining centimeter-level precision positioning obtained by combining an inertial navigation system and GPS signals of the vehicle to be controlled as a true value, comparing the true value with a fusion positioning result, establishing a linear constant acceleration model, realizing fusion of the GPS position information and the IMU position information by adopting a Kalman filtering algorithm, comparing the true value with the true value, calculating the variance before and after filtering, establishing a kinematic model of constant acceleration and angular acceleration, realizing the fusion of the GPS information and the IMU information by adopting an extended Kalman filtering algorithm, an unscented Kalman filtering algorithm and a particle filtering algorithm, calculating RMS (root mean square) errors, judging the advantages and disadvantages of the effects of the three algorithms, further deepening nonlinearity, establishing a single vehicle model, finishing the fusion of the GPS information and the IMU information by using a public data set, and carrying out error analysis aiming at coordinate values and course angles.
The invention can complement the advantages and disadvantages of GPS information and IMU information, realizes more accurate unmanned vehicle positioning effect under simple environment, obtains the most accurate unmanned vehicle positioning effect by comparing the effects of three filtering algorithms under a nonlinear system, adopts an extended Kalman filtering algorithm in the environment with weak nonlinearity, and obtains better results by adopting an unscented Kalman filtering algorithm and a particle filtering algorithm when the nonlinearity degree of the environment is strengthened, so that the invention can realize good fusion positioning effect under the environments with different nonlinearity degrees and has strong applicability.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. An unmanned vehicle positioning method fusing multi-source sensor information is characterized by comprising the following steps:
step 1: acquiring GPS position information and IMU information of a vehicle to be controlled, and taking centimeter-level precision positioning obtained by combining the vehicle to be controlled with an inertial navigation system and a GPS signal as a true value to verify the fused positioning precision;
step 2: establishing a linear model according to the GPS position information and the IMU information;
and step 3: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a linear model, and acquiring a fusion positioning result under the linear model;
and 4, step 4: establishing a nonlinear model according to the GPS position information and the IMU information;
and 5: acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under a nonlinear model, and acquiring a fusion positioning result under the nonlinear model;
step 6: further deepening nonlinearity of the nonlinear model and modeling again to obtain a bicycle model;
and 7: and acquiring an unmanned vehicle positioning algorithm for fusing multi-source sensor information under the single vehicle model, and obtaining a fusion positioning result under the single vehicle model.
2. The method as claimed in claim 1, wherein in step 1, GPS information is collected through a GNSS antenna installed on a roof of a vehicle, and IMU information is collected through an inertial navigation unit, wherein the GPS information includes longitude and latitude information of the vehicle, and the longitude and latitude information of the vehicle is converted into an abscissa and an ordinate of the vehicle, and the IMU information includes three-axis angular velocity and acceleration of the vehicle.
3. The method according to claim 1, wherein in step 2, the linear model comprises a constant velocity model and a constant acceleration model, and the expression of the state vector of the constant velocity model is as follows:
Figure FDA0003494729100000011
wherein x, y, theta and v are respectively an abscissa, an ordinate, a vehicle course angle and a speed of the vehicle position at the time t;
the expressions of the state equation and the observation equation of the constant speed model are respectively as follows:
Figure FDA0003494729100000021
Figure FDA0003494729100000022
wherein, Delta t is a time interval,
Figure FDA0003494729100000023
for the state vector after the time at has elapsed,
Figure FDA0003494729100000024
for observations obtained by GPS, xgpsAs the abscissa, y, of the vehicle position obtained by GPSgpsIs the ordinate of the vehicle position obtained by the GPS;
the expression of the state vector of the constant acceleration model is:
Figure FDA0003494729100000025
wherein s and v are respectively the displacement and the speed of the vehicle at the time t;
the expressions of the state equation and the observation equation of the constant acceleration model are respectively as follows:
Figure FDA0003494729100000026
Figure FDA0003494729100000027
wherein, a is the acceleration of the vehicle,
Figure FDA0003494729100000028
is a state vector after the time of delta t, s (t) is the displacement of the vehicle at the time of t, v (t) is the speed of the vehicle at the time of t,
Figure FDA0003494729100000029
the observed value is the displacement measured by the GPS;
the constant acceleration model takes the values measured by the accelerometer as system inputs:
u(k)=a(k)
where u (k) is the system input and a (k) is the value measured by the accelerometer.
4. The unmanned aerial vehicle positioning method fused with multi-source sensor information according to claim 3, wherein in the step 3, the unmanned aerial vehicle positioning method fused with multi-source sensor information under a linear model specifically comprises:
step 301: obtaining more accurate unmanned vehicle position information according to a Kalman filtering algorithm, wherein the Kalman filtering algorithm comprises five steps of state prediction, error covariance prediction, Kalman gain updating, state updating and error covariance updating:
Figure FDA00034947291000000210
Figure FDA00034947291000000211
Figure FDA00034947291000000212
Figure FDA00034947291000000213
Figure FDA0003494729100000031
wherein ,
Figure FDA0003494729100000032
a predictor representing the state vector at time k,
Figure FDA0003494729100000033
represents the best estimate of the state vector at time k,
Figure FDA0003494729100000034
prediction value, P, representing the error covariance at time kkOptimal estimate, K, representing the error covariance at time KkRepresenting the Kalman gain at time k, Q being the process noise covariance matrix, R being the observation noise covariance matrix, uk-1The system input at the moment of k-1 is represented, A represents a state transition matrix of the system, H represents an observation matrix of the system, I represents a unit matrix, and B is a control input matrix;
for the constant speed model, the expressions of an observation matrix H and a state transition matrix A obtained according to the state equation and the observation equation of the constant speed model are respectively as follows:
Figure FDA0003494729100000035
Figure FDA0003494729100000036
wherein, delta t is a time interval, and theta is a heading angle of the vehicle;
initializing a covariance matrix P into an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R:
Figure FDA0003494729100000037
Figure FDA0003494729100000038
for the constant acceleration model, the expressions of a state transition matrix A, an observation matrix H and a control input matrix B obtained according to the state equation, the observation value and the system input of the constant acceleration model are respectively as follows:
Figure FDA0003494729100000039
H=[1 0]
Figure FDA00034947291000000310
wherein Δ t is a time interval;
initializing the covariance matrix P as an identity matrix, and initializing a process noise matrix Q and an observation noise matrix R as follows:
Figure FDA00034947291000000311
R=1;
step 302: and comparing the obtained optimal estimation value with the true value, and calculating the variance before and after filtering.
5. The method according to claim 1, wherein in step 4, the nonlinear model is a kinematic model based on constant acceleration and angular velocity, and the expression of the state vector of the kinematic model is as follows:
Figure FDA0003494729100000041
wherein ,xk、yk、ψk、ωk、vk and akRespectively an abscissa, an ordinate, a course angle, an angular velocity, a speed and an acceleration of the vehicle at the moment k;
the expressions of the state equation and the observation equation of the kinematic model are respectively as follows:
Figure FDA0003494729100000042
Figure FDA0003494729100000043
wherein ,
Figure FDA0003494729100000044
is the state vector of the vehicle at time k-1, vk-1The speed of the vehicle at time k-1, dt is the time interval,
Figure FDA0003494729100000045
as an observed value, xg,k、yg,k、aa,k and ωg,kThe acceleration obtained by the accelerometer and the angular velocity measured by the gyroscope are respectively the abscissa and the ordinate of the vehicle observed by the GPS at the moment k.
6. The unmanned aerial vehicle positioning method fusing multi-source sensor information according to claim 1, wherein in the step 5, the process of obtaining the unmanned aerial vehicle positioning method fusing multi-source sensor information under the kinematic model specifically comprises the following steps:
step 501: performing fusion positioning by respectively adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filter algorithm;
step 502: comprehensively comparing the three filtering algorithms, calculating RMS error, and selecting the optimal positioning algorithm, wherein the RMS error has the calculation formula:
errorrms=|X-Xest|
wherein, errorrmsX is the true value of the filtered error and includes the true values of the abscissa and ordinate of the vehicle, XestRespectively obtaining the optimal estimated values of the abscissa and the ordinate of the filtered vehicle;
step 503: and selecting a corresponding filtering algorithm with the minimum RMS error, and acquiring a fusion positioning result under the nonlinear model based on the filtering algorithm.
7. The unmanned aerial vehicle positioning method fusing multi-source sensor information according to claim 6, wherein in step 501, expanding a specific formula of a Kalman filtering algorithm comprises:
Figure FDA0003494729100000051
Figure FDA0003494729100000052
wherein ,Fk-1And HkRespectively, the Jacobian matrix, ukFor the system input at time k, xk-1Is the state vector at the moment of k-1;
jacobian matrix Fk-1 and HkAre respectively:
Figure FDA0003494729100000053
Figure FDA0003494729100000054
wherein ,ψk-1Is the heading angle, v, of the vehicle at time k-1k-1The speed of the vehicle at time k-1, dt is the time interval;
the specific filtering process of the unscented Kalman filtering algorithm is as follows:
step 1 a: initialization state vector x0State estimation error covariance P0A process noise matrix Q and an observation noise matrix R, for a kinematic model, the initial values of the state vectors
Figure FDA0003494729100000055
Is true value, the process noise matrix Q, the observation noise matrix R and the initial error covariance matrix P0The initialized expression is:
Figure FDA0003494729100000056
Figure FDA0003494729100000057
Figure FDA0003494729100000061
step 2 a: according to the state vector x at the previous momentkSum error covariance matrix PkSelecting a sampling point at the current moment and distributing weight;
step 3 a: selecting 2n +1 sampling points, calculating the mean value and covariance of all the sampling points at the current moment through a state equation of the system, and finishing time updating;
step 4 a: measuring and updating the sampling points through an observation equation of the system;
step 5 a: calculating Kalman gain through prediction and measurement updating, and finishing filtering updating;
the particle filter algorithm adopts a Monte Carlo method to solve the problem of nonlinear filtering, and the filtering process of the particle filter algorithm is as follows:
step 1 b: initializing N particles
Figure FDA0003494729100000062
And weight of the particles
Figure FDA0003494729100000063
Uniformly setting the ratio to 1/N;
and step 2 b: updating particles
Figure FDA0003494729100000064
Calculating likelihood probability of each particle
Figure FDA0003494729100000065
Weighting each particle
Figure FDA0003494729100000066
Is updated to
Figure FDA0003494729100000067
And carrying out normalization processing, wherein the weight value updating formula of the particles is as follows:
Figure FDA0003494729100000068
wherein ,
Figure FDA0003494729100000069
is the weight of the particle(s),
Figure FDA00034947291000000610
is the likelihood probability of a particle, zkAs an observed value of the system at time k,
Figure FDA00034947291000000611
is the output value, R, of the ith particle observation equation at time kkAn observed noise covariance matrix at time k;
and step 3 b: to pair
Figure FDA00034947291000000612
Resampling, and calculating effective particle number NeffThereby obtaining a new particle group
Figure FDA00034947291000000613
And the weight of the particles
Figure FDA00034947291000000614
Is 1/N;
and 4 b: and calculating the filtering state estimation and the variance estimation of the current moment by adopting the weight and the state of the resampled particles.
8. The unmanned aerial vehicle positioning method fused with multi-source sensor information as claimed in claim 1, wherein in step 6, the relation satisfied between the speed and the angular velocity in the single vehicle model is as follows:
Figure FDA0003494729100000071
Figure FDA0003494729100000072
Figure FDA0003494729100000073
wherein, beta is a slip angle,
Figure FDA0003494729100000074
the speed in the horizontal direction is the speed,
Figure FDA0003494729100000075
the speed in the vertical direction is the speed,
Figure FDA0003494729100000076
angular velocity as course angle, v as centroid speed, psi as course anglerAnd lfRespectively rear overhang length and front overhang length, deltafAnd deltarRespectively a front wheel deflection angle and a rear wheel deflection angle;
the slip angle β is calculated as:
Figure FDA0003494729100000077
wherein ,lrAnd lfRespectively rear overhang length and front overhang length, deltafAnd deltarRespectively a front wheel deflection angle and a rear wheel deflection angle;
the expression of the state vector of the bicycle model is:
Figure FDA0003494729100000078
wherein ,xk、yk、ψk、r、b、N、δr,k and ωkRespectively an abscissa, an ordinate, a course angle, a wheel radius, a vehicle wheel base, a gear ratio, a front wheel deflection angle and a wheel rotation angular speed of the vehicle at the moment k, wherein the wheel radius, the vehicle wheel base and the gear ratio are kept unchanged, the front wheel deflection angle and the wheel rotation angular speed of the vehicle are used as system input, and dt is a time interval;
the equation of state of the bicycle model is as follows:
Figure FDA0003494729100000079
the observed values of the bicycle model are:
Figure FDA00034947291000000710
wherein ,xg,k and yg,kRespectively an abscissa and an ordinate of the vehicle observed by the GPS at the moment k;
the observation equation of the bicycle model is as follows:
Figure FDA0003494729100000081
9. the method according to claim 8, wherein the step 7 of obtaining the unmanned vehicle positioning algorithm fused with the multi-source sensor information under the single vehicle model specifically comprises the following steps:
step 701: respectively obtaining the optimal estimation values of the vehicle under the bicycle model by adopting an extended Kalman algorithm, an unscented Kalman algorithm and a particle filter algorithm based on a public data set;
step 702: and (3) carrying out error analysis aiming at the coordinates and the heading angle of the vehicle, and calculating RMS errors corresponding to the abscissa, the ordinate and the heading angle of the vehicle, wherein the RMS error has a calculation formula as follows:
errorrms=|X-Xest|
wherein, errorrmsX is the true value of the filtered error, including the true values of the abscissa and ordinate of the vehicle, XestRespectively obtaining the optimal estimated values of the abscissa, the ordinate and the course angle of the filtered vehicle;
step 703: carrying out mean value processing on each RMS error to realize normalization;
step 704: adding the three normalized RMS errors of each filtering algorithm to form a score, evaluating the performance and effect of the three filtering algorithms by comparing the scores of the filtering algorithms, and further selecting the score with the minimum value as the optimal positioning algorithm;
step 705: and obtaining a fusion positioning result under the bicycle model based on the optimal positioning algorithm.
10. The method of claim 9, wherein in step 701, initial state vectors are obtained when the single vehicle model is filtered by a filtering algorithm
Figure FDA0003494729100000082
Taking a true value, taking the wheel radius R to be 0.425, taking the vehicle wheel base b to be 0.8, taking the gear ratio N to be 5, and taking the initialization process noise matrix Q, the observation noise matrix R and the initial error covariance matrix P0The expressions are respectively:
Figure FDA0003494729100000083
Figure FDA0003494729100000084
Figure FDA0003494729100000091
CN202210109679.2A 2022-01-29 2022-01-29 Unmanned vehicle positioning method integrating multi-source sensor information Active CN114440881B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210109679.2A CN114440881B (en) 2022-01-29 2022-01-29 Unmanned vehicle positioning method integrating multi-source sensor information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210109679.2A CN114440881B (en) 2022-01-29 2022-01-29 Unmanned vehicle positioning method integrating multi-source sensor information

Publications (2)

Publication Number Publication Date
CN114440881A true CN114440881A (en) 2022-05-06
CN114440881B CN114440881B (en) 2023-05-30

Family

ID=81372107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210109679.2A Active CN114440881B (en) 2022-01-29 2022-01-29 Unmanned vehicle positioning method integrating multi-source sensor information

Country Status (1)

Country Link
CN (1) CN114440881B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116222544A (en) * 2023-05-09 2023-06-06 浙江大学湖州研究院 Automatic navigation and positioning method and device for feeding vehicle facing to feeding farm

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2716597A1 (en) * 2010-10-08 2012-04-08 Canadian Space Agency Apparatus and method for driftless attitude determination and reliable localization of vehicles
JP2012173190A (en) * 2011-02-23 2012-09-10 Seiko Epson Corp Positioning system and positioning method
CN103439731A (en) * 2013-08-29 2013-12-11 北京空间飞行器总体设计部 GPS/INS integrated navigation method based on unscented Kalman filtering
CN108931799A (en) * 2018-07-18 2018-12-04 兰州交通大学 Train combined positioning method and system based on the search of recurrence fast orthogonal
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN110602647A (en) * 2019-09-11 2019-12-20 江南大学 Indoor fusion positioning method based on extended Kalman filtering and particle filtering
CN110956665A (en) * 2019-12-18 2020-04-03 中国科学院自动化研究所 Vehicle turning track bidirectional calculation method, system and device
CN113063429A (en) * 2021-03-18 2021-07-02 苏州华米导航科技有限公司 Self-adaptive vehicle-mounted combined navigation positioning method
CN113252033A (en) * 2021-06-29 2021-08-13 长沙海格北斗信息技术有限公司 Positioning method, positioning system and robot based on multi-sensor fusion

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2716597A1 (en) * 2010-10-08 2012-04-08 Canadian Space Agency Apparatus and method for driftless attitude determination and reliable localization of vehicles
JP2012173190A (en) * 2011-02-23 2012-09-10 Seiko Epson Corp Positioning system and positioning method
CN103439731A (en) * 2013-08-29 2013-12-11 北京空间飞行器总体设计部 GPS/INS integrated navigation method based on unscented Kalman filtering
CN108931799A (en) * 2018-07-18 2018-12-04 兰州交通大学 Train combined positioning method and system based on the search of recurrence fast orthogonal
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN110602647A (en) * 2019-09-11 2019-12-20 江南大学 Indoor fusion positioning method based on extended Kalman filtering and particle filtering
CN110956665A (en) * 2019-12-18 2020-04-03 中国科学院自动化研究所 Vehicle turning track bidirectional calculation method, system and device
CN113063429A (en) * 2021-03-18 2021-07-02 苏州华米导航科技有限公司 Self-adaptive vehicle-mounted combined navigation positioning method
CN113252033A (en) * 2021-06-29 2021-08-13 长沙海格北斗信息技术有限公司 Positioning method, positioning system and robot based on multi-sensor fusion

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
LI XU等: "Multi-sensor fusion methodology for enhanced land vehicle positioning", 《INFORMATION FUSION》 *
VICENT GIRBÉS-JUAN等: "Asynchronous Sensor Fusion of GPS, IMU and CAN-Based Odometry for Heavy-Duty Vehicles", 《IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY 》 *
周群等: "基于融合算法的GPS/UWB/MARG协同定位系统研究", 《现代电子技术》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116222544A (en) * 2023-05-09 2023-06-06 浙江大学湖州研究院 Automatic navigation and positioning method and device for feeding vehicle facing to feeding farm
CN116222544B (en) * 2023-05-09 2023-08-04 浙江大学湖州研究院 Automatic navigation and positioning method and device for feeding vehicle facing to feeding farm

Also Published As

Publication number Publication date
CN114440881B (en) 2023-05-30

Similar Documents

Publication Publication Date Title
CN112083726B (en) Park-oriented automatic driving double-filter fusion positioning system
CN111307162B (en) Multi-sensor fusion positioning method for automatic driving scene
CN111426318B (en) Low-cost AHRS course angle compensation method based on quaternion-extended Kalman filtering
CN108731670B (en) Inertial/visual odometer integrated navigation positioning method based on measurement model optimization
CN106840179B (en) Intelligent vehicle positioning method based on multi-sensor information fusion
CN104061899B (en) A kind of vehicle side inclination angle based on Kalman filtering and angle of pitch method of estimation
CN111595592B (en) Performance evaluation method of adaptive cruise control system
CN110307836B (en) Accurate positioning method for welt cleaning of unmanned cleaning vehicle
CN112505737B (en) GNSS/INS integrated navigation method
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN104198765A (en) Coordinate system transformation method for detection of vehicle motion acceleration
CN113063429B (en) Self-adaptive vehicle-mounted integrated navigation positioning method
CN108362288A (en) Polarized light S L AM method based on unscented Kalman filtering
CN110702113B (en) Method for preprocessing data and calculating attitude of strapdown inertial navigation system based on MEMS sensor
CN107600073B (en) A kind of vehicle centroid side drift angle estimating system and method based on Multi-source Information Fusion
CN113340298B (en) Inertial navigation and dual-antenna GNSS external parameter calibration method
CN108387236A (en) Polarized light S L AM method based on extended Kalman filtering
CN114966629A (en) Vehicle body laser radar external reference calibration method based on EKF algorithm framework
CN111751857A (en) Vehicle pose estimation method, device, storage medium and system
CN103123487B (en) A kind of spacecraft attitude determination method
CN114440881B (en) Unmanned vehicle positioning method integrating multi-source sensor information
CN113008229B (en) Distributed autonomous integrated navigation method based on low-cost vehicle-mounted sensor
CN114935345A (en) Vehicle-mounted inertial navigation mounting angle error compensation method based on pattern recognition
CN108507587B (en) Three-dimensional vehicle-mounted positioning navigation method based on coordinate transformation
CN113029173A (en) Vehicle navigation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant