CN111693044A - Fusion positioning method - Google Patents

Fusion positioning method Download PDF

Info

Publication number
CN111693044A
CN111693044A CN202010565147.0A CN202010565147A CN111693044A CN 111693044 A CN111693044 A CN 111693044A CN 202010565147 A CN202010565147 A CN 202010565147A CN 111693044 A CN111693044 A CN 111693044A
Authority
CN
China
Prior art keywords
motion model
vehicle
model
longitudinal
transverse
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010565147.0A
Other languages
Chinese (zh)
Inventor
宋翔
赵婷
谢军
张磊
蒋慧琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Xiaozhuang University
Original Assignee
Nanjing Xiaozhuang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Xiaozhuang University filed Critical Nanjing Xiaozhuang University
Priority to CN202010565147.0A priority Critical patent/CN111693044A/en
Publication of CN111693044A publication Critical patent/CN111693044A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Abstract

The invention provides a fusion positioning method, which utilizes information acquired by a radio frequency technology and a vehicle-mounted sensor as observation information, respectively establishes a vehicle longitudinal motion model and a vehicle transverse motion model as state equations, respectively carries out extended Kalman filtering recursion on the two models, further establishes a BP (Back Propagation) neural network model by taking real-time vehicle motion state information as input to determine model probabilities of the two models, realizes the estimation combination of multiple models according to the model probabilities, and forms an intelligent multi-model fusion positioning method, thereby realizing the real-time switching of the longitudinal motion model and the transverse motion model according to the vehicle motion state, ensuring that the state equations can accord with the actual vehicle motion state, and realizing the accurate and reliable positioning of vehicles running in the environment with shielding such as a tunnel.

Description

Fusion positioning method
Technical Field
The invention relates to the technical field of automobile navigation positioning, in particular to a fusion positioning method.
Background
In recent decades, the urbanization trend of each country is becoming more and more obvious, vehicles in cities are increasing, and underground tunnels and multi-layer overpasses are increasing in order to relieve traffic pressure of the cities.
Currently, vehicle navigation is generally performed by a Global Positioning System (GPS), which can provide information such as a three-dimensional position, a speed, and time for a mobile carrier in real time and all weather, and is the most widely used Positioning technology. However, when the vehicle runs in a tunnel or other sheltered environment, the GPS cannot provide accurate positioning service because the GPS is sheltered from signals and the positioning is inaccurate or even fails.
Disclosure of Invention
In view of this, the present invention provides a fusion positioning method, which can provide accurate positioning service in an environment with occlusion.
In order to achieve the above purpose, the invention provides the following specific technical scheme:
a fusion localization method, comprising:
establishing a longitudinal motion model of the vehicle according to a state vector of the vehicle with an acceleration of the vehicle as an input vector, and establishing a lateral motion model of the vehicle according to the state vector of the vehicle with a yaw rate of the vehicle as an input vector, the state vector representing position information of the vehicle;
establishing an observation equation by taking the distance between the vehicle-mounted RFID reader and the plurality of RFID tags on two sides of the road, the longitudinal linear speed and the course angle under the vehicle body coordinate system as observation vectors;
respectively carrying out extended Kalman filtering recursion on the longitudinal motion model and the transverse motion model by using the observation equation to obtain an estimation error variance matrix of the longitudinal motion model and an estimation error variance matrix of the transverse motion model;
inputting steering wheel corner information, wheel speed information, longitudinal and transverse acceleration information and yaw angular velocity information into a BP artificial neural network model to obtain model probability of the longitudinal motion model and model probability of the transverse motion model;
constructing a combined model variance array based on the estimation error variance array of the longitudinal motion model, the estimation error variance array of the transverse motion model, the model probability of the longitudinal motion model and the model probability of the transverse motion model;
calculating to obtain a positioning result of the vehicle by taking the minimum error of the variance matrix of the combined model as an optimization target;
the method comprises the following steps of establishing an observation equation by taking the distance between a vehicle-mounted RFID reader and a plurality of RFID labels on two sides of a road, the longitudinal linear velocity and the course angle under a vehicle body coordinate system as observation vectors, wherein the observation equation comprises the following steps:
acquiring a signal intensity value of each RFID label at two sides of a road read by the vehicle-mounted RFID reader-writer and a position coordinate of each RFID label in a geographic coordinate system;
determining the distance between the vehicle-mounted RFID reader-writer and each RFID tag on the two sides of the road according to the calculation relationship between the signal strength and the transmission distance and the signal strength value of each RFID tag on the two sides of the road read by the vehicle-mounted RFID reader-writer;
establishing the observation equation by taking the distance between the vehicle-mounted RFID reader and each RFID label on two sides of a road, the longitudinal linear velocity and the course angle under a vehicle body coordinate system as observation vectors;
wherein, the using the observation equation to perform extended kalman filter recursion on the longitudinal motion model and the transverse motion model respectively to obtain an estimation error variance matrix of the longitudinal motion model and an estimation error variance matrix of the transverse motion model includes:
respectively carrying out time updating on the longitudinal motion model and the transverse motion model by using the observation equation to obtain a state one-step prediction equation and a one-step prediction error variance matrix of the longitudinal motion model and the transverse motion model;
respectively measuring and updating the longitudinal motion model and the transverse motion model by using the one-step prediction error variance arrays of the longitudinal motion model and the transverse motion model to obtain a filter gain matrix and a state estimation equation of the longitudinal motion model and the transverse motion model so as to obtain an estimation error variance array;
the establishing of the BP artificial neural network model comprises the following steps:
acquiring steering wheel corners, wheel speeds, longitudinal and transverse accelerations and yaw angular velocities under different working conditions in an actual vehicle test;
position estimation is carried out by respectively utilizing the estimation error variance matrixes of the longitudinal motion model and the transverse motion model, so that the vehicle position of the longitudinal motion model and the vehicle position of the transverse motion model under different working conditions in an actual vehicle test are obtained;
calculating a first error between the vehicle position of the longitudinal motion model under different working conditions in the real vehicle test and a vehicle position reference value obtained by the high-precision differential GPS, and a second error between the vehicle position of the transverse motion model under different working conditions in the real vehicle test and the vehicle position reference value obtained by the high-precision differential GPS;
establishing a total error function according to the first error, the second error, the model probability of the longitudinal motion model and the model probability of the transverse motion model;
calculating model probabilities of the longitudinal motion model and the transverse motion model under different working conditions in the real vehicle test according to the total error function by taking the minimum total error as an optimization target to obtain a training sample consisting of steering wheel rotation angles, wheel speeds, longitudinal and transverse accelerations, yaw angular velocities, model probabilities of the longitudinal motion model and model probabilities of the transverse motion model under different working conditions in the real vehicle test;
and training the BP artificial neural network model by using the training sample to obtain the network weight coefficient and the threshold coefficient of the hidden layer of the BP artificial neural network model and the network weight coefficient and the threshold coefficient of the output layer.
Compared with the prior art, the invention has the following beneficial effects:
the invention discloses a fusion positioning method, which utilizes information acquired by a radio frequency technology and a vehicle-mounted sensor as observation information, respectively establishes a vehicle longitudinal motion model and a vehicle transverse motion model as state equations, respectively carries out extended Kalman filtering recursion aiming at the two models, further establishes a BP (Back propagation) neural network model by taking real-time vehicle motion state information as input to determine model probabilities of the two models, realizes the estimation combination of multiple models according to the model probabilities, and forms an intelligent multi-model fusion positioning method, thereby realizing the real-time switching of the longitudinal motion model and the transverse motion model according to the vehicle motion state, ensuring that the state equations can accord with the actual vehicle motion state, and realizing the accurate and reliable positioning of vehicles running in the environment with shielding such as a tunnel.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flow chart of a fusion positioning method according to an embodiment of the present invention;
fig. 2 is a schematic view of a scene for acquiring distances between a vehicle-mounted RFID reader and a plurality of RFID tags on both sides of a road according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a BP artificial neural network disclosed in the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides an intelligent multi-model fusion positioning method for model probability calculation by utilizing a neural network, which is used for realizing fusion positioning of RFID and vehicle-mounted sensors in the shielded environments such as tunnels and the like, can adapt to vehicle running characteristics and obtain accurate and reliable positioning results.
Specifically, referring to fig. 1, the fusion positioning method disclosed in this embodiment includes the following steps:
s101: establishing a longitudinal motion model of the vehicle according to a state vector of the vehicle with an acceleration of the vehicle as an input vector, and establishing a lateral motion model of the vehicle according to the state vector of the vehicle with a yaw rate of the vehicle as an input vector, the state vector representing position information of the vehicle;
the state vector of the vehicle is: x (k) ═ pe(k) pn(k) ve(k) vn(k) ψ(k) ω(k)]′
Where the superscript' denotes the transpose of the matrix, pe(k) And pn(k) Respectively representing the east and north positions of the vehicle's center of mass in a global navigation coordinate system, ve(k) And vn(k) Representing the east and north speed of the vehicle, ψ (k) and ω (k) represent the heading angle and yaw rate of the vehicle, respectively, with k representing discrete time steps.
The longitudinal motion model describes a longitudinal motion process of a vehicle by using an improved Constant Acceleration (CA) model, and unlike a common CA model, in this embodiment, a vehicle acceleration obtained by an accelerometer is used as an input vector to accurately describe a change of the acceleration, the acceleration of the vehicle is used as the input vector, and a longitudinal motion model of the vehicle is established according to a state vector of the vehicle as follows:
Figure BDA0002547547740000051
wherein f is1(. represents the state transfer function of the model, T represents the sampling interval, X1(k, k-1) ═ X (k, k-1) and U1(k)=[axay]' separately representing a state vector and an input vector; w1System for representing zero meanThe vector of the systematic white Gaussian noise is corresponding to the covariance matrix of the systematic noise as Q1;γ1Representing the zero mean Gaussian white noise vector corresponding to the external input of the system, the covariance matrix of the corresponding external input noise of the system is1
Unlike the conventional CT model, the present embodiment uses the yaw rate of the vehicle obtained by the angular rate gyro as an input vector to accurately describe the change of the Turning rate, uses the yaw rate of the vehicle as an input vector, and builds the lateral motion model of the vehicle according to the state vector of the vehicle as follows:
Figure BDA0002547547740000052
wherein f is2(. represents the state transfer function of the model, X2(k, k-1) ═ X (k, k-1) and U2(k)=[ωz]Respectively representing a state vector and an input vector; w2Representing the system Gaussian white noise vector with zero mean value, the corresponding system noise covariance matrix is Q2;γ2Representing the zero mean Gaussian white noise vector corresponding to the external input of the system, the covariance matrix of the corresponding external input noise of the system is2
S102: establishing an observation equation by taking the distance between the vehicle-mounted RFID reader and the plurality of RFID tags on two sides of the road, the longitudinal linear speed and the course angle under the vehicle body coordinate system as observation vectors;
firstly, obtaining the distance between a vehicle-mounted RFID reader and each RFID tag at two sides of a road, wherein the RFID layout is as shown in figure 2, active RFID tags are adopted, the two sides of the top layer of a tunnel are not provided with shielding positions, active radio frequency identification electronic tags are arranged at equal intervals of 6-9m each distance, the east is taken as the ox direction, the north is taken as the oy direction, the origin is taken as the o direction, a geographical coordinate system xoy is established, the origin o is selected on a fixed point on the earth surface, the coordinate position of each tag under the geographical coordinate system can be obtained in advance, the RFID reader is mounted on the roof of a running vehicle, and the RFID reader is mounted on the roof of the running vehicle along with theThe vehicle moves together, the position coordinate of the RFID reader-writer in the geographic coordinate system can be regarded as the position coordinate of the vehicle in the geographic coordinate system, and the electronic tag and the reader-writer can be approximately regarded as being approximately positioned on the same plane; at each discrete time k (k is 1,2,3, …), the RFID reader can stably read the information of s tags nearest to the RFID reader, in order to ensure the positioning accuracy, in this embodiment, s is greater than or equal to 4, and the information that the RFID reader can read includes the received signal strength RSS of the tagi(k) (i ═ 1,2,. s) and the location coordinates (X) of the tag in the geographic coordinate systemi(k),Yi(k))(i=1,2,...s);
The relationship between signal strength and transmission distance is:
Figure BDA0002547547740000061
wherein, PtRSS is the wireless signal transmission power and received signal strength, Gt、GrRespectively, the gain of the transmitting antenna and the gain of the receiving antenna, L is the transmission loss factor of the system, lambda is the wavelength of the radio frequency signal, r is the distance between the transmitting antenna and the receiving antenna, according to the transmission formula, if the received signal strength RSS is knowni(k) The value of (i ═ 1, 2.. s), the distance r between the tag i and the RFID reader-writer, which is the output vector, can be calculatedi(k) I.e. r at time k1,r2,…rsThe value of (c).
On the basis, the distance between the vehicle-mounted RFID reader-writer and each RFID label on two sides of a road, the longitudinal linear velocity under a vehicle body coordinate system and a course angle are taken as observation vectors:
Z=[r1... rsvwheelψcompass]′
vwheelrepresenting the longitudinal linear velocity in the body coordinate system, psi, measured by the wheel speed sensorcompassRepresenting the course angle, r, observed by the electronic compass1,r2,…rsAnd the distance between the vehicle-mounted RFID reader-writer and the s RFID tags capable of receiving the information is represented.
And establishing the observation equation by taking the distance between the vehicle-mounted RFID reader and each RFID label on two sides of the road, the longitudinal linear velocity and the course angle under the vehicle body coordinate system as observation vectors as follows:
Figure BDA0002547547740000071
wherein h represents an observation function, nvAnd nψRespectively representing observation noise vectors of the wheel speed sensor and the electronic compass; n isri(i ═ 1,2,. s) represent observed noise vectors for the s tags; then the observed white noise vector V ═ nr1... nrsnvnψ]' where the corresponding observed noise variance matrix is R.
S103: respectively carrying out extended Kalman filtering recursion on the longitudinal motion model and the transverse motion model by using the observation equation to obtain an estimation error variance matrix of the longitudinal motion model and an estimation error variance matrix of the transverse motion model;
for the longitudinal motion model and the transverse motion model established by the formulas (1) and (2), and the observation equation established by the formula (3), an extended kalman filter theory is applied, and standard extended kalman filter recursion is respectively carried out, wherein the recursion process comprises time updating and measurement updating, and the filtering process of the jth model is as follows (j is 1 and 2, and represents the longitudinal motion model and the transverse motion model respectively):
and (3) time updating:
one-step prediction equation of state:
Xj(k,k-1)=fj(Xj(k-1),Uj(k-1),0,0) (4)
one-step prediction error variance matrix:
Figure BDA0002547547740000072
wherein A isj、BjRespectively, a system state function vector fjFor state vector XjAnd external inputVector UjAnd solving a jacobian matrix of partial derivatives.
And (3) measurement updating:
a filter gain matrix:
Kj(k)=Pj(k,k-1)(H(k-1))′(Sj(k))-1(6)
Sj(k)=H(k-1)Pj(k,k-1)(H(k-1))′+R(k-1) (7)
and (3) state estimation:
Xj(k)=Xj(k,k-1)+Kj(k)(Z(k)-h(Xj(k,k-1))) (8)
estimating an error variance matrix:
Pj(k)=Pj(k,k-1)-Kj(k)Sj(k)(Kj(k))′ (9)
where H is a Jacobian matrix of H (-) derivatives of the state vector X.
S104: inputting steering wheel corner information, wheel speed information, longitudinal and transverse acceleration information and yaw angular velocity information into a BP artificial neural network model to obtain model probability of the longitudinal motion model and model probability of the transverse motion model;
firstly, a BP artificial neural network model is established, which is different from a pseudo bayesian method used by a traditional interactive multi-model algorithm for calculating model probability, in the embodiment, the probability of a longitudinal motion model and a transverse motion model is estimated in real time by establishing a BP (Back Propagation) artificial neural network as shown in fig. 3,
as shown in fig. 3, the artificial neural network 1 includes an input layer 11, a hidden layer 12, and an output layer 13.
The input layer 11 receives input vectors required for calculating the reaction time of the driver, the input vectors are 5-dimensional vectors, and the dimensions are a steering wheel turning angle U1, a wheel speed U2, a longitudinal acceleration U3, a lateral acceleration U4 and a yaw rate U5.
The hidden layer 12 comprises 6 neuron nodes 121, and the neuron nodes 121 can be calculated according to the input vector of the input layer 11, and the calculated output value is used as the output layer 13 calculating an input value of the model probability; the function that the neuron node 121 calculates from the input vector is a tansig function: tan sig (x) is 1/(1+ e)-x) Specifically, the expression of the tansig function is as follows:
Figure BDA0002547547740000081
in the formula (10), n is 1,2,3,4,5 is an input vector dimension, and p isnIs the nth input vector; q is 1,2 … 8, and represents the number of hidden layer neurons, mqOutput value, w, for the qth neuron of the hidden layernFor p corresponding within the hidden layernNetwork weight coefficient of bqThe qth threshold coefficient of the hidden layer;
the output layer 13 includes two output layer neuron nodes 131, the neuron nodes 131 calculate the changes of the longitudinal motion model and the transverse motion model according to the result calculated by the hidden layer 12, and the output layer neuron nodes 131 function is purelin function: purelin (x) kx; specifically, the function of the distance between the output layer neuron node calculation tag and the reader-writer is shown in the following formula:
Figure BDA0002547547740000091
in the formula, mqOutput value, w, for the qth neuron of the hidden layerqFor m in the output layerqNetwork weight coefficient of bvFor the v-th threshold coefficient of the output layer, cvIs the output value (v ═ 1,2) of the vth neuron of the output layer, i.e. the output vector;
the establishment of the whole BP artificial neural network adopts matlab function newff, which is shown as the following formula:
net=newff(minmax(p),[8,2],{'tansig','purlin'},'trainlm') (12)
in the formula (12), net is a neural network, p is an input vector, min max is matlab to obtain a vector extremum function, and rainlm is a learning algorithm.
The training process of the artificial neural network model requires system input and reference output values, the training data are collected in an actual vehicle test, and the data are collected according to the input vector p of the artificial neural network, wherein the data comprise the measurement of the turning angle, the wheel speed, the longitudinal acceleration, the lateral acceleration and the yaw angular velocity value of different steering wheels; the reference output value m, i.e. the model probability, can be estimated by using the distance errors of the position estimation by using the longitudinal motion model and the transverse motion model separately, i.e. the distance errors corresponding to the two models separately and can be defined as the difference between the position information calculated by using the longitudinal motion model or the transverse motion model separately to perform the extended kalman filter recursion and the reference position information:
Figure BDA0002547547740000092
wherein, XrefAnd YrefRespectively representing a reference value, X, of the position of the vehicle obtained by a high-precision differential GPSCAAnd YCARepresents the estimated vehicle position, X, from extended Kalman Filter recursion Using the longitudinal motion model aloneCTAnd YCTRepresenting the vehicle position estimated by performing extended Kalman filtering recursion by independently utilizing a transverse motion model; definition of muCAAnd muCTThe model probabilities of the longitudinal motion model and the transverse motion model, respectively, are determined, the total error is e ═ muCAeCACTeCTAnd satisfy muCACT=1,0≤μCACTLess than or equal to 1; mu at which the total error e is minimized by using an optimization algorithmCAAnd muCTThe reference output value is the reference output value in the training;
utilizing the reference output value m and the input vector p acquired by the sensor in real time to train the BP artificial neural network in an off-line manner to acquire the network weight coefficient w of the hidden layernAnd a threshold coefficient bqAnd the network weight coefficient w of the output layerqAnd a threshold coefficient bv
The main purpose of BP artificial neural network training is to obtain the network weight coefficient wn、wqAnd a threshold coefficient bq、bvSo that the output vector meets a certain precision; BP personThe industrial neural network training directly adopts matlab function train to carry out off-line training, which is shown as the following formula:
net=train(net,p,t) (14)
in formula (14): net is BP artificial neural network, p is training sample input vector, t is training sample output vector; the network weight coefficient w obtained by the trainingn、wqAnd a threshold coefficient bq、bvSubstituting the model probability mu into a neural network, namely calculating and estimating the model probability mu of the longitudinal motion model and the transverse motion model at the moment k according to the steering wheel corner, wheel speed, longitudinal acceleration, lateral acceleration and yaw velocity value acquired at the discrete moment k in real timei(i=CA,CT);
S105: constructing a combined model variance array based on the estimation error variance array of the longitudinal motion model, the estimation error variance array of the transverse motion model, the model probability of the longitudinal motion model and the model probability of the transverse motion model;
s106: and calculating to obtain a positioning result of the vehicle by taking the minimum error of the variance matrix of the combined model as an optimization target.
With the respective model probabilities of the estimated longitudinal motion model and the transverse motion model and the respective filtering results, the combined estimated state can be expressed as:
Figure BDA0002547547740000101
in the formula (15), P (k) is a variance matrix of the combined model, Pi(k) (i ═ CA, CT) are respectively covariance matrices in which the longitudinal motion model and the transverse motion model individually perform extended kalman filtering; so far, x (k) can be solved, which is the final positioning result of the intelligent multi-model positioning of this embodiment.
It can be seen that, in the fusion positioning method disclosed in this embodiment, information acquired by a radio frequency technology and a vehicle-mounted sensor is used as observation information, a vehicle longitudinal motion model and a vehicle transverse motion model are respectively established as state equations, extended kalman filtering recursion is performed on the two models, and then real-time vehicle motion state information is used as input, a BP (Back Propagation) neural network model is established to determine model probabilities of the two models, and estimation combination of multiple models is realized according to the model probabilities, so that an intelligent multi-model fusion positioning method is formed.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (1)

1. A fusion localization method, comprising:
establishing a longitudinal motion model of the vehicle according to a state vector of the vehicle with an acceleration of the vehicle as an input vector, and establishing a lateral motion model of the vehicle according to the state vector of the vehicle with a yaw rate of the vehicle as an input vector, the state vector representing position information of the vehicle;
establishing an observation equation by taking the distance between the vehicle-mounted RFID reader and the plurality of RFID tags on two sides of the road, the longitudinal linear speed and the course angle under the vehicle body coordinate system as observation vectors;
respectively carrying out extended Kalman filtering recursion on the longitudinal motion model and the transverse motion model by using the observation equation to obtain an estimation error variance matrix of the longitudinal motion model and an estimation error variance matrix of the transverse motion model;
inputting steering wheel corner information, wheel speed information, longitudinal and transverse acceleration information and yaw angular velocity information into a BP artificial neural network model to obtain model probability of the longitudinal motion model and model probability of the transverse motion model;
constructing a combined model variance array based on the estimation error variance array of the longitudinal motion model, the estimation error variance array of the transverse motion model, the model probability of the longitudinal motion model and the model probability of the transverse motion model;
calculating to obtain a positioning result of the vehicle by taking the minimum error of the variance matrix of the combined model as an optimization target;
the method comprises the following steps of establishing an observation equation by taking the distance between a vehicle-mounted RFID reader and a plurality of RFID labels on two sides of a road, the longitudinal linear velocity and the course angle under a vehicle body coordinate system as observation vectors, wherein the observation equation comprises the following steps:
acquiring a signal intensity value of each RFID label at two sides of a road read by the vehicle-mounted RFID reader-writer and a position coordinate of each RFID label in a geographic coordinate system;
determining the distance between the vehicle-mounted RFID reader-writer and each RFID tag on the two sides of the road according to the calculation relationship between the signal strength and the transmission distance and the signal strength value of each RFID tag on the two sides of the road read by the vehicle-mounted RFID reader-writer;
establishing the observation equation by taking the distance between the vehicle-mounted RFID reader and each RFID label on two sides of a road, the longitudinal linear velocity and the course angle under a vehicle body coordinate system as observation vectors;
wherein, the using the observation equation to perform extended kalman filter recursion on the longitudinal motion model and the transverse motion model respectively to obtain an estimation error variance matrix of the longitudinal motion model and an estimation error variance matrix of the transverse motion model includes:
respectively carrying out time updating on the longitudinal motion model and the transverse motion model by using the observation equation to obtain a state one-step prediction equation and a one-step prediction error variance matrix of the longitudinal motion model and the transverse motion model;
respectively measuring and updating the longitudinal motion model and the transverse motion model by using the one-step prediction error variance arrays of the longitudinal motion model and the transverse motion model to obtain a filter gain matrix and a state estimation equation of the longitudinal motion model and the transverse motion model so as to obtain an estimation error variance array;
the establishing of the BP artificial neural network model comprises the following steps:
acquiring steering wheel corners, wheel speeds, longitudinal and transverse accelerations and yaw angular velocities under different working conditions in an actual vehicle test;
position estimation is carried out by respectively utilizing the estimation error variance matrixes of the longitudinal motion model and the transverse motion model, so that the vehicle position of the longitudinal motion model and the vehicle position of the transverse motion model under different working conditions in an actual vehicle test are obtained;
calculating a first error between the vehicle position of the longitudinal motion model under different working conditions in the real vehicle test and a vehicle position reference value obtained by the high-precision differential GPS, and a second error between the vehicle position of the transverse motion model under different working conditions in the real vehicle test and the vehicle position reference value obtained by the high-precision differential GPS;
establishing a total error function according to the first error, the second error, the model probability of the longitudinal motion model and the model probability of the transverse motion model;
calculating model probabilities of the longitudinal motion model and the transverse motion model under different working conditions in the real vehicle test according to the total error function by taking the minimum total error as an optimization target to obtain a training sample consisting of steering wheel rotation angles, wheel speeds, longitudinal and transverse accelerations, yaw angular velocities, model probabilities of the longitudinal motion model and model probabilities of the transverse motion model under different working conditions in the real vehicle test;
and training the BP artificial neural network model by using the training sample to obtain the network weight coefficient and the threshold coefficient of the hidden layer of the BP artificial neural network model and the network weight coefficient and the threshold coefficient of the output layer.
CN202010565147.0A 2020-06-19 2020-06-19 Fusion positioning method Pending CN111693044A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010565147.0A CN111693044A (en) 2020-06-19 2020-06-19 Fusion positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010565147.0A CN111693044A (en) 2020-06-19 2020-06-19 Fusion positioning method

Publications (1)

Publication Number Publication Date
CN111693044A true CN111693044A (en) 2020-09-22

Family

ID=72482142

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010565147.0A Pending CN111693044A (en) 2020-06-19 2020-06-19 Fusion positioning method

Country Status (1)

Country Link
CN (1) CN111693044A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112429044A (en) * 2020-11-27 2021-03-02 株洲中车时代软件技术有限公司 Method and system for positioning running train position based on nonlinear compensation
CN112950812A (en) * 2021-02-04 2021-06-11 南京航空航天大学 Vehicle state fault-tolerant estimation method based on long-time and short-time memory neural network

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156476A (en) * 2011-04-14 2011-08-17 山东大学 Intelligent space and nurse robot multi-sensor system and information fusion method of intelligent space and nurse robot multi-sensor system
CN102928816A (en) * 2012-11-07 2013-02-13 东南大学 High-reliably integrated positioning method for vehicles in tunnel environment
CN103206954A (en) * 2013-04-25 2013-07-17 佳木斯大学 Multi-sensor information fusion method for mobile robot based on UKF (Unscented Kalman Filter)
CN109144076A (en) * 2018-10-31 2019-01-04 吉林大学 A kind of more vehicle transverse and longitudinals coupling cooperative control system and control method
CN208393354U (en) * 2018-06-22 2019-01-18 南京航空航天大学 Line operating condition automatic Pilot steering system is moved based on BP neural network and safe distance
CN109872415A (en) * 2018-12-28 2019-06-11 北京理工大学 A kind of vehicle speed estimation method neural network based and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156476A (en) * 2011-04-14 2011-08-17 山东大学 Intelligent space and nurse robot multi-sensor system and information fusion method of intelligent space and nurse robot multi-sensor system
CN102928816A (en) * 2012-11-07 2013-02-13 东南大学 High-reliably integrated positioning method for vehicles in tunnel environment
CN103206954A (en) * 2013-04-25 2013-07-17 佳木斯大学 Multi-sensor information fusion method for mobile robot based on UKF (Unscented Kalman Filter)
CN208393354U (en) * 2018-06-22 2019-01-18 南京航空航天大学 Line operating condition automatic Pilot steering system is moved based on BP neural network and safe distance
CN109144076A (en) * 2018-10-31 2019-01-04 吉林大学 A kind of more vehicle transverse and longitudinals coupling cooperative control system and control method
CN109872415A (en) * 2018-12-28 2019-06-11 北京理工大学 A kind of vehicle speed estimation method neural network based and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
XIANG SONG, ET AL: "A fusion strategy for reliable vehicle positioning utilizing RFID and in-vehicle sensors", 《ELSEVIER》 *
张烈平,王守峰: "《无线传感器网络节点定位技术研究》", 31 May 2019, 中国原子能出版社 *
李圣怡,吴学忠等: "《多传感器融合理论及在智能制造系统中的应用》", 30 November 1988 *
高怀堃: "面向复杂城市环境的车辆RFID融合定位技术研", 《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112429044A (en) * 2020-11-27 2021-03-02 株洲中车时代软件技术有限公司 Method and system for positioning running train position based on nonlinear compensation
CN112950812A (en) * 2021-02-04 2021-06-11 南京航空航天大学 Vehicle state fault-tolerant estimation method based on long-time and short-time memory neural network
CN112950812B (en) * 2021-02-04 2022-07-26 南京航空航天大学 Vehicle state fault-tolerant estimation method based on long-time and short-time memory neural network

Similar Documents

Publication Publication Date Title
CN102928816B (en) High-reliably integrated positioning method for vehicles in tunnel environment
CN106840179B (en) Intelligent vehicle positioning method based on multi-sensor information fusion
CN105628026B (en) A kind of positioning and orientation method and system of mobile object
CN109946731B (en) Vehicle high-reliability fusion positioning method based on fuzzy self-adaptive unscented Kalman filtering
CN108362288B (en) Polarized light SLAM method based on unscented Kalman filtering
CN104061899B (en) A kind of vehicle side inclination angle based on Kalman filtering and angle of pitch method of estimation
JP2022019642A (en) Positioning method and device based upon multi-sensor combination
CN108387236B (en) Polarized light SLAM method based on extended Kalman filtering
CN103777220B (en) Based on the accurate position and orientation estimation method in real time of optical fibre gyro, speed pickup and GPS
CN100533066C (en) Inertia compensation method used for earth-based vehicle GPS navigation
JP5162849B2 (en) Fixed point position recorder
Li et al. Simultaneous registration and fusion of multiple dissimilar sensors for cooperative driving
CN105509738B (en) Vehicle positioning orientation method based on inertial navigation/Doppler radar combination
CN110208842A (en) Vehicle high-precision locating method under a kind of car networking environment
CN101846734B (en) Agricultural machinery navigation and position method and system and agricultural machinery industrial personal computer
CN113819914A (en) Map construction method and device
CN107274721B (en) Multi-vehicle cooperative positioning method in intelligent transportation system
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN109781098B (en) Train positioning method and system
CN104180818A (en) Monocular vision mileage calculating device
CN112147651B (en) Asynchronous multi-vehicle cooperative target state robust estimation method
CN112923931A (en) Feature map matching and GPS positioning information fusion method based on fixed route
CN105333869A (en) Unmanned reconnaissance aerial vehicle synchronous positioning and picture compositing method based on self-adaption EKF
CN112967392A (en) Large-scale park mapping and positioning method based on multi-sensor contact
CN111693044A (en) Fusion positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200922