CN117516533A - Vehicle fusion positioning method based on TCN and LSTM - Google Patents

Vehicle fusion positioning method based on TCN and LSTM Download PDF

Info

Publication number
CN117516533A
CN117516533A CN202311426556.2A CN202311426556A CN117516533A CN 117516533 A CN117516533 A CN 117516533A CN 202311426556 A CN202311426556 A CN 202311426556A CN 117516533 A CN117516533 A CN 117516533A
Authority
CN
China
Prior art keywords
vehicle
gnss
tcn
lstm
imu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311426556.2A
Other languages
Chinese (zh)
Inventor
张伟
张健
赵奉奎
曹伟
董洲
吴天余
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Special Equipment Safety Supervision Inspection Institute of Jiangsu Province
Original Assignee
Special Equipment Safety Supervision Inspection Institute of Jiangsu Province
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Special Equipment Safety Supervision Inspection Institute of Jiangsu Province filed Critical Special Equipment Safety Supervision Inspection Institute of Jiangsu Province
Priority to CN202311426556.2A priority Critical patent/CN117516533A/en
Publication of CN117516533A publication Critical patent/CN117516533A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Abstract

The invention relates to a vehicle fusion positioning method based on TCN and LSTM, comprising the steps of obtaining vehicle data output by a vehicle IMU, obtaining positioning information of a vehicle GNSS, and fusing the IMU vehicle data and the GNSS positioning information through an extended Kalman filter; when the GNSS positioning information of the vehicle is lost for a long time, the IMU vehicle data is input into the TCN, then the GNSS vehicle positioning information predicted by the TCN and the IMU vehicle data are input into the extended Kalman filter, finally the position data required by the vehicle positioning are output, when the GNSS positioning information of the vehicle is lost for a long time, the IMU vehicle data are input into the LSTM, and then the LSTM output and the vehicle IMU vehicle data are input into the extended Kalman filter. The vehicle fusion positioning method based on TCN and LSTM can still reliably position the vehicle when satellite signals are lost.

Description

Vehicle fusion positioning method based on TCN and LSTM
Technical Field
The invention relates to the technical field of vehicle positioning, in particular to a vehicle fusion positioning method based on TCN and LSTM.
Background
When the vehicle is positioned and tracked, the existing positioning method, such as the positioning method disclosed in the China patent with the publication number of CN116817927A and the name of dual-filter combined navigation positioning and gesture measuring method, electronic equipment and medium, generally adopts Kalman filtering to process an inertial measurement unit and satellite data so as to realize real-time positioning and tracking of the target vehicle.
Although this vehicle positioning method has high positioning accuracy, it relies on real-time transmission of satellite data. In practical application, the satellite signals are often lost temporarily or for a long time by vehicles due to fields, terrains, weather and the like. Once satellite signals are lost, the Kalman filtering cannot pass through satellite positioning data, and the position of the vehicle is predicted and positioned in real time by combining the data of the inertial measurement unit, so that the reliability of vehicle positioning is reduced.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide a TCN and LSTM-based vehicle fusion positioning method capable of reliably positioning a vehicle when satellite signals are lost.
The invention relates to a vehicle fusion positioning method based on TCN and LSTM, which comprises the following steps:
acquiring vehicle data output by an IMU (inertial measurement unit) of a vehicle;
acquiring positioning information of a vehicle GNSS, wherein the GNSS is a global navigation satellite system;
fusing the vehicle data output by the IMU and the positioning information of the GNSS through an extended Kalman filter, and outputting the position data required by the vehicle positioning through the extended Kalman filter;
when the GNSS positioning information of the vehicle is lost, acquiring position data required by vehicle positioning through the following steps of;
judging whether the GNSS positioning information of the vehicle is lost in a short term or a long term;
if the GNSS positioning information of the vehicle is lost in a short period, inputting the vehicle data output by the vehicle IMU into a TCN neural network, inputting the GNSS vehicle positioning information predicted by the TCN neural network and the vehicle data output by the vehicle IMU into an extended Kalman filter, and finally outputting the position data required by vehicle positioning by the extended Kalman filter;
if the GNSS positioning information of the vehicle is lost for a long time, inputting the vehicle data output by the vehicle IMU into the LSTM neural network, inputting the GNSS vehicle positioning information predicted by the LSTM neural network and the vehicle data output by the vehicle IMU into the extended Kalman filter, and finally outputting the position data required by vehicle positioning by the extended Kalman filter.
Furthermore, according to the vehicle fusion positioning method based on TCN and LSTM, the vehicle data output by the IMU is filtered and noise-reduced through CAE before being input into the extended Kalman filter, and the IMU vehicle data after being subjected to CAE filtering and noise reduction is input into the extended Kalman filter.
Furthermore, according to the vehicle fusion positioning method based on TCN and LSTM, the TCN and LSTM neural network can be trained by the following methods;
and inputting the IMU vehicle data subjected to CAE filtering and noise reduction into a TCN neural network and an LSTM neural network for training, setting the IMU vehicle data subjected to filtering and noise reduction as the input of the neural network, setting the obtained GNSS vehicle information as the target output of the neural network, thereby training the GNSS vehicle information, and finally obtaining trained TCN and LSTM neural network models.
Further, the vehicle fusion positioning method based on TCN and LSTM can judge whether the GNSS positioning information of the vehicle is lost in a short term or in a long term, and can be realized by the following method;
if L TCN < L LSTM And predicted time step t when GNSS signal is lost<100, judging that the GNSS signals are lost in a short period;
if L TCN > L LSTM And predicted time step t when GNSS signal is lost>100, it is determined that the GNSS signal is lost for a long period of time.
According to the TCN and LSTM-based vehicle fusion positioning method, the predicted GNSS vehicle positioning information is output through the trained TCN neural network or LSTM neural network, the predicted GNSS vehicle positioning information is used for replacing the vehicle position information lost by the GNSS positioning information, and the vehicle data output by the IMU is combined to serve as the input of an extended Kalman filter, so that the position data required by vehicle positioning is obtained. Because the TCN neural network and the LTSM neural network are trained by the IMU vehicle data and the GNSS positioning information, even if the GNSS positioning information is short-circuited or lost for a long time, the vehicle can obtain accurate and reliable position data through the trained TCN or LSTM network model, and compared with the existing vehicle positioning method, the reliability is higher.
The foregoing description is merely an overview of the embodiments of the present invention, and is intended to provide a more clear understanding of the technical means of the present invention, as embodied in the present invention, by way of example only.
Drawings
FIG. 1 is a general line diagram of a vehicle fusion positioning method based on TCN and LSTM.
FIG. 2 is a system block diagram of a TCN and LSTM based vehicle fusion positioning method.
FIG. 3 is a flow chart of a vehicle fusion positioning method based on TCN and LSTM.
Fig. 4 is a structural diagram of CAE.
Fig. 5 is a structural diagram of TCN.
FIG. 6 is a block diagram of a TCN having N causal convolutional layers.
Fig. 7 is a training diagram of a TCN neural network and LSTM neural network.
Fig. 8 is a block diagram of a bi-directional LSTM.
Detailed Description
The following describes in further detail the embodiments of the present invention with reference to the drawings and examples. The following examples are illustrative of the invention and are not intended to limit the scope of the invention.
Referring to fig. 1 to 8, the vehicle fusion positioning method based on TCN and LSTM of the present embodiment includes a vehicle fusion positioning method based on TCN and LSTM, including;
acquiring vehicle data output by an IMU (inertial measurement unit) of a vehicle;
acquiring positioning information of a vehicle GNSS, wherein the GNSS is a global navigation satellite system;
fusing the vehicle data output by the IMU and the positioning information of the GNSS through an extended Kalman filter, and outputting the position data required by the vehicle positioning through the extended Kalman filter;
when the GNSS positioning information of the vehicle is lost, acquiring position data required by vehicle positioning through the following steps of;
judging whether the GNSS positioning information of the vehicle is lost in a short term or a long term;
if the GNSS positioning information of the vehicle is lost in a short period, inputting the vehicle data output by the vehicle IMU into a TCN neural network, inputting the GNSS vehicle positioning information predicted by the TCN neural network and the vehicle data output by the vehicle IMU into an extended Kalman filter, and finally outputting the position data required by vehicle positioning by the extended Kalman filter;
if the GNSS positioning information of the vehicle is lost for a long time, inputting the vehicle data output by the vehicle IMU into the LSTM neural network, inputting the GNSS vehicle positioning information predicted by the LSTM neural network and the vehicle data output by the vehicle IMU into the extended Kalman filter, and finally outputting the position data required by vehicle positioning by the extended Kalman filter.
According to the TCN and LSTM-based vehicle fusion positioning method, the predicted GNSS vehicle positioning information is output through the trained TCN neural network or LSTM neural network, the predicted GNSS vehicle positioning information is used for replacing the vehicle position information lost by the GNSS positioning information, and the vehicle data output by the IMU is combined to serve as the input of an extended Kalman filter, so that the position data required by vehicle positioning is obtained. Because the TCN neural network and the LTSM neural network are trained by the IMU vehicle data and the GNSS positioning information, even if the GNSS positioning information is short-circuited or lost for a long time, the vehicle can obtain accurate and reliable position data through the trained TCN or LSTM network model, and compared with the existing vehicle positioning method, the reliability is higher.
Wherein the IMU, i.e. the inertial measurement unit, outputs vehicle data comprising three-axis angular velocity and linear acceleration data of the vehicle travel; the GNSS positioning information of the vehicle includes the longitude and latitude and the altitude of the vehicle.
EKF (Extended Kalman Filter) is an extended kalman filter, which further applies the kalman filter theory to the nonlinear field, so as to realize estimation of state and update of data, which are related art, and specific reference can be made to relevant documents, and details are not repeated here.
Preferably, the vehicle data output by the IMU may be filtered and noise reduced by CAE (convolutional auto encoder) before being input to the extended kalman filter. The IMU vehicle data after CAE filtering and noise reduction is input to the extended Kalman filter, so that the accuracy and reliability of the vehicle positioning data can be further improved.
The encoder portion of the CAE includes an input layer, a convolutional layer, a pooling layer, and an encoding layer, and the decoder portion includes a decoding layer, a reverse pooling layer, a reverse convolution layer, and an output layer that enable the reduction and reconstruction of vehicle data acquired by the IMU by mapping the input data to a low dimensional space (encoder) and then restoring it to the original data (decoder). The reconstruction error of the IMU vehicle data can be minimized through the learned optimal filter, so that the IMU vehicle data can be filtered and noise reduced.
The CAE trained optimizer selects a random gradient descent (SGD), and the training process includes forward propagation, calculation of a loss function, back propagation of optimization weights and parameters, and evaluation of model performance by a test set after each epoch is completed.
The loss function of CAE training is the mean square error loss function:
where N is the number of samples in the training dataset, X i Is the actual observation of the vehicle information acquired by the IMU,is a reconstructed value of IMU vehicle data output by the decoder.
The method for judging whether the GNSS positioning information of the vehicle is lost in a short term or a long term can be realized by the following steps of;
if L TCN < L LSTM And predicted time step t when GNSS signal is lost<100, judging that the GNSS signals are lost in a short period; at this time, IMU vehicle data is used as input of a TCN neural network, and the TCN neural network outputs predicted GNSS position information to replace vehicle position information lost in a GNSS short period.
Specifically, the TCN neural network takes the IMU vehicle data after CAE filtering and noise reduction as input:
the initial position quantity and the predicted position increment are added to obtain the output of the TCN neural network:
wherein P is o Before the GNSS signal is lost, the last position quantity is obtained, P (T) is the position information output by the neural network as the initial position quantity, T is the current time step,e, increasing the position of the current time step relative to the previous time step t And outputting the CAE as the current time step.
Wherein L is TCN Is the mean square error loss function value of the TCN neural network, L LSTM The loss function value of the LSTM neural network is calculated, and t is the predicted time step of the neural network;
if L TCN > L LSTM And predicted time step t when GNSS signal is lost>100, determining that the GNSS signal is lost for a long time, at this time, using the vehicle data output by the IMU as an input to the LSTM neural network, and outputting predicted GNSS position information by the LSTM neural network, instead of the vehicle position information in which the GNSS signal is lost for a long time.
Specifically, the LSTM neural network takes as input the output of the CAE encoder:
the initial position quantity and the predicted position increment are added to obtain the output of the LSTM neural network:
in practical application, the TCN neural network has better prediction effect when GNSS signals are lost in a short period, and the LSTM neural network has better prediction effect when GNSS signals are lost in a long period.
When the GNSS signals are lost, the GNSS predicted position information output by the neural network and the IMU vehicle data are input into the extended Kalman filter for filtering fusion, and finally the final position information required by vehicle positioning is output.
Under the default condition, the vehicle data output by the IMU is input to the extended Kalman filter through the TCN neural network and is fused with the IMU vehicle data, and finally the vehicle positioning position information is formed.
Among them, temporal Convolutional Networks (TCN) is a convolutional neural network for processing sequence data. It is able to capture the time dependence in the sequence by applying a convolution operation in the time dimension. TCN has two main features: firstly, processing the sequence data with any length can be carried out during training; and secondly, the method has excellent parallel computing capability.
The TCN neural network model comprises an input layer, a causal expansion convolution layer, an activation function, residual connection, a pooling layer and an output layer, wherein the mean square error loss function of TCN is;
wherein P is i Is the actual observation of the GNSS vehicle location information,GNSS output by neural networkA predicted value of the vehicle position information;
the trained optimizer selects random gradient descent (SGD), and the training process includes forward propagation, loss function calculation, back propagation of optimization weights and parameters, and model performance evaluation through a test set after each epoch is finished.
LSTM is a commonly used recurrent neural network structure that can memorize historical information and use that information to generate new outputs.
The LSTM neural network model comprises an input layer, an LSTM layer and an output layer, wherein the LSTM layer comprises a plurality of LSTM units, each LSTM unit comprises a memory unit and a hidden state, and a single LSTM neural network unit is used for determining the state by a forget gate, an input gate and 3 gates of an output gate:
wherein x is t Is the input of LSTM nerve cell, f t 、i t 、o t The left door, the input door and the output door output information respectively, C t To output information to memory cell, C t-1 Outputting information for the memory cell of the previous time step, h t To hide the output information of the unit, also the output of the LSTM unit, h t-1 For the last time stepThe hidden unit outputs information that is to be used for the information processing,for outputting information for candidate memory cells, tanh is an activation function,>for sigmoid activation function, t is time step, W f 、W i 、W c 、W o As a weight matrix, b f 、b i 、b c 、b o Bias values derived for model training process, +.>Representing multiplication of two vector corresponding elements;
the mean square error loss function of LSTM is:
similarly, its trained optimizer selects random gradient descent (SGD), and the training process includes forward propagation, calculation of a loss function, back propagation of optimization weights and parameters, and evaluation of model performance by a test set after each epoch is completed.
TCN and LSTM neural networks may be trained by the following methods;
and inputting the IMU vehicle data subjected to CAE filtering and noise reduction into a TCN neural network and an LSTM neural network for training, setting the IMU vehicle data subjected to filtering and noise reduction as the input of the neural network, setting the obtained GNSS vehicle information as the target output of the neural network, thereby training the GNSS vehicle information, and finally obtaining trained TCN and LSTM neural network models.
For TCN neural network, in order to better prevent overfitting and ensure that the original characteristic information in the data is captured as far as possible, TCNs for training and prediction use a plurality of causal expansion convolution layers with different sizes, and input data of a model are respectively input intoCausal factors of different sizesIn the expanded convolution layer, Y with the same length is respectively output 1 ,Y 2 ,……,Y n And then Y is added 1 ,Y 2 ,……,Y n The output of the causal expansion convolution layer is obtained after the input of the causal expansion convolution layer to the full connection layer;
wherein Y is 1 ,Y 2 ,……,Y n To causally expand the output data of the convolutional layer,numbering causally dilated convolutional layers;
for LSTM neural networks, to improve accuracy of predictions, bi-directional information features are better captured, training and predictions are performed using bi-directional LSTM:
wherein w is 1 And w 2 Weight vector, w, belonging to weight matrix in forward layer 3 And w 5 Weight vector, w, in weight matrix in backward layer 4 And w 6 Weight vector belonging to weight matrix in output layer, h t Is the output of the hidden unit of the forward propagation layer,is the output of the hidden unit of the counter propagation layer, H t And g and f are activating functions for outputting results for the output layer.
The above is only a preferred embodiment of the present invention for assisting a person skilled in the art to implement the corresponding technical solution, and is not intended to limit the scope of the present invention, which is defined by the appended claims. It should be noted that, on the basis of the technical solution of the present invention, several improvements and modifications equivalent thereto can be made by those skilled in the art, and these improvements and modifications should also be regarded as the protection scope of the present invention. Meanwhile, it should be understood that, although the present disclosure describes the above embodiments, not every embodiment contains only one independent technical solution, and the description is merely for clarity, and those skilled in the art should consider the disclosure as a whole, and the technical solutions of the embodiments may be combined appropriately to form other embodiments that can be understood by those skilled in the art.

Claims (4)

1. A vehicle fusion positioning method based on TCN and LSTM includes;
acquiring vehicle data output by an IMU (inertial measurement unit) of a vehicle;
acquiring positioning information of a vehicle GNSS, wherein the GNSS is a global navigation satellite system;
fusing the vehicle data output by the IMU and the positioning information of the GNSS through an extended Kalman filter, and outputting the position data required by the vehicle positioning through the extended Kalman filter;
the method is characterized in that:
when the GNSS positioning information of the vehicle is lost, acquiring position data required by vehicle positioning through the following steps of;
judging whether the GNSS positioning information of the vehicle is lost in a short term or a long term;
if the GNSS positioning information of the vehicle is lost in a short period, inputting the vehicle data output by the vehicle IMU into a TCN neural network, inputting the GNSS vehicle positioning information predicted by the TCN neural network and the vehicle data output by the vehicle IMU into an extended Kalman filter, and finally outputting the position data required by vehicle positioning by the extended Kalman filter;
if the GNSS positioning information of the vehicle is lost for a long time, inputting the vehicle data output by the vehicle IMU into the LSTM neural network, inputting the GNSS vehicle positioning information predicted by the LSTM neural network and the vehicle data output by the vehicle IMU into the extended Kalman filter, and finally outputting the position data required by vehicle positioning by the extended Kalman filter.
2. The TCN and LSTM based vehicle fusion localization method of claim 1, wherein: the IMU-output vehicle data is filtered and noise-reduced through CAE before being input into the extended Kalman filter, and the IMU-output vehicle data after being noise-reduced through CAE filtering is input into the extended Kalman filter.
3. The TCN and LSTM based vehicle fusion localization method of claim 2, wherein: TCN and LSTM neural networks may be trained by the following methods;
and inputting the IMU vehicle data subjected to CAE filtering and noise reduction into a TCN neural network and an LSTM neural network for training, setting the IMU vehicle data subjected to filtering and noise reduction as the input of the neural network, setting the obtained GNSS vehicle information as the target output of the neural network, thereby training the GNSS vehicle information, and finally obtaining trained TCN and LSTM neural network models.
4. The TCN and LSTM based vehicle fusion localization method of claim 1, wherein:
the judgment of whether the GNSS positioning information of the vehicle is lost in a short term or a long term can be realized by the following method;
if L TCN < L LSTM And predicted time step t when GNSS signal is lost<100, judging that the GNSS signals are lost in a short period;
if L TCN > L LSTM And predicted time step t when GNSS signal is lost>100, it is determined that the GNSS signal is lost for a long period of time.
CN202311426556.2A 2023-10-31 2023-10-31 Vehicle fusion positioning method based on TCN and LSTM Pending CN117516533A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311426556.2A CN117516533A (en) 2023-10-31 2023-10-31 Vehicle fusion positioning method based on TCN and LSTM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311426556.2A CN117516533A (en) 2023-10-31 2023-10-31 Vehicle fusion positioning method based on TCN and LSTM

Publications (1)

Publication Number Publication Date
CN117516533A true CN117516533A (en) 2024-02-06

Family

ID=89754105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311426556.2A Pending CN117516533A (en) 2023-10-31 2023-10-31 Vehicle fusion positioning method based on TCN and LSTM

Country Status (1)

Country Link
CN (1) CN117516533A (en)

Similar Documents

Publication Publication Date Title
CN109459040B (en) Multi-AUV (autonomous Underwater vehicle) cooperative positioning method based on RBF (radial basis function) neural network assisted volume Kalman filtering
CN111027686B (en) Landslide displacement prediction method, device and equipment
Zhang et al. Increasing GPS localization accuracy with reinforcement learning
CN111709517B (en) Method and device for enhancing redundancy fusion positioning based on confidence prediction system
CN111311685A (en) Motion scene reconstruction unsupervised method based on IMU/monocular image
Bonnifait et al. Cooperative localization with reliable confidence domains between vehicles sharing GNSS pseudoranges errors with no base station
WO2023134666A1 (en) Terminal positioning method and apparatus, and device and medium
CN111190211B (en) GPS failure position prediction positioning method
CN112578419B (en) GPS data reconstruction method based on GRU network and Kalman filtering
CN108446324A (en) A kind of GPS data reconstructing method based on long memory network LSTM in short-term
Srinivas et al. Overview of architecture for GPS-INS integration
CN115690153A (en) Intelligent agent track prediction method and system
CN115840240A (en) Automatic driving positioning method and system based on LSTM deep reinforcement learning
CN115270608A (en) Coastal zone ground settlement prediction method based on ARIMA and LSTM
Karlsson et al. A statistical GPS error model for autonomous driving
CN117516533A (en) Vehicle fusion positioning method based on TCN and LSTM
CN109655057B (en) Filtering optimization method and system for accelerator measurement value of six-push unmanned aerial vehicle
CN114047766B (en) Mobile robot data acquisition system and method for long-term application of indoor and outdoor scenes
CN116166642A (en) Spatio-temporal data filling method, system, equipment and medium based on guide information
CN113703025B (en) GNSS (Global navigation satellite System) multiple failure state-oriented intelligent vehicle positioning error prediction method
Havyarimana et al. A novel probabilistic approach for vehicle position prediction in free, partial, and full GPS outages
Karlsson et al. A data-driven generative model for GPS sensors for autonomous driving
Xiao et al. Adaptive Fault-tolerant Federated Filter with Fault Detection Method Based on Combination of LSTM and Chi-square Test
CN112700007B (en) Training method, forecasting method and device of ionosphere electronic content forecasting model
CN114662792B (en) Traffic flow prediction method of recurrent neural network based on dynamic diffusion graph convolution

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination