CN112183196B - Traffic intersection vehicle state estimation method based on adaptive fusion filter - Google Patents
Traffic intersection vehicle state estimation method based on adaptive fusion filter Download PDFInfo
- Publication number
- CN112183196B CN112183196B CN202010844156.3A CN202010844156A CN112183196B CN 112183196 B CN112183196 B CN 112183196B CN 202010844156 A CN202010844156 A CN 202010844156A CN 112183196 B CN112183196 B CN 112183196B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- noise
- matrix
- ufir
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 230000004927 fusion Effects 0.000 title claims abstract description 44
- 230000003044 adaptive effect Effects 0.000 title claims abstract description 16
- 230000007613 environmental effect Effects 0.000 claims abstract description 16
- 239000011159 matrix material Substances 0.000 claims description 57
- 230000008569 process Effects 0.000 claims description 23
- 238000004364 calculation method Methods 0.000 claims description 17
- 238000005259 measurement Methods 0.000 claims description 13
- 230000007704 transition Effects 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 8
- 230000001133 acceleration Effects 0.000 claims description 7
- 238000005286 illumination Methods 0.000 claims description 7
- 238000005516 engineering process Methods 0.000 claims description 6
- 238000002474 experimental method Methods 0.000 claims description 4
- 238000009825 accumulation Methods 0.000 claims description 2
- 238000005070 sampling Methods 0.000 claims description 2
- 238000004891 communication Methods 0.000 claims 1
- 230000007774 longterm Effects 0.000 claims 1
- 230000009286 beneficial effect Effects 0.000 abstract 1
- 238000001914 filtration Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 101001093748 Homo sapiens Phosphatidylinositol N-acetylglucosaminyltransferase subunit P Proteins 0.000 description 1
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/30—Noise filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a traffic intersection vehicle state estimation method based on a KF/UFIR adaptive fusion filter, which comprises the following steps of firstly, establishing a vehicle driving kinematics model; secondly, respectively determining parameter matrixes of the single KF/UFIR filters; then, obtaining noise statistical characteristic information through the obtained environmental factor table look-up, calculating variance to look up corresponding fusion factors, reading vehicle motion information obtained by current camera observation, and continuously accumulating and storing historical vehicle motion information of the optimal sliding window length of the UFIR; and finally, inputting the fusion factor into a fusion filter, fusing the KF/UFIR estimation results, outputting the finally optimized vehicle state estimation result, and sending the result to other vehicles at the traffic intersection. Compared with a single filter, the KF/UFIR-based adaptive fusion filter adopted by the invention has good robustness and optimality, outputs a more accurate estimation result in an environment with changeable noise characteristics, and is beneficial to improving the driving safety of vehicles.
Description
Technical Field
The invention belongs to the field of intelligent traffic and state estimation, and particularly relates to a traffic intersection vehicle state estimation method based on a KF/UFIR adaptive fusion filter.
Background
The traffic intersection is used as a main component of a vehicle driving environment, and has an important guiding function for development and application of related technologies of an Intelligent Transportation System (ITS) and an intelligent Internet Transport System (ITS). Statistics show that the traffic intersection has a large occupation ratio and a high accident occurrence probability in a traffic accident scene, and the traffic intersection cannot detect vehicles running around the vehicles due to the blind areas, the constraint, the low accuracy and the like of a vehicle sensing system. The method has important application value in estimation of different vehicle motion states, relevant data information processed by the algorithm is sent to corresponding vehicles through an intelligent internet technology, and the vehicles perform relevant operations such as braking and the like, so that the incidence rate of traffic intersection accidents can be effectively reduced, and the driving safety is improved.
At a traffic intersection, the motion state of a vehicle is sensed through a camera sensor, and Kalman Filtering (KF) is often adopted as an estimation algorithm. The accuracy of the KF mainly depends on the nature and the statistical characteristics of relevant noise, for example, whether the noise is white Gaussian noise or not, whether the covariance matrix is known and accurate or not, the observation noise of the traffic intersection camera can change continuously along with environmental factors such as weather, temperature, time and the like, and when the noise information data has larger deviation, the KF estimation result has larger deviation. The UFIR (infinite impulse response) filter is often adopted for processing the noise statistical information inaccurately, has the characteristic that the statistical characteristic of noise is not needed in the estimation process, estimates the current state through historical observation data, has good performance in various environments, has stronger robustness compared with KF, and has lower accuracy than KF under the condition that the statistical characteristic of the noise is known.
Disclosure of Invention
The invention aims to solve the problems that a Kalman filtering algorithm is used independently and a large estimation error occurs under the condition that the noise statistical characteristic is uncertain, and the estimation accuracy of the UFIR algorithm is lower than that of the Kalman filtering algorithm under the condition that the noise statistical characteristic is confirmed, and provides a traffic intersection vehicle state estimation method based on a KF/UFIR self-adaptive fusion filter.
The invention relates to a traffic intersection vehicle state estimation method based on a self-adaptive fusion filter, which comprises the following specific steps of:
step one, constructing a vehicle kinematic model;
secondly, respectively determining parameter matrixes related to the single KF and the UFIR filters according to basic parameters of the vehicle kinematic model;
acquiring environmental information at a traffic intersection through a network and a measuring device, searching noise statistical characteristic information under corresponding conditions, reading a fusion factor through calculating a variance result, and simultaneously reading and storing vehicle historical position information corresponding to the optimal sliding window length of the UFIR in real time;
and step four, inputting the fusion factor into a KF/UFIR fusion filter, carrying out state estimation on the acquired vehicle state information data by the filter to obtain the position and speed information of the vehicle at the next moment, and sending the estimation information to other vehicles.
The invention has the advantages that:
(1) the invention discloses a traffic intersection vehicle state estimation method based on a self-adaptive fusion filter, which is characterized in that KF and UFIR algorithms are fused and applied to vehicle state estimation, the advantages and the disadvantages of the KF and the UFIR algorithms are complemented, a more accurate vehicle state estimation result can be achieved under the condition of the existence of accurate noise statistical characteristics, and the method has better robustness and optimality.
(2) The traffic intersection vehicle state estimation method based on the adaptive fusion filter adopts the UFIR filter as a leading factor when the noise statistical characteristics cannot be accurately obtained, and the UFIR algorithm has stronger robustness and universality and can still show better performance in the environment of continuously changing noise signals.
(3) The invention discloses a traffic intersection vehicle state estimation method based on a self-adaptive fusion filter, and provides a noise statistical characteristic look-up table method.
Drawings
FIG. 1 is a flow chart of a traffic intersection vehicle state estimation method based on an adaptive fusion filter according to the present invention.
FIG. 2 is a schematic flow chart of a KF/UFIR adaptive fusion filtering algorithm in the traffic intersection vehicle state estimation method based on the adaptive fusion filter of the present invention.
FIG. 3 is a schematic diagram of a fusion factor arrangement table adopted by the traffic intersection vehicle state estimation method based on the adaptive fusion filter of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
The invention discloses a traffic intersection vehicle state estimation method based on a self-adaptive fusion filter, which comprises the following specific steps as shown in figure 1:
step one, constructing a vehicle kinematic model:
in the formula (1), Xk∈R4The system state variable at the moment k represents the real motion state of the vehicle; zk∈R2An observed value representing the motion state of the vehicle, which is an observed variable at the time k; xkAnd ZkThe specific definition of (A) is as follows:
wherein x (k) and y (k) respectively represent the transverse and longitudinal position information of the vehicle at the time k; x is the number ofz(k) And yz(k) Respectively representing the observed values of the transverse and longitudinal position information of the vehicle at the time k; v. ofx(k) And vy(k) And the information is the transverse and longitudinal speed information of the vehicle at the time k.
A. B, C is a parameter matrix expressed as state transition matrix, process noise coefficient matrix and observation transition matrix, respectively, and A belongs to R4×4,B∈R4×2,C∈R2×4Assuming that the sampling period is T, the parameter matrix A, B, C is specifically defined as:
wk∈R2the process noise is a process noise, the process noise covariance matrix is Q, and the vehicle acceleration is assumed to be a random variable in the vehicle motion model and is taken as the process noise; v. ofkFor measuring noise, the covariance matrix of the measured noise is R, vk∈R2The specific matrix is defined as follows:
E(wkwk T)=R,E(vkvk T)=Q (5)
wherein, ax(k) And ay(k) Respectively, represents the acceleration values of the vehicle in the lateral and longitudinal directions, and E () solves for the expected deviation.
The covariance matrix of the measured noise is obtained according to camera sensor manufacturers and actual measurement experiments, and the process noise covariance matrix can be obtained through specific operation. The initial values of the process noise and the measurement noise are assumed to both satisfy the condition that the mean value is zero:
secondly, respectively determining parameter matrixes related to the KF and the UFIR filters according to the vehicle kinematic model obtained in the first step;
the self-adaptive fusion filter is established on the basis of a KF/UFIR algorithm, the calculation flows of the two algorithms are basically not different, the two processes are mainly divided into a priori prediction process and a posteriori updating process, and a specific parameter matrix in the KF/UFIR is determined according to a vehicle kinematics model in the step.
Specifically, the determination and calculation process of the Kalman filter related parameter matrix is as follows:
wherein,respectively carrying out prior prediction estimation value and posterior update estimation value on the vehicle motion state by the Kalman filter at the k moment;updating an estimated value for the posteriori of the vehicle motion state by the Kalman filter at the k-1 moment;a priori prediction error covariance matrix and a posteriori update error covariance, respectively, at time kA difference matrix;updating an error covariance matrix for the posteriori at the time k-1; i is an identity matrix; kkThe posteriori update error covariance matrix is minimized for the kalman gain at time k.
The UFIR filter is mainly divided into an iterative type and a batch processing type, and in order to be fused with KF, the invention adopts an iterative type calculation form, and the specific calculation process is as follows:
the iterative initial value needs to be obtained by batch processing, and the formula is as follows:
assuming that the current moment is k, the optimal UFIR sliding window value is N, and the initialization length is L according to batch processing; in the formula 8, m is the starting point of the batch formula, and m is k-N + 1; s is an end point, and k-N + L; gsA generalized noise power gain matrix at time s; hm,sThe iteration matrix parameters at the time from m to s;an estimate of the posterior state (initial value of the iteration) at time s in batch mode, Zm,sAnd the historical observation information matrix is from m to s.
Iteration parameter matrix Hm,kThe expression pattern is as follows:
in the formula, Fm,kAccumulating matrices for state transitions; h ism,kAs a coefficient accumulation matrix, CxAn observation transition matrix C, x ═ m, m +1,. k, representing time x; a. thexA system state transition matrix a at time x, where x is m +1, m +2,. and k;the accumulated state transition matrix for time m +1 to k, which is broadly defined as:
wherein A isrAr-1...AgThe system state transition matrix a at time r, r-1.. g, respectively.
Further, the iterative calculation formula is as follows:
where r is the initial point of the iteration, i.e., r is s +1,Gr-1=Gs,Drthe gain matrix is corrected for the systematic deviation at time r.
The UFIR filter does not need noise statistical characteristic information of the whole calculation process, but needs to determine the optimal sliding window length N in order to ensure the accuracy of the calculation resultoptI.e. the number of history data needed for the entire UFIR filter pair state estimation. Optimal sliding window length NoptThere are many feasible methods for determining (a), in the present invention, the determination is mainly obtained by calculating the derivative of the minimum residual covariance matrix, and the specific formula is as follows:
Acquiring environmental information and vehicle running information at a traffic intersection through a network and a measuring device, searching noise statistical characteristic information under corresponding conditions, reading a fusion factor through calculating a variance result, and simultaneously reading and storing vehicle historical position information corresponding to the optimal sliding window length of the UFIR in real time:
the environmental factors which mainly affect the observation result of the camera comprise weather conditions, illumination intensity and brightness, outdoor temperature and observation time, and the four factors are classified, wherein the classification of the weather conditions mainly refers to the weather states issued by a meteorological department, such as cloudy, light rain, clear and the like; the outdoor temperature is classified at every 1 ℃ by taking the lowest temperature and the highest temperature of local history as boundaries; the illumination intensity and the brightness grading standard take a measurement extreme value as an upper limit and a lower limit, and the light intensity and the brightness grading standard are equally divided into 10 small standard grades; the observation time is graded in each half hour, but the observation time mainly plays a reference role because the time measurement generally has little influence on the result. Performing single variable experiment on all grades of the four factors, measuring and recording statistical characteristics of observation noise of the camera under different conditions, and assuming that the four factors respectively have n1、n2、n3And n4By classification, a total of n ═ n can be obtained1n2n3n4Noise statistics MEA under one conditionxAnd (x is 1,2, 3.., n), under the support of a big data technology, the measurement is carried out for a plurality of times by keeping the variable unique, and a relatively accurate data result can be obtained.
For process noise, the running vehicle can send the self transverse and longitudinal acceleration values to an estimation processing center at a traffic intersection through technologies such as DSRC or LTE-V, and the like, although the transmission of information has certain time delay, the change of the delay time relative to the vehicle acceleration can be basically kept, so that the influence of observation noise on a KF estimation result is mainly taken as the main point in the invention.
The traffic intersection observation equipment acquires the weather condition, the illumination intensity and brightness and the outdoor temperature at the intersection position through a network, and reads the corresponding statistical characteristics of observation noise by observing time, however, because the four factors are discretely classified, the data acquired by the equipment cannot be judged in grade with high probability, so that the fuzzy mean value is adoptedFor example, the weather condition at this time is fine, the illumination intensity and brightness are between 3 to 4 levels, the outdoor temperature is 21.5 °, the observation time is 9 hours and 48 minutes, the measurement results of the above three environmental factors except the weather condition are not at precise classification points, and if classification points on two adjacent sides of the measurement results of the environmental factors are selected, the total number of the measurement results is 1 × 2 × 2 × 8, which are c1,c2,...,c8Taking the average as the final result:
wherein the MEAc1To the MEAc8The results of the 8 noise statistics corresponding to the above case.
Finally, the MEA and the MEA of the above 8 cases are calculatedc1,MEAc2,...,MEAc8And (3) determining KF/UFIR fusion factors a and b by table lookup, wherein a + b is 1.
In the formula, σ2To calculate the variance value of the estimated noise statistical characteristics, ci (i ═ 1,2, 3.., 8) indicates that 8 environmental factors are obtained as described above.
Further, the observation data of the running vehicle is obtained by processing the camera image, and only the measured value at the current moment is needed for KF, but the UFIR needs and is according to the optimal sliding window length N at the momentoptStoring corresponding historical observation data:
Zm,k=[Zm TZm+1 T...Zk T]T (15)
where k is the current time, Zm,kIs NoptAnd the corresponding historical observation data parameter matrix.
And step four, inputting the fusion factor into a KF/UFIR fusion filter, carrying out state estimation on the acquired vehicle state information data by the filter to obtain the position and speed information of the vehicle at the next moment, and sending the estimation information to other vehicles.
For KF algorithm, inputting initial values of system-related variables, the algorithm can perform iterative calculation, but UFIR algorithm has certain limitation, such as UFIR optimal sliding window length N described in step threeoptWhen historical observation data with corresponding length are not obtained, iterative calculation cannot be carried out by the algorithm, so that enough N is not obtainedoptWhen data is observed, the output of the KF filter is used as the output of the fusion filter.
The fusion factor obtained in the third step represents the confidence degree of the fusion filter to the KF and the UFIR, and the larger the value of the fusion factor a is, the more accurate the statistical characteristics of the noise should be indicated, and the more dependent on the result obtained by calculation by the KF is required, whereas the smaller the value of a is, the larger the deviation of the statistical characteristics of the noise should be indicated, and the more dependent on the UFIR filter with stronger robustness to the noise is required, and the formula is as follows:
wherein,fusing the final vehicle state estimation result of the filter at the k moment; a and b are fusion factors respectively;calculating an estimation result for KF at the k moment;the time UFIR is the estimated result of the calculation.
By repeating the steps, under the condition that the noise is continuously changed, whether more accurate noise statistical characteristics are obtained or not, the proportion of the calculation results of KF and UFIR in the output of the fusion filter can be adjusted in a self-adaptive mode by searching the corresponding fusion factors, more accurate vehicle motion state estimation values are obtained, and the vehicle state estimation values are sent to other vehicles at traffic intersections.
Claims (6)
1. The traffic intersection vehicle state estimation method based on the adaptive fusion filter is characterized by comprising the following steps: the method is realized by the following steps:
step one, constructing a vehicle kinematic model;
secondly, respectively determining parameter matrixes related to the single KF and the UFIR filters according to basic parameters of the vehicle kinematic model;
acquiring environmental information at a traffic intersection through a network and a measuring device, searching noise statistical characteristic information under corresponding conditions, reading a fusion factor through calculating a variance result, and simultaneously reading and storing vehicle historical position information corresponding to the optimal sliding window length of the UFIR in real time; the specific method comprises the following steps:
the environmental factors influencing the observation result of the camera comprise weather conditions, illumination intensity and brightness, outdoor temperature and observation time, the four factors are graded, and the weather conditions are graded according to the weather state issued by a meteorological department; the outdoor temperature is classified at every 1 ℃ by taking the lowest temperature and the highest temperature of local history as boundaries; the illumination intensity and the brightness grading standard take a measurement extreme value as an upper limit and a lower limit, and the light intensity and the brightness grading standard are equally divided into 10 small standard grades; the observation times are graded every half hour; performing single variable experiments on all the grades of the four factors, measuring and recording the statistical characteristics of the observation noise of the camera under different conditions; let the four factors respectively have n1、n2、n3And n4By classification, a total of n ═ n can be obtained1n2n3n4Noise statistics MEA under one conditionx(x ═ 1,2, 3.., n), under the support of big data technology, the measurement that keeps the variable unique and repeats many times, get the comparatively accurate data result;
for process noise, the running vehicle sends the self transverse and longitudinal acceleration values to an estimation processing center at a traffic intersection through a DSRC (dedicated short range communication) or LTE-V (long term evolution-V) technology, and the influence of observation noise on a KF (Kalman Filter) estimation result is mainly observed;
the traffic intersection observation equipment acquires the weather condition, the illumination intensity and the brightness and the outdoor temperature at the intersection position through a network, reads the corresponding statistical characteristics of observation noise by observing time, selects the grading points at two adjacent sides of the environmental factor measurement result by adopting a fuzzy mean value mode, and then the total number of the measurable results is 1 multiplied by 2 multiplied by 8, namely c1,c2,...,c8Taking the average as the final result:
wherein the MEAc1To the MEAc8The statistical characteristic results of the 8 kinds of noise corresponding to the above situation;
finally, the MEA and the MEA of the above 8 cases are calculatedc1,MEAc2,...,MEAc8Determining KF/UFIR fusion factors a and b by table look-up, wherein a + b is 1;
in the formula, σ2To calculate the variance value of the estimated statistical characteristics of the noise, ci (i ═ 1,2, 3.., 8.) represents the above-mentioned 8 environmental factors;
further, the observation data of the running vehicle is obtained by processing the camera image, and only the measured value at the current moment is needed for KF, but the UFIR needs and is according to the optimal sliding window length N at the momentoptStoring corresponding historical observation data:
Zm,k=[Zm TZm+1 T...Zk T]T (15)
where k is the current time, Zm,kIs NoptCorresponding calendarA history observation data parameter matrix;
and step four, inputting the fusion factor into a KF/UFIR fusion filter, carrying out state estimation on the acquired vehicle state information data by the filter to obtain the position and speed information of the vehicle at the next moment, and sending the estimation information to other vehicles.
2. The adaptive fusion filter-based traffic intersection vehicle state estimation method of claim 1, wherein: in the first step, the vehicle kinematic model is:
in the formula, Xk∈R4The system state variable at the moment k represents the real motion state of the vehicle; zk∈R2An observed value representing the motion state of the vehicle, which is an observed variable at the time k; xkAnd ZkThe specific definition of (A) is as follows:
wherein x (k) and y (k) respectively represent the transverse and longitudinal position information of the vehicle at the time k; x is the number ofz(k) And yz(k) Respectively representing the observed values of the transverse and longitudinal position information of the vehicle at the time k; v. ofx(k) And vy(k) The information of the transverse and longitudinal speeds of the vehicle at the moment k;
A. b, C are respectively expressed as a state transition matrix, a process noise coefficient matrix and an observation transition matrix, and A belongs to R4×4,B∈R4×2,C∈R2×4Assuming that the sampling period is T, the parameter matrix A, B, C is specifically defined as:
wk∈R2to passPath noise, wherein the covariance matrix of the process noise is Q, and the acceleration of the vehicle is assumed to be a random variable in the vehicle motion model and is taken as the process noise; v. ofk∈R2For measuring noise, the covariance matrix of the measured noise is R, and the specific matrix is defined as follows:
E(wkwk T)=R,E(vkvk T)=Q (5)
wherein, ax(k) And ay(k) Respectively representing the acceleration values of the vehicle in the transverse and longitudinal directions;
the covariance matrix of the measured noise is obtained according to camera sensor manufacturers and actual measurement experiments, and the process noise covariance matrix can be obtained through specific operation; the initial values of the process noise and the measurement noise are assumed to both satisfy the condition that the mean value is zero:
3. the adaptive fusion filter-based traffic intersection vehicle state estimation method of claim 1, wherein: the KF filter related parameter matrix is determined and calculated as follows:
wherein,respectively carrying out prior prediction estimation value and posterior update estimation value on the vehicle motion state by the Kalman filter at the k moment;updating an estimated value for the posteriori of the vehicle motion state by the Kalman filter at the k-1 moment;respectively a priori prediction error covariance matrix and a posteriori updating error covariance matrix at the moment k;updating an error covariance matrix for the posteriori at the time k-1; i is an identity matrix; kkThe posteriori update error covariance matrix is minimized for the kalman gain at time k.
4. The adaptive fusion filter-based traffic intersection vehicle state estimation method of claim 1, wherein: the UFIR filter adopts an iterative calculation form, and the specific calculation process is as follows:
the iterative initial value needs to be obtained by batch processing, and the formula is as follows:
assuming that the current moment is k, the optimal UFIR sliding window value is N, and the initialization length is L according to batch processing; in the formula 8, m is the starting point of the batch formula, and m is k-N + 1; s is an end point, and k-N + L; gsA generalized noise power gain matrix at time s; hm,sThe iteration matrix parameters at the time from m to s;an estimate of the posterior state of the batch type at time s, i.e. the initial value of the iteration, Zm,sA historical observation information matrix at the time from m to s;
iteration parameter matrix Hm,kThe expression pattern is as follows:
in the formula, Fm,kAccumulating matrices for state transitions; h ism,kAs a coefficient accumulation matrix, CxAn observation transition matrix C, x ═ m, m +1,. k, representing time x; a. thexA system state transition matrix a at time x, where x is m +1, m +2,. and k;accumulating state transition matrices for time m +1 through k
Further, the iterative calculation formula is as follows:
where r is the initial point of the iteration, i.e., r is s +1,Gr-1=Gs,Drcorrecting a gain matrix for the system offset at time r;
the UFIR filter does not need noise statistical characteristic information of the whole calculation process, but needs to determine the optimal sliding window length N in order to ensure the accuracy of the calculation resultoptI.e. the number of history data needed for the entire UFIR filter pair state estimation.
5. The adaptive fusion filter-based traffic intersection vehicle state estimation method of claim 4, wherein: optimal sliding window length NoptThe method is obtained by calculating the derivative of the minimum residual covariance matrix, and the specific formula is as follows:
6. The adaptive fusion filter-based traffic intersection vehicle state estimation method of claim 1, wherein: and 3, aiming at the condition that the noise statistical characteristic information is different under different environmental factors, setting classification points of each environmental factor in a fuzzy mean value mode, measuring the noise statistical characteristic information, selecting the classification points on two adjacent sides of the current environmental factor measuring result, and then taking the mean value of the classification points as the noise statistical characteristic information under the current environmental factor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010844156.3A CN112183196B (en) | 2020-08-20 | 2020-08-20 | Traffic intersection vehicle state estimation method based on adaptive fusion filter |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010844156.3A CN112183196B (en) | 2020-08-20 | 2020-08-20 | Traffic intersection vehicle state estimation method based on adaptive fusion filter |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112183196A CN112183196A (en) | 2021-01-05 |
CN112183196B true CN112183196B (en) | 2021-08-27 |
Family
ID=73924145
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010844156.3A Active CN112183196B (en) | 2020-08-20 | 2020-08-20 | Traffic intersection vehicle state estimation method based on adaptive fusion filter |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112183196B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112859126B (en) * | 2021-01-19 | 2024-06-11 | 智驾汽车科技(宁波)有限公司 | GNSS positioning drift processing method based on UFIR filter |
CN113436442B (en) * | 2021-06-29 | 2022-04-08 | 西安电子科技大学 | Vehicle speed estimation method using multiple geomagnetic sensors |
CN113472318B (en) * | 2021-07-14 | 2024-02-06 | 青岛杰瑞自动化有限公司 | Hierarchical self-adaptive filtering method and system considering observation model errors |
CN116127406B (en) * | 2022-12-09 | 2023-10-17 | 聊城大学 | Data fusion method based on hybrid H-infinity self-adaptive Kalman filtering |
CN117909772A (en) * | 2024-01-22 | 2024-04-19 | 邢台医学高等专科学校 | Sport equipment motion data monitoring system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110572139A (en) * | 2019-08-16 | 2019-12-13 | 上海智驾汽车科技有限公司 | fusion filtering implementation method and device for vehicle state estimation, storage medium and vehicle |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1863656B1 (en) * | 2005-03-18 | 2018-01-10 | Gatekeeper Systems, Inc. | Power generation systems and methods for wheeled objects |
CN107966142A (en) * | 2017-11-14 | 2018-04-27 | 济南大学 | A kind of adaptive UFIR data fusion methods of indoor pedestrian based on multiwindow |
CN110422175B (en) * | 2019-07-31 | 2021-04-02 | 上海智驾汽车科技有限公司 | Vehicle state estimation method and device, electronic device, storage medium, and vehicle |
CN110414173B (en) * | 2019-08-06 | 2023-04-18 | 上海智驾汽车科技有限公司 | Intersection vehicle state estimation method based on UFIR filter |
-
2020
- 2020-08-20 CN CN202010844156.3A patent/CN112183196B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110572139A (en) * | 2019-08-16 | 2019-12-13 | 上海智驾汽车科技有限公司 | fusion filtering implementation method and device for vehicle state estimation, storage medium and vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN112183196A (en) | 2021-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112183196B (en) | Traffic intersection vehicle state estimation method based on adaptive fusion filter | |
CN113313947B (en) | Road condition evaluation method of short-term traffic prediction graph convolution network | |
EP3206411A1 (en) | Arrangement and method for predicting road friction within a road network | |
CN110047291B (en) | Short-term traffic flow prediction method considering diffusion process | |
CN104616290A (en) | Target detection algorithm in combination of statistical matrix model and adaptive threshold | |
CN112731436B (en) | Multi-mode data fusion travelable region detection method based on point cloud up-sampling | |
CN117278643B (en) | Vehicle-mounted cloud calibration data transmission system based on cloud edge cooperation | |
CN111723929A (en) | Numerical prediction product correction method, device and system based on neural network | |
CN113033687A (en) | Target detection and identification method under rain and snow weather condition | |
WO2022242465A1 (en) | Method and apparatus for fusing data of multiple sensors | |
CN113538357B (en) | Shadow interference resistant road surface state online detection method | |
CN114758178A (en) | Hub real-time classification and air valve hole positioning method based on deep learning | |
CN117291443B (en) | Intelligent paying-off system based on multidimensional sensing technology | |
CN115116013A (en) | Online dense point cloud semantic segmentation system and method integrating time sequence features | |
CN110986946B (en) | Dynamic pose estimation method and device | |
Prasad | Adaptive traffic signal control system with cloud computing based online learning | |
CN110414173B (en) | Intersection vehicle state estimation method based on UFIR filter | |
CN114972429B (en) | Target tracking method and system for cloud edge cooperative self-adaptive reasoning path planning | |
CN114495494B (en) | Traffic situation assessment method based on traffic flow parameter prediction | |
CN109886126A (en) | A kind of region traffic density estimation method based on dynamic sampling mechanism and RBF neural | |
CN112994656A (en) | Distributed k-order filter design method for communication topology random switching | |
CN112241748A (en) | Data dimension reduction method and device based on multi-source information entropy difference | |
CN112419362A (en) | Moving target tracking method based on prior information feature learning | |
CN111583245B (en) | Industrial automation monitoring method for multi-feature coupling and target detection | |
CN116542374B (en) | Public transportation arrival time prediction method based on neural network and Kalman filtering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |