CN115571156B - Front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion - Google Patents

Front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion Download PDF

Info

Publication number
CN115571156B
CN115571156B CN202211164025.6A CN202211164025A CN115571156B CN 115571156 B CN115571156 B CN 115571156B CN 202211164025 A CN202211164025 A CN 202211164025A CN 115571156 B CN115571156 B CN 115571156B
Authority
CN
China
Prior art keywords
front vehicle
state
representing
vehicle
longitudinal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211164025.6A
Other languages
Chinese (zh)
Other versions
CN115571156A (en
Inventor
徐利伟
董锋威
殷国栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202211164025.6A priority Critical patent/CN115571156B/en
Publication of CN115571156A publication Critical patent/CN115571156A/en
Application granted granted Critical
Publication of CN115571156B publication Critical patent/CN115571156B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • B60W60/00274Planning or execution of driving tasks using trajectory prediction for other traffic participants considering possible movement changes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0018Method for the design of a control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0019Control system elements or transfer functions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0019Control system elements or transfer functions
    • B60W2050/0028Mathematical models, e.g. for simulation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0019Control system elements or transfer functions
    • B60W2050/0028Mathematical models, e.g. for simulation
    • B60W2050/0031Mathematical model of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • B60W2050/0052Filtering, filters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/30Road curve radius
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/65Data transmitted between vehicles

Abstract

The invention discloses a front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion, relates to the technical field of unmanned and intelligent control, and solves the technical problem that the existing front vehicle transverse and longitudinal motion state estimation precision is low.

Description

Front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion
Technical Field
The application relates to the technical field of unmanned and intelligent control, in particular to a heterogeneous communication mode-based sensor asynchronous fusion motion state estimation method, and especially relates to a front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion.
Background
The advanced auxiliary driving technology of the intelligent network-connected automobile can effectively improve the comfort and safety of the running of the automobile, and the technology generally needs to sense the pose and the motion state of surrounding automobiles in real time, so that subsequent decision making, planning and control are performed. However, the motion state of the surrounding vehicle is difficult to measure by the sensor, and even if the related sensor such as radar and vision sensor is equipped, the measurement accuracy thereof varies depending on the environment. Some estimation algorithms for the motion state of the front vehicle appear, but the current estimation algorithms still have the following problems:
1) The method mainly aims at estimating the lateral state of the front vehicle, and the longitudinal state of the front vehicle is considered to be unchanged or transmitted through V2V (vehicle to vehicle); this estimation method ignores the longitudinal and lateral motion coupling of the vehicle system.
2) Vehicle conditions are estimated using only a single information source, such as using only V2V information. V2V is wireless communication mode, and the problem that communication delay, packet loss and the like influence estimation accuracy is necessarily accompanied, and simultaneously communication rate is lower, and real-time performance of estimation cannot be guaranteed. Or only using information from on-board sensors whose measurement accuracy is susceptible to environmental changes.
3) The problem of different information transfer rates of different information sources, i.e. asynchronous information fusion, in the case of multi-information source fusion estimation is not considered.
Disclosure of Invention
The application provides a front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion, which aims to estimate the front vehicle transverse and longitudinal motion state on the basis of fusing information of two information sources, namely V2V and a vehicle-mounted sensor, improve estimation precision and improve the adaptability of the estimation method to the environment.
The technical aim of the application is achieved through the following technical scheme:
a front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion comprises the following steps:
s1: acquiring front vehicle state quantity and parameter information through a V2V and a vehicle sensor;
s2: designing a front vehicle transverse and longitudinal state joint estimator based on the front vehicle state quantity and parameter information acquired by V2V;
s3: designing a front vehicle lateral state estimator based on the front vehicle state quantity and parameter information acquired by the vehicle sensor;
s4: and fusing the front vehicle transverse and longitudinal state joint estimator with the front vehicle lateral state estimator based on distributed fusion Kalman filtering, and carrying out joint estimation on the front vehicle transverse and longitudinal motion state after fusion.
The beneficial effects of this application lie in: 1) The method comprehensively considers the longitudinal movement and the lateral movement of the front vehicle, and compared with the traditional method of assuming that the longitudinal speed is unchanged or utilizing V2V transmission, the method is more in line with the actual driving scene of the vehicle, and overcomes the defect of lower V2V communication rate; 2) The method comprehensively considers the motion state of the front vehicle to be estimated by utilizing the V2V information and the vehicle-mounted sensor information, not only solves the problem of estimation accuracy reduction caused by low V2V communication rate, but also improves the adaptability of the estimation method to the environment, and simultaneously solves the problem of asynchronous estimation under the condition of inconsistent sensor sampling rate.
Drawings
FIG. 1 is a schematic diagram of a host vehicle acquiring front vehicle information via V2V and a host vehicle sensor;
FIG. 2 is a time schematic diagram of asynchronous sampling rates of two information sources, V2V and an on-vehicle sensor;
FIG. 3 is a schematic diagram of a method for jointly estimating lateral and longitudinal motion states of a front vehicle based on sensor fusion according to the present application;
FIG. 4 is a schematic diagram of a three degree of freedom dynamics model used in the design process of each estimator of the present application;
FIG. 5 is a schematic diagram of a three degree of freedom kinematic model of a lead vehicle relative to a roadway incorporating the Serset-Frenet equation.
Detailed Description
The technical scheme of the application will be described in detail below with reference to the accompanying drawings.
As shown in fig. 3, the method for jointly estimating the lateral and longitudinal motion states of the front vehicle based on sensor fusion described in the application includes:
s1: the front vehicle state quantity and parameter information are acquired through the V2V and the vehicle sensor, and fig. 2 is a time schematic diagram of asynchronous sampling rates of two information sources, namely the V2V and the vehicle sensor.
Specifically, as shown in fig. 1, the front vehicle state quantity acquired based on V2V includes a lateral acceleration, a longitudinal acceleration, and a front wheel rotation angle of the front vehicle, and the parameter information includes a mass and a wheelbase of the front vehicle; the position of the front vehicle under the geodetic coordinate system, the pose relative to the road and the road curvature are acquired by using the vehicle sensor.
S2: and designing a front vehicle transverse and longitudinal state joint estimator based on the V2V acquired front vehicle state quantity and parameter information. Comprising the following steps: the method comprises the steps of constructing a first front vehicle three-degree-of-freedom dynamics model based on front vehicle state quantity and parameter information acquired through V2V communication, and designing a front vehicle transverse and longitudinal state joint estimator through the first front vehicle three-degree-of-freedom dynamics model, wherein the specific process is as follows:
1) Front three-degree-of-freedom dynamics model: with the three degree of freedom dynamics model taking into account the longitudinal, transverse and yaw directions of the vehicle, as shown in fig. 4, the first front three degree of freedom dynamics model is expressed as:
wherein r represents the yaw rate of the preceding vehicle; v x Representing the longitudinal speed of the front vehicle; a, a x Indicating the acceleration of the front vehicle; v y Indicating the lateral speed of the front vehicle; a, a y Representing the lateral acceleration of the front vehicle; m represents the mass of the front vehicle; i z Representing the moment of inertia of the front truck around the Z axis; a represents the distance from the center of mass of the front vehicle to the front axle; b represents the distance from the center of mass of the front vehicle to the rear axle; k (k) f Representing the cornering stiffness equivalent to the front axle of the front vehicle; k (k) r Representing the cornering stiffness equivalent to the rear axle of the front vehicle; delta represents the front wheel rotation angle of the front vehicle.
Let x 1,k =[v x ,v y ,r] T ,z 1,k =[a y ] T ,u k =[a x ,δ] T The discretization equation of the three-degree-of-freedom dynamics model of the front vehicle is expressed as follows:
wherein w is k And v k Respectively representing the process noise and the measurement noise of the model, and conforming to Gaussian distribution; q represents the covariance matrix of the process noise; r represents the covariance matrix of the measurement noise; f (f) 1 (-) represents a state transfer function; h is a 1 (-) represents the output function.
In order to improve the estimation accuracy, the input amount a is set during the V2V sampling interval x And delta is estimated by adopting a constant change rate method.
Linearizing equation (2), the linearized equation is expressed as:
wherein A is 1,k-1 Represents f 1 Relative to x 1 Jacobian matrix of (a); b (B) 1,k-1 Represents f 1 Jacobian matrix relative to u; h 1,k Represents h 1 Relative to x 1 Jacobian matrix of (a); Γ -shaped structure 1,k Represents h 1 Jacobian matrix relative to u.
2) V2V-based joint estimator design of front vehicle transverse and longitudinal states: considering that the longitudinal acceleration, the lateral acceleration and the front wheel rotation angle of the front vehicle can be transmitted to the vehicle by means of V2V, the front vehicle transverse and longitudinal state joint estimator about longitudinal speed, lateral speed and yaw rate can be designed by utilizing the first front vehicle three-degree-of-freedom dynamics model.
In order to facilitate subsequent information fusion, an extended Kalman filter in an information filtering form is adopted to design a front vehicle transverse and longitudinal state joint estimator, and the estimation steps comprise time updating and measurement updating.
The time update includes:
the measurement update includes:
wherein,representing a priori estimates of time k, P 1,k/k-1 A priori error covariance matrix representing k time, A 1,k-1 Represents f 1 Jacobian matrix for system state vector x>Representing the state estimate at time k, P 1,k-1 An error covariance matrix representing the k-1 time, H 1,k Represents h 1 Jacobian matrix for system state vector x
S3: the front vehicle lateral state estimator is designed based on the front vehicle state quantity and parameter information acquired by the vehicle sensor.
Specifically, step S3 includes:
s31: and constructing a front vehicle longitudinal motion model based on the front vehicle state quantity and parameter information acquired by the vehicle sensor, and designing a front vehicle longitudinal speed estimator according to the front vehicle longitudinal motion model.
1) Building a longitudinal movement model of the front vehicle: the front vehicle longitudinal motion model is a discretized CTRA model, and the CTRA model can describe the change of the motion state of the front vehicle when the front vehicle changes lanes. The state of the CTRA model is expressed as: respectively representing a longitudinal coordinate value and a transverse coordinate value of the front vehicle in the geodetic coordinates and a direction angle; r, v x And a x The yaw rate, the longitudinal speed, and the longitudinal acceleration of the preceding vehicle are respectively indicated.
Let the sampling time be T d And (3) acquiring the pose of the front vehicle by using a vehicle-mounted detection module such as LiDAR and a vision system, wherein the discretized CTRA model is expressed as follows:
wherein, a k the longitudinal acceleration of the front vehicle at the moment k is represented; v k The longitudinal speed of the front vehicle at the moment k is represented; r is (r) k The yaw rate of the preceding vehicle at time k is represented; x is x k A preceding vehicle state vector representing the time k; f (f) CTRA Representing a state transfer function of the CTRA model; h is a CTRA Representing the output function of the CTRA model.
2) Front vehicle longitudinal speed estimator design based on vehicle sensor: based on the unscented Kalman filtering principle, the longitudinal speed of the front vehicle is estimated according to the pose of the front vehicle measured in real time by the vehicle-mounted radar and the vision system.
In the joint estimation architecture based on the vehicle sensor, the longitudinal speed of the front vehicle needs to be estimated, and the estimation of the lateral motion state of the front vehicle also needs to utilize the longitudinal speed of the front vehicle, so the application needs to design the longitudinal speed estimator of the front vehicle first. Because the volume Kalman filtering is more favorable for carrying out state estimation on the high-dimensional nonlinear dynamic system and has higher estimation precision, the method is used for estimating the longitudinal speed of the front vehicle, and comprises the following specific steps:
(1) initializing: the estimated value of zero time isError covariance matrixThe volume point is calculated according to equation (7), expressed as:
where n is the dimension of the state and c is the number of volume points.
(2) And (5) updating time: decomposing the error covariance matrix at the previous moment by adopting a singular value decomposition method, and then sequentially calculating and transmitting volume points, wherein the volume points are expressed as follows:
wherein U is k-1 、S k-1 Andthe matrix is obtained by singular value decomposition of the error covariance matrix at the moment k-1; x-shaped articles i,k-1 Representing an ith volume point; />A state estimation value representing the time k-1; />Representing the volume point after the state equation iteration.
State prediction value updated for elapsed timeAnd error covariance matrix P k/k-1 Estimation is performed, expressed as:
(3) measurement update: decomposing the prior error covariance matrix at the current moment by adopting a singular value decomposition method, and then sequentially calculating and transmitting volume points, wherein the volume points are expressed as:
predicted value for measured valueEstimation is carried out, and the innovation variance P is calculated zz,k/k-1 And cross covariance P xz,k/k-1 Expressed as:
combining equations (7) to (11) and filter gain K k Completion of the state quantityAnd error covariance matrix P k Is expressed as:
s32: and combining the first front vehicle three-degree-of-freedom dynamics model with a Serset-Frenet equation to obtain a second front vehicle three-degree-of-freedom dynamics model.
Specifically, in order to compensate for the disadvantage of the reduction of estimation accuracy due to the low V2V communication rate, the lateral state of the preceding vehicle is estimated by using sensors such as vehicle vision and radar. Thus, the Serset-Frenet equation is used to describe the pose change of the lead vehicle on the road. As shown in fig. 5, when the front vehicle runs on the standardized road, there is a lateral displacement deviation e from the road center line, and meanwhile, there is a heading angle deviation ψ from the tangential direction of the road center line, and the longitudinal speed, the lateral speed and the road curvature of the front vehicle are considered to obtain the following serset-Frenet equation of the vehicle and the road, expressed as:
then a discretized second front vehicle three-degree-of-freedom dynamics model is obtained, which is expressed as follows:
wherein x is 2,k =[v x ,v y ,r,e,ψ,C] T ,z 2,k =[v x ,e,ψ,C] T ,u k =[a x ,δ] T . V in the measured value x From the front vehicle longitudinal speed estimator based on the vehicle sensor, the remaining measurements are from the vehicle's vision sensor and radar.
The equation after linearization of the second front vehicle three-degree-of-freedom dynamics model is expressed as follows:
wherein e represents the lateral position deviation of the front vehicle relative to the road, ψ represents the included angle of the heading of the front vehicle relative to the tangential direction of the road center line, and C represents the road curvature. f (f) 2 (-) represents a state transfer function; h is a 2 (-) represents the output function. A is that 2,k-1 Represents f 2 Relative to x 2 Jacobian matrix of (a); b (B) 2,k-1 Represents f 2 Jacobian matrix relative to u; h 2,k Represents h 2 Relative to x 2 Jacobian matrix of (a); Γ -shaped structure 2,k Represents h 2 Jacobian matrix relative to u.
In order to improve the estimation accuracy, the input amount a is set during the V2V sampling interval x And delta is estimated by adopting a constant change rate method.
S33: and on the basis of the front vehicle longitudinal speed estimator, designing the front vehicle lateral state estimator according to the second front vehicle three-degree-of-freedom dynamics model degree.
Specifically, the estimation steps of the front vehicle lateral state estimator are the same as those of the front vehicle lateral and longitudinal state joint estimator based on V2V, and will not be described here.
S4: and fusing the front vehicle transverse and longitudinal state joint estimator with the front vehicle lateral state estimator based on distributed fusion Kalman filtering, and carrying out joint estimation on the front vehicle transverse and longitudinal motion state after fusion.
Specifically, the two methods for estimating the state of the preceding vehicle using the V2V information and the in-vehicle sensor information described above have advantages and disadvantages. For the former, V2V information can accurately reflect the state of vehicle operation, but V2V communication rate is low, and estimation accuracy is easily affected by it. For the latter, the sampling rate of the vehicle-mounted sensor is faster than the V2V communication rate, but the accuracy requirement of the algorithm for machine vision is higher by estimating the front vehicle state based on the kinematic geometric relationship, and the accuracy of the sensor is easily influenced by the external environment. In order to integrate the advantages of the two estimation methods and improve the estimation accuracy, the problem of asynchronous fusion needs to be considered.
In addition, since the model states of the sub-estimators are considered to be the same by the conventional fusion algorithm, and the two sub-estimators established in the application are heterogeneous models, only part of the states are the same, the application also needs to solve the fusion problem of the heterogeneous models.
First through a state transformation matrix phi j Fusing the front vehicle transverse and longitudinal state joint estimator and the front vehicle lateral state estimator, and expressing as:
wherein phi is 1 And phi is 2 Are all state transformation matrices; i 3×3 A matrix representing all 1's diagonal lines of 3 dimensions; namely: according to the principle of distributed fusion Kalman filtering, the model adopted by the fusion center is a model containing all states, and the states x and x of the fusion center 2 The same, the fused joint estimation step includes a time update and a measurement update, as follows:
the time update is expressed as:
the measurement update is expressed as:
wherein A is k-1 =A 2,k-1 ,B k-1 =B 2,k-1Representing a priori estimated state calculated by the fusion center at the moment k;representing the state of the fusion center estimation at the moment k-1; p (P) k/k-1 Representing a priori error covariance matrix calculated by the fusion center at the moment k; p (P) k Representing an error covariance matrix calculated by the fusion center at the moment k-1; when j=1, each matrix is a state transfer function f 1 (-) corresponding respective matrices; when j=2, each matrix is a state transfer function f 2 (-) corresponding respective matrix.
Combining equation (16) and equation (18), the estimation results from the sub-estimators can be fusion estimated.
The foregoing is an exemplary embodiment of the present application, the scope of which is defined by the claims and their equivalents.

Claims (9)

1. The front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion is characterized by comprising the following steps of:
s1: acquiring front vehicle state quantity and parameter information through a V2V and a vehicle sensor;
s2: designing a front vehicle transverse and longitudinal state joint estimator based on the front vehicle state quantity and parameter information acquired by V2V;
s3: designing a front vehicle lateral state estimator based on the front vehicle state quantity and parameter information acquired by the vehicle sensor;
s4: based on distributed fusion Kalman filtering, a front vehicle transverse and longitudinal state joint estimator and a front vehicle lateral state estimator are fused, and after fusion, the front vehicle transverse and longitudinal motion states are jointly estimated;
wherein in S4, the state transformation matrix phi is passed j Fusing the front vehicle transverse and longitudinal state joint estimator and the front vehicle lateral state estimator, and expressing as:
wherein phi is 1 And phi is 2 Are all state transformation matrices; i 3×3 A matrix representing all 1's diagonal lines of 3 dimensions;
fusion center states x and x 2 Similarly, the fused joint estimation step includes a time update and a measurement update, the time update being expressed as:
the measurement update is expressed as:
wherein A is k-1 =A 2,k-1 ,B k-1 =B 2,k-1 ,A 2,k-1 Representing a state transfer function f 2 Relative to x 2,k-1 Jacobian matrix of (a); b (B) 2,k-1 Representing a state transfer function f 2 Relative to u k-1 Jacobian matrix of (a); the method comprises the steps of carrying out a first treatment on the surface of theRepresenting a priori estimated state calculated by the fusion center at the moment k; />Representing the state of the fusion center estimation at the moment k-1; p (P) k/k-1 Representing a priori error covariance matrix calculated by the fusion center at the moment k; p (P) k Representing an error covariance matrix calculated by the fusion center at the moment k-1; q (Q) k-1 A covariance matrix representing process noise at time k-1; when j=1, each matrix is a state transfer function f 1 (-) corresponding respective matrices; when j=2, each matrix is a state transfer function f 2 (-) corresponding respective matrices; u (u) k-1 An input representing time k-1; h j,k Represents h j Relative to x j,k Jacobian matrix of (a); h is a j () represents an output function; r is R j,k The covariance matrix of measurement noise corresponding to the three-degree-of-freedom dynamic models of different front vehicles at the moment k is represented; />Representing the state of fusion center estimation corresponding to the three-degree-of-freedom dynamic models of different front vehicles at the moment k; p (P) j,k Representing error covariance matrixes calculated by fusion centers corresponding to three-degree-of-freedom dynamic models of different front vehicles at the moment k; z j,k And representing the performance output corresponding to the three-degree-of-freedom dynamic models of different front vehicles at the moment k.
2. The method of claim 1, wherein the front vehicle state quantity acquired based on V2V includes a lateral acceleration, a longitudinal acceleration, and a front wheel rotation angle of the front vehicle, and the parameter information includes a mass and a wheelbase of the front vehicle; the position of the front vehicle under the geodetic coordinate system, the pose relative to the road and the road curvature are acquired by using the vehicle sensor.
3. The method according to claim 2, wherein the step S2 includes: constructing a first front vehicle three-degree-of-freedom dynamics model based on front vehicle state quantity and parameter information acquired by V2V communication, and designing a front vehicle transverse and longitudinal state joint estimator through the first front vehicle three-degree-of-freedom dynamics model;
the first front vehicle three-degree-of-freedom dynamics model is expressed as:
wherein r represents the yaw rate of the preceding vehicle; v x Representing the longitudinal speed of the front vehicle; a, a x Indicating the acceleration of the front vehicle; v y Indicating the lateral speed of the front vehicle; a, a y Representing the lateral acceleration of the front vehicle; m represents the mass of the front vehicle; i z Representing the moment of inertia of the front truck around the Z axis; a represents the distance from the center of mass of the front vehicle to the front axle; b represents the distance from the center of mass of the front vehicle to the rear axle; k (k) f Representing the cornering stiffness equivalent to the front axle of the front vehicle; k (k) r Representing the cornering stiffness equivalent to the rear axle of the front vehicle; delta represents the front wheel rotation angle of the front vehicle;
let x 1,k =[v x ,v y ,r] T ,z 1,k =[a y ] T ,u k =[a x ,δ] T The discretization equation of the three-degree-of-freedom dynamics model of the front vehicle is expressed as follows:
wherein w is k And v k Respectively representing the process noise and the measurement noise of the model, and conforming to Gaussian distribution; q represents the covariance matrix of the process noise; r represents the covariance matrix of the measurement noise; f (f) 1 (-) represents a state transfer function; h is a 1 () represents an output function;
linearizing equation (2), the linearized equation is expressed as:
wherein A is 1,k-1 Represents f 1 Relative to x 1,k-1 Jacobian matrix of (a); b (B) 1,k-1 Represents f 1 Relative to u k-1 Jacobian matrix of (a); h 1,k Represents h 1 Relative to x 1,k Jacobian matrix of (a); Γ -shaped structure 1,k Represents h 1 Relative to u k Is a jacobian matrix of (c).
4. A method according to claim 3, wherein, according to a 1 And H 1 The front vehicle transverse and longitudinal state joint estimator is designed by adopting an expansion Kalman filter in an information filtering mode, and the estimation steps of the front vehicle transverse and longitudinal state joint estimator comprise time updating and measurement updating;
the time update includes:
the measurement update includes:
wherein,representing a priori estimates of time k, P 1,k/k-1 A priori error covariance matrix representing k time, A 1,k-1 Represents f 1 For system state vector x 1,k-1 Jacobian matrix-> Representing the state estimate at time k, P 1,k-1 An error covariance matrix representing the k-1 time, H 1,k Represents h 1 For system state vector x 1,k Jacobian matrix of (a)
5. The method of claim 4, wherein the step S3 includes:
constructing a front vehicle longitudinal motion model based on front vehicle state quantity and parameter information acquired by a vehicle sensor, and designing a front vehicle longitudinal speed estimator according to the front vehicle longitudinal motion model;
combining the first front vehicle three-degree-of-freedom dynamics model with a Serset-Frenet equation to obtain a second front vehicle three-degree-of-freedom dynamics model;
and on the basis of the front vehicle longitudinal speed estimator, designing the front vehicle lateral state estimator according to the second front vehicle three-degree-of-freedom dynamics model.
6. The method of claim 5, wherein the front truck longitudinal motion model is a discretized CTRA model, the discretized CTRA model expressed as:
wherein, T d representing a sampling time; x, Y, (-) -and>respectively representing a longitudinal coordinate value and a transverse coordinate value of the front vehicle in the geodetic coordinates and a direction angle; a, a k The longitudinal acceleration of the front vehicle at the moment k is represented; v k The longitudinal speed of the front vehicle at the moment k is represented; r is (r) k The yaw rate of the preceding vehicle at time k is represented; x is x k A preceding vehicle state vector representing the time k; f (f) CTRA Representing a state transfer function of the CTRA model; h is a CTRA Representing the output function of the CTRA model.
7. The method of claim 6, wherein the front longitudinal speed estimator is designed based on the front longitudinal motion model, and the front longitudinal speed estimator estimates the longitudinal speed of the front vehicle by volumetric kalman filtering, the estimating step comprising:
1) Initializing: the estimated value of zero time isError covariance matrix->The volume point is calculated according to equation (7), expressed as:
wherein n is the dimension of the state, c is the number of volume points;
2) And (5) updating time: decomposing the error covariance matrix at the previous moment by adopting a singular value decomposition method, and then sequentially calculating and transmitting volume points, wherein the volume points are expressed as follows:
wherein U is k-1 、S k-1 Andthe matrix is obtained by singular value decomposition of the error covariance matrix at the moment k-1; x-shaped articles i,k-1 Representing an ith volume point; />A state estimation value representing the time k-1; />Representing the volume point after the state equation iteration;
state prediction value updated for elapsed timeAnd error covariance matrix P k/k-1 Estimation is performed, expressed as:
3) Measurement update: decomposing the prior error covariance matrix at the current moment by adopting a singular value decomposition method, and then sequentially calculating and transmitting volume points, wherein the volume points are expressed as:
predicted value for measured valueEstimation is carried out, and the innovation variance P is calculated zz,k/k-1 And cross covariance P xz,k/k-1 Expressed as:
combining equations (7) to (11) and filter gain K k Completion of the state quantityAnd error covariance matrix P k Is expressed as:
8. the method of claim 7, wherein the constructing of the second front truck three degree of freedom dynamics model comprises:
the Serset-Frenet equation is expressed as:
according to Serset-Frenet equation and the first front vehicle three-degree-of-freedom dynamics model, the second front vehicle three-degree-of-freedom dynamics model is obtained in a discretization mode, and the second front vehicle three-degree-of-freedom dynamics model is expressed as follows:
the equation after linearization of the second front vehicle three-degree-of-freedom dynamics model is expressed as follows:
wherein x is 2,k =[v x ,v y ,r,e,ψ,C] T ,z 2,k =[v x ,e,ψ,C] T ,u k =[a x ,δ] T The method comprises the steps of carrying out a first treatment on the surface of the e represents the lateral position deviation of the front vehicle relative to the road, ψ represents the included angle of the heading of the front vehicle relative to the road direction, and C represents the road curvature; v in the measured value x From a front vehicle longitudinal speed estimator based on a vehicle sensor, the remaining measured values from the vehicle's vision sensor and radar;
f 2 (-) represents a state transfer function; h is a 2 () represents an output function;
A 2,k-1 represents f 2 Relative to x 2,k-1 Jacobian matrix of (a); b (B) 2,k-1 Represents f 2 Relative to u k-1 Jacobian matrix of (a); h 2,k Represents h 2 Relative to x 2,k Jacobian matrix of (a); Γ -shaped structure 2,k Represents h 2 Relative to u k Is a jacobian matrix of (c).
9. The method of claim 1, wherein during a V2V sampling interval, the input a of each estimator x And delta is estimated and obtained by adopting a constant change rate method; wherein a is x Indicating the acceleration of the front vehicle and delta indicating the front wheel rotation angle of the front vehicle.
CN202211164025.6A 2022-09-23 2022-09-23 Front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion Active CN115571156B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211164025.6A CN115571156B (en) 2022-09-23 2022-09-23 Front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211164025.6A CN115571156B (en) 2022-09-23 2022-09-23 Front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion

Publications (2)

Publication Number Publication Date
CN115571156A CN115571156A (en) 2023-01-06
CN115571156B true CN115571156B (en) 2023-12-26

Family

ID=84581758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211164025.6A Active CN115571156B (en) 2022-09-23 2022-09-23 Front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion

Country Status (1)

Country Link
CN (1) CN115571156B (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106515740A (en) * 2016-11-14 2017-03-22 江苏大学 Distributed electrically driven automobile travelling status parameter estimation algorithm based on ICDKF
CN107315413A (en) * 2017-07-12 2017-11-03 北京航空航天大学 Under a kind of truck traffic environment consider vehicle between relative position many car co-located algorithms
CN108128308A (en) * 2017-12-27 2018-06-08 长沙理工大学 A kind of vehicle state estimation system and method for distributed-driving electric automobile
EP3416151A1 (en) * 2017-06-15 2018-12-19 Veoneer Sweden AB Detection of non-v2v vehicles
CN110816526A (en) * 2019-11-29 2020-02-21 苏州智加科技有限公司 Acceleration control method and device for automatically driving vehicle to avoid threat and storage medium
CN110861651A (en) * 2019-12-02 2020-03-06 吉林大学 Method for estimating longitudinal and lateral motion states of front vehicle
KR102110848B1 (en) * 2019-01-24 2020-05-13 (주)이노시뮬레이션 Method for estimating load information based on modular type motion platform and apparatus thereof
CN111780981A (en) * 2020-05-21 2020-10-16 东南大学 Intelligent vehicle formation lane change performance evaluation method
CN112046468A (en) * 2020-09-16 2020-12-08 吉林大学 Vehicle transverse and longitudinal coupling stability control method based on T-S fuzzy
CN112484725A (en) * 2020-11-23 2021-03-12 吉林大学 Intelligent automobile high-precision positioning and space-time situation safety method based on multi-sensor fusion
AU2021102801A4 (en) * 2021-05-24 2021-07-22 Chakrabarti, Prasun DR Velocity estimation from cycle pedaling by using extended kalman filter algorithm
CN113460056A (en) * 2021-08-03 2021-10-01 吉林大学 Vehicle road surface adhesion coefficient estimation method based on Kalman filtering and least square method
CN113460088A (en) * 2021-07-26 2021-10-01 南京航空航天大学 Unmanned vehicle path tracking control method based on nonlinear tire and driver model
CN113511194A (en) * 2021-04-29 2021-10-19 无锡物联网创新中心有限公司 Longitudinal collision avoidance early warning method and related device
CN113859232A (en) * 2021-10-30 2021-12-31 重庆长安汽车股份有限公司 Vehicle automatic driving potential target prediction alarm method and system
CN114518119A (en) * 2020-11-19 2022-05-20 三星电子株式会社 Positioning method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11287524B2 (en) * 2018-12-11 2022-03-29 Hyundai Motor Company System and method for fusing surrounding V2V signal and sensing signal of ego vehicle

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106515740A (en) * 2016-11-14 2017-03-22 江苏大学 Distributed electrically driven automobile travelling status parameter estimation algorithm based on ICDKF
EP3416151A1 (en) * 2017-06-15 2018-12-19 Veoneer Sweden AB Detection of non-v2v vehicles
CN107315413A (en) * 2017-07-12 2017-11-03 北京航空航天大学 Under a kind of truck traffic environment consider vehicle between relative position many car co-located algorithms
CN108128308A (en) * 2017-12-27 2018-06-08 长沙理工大学 A kind of vehicle state estimation system and method for distributed-driving electric automobile
KR102110848B1 (en) * 2019-01-24 2020-05-13 (주)이노시뮬레이션 Method for estimating load information based on modular type motion platform and apparatus thereof
CN110816526A (en) * 2019-11-29 2020-02-21 苏州智加科技有限公司 Acceleration control method and device for automatically driving vehicle to avoid threat and storage medium
CN110861651A (en) * 2019-12-02 2020-03-06 吉林大学 Method for estimating longitudinal and lateral motion states of front vehicle
CN111780981A (en) * 2020-05-21 2020-10-16 东南大学 Intelligent vehicle formation lane change performance evaluation method
CN112046468A (en) * 2020-09-16 2020-12-08 吉林大学 Vehicle transverse and longitudinal coupling stability control method based on T-S fuzzy
CN114518119A (en) * 2020-11-19 2022-05-20 三星电子株式会社 Positioning method and device
CN112484725A (en) * 2020-11-23 2021-03-12 吉林大学 Intelligent automobile high-precision positioning and space-time situation safety method based on multi-sensor fusion
CN113511194A (en) * 2021-04-29 2021-10-19 无锡物联网创新中心有限公司 Longitudinal collision avoidance early warning method and related device
AU2021102801A4 (en) * 2021-05-24 2021-07-22 Chakrabarti, Prasun DR Velocity estimation from cycle pedaling by using extended kalman filter algorithm
CN113460088A (en) * 2021-07-26 2021-10-01 南京航空航天大学 Unmanned vehicle path tracking control method based on nonlinear tire and driver model
CN113460056A (en) * 2021-08-03 2021-10-01 吉林大学 Vehicle road surface adhesion coefficient estimation method based on Kalman filtering and least square method
CN113859232A (en) * 2021-10-30 2021-12-31 重庆长安汽车股份有限公司 Vehicle automatic driving potential target prediction alarm method and system

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
An Integrated Longitudinal and Lateral Vehicle Following Control System With Radar and Vehicle-to-Vehicle Communication;Shouyang Wei;《IEEE Transactions on Vehicular Technology》;全文 *
一种多传感器数据时空融合估计算法;郭利;马彦恒;张锡恩;;系统工程与电子技术(12);全文 *
基于OpenCV的前方车辆检测和前撞预警算法研究;刘军;高雪婷;王利明;晏晓娟;;汽车技术(第06期);全文 *
基于信息融合的智能车辆前方目标识别技术研究;严思宁;《中国优秀硕士学位论文全文数据库(信息科技辑)》;D46-52页 *
基于联邦滤波的智能车辆多传感器组合导航的研究;李旭;张为公;;中国机械工程(12);全文 *
智能汽车的运动规划与控制研究综述;采国顺;《汽车安全与节能学报》;全文 *

Also Published As

Publication number Publication date
CN115571156A (en) 2023-01-06

Similar Documents

Publication Publication Date Title
Guo et al. Vehicle dynamic state estimation: State of the art schemes and perspectives
CN112733270B (en) System and method for predicting vehicle running track and evaluating risk degree of track deviation
US6522956B2 (en) Method and device for estimating a transverse acceleration at an axle of a semitrailer or a trailer of a vehicle combination
CN112083726B (en) Park-oriented automatic driving double-filter fusion positioning system
Ding et al. Event-triggered vehicle sideslip angle estimation based on low-cost sensors
Lian et al. Cornering stiffness and sideslip angle estimation based on simplified lateral dynamic models for four-in-wheel-motor-driven electric vehicles with lateral tire force information
CN113819914A (en) Map construction method and device
CN109900490B (en) Vehicle motion state detection method and system based on autonomous and cooperative sensors
Liu et al. Slip-aware motion estimation for off-road mobile robots via multi-innovation unscented Kalman filter
CN111189454A (en) Unmanned vehicle SLAM navigation method based on rank Kalman filtering
CN113247004A (en) Joint estimation method for vehicle mass and road transverse gradient
CN116552550A (en) Vehicle track tracking control system based on parameter uncertainty and yaw stability
Brunker et al. GNSS-shortages-resistant and self-adaptive rear axle kinematic parameter estimator (SA-RAKPE)
CN114912061A (en) Accurate evaluation method for lane keeping auxiliary system of commercial vehicle
Welte et al. Four-wheeled dead-reckoning model calibration using RTS smoothing
CN109900295B (en) Method and system for detecting vehicle motion state based on autonomous sensor
Hashemi et al. Real-time road bank estimation with disturbance observers for vehicle control systems
JP2001134320A (en) Lane follow-up controller
CN115571156B (en) Front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion
Mori et al. Simultaneous estimation of vehicle position and data delays using gaussian process based moving horizon estimation
CN114475581B (en) Automatic parking positioning method based on wheel speed pulse and IMU Kalman filtering fusion
Han et al. Hybrid state observer design for estimating the hitch angles of tractor-multi unit trailer
CN113978476B (en) Wire-controlled automobile tire lateral force estimation method considering sensor data loss
CN113060143B (en) System and method for determining road adhesion coefficient
CN115366889A (en) Multi-working-condition pavement adhesion coefficient estimation method and system based on particle filtering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant