CN113870316A - Front vehicle path reconstruction method under scene without GPS vehicle following - Google Patents

Front vehicle path reconstruction method under scene without GPS vehicle following Download PDF

Info

Publication number
CN113870316A
CN113870316A CN202111215298.4A CN202111215298A CN113870316A CN 113870316 A CN113870316 A CN 113870316A CN 202111215298 A CN202111215298 A CN 202111215298A CN 113870316 A CN113870316 A CN 113870316A
Authority
CN
China
Prior art keywords
vehicle
path
representing
time
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111215298.4A
Other languages
Chinese (zh)
Other versions
CN113870316B (en
Inventor
李惠乾
黄晋
贾一帆
苏炎召
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Dezhi Automobile Technology Co ltd
Original Assignee
Qingdao Dezhi Automobile Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Dezhi Automobile Technology Co ltd filed Critical Qingdao Dezhi Automobile Technology Co ltd
Priority to CN202111215298.4A priority Critical patent/CN113870316B/en
Publication of CN113870316A publication Critical patent/CN113870316A/en
Application granted granted Critical
Publication of CN113870316B publication Critical patent/CN113870316B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention provides a method for reconstructing a route of a front vehicle under a scene without GPS vehicle following, which comprises the following steps of 1: generating a reference path based on the relative positions of the front and rear vehicles at the initial moment; step 2: fusing the relative positioning information of the sensor and the information of the odometer to obtain an optimal estimation value of the relative pose; and step 3: and constructing a kinematics model according to the optimal estimation value of the relative pose, initializing and screening the path of the rear vehicle, and generating an effective path. The method for reconstructing the route of the front vehicle under the GPS-free automatic driving following scene does not depend on absolute positioning information and other auxiliary infrastructure, and is suitable for following driving scenes on unstructured roads; meanwhile, sensor and odometer multi-source information are fused, the precision of relative positioning is further improved, and a reliable reference path is finally generated.

Description

Front vehicle path reconstruction method under scene without GPS vehicle following
Technical Field
The invention relates to the technical field of automatic driving, in particular to a method for reconstructing a route of a front vehicle in a scene without a GPS vehicle following.
Background
With the rapid development of the automatic automobile technology and the computer technology, the degree of automation of automatic driving is continuously improved, and the applicable scenes are continuously increased. The queue driving and following driving scenarios have been widely studied in recent years due to the advantages of increased road capacity and reduced fuel consumption.
The realization of automatic driving function under current vehicle queue traveles and follows the driving scene, and the vast majority need rely on GPS (global positioning system) equipment to provide the accurate positional information of two cars to realize the following of back car to preceding car historical orbit, this kind of mode is tracking effect good, but GPS equipment is expensive and is sheltering from regional effect unstability, is difficult to promote on a large scale. In addition, the following vehicle running method based on the relative position of the vehicle only considers the relative position relationship of the instant time of the front and rear vehicles, so that the rear vehicle cannot accurately run along the historical route of the front vehicle, and an inscribing phenomenon may occur during steering.
Disclosure of Invention
The invention aims to provide an automatic driving and following scene without GPS signals, wherein the historical path of a front vehicle is reconstructed through relative positioning and vehicle information acquired by workshop communication equipment and is used as a reference path of a rear vehicle.
Specifically, the method for reconstructing the route of the front vehicle under the scene without GPS vehicle following is characterized by comprising the following steps of:
step 1: generating a reference path based on the relative positions of the front and rear vehicles at the initial moment;
step 2: fusing the relative positioning information of the sensor and the information of the odometer to obtain an optimal estimation value of the relative pose;
and step 3: and constructing a kinematics model according to the optimal estimation value of the relative pose, initializing and screening the path of the rear vehicle, and generating an effective path.
Furthermore, in step 1, according to the relative pose at the initial time, the centroid of the rear vehicle is taken as the origin of the coordinate system, the heading of the vehicle head is taken as the positive direction of the x axis, a coordinate system of the vehicle is established, and cubic curve fitting is performed according to the relative position of the front vehicle and the heading angle of the vehicle head:
f(x)=ax3+bx2+cx+d
wherein a, b, c and d respectively represent each secondary coefficient of the reference path curve; since the curve passes through the origin of the coordinate system and the slope of the tangent at the origin is 0, c-d-0; and because the front vehicle is on the curve at the initial moment, the following conditions are met:
Figure BDA0003310540660000021
solving the equation system can obtain a and b values as follows:
Figure BDA0003310540660000022
setting the number of path points contained in the initialization path to be N0Carrying out equidistant interpolation on the cubic curve to obtain the position information of the path points in the initialized path sequence
Figure BDA0003310540660000023
The slope of the curve at each point is the pose information of the initial path sequence
Figure BDA0003310540660000024
Further, in step 2, a kinematic equation is established and discretized:
Figure BDA0003310540660000025
wherein, Δ x represents the longitudinal relative position of the front vehicle mass center in the rear vehicle coordinate system; Δ y represents the lateral relative position of the front vehicle centroid in the rear vehicle coordinate system; Δ θ represents a nose orientation angle of the front vehicle with respect to the rear vehicle; omega1Indicating yaw angle of preceding vehicleSpeed; omega2Representing the yaw rate of the rear vehicle; v. ofx1Representing the longitudinal speed of the front vehicle; v. ofx2Indicating the longitudinal speed of the rear vehicle.
Further, in step 2, by using the extended kalman filter algorithm, according to the system state at the time k-1, an a priori estimation value of the state of the system at the time k may be obtained by using a state equation, which is expressed as:
Figure BDA0003310540660000031
wherein ,
Figure BDA0003310540660000032
is a posterior estimated value at the k-1 moment; the covariance corresponding to the a priori estimate of the state at time k may be expressed as:
Figure BDA0003310540660000033
wherein ,
Figure BDA0003310540660000034
covariance, P, corresponding to a priori estimate representing the state at time kk-1Represents the covariance of the estimate at time k-1; j. the design is a squarefA jacobian matrix representing a state function.
Further, in step 2, the jacobian matrix of the state function is represented as:
Figure BDA0003310540660000035
wherein ,vx,t-1Representing the longitudinal speed of the vehicle after time k-1.
Further, in step 2, the system estimates the best value at that time
Figure BDA0003310540660000036
Comprises the following steps:
Figure BDA0003310540660000037
Figure BDA0003310540660000038
representing a predicted value at the k moment; zkRepresents the measured value at time k; kkRepresenting a Kalman gain value at the k moment; thetakRepresenting the availability of sensing system data, Θ being the case when sensing information is not availablek0; when perceptual system data is available, Θk=1;
The optimal estimated value is recorded as
Figure BDA0003310540660000039
Figure BDA00033105406600000310
Respectively representing the longitudinal relative position, the transverse relative position and the relative head orientation angle of the front vehicle mass center in a rear vehicle coordinate system;
the k-time kalman gain value may be expressed as:
Figure BDA00033105406600000311
Jhthe Jacobian matrix representing the measurement function is a unit diagonal matrix.
Further, in step 2, enabling the extended kalman filter algorithm to be continuously updated, the covariance of the optimal estimation state of the system needs to be updated:
Figure BDA0003310540660000041
wherein I represents an identity matrix;
obtaining the optimal estimated value of the relative position at the moment k, and recording the optimal estimated value as
Figure BDA0003310540660000042
Figure BDA0003310540660000043
Respectively representing the longitudinal relative position, the transverse relative position and the relative head orientation angle of the mass center of the front vehicle in the coordinate system of the rear vehicle.
Further, in step 3, the system sampling time T is sufficiently small to be within [ (k-1) T, kT]Within the time interval, the rear parking attitude change can be expressed as δ q ═ δ x, δ y, δ θ)TAccording to the monorail kinematics model of the vehicle, there are:
Figure BDA0003310540660000044
wherein ,vxIndicating longitudinal speed of the rear vehicle,/f,lrRespectively representing the distance from the front axle of the rear vehicle to the center of mass and the distance from the rear axle to the center of mass, deltafRepresenting the front wheel rotation angle of the rear vehicle, and omega representing the yaw rate of the rear vehicle; converting the pose information of the vehicle coordinate system at the moment k-1 into the vehicle coordinate system at the moment k:
Figure BDA0003310540660000045
further, in step 3, it is assumed that at the time t ═ kT, the path sequence length is
Figure BDA0003310540660000046
After coordinate system transformation, a positive integer i exists0Satisfy the requirement of
Figure BDA0003310540660000047
Figure BDA0003310540660000048
Then the elimination of waypoints can be performed using the following equation:
Figure BDA0003310540660000049
Figure BDA00033105406600000410
Figure BDA0003310540660000051
wherein ,
Figure BDA0003310540660000052
the sequence of path points at this time is recorded as
Figure BDA0003310540660000053
Further, in step 3, the newly added path points satisfy the following inequality relationship:
Figure BDA0003310540660000054
wherein ,dminRepresents the lower limit of the jumping distance range of the design path point, dmaxRepresenting the upper limit of the jumping distance range of the design path point;
adding the path points meeting the unequal relation into a path sequence:
Figure BDA0003310540660000055
Figure BDA0003310540660000056
Figure BDA0003310540660000057
wherein ,
Figure BDA0003310540660000058
Figure BDA0003310540660000059
respectively are estimated values of the longitudinal speed and the yaw rate at the moment; the path is then represented as
Figure BDA00033105406600000510
If the inequality relationship is not satisfied, no new path point is added, and the path is represented as
Figure BDA00033105406600000511
The beneficial effects of the invention include:
the method for reconstructing the route of the front vehicle under the GPS-free automatic driving following scene does not depend on absolute positioning information and other auxiliary infrastructure, and is suitable for following driving scenes on unstructured roads.
The method for reconstructing the route of the front vehicle under the GPS-free automatic driving following scene integrates the sensor and the multi-source information of the milemeter, further improves the precision of relative positioning, and finally generates a reliable reference route.
Drawings
Fig. 1 is a schematic flow chart of a method for reconstructing a route of a preceding vehicle in a scene without a GPS following vehicle according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a relative motion model of a front vehicle and a rear vehicle in a method for reconstructing a path of the front vehicle in a scene without a GPS following vehicle according to an embodiment of the present invention.
Detailed Description
The technical solution of the present invention will be described in more detail with reference to the accompanying drawings, and the present invention includes, but is not limited to, the following embodiments.
As shown in the attached figure 1, the invention provides a method for reconstructing a route of a front vehicle under a scene without GPS vehicle following, which comprises the following steps:
step 1, when the system starts to operate, a reference path is generated based on the relative positions of the initial time of the front vehicle and the rear vehicle and is used as the basis for the reconstruction of the subsequent path.
The invention relates to a two-vehicle queue or following driving scene, wherein two vehicles are respectively called as a front vehicle and a rear vehicle, are not provided with GPS equipment but are provided with workshop communication equipment, and can realize signal transmission from the front vehicle to the rear vehicle. The two vehicles are respectively provided with a sensor capable of measuring the states of longitudinal speed, longitudinal acceleration, yaw angular velocity and the like; the two vehicles pass through a sensor, such as a laser radar and a camera; or a relative positioning module, such as a UWB positioning module, acquires the relative pose q of the two vehicles (Δ x, Δ y, Δ θ); in addition, the rear vehicle should be provided with a drive-by-wire, brake-by-wire, and steer-by-wire system. In actual operation, the rear vehicle can design a controller to automatically follow the front vehicle according to the reconstructed path.
The reconstructed path refers to a historical path of a front vehicle in a rear vehicle coordinate system and comprises a plurality of path points in an interval from the front of the mass center of the rear vehicle along the driving direction to the mass center position of the front vehicle at the current moment. At time T-kT (k-0, 1,2 …), where T is the system sample time, the path may be represented as a path of length NkPath point sequence MkThe initial length of the sequence is N0It can be expressed as:
Figure BDA0003310540660000061
wherein, the information of the ith path point
Figure BDA0003310540660000062
Can be described as:
Figure BDA0003310540660000063
wherein ,
Figure BDA0003310540660000064
the pose of the vehicle ahead at the time k,
Figure BDA0003310540660000065
for vehicles ahead at time kThe speed of the machine in the longitudinal direction,
Figure BDA0003310540660000066
the yaw rate of the preceding vehicle at time k.
According to the relative position and posture of the initial moment, the center of mass of the rear vehicle is the origin of a coordinate system, the direction of the vehicle head is the positive direction of an x axis, a vehicle coordinate system is established, and cubic curve fitting is carried out according to the relative position of the front vehicle and the direction angle of the vehicle head:
f(x)=ax3+bx2+cx+d
wherein a, b, c and d respectively represent each secondary coefficient of the reference path curve; since the curve passes through the origin of the coordinate system and the slope of the tangent at the origin is 0, c-d-0; and because the front vehicle at the initial moment is on the curve, the following model is satisfied:
Figure BDA0003310540660000071
solving the equation system can obtain a and b values as follows:
Figure BDA0003310540660000072
setting the number of path points contained in the initialization path to be N0Carrying out equidistant interpolation on the cubic curve to obtain the position information of the path points in the initialized path sequence
Figure BDA0003310540660000073
The slope of the curve at each point is the pose information of the initial path sequence
Figure BDA0003310540660000074
And 2, fusing the relative positioning information of the sensor and the odometer information to improve the relative positioning precision, and providing the relative positioning information by using the odometer information under the condition that the relative positioning of the sensor fails.
Step 21, firstly, according to the relative motion relationship shown in fig. 2, a kinematic equation is established, and discretization is performed, which can be expressed as:
Figure BDA0003310540660000075
wherein, Δ x represents the longitudinal relative position of the front vehicle mass center in the rear vehicle coordinate system; Δ y represents the lateral relative position of the front vehicle centroid in the rear vehicle coordinate system; Δ θ represents a nose orientation angle of the front vehicle with respect to the rear vehicle; omega1Representing the yaw rate of the preceding vehicle; omega2Representing the yaw rate of the rear vehicle; v. ofx1Representing the longitudinal speed of the front vehicle; v. ofv2Indicating the longitudinal speed of the rear vehicle.
And step 22, fusing the sensing information and the odometer information by the extended Kalman filtering algorithm.
Noting the state vector as Xk=[Δxk,Δyk,Δθk]TAnd obtaining a nonlinear state equation which can be simplified and expressed as:
Xk=f(Xk-1)+wk-1
wherein ,wk-1Obeying a Gaussian distribution w for the process noise at time k-1k-1N (0, Q), Q represents the variance of the process noise. The measurement value of the relative positioning of the sensor at the moment k is ZkThen the measurement equation of the system can be expressed as:
Zk=h(Xk)+ψk
where h (-) denotes a measurement function, there is h (X) since the state variables can be measured directly by the sensorsk)=HXkWherein H ═ diag ([1, 1)]),ψkFor measuring noise at time k, obeying a Gaussian distribution vkN (0, R), the magnitude of R depends on the relative positioning measurement accuracy of the sensor.
When the extended Kalman filtering algorithm is specifically implemented, firstly, according to the system state at the k-1 moment, a prior estimation value of the state of the system at the k moment can be obtained by using a state equation, and the prior estimation value is expressed as follows:
Figure BDA0003310540660000081
wherein ,
Figure BDA0003310540660000082
is an a posteriori estimate of state X at time k-1. The covariance corresponding to the a priori estimate of the state at time k may be expressed as:
Figure BDA0003310540660000083
wherein ,
Figure BDA0003310540660000084
covariance, P, corresponding to a priori estimate representing the state at time kk-1Represents the covariance of the estimate at time k-1; j. the design is a squarefThe jacobian matrix representing the state function can be obtained by partial derivation of the state equation:
Figure BDA0003310540660000085
wherein ,vx,t-1Indicating the speed in the forward direction of the vehicle at time k-1.
Kalman gain K of the systemkCan be expressed as:
Figure BDA0003310540660000091
Jhthe Jacobian matrix representing the measurement function is a unit diagonal matrix.
And finally, updating the covariance of the optimal estimation state of the system so that the extended Kalman filtering algorithm can be continuously updated:
Figure BDA0003310540660000092
wherein I represents an identity matrix.
According to the predicted value of k time
Figure BDA0003310540660000093
Measured value ZkAnd a Kalman gain value KkThe optimal estimated value of the system at the moment can be obtained
Figure BDA0003310540660000094
Comprises the following steps:
Figure BDA0003310540660000095
in order to prevent the loss of the relative positioning sensing information, a parameter theta is introducedkAnd characterizing the available state of the sensing information. When the sensing information is not available, Θk0, which means prediction using only odometer information; when perceptual system data is available, ΘkThe two pieces of information are fused together at 1.
The optimal estimated value of the relative pose at the moment k can be obtained through the steps and is recorded as
Figure BDA0003310540660000096
Figure BDA0003310540660000097
Figure BDA0003310540660000098
Respectively representing the longitudinal relative position, the transverse relative position and the relative head orientation angle of the mass center of the front vehicle in the coordinate system of the rear vehicle.
And 3, screening the path points to generate an effective path.
Step 31, converting the coordinate system of the historical path points according to the kinematic relation, and when the system sampling time T is small enough, the system sampling time T is in [ (k-1) T, kT]Within the time interval, the rear parking attitude change can be expressed as δ q ═ δ x, δ y, δ θ)TAccording to the monorail kinematics model of the vehicle, there are:
Figure BDA0003310540660000099
wherein ,vxIndicating longitudinal speed of the rear vehicle,/f,lrRespectively representing the distance from the front axle of the rear vehicle to the center of mass and the distance from the rear axle to the center of mass, deltafIndicating the front wheel rotation angle of the rear vehicle and ω indicating the yaw rate of the rear vehicle.
And converting the pose information of the vehicle coordinate system at the moment k-1 into the vehicle coordinate system at the moment k through the following coordinate system transformation equation:
Figure BDA0003310540660000101
step 32, along the vehicle heading direction, the reconstruction of the waypoints behind the centroid makes no sense for subsequent path tracking, and therefore these waypoints are removed from the sequence. Suppose that at time t-kT, the path sequence length is
Figure BDA0003310540660000102
After coordinate system transformation, a positive integer i exists0Satisfy the requirement of
Figure BDA0003310540660000103
Then the elimination of waypoints can be performed using the following equation:
Figure BDA0003310540660000104
Figure BDA0003310540660000105
Figure BDA0003310540660000106
wherein ,
Figure BDA0003310540660000107
the sequence of path points at this time is recorded as
Figure BDA0003310540660000108
Meanwhile, for newly added path point data, in order to avoid path point stacking caused by the static state of a front vehicle and sudden change of the position of a path point caused by abnormal sensing data, the upper limit and the lower limit d of the jumping distance range of the path point are designedmin,dmaxWhen the newly added path point satisfies the following inequality relation:
Figure BDA0003310540660000109
adding the path point into the path sequence:
Figure BDA00033105406600001010
Figure BDA00033105406600001011
Figure BDA00033105406600001012
wherein
Figure BDA00033105406600001013
Figure BDA00033105406600001014
Respectively, are estimates of the longitudinal velocity and yaw rate at that time. The path is then represented as
Figure BDA00033105406600001015
If the inequality relationship is not satisfied, no new path point is added, and the path is represented as
Figure BDA00033105406600001016
The present invention is not limited to the above embodiments, and those skilled in the art can implement the present invention in other various embodiments according to the disclosure of the embodiments and the drawings, and therefore, all designs that can be easily changed or modified by using the design structure and thought of the present invention fall within the protection scope of the present invention.

Claims (10)

1. A method for reconstructing a route of a front vehicle under a scene without GPS vehicle following is characterized by comprising the following steps:
step 1: generating a reference path based on the relative positions of the front and rear vehicles at the initial moment;
step 2: fusing the relative positioning information of the sensor and the information of the odometer to obtain an optimal estimation value of the relative pose;
and step 3: and constructing a kinematics model according to the optimal estimation value of the relative pose, initializing and screening the path of the rear vehicle, and generating an effective path.
2. The method for reconstructing the route of the preceding vehicle according to claim 1, wherein in step 1, a coordinate system of the vehicle is established with a center of mass of the following vehicle as an origin of the coordinate system and a heading of the vehicle as a positive direction of an x-axis according to the relative pose at the initial time, and cubic curve fitting is performed according to a relative position of the preceding vehicle and a heading angle of the vehicle:
f(x)=ax3+bx2+cx+d
wherein a, b, c and d respectively represent each secondary coefficient of the reference path curve; since the curve passes through the origin of the coordinate system and the slope of the tangent at the origin is 0, c-d-0; and because the front vehicle is on the curve at the initial moment, the following conditions are met:
Figure FDA0003310540650000011
solving the equation system can obtain a and b values as follows:
Figure FDA0003310540650000012
setting the number of path points contained in the initialization path to be N0Carrying out equidistant interpolation on the cubic curve to obtain the position information of the path points in the initialized path sequence
Figure FDA0003310540650000013
The slope of the curve at each point is the pose information of the initial path sequence
Figure FDA0003310540650000021
3. The preceding vehicle path reconstructing method according to claim 1, wherein in step 2, a kinematic equation is established and discretized:
Figure FDA0003310540650000022
wherein, Δ x represents the longitudinal relative position of the front vehicle mass center in the rear vehicle coordinate system; Δ y represents the lateral relative position of the front vehicle centroid in the rear vehicle coordinate system; Δ θ represents a nose orientation angle of the front vehicle with respect to the rear vehicle; omega1Representing the yaw rate of the preceding vehicle; omega2Representing the yaw rate of the rear vehicle; v. ofx1Representing the longitudinal speed of the front vehicle; v. ofx2Indicating the longitudinal speed of the rear vehicle.
4. The method for reconstructing the leading vehicle path according to claim 3, wherein in step 2, the extended kalman filter algorithm is used to obtain the prior estimation value of the state of the system at the time k by using the state equation according to the state of the system at the time k-1, which is expressed as:
Figure FDA0003310540650000023
wherein ,
Figure FDA0003310540650000024
is a posterior estimated value at the k-1 moment; the covariance corresponding to the a priori estimate of the state at time k may be expressed as:
Figure FDA0003310540650000025
wherein ,
Figure FDA0003310540650000026
covariance, P, corresponding to a priori estimate representing the state at time kk-1Represents the covariance of the estimate at time k-1; j. the design is a squarefA jacobian matrix representing a state function.
5. The leading vehicle path reconstructing method according to claim 4, wherein in step 2, the Jacobian matrix of the state function is represented as:
Figure FDA0003310540650000027
wherein ,vx,t-1Representing the longitudinal speed of the vehicle after time k-1.
6. The method according to claim 4, wherein in step 2, the system's optimal estimated value at that time
Figure FDA0003310540650000028
Comprises the following steps:
Figure FDA0003310540650000031
Figure FDA0003310540650000032
representing a predicted value at the k moment; zkRepresents the measured value at time k; kkRepresenting a Kalman gain value at the k moment; thetakRepresenting the availability of sensing system data, Θ being the case when sensing information is not availablek0; when perceptual system data is available, Θk=1;
The optimal estimated value is recorded as
Figure FDA0003310540650000033
Figure FDA0003310540650000034
Respectively representing the longitudinal relative position, the transverse relative position and the relative head orientation angle of the front vehicle mass center in a rear vehicle coordinate system;
the k-time kalman gain value may be expressed as:
Figure FDA0003310540650000035
Jhthe Jacobian matrix representing the measurement function is a unit diagonal matrix.
7. The method for reconstructing the leading vehicle path according to claim 6, wherein in step 2, the extended kalman filter algorithm is continuously updated, and the covariance of the optimal estimation state of the system needs to be updated:
Figure FDA0003310540650000036
wherein I represents an identity matrix;
obtaining the optimal estimated value of the relative position at the moment k, and recording the optimal estimated value as
Figure FDA0003310540650000037
Figure FDA0003310540650000038
Respectively representing the longitudinal relative position, the transverse relative position and the relative head orientation angle of the mass center of the front vehicle in the coordinate system of the rear vehicle.
8. The method according to claim 6, wherein in step 3, the system sampling time T is sufficiently small, and is defined as [ (k-1) T, kT [ ]]Within the time interval, the rear parking attitude change can be expressed as δ q ═ δ x, δ y, δ θ)TAccording to the monorail kinematics model of the vehicle, there are:
Figure FDA0003310540650000039
wherein ,vxIndicating longitudinal speed of the rear vehicle,/f,lrRespectively representing the distance from the front axle of the rear vehicle to the center of mass and the distance from the rear axle to the center of mass, deltafRepresenting the front wheel rotation angle of the rear vehicle, and omega representing the yaw rate of the rear vehicle; converting the pose information of the vehicle coordinate system at the moment k-1 into the vehicle coordinate system at the moment k:
Figure FDA0003310540650000041
9. the leading-vehicle path reconstructing method according to claim 8, wherein in step 3, it is assumed that at time t-kT, the path sequence length is
Figure FDA0003310540650000042
After coordinate system transformation, a positive integer i exists0Satisfy the requirement of
Figure FDA0003310540650000043
Then the elimination of waypoints can be performed using the following equation:
Figure FDA0003310540650000044
Figure FDA0003310540650000045
Figure FDA0003310540650000046
wherein ,
Figure FDA0003310540650000047
the sequence of path points at this time is recorded as
Figure FDA0003310540650000048
10. The leading vehicle path rebuilding method according to claim 9, wherein in step 3, the newly added path points satisfy the following inequality relationship:
Figure FDA0003310540650000049
wherein ,dminRepresents the lower limit of the jumping distance range of the design path point, dmaxRepresenting the upper limit of the jumping distance range of the design path point;
adding the path points meeting the unequal relation into a path sequence:
Figure FDA00033105406500000410
Figure FDA00033105406500000411
Figure FDA00033105406500000412
wherein ,
Figure FDA00033105406500000413
Figure FDA00033105406500000414
respectively are estimated values of the longitudinal speed and the yaw rate at the moment; the path is then represented as
Figure FDA00033105406500000415
If the inequality relationship is not satisfied, no new path point is added, and the path is represented as
Figure FDA00033105406500000416
CN202111215298.4A 2021-10-19 2021-10-19 Front vehicle path reconstruction method under GPS-free following scene Active CN113870316B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111215298.4A CN113870316B (en) 2021-10-19 2021-10-19 Front vehicle path reconstruction method under GPS-free following scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111215298.4A CN113870316B (en) 2021-10-19 2021-10-19 Front vehicle path reconstruction method under GPS-free following scene

Publications (2)

Publication Number Publication Date
CN113870316A true CN113870316A (en) 2021-12-31
CN113870316B CN113870316B (en) 2023-08-15

Family

ID=79000236

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111215298.4A Active CN113870316B (en) 2021-10-19 2021-10-19 Front vehicle path reconstruction method under GPS-free following scene

Country Status (1)

Country Link
CN (1) CN113870316B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107015238A (en) * 2017-04-27 2017-08-04 睿舆自动化(上海)有限公司 Unmanned vehicle autonomic positioning method based on three-dimensional laser radar
CN108519773A (en) * 2018-03-07 2018-09-11 西安交通大学 The paths planning method of automatic driving vehicle under a kind of structured environment
CN108646763A (en) * 2018-07-18 2018-10-12 扬州大学 A kind of autonomous driving trace tracking and controlling method
CN109668569A (en) * 2018-12-08 2019-04-23 华东交通大学 Path rapid generation in a kind of intelligent driving
CN109991972A (en) * 2017-12-29 2019-07-09 长城汽车股份有限公司 Control method, apparatus, vehicle and the readable storage medium storing program for executing of vehicle driving
CN110806215A (en) * 2019-11-21 2020-02-18 北京百度网讯科技有限公司 Vehicle positioning method, device, equipment and storage medium
CN111231965A (en) * 2020-01-14 2020-06-05 北京小马智行科技有限公司 Method and device for adjusting vehicle control mode and unmanned vehicle
US20200180611A1 (en) * 2018-12-07 2020-06-11 Volkswagen Aktiengesellschaft Driver Assistance System For A Motor Vehicle, Motor Vehicle And Method For Operating A Motor Vehicle
CN111338340A (en) * 2020-02-21 2020-06-26 天津大学 Model prediction-based unmanned automobile local path planning method
CN111707272A (en) * 2020-06-28 2020-09-25 湖南大学 Underground garage automatic driving laser positioning system
CN111750887A (en) * 2020-06-11 2020-10-09 上海交通大学 Unmanned vehicle trajectory planning method and system for reducing accident severity
CN112130559A (en) * 2020-08-21 2020-12-25 同济大学 Indoor pedestrian following and obstacle avoiding method based on UWB and laser radar
CN112904849A (en) * 2021-01-18 2021-06-04 北京科技大学 Integrated automatic driving automobile lane change track planning and tracking control method and system
CN113386795A (en) * 2021-07-05 2021-09-14 西安电子科技大学芜湖研究院 Intelligent decision-making and local track planning method for automatic driving vehicle and decision-making system thereof
CN113433947A (en) * 2021-07-15 2021-09-24 天津大学 Intersection trajectory planning and control method based on obstacle vehicle estimation and prediction

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107015238A (en) * 2017-04-27 2017-08-04 睿舆自动化(上海)有限公司 Unmanned vehicle autonomic positioning method based on three-dimensional laser radar
CN109991972A (en) * 2017-12-29 2019-07-09 长城汽车股份有限公司 Control method, apparatus, vehicle and the readable storage medium storing program for executing of vehicle driving
CN108519773A (en) * 2018-03-07 2018-09-11 西安交通大学 The paths planning method of automatic driving vehicle under a kind of structured environment
CN108646763A (en) * 2018-07-18 2018-10-12 扬州大学 A kind of autonomous driving trace tracking and controlling method
US20200180611A1 (en) * 2018-12-07 2020-06-11 Volkswagen Aktiengesellschaft Driver Assistance System For A Motor Vehicle, Motor Vehicle And Method For Operating A Motor Vehicle
CN109668569A (en) * 2018-12-08 2019-04-23 华东交通大学 Path rapid generation in a kind of intelligent driving
CN110806215A (en) * 2019-11-21 2020-02-18 北京百度网讯科技有限公司 Vehicle positioning method, device, equipment and storage medium
CN111231965A (en) * 2020-01-14 2020-06-05 北京小马智行科技有限公司 Method and device for adjusting vehicle control mode and unmanned vehicle
CN111338340A (en) * 2020-02-21 2020-06-26 天津大学 Model prediction-based unmanned automobile local path planning method
CN111750887A (en) * 2020-06-11 2020-10-09 上海交通大学 Unmanned vehicle trajectory planning method and system for reducing accident severity
CN111707272A (en) * 2020-06-28 2020-09-25 湖南大学 Underground garage automatic driving laser positioning system
CN112130559A (en) * 2020-08-21 2020-12-25 同济大学 Indoor pedestrian following and obstacle avoiding method based on UWB and laser radar
CN112904849A (en) * 2021-01-18 2021-06-04 北京科技大学 Integrated automatic driving automobile lane change track planning and tracking control method and system
CN113386795A (en) * 2021-07-05 2021-09-14 西安电子科技大学芜湖研究院 Intelligent decision-making and local track planning method for automatic driving vehicle and decision-making system thereof
CN113433947A (en) * 2021-07-15 2021-09-24 天津大学 Intersection trajectory planning and control method based on obstacle vehicle estimation and prediction

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
HANZHANG XUE等: "IMU-Aided High-Frequency Lidar Odometry for Autonomous Driving", 《APPLIED SCIENCES》, vol. 9, pages 1 - 20 *
化祖旭: "自动驾驶汽车路径跟踪控制算法综述", 《装备制造技术》, no. 5, pages 100 - 103 *
肖军浩等: "单目相机-3维激光雷达的外参标定及融合里程计研究", 《机器人》, vol. 43, no. 1, pages 17 - 28 *
谭光兴等: "基于扩展卡尔曼滤波的汽车行驶状态估计", 《广西科技大学学报》, vol. 31, no. 1, pages 18 - 24 *
赵凌峰: "高速无人驾驶车辆横纵向控制算法的研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》, no. 06, pages 035 - 200 *

Also Published As

Publication number Publication date
CN113870316B (en) 2023-08-15

Similar Documents

Publication Publication Date Title
CN106476728B (en) Motion compensation for vehicle-mounted vehicle sensors
EP3678110B1 (en) Method for correcting positional error and device for correcting positional error in a drive-assisted vehicle
US9645250B2 (en) Fail operational vehicle speed estimation through data fusion of 6-DOF IMU, GPS, and radar
Son et al. Robust multirate control scheme with predictive virtual lanes for lane-keeping system of autonomous highway driving
US20210403032A1 (en) Two-level path planning for autonomous vehicles
Kang et al. Comparative evaluation of dynamic and kinematic vehicle models
CN112083726B (en) Park-oriented automatic driving double-filter fusion positioning system
CN112486156B (en) Automatic tracking control system and control method for vehicle
JP6594589B1 (en) Travel plan generation device and automatic driving system
Xia et al. Advancing estimation accuracy of sideslip angle by fusing vehicle kinematics and dynamics information with fuzzy logic
KR102024986B1 (en) Apparatus and method for lane Keeping control
Liu et al. Slip-aware motion estimation for off-road mobile robots via multi-innovation unscented Kalman filter
CN112433531A (en) Trajectory tracking method and device for automatic driving vehicle and computer equipment
CN115923839A (en) Vehicle path planning method
Zindler et al. Real-time ego-motion estimation using Lidar and a vehicle model based Extended Kalman Filter
Katriniok Optimal vehicle dynamics control and state estimation for a low-cost GNSS-based collision avoidance system
CN113870316A (en) Front vehicle path reconstruction method under scene without GPS vehicle following
CN117146838A (en) Path planning method and device and related products
Berntorp et al. Bayesian sensor fusion of GNSS and camera with outlier adaptation for vehicle positioning
EP2364891B1 (en) Method for threat assessment in a vehicle
Zhou et al. Vision-based control of an industrial vehicle in unstructured environments
Han et al. The Leader-Follower Formation Control of Nonholonomic Vehicle With Follower-Stabilizing Strategy
CN113048987A (en) Vehicle navigation system positioning method
KR102271913B1 (en) Apparatus for determining position of vehicle and method thereof
CN115571156B (en) Front vehicle transverse and longitudinal motion state joint estimation method based on sensor fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant