CN113870316A - Front vehicle path reconstruction method under scene without GPS vehicle following - Google Patents
Front vehicle path reconstruction method under scene without GPS vehicle following Download PDFInfo
- Publication number
- CN113870316A CN113870316A CN202111215298.4A CN202111215298A CN113870316A CN 113870316 A CN113870316 A CN 113870316A CN 202111215298 A CN202111215298 A CN 202111215298A CN 113870316 A CN113870316 A CN 113870316A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- path
- representing
- time
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 238000012216 screening Methods 0.000 claims abstract description 4
- 239000011159 matrix material Substances 0.000 claims description 14
- 238000013461 design Methods 0.000 claims description 10
- 238000005259 measurement Methods 0.000 claims description 7
- 230000009191 jumping Effects 0.000 claims description 5
- 238000005070 sampling Methods 0.000 claims description 4
- 230000009466 transformation Effects 0.000 claims description 4
- 206010034719 Personality change Diseases 0.000 claims description 3
- 230000008030 elimination Effects 0.000 claims description 3
- 238000003379 elimination reaction Methods 0.000 claims description 3
- 238000001914 filtration Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/251—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Navigation (AREA)
Abstract
The invention provides a method for reconstructing a route of a front vehicle under a scene without GPS vehicle following, which comprises the following steps of 1: generating a reference path based on the relative positions of the front and rear vehicles at the initial moment; step 2: fusing the relative positioning information of the sensor and the information of the odometer to obtain an optimal estimation value of the relative pose; and step 3: and constructing a kinematics model according to the optimal estimation value of the relative pose, initializing and screening the path of the rear vehicle, and generating an effective path. The method for reconstructing the route of the front vehicle under the GPS-free automatic driving following scene does not depend on absolute positioning information and other auxiliary infrastructure, and is suitable for following driving scenes on unstructured roads; meanwhile, sensor and odometer multi-source information are fused, the precision of relative positioning is further improved, and a reliable reference path is finally generated.
Description
Technical Field
The invention relates to the technical field of automatic driving, in particular to a method for reconstructing a route of a front vehicle in a scene without a GPS vehicle following.
Background
With the rapid development of the automatic automobile technology and the computer technology, the degree of automation of automatic driving is continuously improved, and the applicable scenes are continuously increased. The queue driving and following driving scenarios have been widely studied in recent years due to the advantages of increased road capacity and reduced fuel consumption.
The realization of automatic driving function under current vehicle queue traveles and follows the driving scene, and the vast majority need rely on GPS (global positioning system) equipment to provide the accurate positional information of two cars to realize the following of back car to preceding car historical orbit, this kind of mode is tracking effect good, but GPS equipment is expensive and is sheltering from regional effect unstability, is difficult to promote on a large scale. In addition, the following vehicle running method based on the relative position of the vehicle only considers the relative position relationship of the instant time of the front and rear vehicles, so that the rear vehicle cannot accurately run along the historical route of the front vehicle, and an inscribing phenomenon may occur during steering.
Disclosure of Invention
The invention aims to provide an automatic driving and following scene without GPS signals, wherein the historical path of a front vehicle is reconstructed through relative positioning and vehicle information acquired by workshop communication equipment and is used as a reference path of a rear vehicle.
Specifically, the method for reconstructing the route of the front vehicle under the scene without GPS vehicle following is characterized by comprising the following steps of:
step 1: generating a reference path based on the relative positions of the front and rear vehicles at the initial moment;
step 2: fusing the relative positioning information of the sensor and the information of the odometer to obtain an optimal estimation value of the relative pose;
and step 3: and constructing a kinematics model according to the optimal estimation value of the relative pose, initializing and screening the path of the rear vehicle, and generating an effective path.
Furthermore, in step 1, according to the relative pose at the initial time, the centroid of the rear vehicle is taken as the origin of the coordinate system, the heading of the vehicle head is taken as the positive direction of the x axis, a coordinate system of the vehicle is established, and cubic curve fitting is performed according to the relative position of the front vehicle and the heading angle of the vehicle head:
f(x)=ax3+bx2+cx+d
wherein a, b, c and d respectively represent each secondary coefficient of the reference path curve; since the curve passes through the origin of the coordinate system and the slope of the tangent at the origin is 0, c-d-0; and because the front vehicle is on the curve at the initial moment, the following conditions are met:
solving the equation system can obtain a and b values as follows:
setting the number of path points contained in the initialization path to be N0Carrying out equidistant interpolation on the cubic curve to obtain the position information of the path points in the initialized path sequenceThe slope of the curve at each point is the pose information of the initial path sequence
Further, in step 2, a kinematic equation is established and discretized:
wherein, Δ x represents the longitudinal relative position of the front vehicle mass center in the rear vehicle coordinate system; Δ y represents the lateral relative position of the front vehicle centroid in the rear vehicle coordinate system; Δ θ represents a nose orientation angle of the front vehicle with respect to the rear vehicle; omega1Indicating yaw angle of preceding vehicleSpeed; omega2Representing the yaw rate of the rear vehicle; v. ofx1Representing the longitudinal speed of the front vehicle; v. ofx2Indicating the longitudinal speed of the rear vehicle.
Further, in step 2, by using the extended kalman filter algorithm, according to the system state at the time k-1, an a priori estimation value of the state of the system at the time k may be obtained by using a state equation, which is expressed as:
wherein ,is a posterior estimated value at the k-1 moment; the covariance corresponding to the a priori estimate of the state at time k may be expressed as:
wherein ,covariance, P, corresponding to a priori estimate representing the state at time kk-1Represents the covariance of the estimate at time k-1; j. the design is a squarefA jacobian matrix representing a state function.
Further, in step 2, the jacobian matrix of the state function is represented as:
wherein ,vx,t-1Representing the longitudinal speed of the vehicle after time k-1.
representing a predicted value at the k moment; zkRepresents the measured value at time k; kkRepresenting a Kalman gain value at the k moment; thetakRepresenting the availability of sensing system data, Θ being the case when sensing information is not availablek0; when perceptual system data is available, Θk=1;
The optimal estimated value is recorded as Respectively representing the longitudinal relative position, the transverse relative position and the relative head orientation angle of the front vehicle mass center in a rear vehicle coordinate system;
the k-time kalman gain value may be expressed as:
Jhthe Jacobian matrix representing the measurement function is a unit diagonal matrix.
Further, in step 2, enabling the extended kalman filter algorithm to be continuously updated, the covariance of the optimal estimation state of the system needs to be updated:
wherein I represents an identity matrix;
obtaining the optimal estimated value of the relative position at the moment k, and recording the optimal estimated value as Respectively representing the longitudinal relative position, the transverse relative position and the relative head orientation angle of the mass center of the front vehicle in the coordinate system of the rear vehicle.
Further, in step 3, the system sampling time T is sufficiently small to be within [ (k-1) T, kT]Within the time interval, the rear parking attitude change can be expressed as δ q ═ δ x, δ y, δ θ)TAccording to the monorail kinematics model of the vehicle, there are:
wherein ,vxIndicating longitudinal speed of the rear vehicle,/f,lrRespectively representing the distance from the front axle of the rear vehicle to the center of mass and the distance from the rear axle to the center of mass, deltafRepresenting the front wheel rotation angle of the rear vehicle, and omega representing the yaw rate of the rear vehicle; converting the pose information of the vehicle coordinate system at the moment k-1 into the vehicle coordinate system at the moment k:
further, in step 3, it is assumed that at the time t ═ kT, the path sequence length isAfter coordinate system transformation, a positive integer i exists0Satisfy the requirement of
Then the elimination of waypoints can be performed using the following equation:
Further, in step 3, the newly added path points satisfy the following inequality relationship:
wherein ,dminRepresents the lower limit of the jumping distance range of the design path point, dmaxRepresenting the upper limit of the jumping distance range of the design path point;
adding the path points meeting the unequal relation into a path sequence:
wherein , respectively are estimated values of the longitudinal speed and the yaw rate at the moment; the path is then represented as
If the inequality relationship is not satisfied, no new path point is added, and the path is represented as
The beneficial effects of the invention include:
the method for reconstructing the route of the front vehicle under the GPS-free automatic driving following scene does not depend on absolute positioning information and other auxiliary infrastructure, and is suitable for following driving scenes on unstructured roads.
The method for reconstructing the route of the front vehicle under the GPS-free automatic driving following scene integrates the sensor and the multi-source information of the milemeter, further improves the precision of relative positioning, and finally generates a reliable reference route.
Drawings
Fig. 1 is a schematic flow chart of a method for reconstructing a route of a preceding vehicle in a scene without a GPS following vehicle according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a relative motion model of a front vehicle and a rear vehicle in a method for reconstructing a path of the front vehicle in a scene without a GPS following vehicle according to an embodiment of the present invention.
Detailed Description
The technical solution of the present invention will be described in more detail with reference to the accompanying drawings, and the present invention includes, but is not limited to, the following embodiments.
As shown in the attached figure 1, the invention provides a method for reconstructing a route of a front vehicle under a scene without GPS vehicle following, which comprises the following steps:
step 1, when the system starts to operate, a reference path is generated based on the relative positions of the initial time of the front vehicle and the rear vehicle and is used as the basis for the reconstruction of the subsequent path.
The invention relates to a two-vehicle queue or following driving scene, wherein two vehicles are respectively called as a front vehicle and a rear vehicle, are not provided with GPS equipment but are provided with workshop communication equipment, and can realize signal transmission from the front vehicle to the rear vehicle. The two vehicles are respectively provided with a sensor capable of measuring the states of longitudinal speed, longitudinal acceleration, yaw angular velocity and the like; the two vehicles pass through a sensor, such as a laser radar and a camera; or a relative positioning module, such as a UWB positioning module, acquires the relative pose q of the two vehicles (Δ x, Δ y, Δ θ); in addition, the rear vehicle should be provided with a drive-by-wire, brake-by-wire, and steer-by-wire system. In actual operation, the rear vehicle can design a controller to automatically follow the front vehicle according to the reconstructed path.
The reconstructed path refers to a historical path of a front vehicle in a rear vehicle coordinate system and comprises a plurality of path points in an interval from the front of the mass center of the rear vehicle along the driving direction to the mass center position of the front vehicle at the current moment. At time T-kT (k-0, 1,2 …), where T is the system sample time, the path may be represented as a path of length NkPath point sequence MkThe initial length of the sequence is N0It can be expressed as:
wherein ,the pose of the vehicle ahead at the time k,for vehicles ahead at time kThe speed of the machine in the longitudinal direction,the yaw rate of the preceding vehicle at time k.
According to the relative position and posture of the initial moment, the center of mass of the rear vehicle is the origin of a coordinate system, the direction of the vehicle head is the positive direction of an x axis, a vehicle coordinate system is established, and cubic curve fitting is carried out according to the relative position of the front vehicle and the direction angle of the vehicle head:
f(x)=ax3+bx2+cx+d
wherein a, b, c and d respectively represent each secondary coefficient of the reference path curve; since the curve passes through the origin of the coordinate system and the slope of the tangent at the origin is 0, c-d-0; and because the front vehicle at the initial moment is on the curve, the following model is satisfied:
solving the equation system can obtain a and b values as follows:
setting the number of path points contained in the initialization path to be N0Carrying out equidistant interpolation on the cubic curve to obtain the position information of the path points in the initialized path sequenceThe slope of the curve at each point is the pose information of the initial path sequence
And 2, fusing the relative positioning information of the sensor and the odometer information to improve the relative positioning precision, and providing the relative positioning information by using the odometer information under the condition that the relative positioning of the sensor fails.
Step 21, firstly, according to the relative motion relationship shown in fig. 2, a kinematic equation is established, and discretization is performed, which can be expressed as:
wherein, Δ x represents the longitudinal relative position of the front vehicle mass center in the rear vehicle coordinate system; Δ y represents the lateral relative position of the front vehicle centroid in the rear vehicle coordinate system; Δ θ represents a nose orientation angle of the front vehicle with respect to the rear vehicle; omega1Representing the yaw rate of the preceding vehicle; omega2Representing the yaw rate of the rear vehicle; v. ofx1Representing the longitudinal speed of the front vehicle; v. ofv2Indicating the longitudinal speed of the rear vehicle.
And step 22, fusing the sensing information and the odometer information by the extended Kalman filtering algorithm.
Noting the state vector as Xk=[Δxk,Δyk,Δθk]TAnd obtaining a nonlinear state equation which can be simplified and expressed as:
Xk=f(Xk-1)+wk-1
wherein ,wk-1Obeying a Gaussian distribution w for the process noise at time k-1k-1N (0, Q), Q represents the variance of the process noise. The measurement value of the relative positioning of the sensor at the moment k is ZkThen the measurement equation of the system can be expressed as:
Zk=h(Xk)+ψk
where h (-) denotes a measurement function, there is h (X) since the state variables can be measured directly by the sensorsk)=HXkWherein H ═ diag ([1, 1)]),ψkFor measuring noise at time k, obeying a Gaussian distribution vkN (0, R), the magnitude of R depends on the relative positioning measurement accuracy of the sensor.
When the extended Kalman filtering algorithm is specifically implemented, firstly, according to the system state at the k-1 moment, a prior estimation value of the state of the system at the k moment can be obtained by using a state equation, and the prior estimation value is expressed as follows:
wherein ,is an a posteriori estimate of state X at time k-1. The covariance corresponding to the a priori estimate of the state at time k may be expressed as:
wherein ,covariance, P, corresponding to a priori estimate representing the state at time kk-1Represents the covariance of the estimate at time k-1; j. the design is a squarefThe jacobian matrix representing the state function can be obtained by partial derivation of the state equation:
wherein ,vx,t-1Indicating the speed in the forward direction of the vehicle at time k-1.
Kalman gain K of the systemkCan be expressed as:
Jhthe Jacobian matrix representing the measurement function is a unit diagonal matrix.
And finally, updating the covariance of the optimal estimation state of the system so that the extended Kalman filtering algorithm can be continuously updated:
wherein I represents an identity matrix.
According to the predicted value of k timeMeasured value ZkAnd a Kalman gain value KkThe optimal estimated value of the system at the moment can be obtainedComprises the following steps:
in order to prevent the loss of the relative positioning sensing information, a parameter theta is introducedkAnd characterizing the available state of the sensing information. When the sensing information is not available, Θk0, which means prediction using only odometer information; when perceptual system data is available, ΘkThe two pieces of information are fused together at 1.
The optimal estimated value of the relative pose at the moment k can be obtained through the steps and is recorded as Respectively representing the longitudinal relative position, the transverse relative position and the relative head orientation angle of the mass center of the front vehicle in the coordinate system of the rear vehicle.
And 3, screening the path points to generate an effective path.
Step 31, converting the coordinate system of the historical path points according to the kinematic relation, and when the system sampling time T is small enough, the system sampling time T is in [ (k-1) T, kT]Within the time interval, the rear parking attitude change can be expressed as δ q ═ δ x, δ y, δ θ)TAccording to the monorail kinematics model of the vehicle, there are:
wherein ,vxIndicating longitudinal speed of the rear vehicle,/f,lrRespectively representing the distance from the front axle of the rear vehicle to the center of mass and the distance from the rear axle to the center of mass, deltafIndicating the front wheel rotation angle of the rear vehicle and ω indicating the yaw rate of the rear vehicle.
And converting the pose information of the vehicle coordinate system at the moment k-1 into the vehicle coordinate system at the moment k through the following coordinate system transformation equation:
step 32, along the vehicle heading direction, the reconstruction of the waypoints behind the centroid makes no sense for subsequent path tracking, and therefore these waypoints are removed from the sequence. Suppose that at time t-kT, the path sequence length isAfter coordinate system transformation, a positive integer i exists0Satisfy the requirement ofThen the elimination of waypoints can be performed using the following equation:
wherein ,the sequence of path points at this time is recorded asMeanwhile, for newly added path point data, in order to avoid path point stacking caused by the static state of a front vehicle and sudden change of the position of a path point caused by abnormal sensing data, the upper limit and the lower limit d of the jumping distance range of the path point are designedmin,dmaxWhen the newly added path point satisfies the following inequality relation:
adding the path point into the path sequence:
wherein Respectively, are estimates of the longitudinal velocity and yaw rate at that time. The path is then represented asIf the inequality relationship is not satisfied, no new path point is added, and the path is represented as
The present invention is not limited to the above embodiments, and those skilled in the art can implement the present invention in other various embodiments according to the disclosure of the embodiments and the drawings, and therefore, all designs that can be easily changed or modified by using the design structure and thought of the present invention fall within the protection scope of the present invention.
Claims (10)
1. A method for reconstructing a route of a front vehicle under a scene without GPS vehicle following is characterized by comprising the following steps:
step 1: generating a reference path based on the relative positions of the front and rear vehicles at the initial moment;
step 2: fusing the relative positioning information of the sensor and the information of the odometer to obtain an optimal estimation value of the relative pose;
and step 3: and constructing a kinematics model according to the optimal estimation value of the relative pose, initializing and screening the path of the rear vehicle, and generating an effective path.
2. The method for reconstructing the route of the preceding vehicle according to claim 1, wherein in step 1, a coordinate system of the vehicle is established with a center of mass of the following vehicle as an origin of the coordinate system and a heading of the vehicle as a positive direction of an x-axis according to the relative pose at the initial time, and cubic curve fitting is performed according to a relative position of the preceding vehicle and a heading angle of the vehicle:
f(x)=ax3+bx2+cx+d
wherein a, b, c and d respectively represent each secondary coefficient of the reference path curve; since the curve passes through the origin of the coordinate system and the slope of the tangent at the origin is 0, c-d-0; and because the front vehicle is on the curve at the initial moment, the following conditions are met:
solving the equation system can obtain a and b values as follows:
setting the number of path points contained in the initialization path to be N0Carrying out equidistant interpolation on the cubic curve to obtain the position information of the path points in the initialized path sequenceThe slope of the curve at each point is the pose information of the initial path sequence
3. The preceding vehicle path reconstructing method according to claim 1, wherein in step 2, a kinematic equation is established and discretized:
wherein, Δ x represents the longitudinal relative position of the front vehicle mass center in the rear vehicle coordinate system; Δ y represents the lateral relative position of the front vehicle centroid in the rear vehicle coordinate system; Δ θ represents a nose orientation angle of the front vehicle with respect to the rear vehicle; omega1Representing the yaw rate of the preceding vehicle; omega2Representing the yaw rate of the rear vehicle; v. ofx1Representing the longitudinal speed of the front vehicle; v. ofx2Indicating the longitudinal speed of the rear vehicle.
4. The method for reconstructing the leading vehicle path according to claim 3, wherein in step 2, the extended kalman filter algorithm is used to obtain the prior estimation value of the state of the system at the time k by using the state equation according to the state of the system at the time k-1, which is expressed as:
wherein ,is a posterior estimated value at the k-1 moment; the covariance corresponding to the a priori estimate of the state at time k may be expressed as:
6. The method according to claim 4, wherein in step 2, the system's optimal estimated value at that timeComprises the following steps:
representing a predicted value at the k moment; zkRepresents the measured value at time k; kkRepresenting a Kalman gain value at the k moment; thetakRepresenting the availability of sensing system data, Θ being the case when sensing information is not availablek0; when perceptual system data is available, Θk=1;
The optimal estimated value is recorded as Respectively representing the longitudinal relative position, the transverse relative position and the relative head orientation angle of the front vehicle mass center in a rear vehicle coordinate system;
the k-time kalman gain value may be expressed as:
Jhthe Jacobian matrix representing the measurement function is a unit diagonal matrix.
7. The method for reconstructing the leading vehicle path according to claim 6, wherein in step 2, the extended kalman filter algorithm is continuously updated, and the covariance of the optimal estimation state of the system needs to be updated:
wherein I represents an identity matrix;
obtaining the optimal estimated value of the relative position at the moment k, and recording the optimal estimated value as Respectively representing the longitudinal relative position, the transverse relative position and the relative head orientation angle of the mass center of the front vehicle in the coordinate system of the rear vehicle.
8. The method according to claim 6, wherein in step 3, the system sampling time T is sufficiently small, and is defined as [ (k-1) T, kT [ ]]Within the time interval, the rear parking attitude change can be expressed as δ q ═ δ x, δ y, δ θ)TAccording to the monorail kinematics model of the vehicle, there are:
wherein ,vxIndicating longitudinal speed of the rear vehicle,/f,lrRespectively representing the distance from the front axle of the rear vehicle to the center of mass and the distance from the rear axle to the center of mass, deltafRepresenting the front wheel rotation angle of the rear vehicle, and omega representing the yaw rate of the rear vehicle; converting the pose information of the vehicle coordinate system at the moment k-1 into the vehicle coordinate system at the moment k:
9. the leading-vehicle path reconstructing method according to claim 8, wherein in step 3, it is assumed that at time t-kT, the path sequence length isAfter coordinate system transformation, a positive integer i exists0Satisfy the requirement of
Then the elimination of waypoints can be performed using the following equation:
10. The leading vehicle path rebuilding method according to claim 9, wherein in step 3, the newly added path points satisfy the following inequality relationship:
wherein ,dminRepresents the lower limit of the jumping distance range of the design path point, dmaxRepresenting the upper limit of the jumping distance range of the design path point;
adding the path points meeting the unequal relation into a path sequence:
wherein , respectively are estimated values of the longitudinal speed and the yaw rate at the moment; the path is then represented as
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111215298.4A CN113870316B (en) | 2021-10-19 | 2021-10-19 | Front vehicle path reconstruction method under GPS-free following scene |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111215298.4A CN113870316B (en) | 2021-10-19 | 2021-10-19 | Front vehicle path reconstruction method under GPS-free following scene |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113870316A true CN113870316A (en) | 2021-12-31 |
CN113870316B CN113870316B (en) | 2023-08-15 |
Family
ID=79000236
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111215298.4A Active CN113870316B (en) | 2021-10-19 | 2021-10-19 | Front vehicle path reconstruction method under GPS-free following scene |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113870316B (en) |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107015238A (en) * | 2017-04-27 | 2017-08-04 | 睿舆自动化(上海)有限公司 | Unmanned vehicle autonomic positioning method based on three-dimensional laser radar |
CN108519773A (en) * | 2018-03-07 | 2018-09-11 | 西安交通大学 | The paths planning method of automatic driving vehicle under a kind of structured environment |
CN108646763A (en) * | 2018-07-18 | 2018-10-12 | 扬州大学 | A kind of autonomous driving trace tracking and controlling method |
CN109668569A (en) * | 2018-12-08 | 2019-04-23 | 华东交通大学 | Path rapid generation in a kind of intelligent driving |
CN109991972A (en) * | 2017-12-29 | 2019-07-09 | 长城汽车股份有限公司 | Control method, apparatus, vehicle and the readable storage medium storing program for executing of vehicle driving |
CN110806215A (en) * | 2019-11-21 | 2020-02-18 | 北京百度网讯科技有限公司 | Vehicle positioning method, device, equipment and storage medium |
CN111231965A (en) * | 2020-01-14 | 2020-06-05 | 北京小马智行科技有限公司 | Method and device for adjusting vehicle control mode and unmanned vehicle |
US20200180611A1 (en) * | 2018-12-07 | 2020-06-11 | Volkswagen Aktiengesellschaft | Driver Assistance System For A Motor Vehicle, Motor Vehicle And Method For Operating A Motor Vehicle |
CN111338340A (en) * | 2020-02-21 | 2020-06-26 | 天津大学 | Model prediction-based unmanned automobile local path planning method |
CN111707272A (en) * | 2020-06-28 | 2020-09-25 | 湖南大学 | Underground garage automatic driving laser positioning system |
CN111750887A (en) * | 2020-06-11 | 2020-10-09 | 上海交通大学 | Unmanned vehicle trajectory planning method and system for reducing accident severity |
CN112130559A (en) * | 2020-08-21 | 2020-12-25 | 同济大学 | Indoor pedestrian following and obstacle avoiding method based on UWB and laser radar |
CN112904849A (en) * | 2021-01-18 | 2021-06-04 | 北京科技大学 | Integrated automatic driving automobile lane change track planning and tracking control method and system |
CN113386795A (en) * | 2021-07-05 | 2021-09-14 | 西安电子科技大学芜湖研究院 | Intelligent decision-making and local track planning method for automatic driving vehicle and decision-making system thereof |
CN113433947A (en) * | 2021-07-15 | 2021-09-24 | 天津大学 | Intersection trajectory planning and control method based on obstacle vehicle estimation and prediction |
-
2021
- 2021-10-19 CN CN202111215298.4A patent/CN113870316B/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107015238A (en) * | 2017-04-27 | 2017-08-04 | 睿舆自动化(上海)有限公司 | Unmanned vehicle autonomic positioning method based on three-dimensional laser radar |
CN109991972A (en) * | 2017-12-29 | 2019-07-09 | 长城汽车股份有限公司 | Control method, apparatus, vehicle and the readable storage medium storing program for executing of vehicle driving |
CN108519773A (en) * | 2018-03-07 | 2018-09-11 | 西安交通大学 | The paths planning method of automatic driving vehicle under a kind of structured environment |
CN108646763A (en) * | 2018-07-18 | 2018-10-12 | 扬州大学 | A kind of autonomous driving trace tracking and controlling method |
US20200180611A1 (en) * | 2018-12-07 | 2020-06-11 | Volkswagen Aktiengesellschaft | Driver Assistance System For A Motor Vehicle, Motor Vehicle And Method For Operating A Motor Vehicle |
CN109668569A (en) * | 2018-12-08 | 2019-04-23 | 华东交通大学 | Path rapid generation in a kind of intelligent driving |
CN110806215A (en) * | 2019-11-21 | 2020-02-18 | 北京百度网讯科技有限公司 | Vehicle positioning method, device, equipment and storage medium |
CN111231965A (en) * | 2020-01-14 | 2020-06-05 | 北京小马智行科技有限公司 | Method and device for adjusting vehicle control mode and unmanned vehicle |
CN111338340A (en) * | 2020-02-21 | 2020-06-26 | 天津大学 | Model prediction-based unmanned automobile local path planning method |
CN111750887A (en) * | 2020-06-11 | 2020-10-09 | 上海交通大学 | Unmanned vehicle trajectory planning method and system for reducing accident severity |
CN111707272A (en) * | 2020-06-28 | 2020-09-25 | 湖南大学 | Underground garage automatic driving laser positioning system |
CN112130559A (en) * | 2020-08-21 | 2020-12-25 | 同济大学 | Indoor pedestrian following and obstacle avoiding method based on UWB and laser radar |
CN112904849A (en) * | 2021-01-18 | 2021-06-04 | 北京科技大学 | Integrated automatic driving automobile lane change track planning and tracking control method and system |
CN113386795A (en) * | 2021-07-05 | 2021-09-14 | 西安电子科技大学芜湖研究院 | Intelligent decision-making and local track planning method for automatic driving vehicle and decision-making system thereof |
CN113433947A (en) * | 2021-07-15 | 2021-09-24 | 天津大学 | Intersection trajectory planning and control method based on obstacle vehicle estimation and prediction |
Non-Patent Citations (5)
Title |
---|
HANZHANG XUE等: "IMU-Aided High-Frequency Lidar Odometry for Autonomous Driving", 《APPLIED SCIENCES》, vol. 9, pages 1 - 20 * |
化祖旭: "自动驾驶汽车路径跟踪控制算法综述", 《装备制造技术》, no. 5, pages 100 - 103 * |
肖军浩等: "单目相机-3维激光雷达的外参标定及融合里程计研究", 《机器人》, vol. 43, no. 1, pages 17 - 28 * |
谭光兴等: "基于扩展卡尔曼滤波的汽车行驶状态估计", 《广西科技大学学报》, vol. 31, no. 1, pages 18 - 24 * |
赵凌峰: "高速无人驾驶车辆横纵向控制算法的研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》, no. 06, pages 035 - 200 * |
Also Published As
Publication number | Publication date |
---|---|
CN113870316B (en) | 2023-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106476728B (en) | Motion compensation for vehicle-mounted vehicle sensors | |
US20210403032A1 (en) | Two-level path planning for autonomous vehicles | |
EP3678110B1 (en) | Method for correcting positional error and device for correcting positional error in a drive-assisted vehicle | |
Son et al. | Robust multirate control scheme with predictive virtual lanes for lane-keeping system of autonomous highway driving | |
US9645250B2 (en) | Fail operational vehicle speed estimation through data fusion of 6-DOF IMU, GPS, and radar | |
Kang et al. | Comparative evaluation of dynamic and kinematic vehicle models | |
CN112486156B (en) | Automatic tracking control system and control method for vehicle | |
CN110525436A (en) | Vehicle lane-changing control method, device, vehicle and storage medium | |
Xia et al. | Advancing estimation accuracy of sideslip angle by fusing vehicle kinematics and dynamics information with fuzzy logic | |
KR102024986B1 (en) | Apparatus and method for lane Keeping control | |
Liu et al. | Slip-aware motion estimation for off-road mobile robots via multi-innovation unscented Kalman filter | |
CN112433531A (en) | Trajectory tracking method and device for automatic driving vehicle and computer equipment | |
CN112099378B (en) | Front vehicle lateral motion state real-time estimation method considering random measurement time lag | |
CN114912061B (en) | Accurate assessment method for lane keeping auxiliary system of commercial vehicle | |
CN115923839A (en) | Vehicle path planning method | |
CN115303265A (en) | Vehicle obstacle avoidance control method and device and vehicle | |
Zindler et al. | Real-time ego-motion estimation using Lidar and a vehicle model based Extended Kalman Filter | |
Berntorp et al. | Bayesian sensor fusion of GNSS and camera with outlier adaptation for vehicle positioning | |
Katriniok | Optimal vehicle dynamics control and state estimation for a low-cost GNSS-based collision avoidance system | |
Zhou et al. | Vision-based control of an industrial vehicle in unstructured environments | |
CN113870316A (en) | Front vehicle path reconstruction method under scene without GPS vehicle following | |
KR102271913B1 (en) | Apparatus for determining position of vehicle and method thereof | |
KR101733880B1 (en) | Apparatus and method for estimating position of vehicle using nonlinear tire model | |
CN117146838A (en) | Path planning method and device and related products | |
EP2364891B1 (en) | Method for threat assessment in a vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |