CN109885066B - Motion trail prediction method and device - Google Patents

Motion trail prediction method and device Download PDF

Info

Publication number
CN109885066B
CN109885066B CN201910232415.4A CN201910232415A CN109885066B CN 109885066 B CN109885066 B CN 109885066B CN 201910232415 A CN201910232415 A CN 201910232415A CN 109885066 B CN109885066 B CN 109885066B
Authority
CN
China
Prior art keywords
obstacle
target
track
lane
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910232415.4A
Other languages
Chinese (zh)
Other versions
CN109885066A (en
Inventor
葛彦悟
万国强
朱明�
刘祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingwei Hirain Tech Co Ltd
Original Assignee
Beijing Jingwei Hirain Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingwei Hirain Tech Co Ltd filed Critical Beijing Jingwei Hirain Tech Co Ltd
Priority to CN201910232415.4A priority Critical patent/CN109885066B/en
Publication of CN109885066A publication Critical patent/CN109885066A/en
Application granted granted Critical
Publication of CN109885066B publication Critical patent/CN109885066B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a motion trail prediction method and a motion trail prediction device, which can determine a short-time running trail of a target obstacle in a first preset time based on obstacle motion information after obtaining the obstacle motion information and lane line information, then determine a long-time running trail of the target obstacle in a second preset time according to the obstacle motion information and the lane line information, and finally perform trail fusion on the short-time running trail and the long-time running trail to obtain a predicted running trail of the target obstacle. The invention can know the predicted running track of the target obstacle during automatic driving control, and avoid the obstacle in advance when the target obstacle performs working conditions such as close-distance vehicle cut-in and the like.

Description

Motion trail prediction method and device
Technical Field
The invention relates to the field of automatic driving, in particular to a motion trail prediction method and device.
Background
A vehicle having an advanced Driving assistance system adas (advanced Driving assistance system) function collects obstacles on a Driving road of the vehicle in real time by a camera, a radar, or other sensors mounted on the vehicle.
However, when an obstacle is detected, the vehicle only acquires the current relative position of the obstacle, but the automatic driving control based on the current relative position of the obstacle is prone to cause the phenomenon that the vehicle brakes too hard, even brakes fail and collides under some working conditions, such as a short-distance vehicle cut-in working condition.
Disclosure of Invention
In view of this, the present invention provides a motion trajectory prediction method and device to solve the problem that the vehicle only collects the current relative position of the obstacle, but the automatic driving control based on the current relative position of the obstacle is prone to cause the phenomenon that the vehicle is braked too hard, and even fails to brake and collides under some working conditions, such as a short-distance vehicle cut-in working condition.
In order to solve the technical problems, the invention adopts the following technical scheme:
a motion trajectory prediction method includes:
carrying out information fusion on initial obstacle motion information of a target obstacle, which is respectively acquired by a millimeter wave radar and a camera, so as to obtain obstacle motion information;
carrying out information fusion on the road lane line information detected by the camera and the road lane line information output by the high-precision map to obtain the lane line information;
determining a short-time running track of the target obstacle within a first preset time based on the obstacle motion information;
determining a long-time running track of the target obstacle within a second preset time according to the obstacle motion information and the lane line information; the second preset time is longer than the first preset time;
and carrying out track fusion on the short-time running track and the long-time running track to obtain a predicted running track of the target obstacle.
Preferably, the obtaining the predicted travel track of the target obstacle by performing track fusion on the short-time travel track and the long-time travel track includes:
predicted travel trackfinalThe formula for calculation of (t) is:
Trajectory final(t)=f(t)·Trajectory mod el(t)+(1-f(t))·Trajectory maneuver(t);
wherein f is(t)Is a fusion weight coefficient; trjectorymod el(t) is a short-time driving track; trjectorymaneuverAnd (t) is a long-term driving track.
Preferably, the determining the short-time driving track of the target obstacle in the first preset time based on the obstacle motion information includes:
obtaining a vehicle kinematic model; the vehicle kinematic model is used for predicting the running track of the target obstacle;
determining the short-time driving track according to the obstacle motion information and the vehicle kinematics modelmod el(t);
Trajectory mod el(t)=(xmdl(t),ymdl(t))
Wherein x ismdlThe longitudinal distance of the short-time running track is obtained through prediction; y ismdlThe lateral distance of the short-time driving track is obtained through prediction.
Preferably, determining a long-time driving track of the target obstacle within a second preset time according to the obstacle motion information and the lane line information includes:
determining the driving track state information X of the target obstacle based on the lane line information and the obstacle movement informationpathAnd a target point on a target lane line closest to the target obstacle;
Xpath=[Dis2LineRatioobjobjobj]T
wherein, Dis2LineRatioobjThe ratio of the distance between the target obstacle and the center line of the current lane where the target obstacle is located to the lane width; theta obj is the course of the target obstacle, and T is the transpose of a vector/matrix;
trajectory curvature γobjIs the angular velocity ω of the target obstacleobjWith the velocity v of the target obstacleobjThe ratio of (A) to (B);
Υobj=ωobj/vobj
determining target lane line state information X of a target point on the target lane linelane=[0,θlanelane]TWherein θ lane is the heading, γ, at the target pointlaneIs the curvature at the target point, T is the transpose of the vector/matrix;
determining a current track deviation value of a current driving track of the target obstacle and a central line of a current lane where the target obstacle is located according to the state information of the target lane line and the state information of the driving track;
determining a target driving lane of the target obstacle based on the current trajectory deviation value;
and determining a long-time driving track of the target obstacle based on the target driving lane and the obstacle motion information.
Preferably, determining a target driving lane of the target obstacle based on the current trajectory deviation value includes:
acquiring a historical track deviation value;
calculating the deviation sum of the current trajectory deviation value and the historical trajectory deviation value;
if the sum of the deviations is smaller than or equal to a first preset threshold value, determining that a target driving lane of the target obstacle is a current lane where the target obstacle is located;
if the sum of the deviations is larger than the first preset threshold value, determining the change trend of the track deviation based on the historical track deviation value and the current track deviation value;
if the change trend is continuously reduced, determining that the target driving lane of the target obstacle is the current lane where the vehicle is located;
and if the change trend is continuously increased, determining that the target driving lane of the target obstacle is a lane with a smaller deviation value with the current driving track of the target obstacle in the adjacent lanes of the current lane where the target obstacle is located.
Preferably, determining the long-term travel trajectory of the target obstacle based on the target travel lane and the target obstacle motion information includes:
determining a plurality of long-term driving tracks in the second preset time based on the target driving lane and the target obstacle movement information;
screening a long-time driving track meeting a preset track screening rule from the long-time driving tracksmaneuver(t);
Trajectory maneuver=(xmane(t),ymane(t))
Wherein x ismaneObtaining the longitudinal distance of the long-time track by prediction; y ismaneThe long-term track lateral distance is predicted.
Preferably, the information fusion is performed on initial obstacle motion information of the target obstacle detected by the millimeter wave radar and the camera respectively to obtain obstacle motion information, and the method includes:
acquiring historical barrier movement information respectively acquired by the millimeter wave radar and the camera;
calculating the similarity of the target obstacle in the motion information of each target obstacle respectively detected by the millimeter wave radar and the camera; the target obstacle movement information includes the initial obstacle movement information and the historical obstacle movement information;
when all the similarity degrees are within a preset threshold value, performing weighted fusion on initial obstacle motion information respectively detected by the millimeter wave radar and the camera to obtain obstacle motion information X;
X=[x,y,vx,vy,ax,ay]T
wherein x and y are the longitudinal and transverse distances from the target obstacle to the vehicle, respectively, and vx、vyThe longitudinal and transverse relative speeds of the target obstacle relative to the vehicle, ax、ayThe longitudinal and lateral relative accelerations of the target obstacle relative to the host vehicle, respectively, and T represents the transpose of a vector/matrix.
Preferably, the information fusion is performed on the road lane line information detected by the camera and the road lane line information output by the high-precision map, so as to obtain the road lane line information, and the method includes:
the formula for calculating the lane line information is as follows:
Figure BDA0002007143030000041
wherein, fusion lanes represents the fused lane line information; CamLanes represents road lane line information detected by the camera, and Camquality represents a lane line detection quality value of the camera; HDMapLanes represents the road and lane line information output by the high-precision map; PosQuality represents a lane line positioning quality value of the high-precision map; const1 and Const2 are constants.
A motion trajectory prediction apparatus comprising:
the movement information determining module is used for carrying out information fusion on initial obstacle movement information of the target obstacle, which is respectively acquired by the millimeter wave radar and the camera, so as to obtain obstacle movement information;
the lane line information determining module is used for carrying out information fusion on the lane line information detected by the camera and the lane line information output by the high-precision map to obtain the lane line information;
the short-time track prediction module is used for determining a short-time running track of the target obstacle in first preset time based on the obstacle motion information;
the long-time track prediction module is used for determining a long-time running track of the target obstacle within second preset time according to the obstacle motion information and the lane line information; the second preset time is longer than the first preset time;
and the track fusion module is used for carrying out track fusion on the short-time running track and the long-time running track to obtain the predicted running track of the target obstacle.
Preferably, the trajectory fusion module is configured to perform trajectory fusion on the short-time travel trajectory and the long-time travel trajectory to obtain a predicted travel trajectory of the target obstacle, and specifically configured to:
predicted travel trackfinalThe formula for calculation of (t) is:
Trajectory final(t)=f(t)·Trajectory mod el(t)+(1-f(t))·Trajectory maneuver(t);
wherein f is(t)Is a fusion weight coefficient; trjectorymod el(t) is a short-time driving track; trjectorymaneuverAnd (t) is a long-term driving track.
Compared with the prior art, the invention has the following beneficial effects:
after obtaining the movement information of the obstacle and the lane line information, the method and the device can firstly determine the short-time driving track of the target obstacle within a first preset time based on the movement information of the obstacle, then determine the long-time driving track of the target obstacle within a second preset time according to the movement information of the obstacle and the lane line information, and finally perform track fusion on the short-time driving track and the long-time driving track to obtain the predicted driving track of the target obstacle. The invention can know the predicted running track of the target obstacle during automatic driving control, and avoid the obstacle in advance when the target obstacle performs working conditions such as close-distance vehicle cut-in and the like.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of a method for predicting a motion trajectory according to an embodiment of the present invention;
fig. 2 is a flowchart of another method for predicting a motion trajectory according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a variation of fusion weighting coefficients according to an embodiment of the present invention;
fig. 4 is a flowchart of a method for predicting a motion trajectory according to another embodiment of the present invention;
fig. 5 is a schematic structural diagram of a motion trajectory prediction apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides a motion trail prediction method, which can be applied to an automatic driving controller, and can comprise the following steps of:
and S11, performing information fusion on initial obstacle motion information of the target obstacle respectively acquired by the millimeter wave radar and the camera to obtain obstacle motion information.
The target obstacle may be a movable obstacle such as a vehicle or a pedestrian, but is preferably a vehicle. The obstacle motion information of the target obstacle is data in a vehicle coordinate system with the vehicle as an origin. The obstacle motion information may include position, velocity, acceleration, and the like.
Specifically, the millimeter wave radar senses the running environment of the automatic driving vehicle in real time and outputs the initial obstacle movement information of the detected target obstacle.
It should be noted that the millimeter wave radar can detect the obstacle motion information of a plurality of obstacles including the target obstacle. If an obstacle is detected in front of the left, right and front of the vehicle, initial obstacle motion information of each obstacle is determined.
The camera senses the running environment of the automatic driving vehicle in real time and outputs the detected initial obstacle motion information of the target obstacle and the road lane line information of the road where the vehicle is located.
In a preferred implementation manner of the present invention, referring to fig. 2, step S11 may include:
and S21, acquiring historical obstacle motion information respectively acquired by the millimeter wave radar and the camera.
Specifically, the millimeter wave radar and the camera acquire barrier motion information in real time, the barrier motion information acquired at the current moment is used as initial barrier motion information, and barrier motion information acquired in history is used as historical barrier motion information.
And S22, calculating the similarity of the target obstacles in the motion information of each target obstacle detected by the millimeter wave radar and the camera respectively.
The target obstacle motion information includes the initial obstacle motion information and the historical obstacle motion information.
S23, when all the similarities are within a preset threshold value, carrying out weighted fusion on initial obstacle motion information respectively detected by the millimeter wave radar and the camera to obtain obstacle motion information X;
X=[x,y,vx,vy,ax,ay]T
wherein x and y are the longitudinal and transverse distances from the target obstacle to the vehicle, respectively, and vx、vyThe longitudinal and transverse relative speeds of the target obstacle relative to the vehicle, ax、ayThe longitudinal and lateral relative accelerations of the target obstacle relative to the host vehicle, respectively, and T represents the transpose of a vector/matrix.
Specifically, both the millimeter wave radar and the camera can detect obstacle motion information of a plurality of obstacles including the target obstacle. However, the obstacle detected by the camera and the obstacle detected by the millimeter wave radar may be one obstacle or different obstacles, for example, if the camera detects seven obstacles, the millimeter wave radar detects five obstacles, and a target obstacle for which a predicted travel track needs to be determined from the plurality of obstacles.
Specifically, the process of determining the target obstacle from the plurality of obstacles is as follows:
and respectively comparing the similarity of the position, the speed, the acceleration and other information of the obstacles of the current frame and the first preset number of historical frames acquired by the camera and the millimeter wave radar, wherein the similarity measurement mode comprises an Euclidean distance, a Mahalanobis distance and the like. That is, the similarity between the information such as the position, speed, acceleration, etc. of each obstacle in each frame acquired by the camera and the information such as the position, speed, acceleration, etc. of each obstacle in each frame acquired by the millimeter wave radar is compared. If the similarity of the obstacles in each frame is larger, if the similarity is within a preset threshold value, the same obstacle is determined and is used as the target obstacle.
After the target obstacle is determined, fusing initial obstacle information detected by the millimeter wave radar and the camera, for example, weighting fusion may be performed, that is, setting a weight value of initial obstacle motion information detected by the millimeter wave radar and the camera, and then performing weighting summation.
It should be noted that only the initial obstacle motion information collected by the installed millimeter wave radar or the camera may be used as the obstacle motion information, that is, information fusion is not performed.
And S12, carrying out information fusion on the road lane line information detected by the camera and the road lane line information output by the high-precision map to obtain the lane line information.
Specifically, as described above, the camera may detect the lane line information of the road on which the host vehicle is located, and the lane line information is generally the position of the lane line of one, two, or three lanes.
The high-precision map system stores lane line information of different roads, and the lane line information of the road where the automatic driving vehicle is located can be output in real time.
In a preferred implementation manner of the present invention, step S12 may include:
the formula for calculating the lane line information is as follows:
Figure BDA0002007143030000081
wherein, fusion lanes represents the fused lane line information; CamLanes represents road lane line information detected by the camera, and Camquality represents a lane line detection quality value of the camera;
HDMapLanes represents the road and lane line information output by the high-precision map; PosQuality represents a lane line positioning quality value of the high-precision map; const1 and Const2 are constants. CamQuality and PosQuality may be obtained in advance.
Specifically, when the lane line detection quality value CamQuality detected by the camera is greater than Const1, the lane line information detected by the camera is used as the lane line information, and when the lane line detection quality value CamQuality detected by the camera is less than Const1 and the lane line positioning quality value PosQuality of the high-precision map system is greater than Const2, the lane line information output by the high-precision map is used as the lane line information.
And S13, determining the short-time running track of the target obstacle in the first preset time based on the obstacle motion information.
And S14, determining the long-time running track of the target obstacle in a second preset time according to the obstacle motion information and the lane line information.
The second preset time is greater than the first preset time. The first preset time and the second preset time are both set in advance by technicians according to the use scene.
And S15, carrying out track fusion on the short-time running track and the long-time running track to obtain the predicted running track of the target obstacle.
Specifically, in the embodiment, the driving tracks of two different time periods are predicted, and the two driving tracks are subjected to track fusion, so that the accuracy of the predicted track is further ensured.
In this embodiment, after obtaining the obstacle motion information and the lane line information, the short-time travel track of the target obstacle in the first preset time may be determined based on the obstacle motion information, the long-time travel track of the target obstacle in the second preset time may be determined according to the obstacle motion information and the lane line information, and finally the short-time travel track and the long-time travel track are subjected to track fusion to obtain the predicted travel track of the target obstacle. The invention can know the predicted running track of the target obstacle during automatic driving control, and avoid the obstacle in advance when the target obstacle performs working conditions such as close-distance vehicle cut-in and the like.
Optionally, on the basis of any of the foregoing embodiments, step S15 may include:
predicted travel trackfinalThe formula for calculation of (t) is:
Trajectory final(t)=f(t)·Trajectory mod el(t)+(1-f(t))·Trajectory maneuver(t);
wherein f is(t)Is a fusion weight coefficient; trajectory mod el(t) is a short-time driving track; trjectorymaneuverAnd (t) is a long-term driving track.
Fusing the obtained short-time driving track and the long-time driving track to obtain a final predicted track Trajectoryfinal. The fusion criterion is that at [0, T1]Giving larger weight to the short-time running track within the time, and gradually reducing the weight; [ T1, T2 ]]Giving greater weight to the long-term driving trajectory over time, and the weight gradually increases as shown in the following equation:
Trajectory final(t)=f(t)·Trajectory mod el(t)+(1-f(t))·Trajectory maneuver(t)
where f (t) is a fusion weight coefficient, the form of which is shown in fig. 3.
Optionally, on the basis of this embodiment, step S13 may include:
obtaining a vehicle kinematic model, and determining the short-time driving track according to the obstacle motion information and the vehicle kinematic modelmod el(t);
Trajectory mod el(t)=(xmdl(t),ymdl(t))
Wherein the vehicle kinematics model is configured to predict a travel trajectory of the target obstacle. The vehicle kinematics model may be a constant velocity model, a constant acceleration model, or the like.
xmdlThe longitudinal distance of the short-time running track is obtained through prediction; y ismdlThe lateral distance of the short-time driving track is obtained through prediction.
Specifically, short-time trajectory prediction is performed based on the obstacle motion information and a vehicle kinematic model, and the effective time of the model is a first preset time T1. Predicted short-time Trajectory output Tracorymodel(t) is a series of positions of the target obstacle at a future time.
Optionally, on the basis of this embodiment, referring to fig. 4, step S14 may include:
s31, determining the running track state information of the target obstacle based on the lane line information and the obstacle motion informationXpathAnd a target point on a target lane line closest to the target obstacle.
Xpath=[Dis2LineRatioobjobjobj]T
Wherein, Dis2LineRatioobjThe ratio of the distance between the target obstacle and the center line of the current lane where the target obstacle is located to the lane width; theta obj is the course of the target obstacle, and T is the transpose of a vector/matrix;
trajectory curvature γobjIs the angular velocity ω of the target obstacleobjWith the velocity v of the target obstacleobjThe ratio of (A) to (B);
Υobj=ωobj/vobj
specifically, lane line data of the current lane where the target obstacle is located, such as lane width and center line position, may be obtained from the lane line information, and a target point on the target lane line where the target obstacle is closest to the target obstacle may be determined based on the position of the target obstacle and the lane line information.
S32, determining the state information X of the target lane line of the target point on the target lane linelane=[0,θlanelane]T
Wherein θ lane is the heading, γ, at the target pointlaneIs the curvature at the target point and T is the transpose of the vector/matrix.
Specifically, the target lane line state information of the target point may be acquired from the lane line information.
And S33, determining a current track deviation value between the current driving track of the target obstacle and the central line of the current lane where the target obstacle is located according to the state information of the target lane line and the state information of the driving track.
Specifically, the target travel lane of the target obstacle may be determined by a deviation of a current travel trajectory of the obstacle from a center line of a current lane in which the obstacle is currently located. The deviation comprises the proportion of the distance between the target obstacle and the center line of the current lane where the target obstacle is located and the lane width, the course difference between the course of the target obstacle and the course of the target point, and the curvature difference between the track curvature of the target obstacle and the curvature of the target point.
The course difference can be obtained by subtracting the course of the target obstacle from the course of the target point, the curvature difference can be obtained by subtracting the curvature of the track of the target obstacle from the curvature of the target point, and the ratio of the distance of the target obstacle from the center line of the current lane where the target obstacle is located to the width of the lane is Dis2LineRatioobj
And S34, determining a target driving lane of the target obstacle based on the current track deviation value.
Specifically, the obstacle behavior is identified based on the obstacle motion information and the lane line information. The behavior to be recognized includes lane keeping or lane changing. The essence of behavior recognition is to predict the target driving lane of an obstacle. The target driving lane of the target obstacle is determined, that is, the predicted driving track of the target obstacle can be determined.
Optionally, on the basis of this embodiment, step S34 may include:
1) and acquiring a historical track deviation value.
2) And calculating the sum of the deviation of the current track deviation value and the historical track deviation value.
Specifically, whether the target obstacle needs to leave the current lane of the target obstacle is judged according to the current trajectory deviation value of the target obstacle and the historical trajectory deviation values of a second preset number, and the expression form of the deviation comprises Euclidean distance, Mahalanobis distance and the like.
3) And if the sum of the deviations is less than or equal to a first preset threshold value, determining that the target driving lane of the target obstacle is the current lane where the target obstacle is located.
If the sum of the deviations is less than or equal to a first preset threshold value DthresholdAnd judging that the current driving behavior of the target obstacle is lane keeping.
4) If the sum of the deviations is larger than the first preset threshold value, determining the change trend of the track deviation based on the historical track deviation value and the current track deviation value;
5) if the change trend is continuously reduced, determining that the target driving lane of the target obstacle is the current lane where the vehicle is located;
6) and if the change trend is continuously increased, determining that the target driving lane of the target obstacle is a lane with a smaller deviation value with the current driving track of the target obstacle in the adjacent lanes of the current lane where the target obstacle is located.
Specifically, if the sum of the deviations is greater than a first preset threshold value DthresholdAnd if the trend is smaller and smaller, judging that the target barrier just enters the current lane where the vehicle is located from the side lane; if the deviation is larger than a first preset threshold value DthresholdAnd if the trend is larger and larger, judging that the target obstacle is changed, wherein the target driving lane is one of two lanes of the current lane where the adjacent target obstacle is located, and the current driving track of the target obstacle has smaller deviation.
And S35, determining the long-term driving track of the target obstacle based on the target driving lane and the obstacle motion information.
Optionally, on the basis of this embodiment, step S35 may include:
determining a plurality of long-term driving tracks in the second preset time based on the target driving lane and the target obstacle motion information, and screening a long-term driving track meeting a preset track screening rule from the long-term driving tracksmaneuver(t);
Trajectory maneuver=(xmane(t),ymane(t))
Wherein x ismaneObtaining the longitudinal distance of the long-time track by prediction; y ismaneThe long-term track lateral distance is predicted.
Specifically, the long-term driving trajectory is a trajectory from the current position of the target obstacle to the target driving lane, the long-term driving trajectory is a driving trajectory within a second preset time, a maximum time value of behaviors such as lane keeping or lane changing is set as a second preset time T2, and the second preset time is longer than the second preset timeThe first preset time. Generating a polynomial track every interval delta t within a second preset time to obtain a long-time travel track set trackset
From TrjectorysetSelecting an optimal long-time driving track Trajectory based on a certain preset track screening rule such as time length, track curvature, track transverse acceleration and the likemaneuver
In the embodiment, the target driving lane of the target obstacle is determined, the predicted driving track of the target obstacle is predicted, the target obstacle information can be updated in advance and output to a system such as an adaptive cruise system, the target obstacle which is about to enter or leave the own lane can be judged earlier, the own vehicle is controlled to brake, decelerate, accelerate or the like the cut-in target obstacle earlier, and the safety and the comfort of the automatic driving vehicle during the actual road driving are improved.
Optionally, on the basis of the above embodiment of the motion trajectory prediction method, another embodiment of the present invention provides a motion trajectory prediction apparatus, and with reference to fig. 5, the motion trajectory prediction apparatus may include:
the movement information determining module 101 is configured to perform information fusion on initial obstacle movement information of the target obstacle, which is acquired by the millimeter wave radar and the camera, respectively, to obtain obstacle movement information;
the lane line information determining module 102 is configured to perform information fusion on the lane line information detected by the camera and the lane line information output by the high-precision map to obtain lane line information;
a short-time track prediction module 103, configured to determine a short-time driving track of the target obstacle within a first preset time based on the obstacle motion information;
a long-time trajectory prediction module 104, configured to determine a long-time driving trajectory of the target obstacle within a second preset time according to the obstacle motion information and the lane line information; the second preset time is longer than the first preset time;
and a track fusion module 105, configured to perform track fusion on the short-time travel track and the long-time travel track to obtain a predicted travel track of the target obstacle.
Optionally, on the basis of this embodiment, the motion information determining module 101 may include:
the information acquisition submodule is used for acquiring historical barrier motion information respectively acquired by the millimeter wave radar and the camera;
the similarity calculation word module is used for calculating the similarity of the target obstacle in the motion information of each target obstacle respectively detected by the millimeter wave radar and the camera; the target obstacle movement information includes the initial obstacle movement information and the historical obstacle movement information;
the fusion sub-module is used for performing weighted fusion on initial obstacle motion information respectively detected by the millimeter wave radar and the camera to obtain obstacle motion information X when all the similarities are within a preset threshold value;
X=[x,y,vx,vy,ax,ay]T
wherein x and y are the longitudinal and transverse distances from the target obstacle to the vehicle, respectively, and vx、vyThe longitudinal and transverse relative speeds of the target obstacle relative to the vehicle, ax、ayThe longitudinal and lateral relative accelerations of the target obstacle relative to the host vehicle, respectively, and T represents the transpose of a vector/matrix.
Optionally, on the basis of this embodiment, the lane line information determining module 102 is configured to perform information fusion on the lane line information detected by the camera and the lane line information output by the high-precision map, and when obtaining the lane line information, is specifically configured to:
the formula for calculating the lane line information is as follows:
Figure BDA0002007143030000131
wherein, fusion lanes represents the fused lane line information; CamLanes represents road lane line information detected by the camera, and Camquality represents a lane line detection quality value of the camera; HDMapLanes represents the road and lane line information output by the high-precision map; PosQuality represents a lane line positioning quality value of the high-precision map; const1 and Const2 are constants.
In this embodiment, after obtaining the obstacle motion information and the lane line information, the short-time travel track of the target obstacle in the first preset time may be determined based on the obstacle motion information, the long-time travel track of the target obstacle in the second preset time may be determined according to the obstacle motion information and the lane line information, and finally the short-time travel track and the long-time travel track are subjected to track fusion to obtain the predicted travel track of the target obstacle. The invention can know the predicted running track of the target obstacle during automatic driving control, and avoid the obstacle in advance when the target obstacle performs working conditions such as close-distance vehicle cut-in and the like.
It should be noted that, for the working processes of each module and sub-module in this embodiment, please refer to the corresponding description in the above embodiments, which is not described herein again.
Optionally, on the basis of any of the foregoing embodiments, the trajectory fusion module 105 is configured to perform trajectory fusion on the short-time travel trajectory and the long-time travel trajectory to obtain a predicted travel trajectory of the target obstacle, and specifically configured to:
predicted travel trackfinalThe formula for calculation of (t) is:
Trajectory final(t)=f(t)·Trajectory mod el(t)+(1-f(t))·Trajectory maneuver(t);
wherein f is(t)Is a fusion weight coefficient; trjectorymod el(t) is a short-time driving track; trjectorymaneuverAnd (t) is a long-term driving track.
Optionally, on the basis of this embodiment, the short-time trajectory prediction module 103 is configured to, when determining the short-time travel trajectory of the target obstacle within the first preset time based on the obstacle motion information, specifically:
obtaining a vehicle kinematic model, and determining the short-time driving track according to the obstacle motion information and the vehicle kinematic modelmod el(t)。
Wherein the vehicle kinematic model is used for predicting a driving track of the target obstacle;
Trajectory mod el(t)=(xmdl(t),ymdl(t))
xmdlthe longitudinal distance of the short-time running track is obtained through prediction; y ismdlThe lateral distance of the short-time driving track is obtained through prediction.
Optionally, on the basis of this embodiment, the long-term trajectory prediction module 104 may include:
a data determination submodule for determining the travel track state information X of the target obstacle based on the lane line information and the obstacle movement informationpathAnd a target point on a target lane line closest to the target obstacle;
Xpath=[Dis2LineRatioobjobjobj]T
wherein, Dis2LineRatioobjThe ratio of the distance between the target obstacle and the center line of the current lane where the target obstacle is located to the lane width; theta obj is the course of the target obstacle, and T is the transpose of a vector/matrix;
trajectory curvature γobjIs the angular velocity ω of the target obstacleobjWith the velocity v of the target obstacleobjThe ratio of (A) to (B);
Υobj=ωobj/vobj
an information determination submodule for determining target lane line state information X of a target point on the target lane linelane=[0,θlanelane]TWherein θ lane is the heading, γ, at the target pointlaneIs the curvature at the target point, T is the transpose of the vector/matrix;
the deviation value determining submodule is used for determining a current trajectory deviation value of the current driving trajectory of the target obstacle and the central line of the current lane where the target obstacle is located according to the state information of the target lane line and the state information of the driving trajectory;
the lane determining submodule is used for determining a target driving lane of the target obstacle based on the current track deviation value;
and the track determining submodule is used for determining the long-time running track of the target obstacle based on the target running lane and the obstacle motion information.
Optionally, on the basis of this embodiment, the lane determining sub-module may include:
the deviation value acquisition unit is used for acquiring a history track deviation value;
the deviation calculation unit is used for calculating the sum of the deviation of the current track deviation value and the deviation of the historical track deviation value;
a first determining unit, configured to determine that a target driving lane of the target obstacle is a current lane in which the target obstacle is located, if the sum of the deviations is smaller than or equal to a first preset threshold;
the trend determining unit is used for determining the change trend of the track deviation based on the historical track deviation value and the current track deviation value if the sum of the deviations is larger than the first preset threshold value;
the second determining unit is used for determining that the target driving lane of the target obstacle is the current lane where the vehicle is located if the change trend is continuously smaller;
and the third determining unit is used for determining that the target driving lane of the target obstacle is a lane with a smaller deviation value with the current driving track of the target obstacle in the adjacent lane of the current lane where the target obstacle is located if the change trend is continuously increased.
Optionally, on the basis of this embodiment, the trajectory determination submodule, when determining the long-time travel trajectory of the target obstacle based on the target travel lane and the obstacle motion information, is specifically configured to:
determining a plurality of long-term driving tracks in the second preset time based on the target driving lane and the target obstacle movement information;
screening a long-time driving track meeting a preset track screening rule from the long-time driving tracksmaneuver(t);
Trajectory maneuver=(xmane(t),ymane(t))
Wherein x ismaneObtaining the longitudinal distance of the long-time track by prediction; y ismaneThe long-term track lateral distance is predicted.
In the embodiment, the target driving lane of the target obstacle is determined, the predicted driving track of the target obstacle is predicted, the target obstacle information can be updated in advance and output to a system such as an adaptive cruise system, the target obstacle which is about to enter or leave the own lane can be judged earlier, the own vehicle is controlled to brake, decelerate, accelerate or the like the cut-in target obstacle earlier, and the safety and the comfort of the automatic driving vehicle during the actual road driving are improved.
It should be noted that, for the working processes of each module, sub-module, and unit in this embodiment, please refer to the corresponding description in the above embodiments, which is not described herein again.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A motion trajectory prediction method is characterized by comprising the following steps:
carrying out information fusion on initial obstacle motion information of a target obstacle, which is respectively acquired by a millimeter wave radar and a camera, so as to obtain obstacle motion information;
carrying out information fusion on the road lane line information detected by the camera and the road lane line information output by the high-precision map to obtain the lane line information;
determining a short-time running track of the target obstacle within a first preset time based on the obstacle motion information;
determining a long-time running track of the target obstacle within a second preset time according to the obstacle motion information and the lane line information; the second preset time is longer than the first preset time;
and carrying out track fusion on the short-time running track and the long-time running track to obtain a predicted running track of the target obstacle.
2. The method for predicting a motion trajectory according to claim 1, wherein the performing trajectory fusion on the short-term travel trajectory and the long-term travel trajectory to obtain the predicted travel trajectory of the target obstacle includes:
predicted travel trackfinalThe formula for calculation of (t) is:
Trajectoryfinal(t)=f(t)·Trajectorymodel(t)+(1-f(t))·Trajectorymaneuver(t);
wherein f is(t)Is a fusion weight coefficient; trjectorymodel(t) is a short-time driving track; trjectorymaneuverAnd (t) is a long-term driving track.
3. The method according to claim 2, wherein the determining a short-time travel track of the target obstacle within a first preset time based on the obstacle motion information includes:
obtaining a vehicle kinematic model; the vehicle kinematic model is used for predicting the running track of the target obstacle;
determining the short-time driving track Tracobjectymodel (t) according to the obstacle motion information and the vehicle kinematic model;
Trajectorymodel(t)=(xmdl(t),ymdl(t))
wherein x ismdlThe longitudinal distance of the short-time running track is obtained through prediction; y ismdlThe lateral distance of the short-time driving track is obtained through prediction.
4. The method for predicting the movement track according to claim 2, wherein the determining the long-term travel track of the target obstacle within a second preset time according to the obstacle movement information and the lane line information comprises:
determining the driving track state information X of the target obstacle based on the lane line information and the obstacle movement informationpathAnd a target point on a target lane line closest to the target obstacle;
Xpath=[Dis2LineRatioobjobjobj]T
wherein, Dis2LineRatioobjThe ratio of the distance between the target obstacle and the center line of the current lane where the target obstacle is located to the lane width; theta obj is the course of the target obstacle, and T is the transpose of a vector/matrix;
trajectory curvature γobjIs the angular velocity ω of the target obstacleobjWith the velocity v of the target obstacleobjThe ratio of (A) to (B);
Υobj=ωobj/vobj
determining target lane line state information X of a target point on the target lane linelane=[0,θlanelane]TWherein θ lane is the heading, γ, at the target pointlaneIs the curvature at the target point, T is the transpose of the vector/matrix;
determining a current track deviation value of a current driving track of the target obstacle and a central line of a current lane where the target obstacle is located according to the state information of the target lane line and the state information of the driving track;
determining a target driving lane of the target obstacle based on the current trajectory deviation value;
and determining a long-time driving track of the target obstacle based on the target driving lane and the obstacle motion information.
5. The motion trajectory prediction method according to claim 4, wherein the determining a target driving lane of the target obstacle based on the current trajectory deviation value includes:
acquiring a historical track deviation value;
calculating the deviation sum of the current trajectory deviation value and the historical trajectory deviation value;
if the sum of the deviations is smaller than or equal to a first preset threshold value, determining that a target driving lane of the target obstacle is a current lane where the target obstacle is located;
if the sum of the deviations is larger than the first preset threshold value, determining the change trend of the track deviation based on the historical track deviation value and the current track deviation value;
if the change trend is continuously reduced, determining that the target driving lane of the target obstacle is the current lane where the vehicle is located;
and if the change trend is continuously increased, determining that the target driving lane of the target obstacle is a lane with a smaller deviation value with the current driving track of the target obstacle in the adjacent lanes of the current lane where the target obstacle is located.
6. The motion trail prediction method according to claim 4, wherein determining the long-time travel trail of the target obstacle based on the target travel lane and the target obstacle motion information comprises:
determining a plurality of long-term driving tracks in the second preset time based on the target driving lane and the target obstacle movement information;
screening a long-time driving track meeting a preset track screening rule from the long-time driving tracksmaneuver(t);
Trajectorymaneuver=(xmane(t),ymane(t))
Wherein x ismaneObtaining the longitudinal distance of the long-time track by prediction; y ismaneThe long-term track lateral distance is predicted.
7. The method for predicting the motion trail according to claim 1, wherein the step of performing information fusion on initial obstacle motion information of a target obstacle respectively detected by a millimeter wave radar and a camera to obtain the obstacle motion information comprises:
acquiring historical barrier movement information respectively acquired by the millimeter wave radar and the camera;
calculating the similarity of the target obstacle in the motion information of each target obstacle respectively detected by the millimeter wave radar and the camera; the target obstacle movement information includes the initial obstacle movement information and the historical obstacle movement information;
when all the similarity degrees are within a preset threshold value, performing weighted fusion on initial obstacle motion information respectively detected by the millimeter wave radar and the camera to obtain obstacle motion information X;
X=[x,y,vx,vy,ax,ay]T
wherein x and y are the longitudinal and transverse distances from the target obstacle to the vehicle, respectively, and vx、vyThe longitudinal and transverse relative speeds of the target obstacle relative to the vehicle, ax、ayThe longitudinal and lateral relative accelerations of the target obstacle relative to the host vehicle, respectively, and T represents the transpose of a vector/matrix.
8. The method for predicting the movement track according to claim 1, wherein the step of performing information fusion on the road lane line information detected by the camera and the road lane line information output by the high-precision map to obtain the road lane line information comprises:
the formula for calculating the lane line information is as follows:
Figure FDA0002007143020000041
wherein, fusion lanes represents the fused lane line information; CamLanes represents road lane line information detected by the camera, and Camquality represents a lane line detection quality value of the camera; HDMapLanes represents the road and lane line information output by the high-precision map; PosQuality represents a lane line positioning quality value of the high-precision map; const1 and Const2 are constants.
9. A motion trajectory prediction apparatus, comprising:
the movement information determining module is used for carrying out information fusion on initial obstacle movement information of the target obstacle, which is respectively acquired by the millimeter wave radar and the camera, so as to obtain obstacle movement information;
the lane line information determining module is used for carrying out information fusion on the lane line information detected by the camera and the lane line information output by the high-precision map to obtain the lane line information;
the short-time track prediction module is used for determining a short-time running track of the target obstacle in first preset time based on the obstacle motion information;
the long-time track prediction module is used for determining a long-time running track of the target obstacle within second preset time according to the obstacle motion information and the lane line information; the second preset time is longer than the first preset time;
and the track fusion module is used for carrying out track fusion on the short-time running track and the long-time running track to obtain the predicted running track of the target obstacle.
10. The movement trajectory prediction device according to claim 9, wherein the trajectory fusion module is configured to perform trajectory fusion on the short-time travel trajectory and the long-time travel trajectory to obtain the predicted travel trajectory of the target obstacle, and is specifically configured to:
predicted travel trackfinalThe formula for calculation of (t) is:
Trajectoryfinal(t)=f(t)·Trajectorymodel(t)+(1-f(t))·Trajectorymaneuver(t);
wherein f is(t)Is a fusion weight coefficient; trjectorymodel(t) is a short-time driving track; trjectorymaneuverAnd (t) is a long-term driving track.
CN201910232415.4A 2019-03-26 2019-03-26 Motion trail prediction method and device Active CN109885066B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910232415.4A CN109885066B (en) 2019-03-26 2019-03-26 Motion trail prediction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910232415.4A CN109885066B (en) 2019-03-26 2019-03-26 Motion trail prediction method and device

Publications (2)

Publication Number Publication Date
CN109885066A CN109885066A (en) 2019-06-14
CN109885066B true CN109885066B (en) 2021-08-24

Family

ID=66934282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910232415.4A Active CN109885066B (en) 2019-03-26 2019-03-26 Motion trail prediction method and device

Country Status (1)

Country Link
CN (1) CN109885066B (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112286049A (en) * 2019-07-27 2021-01-29 华为技术有限公司 Motion trajectory prediction method and device
CN110400490B (en) * 2019-08-08 2022-02-25 腾讯科技(深圳)有限公司 Trajectory prediction method and apparatus
CN110705388B (en) * 2019-09-16 2022-04-01 清华大学 Target vehicle lane change identification method for auxiliary driving based on prediction feedback
CN112712729B (en) * 2019-10-26 2023-01-06 华为技术有限公司 Method and system for predicting motion trajectory
CN110794406B (en) * 2019-11-12 2022-08-02 北京经纬恒润科技股份有限公司 Multi-source sensor data fusion system and method
CN112904331A (en) * 2019-11-19 2021-06-04 杭州海康威视数字技术股份有限公司 Method, device and equipment for determining movement track and storage medium
CN113261035B (en) * 2019-12-30 2022-09-16 华为技术有限公司 Trajectory prediction method and related equipment
WO2021134354A1 (en) * 2019-12-30 2021-07-08 深圳元戎启行科技有限公司 Path prediction method and apparatus, computer device, and storage medium
CN111081046B (en) * 2020-01-03 2022-01-14 阿波罗智能技术(北京)有限公司 Method, device, electronic equipment and medium for automatically changing lane of driving vehicle
WO2021174445A1 (en) * 2020-03-04 2021-09-10 华为技术有限公司 Method and device for predicting exit for vehicle
CN111076739B (en) * 2020-03-25 2020-07-03 北京三快在线科技有限公司 Path planning method and device
CN111595352B (en) * 2020-05-14 2021-09-28 陕西重型汽车有限公司 Track prediction method based on environment perception and vehicle driving intention
CN111399523B (en) * 2020-06-02 2020-12-01 北京三快在线科技有限公司 Path planning method and device
CN111857134B (en) * 2020-06-29 2022-09-16 江苏大学 Target obstacle vehicle track prediction method based on Bayesian network
CN112327848A (en) * 2020-11-05 2021-02-05 北京京东乾石科技有限公司 Robot obstacle avoidance method and device, storage medium and electronic equipment
CN112665590B (en) * 2020-12-11 2023-04-21 国汽(北京)智能网联汽车研究院有限公司 Vehicle track determination method and device, electronic equipment and computer storage medium
CN114692289A (en) * 2020-12-31 2022-07-01 华为技术有限公司 Automatic driving algorithm testing method and related equipment
CN113104041B (en) * 2021-05-08 2022-11-04 地平线(上海)人工智能技术有限公司 Driving track prediction method and device, electronic equipment and storage medium
CN113335276A (en) * 2021-07-20 2021-09-03 中国第一汽车股份有限公司 Obstacle trajectory prediction method, obstacle trajectory prediction device, electronic device, and storage medium
CN113386144B (en) * 2021-08-17 2021-11-09 深圳市创能亿科科技开发有限公司 Remote experiment control device and method
CN113844446B (en) * 2021-10-14 2023-08-15 安徽江淮汽车集团股份有限公司 Vehicle track prediction method integrating long and short ranges
CN113879337A (en) * 2021-10-29 2022-01-04 北京触达无界科技有限公司 Trajectory prediction method and device, storage medium and electronic equipment
CN114312840B (en) * 2021-12-30 2023-09-22 重庆长安汽车股份有限公司 Automatic driving obstacle target track fitting method, system, vehicle and storage medium
CN114228746A (en) * 2022-01-17 2022-03-25 北京经纬恒润科技股份有限公司 Method and device for predicting vehicle motion trail
CN114475593B (en) * 2022-01-18 2023-12-19 上汽通用五菱汽车股份有限公司 Travel track prediction method, vehicle, and computer-readable storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013218497A1 (en) * 2013-09-16 2015-03-19 Bayerische Motoren Werke Aktiengesellschaft Prediction of driving paths of a vehicle
CN108803617B (en) * 2018-07-10 2020-03-20 深圳大学 Trajectory prediction method and apparatus
CN108801286B (en) * 2018-08-01 2021-11-30 奇瑞汽车股份有限公司 Method and device for determining a driving trajectory
CN109376906B (en) * 2018-09-21 2021-11-19 中国科学院深圳先进技术研究院 Travel time prediction method and system based on multi-dimensional trajectory and electronic equipment

Also Published As

Publication number Publication date
CN109885066A (en) 2019-06-14

Similar Documents

Publication Publication Date Title
CN109885066B (en) Motion trail prediction method and device
CN108692734B (en) Path planning method and device
Göhring et al. Radar/lidar sensor fusion for car-following on highways
CN112334368A (en) Vehicle control system and control method for controlling vehicle motion
WO2020061214A1 (en) Collision prediction and avoidance for vehicles
CN108334077B (en) Method and system for determining unity gain for speed control of an autonomous vehicle
US11433885B1 (en) Collision detection for vehicles
US20200148215A1 (en) Method and system for generating predicted occupancy grid maps
CN113243029B (en) Other vehicle behavior prediction method and other vehicle behavior prediction device
KR101874186B1 (en) Method for predicting driving path of object vehicle by maching learning and method for controlling vehicle using the same
Jeong et al. Bidirectional long shot-term memory-based interactive motion prediction of cut-in vehicles in urban environments
CN114391088A (en) Trajectory planner
WO2022132416A1 (en) Object determination in an occluded region
EP4020113A1 (en) Dynamic model evaluation package for autonomous driving vehicles
US11860634B2 (en) Lane-attention: predicting vehicles' moving trajectories by learning their attention over lanes
JP2018063476A (en) Apparatus, method and computer program for driving support
CN113412212A (en) Other-vehicle behavior prediction method and other-vehicle behavior prediction device
CN113942524B (en) Vehicle running control method, system and computer readable storage medium
JP5078727B2 (en) Object detection device
JP6943005B2 (en) Lane change judgment method and lane change judgment device
Choi et al. Radar-based lane estimation with deep neural network for lane-keeping system of autonomous highway driving
CN111959482A (en) Autonomous driving device and method
CN115092183B (en) Active obstacle avoidance control method and system for vehicle based on potential field force
Chung et al. Collision detection system for lane change on multi-lanes using convolution neural network
CN112644487A (en) Automatic driving method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 4 / F, building 1, No.14 Jiuxianqiao Road, Chaoyang District, Beijing 100020

Applicant after: Beijing Jingwei Hengrun Technology Co., Ltd

Address before: 8 / F, block B, No. 11, Anxiang Beili, Chaoyang District, Beijing 100101

Applicant before: Beijing Jingwei HiRain Technologies Co.,Ltd.

GR01 Patent grant
GR01 Patent grant