CN114596712A - Vehicle following control method and system - Google Patents

Vehicle following control method and system Download PDF

Info

Publication number
CN114596712A
CN114596712A CN202210484862.0A CN202210484862A CN114596712A CN 114596712 A CN114596712 A CN 114596712A CN 202210484862 A CN202210484862 A CN 202210484862A CN 114596712 A CN114596712 A CN 114596712A
Authority
CN
China
Prior art keywords
vehicle
following
distance
driver
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210484862.0A
Other languages
Chinese (zh)
Other versions
CN114596712B (en
Inventor
郑建颖
杨泽
郁树梅
孙荣川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou University
Original Assignee
Suzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou University filed Critical Suzhou University
Priority to CN202210484862.0A priority Critical patent/CN114596712B/en
Publication of CN114596712A publication Critical patent/CN114596712A/en
Application granted granted Critical
Publication of CN114596712B publication Critical patent/CN114596712B/en
Priority to PCT/CN2022/109003 priority patent/WO2023213018A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a method and a system for controlling vehicle following, which take the over-driving intention of a driver into consideration under the scene of an urban road intersection so as to improve the influence of the personalized driving style of a vehicle on the traffic safety of the intersection. The method comprises the following steps: 1) collecting vehicle running data of an urban road intersection, and identifying and extracting vehicle following behaviors; 2) acquiring high-resolution microscopic information in the following process by a method for constructing a 2D boundary box model; 3) analyzing the running states, speed and distance changes of a target vehicle and a leading vehicle on the same lane, and judging whether the target vehicle has the overtaking willingness or not; 4) and controlling the acceleration change of the vehicle in real time based on the following models in different driving states. According to the invention, the decision-making judgment of vehicle following is carried out by analyzing the traffic information acquired by the road test sensor, so that traffic accidents are avoided, the passing efficiency of the urban road intersection is improved, and the safety of vehicle running in the following process of the intersection is ensured.

Description

Vehicle following control method and system
Technical Field
The application relates to the field of intelligent traffic technologies and vehicle control methods, in particular to a method and a system for controlling vehicle following in consideration of the driver's intention of overtaking at an urban road intersection scene.
Background
With the rapid development of modern cities, the quantity of motor vehicles in cities is continuously increased, the problems of road traffic jam and safety are increasingly serious, and the information technology represented by an intelligent traffic system becomes a new means for relieving the problems of traffic jam and safety. Vehicle following control is one of key technologies of intelligent traffic, and has important significance for revealing an intrinsic mechanism of traffic flow evolution, relieving traffic jam and guaranteeing the running safety of vehicles.
The urban road intersection is used as an important junction of an urban road network, and is not only a dense area of traffic conflicts, but also a high-incidence area of traffic accidents. Meanwhile, the driving styles of different drivers influence the accuracy of the following control, so that the following vehicle generates errors in control decision, and the running safety and driving comfort of the vehicle are seriously influenced. Therefore, the method is used for analyzing the driving behavior of the vehicles at the urban road intersection and considering the personalized driving style in the process of vehicle following control, and is beneficial to improving the driving safety, comfort and traffic efficiency of the intersection.
Since Pipes first proposed a vehicle following model in 1952, more and more transporters began to research the vehicle following problem, and vehicle following models such as a GM model, an Optimal Velocity (OV) model, a Full Velocity Difference (FVD) model and a Generalized Force (GF) model were successively proposed. With the continuous progress of computer science, navigation technology and video extraction technology in the past 21 st century, researchers obtain a large amount of real vehicle microscopic running track data, and further promote the rapid development of vehicle following model research. Through research on related researches, the fact that although more and more extended models are widely proposed is found, the existing researches rarely consider the driving style of a driver and do not consider the driver's intention of overtaking in the following process.
Therefore, based on the background, aiming at urban road intersection scenes, the overtaking willingness of a driver needs to be considered urgently, a traditional following model is improved, and a more optimal vehicle following control method is provided.
Disclosure of Invention
The invention provides a vehicle following control method and system considering the overtaking willingness of a driver at a road intersection based on the background problem. And optimizing the applicability of the GM model under different driving states by analyzing the traffic information acquired by the drive test sensor and considering the driving parameters of the front and rear vehicles, thereby prompting the vehicles to make a more optimal following control strategy.
In order to achieve the aim, the technical scheme provided by the invention is a vehicle following control method considering the intention of a driver at a road intersection to overtake, which specifically comprises the following steps:
s1: the method comprises the steps of collecting traffic point cloud data of urban road intersections, using a background difference method to achieve background filtering of original point cloud data, completing vehicle target detection through a clustering algorithm, obtaining vehicle track information through a multi-frame fusion method on the basis, and extracting follow-up behaviors from vehicle tracks according to rule conditions of the follow-up behaviors.
S2: constructing a 2D boundary box model of the vehicle, estimating the size of the vehicle through the boundary box model, and determining the position of the vehicle head; comparing the position change of the vehicle in the adjacent frames, and calculating to obtain the instantaneous speed of the vehicle; and comparing the position relation of the front vehicle and the rear vehicle in the following process, and calculating to obtain the following distance.
S3: and analyzing the running states of the target vehicle and the leading vehicle on the same lane according to the high-resolution microscopic traffic data information obtained by the calculation of the S2, and simultaneously analyzing the curves of the vehicle speed and the following distance changing along with time to judge whether the target vehicle has the overtaking intention.
S4: if the target vehicle driver is judged to have the intention of overtaking, a follow-up model considering the intention of overtaking of the driver is adopted to control the speed of the target vehicle, otherwise, a vehicle follow-up control strategy under a normal state is adopted, and different control methods are adopted according to different follow-up states, so that the vehicle can safely and quickly pass through the intersection.
Further, the S1 includes five steps: (1) collecting traffic data; (2) background filtering of point cloud data; (3) detecting a traffic target; (4) acquiring a vehicle running track; (5) and identifying and extracting the following behavior.
Further, in the step S1, collecting data, arranging a laser radar to a road test of an urban road intersection, and collecting traffic data of a vehicle running through the intersection in the scene.
Further, the background filtering method in S1 includes a background construction method and a background subtraction method. A background model is constructed by using a multi-frame data superposition method: and superposing a large amount of frame data under the same scene, counting the distance returned by each position point of the laser radar and the occurrence frequency of different distances, finding the distance d m with the maximum occurrence frequency, and then gathering the points obtained by all the position statistics, namely constructing the background model of the scene. And (3) differentiating the point cloud data of the current frame with the background model by using a background differentiation method, setting a threshold value a m, differentiating the distance between each azimuth point of the current frame and the distance d m of the azimuth in the background model, and filtering all points with the difference smaller than the threshold value a m, namely finishing background filtering.
Further, the target detection method in S1 uses a DBSCAN clustering algorithm. Firstly, selecting appropriate DBSCAN algorithm parameters Eps and MinPts according to the characteristics of laser radar point cloud data and the spatial arrangement rule of traffic target point clouds, detecting targets, then clearing traffic targets in areas outside all road spaces according to the coordinate relation, then identifying and extracting vehicle targets from all targets according to the point cloud characteristic difference of different traffic targets, and recording the position of each vehicle target in the road space.
Further, in S1, a traffic target tracking method based on fusion of historical frame data is used to obtain a vehicle movement track. The space-time correlation of the point cloud is utilized, and the space-time correlation degree of the target point cloud is increased through the fusion of the current frame data and the historical frame data, so that the target tracking accuracy is improved.
Further, the following identification and extraction rules in S1 are as follows: (1) the following vehicle and the leading vehicle run on the same lane; (2) the angle difference between the driving directions of the front vehicle and the rear vehicle is less than 5 degrees; (3) the clearance distance between the front and rear vehicles is less than 30 m.
Further, the S2 includes three steps: (1) constructing a vehicle 2D boundary box model; (2) extracting the speed of the vehicle; (3) extraction of the vehicle following distance.
Further, in the construction of the 2D boundary box model in S2, the processed target point cloud is projected onto an XOY plane, the 3D problem is simplified into a 2D problem, then the target point cloud is extracted by convex hull points, and the 2D boundary box model of the vehicle is constructed according to the minimum area method principle.
Further, the calculation formula of the speed in S2 is as follows:
Figure 185253DEST_PATH_IMAGE001
(1)
wherein
Figure 836815DEST_PATH_IMAGE002
For vehicles at
Figure 963689DEST_PATH_IMAGE003
The coordinates of the position of the head of the vehicle at the moment,
Figure 93319DEST_PATH_IMAGE004
for vehicles at
Figure 376533DEST_PATH_IMAGE005
The head position coordinates at the moment.
Further, the following interval in S2
Figure 58050DEST_PATH_IMAGE006
The calculation formula of (a) is as follows:
Figure 793925DEST_PATH_IMAGE007
(2)
Figure 727246DEST_PATH_IMAGE008
(3)
wherein
Figure 5912DEST_PATH_IMAGE009
And
Figure 468117DEST_PATH_IMAGE010
respectively the head position coordinates of the leading vehicle and the following vehicle,
Figure 815922DEST_PATH_IMAGE011
the distance between the car heads is the distance between the car heads,
Figure 21775DEST_PATH_IMAGE012
is the leading vehicle length.
Further, the specific method for determining whether the target vehicle has the intention to cut-in S3 is as follows: firstly, a curve of the following distance changing along with time in the initial stage is counted, then the change rule is analyzed, if the distance between the following vehicle and the leading vehicle is reduced along with the change of the time, the driver of the following vehicle is considered to have the intention of overtaking, and otherwise, the driver is considered to have no intention of overtaking.
Using a parameter
Figure 279581DEST_PATH_IMAGE013
To indicate whether the driver has the intention to overtake,
Figure 319212DEST_PATH_IMAGE013
=1 indicates that the driver has a desire to overtake,
Figure 29679DEST_PATH_IMAGE013
=0 indicates that the driver does not have the intention to overtake.
Further, the S4 includes two control methods: a vehicle-following control strategy under normal following and a vehicle-following control strategy taking into account the driver's willingness to override. And aiming at different driver intentions, the corresponding following control method is adopted to control the acceleration of the vehicle.
Based on a classical GM model, considering the overtaking willingness of a driver, an improved new following model is provided:
Figure 39224DEST_PATH_IMAGE014
(4)
wherein
Figure 276170DEST_PATH_IMAGE015
Is the nth vehicle
Figure 345757DEST_PATH_IMAGE016
The acceleration at the moment of time is,
Figure 418886DEST_PATH_IMAGE017
is the nth vehicle
Figure 232122DEST_PATH_IMAGE016
The speed of the moment in time is,
Figure 198941DEST_PATH_IMAGE018
is the nth vehicle and the n-1 th vehicle
Figure 829642DEST_PATH_IMAGE019
The difference in the speed at the moment in time,
Figure 249122DEST_PATH_IMAGE020
is divided into two vehicles
Figure 866048DEST_PATH_IMAGE019
The separation distance of the time instants.
Figure 559810DEST_PATH_IMAGE021
For the model parameters of the vehicle-following control method under normal following,
Figure 971200DEST_PATH_IMAGE022
model parameters of a method for controlling vehicle following taking into account the driver's willingness to overtake.
Figure 268189DEST_PATH_IMAGE023
(5)
Figure 157648DEST_PATH_IMAGE024
(6)
Wherein, the first and the second end of the pipe are connected with each other,
Figure 364638DEST_PATH_IMAGE025
respectively the (n-1) th vehicle and the (n) th vehicle are
Figure 87875DEST_PATH_IMAGE019
The coordinates of the time of day are,
Figure 481947DEST_PATH_IMAGE026
respectively the (n-1) th vehicle and the (n) th vehicle are
Figure 299730DEST_PATH_IMAGE019
The speed of the moment.
Further, the vehicle-following control under normal following is directly controlled using a new following model.
Further, the vehicle-following control considering the driver's intention to overtake additionally introduces two distance parameters while controlling through the model: minimum safe distance
Figure 95648DEST_PATH_IMAGE027
And lane change distance
Figure 114420DEST_PATH_IMAGE028
First, the following distance is determined
Figure 871154DEST_PATH_IMAGE006
And
Figure 633574DEST_PATH_IMAGE028
size of (1), if
Figure 283998DEST_PATH_IMAGE006
>
Figure 332725DEST_PATH_IMAGE028
Continuing to execute a vehicle following control strategy considering the driver's intention to overtake; if it is
Figure 701390DEST_PATH_IMAGE006
<
Figure 1921DEST_PATH_IMAGE028
Then judging whether the left lane meets the conditions of overtaking and lane changing; if the conditions are met, the vehicle executes a control strategy of lane changing and overtaking; if the following distance does not meet the condition, judging the following distance
Figure 913377DEST_PATH_IMAGE006
And
Figure 742792DEST_PATH_IMAGE027
the size of (d); if it is
Figure 864332DEST_PATH_IMAGE006
>
Figure 827609DEST_PATH_IMAGE027
Continuing to execute a vehicle following control strategy considering the driver's intention to overtake; if it is
Figure 452625DEST_PATH_IMAGE006
<
Figure 585099DEST_PATH_IMAGE027
The vehicle temporarily abandons the intention of overtaking and executes the vehicle following control strategy under the normal following condition until the requirement is met
Figure 928355DEST_PATH_IMAGE006
>
Figure 570689DEST_PATH_IMAGE027
And then, the vehicle-following control strategy considering the driver's intention of overtaking is executed again.
Compared with the prior art, the technical scheme of the invention has the following advantages:
1. the data for model parameter calibration and decision analysis in the invention are from real traffic scenes, and can reflect real traffic conditions.
2. The invention can reacquire new data and calibrate new model parameters aiming at different scenes, and is suitable for various intersections.
3. According to the invention, the individualized driving style of the driver is considered, namely the overtaking intention of the driver is considered in the car following control process, so that the driving safety and comfort are greatly ensured.
Drawings
In the drawings, like reference numerals refer to the same or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily to scale. It is appreciated that these drawings depict only some embodiments in accordance with the disclosure and are therefore not to be considered limiting of its scope.
Fig. 1 is a general flowchart of a vehicle-following control method in consideration of a driver's intention to overtake.
Fig. 2 is a schematic diagram of original point cloud data in an intersection scene.
Fig. 3 is a graph of the results after background filtering.
Fig. 4 is a diagram showing a result of detection of a vehicle target.
Fig. 5 is a driving track diagram of a vehicle in an intersection scene.
Fig. 6 is a schematic diagram of the vehicle position relationship during following.
Fig. 7 is a schematic diagram of a vehicle 2D bounding box model.
Fig. 8 is a relationship diagram of the position coordinates of the front and rear workshops during the following process.
Fig. 9 is a schematic diagram of a high resolution microscopic database of a vehicle-following process.
Fig. 10(a) is a graph of a variation in following distance with vehicle speed during normal following, and fig. 10(b) is a graph of a variation in following distance with vehicle speed during following with a driver's intention to override.
Fig. 11 is a graph showing a change in speed of a vehicle passing through an intersection.
Fig. 12 is a flow chart of a vehicle-following control strategy that takes into account the willingness of the driver to override.
Fig. 13(a) is a graph comparing a speed change curve outputted from the model with a real speed curve, and fig. 13(b) is a graph comparing a displacement change curve outputted from the model with a real displacement curve.
Fig. 14 shows a configuration diagram of a vehicle-following control system according to an embodiment of the present application.
Fig. 15 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 16 is a schematic diagram of a storage medium provided in an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Referring to fig. 1, a method for controlling vehicle following in consideration of a willingness of a driver to override a vehicle at an urban road intersection according to an embodiment of the present invention includes four steps:
step S1: traffic data are collected and processed, and vehicle following behaviors are identified and extracted.
Step S2: and constructing a 2D boundary box model to obtain high-resolution microscopic traffic information.
Step S3: and judging whether the target vehicle has the overtaking willingness or not.
Step S4: and aiming at different driver intentions, adopting a corresponding car following control method.
The details of step S1 are described as follows S101-S105:
s101: and collecting traffic data. According to the method, the laser radar is deployed in the road test, so that traffic data under the urban road intersection scene is acquired. The laser radar is arranged at the corner of the intersection and close to one side of the vehicle stop line, and the laser radar can scan the environment within 360 degrees around, so that the laser radar can acquire the complete running track of the vehicle passing through the intersection at the position. The laser radar is arranged on a tripod with the height of 1.8 meters at the road side, so that the problem of shielding among traffic targets caused by too low position can be prevented, and the problem of reducing laser beams capable of scanning target objects due to too high position can be prevented, thereby reducing the number of target point clouds.
S102: and filtering the background of the point cloud data. The invention uses a method of multi-frame data superposition to construct the background. Firstly, a large number of frame data under the same scene are overlapped, and then each azimuth laser beam is counted
Figure 440425DEST_PATH_IMAGE029
The returned distance values and the frequency of occurrence of different distances, and finding the distance with the most frequency of occurrence
Figure 877223DEST_PATH_IMAGE030
. Since the position of the traffic object is changing in the traffic scene, the position of the traffic background remains unchanged. Over a long period of time, the background data is necessarily much more than the foreground data (in the art, foreground data is often referred to as traffic objects) in the large amount of data measured by the lidar at a fixed location. Thus the distances statistically obtained from all the orientations
Figure 707775DEST_PATH_IMAGE031
The points are collected together, and the collection of the points is the constructed background model. Differentiating the point cloud data of the current frame with the background model by using a background differentiation method, setting a threshold value to be 0.05m, and differentiating the distance of each square point of the current frame
Figure 763587DEST_PATH_IMAGE032
Distance from the orientation in the background model
Figure 363196DEST_PATH_IMAGE031
And carrying out difference, and filtering out all points with the difference smaller than the threshold value of 0.05m, thereby completing background filtering. Referring to fig. 2 and fig. 3, the original point cloud data and the result after background filtering in the intersection scene are shown.
S103: and detecting the traffic target. The background filtered point cloud data contains various traffic objects (motor vehicles, non-motor vehicles and pedestrians) and some unfiltered noise points. As the point clouds of all traffic targets are distributed in a centralized manner, the DBSCAN clustering algorithm is selected to complete target detection based on the characteristics of the point cloud data. Because the laser radar returns signals through the rotation of 16 beams of laser light by 360 degrees, all points of the laser radar are closer in the X direction and the Y direction and farther in the Z direction for a traffic target, and the Euclidean distance formula in the DBSCAN algorithm is improved based on the characteristic that:
Figure 95528DEST_PATH_IMAGE033
(7)
wherein
Figure 413377DEST_PATH_IMAGE034
In the form of the euclidean distance,
Figure 131935DEST_PATH_IMAGE035
the coordinate difference of the two points on the traffic target in the X, Y, Z three directions is shown.
Selecting proper DBSCAN algorithm parameters according to the characteristics of laser radar point cloud data and the spatial arrangement rule of traffic target point cloud: the clustering radius Eps and the minimum number of clustering points MinPts. If the Eps is too large, different targets can be clustered into one target, if the Eps is too small, the same target can be clustered into a plurality of targets, if the MinPts is too large, the traffic target can be identified as a noise point, if the MinPts is too small, the noise point can be identified as a traffic target, therefore, the parameter clustering radius Eps selected through experiments is 1.2m, and the minimum clustering point number MinPts is 5. And then clearing traffic targets in areas outside all road spaces according to the coordinate relation, identifying and extracting vehicle targets from all the targets according to point cloud characteristic differences of different traffic targets, and recording the position of each vehicle target in the road space. The target detection result is shown with reference to fig. 4.
S104: and acquiring a vehicle running track. The method comprises the steps of obtaining a vehicle driving track by using a traffic target tracking method based on historical frame data fusion, fusing two adjacent frames of data (n-1 and n frames), clustering point clouds of the same vehicle target in the two frames of data into a cluster by using a DBSCAN clustering method, identifying the point clouds as the same target, additionally selecting two adjacent frames of data (n and n +1 frames), fusing, clustering and identifying by using the same method until all the frames of data appearing in the target vehicle are processed, fusing the targets of the frames together, and obtaining the running track of the target vehicle. According to the method, the temporal-spatial correlation of point clouds is utilized, and the temporal-spatial correlation degree of the target point clouds is increased through fusion of current frame data and historical frame data, so that the target tracking accuracy is improved, and the extracted intersection vehicle running track is shown in a reference figure 5. In addition, the track is broken due to shielding during vehicle running, the running track of any vehicle target can be divided into a complete track, a broken track and an error track, the broken track and the error track are removed, and all extracted complete vehicle tracks are used for subsequent processing.
S105: and identifying and extracting the following behavior. When the motor vehicle is in a non-free flow stage, the driving behavior of the rear vehicle in the same lane is stimulated and restrained by the instantaneous movement of the front vehicle, namely the following behavior of the vehicle. Firstly, recognizing an acquired vehicle track, dividing the vehicle running track into a straight running track, a lane changing track and a turning track according to the position change of an X-Y coordinate of the vehicle, and then dividing the straight running vehicle into a free flow state and a following state. Because the invention aims at the urban road intersection scene, the extraction rule for making the following behavior of the vehicles according to the characteristics of small driving speed and small gap distance of the vehicles at the intersection comprises the following contents: (1) the following vehicle and the leading vehicle run on the same lane; (2) the angle difference between the driving directions of the front and the rear vehicles is less than 5 degrees; (3) the distance from the front bumper of the following vehicle to the rear bumper of the front vehicle is less than 30m (i.e. the following distance is less than 30 m). Fig. 6 is a schematic diagram showing the positional relationship of the vehicle during the following process.
The details of step S2 are described as follows S201-S203:
s201: and constructing a vehicle 2D boundary box model. In an actual traffic scene, a vehicle runs on the ground, and the motion track of the vehicle is parallel to the ground plane, so that the processed target point cloud is projected to an XOY plane, a 3D problem is simplified into a 2D problem, then the target point cloud is subjected to convex hull point extraction, and a 2D boundary box model of the vehicle is constructed according to the principle of a minimum area method. Referring to fig. 7, the size and the head position coordinates of the vehicle are calculated by obtaining four vertex coordinates of the vehicle bounding box model.
S202: and calculating the running speed of the vehicle. The target vehicle speed information may be calculated from position information of adjacent frames before and after the target,
Figure 726995DEST_PATH_IMAGE002
and
Figure 505595DEST_PATH_IMAGE004
are respectively vehicles
Figure 310740DEST_PATH_IMAGE003
Time of day and
Figure 223202DEST_PATH_IMAGE005
and (4) calculating the instantaneous speed of the vehicle in the running process by the following formula according to the locomotive position coordinates at the moment.
Figure 531823DEST_PATH_IMAGE036
(8)
S203: the vehicle-following distance is calculated. The distance information during the car following process can be calculated by the position information of the front and rear vehicles in the same frame, as shown in reference to fig. 8,
Figure 356691DEST_PATH_IMAGE009
and
Figure 383553DEST_PATH_IMAGE010
respectively the head position coordinates of the leading vehicle and the following vehicle,
Figure 975071DEST_PATH_IMAGE037
the vehicle head distance is obtained by respectively calculating the vehicle length of the leading vehicle through the following formula
Figure 262833DEST_PATH_IMAGE011
And distance between heel and heel
Figure 648815DEST_PATH_IMAGE006
Figure 162973DEST_PATH_IMAGE038
(9)
Figure 165039DEST_PATH_IMAGE039
(10)
The details of step S3 are described as follows S301-S303:
s301: and integrating the high-resolution microscopic traffic information acquired in the step S2, and establishing a data table of the car following behavior information at the urban road intersection for judging whether the driver has the intention of overtaking or not and making a proper control decision. Referring to fig. 9, the high resolution microscopic data of the vehicle-following process mainly includes the following information 1:
TABLE 1 high resolution microscopic data of vehicle following Process
Figure 182674DEST_PATH_IMAGE040
S302: based on high-resolution microscopic information of intersection vehicle following behavior, the relationship between the following Distance change and the following vehicle speed change in a following process is statistically analyzed by taking the instantaneous speed (FV-Velocity) of a following vehicle as an abscissa and the following Distance (Gap Distance) as an ordinate. It was found by data analysis that throughout the travel of the vehicle through the intersection, there were only two types of following behavior, one in which the following interval increased with an increase in the speed of the following vehicle, and the other in which the following interval decreased with an increase in the speed of the following vehicle, as shown with reference to fig. 10(a) and 10(b), respectively.
S303: and judging whether the target vehicle has the overtaking willingness or not. Referring to fig. 11, through statistical findings on the vehicle speed, when the vehicle travels through the intersection, the vehicle is accelerated, which indicates that the driver always desires to pass through the intersection at a faster speed while ensuring traffic safety. Analysis is carried out on a relation curve between the instantaneous speed of the target vehicle and the following distance between two vehicles before the target vehicle and the vehicle, if the following distance is reduced along with the increase of the speed, namely the following vehicle continuously catches up with the leading vehicle in the process that the two vehicles run through the intersection, and the distance between the two vehicles is continuously reduced, the behavior is a preparation work for preparing the target vehicle for the overtaking, a driver wants to pull in the distance between the target vehicle and the leading vehicle at a higher speed so as to quickly change the lane at the moment of having the overtaking condition and complete the overtaking, therefore, the driver can be considered to have the overtaking intention in the following process before the target vehicle makes a specific overtaking behavior (namely the vehicle starts to change the lane), and the behavior is called the overtaking behavior that the driver has the overtaking intention. Conversely, if the target vehicle-following distance increases with increasing speed, the driver does not have the intention to overtake.
Introducing a binary variable
Figure 5136DEST_PATH_IMAGE013
And is used to indicate whether the driver has a desire to overtake the vehicle.
Figure 865645DEST_PATH_IMAGE041
(11)
The details of step S4 are described as follows S401 to S403:
s401: firstly, data acquired and analyzed by road test detection equipment are transmitted to a target vehicle through ZigBee network communication, so that a vehicle control system can perform vehicle following control according to the information, and vehicle-road cooperation is realized.
S402: controlling the following behavior of the vehicle based on the improved new model:
Figure 798966DEST_PATH_IMAGE042
(12)
wherein
Figure 546473DEST_PATH_IMAGE015
Is following the vehicle
Figure 539837DEST_PATH_IMAGE016
The acceleration at that moment, which is the input to the model,
Figure 763008DEST_PATH_IMAGE017
is following the vehicle
Figure 359074DEST_PATH_IMAGE016
The speed of the moment in time is,
Figure 351301DEST_PATH_IMAGE018
is that the following vehicle and the leading vehicle are in
Figure 515566DEST_PATH_IMAGE019
The difference in the speed at the moment in time,
Figure 835820DEST_PATH_IMAGE020
is two vehicles
Figure 110944DEST_PATH_IMAGE019
The separation distance of the time instants.
Figure 347890DEST_PATH_IMAGE021
For the model parameters of the vehicle-following control method under normal following,
Figure 417477DEST_PATH_IMAGE022
model parameters of a method for controlling vehicle following taking into account the driver's willingness to overtake.
Aiming at different driver intentions, the acceleration of the vehicle is controlled by adopting a corresponding following control method, namely two situations can be divided, embodiment 1 is a vehicle following control method under normal following, and embodiment 2 is a vehicle following control method with an overtaking intention.
Example 1
Since the reaction time of the driver is usually less than 1s, in order to simplify the difficulty of calibrating the parameters, a representative T value is selected in a balanced manner within the range of 0-1 s, and the selected reaction time is taken as a known parameter.
The genetic algorithm is used for calibrating the model parameters, and the genetic algorithm is set as follows: real number encoding, gene fragment of
Figure 615240DEST_PATH_IMAGE043
(ii) a Selecting an operator: a wheel disc selection method; and (3) a crossover operator: a uniform crossing method, wherein the crossing rate is 0.8; mutation operator: mutation was normally distributed with a mutation rate of 0.1. Taking 500 groups of sample data for calibration, the calibration results are shown in table 2:
TABLE 2 calibration results of model parameters
Figure 303842DEST_PATH_IMAGE044
Then, based on the acceleration output of the model, the velocity and displacement information of the vehicle controlled by the model are calculated, and the prediction error of the model is analyzed from both aspects, as shown in table 3.
TABLE 3 model prediction error
Figure 270661DEST_PATH_IMAGE045
Therefore, the error is minimal when the reaction time T is 0.1s, and therefore the optimal result of model parameter calibration is:
Figure 511149DEST_PATH_IMAGE046
example 2
The same procedure as in example 1 was used to calibrate the model parameters to obtain
Figure 320842DEST_PATH_IMAGE047
Parameters of the situation
Figure 937768DEST_PATH_IMAGE022
The optimum result of (2). When the driver has the intention of overtaking in the following process, the speed of the following vehicle is controlled to be greater than that of the front vehicle, so that the distance between the following vehicle and the front vehicle can be continuously reduced to meet the overtaking condition.
When performing vehicle-following control that takes into account the driver's willingness to override, the following distance of the target vehicle from the preceding vehicle also influences the control decision. Two distance parameters are introduced here: minimum safe distance
Figure 637390DEST_PATH_IMAGE027
And lane change distance
Figure 314359DEST_PATH_IMAGE028
When following the car
Figure 221135DEST_PATH_IMAGE006
<
Figure 500807DEST_PATH_IMAGE028
And if the left lane meets the condition of lane change, namely no vehicle influence exists in front of the left lane, the control strategy of the vehicle is switched from the following control to the lane change overtaking control, and the vehicle is controlled to change lanes and overtake so as to meet the overtaking intention of the driver. And if the left lane does not meet the lane change condition, the vehicle continues to carry out the following control until the lane change condition is met.
When following the distance between wheels
Figure 442218DEST_PATH_IMAGE006
<
Figure 24509DEST_PATH_IMAGE027
When the vehicle is in a sudden situation, the vehicle is easy to collide with the front vehicle, and traffic accidents occur. Thus, when the following distance is smaller than the minimum safety distance and the override condition is not satisfied, the vehicle will be controlled to temporarily abandon the override, i.e. the normal following control method is used until when the following distance is smaller than the minimum safety distance
Figure 293947DEST_PATH_IMAGE006
>
Figure 252676DEST_PATH_IMAGE027
And then the following control method with the overtaking intention is recovered. Referring to fig. 12, a flow chart of a vehicle-following control strategy is shown that takes into account the willingness of the driver to override.
Through experiments and statistical analysis, the minimum safe distance is obtained under the scene of urban road intersection
Figure 173227DEST_PATH_IMAGE048
Distance of lane change
Figure 191999DEST_PATH_IMAGE049
S403: with the entire travel information (including position and speed information) of the leading vehicle and the initial information of the following vehicle as known information, the input of the model can be obtained by calculation:
Figure 73367DEST_PATH_IMAGE019
difference in velocity at time
Figure 445574DEST_PATH_IMAGE018
Figure 95998DEST_PATH_IMAGE019
Distance between following and following of time
Figure 285671DEST_PATH_IMAGE020
Following the vehicle
Figure 44549DEST_PATH_IMAGE016
Velocity of time of day
Figure 79501DEST_PATH_IMAGE017
. Outputting instantaneous acceleration of following vehicle
Figure 459798DEST_PATH_IMAGE050
The following speed of the vehicle is calculated by controlling the acceleration change of the vehicle, thereby controlling the running of the vehicle.
Fig. 13(a) and 13(b) are prediction results of the following model, where fig. 13(a) is a predicted following vehicle speed curve and fig. 13(b) is a predicted vehicle displacement curve. The uppermost curve in the figure is the true curve for the lead vehicle, the middle curve is the true curve for the following vehicle, and the lower curve is the predicted curve for the model control output.
An application embodiment provides a vehicle-following control system, which is configured to execute the vehicle-following control method according to the foregoing embodiment, as shown in fig. 14, and includes:
the acquisition and identification module 501 is used for acquiring and processing traffic data at a road intersection and identifying and extracting a car following behavior;
the microscopic traffic information acquisition module 502 is used for constructing a 2D boundary box model of the vehicle and acquiring high-resolution microscopic traffic information, wherein the high-resolution microscopic traffic information comprises the instantaneous speed and the following distance of the vehicle;
an overtaking intention judging module 503, configured to analyze the running states of the target vehicle and the lead vehicle on the same lane according to the high-resolution microscopic traffic information, and analyze a curve of an instantaneous speed and a time-dependent change of a following distance of the vehicle, and judge whether a driver of the target vehicle has an overtaking intention;
and the following control module 504 is configured to control the vehicle to run by using a corresponding vehicle following control method according to different driver intentions.
The car-following control system provided by the above embodiment of the present application and the car-following control method provided by the embodiment of the present application have the same inventive concept and have the same beneficial effects as methods adopted, operated or implemented by application programs stored in the car-following control system.
The embodiment of the present application further provides an electronic device corresponding to the car-following control method provided in the foregoing embodiment, so as to execute the car-following control method. The embodiments of the present application are not limited.
Referring to fig. 15, a schematic diagram of an electronic device provided in some embodiments of the present application is shown. As shown in fig. 15, the electronic device 20 includes: a processor 200, a memory 201, a bus 202 and a communication interface 203, wherein the processor 200, the communication interface 203 and the memory 201 are connected through the bus 202; the memory 201 stores a computer program that can be executed on the processor 200, and the processor 200 executes the vehicle-following control method provided in any of the foregoing embodiments when executing the computer program.
The Memory 201 may include a high-speed Random Access Memory (RAM) and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 203 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used.
Bus 202 can be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The memory 201 is configured to store a program, and the processor 200 executes the program after receiving an execution instruction, where the method for controlling vehicle-following disclosed in any embodiment of the present application may be applied to the processor 200, or implemented by the processor 200.
The processor 200 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 200. The Processor 200 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 201, and the processor 200 reads the information in the memory 201 and completes the steps of the method in combination with the hardware thereof.
The electronic device provided by the embodiment of the application and the car-following control method provided by the embodiment of the application have the same inventive concept and have the same beneficial effects as the method adopted, operated or realized by the electronic device.
The present embodiment further provides a computer-readable storage medium corresponding to the vehicle-following control method provided in the foregoing embodiment, please refer to fig. 16, which illustrates a computer-readable storage medium, which is an optical disc 30 and stores a computer program (i.e., a program product), where the computer program, when executed by a processor, executes the vehicle-following control method provided in any of the foregoing embodiments.
It should be noted that examples of the computer-readable storage medium may also include, but are not limited to, a phase change memory (PRAM), a Static Random Access Memory (SRAM), a Dynamic Random Access Memory (DRAM), other types of Random Access Memories (RAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a flash memory, or other optical and magnetic storage media, which are not described in detail herein.
The computer-readable storage medium provided by the above-mentioned embodiment of the present application and the car-following control method provided by the embodiment of the present application have the same beneficial effects as the method adopted, run, or implemented by the application program stored in the computer-readable storage medium.
It should be noted that:
the algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. Moreover, this application is not intended to refer to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the present application as described herein, and any descriptions of specific languages are provided above to disclose the best modes of the present application.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the application, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the application and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed to reflect the intent: this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in a virtual machine creation system according to embodiments of the present application. The present application may also be embodied as apparatus or system programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present application may be stored on a computer readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several systems, several of these systems may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive various changes or substitutions within the technical scope of the present application, and these should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for vehicle-following control, the method comprising the steps of:
s1: collecting and processing road intersection traffic data, and identifying and extracting a car following behavior;
s2: constructing a vehicle 2D boundary box model, and acquiring high-resolution microscopic traffic information, wherein the high-resolution microscopic traffic information comprises the instantaneous speed and the following distance of a vehicle;
s3: analyzing the running states of a target vehicle and a leading vehicle on the same lane according to the high-resolution microscopic traffic information, analyzing the curves of the instantaneous speed and the following distance of the vehicles changing along with time, and judging whether the driver of the target vehicle has the intention of overtaking or not;
s4: and controlling the vehicle to run by adopting a corresponding vehicle following control method aiming at different driver intentions.
2. The vehicle-following control method according to claim 1, wherein step S1 includes: the method comprises the steps of traffic data acquisition, point cloud data background filtering, traffic target detection, vehicle running track extraction and following behavior identification and extraction.
3. The vehicle-following control method according to claim 2, wherein the traffic target detection method is a DBSCAN clustering algorithm, and includes selecting a preset clustering radius and a minimum number of clustering points according to characteristics of laser radar point cloud data and a spatial arrangement rule of traffic target point clouds, performing target detection, removing traffic targets in regions other than all road spaces according to a coordinate relationship, recognizing and extracting vehicle targets from all targets according to point cloud characteristic differences of different traffic targets, and recording positions of each vehicle target in the road spaces.
4. The vehicle-following control method according to claim 1, wherein the method for constructing a 2D bounding box model of a vehicle of step S2 comprises: firstly, projecting vehicle target point cloud to an XOY plane, then extracting convex points of the target point cloud, and finally constructing a boundary box model according to a minimum area method principle.
5. The vehicle-following control method according to claim 1, wherein the step S2 is a method of acquiring high-resolution microscopic traffic information:
estimating the size of the vehicle through a boundary box model, and determining the position of the vehicle head; comparing the position change of the vehicle in the adjacent frames, and calculating to obtain the instantaneous speed of the vehicle; and comparing the position relation of the front vehicle and the rear vehicle in the following process, and calculating to obtain the following distance.
6. Method for vehicle-following control according to claim 5,
instantaneous speed of the vehicle
Figure 217717DEST_PATH_IMAGE001
And distance between heel and heel
Figure 252669DEST_PATH_IMAGE002
The calculation method of (2) is as follows:
Figure 288758DEST_PATH_IMAGE003
(1)
Figure 242808DEST_PATH_IMAGE004
(2)
Figure 98768DEST_PATH_IMAGE005
(3)
wherein
Figure 62045DEST_PATH_IMAGE001
Is the instantaneous speed of the target vehicle,
Figure 421482DEST_PATH_IMAGE006
and
Figure 77591DEST_PATH_IMAGE007
are respectively vehicles
Figure 155269DEST_PATH_IMAGE008
Time of day and
Figure 928096DEST_PATH_IMAGE009
the coordinates of the position of the head of the vehicle at the moment,
Figure 142039DEST_PATH_IMAGE010
and
Figure 703471DEST_PATH_IMAGE011
respectively the distance between the head of the two vehicles and the distance between the following vehicles,
Figure 534024DEST_PATH_IMAGE012
and
Figure 839103DEST_PATH_IMAGE013
respectively the head position coordinates of the leading vehicle and the following vehicle,
Figure 173132DEST_PATH_IMAGE014
the length of the lead vehicle.
7. The vehicle-following control method according to claim 1, wherein the method of determining whether the analysis target vehicle driver has an intention to overtake in step S3 is: the situation that the following distance changes along with the speed of the vehicle is statistically analyzed, and if the following distance is reduced along with the increase of the speed, the driver has the intention of overtaking; if the following distance increases with increasing speed, the driver has no intention to override.
8. The vehicle-following control method according to claim 7, wherein the controlling the vehicle operation by using the corresponding vehicle-following control method for different driver intentions includes: and if the target vehicle driver is judged to have the intention of overtaking, adopting a following model considering the intention of overtaking of the driver to carry out speed control on the target vehicle, otherwise adopting a vehicle following control strategy under the normal state.
9. A method for vehicle-following control according to claim 8, wherein the following model taking into account the willingness of the driver to override the vehicle specifically comprises:
Figure 905465DEST_PATH_IMAGE015
(4)
wherein
Figure 223314DEST_PATH_IMAGE016
Is the nth following vehicle
Figure 207450DEST_PATH_IMAGE017
The acceleration at the moment of time is,
Figure 786199DEST_PATH_IMAGE018
is the nth following vehicle
Figure 830379DEST_PATH_IMAGE017
The speed of the moment in time is,
Figure 369944DEST_PATH_IMAGE019
is that the nth following vehicle and the (n-1) th vehicle are in
Figure 282406DEST_PATH_IMAGE020
The difference in the speed at the moment in time,
Figure 591027DEST_PATH_IMAGE021
is divided into two vehicles
Figure 806108DEST_PATH_IMAGE020
The separation distance of the moments;
Figure 957604DEST_PATH_IMAGE022
for the model parameters of the method for controlling the car-following under normal following,
Figure 283543DEST_PATH_IMAGE023
model parameters of a method for controlling vehicle following taking into account the driver's willingness to overtake.
10. According to claim 8The vehicle-following control method is characterized in that vehicle-following control considering the driver's intention of overtaking is controlled by a model, and two distance parameters are additionally introduced: minimum safe distance
Figure 712250DEST_PATH_IMAGE024
And lane change distance
Figure 222866DEST_PATH_IMAGE025
First, the following distance is determined
Figure 2603DEST_PATH_IMAGE011
And
Figure 866654DEST_PATH_IMAGE025
size of (1), if
Figure 274501DEST_PATH_IMAGE011
>
Figure 831385DEST_PATH_IMAGE025
If so, continuing to execute a vehicle following control strategy considering the overtaking intention of the driver; if it is
Figure 832839DEST_PATH_IMAGE011
<
Figure 890793DEST_PATH_IMAGE025
Then judging whether the left lane meets the conditions of overtaking and lane changing; if the conditions are met, the vehicle executes a control strategy of lane changing and overtaking; if the following distance does not meet the condition, judging the following distance
Figure 762934DEST_PATH_IMAGE011
And
Figure 756298DEST_PATH_IMAGE024
the size of (d); if it is
Figure 369682DEST_PATH_IMAGE011
>
Figure 575536DEST_PATH_IMAGE024
Continuing to execute a vehicle following control strategy considering the driver's intention to overtake; if it is
Figure 833342DEST_PATH_IMAGE011
<
Figure 856661DEST_PATH_IMAGE024
And the vehicle abandons the intention of overtaking, and executes the vehicle following control strategy under the normal following condition until the requirement is met
Figure 567128DEST_PATH_IMAGE011
>
Figure 842252DEST_PATH_IMAGE024
And then, the vehicle-following control strategy considering the driver's intention of overtaking is executed again.
CN202210484862.0A 2022-05-06 2022-05-06 Vehicle following control method and system Active CN114596712B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210484862.0A CN114596712B (en) 2022-05-06 2022-05-06 Vehicle following control method and system
PCT/CN2022/109003 WO2023213018A1 (en) 2022-05-06 2022-07-29 Car following control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210484862.0A CN114596712B (en) 2022-05-06 2022-05-06 Vehicle following control method and system

Publications (2)

Publication Number Publication Date
CN114596712A true CN114596712A (en) 2022-06-07
CN114596712B CN114596712B (en) 2022-07-19

Family

ID=81820421

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210484862.0A Active CN114596712B (en) 2022-05-06 2022-05-06 Vehicle following control method and system

Country Status (2)

Country Link
CN (1) CN114596712B (en)
WO (1) WO2023213018A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114822044A (en) * 2022-06-29 2022-07-29 山东金宇信息科技集团有限公司 Driving safety early warning method and device based on tunnel
CN115168810A (en) * 2022-09-08 2022-10-11 南京慧尔视智能科技有限公司 Traffic data generation method and device, electronic equipment and storage medium
CN115188184A (en) * 2022-06-20 2022-10-14 海信集团控股股份有限公司 Vehicle speed limit processing method, equipment and device
WO2023213018A1 (en) * 2022-05-06 2023-11-09 苏州大学 Car following control method and system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117246333B (en) * 2023-11-16 2024-01-16 北京航空航天大学 Vehicle driving braking demand prediction method based on near-field predictive information
CN117994987B (en) * 2024-04-07 2024-06-11 东南大学 Traffic parameter extraction method and related device based on target detection technology

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106926844A (en) * 2017-03-27 2017-07-07 西南交通大学 A kind of dynamic auto driving lane-change method for planning track based on real time environment information
CN110597245A (en) * 2019-08-12 2019-12-20 北京交通大学 Automatic driving track-changing planning method based on quadratic planning and neural network
CN110941901A (en) * 2019-11-26 2020-03-31 北方工业大学 Autonomous driving method and system
CN111645692A (en) * 2020-06-02 2020-09-11 中国科学技术大学先进技术研究院 Hybrid strategy game-based driver overtaking intention identification method and system
CN111994088A (en) * 2020-09-02 2020-11-27 中国科学技术大学 Driver lane change intention identification method and system based on hybrid strategy game

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006259948A (en) * 2005-03-16 2006-09-28 Clarion Co Ltd Safe vehicle following distance displaying device
CN1719354A (en) * 2005-05-08 2006-01-11 上海交通大学 Acceleration control method of vehicle follow gallop sports
CN108682184A (en) * 2018-04-25 2018-10-19 江苏大学 A kind of vehicle cut-ins auxiliary control method and system applied to two-way two track
CN110299004A (en) * 2019-07-31 2019-10-01 山东理工大学 The following-speed model of intersection turning vehicle is established and its method for analyzing stability
CN113487874B (en) * 2021-05-27 2022-07-01 中汽研(天津)汽车工程研究院有限公司 System and method for collecting, identifying and classifying following behavior scene data
CN114596712B (en) * 2022-05-06 2022-07-19 苏州大学 Vehicle following control method and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106926844A (en) * 2017-03-27 2017-07-07 西南交通大学 A kind of dynamic auto driving lane-change method for planning track based on real time environment information
CN110597245A (en) * 2019-08-12 2019-12-20 北京交通大学 Automatic driving track-changing planning method based on quadratic planning and neural network
CN110941901A (en) * 2019-11-26 2020-03-31 北方工业大学 Autonomous driving method and system
CN111645692A (en) * 2020-06-02 2020-09-11 中国科学技术大学先进技术研究院 Hybrid strategy game-based driver overtaking intention identification method and system
CN111994088A (en) * 2020-09-02 2020-11-27 中国科学技术大学 Driver lane change intention identification method and system based on hybrid strategy game

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023213018A1 (en) * 2022-05-06 2023-11-09 苏州大学 Car following control method and system
CN115188184A (en) * 2022-06-20 2022-10-14 海信集团控股股份有限公司 Vehicle speed limit processing method, equipment and device
CN115188184B (en) * 2022-06-20 2024-03-19 海信集团控股股份有限公司 Vehicle speed limit processing method, device and apparatus
CN114822044A (en) * 2022-06-29 2022-07-29 山东金宇信息科技集团有限公司 Driving safety early warning method and device based on tunnel
CN115168810A (en) * 2022-09-08 2022-10-11 南京慧尔视智能科技有限公司 Traffic data generation method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2023213018A1 (en) 2023-11-09
CN114596712B (en) 2022-07-19

Similar Documents

Publication Publication Date Title
CN114596712B (en) Vehicle following control method and system
US10832063B2 (en) Systems and methods for detecting an object
CN110843789B (en) Vehicle lane change intention prediction method based on time sequence convolution network
CN112242069B (en) Method and device for determining vehicle speed
US10761201B2 (en) Object detection device and recording medium
WO2017002258A1 (en) Route prediction device
CN116685874A (en) Camera-laser radar fusion object detection system and method
CN113568416B (en) Unmanned vehicle trajectory planning method, device and computer readable storage medium
US20220171975A1 (en) Method for Determining a Semantic Free Space
US20220314979A1 (en) Apparatus and Method for Controlling Driving of Vehicle
CN110119751B (en) Laser radar point cloud target segmentation method, target matching method, device and vehicle
CN113432615A (en) Detection method and system based on multi-sensor fusion drivable area and vehicle
KR20210077043A (en) Apparatus and method for identificating driving lane in vehicle
CN115431945A (en) Target screening method, device, equipment, medium and vehicle
CN115257801A (en) Trajectory planning method and device, server and computer readable storage medium
Lai et al. Sensor fusion of camera and MMW radar based on machine learning for vehicles
CN113753038A (en) Trajectory prediction method and apparatus, electronic device and storage medium
US20240020964A1 (en) Method and device for improving object recognition rate of self-driving car
CN115675472B (en) Ramp port determining method and device, electronic equipment and storage medium
WO2024090388A1 (en) Information processing device and program
EP4113165A1 (en) Methods and systems for radar data processing
CN118025155A (en) Method, device, vehicle and program product for determining following target
JP2024062323A (en) Information processing device and program
KR20240011068A (en) Method for improving object detecting ratio of self-driving car and apparatus thereof
Dey et al. Machine Learning for Efficient Perception in Automotive Cyber-Physical Systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant