CN114782926B - Driving scene recognition method, device, equipment, storage medium and program product - Google Patents

Driving scene recognition method, device, equipment, storage medium and program product Download PDF

Info

Publication number
CN114782926B
CN114782926B CN202210694071.0A CN202210694071A CN114782926B CN 114782926 B CN114782926 B CN 114782926B CN 202210694071 A CN202210694071 A CN 202210694071A CN 114782926 B CN114782926 B CN 114782926B
Authority
CN
China
Prior art keywords
driving
behavior
vehicle
crossing
primitive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210694071.0A
Other languages
Chinese (zh)
Other versions
CN114782926A (en
Inventor
赵树廉
段剑犁
吴思宇
王红
高博麟
李克强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202210694071.0A priority Critical patent/CN114782926B/en
Publication of CN114782926A publication Critical patent/CN114782926A/en
Application granted granted Critical
Publication of CN114782926B publication Critical patent/CN114782926B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology

Abstract

The method comprises the steps of firstly obtaining driving data of a target vehicle in a preset time period, determining driving behavior elements of the target vehicle in the preset time period according to the driving data, and then determining a driving scene of the target vehicle in the preset time period according to the driving behavior elements of the target vehicle in the preset time period and a preset driving scene model. By adopting the method, various driving scenes can be uniformly and automatically identified.

Description

Driving scene recognition method, device, equipment, storage medium and program product
Technical Field
The present application relates to the field of vehicle automatic driving technologies, and in particular, to a driving scenario recognition method, apparatus, device, storage medium, and program product.
Background
With the progress of science and technology, the automobile intelligence degree is higher and higher, and the automatic driving technology is gradually pushed to the market. Prior to entry into the marketplace, the autonomous system must be adequately tested and verified to ensure that the autonomous vehicle can operate safely.
The driving scene is used as the final input capable of reflecting the real running environment of the automatic driving vehicle, and is the core element for supporting the development and test evaluation technology of the automatic driving vehicle. However, the existing driving scene recognition method can only recognize scenes with single functions, and is difficult to adapt to the real driving environment of a complex automatic driving vehicle.
Therefore, how to realize the automatic identification of multiple driving scenes is an urgent problem to be solved.
Disclosure of Invention
In view of the above, it is necessary to provide a driving scene recognition method, device, apparatus, storage medium, and program product capable of automatically recognizing a plurality of driving scenes in view of the above technical problems.
In a first aspect, the present application provides a driving scenario recognition method, including:
acquiring driving data of a target vehicle within a preset time period;
determining a driving behavior primitive of the target vehicle within a preset time period according to the driving data; the driving behavior primitives represent common behavior characteristics of the vehicle in the driving process of a plurality of driving scenes;
determining a driving scene of the target vehicle within a preset time period according to the driving behavior elements of the target vehicle within the preset time period and a preset driving scene model; the driving scene model comprises the corresponding relation between the driving behavior primitive group and the driving scene.
In one embodiment, determining the driving behavior primitive of the target vehicle within the preset time period according to the driving data comprises:
dividing a preset time period into a plurality of time windows according to the time sequence;
and according to the time sequence of each time window, performing behavior primitive identification on the driving data in each time window to obtain driving behavior primitives of the target vehicle in a preset time period.
In one embodiment, the behavior primitive identification comprises a straight-line behavior primitive identification; the driving behavior primitives comprise straight-going behavior primitives;
performing behavior primitive recognition on the driving data in each time window to obtain driving behavior primitives of the target vehicle in a preset time period, wherein the behavior primitives comprise:
aiming at any time window, performing straight-going behavior primitive recognition on the driving data in the time window, and if the driving data in the time window belongs to the straight-going behavior primitive, continuing performing the straight-going behavior primitive recognition on the driving data in the next time window of the time window;
and if the driving data in the time window does not belong to the straight-going behavior primitives, deleting the driving data at the starting moment of the time window, subdividing the time window, and identifying the straight-going behavior primitives of the driving data in the subdivided time window.
In one embodiment, the driving data comprises a transverse distance from the target vehicle to the lane line and a driving included angle between the target vehicle and a tangent of the lane line;
performing straight-ahead behavior primitive recognition on driving data in a time window, comprising:
acquiring a transverse distance corresponding to each time and a driving included angle corresponding to each time in a time window;
determining a distance difference between the transverse distance of each moment in the time window and the transverse distance of the starting moment of the time window according to the transverse distance corresponding to each moment in the time window;
and if the absolute value of the distance difference value at each moment in the time window is less than or equal to the straight-going behavior distance threshold value, and the absolute value of the driving included angle at each moment in the time window is less than or equal to the straight-going behavior angle threshold value, determining that the driving data in the time window belongs to straight-going behavior primitive identification.
In one embodiment, the driving data comprises a left transverse distance from the target vehicle to the left lane line and a left driving included angle between the target vehicle and the tangential direction of the left lane line;
the method for identifying the driving data in each time window to obtain the driving behavior elements of the target vehicle in the preset time period comprises the following steps:
sequentially acquiring left transverse distance variable quantities of target vehicles at each adjacent moment in a preset time period;
determining a time period of the crossing behavior of the target vehicle according to the variable quantity of each left transverse distance;
and acquiring driving behavior primitives of the target vehicle in the crossing behavior time period.
In one embodiment, determining the time period of the target vehicle for the crossing behavior according to the left lateral distance variation comprises:
obtaining left transverse distance variable quantity of which the absolute value is greater than a lane width threshold value in all left transverse distance variable quantities to obtain candidate left transverse distance variable quantity;
determining the earlier time in the adjacent times corresponding to the left transverse distance variable quantity of each candidate as a crossing time; each crossing moment corresponds to a target vehicle to generate a crossing behavior;
and determining the crossing time period of the target vehicle for each crossing behavior according to each crossing time.
In one embodiment, determining the crossing time period of each crossing behavior of the target vehicle according to the crossing time moments comprises:
for any crossing moment, determining the moment meeting the preset condition before the crossing moment as the starting moment of the crossing behavior corresponding to the crossing moment, and determining the moment meeting the preset condition after the crossing moment as the ending moment of the crossing behavior corresponding to the crossing moment; the preset condition is that the absolute value of the left transverse distance variation with the starting time of the preset time period is smaller than or equal to the crossing behavior distance threshold, and the absolute value of the left driving included angle is smaller than or equal to the crossing behavior angle threshold;
and determining the time period between the starting time and the ending time as the time period of the crossing behavior corresponding to the crossing time of the target vehicle.
In one embodiment, the driving behavior primitives include a left ride-through behavior primitive and a right ride-through behavior primitive;
acquiring driving behavior primitives of a target vehicle in a crossing behavior time period, comprising:
acquiring reference left transverse distance variation of a target vehicle in a crossing behavior time period;
if the reference left transverse distance variation is larger than the lane width threshold, determining that a driving behavior element of the target vehicle in the crossing behavior time period belongs to a right crossing behavior element;
and if the reference left transverse distance variation is smaller than the lane width threshold, determining that the driving behavior primitive of the target vehicle in the crossing behavior time period belongs to the left crossing behavior primitive.
In one embodiment, determining the driving scene of the target vehicle within the preset time period according to the driving behavior primitive of the target vehicle within the preset time period and the preset driving scene model comprises:
splicing the driving behavior elements in the preset time period according to the time sequence of the driving behavior elements in the preset time period to obtain a plurality of candidate splicing element groups of the target vehicle in the preset time period;
matching each candidate splicing primitive group with a driving behavior primitive group in a driving scene model respectively;
determining the driving scene corresponding to the successfully matched driving behavior primitive group in the driving scene model as the driving scene corresponding to the candidate splicing primitive group; and the driving scene of each candidate splicing primitive group is the driving scene of the target vehicle in a preset time period.
In one embodiment, the splicing processing of the driving behavior primitives in the preset time period includes:
splicing driving behavior elements with time correlation of a target vehicle within a preset time period; temporal associations represent temporal adjacency or temporal overlap;
and splicing the driving behavior primitives with time association of the target vehicle in a preset time period with the candidate behavior primitives with time association and driving behavior association in the vehicles around the target vehicle.
In one embodiment, the construction process of the driving scene model comprises the following steps:
obtaining a plurality of sample driving scenes; the driving scenes comprise driving scenes of a vehicle to which the sample driving scene belongs, vehicles around the vehicle and the vehicle, wherein the driving scenes are associated with time and driving behaviors;
respectively carrying out driving behavior primitive analysis on each driving scene, and determining a driving behavior primitive group corresponding to each driving scene;
and determining the corresponding relation between each driving behavior primitive group and the corresponding driving scene as a driving scene model.
In a second aspect, the present application further provides a driving scenario recognition apparatus, including:
the acquisition module is used for acquiring driving data of the target vehicle within a preset time period;
the first determination module is used for determining driving behavior elements of the target vehicle in a preset time period according to the driving data; the driving behavior primitives represent common behavior characteristics of the vehicle in the driving process of a plurality of driving scenes;
the second determination module is used for determining the driving scene of the target vehicle in the preset time period according to the driving behavior elements of the target vehicle in the preset time period and the preset driving scene model; the driving scene model comprises the corresponding relation between the driving behavior primitive group and the driving scene.
In a third aspect, an embodiment of the present application provides a computer device, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of any one of the methods provided in the embodiments of the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of any one of the methods provided in the embodiments of the first aspect.
In a fifth aspect, the present application provides a computer program product, which includes a computer program that, when executed by a processor, implements the steps of any one of the methods provided in the embodiments of the first aspect.
According to the driving scene recognition method, the device, the equipment, the storage medium and the program product, firstly, the driving data of the target vehicle in the preset time period is obtained, the driving behavior elements of the target vehicle in the preset time period are determined according to the driving data, and then the driving scene of the target vehicle in the preset time period is determined according to the driving behavior elements of the target vehicle in the preset time period and the preset driving scene model. In the method, the driving behavior primitives represent common behavior characteristics of the vehicle in the driving process of a plurality of driving scenes, the driving behavior primitives are determined by utilizing the driving data of the target vehicle so as to realize the identification of the driving scenes of the target vehicle, the comprehensive research on typical driving scenes is realized, and the driving behavior primitives are common behavior characteristics of the plurality of driving scenes, so that the method is suitable for the identification of the driving scenes of any vehicle and has universality And the automatic identification and extraction of the driving scene with strong universality and easy use.
Drawings
FIG. 1 is a diagram of an exemplary driving scenario recognition method;
FIG. 2 is a flow diagram illustrating a driving scenario identification method in one embodiment;
FIG. 3 is a flow chart illustrating a driving scenario recognition method according to another embodiment;
FIG. 4 is a flow chart illustrating a driving scenario recognition method according to another embodiment;
FIG. 5 is a flowchart illustrating a driving scenario recognition method according to another embodiment;
FIG. 6 is a flow chart illustrating a driving scenario recognition method according to another embodiment;
FIG. 7 is a schematic diagram of a coordinate system of a driving scenario recognition method in one embodiment;
FIG. 8 is a flowchart illustrating a driving scenario recognition method according to another embodiment;
FIG. 9 is a flowchart illustrating a driving scenario recognition method according to another embodiment;
FIG. 10 is a flowchart illustrating a driving scenario recognition method according to another embodiment;
FIG. 11 is a flowchart illustrating a driving scenario recognition method according to another embodiment;
FIG. 12 is a schematic diagram of a vehicle position of a driving scenario identification method in one embodiment;
FIG. 13 is a flowchart illustrating a driving scenario recognition method according to another embodiment;
FIG. 14 is a flow chart illustrating a driving scenario identification method in another embodiment;
FIG. 15 is a block diagram showing the construction of a driving scene recognition apparatus according to an embodiment;
FIG. 16 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clearly understood, the present application is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The driving scene recognition method provided by the embodiment of the application can be applied to the application environment shown in fig. 1. Where the target vehicle 102 communicates with the server 104 over a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104, or may be located on the cloud or other network server. The target vehicle 102 includes various autonomous vehicles and manually driven vehicles, among others. The server 104 may be implemented as a stand-alone server or as a server cluster comprised of multiple servers.
The target vehicle can be provided with a vehicle-mounted system which is used for acquiring driving data of the target vehicle in the driving process and uploading the driving data to the server, and the server receives the driving data of the target vehicle and then carries out driving scene recognition on the driving data.
At present, a large number of researchers have proposed various types of typical driving scene extraction methods; in one mode, a general standard for extracting the dangerous driving scene is determined through research on the dangerous driving scene, in the other mode, characteristic data of a driver is obtained through a real road experiment and an abnormal driving behavior scale and is analyzed, so that an extraction standard of continuous vehicle following working conditions is extracted, and in the other mode, a method for identifying adjacent vehicle cut-in working conditions is extracted through analysis on lane change cut-in behaviors of natural driving data.
However, these research works all adopt a relatively fixed classification rule to classify the driving scene data and are limited to the extraction of a single functional scene. However, the fixed extraction rule of the single-function scene is difficult to adapt to the increasingly complex real driving environment of the automatic driving vehicle, and is not beneficial to forming a standardized and integrated scene extraction framework subsequently, so as to support the comprehensive research on a typical natural driving scene.
Based on this, the embodiment of the application provides a driving scene recognition method, a driving scene recognition device, a driving scene recognition storage medium and a program product, which can automatically recognize a plurality of driving scenes and realize comprehensive research on the driving scenes.
The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, of the embodiments of the present application.
In an embodiment, a driving scenario identification method is provided, which is described by taking the application environment of the method in fig. 1 as an example, and the embodiment relates to a specific process of acquiring driving data of a target vehicle in a preset time period, determining a driving behavior element of the target vehicle in the preset time period according to the driving data, and then determining a driving scenario corresponding to the driving behavior element of the target vehicle in the preset time period according to a preset driving scenario model, as shown in fig. 2, and the embodiment includes the following steps:
s201, driving data of the target vehicle in a preset time period is acquired.
The target vehicle is a vehicle needing to identify the driving scene, the preset time period is a time period needing to identify the driving scene of the target vehicle, and the driving data represent driving data of the target vehicle in a natural driving process within the preset time period.
The driving data may include driving data of the target vehicle and driving data of vehicles around the target vehicle, the driving data including, but not limited to, driving speed data of the vehicle, lane line data, vehicle position data, vehicle driving direction data, traffic environment data including, but not limited to, a distance of the host vehicle from both lane lines, a distance from a center of a lane, and the like, the traffic environment data including, but not limited to, a road type, weather, speed limit, road congestion degree, traffic indicator lights, and the like, the driving data being data capable of reflecting a driving scene of the target vehicle.
In this embodiment, a vehicle-mounted platform may be installed on the target vehicle, and the vehicle-mounted platform is provided with various sensors, for example: cameras, radars, inertial measurement units, etc.
In the driving process of the target vehicle, the original driving data of the target vehicle are acquired in real time through various sensors on the vehicle-mounted platform, and the driving data can be sampled according to a preset fixed frequency in the mode of acquiring the original driving data of the target vehicle.
When the driving scene of the target vehicle is identified, the scene of the target vehicle may be identified in a time period manner, for example, driving data of the target vehicle within a preset time period is obtained, where the preset time period may be any time period during which the target vehicle is driving.
It should be noted that the driving data is natural driving data, that is, the driving data of the target vehicle in a preset time period is the driving data of the target vehicle in any time period during natural driving.
S202, determining driving behavior elements of the target vehicle in a preset time period according to the driving data; the driving behavior primitives represent common behavior characteristics of the vehicle during driving of a plurality of driving scenes.
The driving behavior primitive is a local behavior on a driving scene and is a basic structural unit of the driving scene, and when the driving scene of the target vehicle within a preset time period is identified, the driving behavior primitive of the target vehicle within the preset time period can be determined firstly.
Alternatively, the driving behavior primitive of the target vehicle within the preset time period may be determined according to a preset neural network model, specifically, the driving data of the target vehicle within the preset time period is input into the preset neural network model, and the driving behavior primitive of the target vehicle within the preset time period is output through analysis of the neural network model.
S203, determining a driving scene of the target vehicle in a preset time period according to the driving behavior primitive of the target vehicle in the preset time period and a preset driving scene model; the driving scene model comprises the corresponding relation between the driving behavior primitive group and the driving scene.
The driving behavior elements are common behavior characteristics of the vehicle in a plurality of driving scenes, so that the driving behavior elements are combined to obtain corresponding driving scenes, namely, the driving behavior elements have corresponding relation with the driving behaviors.
The driving scene model comprises the corresponding relation between the driving behavior primitive group and the driving scene, so that the driving scene of the target vehicle in the preset time period can be determined according to the driving behavior primitive of the target vehicle in the preset time period and the driving scene model.
Specifically, the driving behavior primitives of the target vehicle within the preset time period may be combined according to a preset combination rule to obtain a driving behavior primitive group, and then the driving scene of the target vehicle within the preset time period may be determined according to the driving behavior primitive group of the target vehicle within the preset time period and the corresponding relationship between the driving behavior primitive group and the driving scene in the driving scene model.
According to the driving scene recognition method, firstly, driving data of a target vehicle in a preset time period is obtained, driving behavior elements of the target vehicle in the preset time period are determined according to the driving data, and then a driving scene of the target vehicle in the preset time period is determined according to the driving behavior elements of the target vehicle in the preset time period and a preset driving scene model. In the method, the driving behavior primitives represent common behavior characteristics of the vehicle in the driving process of a plurality of driving scenes, the driving behavior primitives are determined by utilizing the driving data of the target vehicle so as to realize the identification of the driving scenes of the target vehicle, the comprehensive research on typical driving scenes is realized, and the driving behavior primitives are common behavior characteristics of the plurality of driving scenes, so that the method is suitable for the identification of the driving scenes of any vehicle and has universality And the automatic identification and extraction of the driving scene with strong universality and easy use.
Based on the above embodiments, how to obtain the driving behavior primitive of the target vehicle within the preset time period is described below by an embodiment, in one embodiment, as shown in fig. 3, determining the driving behavior primitive of the target vehicle within the preset time period according to the driving data includes the following steps:
s301, dividing a preset time period into a plurality of time windows according to the time sequence.
When driving scene recognition is performed on a driving process of a target vehicle in a preset time period, the preset time period can be firstly divided into a plurality of time windows, and then scene recognition is performed on driving data in each time window according to a preset recognition rule.
The manner of dividing the plurality of time windows may be to divide the time windows by a preset time length according to a time sequence, or to divide a preset time period into a plurality of time windows on average, and it should be noted that the number of the time windows is not limited in this application.
Optionally, when actually sampling the driving data, there may be a problem that the sampling frequency intervals are not consistent, that is, the time intervals of two adjacent driving data are different, and each driving data corresponds to a time within a preset time period, so the manner of dividing the plurality of time windows may also be determined according to the sampling number of the driving data within the preset time period, for example, 1000 driving data are provided within the preset time period, and each 100 driving data are divided into one time window according to the sequence of the sampling times.
In one embodiment, the pass widthT(s) dividing the sliding time window, assuming for any time window a starting time within the time windowt j The end time ist k Then, there are:
Figure 831225DEST_PATH_IMAGE002
(1)
wherein, the first and the second end of the pipe are connected with each other,t k t j are respectively the firstkStrip, firstjAnd the moment corresponding to the driving data.
Alternatively,t=3s。
s302, according to the time sequence of each time window, performing behavior primitive identification on the driving data in each time window to obtain driving behavior primitives of the target vehicle in a preset time period.
Because each time window is divided according to the time sequence, the time sequence of each time window exists, and the driving data in each time window is sequentially subjected to behavior element identification according to the time sequence of each time window, so that the driving behavior elements of the target vehicle in the preset time period are finally obtained.
The manner of performing behavior primitive recognition on the driving data in each time window may be to perform behavior primitive recognition on the driving data in each time window according to a preset recognition algorithm, specifically, to use the driving data in each time window as an input of the recognition algorithm, and to obtain driving behavior primitives in each time window by running the recognition algorithm.
According to the driving scene identification method, the preset time period is divided into a plurality of time windows according to the time sequence, and the driving data in each time window is subjected to behavior element identification according to the time sequence of each time window, so that the driving behavior elements of the target vehicle in the preset time period are obtained. The method identifies the behavior primitives of the driving data in each time window, thereby obtaining the driving behavior primitives of the target vehicle in the preset time period, and ensuring the effectiveness and accuracy of the subsequent driving scene identification of the target vehicle in the preset time period.
In one embodiment, as shown in fig. 4, the behavior primitive recognition is performed on the driving data in each time window to obtain the driving behavior primitive of the target vehicle in the preset time period, and the method includes the following steps:
s401, aiming at any time window, straight-going behavior primitive recognition is carried out on the driving data in the time window, and if the driving data in the time window belongs to the straight-going behavior primitive, the straight-going behavior primitive recognition is continuously carried out on the driving data in the next time window of the time window.
Based on the plurality of time windows divided by the preset time period, firstly, according to the sequence of time, the driving data in the time window with the first time is subjected to straight behavior primitive recognition, if the driving data in the time window belongs to the straight behavior primitive, the driving data in the time window with the second time is continuously subjected to straight behavior primitive recognition until the driving data in each time window is subjected to straight behavior primitive recognition.
For example, if there are 5 time windows in the preset time period, and the 5 time windows are numbered by numbers 1-5 according to the time sequence, i.e. the time window 1 is the time window with the first time, then the driving data in the time window 1 is firstly processed with the straight-going behavior primitive recognition, if the driving data in the time window No. 1 belongs to the straight-going behavior primitive, the straight-going behavior primitive recognition is continuously carried out on the driving data in the time window No. 2, if the driving data in the No. 2 time window belongs to the straight-going behavior primitive, the straight-going behavior primitive recognition is continuously carried out on the driving data in the No. 3 time window, if the driving data in the time window No. 3 belongs to the straight-going behavior primitive, the straight-going behavior primitive recognition is continuously carried out on the driving data in the time window No. 4, and if the driving data in the time window No. 4 belongs to the straight-going behavior primitive, continuing to perform straight-going behavior primitive identification on the driving data in the time window No. 5.
S402, if the driving data in the time window does not belong to the straight-going behavior primitives, deleting the driving data at the initial moment of the time window, then dividing the time window again, and identifying the straight-going behavior primitives of the driving data in the newly divided time window.
And if the driving data in the time window does not belong to the straight-going behavior primitives, removing the driving data at the initial time in the time window, subdividing the time window at the time after the initial time, and then identifying the straight-going behavior primitives of the driving data in the subdivided time window.
Based on the above example, for time windows 1 to 5, if the driving data in the time window 1 belongs to the straight behavior primitive, the driving data in the time window 2 continues to be identified by the straight behavior primitive, and if the driving data in the time window 2 does not belong to the straight behavior primitive, the driving data corresponding to the start time in the time window 2 is removed, and then the time windows 2 to 5 are divided again to determine a new time window, where the dividing manner may be the same as the manner of dividing the time windows, and this embodiment of the present application is not described herein again.
Then, the driving data in each of the subdivided time windows is continuously subjected to the straight-going behavior primitive recognition, and the recognition mode is the same as the mode of performing the straight-going behavior primitive recognition on the driving data in each of the time windows in the above embodiment, which is not described herein again.
In the driving scene recognition method, for any time window, straight-going behavior primitive recognition is carried out on driving data in the time window, if the driving data in the time window belongs to the straight-going behavior primitive, straight-going behavior primitive recognition is continuously carried out on the driving data in the next time window of the time window, if the driving data in the time window does not belong to the straight-going behavior primitive, the driving data at the starting moment of the time window is deleted, the time window is divided again, and the straight-going behavior primitive recognition is carried out on the driving data in the newly divided time window. According to the method, the straight-going behavior primitives of the target vehicle are identified in each time window within the preset time period, so that the accuracy of the straight-going behavior primitives of the target vehicle within the preset time period is guaranteed.
The embodiment performs the straight-going behavior primitive recognition on the driving data in the time window, and when the driving data includes the lateral distance from the target vehicle to the lane line and the driving included angle between the target vehicle and the lane line tangent line, the following describes in detail how to perform the straight-going behavior primitive recognition on the driving data in the time window by an embodiment, and in an embodiment, as shown in fig. 5, the straight-going behavior primitive recognition on the driving data in the time window includes the following steps:
s501, acquiring a transverse distance corresponding to each time and a driving included angle corresponding to each time in a time window.
When the driving data in the time window is subjected to straight-ahead behavior primitive recognition, firstly, the transverse distance from the target vehicle to the lane line and the driving included angle between the target vehicle and the lane line tangent line corresponding to each moment in the time window need to be acquired.
The target vehicle obtains the driving data of the target vehicle in real time through the sensor in the driving process, the driving data are stored in the database, and each piece of driving data corresponds to one moment, so that the transverse distance between the target vehicle and the lane line and the driving included angle between the target vehicle and the lane line tangent line, which correspond to the target vehicle at each moment in the corresponding time window, can be directly obtained from the database.
When the transverse distance from the target vehicle to the lane line and the included angle between the target vehicle and the tangent line of the lane line are obtained, the transverse distance and the included angle between the target vehicle and the left lane line can be obtained, and the included angle between the target vehicle and the tangent line of the lane line represents the included angle between the vehicle course of the target vehicle and the tangential direction of the lane line.
And S502, determining a distance difference between the transverse distance of each moment in the time window and the transverse distance of the starting moment of the time window according to the transverse distance corresponding to each moment in the time window.
According to the transverse distance from the target vehicle to the lane line corresponding to each time in the time window, the distance difference between the transverse distance of each time in the time window and the transverse distance of the starting time of the time window is calculated, and the distance difference can be expressed by an expression (2).
Figure 787679DEST_PATH_IMAGE004
(2)
Wherein the content of the first and second substances,
Figure 137889DEST_PATH_IMAGE006
jandkrespectively a start time and an end time of the time window,iis any time within the time window,
Figure 103571DEST_PATH_IMAGE008
is the first in a time windowiThe lateral distance of the target vehicle to the lane line at that time,
Figure 753995DEST_PATH_IMAGE010
the lateral distance of the target vehicle to the lane line at the start of the time window,
Figure 146931DEST_PATH_IMAGE012
is the first in the time windowiTransverse distance of time and transverse of starting time of time windowThe distance difference between the distances.
And S503, if the absolute value of the distance difference value at each moment in the time window is smaller than or equal to the straight-going behavior distance threshold value, and the absolute value of the driving included angle at each moment in the time window is smaller than or equal to the straight-going behavior angle threshold value, determining that the driving data in the time window belongs to a straight-going behavior primitive.
If the absolute value of the distance difference value of each moment in the time window is smaller than or equal to the straight-going behavior distance threshold value, and the absolute value of the running included angle of each moment in the time window is smaller than or equal to the straight-going behavior angle threshold value, it is indicated that the course deviation between each moment in the time window and the initial moment in the time window is not large, and it is determined that the driving data in the time window belongs to the straight-going behavior primitive.
It should be noted that the straight-ahead behavior primitive in the present embodiment represents a dynamic driving behavior of the vehicle without lane change on a straight road and a curve.
Based on the above embodiment, the conditions for satisfying the straight-line behavior primitive are:
Figure 250016DEST_PATH_IMAGE014
(3)
wherein the content of the first and second substances,
Figure 753809DEST_PATH_IMAGE016
is the first in the time windowiThe angle between the target vehicle and the tangent of the lane line at that moment,
Figure 990231DEST_PATH_IMAGE018
for the straight-line behavior distance threshold in the straight-line behavior primitive,
Figure 554068DEST_PATH_IMAGE020
is a straight-line behavior angle threshold in the straight-line behavior primitive.
Alternatively,
Figure 613291DEST_PATH_IMAGE018
=0.3m,
Figure 451934DEST_PATH_IMAGE020
=
Figure 76950DEST_PATH_IMAGE022
according to the driving scene identification method, the transverse distance corresponding to each time in the time window and the driving included angle corresponding to each time are obtained, the distance difference between the transverse distance of each time in the time window and the transverse distance of the starting time of the time window is determined according to the transverse distance corresponding to each time in the time window, and if the absolute value of the distance difference of each time in the time window is smaller than or equal to the straight-going behavior distance threshold value and the absolute value of the driving included angle of each time in the time window is smaller than or equal to the straight-going behavior angle threshold value, it is determined that the driving data in the time window belongs to straight-going behavior primitive identification. The method determines the conditions meeting the straight-going behavior elements, and improves the accuracy of straight-going behavior element identification.
The following describes how to determine a driving behavior element of the target vehicle within a preset time period according to the left lateral distance from the target vehicle to the left lane line, and in one embodiment, as shown in fig. 6, the behavior element recognition is performed on the driving data within each time window to obtain the driving behavior element of the target vehicle within the preset time period, including the following steps:
s601, sequentially obtaining the left transverse distance variable quantity of the target vehicle at each adjacent moment in a preset time period.
The driving data of the target vehicle in the preset time period includes the left transverse distance from the target vehicle to the left lane line at each moment in the preset time period, so that the left transverse distance variation from the target vehicle to the left lane line at each adjacent moment in the preset time period can be sequentially calculated, as shown in formula (4).
Figure 811688DEST_PATH_IMAGE024
(4)
Wherein the content of the first and second substances,
Figure 358207DEST_PATH_IMAGE026
is as followsiThe left lateral distance of the target vehicle to the left lane line at time +1,
Figure DEST_PATH_IMAGE027_39A
is a firstiThe left lateral distance of the target vehicle to the left lane line at that time,
Figure DEST_PATH_IMAGE029_50A
is as followsi+1 time and 1iThe amount of change in the left lateral distance of the target vehicle to the left lane line at that time.
Alternatively, the left lateral distance of the vehicle from the left lane line may be collected by establishing a rectangular coordinate system according to the left-hand rule with the position of the camera sensor on the vehicle as the origin of the coordinate system, as shown in fig. 7.
The adjacent time corresponding to the left lateral distance variation should satisfy that the time difference between the adjacent times is smaller than the minimum time difference judgment threshold, which ensures that the time difference between the adjacent times is not too long, and further ensures the accuracy of the scene recognition algorithm, as shown in formulas (5) and (6).
Figure DEST_PATH_IMAGE031_41A
(5)
Figure DEST_PATH_IMAGE033_43A
(6)
Wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE035_99A
is shown asi+At the time point 1, the mobile terminal is,
Figure DEST_PATH_IMAGE037_46A
is shown asiAt the moment of time, the time of day,
Figure DEST_PATH_IMAGE039_44A
is shown asi+1 time and 1iThe time difference of the time of day is,
Figure DEST_PATH_IMAGE041_40A
indicates the minimum time difference determination threshold.
And S602, determining the time period of the crossing behavior of the target vehicle according to the variable quantity of each left transverse distance.
According to the obtained left transverse distance variation between the adjacent moments, the time period of the crossing behavior of the target vehicle can be determined, and the crossing behavior represents the behavior of the target vehicle crossing the lane line.
In one embodiment, as shown in fig. 8, the determining the time period of the target vehicle having the crossing behavior according to the left lateral distance variation includes the following steps:
s801, obtaining left lateral distance variation with an absolute value larger than a lane width threshold value in each left lateral distance variation, and obtaining candidate left lateral distance variation.
The left lateral distance variation whose absolute value is greater than the lane width threshold value among the left lateral distance variations obtained in the above-described embodiments is determined as a left lateral distance variation candidate, which can be represented by equation (7).
Figure 758796DEST_PATH_IMAGE043
(7)
Wherein the content of the first and second substances,Dis a threshold value for lane width, optionally,Dthe value of (c) can be determined according to the width of the lane.
Alternatively, the first and second liquid crystal display panels may be,D=3m。
and S802, determining the earlier time in the adjacent time corresponding to each candidate left transverse distance variable quantity as the crossing time.
And each crossing moment corresponds to one crossing behavior of the target vehicle.
In the above embodiment, the earlier time in the adjacent time corresponding to the candidate left lateral distance variation is determined as the crossing time, for example, if the adjacent time corresponding to any candidate left lateral distance variation is the crossing timet i+1 Andt i then will bet i The crossing time is determined, and represents the time when the target vehicle crosses the lane line.
It should be noted that there may be a plurality of left lateral distance candidates, and the number of times of the target vehicle having the crossing behavior may also be a plurality of times, that is, there may be more than one crossing time.
And S803, determining a crossing time period for the target vehicle to generate crossing behaviors each time according to each crossing time.
According to each crossing time, namely the time when the target vehicle crosses the lane line each time, the time period when the target vehicle crosses the lane line each time can be determined, namely the time when the target vehicle starts to cross the lane line and the time when the target vehicle crosses the lane line.
And S603, acquiring driving behavior primitives of the target vehicle in the crossing behavior time period.
And further determining a driving behavior primitive in the crossing behavior time period based on the determined time period of the crossing behavior of the target vehicle in the preset time period.
The manner of determining the driving behavior primitive of the target vehicle in the time period of the crossing behavior may be determined according to a preset recognition algorithm, specifically, the driving data of the target vehicle in the time period of the crossing behavior is used as an input of the recognition algorithm, and the driving behavior primitive of the target vehicle in the time period of the crossing behavior is finally output by running the recognition algorithm.
In the driving scene recognition method, the left transverse distance variable quantity of the target vehicle at each adjacent moment in a preset time period is sequentially acquired, the time period of the crossing behavior of the target vehicle is determined according to the left transverse distance variable quantity, and then the driving behavior element of the target vehicle in the time period of the crossing behavior is acquired. The method determines the time period of the crossing behavior of the target vehicle according to the left transverse distance variation of the target vehicle at each adjacent moment in the preset time period, and then further determines the driving behavior elements of the target vehicle in the time period of the crossing behavior, so that the automatic identification of the driving behavior elements of the target vehicle in the preset time period is realized.
Further, the driving data further includes a left driving angle between the target vehicle and a tangential direction of the left lane line, and how to further determine the crossing time period of the target vehicle for each crossing behavior according to the left driving angle between the target vehicle and the tangential direction of the left lane line is described in detail below by using an embodiment, in an embodiment, as shown in fig. 9, determining the crossing time period of the target vehicle for each crossing behavior according to each crossing time includes:
s901, for any crossing time, determining a time before the crossing time and meeting a preset condition as a start time of a crossing behavior corresponding to the crossing time, and determining a time after the crossing time and meeting the preset condition as an end time of the crossing behavior corresponding to the crossing time.
The preset condition is that the absolute value of the left transverse distance variation with the starting time of the preset time period is smaller than or equal to the crossing behavior distance threshold, and the absolute value of the left driving included angle is smaller than or equal to the crossing behavior angle threshold.
When the time period of the crossing behavior of the target vehicle is determined based on the obtained domain crossing time in the preset time period and the crossing time which is the time when the target vehicle crosses the lane line, the starting time of the crossing behavior of the target vehicle needs to be determined by backtracking from the crossing time to the front, and the ending time of the crossing behavior of the target vehicle is determined by iterating from the crossing time to the back.
Specifically, the time meeting the preset condition is determined as the starting time of the crossing behavior corresponding to the crossing time from the crossing time to the forward backtracking until the time meeting the preset condition is found, that is, the time which is closest to the crossing time and meets the requirement that the absolute value of the left transverse distance variation with the starting time of the preset time period is less than or equal to the crossing behavior distance threshold and the absolute value of the left driving included angle is less than or equal to the crossing behavior angle threshold is determined as the starting time of the crossing behavior.
And iterating backwards from the crossing moment until the moment meeting the preset condition is found, determining the moment meeting the preset condition as the ending moment of the crossing behavior corresponding to the crossing moment, namely iterating backwards at the crossing moment, determining the moment which is closest to the crossing moment and meets the requirement that the absolute value of the left transverse distance variation with the starting moment of the preset time period is less than or equal to a crossing behavior distance threshold value, and determining the moment when the absolute value of the left driving included angle is less than or equal to a crossing behavior angle threshold value as the ending moment of the crossing behavior.
The preset condition is shown as formula (8).
Figure DEST_PATH_IMAGE045_48A
(8)
Wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE047_49A
indicating that the conditions are simultaneously satisfied,
Figure 579597DEST_PATH_IMAGE010
a left lateral distance which is a start time of the preset time period,
Figure 16394DEST_PATH_IMAGE008
for the first within a preset time periodiThe left lateral distance of the instant, i.e. the start and end instants of the ride-through activity,
Figure DEST_PATH_IMAGE049_41A
for vehicles at firstiThe left driving included angle of the target vehicle at the moment,
Figure DEST_PATH_IMAGE051_89A
Figure DEST_PATH_IMAGE053_44A
respectively, a cross-behavior distance threshold and a cross-behavior angle threshold in the driving behavior primitive.
Alternatively,
Figure DEST_PATH_IMAGE051_90A
=0.1m,
Figure DEST_PATH_IMAGE053_45A
=
Figure DEST_PATH_IMAGE055_50A
and S902, determining the time period between the starting time and the ending time as the time period when the target vehicle has the crossing behavior.
And determining the time period between the starting time and the ending time of the crossing behavior corresponding to each crossing time as the time period of the crossing behavior corresponding to each crossing time of the target vehicle.
In the driving scene recognition method, for any crossing moment, a moment meeting a preset condition before the crossing moment is determined as a starting moment of a crossing behavior corresponding to the crossing moment, and a moment meeting the preset condition after the crossing moment is determined as an ending moment of the crossing behavior corresponding to the crossing moment; the preset condition is that the absolute value of the left transverse distance variation with the starting time of the preset time period is smaller than or equal to the crossing behavior distance threshold, the absolute value of the left driving included angle is smaller than or equal to the crossing behavior angle threshold, and the time period between the starting time and the ending time is determined as the time period when the crossing behavior of the target vehicle occurs. The method further limits the time period of how to determine the crossing behavior, and improves the accuracy of driving behavior primitive identification, thereby ensuring the accuracy of driving scene identification of the target vehicle in the preset time period.
Further, the driving behavior primitives include a left crossing behavior primitive and a right crossing behavior primitive, and in one embodiment, as shown in fig. 10, acquiring the driving behavior primitives of the target vehicle within the time period of the crossing behavior includes the following steps:
s1001, a reference left lateral distance variation amount of the target vehicle in the time period of the crossing behavior is acquired.
The reference left lateral distance variation amount of the target vehicle within the time period of the crossing behavior represents the left lateral distance variation amount at the corresponding crossing time.
And directly acquiring the left transverse distance variable corresponding to the crossing time from the database, and determining the left transverse distance variable as a reference left transverse distance variable.
And S1002, if the reference left transverse distance variation is larger than the lane width threshold, determining that the driving behavior primitive of the target vehicle in the crossing behavior time period belongs to the right crossing behavior primitive.
And S1003, if the reference left transverse distance variation is smaller than the lane width threshold, determining that the driving behavior primitive of the target vehicle in the crossing behavior time period belongs to a left crossing behavior primitive.
The left lateral distance variation corresponds to a difference value between a time after the crossing time in the time period of the crossing behavior and a left lateral distance from the target vehicle to the left lane line corresponding to the domain crossing time.
And the left transverse distance is acquired by establishing a rectangular coordinate system according to a left-hand rule by taking the position of the camera sensor as the origin of the coordinate system, and the left transverse distance from the target vehicle to the left lane line is the distance from the vehicle to the left lane line of the lane to which the target vehicle belongs.
For example, if the lane width is 3.5, when the target vehicle has a left crossing behavior, the left lateral distance of the target vehicle is changed from 0 to-3.5, and then the reference left lateral distance variation is-3.5; when the target vehicle has right crossing behavior, the left lateral distance of the target vehicle is changed from-3.5 to 0, and the reference left lateral distance is changed by 3.5.
Therefore, if the reference left lateral distance variation is larger than the lane width threshold, that is, a positive value, it is determined that the driving behavior primitive of the target vehicle within the time period of the crossing behavior belongs to the right crossing behavior primitive; if the reference left lateral distance variation is smaller than the lane width threshold value, namely a negative value, it is determined that the driving behavior primitive of the target vehicle within the crossing behavior time period belongs to the left crossing behavior primitive.
It should be noted that, in the embodiment of the present application, the left lateral distance from the target vehicle to the left lane line and the left lateral distance from the target vehicle to the left lane line are taken as negative values, so that the right crossing behavior cell corresponding to the reference left lateral distance variation larger than the lane width threshold value and the crossing behavior cell corresponding to the reference left lateral distance variation smaller than the lane width threshold value are taken as reference. However, in practical applications, the embodiments of the present application do not limit this.
In the driving scene recognition method, a reference left transverse distance variation of the target vehicle in the time period of the crossing behavior is obtained, if the reference left transverse distance variation is larger than a lane width threshold, it is determined that a driving behavior primitive of the target vehicle in the time period of the crossing behavior belongs to a right crossing behavior primitive, and if the reference left transverse distance variation is smaller than the lane width threshold, it is determined that the driving behavior primitive of the target vehicle in the time period of the crossing behavior belongs to a left crossing behavior primitive. The method determines whether the driving behavior primitive of the target vehicle in the crossing behavior time period belongs to the left crossing behavior primitive or the right crossing behavior primitive, and ensures the diversity of the driving scene of the target vehicle.
Based on the driving behavior primitives of the target vehicle within the preset time period obtained above, and the driving behavior primitives include a straight driving primitive, a left crossing behavior primitive and a right crossing primitive, in an embodiment, as shown in fig. 11, a driving scene of the target vehicle within the preset time period is determined according to the driving behavior primitives of the target vehicle within the preset time period and a preset driving scene model, including the following steps:
and S1101, splicing the driving behavior primitives in the preset time period according to the time sequence of the driving behavior primitives in the preset time period to obtain a plurality of candidate spliced primitive groups of the target vehicle in the preset time period.
And splicing the driving behavior elements according to the time sequence of the driving behavior elements in a preset time period, and determining the spliced driving behavior elements as candidate splicing element groups.
In one embodiment, splicing processing is carried out on each driving behavior primitive in a preset time period, and the driving behavior primitives with time correlation of a target vehicle in the preset time period are spliced; temporal associations represent temporal adjacencies or temporal overlaps.
Temporal association includes temporal adjacency or temporal overlap, specifically, defining aA threshold value of time difference of combination of each element, wherein in any two adjacent driving behavior elements within a preset time period, according to the sequence of time, the absolute value of the time difference between the ending time of the previous driving behavior element and the starting time of the next driving behavior element is less than the threshold value of time difference of combination of the elements, that is to say
Figure DEST_PATH_IMAGE057_45A
And then determining that the adjacent driving behavior primitives have time correlation, splicing the two adjacent driving behavior primitives according to a combination order, wherein,
Figure DEST_PATH_IMAGE059_55A
the time difference threshold is combined for the primitives that satisfy the stitching condition.
Therefore, the driving behavior primitives with time correlation in all the driving behavior primitives in the preset time period are spliced, and a plurality of candidate spliced primitive groups can be obtained.
For example, if the driving behavior primitives of the target vehicle in the preset time period include the No. 1 straight behavior primitive, the No. 1 left crossing behavior primitive and the No. 2 straight behavior primitive, and the No. 1 straight behavior primitive and the No. 1 left crossing behavior primitive have time correlation therebetween, and the No. 1 left crossing behavior primitive and the No. 2 straight behavior primitive have time correlation therebetween, the No. 1 straight behavior primitive, the No. 1 left crossing behavior primitive and the No. 2 straight behavior primitive are sequentially spliced according to the time sequence to obtain the candidate spliced primitive group.
In another embodiment, the driving behavior primitives with time correlation of the target vehicle within the preset time period are spliced with the candidate behavior primitives with time correlation and driving behavior correlation in the peripheral vehicles of the target vehicle.
Optionally, when the driving data of the target vehicle is obtained, the driving data further includes driving data of vehicles around the target vehicle, as shown in fig. 12, fig. 12 is a position distribution diagram of the target vehicle and its surrounding vehicles, that is, numbers 1 to 8 in fig. 12 are surrounding vehicles.
Therefore, when obtaining each driving behavior primitive of the target vehicle in the preset time period, determining the driving behavior primitives of the surrounding vehicles of the target vehicle in the preset time period, and splicing the driving behavior primitives with time association of the target vehicle in the preset time period, candidate behavior primitives with time association and driving behavior association of the surrounding vehicles of the target vehicle in the preset time period are spliced, wherein the candidate behavior primitives are the driving behavior primitives with time association and driving behavior association of the driving behavior primitives of the target vehicle in the surrounding vehicles.
Wherein the driving behavior primitives of the surrounding vehicles having driving behavior associations represent that the driving behavior of the target vehicle can be influenced by the driving behavior primitives of the surrounding vehicles of the target vehicle, for example, for the target vehicle, the driving behaviors of the vehicles No. 1, No. 2, No. 3, No. 4 and No. 5 around the target vehicle need to be judged; the description of the time association is the same as the above embodiments, and is not repeated herein.
For example, with continued reference to fig. 12, consider whether there are vehicles and formal conditions of the vehicles directly in front of the target vehicle, i.e., the driving behavior primitive of vehicle No. 2 that is in the same lane as and in front of the target vehicle; if the target vehicle has the straight-ahead behavior primitive in the preset time period, the straight-ahead behavior primitive is regarded as the No. 1 straight-ahead behavior primitive, the candidate behavior primitives which have time correlation with the No. 1 straight-ahead behavior primitive and are the driving behavior primitives exist in the driving behavior primitives of the No. 2 vehicle, and the candidate behavior primitives are the straight-ahead behavior primitives, the No. 1 straight-ahead behavior primitive and the straight-ahead behavior primitives corresponding to the candidate behavior primitives are spliced to obtain the candidate spliced primitive group.
And S1102, matching each candidate splicing primitive group with a driving behavior primitive group in the driving scene model respectively.
The driving scene model comprises the corresponding relation between the driving behavior primitive group and the driving scene, so that each obtained candidate splicing primitive group can be respectively matched with the driving behavior primitive group in the driving scene model, and whether the driving behavior primitive group which is the same as the candidate splicing primitive group exists in the driving scene model or not is detected.
S1103, determining the driving scene corresponding to the driving behavior primitive group successfully matched in the driving scene model as the driving scene corresponding to the candidate splicing primitive group; and the driving scene of each candidate splicing primitive group is the driving scene of the target vehicle in a preset time period.
If the candidate splicing base element group is successfully matched with the driving behavior element group in the driving scene model, determining the driving scene corresponding to the successfully matched driving behavior element group as the driving scene of the candidate splicing base element group, namely the driving scene of the target vehicle in a preset time period.
According to the driving scene identification method, the driving behavior elements in the preset time period are spliced according to the time sequence of the driving behavior elements in the preset time period to obtain a plurality of candidate splicing basic tuples of the target vehicle in the preset time period, then the candidate splicing basic tuples are respectively matched with the driving behavior element groups in the driving scene model, the driving scene corresponding to the driving behavior element group which is successfully matched in the driving scene model is determined as the driving scene corresponding to the candidate splicing basic tuples, and the driving scene of each candidate splicing basic tuple is the driving scene of the target vehicle in the preset time period. According to the method, the driving behavior elements of the target vehicle are spliced, and then the driving scene corresponding to the spliced candidate splicing element group is determined based on the preset driving scene model, so that automatic identification of various types of driving scenes is realized.
Based on the driving scenario model, the following describes a construction process of the driving scenario model in detail by using an embodiment, and in an embodiment, as shown in fig. 13, the construction process of the driving scenario model includes the following steps:
s1301, obtaining a plurality of sample driving scenes; the driving scenarios include driving scenarios in which a sample vehicle to which the sample driving scenario belongs has a temporal association with surrounding vehicles of the sample vehicle and a driving behavior association.
Before a driving scene model is constructed, a plurality of sample driving scenes are required to be obtained, wherein the sample driving scenes comprise a free-running scene, a car following scene, a lane changing scene, an overtaking scene, a vehicle cut-in scene and a vehicle cut-out scene.
The driving scenes comprise driving scenes that a vehicle to which the sample driving scene belongs has time correlation and driving behavior correlation with surrounding vehicles, for example, a following scene is a scene in which the speed and the longitudinal position of the vehicle are influenced by a vehicle ahead, namely, a vehicle which runs straight ahead exists in a lane to which the sample vehicle belongs in the following scene; the vehicle cut-in scene indicates that, during the straight-ahead driving of the vehicle, there is a surrounding vehicle that enters the lane to which the sample vehicle belongs and is in front of the sample vehicle, and the vehicle cut-out scene indicates that the surrounding vehicle that is in the same lane as the sample vehicle and is in front of the sample vehicle switches to another lane.
And S1302, respectively performing driving behavior primitive analysis on each driving scene, and determining a driving behavior primitive group corresponding to each driving scene.
Analyzing and mining the common behavior characteristics of the driving scenes, determining driving behavior elements, namely the driving behavior elements comprise straight behavior elements, left crossing behavior elements and right crossing behavior elements, then analyzing the driving behavior elements, determining the driving behavior elements corresponding to the driving scenes, then combining and splicing the driving behavior elements corresponding to the driving scenes, and determining the driving behavior element groups corresponding to the driving scenes.
Optionally, different driving scenarios correspond to different driving behavior primitive groups, e.g., a free-driving scenario is mainly composed of straight-going behavior primitives of sample vehicles.
The car following scene mainly comprises straight-going behavior elements of sample vehicles and straight-going behavior elements of surrounding vehicles of the sample vehicles, wherein the transverse positions of the surrounding vehicles are lanes to which the sample vehicles belong, and the longitudinal positions of the surrounding vehicles are relative to the front of the sample vehicles.
The lane changing scene comprises a left lane changing scene and a right lane changing scene, wherein the left lane changing scene mainly comprises a straight-going behavior element of a sample vehicle, a left crossing behavior element of the sample vehicle and a straight-going behavior element of the sample vehicle; the right lane changing scene mainly comprises a straight-going behavior element of a sample vehicle, a right crossing behavior element of the sample vehicle and a straight-going behavior element of the sample vehicle.
The overtaking scenes comprise a left overtaking scene and a right overtaking scene.
Wherein, the left overtaking scene mainly comprises a straight-going behavior element of a sample vehicle, a left crossing behavior element of the sample vehicle and a straight-going behavior element of the sample vehicle, the system comprises a right crossing behavior element of a sample vehicle and a straight behavior element of the sample vehicle, wherein the straight behavior element of a first sample vehicle and the straight behavior element of a surrounding vehicle with the transverse position being a lane to which the sample vehicle belongs and the longitudinal position being the front of the sample vehicle exist at the same time, the straight behavior element of a second sample vehicle and the straight behavior element of a surrounding vehicle with the transverse position being the right side adjacent vehicle of the lane to which the sample vehicle belongs and the longitudinal position being the same horizontal position as the sample vehicle exist at the same time, and the straight behavior element of a third sample vehicle and the straight behavior element of a surrounding vehicle with the transverse position being the lane to which the sample vehicle belongs and the longitudinal position being the rear of the sample vehicle exist at the same time.
The right overtaking scene mainly comprises a straight-going behavior element of a sample vehicle, a right crossing behavior element of the sample vehicle, a straight-going behavior element of the sample vehicle, a left crossing behavior element of the sample vehicle and a straight-going behavior element of the sample vehicle, the straight-going behavior element of the first sample vehicle and the straight-going behavior element of the peripheral vehicle with the transverse position being the lane to which the sample vehicle belongs and the longitudinal position being the front of the sample vehicle exist at the same time, the straight-going behavior element of the second sample vehicle and the straight-going behavior element of the peripheral vehicle with the transverse position being the left side adjacent vehicle of the lane to which the sample vehicle belongs and the longitudinal position being the same horizontal position as the sample vehicle exist at the same time, and the straight-going behavior element of the third sample vehicle and the straight-going behavior element of the peripheral vehicle with the transverse position being the lane to which the sample vehicle belongs and the longitudinal position being the rear of the sample vehicle exist at the same time.
The vehicle cut-out scene includes a left cut-out scene and a right cut-out scene.
The left cut-out scene mainly comprises a straight-going behavior element of a sample vehicle, a left crossing behavior element of a peripheral vehicle of the sample vehicle and a straight-going behavior element of the peripheral vehicle of the sample vehicle, wherein the peripheral vehicle is positioned in front of the left crossing behavior element, the transverse position of the peripheral vehicle is positioned in a lane where the sample vehicle is located, the longitudinal position of the peripheral vehicle is positioned in front of the sample vehicle, the transverse position of the peripheral vehicle is positioned in a left adjacent lane of the lane where the sample vehicle belongs before the straight-going behavior element, and the longitudinal position of the peripheral vehicle is positioned at the same horizontal position as that of the sample vehicle.
The right cut-out scene mainly comprises a straight-going behavior element of a sample vehicle, a right crossing behavior element of a peripheral vehicle of the sample vehicle and a straight-going behavior element of the peripheral vehicle of the sample vehicle, wherein the peripheral vehicle is positioned in front of the right crossing behavior element, the transverse position of the peripheral vehicle is positioned in a lane where the sample vehicle is located, the longitudinal position of the peripheral vehicle is positioned in front of the sample vehicle, the transverse position of the peripheral vehicle is positioned in a right adjacent lane of the lane where the sample vehicle belongs before the straight-going behavior element, and the longitudinal position of the peripheral vehicle is positioned at the same horizontal position as that of the sample vehicle.
The vehicle cut-in scenario includes a left cut-in scenario and a right cut-in scenario.
The left cut-in scene mainly comprises a straight-going behavior element of a sample vehicle, a right crossing behavior element of a vehicle around the sample vehicle and a straight-going behavior element of a vehicle around the sample vehicle, wherein the peripheral vehicle is positioned in front of the right crossing behavior element, the transverse position of the peripheral vehicle is positioned in a left adjacent lane of a lane to which the sample vehicle belongs, the longitudinal position of the peripheral vehicle is positioned at the same horizontal position with the sample vehicle, the transverse position of the peripheral vehicle is positioned in the lane to which the sample vehicle is positioned before the straight-going behavior element, and the longitudinal position of the peripheral vehicle is positioned in front of the sample vehicle.
The right cut-in scene mainly comprises a straight-going behavior element of a sample vehicle, a left crossing behavior element of a vehicle around the sample vehicle and a straight-going behavior element of a vehicle around the sample vehicle, wherein the peripheral vehicle is positioned in front of the left crossing behavior element, the transverse position of the peripheral vehicle is positioned in a right adjacent lane of a lane to which the sample vehicle belongs, the longitudinal position of the peripheral vehicle is positioned at the same horizontal position with the sample vehicle, the transverse position of the peripheral vehicle is positioned in the lane to which the sample vehicle is positioned before the straight-going behavior element, and the longitudinal position of the peripheral vehicle is positioned in front of the sample vehicle.
And S1303, determining the corresponding relation between each driving behavior primitive group and the corresponding driving scene as a driving scene model.
Optionally, the driving scene can be defined by three key elements of vehicle attributes, vehicle positions and driving behavior primitives, and mathematical and symbolic theoretical models of the driving scene and the driving behavior primitives are established by combining the definitions; specifically, the vehicle attributes can be divided into a sample vehicle and a surrounding vehicle of the sample vehicle, the vehicle position can be determined according to the relative positions of the sample vehicle and the surrounding vehicle and the lane, and the driving behavior primitive comprises three basic behavior processes of a straight behavior primitive, a left crossing behavior primitive and a right crossing behavior primitive.
And determining the corresponding relation between the driving scenes corresponding to the driving behavior primitive combinations as a driving scene model, wherein the driving scene model can be specifically represented by a formula (9).
Figure DEST_PATH_IMAGE061_61A
(9)
Wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE063_43A
which represents a driving scenario in which,
Figure DEST_PATH_IMAGE065_36A
a union is represented that is a union of the two,
Figure DEST_PATH_IMAGE067_40A
the intersection is represented by the number of points in the intersection,
Figure DEST_PATH_IMAGE069_37A
Figure DEST_PATH_IMAGE071_21A
Figure DEST_PATH_IMAGE073_18A
Figure DEST_PATH_IMAGE075_17A
Figure DEST_PATH_IMAGE077_19A
Figure DEST_PATH_IMAGE079_5A
in that
Figure DEST_PATH_IMAGE081_5A
Is composed of
Figure DEST_PATH_IMAGE083_5A
Respectively representing that the vehicle attribute is a sample vehicle and a surrounding vehicle of the sample vehicle;longthe values of-1, 0 and 1 respectively represent three different longitudinal relative positions which are relative to the rear of the sample vehicle, are at the same level with the sample vehicle and are relative to the front of the sample vehicle;lateralthe number of the sample vehicle is 1, 0 and 1 respectively represents three different transverse relative positions of an adjacent lane on the left side of the lane where the sample vehicle is located, the same lane with the sample vehicle and an adjacent lane on the right side of the lane where the sample vehicle is located;
Figure DEST_PATH_IMAGE085_5A
representing the longitudinal and transverse relative position relation with the sample vehicle;
Figure DEST_PATH_IMAGE087_6A
in that
Figure DEST_PATH_IMAGE089_17A
Is composed of
Figure DEST_PATH_IMAGE091_17A
Respectively representing three different driving behavior primitives of a straight-going behavior, a left crossing behavior and a right crossing behavior;
Figure DEST_PATH_IMAGE093_11A
0 and 1 respectively indicate that no vehicle exists and a sample vehicle/surrounding vehicle exists in the position;
Figure DEST_PATH_IMAGE095_12A
represents the corresponding week in the sceneProperty of surrounding vehicle
Figure DEST_PATH_IMAGE096_55A
Is/are as follows
Figure DEST_PATH_IMAGE095_13A
The driving behavior of each element is basically the same as the driving behavior of each element,
Figure DEST_PATH_IMAGE098_37A
representing surrounding vehicle attributes in the scene
Figure DEST_PATH_IMAGE099_14A
Total driving behavior primitive number.
In one embodiment, a left lane changing scene is taken as an example for description, and the left lane changing scene is analyzed according to three key elements of vehicle attributes, vehicle positions and driving behavior primitives, and in the left lane changing scene, the related vehicle attributes are mainly sample vehicles
Figure DEST_PATH_IMAGE101_6A
The vehicle position is the relative position of the sample vehiclelong=0,lateral= 0; the driving behavior primitive mainly comprises a straight-going behavior primitive
Figure DEST_PATH_IMAGE103_5A
Left crossing behavior primitive
Figure DEST_PATH_IMAGE105_5A
And a forward behavior primitive
Figure DEST_PATH_IMAGE107_8A
Composition is carried out; since the left lane change scene can be determined only by the driving behavior of the sample vehicle, it is not necessary to influence the extraction result on whether the other positions include surrounding vehicles of the sample vehicle, the distribution of the positions of the surrounding vehicles at different positions can be processed according to the union, and in this embodiment, for example, there are no surrounding vehicles around the sample vehicle, the left lane change scene can be expressed as shown in the following formula:
Figure DEST_PATH_IMAGE109_7A
(10)
in practical application, after the driving behavior elements of the target vehicle are extracted, the driving behavior elements are combined and spliced into candidate splicing elements, and if the candidate splicing elements meet the scene driving model of the left lane changing scene, the candidate splicing elements are determined to be the left lane changing scene.
The left lane changing scene comprises three driving behavior primitives with continuous time characteristics: the straight-going behavior element of the target vehicle, the left crossing behavior element of the target vehicle and the straight-going behavior element of the target vehicle, and the time difference between the tail time and the starting time of every two adjacent elements
Figure DEST_PATH_IMAGE110_28A
And meanwhile, splicing the adjacent driving behavior primitives according to the driving behavior primitive combination sequence. Alternatively,
Figure DEST_PATH_IMAGE059_56A
=2s。
in the driving scene identification method, a plurality of sample driving scenes are obtained; the driving scenes comprise driving scenes in which sample vehicles to which the sample driving scenes belong are in time correlation and driving behaviors are correlated with surrounding vehicles of the sample vehicles; respectively carrying out driving behavior primitive analysis on each driving scene, and determining a driving behavior primitive group corresponding to each driving scene; and determining the corresponding relation between each driving behavior primitive group and the corresponding driving scene as a driving scene model. According to the method, the driving scene is modeled, the relation between the driving scene and the driving behavior elements is determined, and the automatic identification and extraction of the typical driving scene which is high in coverage, strong in universality and easy to use and faces to the function development and test of the automatic driving vehicle are realized.
In one embodiment, the application provides a method for identifying driving behavior primitives, the driving behavior primitives with time correlation and driving behavior correlation are spliced, and a driving scene model is utilized, so that automatic identification and extraction of large-scale natural driving data can be realized. On the collected partial natural driving data set, the extraction accuracy rate of the method for the free-running scene, the following scene, the lane-changing scene, the overtaking scene, the vehicle cut-in scene and the vehicle cut-out scene can reach more than 90%. The automatic identification and extraction process can be realized through the following two typical modes: and downloading a typical driving scene extraction algorithm to a real-time processor on the automatic driving vehicle to realize the driving scene recognition of the target vehicle.
Optionally, the method may also perform real-time identification and extraction in an integrated environment of an NI PXI real-time simulator or a Virtual simulation platform, perform real-time identification and extraction in an integrated environment of a Virtual Test Drive (VTD) and Matlab/Simulink, and may also perform offline identification and extraction on data acquired in the acquisition device or the simulation environment.
In the embodiment of the application, the driving scenes can be identified and extracted through the driving behavior primitives, and specifically, the automatic identification and extraction of the typical driving scenes which are high in coverage, strong in universality and easy to use and are oriented to the function development and test of the automatic driving vehicles can be realized through a plurality of key means, namely typical driving scene analysis, driving behavior primitive modeling, driving behavior primitive combination and driving scene extraction.
In one embodiment, as shown in fig. 14, the embodiment includes:
s1401, analyzing common behavior characteristics of a driving scene under natural driving conditions to obtain driving behavior elements;
the driving scene at least comprises free running, car following, lane changing, overtaking, vehicle cut-in and vehicle cut-out, and the driving behavior elements comprise straight behavior elements, left cross-lane behavior elements and right cross-lane behavior elements.
And S1402, defining driving scenes according to the driving behavior primitives.
And S1403, determining the combination relation of driving behavior elements corresponding to the driving scenes according to the driving characteristics and the definitions of the driving scenes of different types, and establishing a mathematical and symbolic theoretical model of a typical driving scene and the driving behavior elements, namely a driving scene model.
And S1404, modeling each driving behavior element according to the characteristics of the driving behavior element to obtain each driving behavior element model.
The modeling process of the generalized straight-ahead driving behavior element comprises the following steps:
judging straight driving behavior elements in natural driving data by a preset time window, firstly, judging whether the absolute value of the distance difference between the lane line distance of each moment of the time window and the lane line distance of the starting moment of the time window is less than or equal to a straight driving behavior distance threshold value or not from the first time window of the natural driving data, and the included angle between the corresponding moment and the tangential direction of the lane line is less than or equal to a straight driving behavior angle threshold value or not, if so, judging that the time window meets the straight driving behavior elements, continuously judging the data of the next time window, otherwise, sequentially constructing a new time window, judging whether the straight driving behavior elements are met or not, and judging all the natural driving data.
The modeling process of the left crossing behavior primitive and the right crossing behavior primitive is as follows:
firstly, judging whether the time difference value of adjacent moments is smaller than a minimum time difference judgment threshold value or not, and whether the absolute value of the distance variation from a lane line is larger than a lane width judgment threshold value or not, and if so, determining the moment when a vehicle crosses the lane;
secondly, backtracking from the time when the vehicle crosses the lane to determine the starting time of the behavior of the vehicle crossing the lane; the absolute value of the distance variation of the lane line distance corresponding to the starting time of the lane crossing behavior of the vehicle and the starting time of the natural driving data is smaller than or equal to a crossing behavior distance threshold, and the absolute value of the included angle between the vehicle course corresponding to the starting time of the lane crossing behavior of the vehicle and the tangential direction of the lane line is smaller than or equal to a crossing behavior angle threshold;
iterating backwards from the time when the vehicle crosses the lane, and determining the end time of the crossing lane line, wherein whether the distance difference between the end time of the crossing lane line and the initial time of the natural driving data is smaller than or equal to a crossing behavior distance threshold value or not, and the included angle in the tangential direction of the lane line at the end time of the crossing lane line is smaller than or equal to a crossing behavior angle threshold value or not;
if the distance variation from the lane line is larger than the lane width judgment threshold, judging that the time period from the starting time to the ending time of the lane crossing behavior of the vehicle is a right crossing behavior element, otherwise, judging that the time period from the starting time to the ending time of the lane crossing behavior of the vehicle is a left crossing behavior element.
S1405, according to the driving behavior primitive model and the driving scene model, recognizing and extracting the driving data of the vehicle, and determining the driving scene of the vehicle.
For specific limitations of the driving scenario recognition method provided in this embodiment, reference may be made to the step limitations of each embodiment in the driving scenario recognition method, which is not described herein again.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the application also provides a driving scene recognition device for realizing the driving scene recognition method. The implementation scheme for solving the problem provided by the device is similar to the implementation scheme recorded in the method, so specific limitations in one or more embodiments of the driving scene recognition device provided below can be referred to the limitations of the driving scene recognition method in the foregoing, and details are not repeated here.
In one embodiment, as shown in fig. 15, there is provided a driving scenario recognition apparatus 1500 including: an acquisition module 1501, a first determination module 1502, and a second determination module 1503, wherein:
the acquisition module 1501 is configured to acquire driving data of a target vehicle within a preset time period;
a first determining module 1502, configured to determine, according to the driving data, a driving behavior primitive of the target vehicle within a preset time period; the driving behavior primitives represent common behavior characteristics of the vehicle in the driving process of a plurality of driving scenes;
the second determination module 1503 is used for determining a driving scene of the target vehicle within a preset time period according to the driving behavior primitive of the target vehicle within the preset time period and the preset driving scene model; the driving scene model comprises the corresponding relation between the driving behavior primitive group and the driving scene.
In one embodiment, the first determination module 1502 includes:
the dividing unit is used for dividing the preset time period into a plurality of time windows according to the time sequence;
and the identification unit is used for performing behavior primitive identification on the driving data in each time window according to the time sequence of each time window to obtain the driving behavior primitive of the target vehicle in the preset time period.
In one embodiment, the identification unit comprises:
the first identification subunit is used for identifying the straight-going behavior primitive of the driving data in any time window, and if the driving data in the time window belongs to the straight-going behavior primitive, continuing to identify the straight-going behavior primitive of the driving data in the next time window of the time window;
and the second identification subunit is used for deleting the driving data at the starting moment of the time window, then subdividing the time window and identifying the straight-ahead behavior primitive of the driving data in the subdivided time window if the driving data in the time window does not belong to the straight-ahead behavior primitive.
In one embodiment, the first identification subunit comprises:
the first acquiring subunit is used for acquiring the transverse distance corresponding to each moment in the time window and the driving included angle corresponding to each moment;
the first determining subunit is used for determining a distance difference between the transverse distance of each moment in the time window and the transverse distance of the starting moment of the time window according to the transverse distance corresponding to each moment in the time window;
and the second determining subunit is used for determining that the driving data in the time window belongs to the straight behavior primitive if the absolute value of the distance difference value at each moment in the time window is less than or equal to the straight behavior distance threshold value and the absolute value of the driving included angle at each moment in the time window is less than or equal to the straight behavior angle threshold value.
In one embodiment, the identification unit comprises:
the second acquiring subunit is used for sequentially acquiring the left transverse distance variable quantity of the target vehicle at each adjacent moment in a preset time period;
the third determining subunit is used for determining the time period of the crossing behavior of the target vehicle according to the variable quantity of each left transverse distance;
and the third acquisition subunit is used for acquiring the driving behavior primitives of the target vehicle in the time period of the crossing behavior.
In one embodiment, the third determining subunit includes:
the fourth determining subunit is configured to obtain a left lateral distance variation of which an absolute value is greater than a lane width threshold in each left lateral distance variation, and obtain a candidate left lateral distance variation;
a fifth determining subunit, configured to determine, as crossing times, earlier times among adjacent times corresponding to the left lateral distance variation candidates; each crossing moment corresponds to a target vehicle to generate a crossing behavior;
and the sixth determining subunit is used for determining the crossing time period of the target vehicle, which is subjected to the crossing behavior each time, according to each crossing time.
In one embodiment, the sixth determining subunit includes:
a seventh determining subunit, configured to determine, for any crossing time, a time that satisfies a preset condition before the crossing time as a start time of a crossing behavior corresponding to the crossing time, and determine a time that satisfies the preset condition after the crossing time as an end time of the crossing behavior corresponding to the crossing time; the preset condition is that the absolute value of the left transverse distance variation with the starting moment of the preset time period is smaller than or equal to the crossing behavior distance threshold, and the absolute value of the left driving included angle is smaller than or equal to the crossing behavior angle threshold;
and the eighth determining subunit is used for determining the time period between the starting time and the ending time as the time period when the target vehicle has the crossing behavior.
In one embodiment, the third acquisition subunit includes:
a fourth acquisition subunit configured to acquire a reference left lateral distance variation amount of the target vehicle over a time period of the crossing behavior;
a ninth determining subunit, configured to determine that a driving behavior element of the target vehicle within the time period of the crossing behavior belongs to the right crossing behavior element if the reference left lateral distance variation is greater than the lane width threshold;
a tenth determining subunit, configured to determine that a driving behavior primitive of the target vehicle within the time period of the crossing behavior belongs to the left crossing behavior primitive if the reference left lateral distance variation is smaller than the lane width threshold.
In one embodiment, the second determination module 1503 includes:
the splicing unit is used for splicing the driving behavior elements in the preset time period according to the time sequence of the driving behavior elements in the preset time period to obtain a plurality of candidate splicing element groups of the target vehicle in the preset time period;
the matching unit is used for matching each candidate splicing primitive group with a driving behavior primitive group in the driving scene model respectively;
the determining unit is used for determining the driving scene corresponding to the driving behavior primitive group successfully matched in the driving scene model as the driving scene corresponding to the candidate splicing primitive group; and the driving scene of each candidate splicing primitive group is the driving scene of the target vehicle in a preset time period.
In one embodiment, the splicing unit includes:
the first splicing subunit is used for splicing the driving behavior elements with time correlation of the target vehicle within a preset time period; temporal associations represent temporal adjacency or temporal overlap;
and the second splicing subunit is used for splicing the driving behavior primitive with time correlation of the target vehicle in the preset time period with the candidate behavior primitive with time correlation and driving behavior correlation in the peripheral vehicles of the target vehicle.
In one embodiment, the apparatus 1500 includes:
the system comprises a sample module, a data processing module and a display module, wherein the sample module is used for acquiring a plurality of sample driving scenes; the driving scenes comprise driving scenes of a sample vehicle to which the sample driving scene belongs, vehicles around the sample vehicle, and time correlation and driving behavior correlation;
the analysis module is used for respectively carrying out driving behavior primitive analysis on each driving scene and determining a driving behavior primitive group corresponding to each driving scene;
and the corresponding module is used for determining the corresponding relation between each driving behavior primitive group and the corresponding driving scene as a driving scene model.
The various modules in the driving scene recognition device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 16. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operating system and the computer program to run on the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a driving scenario recognition method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 16 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is further provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
The implementation principle and technical effect of each step implemented by the processor in this embodiment are similar to the principle of the driving scene recognition method, and are not described herein again.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In the present embodiment, the implementation principle and the technical effect of each step implemented when the computer program is executed by the processor are similar to the principle of the driving scene recognition method, and are not described herein again.
In an embodiment, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In the present embodiment, the implementation principle and the technical effect of each step implemented when the computer program is executed by the processor are similar to the principle of the driving scene recognition method, and are not described herein again.
It should be noted that, the user information (including but not limited to user device information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), Magnetic Random Access Memory (MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the various embodiments provided herein may be, without limitation, general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing-based data processing logic devices, or the like.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, and these are all within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (15)

1. A driving scenario recognition method, characterized in that the method comprises:
acquiring driving data of a target vehicle within a preset time period;
determining a driving behavior primitive of the target vehicle in the preset time period according to the driving data; the driving behavior primitive represents the common behavior characteristics of the vehicle in the driving process of a plurality of driving scenes; the driving behavior primitives comprise a straight-line behavior primitive, a left crossing behavior primitive and a right crossing behavior primitive;
determining a driving scene of the target vehicle in the preset time period according to the driving behavior primitive of the target vehicle in the preset time period and a preset driving scene model; the driving scene model comprises a corresponding relation between a driving behavior primitive group and a driving scene; the driving scenario model includes:
Figure DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 945950DEST_PATH_IMAGE002
which is indicative of a driving scenario in which,
Figure DEST_PATH_IMAGE003
the union is represented as a union of the sets,
Figure 128669DEST_PATH_IMAGE004
the intersection is represented by the number of points in the intersection,
Figure DEST_PATH_IMAGE005
Figure 564199DEST_PATH_IMAGE006
Figure DEST_PATH_IMAGE007
Figure 937237DEST_PATH_IMAGE008
Figure DEST_PATH_IMAGE009
Figure 876243DEST_PATH_IMAGE010
in that
Figure DEST_PATH_IMAGE011
Is composed of
Figure 354498DEST_PATH_IMAGE012
Respectively representing that the vehicle attribute is a sample vehicle and a surrounding vehicle of the sample vehicle;long-1, 0, 1 respectively represent three different longitudinal relative positions with respect to the rear of the sample vehicle, at the same level as the sample vehicle and with respect to the front of the sample vehicle;lateralwhen the number of the lanes is 1, 0 and 1, the lanes adjacent to the left side of the lane where the sample vehicle is located and the lanes adjacent to the right side of the lane where the sample vehicle is located are respectively represented by three different transverse relative positions;
Figure DEST_PATH_IMAGE013
representing a longitudinal and transverse relative positional relationship with the sample vehicle;
Figure 152690DEST_PATH_IMAGE014
in that
Figure DEST_PATH_IMAGE015
Is composed of
Figure 741803DEST_PATH_IMAGE016
Respectively representing three different driving behavior primitives of a straight-going behavior, a left crossing behavior and a right crossing behavior;
Figure DEST_PATH_IMAGE017
0 and 1 respectively indicate that no vehicle exists in the corresponding position and the sample vehicle or the surrounding vehicle of the sample vehicle exists;
Figure 394370DEST_PATH_IMAGE018
representing the corresponding surrounding vehicle attribute in the driving scene
Figure DEST_PATH_IMAGE019
Is/are as follows
Figure 246788DEST_PATH_IMAGE018
The driving behavior of each element is basically the same as the driving behavior of each element,
Figure 860172DEST_PATH_IMAGE020
representing the surrounding vehicle attributes in the driving scenario
Figure DEST_PATH_IMAGE021
Total driving behavior primitive number.
2. The method of claim 1, wherein the determining a driving behavior primitive of the target vehicle within the preset time period according to the driving data comprises:
dividing the preset time period into a plurality of time windows according to the time sequence;
and performing behavior primitive recognition on the driving data in each time window according to the time sequence of each time window to obtain driving behavior primitives of the target vehicle in the preset time period.
3. The method of claim 2, wherein the behavior primitive identification comprises a straight-through behavior primitive identification; the driving behavior primitives comprise straight-ahead behavior primitives;
the performing behavior primitive identification on the driving data in each time window to obtain the driving behavior primitive of the target vehicle in the preset time period includes:
for any time window, performing straight-going behavior primitive identification on the driving data in the time window, and if the driving data in the time window belongs to a straight-going behavior primitive, continuing performing straight-going behavior primitive identification on the driving data in the next time window of the time window;
and if the driving data in the time window does not belong to the straight-going behavior primitives, deleting the driving data at the starting moment of the time window, subdividing the time window, and identifying the straight-going behavior primitives of the driving data in the subdivided time window.
4. The method of claim 3, wherein the driving data includes a lateral distance of the target vehicle to a lane line and an included angle of travel between the target vehicle and a lane line tangent;
the performing of the straight-ahead behavior primitive recognition on the driving data in the time window includes:
acquiring the transverse distance corresponding to each moment in the time window and the driving included angle corresponding to each moment;
determining a distance difference between the transverse distance of each moment in the time window and the transverse distance of the starting moment of the time window according to the transverse distance corresponding to each moment in the time window;
and if the absolute value of the distance difference value at each moment in the time window is less than or equal to the straight-going behavior distance threshold value, and the absolute value of the driving included angle at each moment in the time window is less than or equal to the straight-going behavior angle threshold value, determining that the driving data in the time window belongs to the straight-going behavior elements.
5. The method according to any one of claims 2-4, wherein the driving data includes a left lateral distance of the target vehicle to a left lane line;
the behavior primitive recognition of the driving data in each time window is performed to obtain the driving behavior primitive of the target vehicle in the preset time period, and the method comprises the following steps:
sequentially acquiring left transverse distance variable quantities of the target vehicle at each adjacent moment in the preset time period;
determining a time period of the crossing behavior of the target vehicle according to the left transverse distance variable quantity;
and acquiring a driving behavior primitive of the target vehicle in the crossing behavior time period.
6. The method according to claim 5, wherein the determining a time period during which the target vehicle has the ride-through behavior according to each of the left lateral distance changing amounts comprises:
obtaining left transverse distance variation of which the absolute value is greater than a lane width threshold value in each left transverse distance variation to obtain candidate left transverse distance variation;
determining the earlier time in the adjacent times corresponding to each candidate left transverse distance variable quantity as a crossing time; each crossing moment corresponds to the target vehicle to generate a crossing behavior;
and determining a crossing time period of the target vehicle for each crossing behavior according to each crossing time.
7. The method of claim 6, wherein the driving data further comprises a left included angle of travel of the target vehicle from a left lane line tangent direction; the step of determining the crossing time period of the target vehicle for each crossing behavior according to each crossing time comprises the following steps:
for any crossing moment, determining the moment meeting a preset condition before the crossing moment as the starting moment of the crossing behavior corresponding to the crossing moment, and determining the moment meeting the preset condition after the crossing moment as the ending moment of the crossing behavior corresponding to the crossing moment; the preset condition is that the absolute value of the left transverse distance variation with the starting moment of the preset time period is smaller than or equal to a crossing behavior distance threshold, and the absolute value of the left driving included angle is smaller than or equal to a crossing behavior angle threshold;
and determining the time period between the starting time and the ending time as the time period of the crossing behavior of the target vehicle.
8. The method of claim 5, wherein the driving behavior primitives include a left ride-through behavior primitive and a right ride-through behavior primitive;
the acquiring of the driving behavior primitive of the target vehicle in the crossing behavior time period includes:
acquiring a reference left lateral distance variation of the target vehicle in the crossing behavior time period;
if the reference left transverse distance variation is larger than a lane width threshold value, determining that a driving behavior primitive of the target vehicle in the crossing behavior time period belongs to a right crossing behavior primitive;
and if the reference left lateral distance variation is smaller than the lane width threshold, determining that a driving behavior primitive of the target vehicle in the crossing behavior time period belongs to a left crossing behavior primitive.
9. The method according to any one of claims 1-4, wherein the determining the driving scene of the target vehicle within the preset time period according to the driving behavior primitive of the target vehicle within the preset time period and a preset driving scene model comprises:
splicing the driving behavior elements in a preset time period according to the time sequence of the driving behavior elements in the preset time period to obtain a plurality of candidate spliced element groups of the target vehicle in the preset time period;
matching each candidate splicing primitive group with a driving behavior primitive group in the driving scene model respectively;
determining the driving scene corresponding to the successfully matched driving behavior primitive group in the driving scene model as the driving scene corresponding to the candidate splicing primitive group; and the driving scene of each candidate splicing primitive group is the driving scene of the target vehicle in the preset time period.
10. The method according to claim 9, wherein the stitching processing of the driving behavior primitives within the preset time period comprises:
splicing the driving behavior primitives with time correlation of the target vehicle within the preset time period; the temporal associations represent temporal adjacency or temporal overlap;
and splicing the driving behavior primitives with time association of the target vehicle in the preset time period with the candidate behavior primitives with time association and driving behavior association in the surrounding vehicles of the target vehicle.
11. The method of claim 1, wherein the construction process of the driving scenario model comprises:
obtaining a plurality of sample driving scenes; the driving scenes comprise driving scenes in which a sample vehicle to which the sample driving scene belongs has time correlation with surrounding vehicles of the sample vehicle and driving behavior correlation;
respectively carrying out driving behavior primitive analysis on each driving scene, and determining a driving behavior primitive group corresponding to each driving scene;
and determining the corresponding relation between each driving behavior primitive group and the corresponding driving scene as the driving scene model.
12. A driving scenario recognition apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring driving data of the target vehicle within a preset time period;
the first determination module is used for determining driving behavior elements of the target vehicle in the preset time period according to the driving data; the driving behavior primitive represents the common behavior characteristics of the vehicle in the driving process of a plurality of driving scenes; the driving behavior primitives comprise a straight-line behavior primitive, a left crossing behavior primitive and a right crossing behavior primitive;
the second determination module is used for determining the driving scene of the target vehicle in the preset time period according to the driving behavior elements of the target vehicle in the preset time period and a preset driving scene model; the driving scene model comprises a corresponding relation between a driving behavior primitive group and a driving scene; the driving scenario model includes:
Figure 862763DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 251062DEST_PATH_IMAGE022
which is indicative of a driving scenario in which,
Figure 477644DEST_PATH_IMAGE003
the union is represented as a union of the sets,
Figure 453691DEST_PATH_IMAGE004
the intersection is represented by the number of points in the intersection,
Figure 525552DEST_PATH_IMAGE005
Figure DEST_PATH_IMAGE023
Figure 496919DEST_PATH_IMAGE024
Figure 628823DEST_PATH_IMAGE008
Figure 951220DEST_PATH_IMAGE009
Figure 764455DEST_PATH_IMAGE010
in that
Figure DEST_PATH_IMAGE025
Is composed of
Figure 652646DEST_PATH_IMAGE026
Respectively representing that the vehicle attribute is a sample vehicle and a surrounding vehicle of the sample vehicle;long-1, 0, 1 respectively represent three different longitudinal relative positions with respect to the rear of the sample vehicle, at the same level as the sample vehicle and with respect to the front of the sample vehicle;lateral1, 0 and 1 respectively represent the adjacent lane at the left side of the lane where the sample vehicle is located, the same lane as the sample vehicle, and the sample vehicleThree different transverse relative positions of adjacent lanes on the right side of the lane;
Figure 158713DEST_PATH_IMAGE013
representing a longitudinal and transverse relative positional relationship with the sample vehicle;
Figure 702827DEST_PATH_IMAGE014
in that
Figure 382070DEST_PATH_IMAGE015
Is composed of
Figure 203396DEST_PATH_IMAGE016
Respectively representing three different driving behavior primitives of a straight-going behavior, a left crossing behavior and a right crossing behavior;
Figure 4998DEST_PATH_IMAGE017
0 and 1 respectively indicate that no vehicle exists in the corresponding position and the sample vehicle or the surrounding vehicle of the sample vehicle exists;
Figure 239671DEST_PATH_IMAGE018
representing the corresponding surrounding vehicle attribute in the driving scene
Figure 129129DEST_PATH_IMAGE019
Is/are as follows
Figure 460753DEST_PATH_IMAGE018
The driving behavior of each element is basically the same as the driving behavior of each element,
Figure 43044DEST_PATH_IMAGE020
representing the surrounding vehicle attributes in the driving scenario
Figure 765013DEST_PATH_IMAGE021
Total driving behavior primitive number.
13. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor realizes the steps of the method of any one of claims 1 to 11 when executing the computer program.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 11.
15. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of the method of any one of claims 1 to 11 when executed by a processor.
CN202210694071.0A 2022-06-17 2022-06-17 Driving scene recognition method, device, equipment, storage medium and program product Active CN114782926B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210694071.0A CN114782926B (en) 2022-06-17 2022-06-17 Driving scene recognition method, device, equipment, storage medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210694071.0A CN114782926B (en) 2022-06-17 2022-06-17 Driving scene recognition method, device, equipment, storage medium and program product

Publications (2)

Publication Number Publication Date
CN114782926A CN114782926A (en) 2022-07-22
CN114782926B true CN114782926B (en) 2022-08-26

Family

ID=82421100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210694071.0A Active CN114782926B (en) 2022-06-17 2022-06-17 Driving scene recognition method, device, equipment, storage medium and program product

Country Status (1)

Country Link
CN (1) CN114782926B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116467483A (en) * 2022-12-19 2023-07-21 北京百度网讯科技有限公司 Automatic driving scene mining method, device, equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1170652A1 (en) * 2000-07-06 2002-01-09 Renault Process of adaptive regulation of the distance between two mobile vehicles
WO2019047651A1 (en) * 2017-09-05 2019-03-14 百度在线网络技术(北京)有限公司 Driving behavior prediction method and device, and unmanned vehicle
CN109491375A (en) * 2017-09-13 2019-03-19 百度(美国)有限责任公司 The path planning based on Driving Scene for automatic driving vehicle
CN109520744A (en) * 2018-11-12 2019-03-26 百度在线网络技术(北京)有限公司 The driving performance test method and device of automatic driving vehicle
JP2019182425A (en) * 2018-09-07 2019-10-24 百度在線網絡技術(北京)有限公司 Control method and control device for automatic drive vehicle, and computer readable storage medium
CN111645670A (en) * 2020-05-23 2020-09-11 东南大学 Heavy-duty vehicle roll state detection method based on support vector machine
CN112525551A (en) * 2020-12-10 2021-03-19 北京百度网讯科技有限公司 Drive test method, device, equipment and storage medium for automatic driving vehicle
CN113997940A (en) * 2021-12-20 2022-02-01 镁佳(北京)科技有限公司 Driving behavior monitoring method and device
WO2022056564A1 (en) * 2020-09-15 2022-03-24 Avl List Gmbh Method and system for testing a driver assistance system
WO2022078077A1 (en) * 2020-10-15 2022-04-21 腾讯科技(深圳)有限公司 Driving risk early warning method and apparatus, and computing device and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2733037B1 (en) * 2012-11-16 2016-09-21 Honda Motor Co., Ltd. Driver state estimation device
WO2019169031A1 (en) * 2018-02-27 2019-09-06 Nauto, Inc. Method for determining driving policy
JP7207227B2 (en) * 2019-08-08 2023-01-18 トヨタ自動車株式会社 DRIVING ACTION EVALUATION DEVICE, DRIVING ACTION EVALUATION METHOD, AND DRIVING ACTION EVALUATION PROGRAM
US20220121213A1 (en) * 2020-10-21 2022-04-21 Automotive Research & Testing Center Hybrid planning method in autonomous vehicle and system thereof

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1170652A1 (en) * 2000-07-06 2002-01-09 Renault Process of adaptive regulation of the distance between two mobile vehicles
WO2019047651A1 (en) * 2017-09-05 2019-03-14 百度在线网络技术(北京)有限公司 Driving behavior prediction method and device, and unmanned vehicle
CN109491375A (en) * 2017-09-13 2019-03-19 百度(美国)有限责任公司 The path planning based on Driving Scene for automatic driving vehicle
JP2019182425A (en) * 2018-09-07 2019-10-24 百度在線網絡技術(北京)有限公司 Control method and control device for automatic drive vehicle, and computer readable storage medium
CN109520744A (en) * 2018-11-12 2019-03-26 百度在线网络技术(北京)有限公司 The driving performance test method and device of automatic driving vehicle
CN111645670A (en) * 2020-05-23 2020-09-11 东南大学 Heavy-duty vehicle roll state detection method based on support vector machine
WO2022056564A1 (en) * 2020-09-15 2022-03-24 Avl List Gmbh Method and system for testing a driver assistance system
WO2022078077A1 (en) * 2020-10-15 2022-04-21 腾讯科技(深圳)有限公司 Driving risk early warning method and apparatus, and computing device and storage medium
CN112525551A (en) * 2020-12-10 2021-03-19 北京百度网讯科技有限公司 Drive test method, device, equipment and storage medium for automatic driving vehicle
CN113997940A (en) * 2021-12-20 2022-02-01 镁佳(北京)科技有限公司 Driving behavior monitoring method and device

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Automatic Virtual Test Technology for Intelligent Driving Systems Considering Both Coverage and Efficiency;Feng Gao等;《IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY》;20201231;全文 *
Review on the research of motion control for intelligent vehicles;GUO J H等;《Journal of Automotive Safety and Energy》;20160731;全文 *
基于深度强化学习的驾驶员跟车模型研究;郭景华等;《汽车工程》;20211231;第43卷(第4期);全文 *
智能汽车决策中的驾驶行为语义解析关键技术;李国法等;《汽车安全与节能学报》;20191215(第04期);全文 *
经验因素与车道类型对汽车驾驶员变道决策的影响;霍东超等;《人类工效学》;20200220(第01期);全文 *
虚拟自主汽车智能驾驶行为模型的研究及实现;娄燕等;《系统仿真学报》;20051220(第12期);全文 *

Also Published As

Publication number Publication date
CN114782926A (en) 2022-07-22

Similar Documents

Publication Publication Date Title
EP3186662B1 (en) Measuring traffic speed in a road network
US20230138650A1 (en) Test method for automatic driving, and electronic device
CN109685898B (en) Layering method and device of point cloud data, computer equipment and storage medium
CN111476079A (en) Comprehensive and efficient method of merging map features for object detection with L IDAR
CN114514524A (en) Multi-agent simulation
WO2018058888A1 (en) Street view image recognition method and apparatus, server and storage medium
WO2022156553A1 (en) Method and apparatus for presenting road information
CN114782926B (en) Driving scene recognition method, device, equipment, storage medium and program product
US20220227388A1 (en) Method and apparatus for determining green wave speed, and storage medium
EP4202365A1 (en) Method, apparatus, and computer program product for identifying and correcting lane geometry in map data
US20190303647A1 (en) Movement intelligence using satellite imagery
CN109085764B (en) Method and device for optimizing creation of unmanned simulation scene
CN114625744A (en) Updating method and device of electronic map
US11435202B2 (en) Trajectory sampling using spatial familiarity
CN114743395B (en) Signal lamp detection method, device, equipment and medium
EP3872669A1 (en) Systems and methods for reconstructing a trajectory from anonymized data
EP3872519A1 (en) Systems and methods for reconstructing a trajectory from anonymized data
CN114872718A (en) Vehicle trajectory prediction method, vehicle trajectory prediction device, computer equipment and storage medium
JP7070531B2 (en) Urban structure design equipment and urban structure design program
CN114216467A (en) Road positioning method, device, computer equipment and storage medium
CN115392391A (en) Training method of road type recognition model and road type recognition method
CN116010538A (en) Traffic thermodynamic diagram generation method, device, equipment, storage medium and program product
CN116303866B (en) Data processing method, device, electronic equipment and storage medium
US20230194298A1 (en) Method, apparatus, and computer program product for correcting lane geometry in map data
US20230196760A1 (en) Method, apparatus, and computer program product for identifying and correcting intersection lane geometry in map data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant