CN114064656A - Automatic driving scene recognition and conversion method based on road end sensing system - Google Patents

Automatic driving scene recognition and conversion method based on road end sensing system Download PDF

Info

Publication number
CN114064656A
CN114064656A CN202111400766.5A CN202111400766A CN114064656A CN 114064656 A CN114064656 A CN 114064656A CN 202111400766 A CN202111400766 A CN 202111400766A CN 114064656 A CN114064656 A CN 114064656A
Authority
CN
China
Prior art keywords
vehicle
slave
data
main
master
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111400766.5A
Other languages
Chinese (zh)
Other versions
CN114064656B (en
Inventor
郑玲
曾杰
李以农
余颖弘
张迪思
屈顺娇
裴健宏
杨崇辉
杨显通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN202111400766.5A priority Critical patent/CN114064656B/en
Publication of CN114064656A publication Critical patent/CN114064656A/en
Application granted granted Critical
Publication of CN114064656B publication Critical patent/CN114064656B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • G06F16/2291User-Defined Types; Storage management thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • G06F16/252Integrating or interfacing systems involving database management systems between a Database Management System and a front-end application
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides an automatic driving scene recognition and conversion method based on a road end perception system, which comprises the following steps: acquiring driving data acquired by a road end sensing system; traversing all vehicles which can be used as main vehicles according to the driving data, and dividing the driving data into a plurality of segments by taking different vehicles as the main vehicles to obtain independent data segments of the main vehicles and the subordinate vehicles; in the independent data segment of the master vehicle and the slave vehicle, the motion information of the slave vehicle is subjected to coordinate conversion and projected into a coordinate system of the master vehicle; judging whether the master vehicle and the slave vehicle are in the direct interaction interval or not, and evaluating the collision time according to the judgment result; and when the collision time is greater than or equal to 0 and less than or equal to the TTC threshold, performing data slicing on the driving data to obtain function marking slice data and configuring a description file. The invention can identify and convert a large amount of driving data collected by the road end sensing system, output slice data with functional marks and provide a data source for the construction of a large-volume database for automatic driving training.

Description

Automatic driving scene recognition and conversion method based on road end sensing system
Technical Field
The invention relates to the technical field of automatic driving, in particular to an automatic driving scene recognition and conversion method based on a road end sensing system.
Background
With the continuous development of social economy, traffic infrastructure construction and road conditions cannot meet the current social traffic demands, and traffic safety problems and traffic congestion problems become stumbling stones for further improving the social production activity efficiency. The automatic driving automobile has wider environment perception capability and unique path planning capability than the human, can greatly reduce the traffic accident rate and relieve the traffic jam pressure. Therefore, the rapid development of the automatic driving automobile has great practical significance for relieving and even solving the problems of traffic safety, traffic jam and the like. However, a large-volume database for automatic driving scene recognition does not exist at present, which seriously restricts the popularization and development of the related technology in the field of automatic driving scene recognition.
Therefore, there is a need for an automatic driving scene recognition and transformation method, so as to recognize an unmanned driving scene from a large amount of collected data, output data segments with function or action labels, and provide a data source for the construction of a large-scale database for automatic driving training.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides an automatic driving scene recognition and conversion method based on a road end perception system, and aims to solve the technical problem that an automatic driving scene recognition and conversion method is urgently needed in the prior art so as to be convenient for recognizing and converting unmanned scenes from massive collected data and provide a data source for construction of a large-scale database for automatic driving training.
The invention adopts the technical scheme that an automatic driving scene recognition and conversion method based on a road end perception system comprises the following steps:
acquiring driving data acquired by a road end sensing system;
traversing all vehicles which can be used as main vehicles according to the driving data, and dividing the driving data into a plurality of segments by taking different vehicles as the main vehicles to obtain independent data segments of the main vehicles and the subordinate vehicles;
in the independent data segment of the master vehicle and the slave vehicle, the motion information of the slave vehicle is subjected to coordinate conversion and projected into a coordinate system of the master vehicle;
judging whether the master vehicle and the slave vehicle are in the direct interaction interval or not, and evaluating the collision time according to the judgment result;
when the collision time is greater than or equal to 0 and less than or equal to the TTC threshold, performing data slicing on the driving data to obtain functional marker slice data;
and configuring a description file for the functional marker slice data.
With reference to the first implementable manner, in a second implementable manner, the driving data includes: the system comprises an expressway, a national road, a provincial road and an urban road, wherein the longitudinal position, the transverse position and the course angle of all traffic participation units in the detection range of a road-end sensing system are under a coordinate system of the road-end sensing system; longitudinal speed, transverse speed; longitudinal acceleration, lateral acceleration.
With reference to the first implementable manner, in a third implementable manner, traversing all vehicles that can serve as a host vehicle according to the driving data, and dividing the driving data into a plurality of segments by taking different vehicles as the host vehicle, includes:
and in the data segment determined by taking time as a scale, taking different vehicle targets as a master vehicle, traversing slave vehicles which are in space-time association with the master vehicle, reestablishing independent data segments of the slave vehicles and the master vehicle, and naming the independent data segments of the master vehicle and the slave vehicles according to the ID of the master vehicle and the ID of the traversed slave vehicles.
In combination with the first implementable manner, in a fourth implementable manner,
the method for converting the coordinates of the motion information of the slave vehicle and projecting the motion information to the coordinate system of the master vehicle comprises the following steps:
calculating the relative longitudinal distance D between the master vehicle and the slave vehiclefAnd relative lateral distance Dl
Df=(Xobj-Xego)×cos(-α2)+(Yobj-Yego)×sin(-α2)
Dl=-((Yobj-Yego)×cos(-α2)-(Xobj-Xego)×sin(-α2))
In the above formula, DfAt a relative longitudinal distance, DlIs a relative lateral distance, Xego,YegoRespectively the abscissa and ordinate values, X, of the main vehicleobj,YobjRespectively the horizontal and vertical coordinate values of the slave car, alpha2Is the main vehicle course angle;
under the coordinate system of the main vehicle, the projection of the longitudinal speed and the projection of the transverse speed of the auxiliary vehicle are as follows:
VOf=VEf×cos(-β2)+VEl×sin(-β2)
VOl=-VEl×cos(-β2)-VEf×sin(-β2)
in the above formula, VofFor longitudinal velocity projection, VolFor transverse velocity projection, VEf、VElLongitudinal, transverse speed of the master vehicle relative to the slave vehicle, beta2Is the centroid slip angle.
In combination with the first implementable manner, in a fifth implementable manner,
judging whether the master vehicle and the slave vehicle are in a direct interaction interval, and evaluating the collision time according to the judgment result, wherein the judgment comprises the following steps:
if it is
Figure BDA0003364381520000031
Calculating the collision time;
if it is
Figure BDA0003364381520000032
Firstly, evaluating a transverse collision crisis, and then calculating collision time;
in the above formula, | DlI is the relative transverse distance between the slave car and the master car, WEIs the width and W of the main frameoIs measured from the vehicle width;
the time to collision TTC is calculated as follows:
Figure BDA0003364381520000033
in the above formula, DfFor longitudinal distance of slave car to master car, VRfIs the relative longitudinal velocity of the slave car relative to the master car.
With reference to the fifth implementable manner, in a sixth implementable manner,
if it is
Figure BDA0003364381520000034
Firstly, evaluating the transverse collision crisis and then calculating the longitudinal collision time, comprising the following steps:
calculating the relative speed of the slave vehicle relative to the master vehicle by taking the master vehicle as an observation coordinate system;
judging a transverse approaching trend according to the relative speed of the slave vehicle relative to the master vehicle;
calculating the transverse meeting time of the slave vehicle and the master vehicle according to the transverse distance and the transverse relative speed of the slave vehicle and the master vehicle through the transverse approaching trend judgment result;
calculating the traveling distance of the master vehicle in the transverse meeting time of the slave vehicle and the master vehicle according to the transverse meeting time and the longitudinal relative speed of the slave vehicle and the master vehicle;
calculating the diagonal radius of the secondary vehicle according to the length and the width of the secondary vehicle to obtain the collision area of the primary vehicle and the secondary vehicle;
and judging whether the main vehicle and the auxiliary vehicle have a collision crisis or not according to the longitudinal distance between the auxiliary vehicle and the main vehicle, the collision area between the main vehicle and the auxiliary vehicle and the distance which can be traveled by the main vehicle in the transverse meeting time of the auxiliary vehicle and the main vehicle, and evaluating the collision time.
In combination with the first implementable manner, in a seventh implementable manner,
data slicing driving data, comprising:
performing road right competition scene data slicing on the driving data;
according to a rule threshold judgment method, performing function marking on the scene slice data of the right contention based on the functional characteristics;
and carrying out function labeling on the slice data with the function label to obtain the slice data with the function label.
With reference to the seventh implementable manner, in an eighth implementable manner,
the functional marker includes:
AEB automatic emergency braking: minimum deceleration of 4m/s of the main vehicle2The lowest speed of the main vehicle is 15km/h, and the TTC threshold value is 2.4 s;
ACC adaptive cruise: the lowest speed of the main vehicle is 15km/h, and the maximum deceleration of the main vehicle is 4m/s2TTC threshold is 3.0 s;
monitoring a BSD blind area: the main lane changing scene is carried out, and the TTC threshold value is 6.0 s;
FCW front collision early warning: triggered before the AEB emergency braking phase, TTC threshold is 3.0 s;
TJA traffic congestion assistance: the host vehicle speed range: 0-20 km/h;
LCA safe lane change assistance: the lowest running speed of the main vehicle is 60km/h, the main vehicle lane change characteristic is that the transverse vehicle speed of the LCA main vehicle is equal to 1m/s, and the secondary vehicle entering detection area characteristic is that the minimum detection distance of the LCA in the longitudinal direction is 100m and the minimum detection distance of the LCA in the transverse direction is 5 m.
With reference to the eighth implementable manner, in a ninth implementable manner, the BSD blind area monitoring includes:
line B is parallel to the main trailing edge and is located at 3.0m behind the main trailing edge;
line C is parallel to the front edge of the main car and is located at the center of the ninety-fifth percentile eye ellipse;
the line F is parallel to the center line of the main vehicle, is positioned on the left side of the outermost edge of the left side of the main vehicle body and is 0.5m away from the outermost edge of the left side;
the line G is parallel to the center line of the main vehicle, is positioned on the left of the outermost edge of the left side of the main vehicle body and is 3.0m away from the outermost edge of the left side;
the line K is parallel to the center line of the main vehicle, is positioned on the right of the outermost edge of the right side of the vehicle body of the main vehicle, and is 0.5m away from the outermost edge of the right side;
the line L is parallel to the center line of the main vehicle, is positioned on the right of the outermost edge of the right side of the main vehicle body and has a distance of 3.0m with the outermost edge of the right side;
two determinations of the slave vehicle entering the blind zone:
any part of the slave vehicle is positioned in front of the line B, the slave vehicle is positioned completely behind the line C, the slave vehicle is positioned completely on the left side of the line F, and any part of the slave vehicle is positioned on the right side of the line G;
any part of the slave car is positioned in front of the line B, the slave car is positioned completely behind the line C, the slave car is positioned completely on the right side of the line K, and any part of the slave car is positioned on the left side of the line L.
In combination with the first implementable manner, in a tenth implementable manner,
before the driving data are divided into a plurality of segments, normal distribution kernel fitting is carried out on the data set by adopting a three-sigma rule, and abnormal data are removed;
repairing a default portion of the driving data by linear interpolation fitting;
and performing smooth filtering processing on the independent data segments of the master vehicle and the slave vehicle after the coordinate conversion of the motion information of the slave vehicle.
According to the technical scheme, the beneficial technical effects of the invention are as follows:
the road end sensing system can acquire a large amount of driving data to be recognized and converted, slice data with function marks are output, and a data source is provided for the construction of a large-volume database of automatic driving training.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. Throughout the drawings, like elements or portions are generally identified by like reference numerals. In the drawings, elements or portions are not necessarily drawn to scale.
Fig. 1 is a flowchart of a scene recognition and transformation method according to embodiment 1 of the present invention;
fig. 2(a) and 2(b) are schematic diagrams of coordinate transformation of a master vehicle and a slave vehicle according to embodiment 1 of the present invention;
FIG. 3 is a schematic diagram showing the relative positions of the master vehicle and the slave vehicles at a sampling time according to embodiment 1 of the present invention;
FIG. 4 is a schematic illustration showing monitoring of BSD dead zone in embodiment 1 of the present invention;
fig. 5 is a flowchart of a driving data preprocessing method according to embodiment 2 of the present invention.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings. The following examples are only for illustrating the technical solutions of the present invention more clearly, and therefore are only examples, and the protection scope of the present invention is not limited thereby.
It is to be noted that, unless otherwise specified, technical or scientific terms used herein shall have the ordinary meaning as understood by those skilled in the art to which the invention pertains.
Example 1
The embodiment provides an automatic driving scene recognition and conversion method based on a road-end perception system, as shown in fig. 1, the method includes the following steps:
s1, acquiring driving data acquired by road end sensing system
In this embodiment, the road-end sensing system is a sensing system integrating a 64-line limit rotary lidar and a camera. The driving data includes: highways, national roads, provincial roads and urban roads; the longitudinal position, the transverse position and the course angle of all traffic participation units in the detection range of the road end sensing system under the coordinate system of the road end sensing system; longitudinal speed, transverse speed; longitudinal acceleration, lateral acceleration.
S2, traversing all vehicles which can be used as main vehicles, and dividing driving data into a plurality of segments by using different vehicles as main vehicles to obtain independent data segments of main vehicles and auxiliary vehicles
In a specific embodiment, different vehicle targets are taken as the master vehicle in the data segments determined by taking time as scales, all slave vehicles which are in space-time association with the master vehicle are traversed, independent data segments of the traversed slave vehicles and the master vehicle are reestablished, and the independent data segments of the master vehicle and the slave vehicle are named according to the ID of the master vehicle and the ID of the traversed slave vehicle. For example, if a vehicle target with an ID of 34 is defined as the master and other moving targets with an ID of 72 are temporally and spatially associated with the master, then all data interactively associated with the master and slave is intercepted to form an independent data segment. And traversing all vehicles which can be used as the main vehicles to obtain a plurality of independent data segments.
S3, in the independent data segment of the master-slave vehicle, the motion information of the slave vehicle is subjected to coordinate conversion and projected into the coordinate system of the master vehicle
In a specific embodiment, when the motion information of the slave vehicle is subjected to coordinate transformation, as shown in fig. 2(a) and 2(b), the heading angle is obtained by using an arctangent function of the distance in the y direction and the distance in the x direction, and the calculation formula is as follows:
alphvelo=abs(arctan(direciton_y/direction_x))
the absolute distance d between the master vehicle and the slave vehicle is calculated by using the central point position (x, y) of the driving data, and the calculation formula is as follows:
Figure BDA0003364381520000071
the spatial position relationship between the master vehicle and the slave vehicles is shown in figure 2, and the relative longitudinal distance D between the master vehicle and the slave vehicles is calculated according to the following formulafAnd relative lateral distance Dl
Df=(Xobj-Xego)×cos(-α2)+(Yobj-Yego)×sin(-α2)
Dl=-((Yobj-Yego)×cos(-α2)-(Xobj-Xego)×sin(-α2))
In the above formula, DfAt a relative longitudinal distance, DlIs a relative lateral distance, Xego,YegoRespectively the abscissa and ordinate values, X, of the main vehicleobj,YobjRespectively the horizontal and vertical coordinate values of the slave car, alpha2Is the main vehicle heading angle.
Under the coordinate system of the main vehicle, the projection of the longitudinal speed and the projection of the transverse speed of the auxiliary vehicle are as follows:
VOf=VEf×cos(-β2)+VEl×sin(-β2)
VOl=-VEl×cos(-β2)-VEf×sin(-β2)
in the above formula, VofFor longitudinal velocity projection, VolFor transverse velocity projection, VEf、VElLongitudinal, transverse speed of the master vehicle relative to the slave vehicle, beta2Is the centroid slip angle.
S4, judging whether the master vehicle and the slave vehicle are in the direct interaction interval or not, and evaluating the collision time according to the judgment result
Judging whether the master vehicle and the slave vehicle are in a direct interaction interval, and judging whether the right of way is contended, wherein the judging method specifically comprises the following steps:
as shown in FIG. 3, EGO represents the master vehicle, OBI represents the slave vehicle, and the relative lateral distance between the slave vehicle and the master vehicle is | Dl|:
If it is
Figure BDA0003364381520000072
The fact that the master vehicle and the slave vehicle are basically located in the same lane at the moment is shown, the collision is basically longitudinal collision at the moment, and the collision time TTC can be directly calculated; in the above formula, WEIs the width and W of the main frameoFrom the vehicle width.
If it is
Figure BDA0003364381520000081
The main vehicle and the secondary vehicle are not in the same lane at the moment, the cross collision crisis needs to be evaluated first, and then the collision time TTC is calculated; this is achieved byThe collision frame can be set in a smaller space range, and the judgment of the longitudinal collision and the calculation of the collision time are carried out. The specific method comprises the following steps:
s4-1, calculating the relative speed of the slave vehicle relative to the master vehicle by taking the master vehicle as an observation coordinate system:
longitudinal relative speed: vRf=VOf-VEf
Lateral relative velocity: vRl=VOl-VEl
Wherein, VOfIndicating the longitudinal speed, V, of the slave vehicleOlIndicating the lateral speed, V, of the slave vehicleEfIndicating the longitudinal speed, V, of the tractorElIndicating the lateral velocity of the host vehicle.
S4-2, judging the transverse and longitudinal approaching trend according to the relative speed of the slave vehicle to the master vehicle:
if: df·VRf<0, the longitudinal approaching trend exists between the master vehicle and the slave vehicle.
If: dl·VRl<0, the main vehicle and the auxiliary vehicle have a transverse approaching trend.
Wherein D isfLongitudinal distance of the slave car from the master car, DlIs the lateral distance between the slave car and the master car.
S4-3, calculating the transverse meeting time of the slave vehicle and the master vehicle according to the transverse distance and the transverse relative speed of the slave vehicle and the master vehicle through the transverse approaching trend judgment result
When the slave vehicle and the master vehicle have a transverse approaching trend, the time t when the slave vehicle and the master vehicle meet in the transverse direction (i.e. the transverse distance is 0) is as follows:
Figure BDA0003364381520000082
in the above formula, DlFor transverse distance of the slave car from the master car, VRlIs the transverse relative speed of the slave vehicle and the master vehicle.
S4-4, calculating the distance that the master vehicle can travel in the transverse meeting time of the slave vehicle and the master vehicle according to the transverse meeting time and the longitudinal relative speed of the slave vehicle and the master vehicle
Within this time, the distance D' that the host vehicle can travel is:
D′=t×|VRf|
in the above formula, t is the time when the slave vehicle and the master vehicle meet in the transverse direction, VRfIs the longitudinal relative speed of the slave vehicle and the master vehicle.
S4-5, calculating the diagonal radius of the secondary vehicle according to the length and the width of the secondary vehicle to obtain the area R of the collision between the primary vehicle and the secondary vehicleT
Figure BDA0003364381520000091
In the above formula, L is the slave vehicle length, and W is the slave vehicle width.
S4-6, judging whether the main vehicle and the auxiliary vehicle have collision crisis or not according to the longitudinal distance between the auxiliary vehicle and the main vehicle, the collision area between the main vehicle and the auxiliary vehicle and the distance capable of being traveled by the main vehicle in the transverse meeting time of the auxiliary vehicle and the main vehicle, and evaluating the collision time
If | Df|+RT≥D′≥|Df|-RTAnd judging that the main vehicle and the auxiliary vehicle have collision crisis.
If D'>|Df|+RTOr D'<|Df|-RTAnd judging that the collision crisis does not exist between the master vehicle and the slave vehicle.
In the above formula, DfLongitudinal distance of the slave car from the master car, RTThe area where the master vehicle collides with the slave vehicle, and D' is the distance that the master vehicle can travel in the time when the slave vehicle meets the master vehicle in the lateral direction.
The calculation for the estimated time to collision TTC is as follows:
Figure BDA0003364381520000092
in the above formula, DfFor longitudinal distance of slave car to master car, VRfIs the relative longitudinal velocity of the slave car relative to the master car.
S5, comparing the collision time with the TTC threshold value, and slicing the driving data according to the comparison result
In a specific embodiment, slicing is started when the collision time is less than or equal to 0 and less than or equal to the TTC threshold, and the slicing of the data is ended when the collision time is greater than the TTC threshold. The TTC threshold is set according to actual conditions, and is described separately below.
The data slicing is divided into two steps, wherein the rights of way are firstly contended for the scene data slicing, and then the function marking data slicing is carried out. The data slicing method is not limited, and any realizable method in the prior art can be adopted, and in this embodiment, it is preferable to slice data based on a time window method, which is specifically as follows:
s5-1, performing road right competition on driving data and scene data slicing
Because the data measured by the road end sensing system are continuous data of 24 hours, a large number of time periods exist, vehicles may not pass through the road surface, and the data in the case has no effect on subsequent automatic driving scene analysis, road rights are firstly contended for scene data slicing, the data of the road surfaces without vehicles can be rapidly eliminated, and practical and useful data are left.
S5-2, according to the rule threshold value judging method, the function marking is carried out on the road right contention scene slice data based on the function characteristics
Function marking is carried out on road right contention scene data slice according to vehicle characteristic state quantity
And in the slice data with the road right competition scene, performing function marking according to the functional characteristics of the vehicle, and identifying the intelligent driving function and the driving action. In a specific embodiment, the rule threshold determination method is as follows:
1) AEB automatic emergency braking
Minimum deceleration of 4m/s of the main vehicle2
The minimum speed of the main vehicle is 15 km/h.
TTC threshold: 2.4 s.
2) ACC adaptive cruise
The minimum speed of the main vehicle is 15 km/h.
Maximum of main carDeceleration 4m/s2
TTC threshold: 3.0 s.
3) BSD blind zone monitoring
The main bus changes lane scenes.
TTC threshold: 6.0 s.
The slave vehicle enters the blind zone, which is a range shown by the hatching in fig. 4, where the line B is parallel to the rear edge of the master vehicle and is located at 3.0m behind the rear edge of the master vehicle. Line C is parallel to the leading edge of the main car and is located at the center of the ninety-fifth percentile eye ellipse. Line F is parallel to the centerline of the host vehicle and is located to the left of the outermost left edge of the host vehicle body, 0.5m from the outermost left edge. Line G is parallel to the centerline of the host vehicle and is located to the left of the outermost left edge of the host vehicle body, 3.0m from the outermost left edge. Line K is parallel to the centerline of the host vehicle and is located to the right of the outermost right edge of the host vehicle body, 0.5m from the outermost right edge. The line L is parallel to the centerline of the host vehicle and is located to the right of the outermost right edge of the body of the host vehicle, at a distance of 3.0m from the outermost right edge.
Two determinations of the slave vehicle entering the blind zone:
1. any part of the slave car is positioned in front of the line B, the slave car is positioned completely behind the line C, the slave car is positioned completely on the left side of the line F, and any part of the slave car is positioned on the right side of the line G.
2. Any part of the slave car is positioned in front of the line B, the slave car is positioned completely behind the line C, the slave car is positioned completely on the right side of the line K, and any part of the slave car is positioned on the left side of the line L.
4) FCW front collision warning
Triggered prior to the AEB emergency braking phase.
TTC threshold: 3.0 s.
5) TJA traffic congestion assistance
The host vehicle speed range: 0-20 km/h.
6) LCA safety lane change aid
The lowest running speed of the main vehicle is 60 km/h.
The main vehicle lane change characteristic: LCA main vehicle transverse speed 1m/s
Entering the detection zone from the vehicle:
the LCA has a minimum detection distance of 100m in the longitudinal direction.
The minimum detection distance of LCA in the transverse direction is 5 m.
S3-3, generating function tag slice data from the function tag
The function mark slice data generated in the step is obtained by further subdividing the road right contention scene data slice according to the data measured in different automatic driving scenes, so that the data can be classified and put in storage when a large database is subsequently established.
S6, configuring data file for function mark slice
In the embodiment, the generated functional markup slice data format is preferably in an xls format, which facilitates direct reading based on a windows system and calling of other professional data analysis software.
The configuration specification file specifically includes: and (3) annotating the physical significance of the data characteristics corresponding to the section of data slice, explaining the unit, unifying the section of data slice, and annotating the text description of the automatic driving automobile function label, and finally outputting the section data with the automatic driving function and the driving action label. And storing the function mark slice data of the configured description file as a data source of the large-volume database.
By adopting the technical scheme of the embodiment, a large amount of driving data collected by the road end sensing system can be identified and converted, slice data with functional markers are output, and a data source is provided for the construction of a large-volume database for automatic driving training.
Example 2
The measured data collected by the road end sensing system may have abnormal values during the collection and transmission processes, such as: because of clutter interference, sudden peak glitches may occur. These outlier data should not be used as a data source to build a large volume database.
In order to solve the above technical problem, as shown in fig. 5, the following technical solutions are adopted:
1. before the driving data are divided into a plurality of segments, the abnormal data are removed by adopting a three-sigma rule and carrying out normal distribution kernel fitting on a data set
In a specific embodiment, data appears beyond three sigma, which means that the probability of the data appearing is extremely low, and the data are abnormal data in general and can be directly eliminated.
2. Patching default portions of driving data by linear interpolation fitting
The data frame dropping problem can cause the loss of the interactive vehicle time sequence and can not be completely matched. And filling the numerical value corresponding to the blank pointer by using a linear interpolation fitting method according to the numerical value and the pointer of the data before and after packet dropping default, and using the numerical value and the pointer as the repairing of the default data. In this embodiment, the linear interpolation fitting is performed after the abnormal data is eliminated and before the driving data is divided into a plurality of segments.
3. Carrying out smooth filtering processing on independent data segments of the master vehicle and the slave vehicle after coordinate conversion of motion information of the slave vehicle
When the road end and vehicle end sensors acquire data, the problem of jitter exists, so that the data waveform has jumping and zero drift. In a specific embodiment, the data waveform is subjected to a filtering process using a moving average filtering function (smooth). And performing smooth filtering processing, and performing coordinate conversion on the motion information of the slave vehicle and projecting the motion information into the coordinate system of the master vehicle.
By adopting the technical scheme of the embodiment, before data slicing is carried out, irregular data can be further preprocessed, so that the correctness of subsequently obtained slice data is higher, and the data source for establishing a large database is more accurate.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.

Claims (10)

1. An automatic driving scene recognition and conversion method based on a road end perception system is characterized by comprising the following steps:
acquiring driving data acquired by a road end sensing system;
traversing all vehicles which can be used as main vehicles according to the driving data, and dividing the driving data into a plurality of segments by taking different vehicles as the main vehicles to obtain independent data segments of the main vehicles and the auxiliary vehicles;
in the independent data segment of the master vehicle and the slave vehicle, the motion information of the slave vehicle is subjected to coordinate conversion and projected into a coordinate system of the master vehicle;
judging whether the master vehicle and the slave vehicle are in the direct interaction interval or not, and evaluating the collision time according to the judgment result;
when the collision time is greater than or equal to 0 and less than or equal to the TTC threshold, performing data slicing on the driving data to obtain functional marker slice data;
and configuring a description file for the functional marker slice data.
2. The method of claim 1, wherein the driving data comprises: the system comprises an expressway, a national road, a provincial road and an urban road, wherein the longitudinal position, the transverse position and the course angle of all traffic participation units in the detection range of a road-end sensing system are under a coordinate system of the road-end sensing system; longitudinal speed, transverse speed; longitudinal acceleration, lateral acceleration.
3. The method for recognizing and transforming the automatic driving scene based on the road-end perception system according to claim 1, wherein traversing all vehicles which can be used as the host vehicle according to the driving data, and dividing the driving data into a plurality of segments by using different vehicles as the host vehicle comprises:
and in the data segment determined by taking time as a scale, taking different vehicle targets as a master vehicle, traversing slave vehicles which are in space-time association with the master vehicle, reestablishing independent data segments of the slave vehicles and the master vehicle, and naming the independent data segments of the master vehicle and the slave vehicles according to the ID of the master vehicle and the ID of the traversed slave vehicles.
4. The method for recognizing and converting the automatic driving scene based on the road-end sensing system as claimed in claim 1, wherein the step of performing coordinate conversion on the motion information of the slave vehicle and projecting the motion information into the coordinate system of the master vehicle comprises the steps of:
calculating the relative longitudinal distance D between the master vehicle and the slave vehiclefAnd relative lateral distance Dl
Df=(Xobj-Xego)×cos(-α2)+(Yobj-Yego)×sin(-α2)
Dl=-((Yobj-Yego)×cos(-α2)-(Xobj-Xego)×sin(-α2))
In the above formula, DfAt a relative longitudinal distance, DlIs a relative lateral distance, Xego,YegoRespectively the abscissa and ordinate values, X, of the main vehicleobj,YobjRespectively the horizontal and vertical coordinate values of the slave car, alpha2Is the main vehicle course angle;
under the coordinate system of the main vehicle, the projection of the longitudinal speed and the projection of the transverse speed of the auxiliary vehicle are as follows:
VOf=VEf×cos(-β2)+VEl×sin(-β2)
VOl=-VEl×cos(-β2)-VEf×sin(-β2)
in the above formula, VofFor longitudinal velocity projection, VolFor transverse velocity projection, VEf、VElLongitudinal, transverse speed of the master vehicle relative to the slave vehicle, beta2Is the centroid slip angle.
5. The method for recognizing and converting the automatic driving scene based on the road-end sensing system as claimed in claim 1, wherein judging whether the master vehicle and the slave vehicle are in the direct interaction zone, and estimating the collision time according to the judgment result comprises:
if it is
Figure FDA0003364381510000021
Calculating the collision time;
if it is
Figure FDA0003364381510000022
Firstly, evaluating a transverse collision crisis, and then calculating collision time;
in the above formula, | DlI is the relative transverse distance between the slave car and the master car, WEIs the width and W of the main frameoIs measured from the vehicle width;
the time to collision TTC is calculated as:
Figure FDA0003364381510000023
in the above formula, DfFor longitudinal distance of slave car to master car, VRfIs the relative longitudinal velocity of the slave car relative to the master car.
6. The method for automatic driving scene recognition and transformation based on the road-end sensing system as claimed in claim 5, wherein if the vehicle is a vehicle, the vehicle is driven by the vehicle
Figure FDA0003364381510000024
Firstly, evaluating the transverse collision crisis and then calculating the longitudinal collision time, comprising the following steps:
calculating the relative speed of the slave vehicle relative to the master vehicle by taking the master vehicle as an observation coordinate system;
judging a transverse approaching trend according to the relative speed of the slave vehicle relative to the master vehicle;
calculating the transverse meeting time of the slave vehicle and the master vehicle according to the transverse distance and the transverse relative speed of the slave vehicle and the master vehicle through the transverse approaching trend judgment result;
calculating the traveling distance of the master vehicle in the transverse meeting time of the slave vehicle and the master vehicle according to the transverse meeting time and the longitudinal relative speed of the slave vehicle and the master vehicle;
calculating the diagonal radius of the secondary vehicle according to the length and the width of the secondary vehicle to obtain the collision area of the primary vehicle and the secondary vehicle;
and judging whether the main vehicle and the auxiliary vehicle have a collision crisis or not according to the longitudinal distance between the auxiliary vehicle and the main vehicle, the collision area between the main vehicle and the auxiliary vehicle and the distance which can be traveled by the main vehicle in the transverse meeting time of the auxiliary vehicle and the main vehicle, and evaluating the collision time.
7. The method for recognizing and converting the automatic driving scene based on the road-end perception system according to claim 1, wherein the data slicing of the driving data comprises:
performing road right competition scene data slicing on the driving data;
according to a rule threshold judgment method, performing function marking on the scene slice data of the right contention based on the functional characteristics;
and carrying out function labeling on the slice data with the function label to obtain the slice data with the function label.
8. The automatic driving scene recognition and conversion method based on the road-end sensing system as claimed in claim 7, wherein the rule threshold determination method comprises:
AEB automatic emergency braking: minimum deceleration of 4m/s of the main vehicle2The lowest speed of the main vehicle is 15km/h, and the TTC threshold value is 2.4 s;
ACC adaptive cruise: the lowest speed of the main vehicle is 15km/h, and the maximum deceleration of the main vehicle is 4m/s2TTC threshold is 3.0 s;
monitoring a BSD blind area: the main lane changing scene is carried out, and the TTC threshold value is 6.0 s;
FCW front collision early warning: triggered before the AEB emergency braking phase, TTC threshold is 3.0 s;
TJA traffic congestion assistance: the host vehicle speed range: 0-20 km/h;
LCA safe lane change assistance: the lowest running speed of the main vehicle is 60km/h, the main vehicle lane change characteristic is that the transverse vehicle speed of the LCA main vehicle is equal to 1m/s, and the secondary vehicle entering detection area characteristic is that the minimum detection distance of the LCA in the longitudinal direction is 100m and the minimum detection distance of the LCA in the transverse direction is 5 m.
9. The method of claim 8, wherein the BSD blind zone monitoring comprises:
line B is parallel to the main trailing edge and is located at 3.0m behind the main trailing edge;
line C is parallel to the front edge of the main car and is located at the center of the ninety-fifth percentile eye ellipse;
the line F is parallel to the center line of the main vehicle, is positioned on the left side of the outermost edge of the left side of the main vehicle body and is 0.5m away from the outermost edge of the left side;
the line G is parallel to the center line of the main vehicle, is positioned on the left of the outermost edge of the left side of the main vehicle body and is 3.0m away from the outermost edge of the left side;
the line K is parallel to the center line of the main vehicle, is positioned on the right of the outermost edge of the right side of the vehicle body of the main vehicle, and is 0.5m away from the outermost edge of the right side;
the line L is parallel to the center line of the main vehicle, is positioned on the right of the outermost edge of the right side of the main vehicle body and has a distance of 3.0m with the outermost edge of the right side;
two determinations of the slave vehicle entering the blind zone:
any part of the slave vehicle is positioned in front of the line B, the slave vehicle is positioned completely behind the line C, the slave vehicle is positioned completely on the left side of the line F, and any part of the slave vehicle is positioned on the right side of the line G;
any part of the slave car is positioned in front of the line B, the slave car is positioned completely behind the line C, the slave car is positioned completely on the right side of the line K, and any part of the slave car is positioned on the left side of the line L.
10. The automatic driving scene recognition and transformation method based on the road-end perception system according to claim 1, characterized in that:
before the driving data are divided into a plurality of segments, normal distribution kernel fitting is carried out on the data set by adopting a three-sigma rule, and abnormal data are removed;
repairing a default portion of the driving data by linear interpolation fitting;
and performing smooth filtering processing on the independent data segments of the master vehicle and the slave vehicle after the coordinate conversion of the motion information of the slave vehicle.
CN202111400766.5A 2021-11-19 2021-11-19 Automatic driving scene recognition and conversion method based on road end perception system Active CN114064656B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111400766.5A CN114064656B (en) 2021-11-19 2021-11-19 Automatic driving scene recognition and conversion method based on road end perception system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111400766.5A CN114064656B (en) 2021-11-19 2021-11-19 Automatic driving scene recognition and conversion method based on road end perception system

Publications (2)

Publication Number Publication Date
CN114064656A true CN114064656A (en) 2022-02-18
CN114064656B CN114064656B (en) 2024-05-14

Family

ID=80276745

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111400766.5A Active CN114064656B (en) 2021-11-19 2021-11-19 Automatic driving scene recognition and conversion method based on road end perception system

Country Status (1)

Country Link
CN (1) CN114064656B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116022081A (en) * 2023-01-05 2023-04-28 中国第一汽车股份有限公司 Anti-collision control method and device, vehicle and storage medium
CN117593892A (en) * 2024-01-19 2024-02-23 福思(杭州)智能科技有限公司 Method and device for acquiring true value data, storage medium and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150104757A1 (en) * 2013-10-15 2015-04-16 Mbfarr, Llc Driving assessment and training method and apparatus
CN109946688A (en) * 2019-03-18 2019-06-28 中国汽车工程研究院股份有限公司 Lane-change contextual data extracting method, device and server
US20200310428A1 (en) * 2019-03-28 2020-10-01 Baidu Online Network Technology (Beijing) Co., Ltd. Lane changing method, device for driverless vehicle and computer-readable storage medium
US20210089039A1 (en) * 2019-09-19 2021-03-25 Caterpillar Inc. System and method for avoiding contact between autonomous and manned vehicles caused by loss of traction
CN113487874A (en) * 2021-05-27 2021-10-08 中汽研(天津)汽车工程研究院有限公司 System and method for collecting, identifying and classifying following behavior scene data
CN113561974A (en) * 2021-08-25 2021-10-29 清华大学 Collision risk prediction method based on vehicle behavior interaction and road structure coupling
CN113568416A (en) * 2021-09-26 2021-10-29 智道网联科技(北京)有限公司 Unmanned vehicle trajectory planning method, device and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150104757A1 (en) * 2013-10-15 2015-04-16 Mbfarr, Llc Driving assessment and training method and apparatus
CN109946688A (en) * 2019-03-18 2019-06-28 中国汽车工程研究院股份有限公司 Lane-change contextual data extracting method, device and server
US20200310428A1 (en) * 2019-03-28 2020-10-01 Baidu Online Network Technology (Beijing) Co., Ltd. Lane changing method, device for driverless vehicle and computer-readable storage medium
US20210089039A1 (en) * 2019-09-19 2021-03-25 Caterpillar Inc. System and method for avoiding contact between autonomous and manned vehicles caused by loss of traction
CN113487874A (en) * 2021-05-27 2021-10-08 中汽研(天津)汽车工程研究院有限公司 System and method for collecting, identifying and classifying following behavior scene data
CN113561974A (en) * 2021-08-25 2021-10-29 清华大学 Collision risk prediction method based on vehicle behavior interaction and road structure coupling
CN113568416A (en) * 2021-09-26 2021-10-29 智道网联科技(北京)有限公司 Unmanned vehicle trajectory planning method, device and computer readable storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BORJA BOVCON等: "Improving vision-based obstacle detection on USV using inertial sensor", 《PROCEEDINGS OF THE 10TH INTERNATIONAL SYMPOSIUM ON IMAGE AND SIGNAL PROCESSING AND ANALYSIS》, 19 October 2017 (2017-10-19), pages 1 - 9 *
秦佳祥: "基于危险碰撞场景建模的主动避撞研究", 《中国优秀硕士学位论文全文数据库 工程科技II辑 》, 15 August 2020 (2020-08-15), pages 035 - 408 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116022081A (en) * 2023-01-05 2023-04-28 中国第一汽车股份有限公司 Anti-collision control method and device, vehicle and storage medium
CN117593892A (en) * 2024-01-19 2024-02-23 福思(杭州)智能科技有限公司 Method and device for acquiring true value data, storage medium and electronic equipment
CN117593892B (en) * 2024-01-19 2024-04-09 福思(杭州)智能科技有限公司 Method and device for acquiring true value data, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN114064656B (en) 2024-05-14

Similar Documents

Publication Publication Date Title
CN107346612B (en) Vehicle anti-collision method and system based on Internet of vehicles
CN106240458B (en) A kind of vehicular frontal impact method for early warning based on vehicle-mounted binocular camera
CN103295424B (en) Automobile active safety system based on video recognition and vehicle ad-hoc network
DE102016217645B4 (en) Method for providing information about a probable driving intention of a vehicle
CN110379203B (en) Driving steering collision early warning method
CN110155046A (en) Automatic emergency brake hierarchical control method and system
CN110723141B (en) Vehicle active collision avoidance system and collision avoidance mode switching method thereof
CN106448190B (en) Real-time monitoring and early warning device and method for traffic flow around self-vehicle on highway
CN109859513A (en) Road junction roadway air navigation aid and device
US11912286B2 (en) Driving risk identification model calibration method and system
CN114064656A (en) Automatic driving scene recognition and conversion method based on road end sensing system
CN113744563B (en) Road-vehicle risk real-time estimation method based on track data
CN111402626B (en) Safe following distance control system and control method based on vehicle-road cooperation
CN113192331B (en) Intelligent early warning system and early warning method for riding safety in internet environment
CN112462381B (en) Multi-laser radar fusion method based on vehicle-road cooperation
CN101101333A (en) Apparatus and method for producing assistant information of driving vehicle for driver
CN113487874A (en) System and method for collecting, identifying and classifying following behavior scene data
CN105632203B (en) A kind of traffic security early warning method of traffic control and system
CN105788360A (en) Vehicle anti-collision method, device and system
CN116071933B (en) Intelligent road early warning system based on vehicle-road cooperation
CN103101558A (en) Train collision avoidance system based on global position system (GPS) positioning
CN113428180A (en) Method, system and terminal for controlling single-lane running speed of unmanned vehicle
CN107564336B (en) Signalized intersection left turn conflict early warning system and early warning method
CN115775378A (en) Vehicle-road cooperative target detection method based on multi-sensor fusion
CN115691223A (en) Cloud edge-end cooperation-based collision early warning method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant