WO2021196144A1 - 一种异常驾驶行为识别方法 - Google Patents

一种异常驾驶行为识别方法 Download PDF

Info

Publication number
WO2021196144A1
WO2021196144A1 PCT/CN2020/083072 CN2020083072W WO2021196144A1 WO 2021196144 A1 WO2021196144 A1 WO 2021196144A1 CN 2020083072 W CN2020083072 W CN 2020083072W WO 2021196144 A1 WO2021196144 A1 WO 2021196144A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driving behavior
driving
data
abnormal
Prior art date
Application number
PCT/CN2020/083072
Other languages
English (en)
French (fr)
Inventor
马红占
俞佳伟
王改良
姜军
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2020/083072 priority Critical patent/WO2021196144A1/zh
Priority to EP20929078.2A priority patent/EP4120215A4/en
Priority to CN202080004340.5A priority patent/CN112512890B/zh
Publication of WO2021196144A1 publication Critical patent/WO2021196144A1/zh
Priority to US17/959,066 priority patent/US20230025414A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/105Speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0018Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions
    • B60W60/00188Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions related to detected security violation of control systems, e.g. hacking of moving vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/06Direction of travel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4042Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4044Direction of movement, e.g. backwards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4046Behavior, e.g. aggressive or erratic
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0141Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096716Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096758Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where no selection takes place on the transmitted or the received information
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle

Definitions

  • This application relates to the field of automatic driving, in particular, to a method for identifying abnormal driving behavior, an automatic driving system, and an intelligent driving vehicle adopting the automatic driving system.
  • autonomous driving technology With the rapid development of 5G communication and Internet of Vehicles technology, autonomous driving technology has become a research hotspot.
  • the core technologies in the field of automatic driving include intelligent environment perception, automatic navigation and positioning, driving behavior decision-making, and intelligent path planning and control.
  • Autonomous driving behavior decision-making technology is an important guarantee for the smooth and reliable operation of unmanned vehicles.
  • Reasonable evaluation of intelligent driving behavior decision-making is the basis of correct driving behavior decision-making.
  • the decision-making of driving behavior and the recognition of abnormal driving behavior are also an important part of the theory of autonomous driving.
  • abnormal driving behaviors include non-smooth driving behaviors such as speeding, rapid deceleration, rapid acceleration, and frequent lane changes. Providing a reliable and efficient method for identifying abnormal driving behaviors is of great practical significance for enhancing user driving experience and improving existing driving behavior decisions.
  • the abnormal driving behavior identified in the prior art is only a suspicious abnormal driving behavior, rather than a real abnormal driving behavior.
  • FIG. 1 shows three different situations. In the left figure, there are no other vehicles in the lane of the vehicle 101. If the vehicle brakes (including sudden brakes), it is an abnormal braking behavior; In the middle picture, there is a vehicle 102 moving slowly in front of the vehicle 101 lane. In this case, if the vehicle brakes (including sudden braking), it is a normal braking behavior; in the right picture, there is another vehicle in front of the vehicle 101 103 cuts in and the vehicle speed is low. In this case, if the vehicle brakes (including sudden braking), it is a normal braking behavior.
  • various embodiments of the present application provide a method, system, non-transient storage medium, and vehicles using the system for recognition of abnormal driving behaviors.
  • the technical solution of the present application combines the information data of the vehicle driving behavior and the vehicle driving scene, and at least according to the two aspects of the data, the algorithm logic set for the specific scene is used to comprehensively determine whether the vehicle is in an abnormal driving state, thereby It solves the problem that the recognition of abnormal driving behavior in the prior art is not accurate enough.
  • an embodiment of the present application provides a method for identifying abnormal driving of a vehicle.
  • the method may specifically include: obtaining vehicle driving behavior data through an ECU (Electronic Control Unit) or an on-board control system based on the driving behavior of the vehicle. Data to determine whether the vehicle is in a state of suspicious abnormal driving behavior; to obtain current vehicle driving scene data through on-board devices such as lidar, millimeter wave radar, camera, and ultrasonic radar; to determine the above-mentioned suspicious based on vehicle driving behavior data and current vehicle driving scene data Whether the abnormal driving behavior is abnormal driving behavior.
  • ECU Electronic Control Unit
  • an on-board control system based on the driving behavior of the vehicle.
  • the above method combines the driving behavior data of the vehicle and the current driving scene data, and judges whether the vehicle is in the state of abnormal driving behavior based on these two kinds of information, thereby overcoming the problem that the prior art is only based on the driving behavior data of the vehicle.
  • the inaccuracy of judging the abnormal driving behavior of the vehicle also improves the safety performance of automatic driving.
  • the driving behavior data of the vehicle may include: vehicle speed, vehicle acceleration, vehicle heading angle, and deviation value of the vehicle from the lane line. These driving behavior data can be obtained directly or indirectly calculated by vehicle control parameters, and they are a direct manifestation of the driving behavior of the vehicle.
  • PCA principal component analysis
  • KPCA kernel principal component analysis
  • LLE Local Linear Embedding
  • LLE Laplace Map
  • the current driving scene data of the vehicle includes: vehicle information parameters, other vehicle information parameters, traffic signal parameters, lane line parameters, and road information parameters. This information characterizes the specific driving scene in which the vehicle is located. Some of these data are discrete and some are continuous. The continuous data can be discretized to facilitate subsequent processing and calculations.
  • a neural network can be used to classify the discretized current vehicle driving scene data to determine the current driving scene.
  • the neural network includes at least one of the following: Convolutional Neural Network (CNN) and Extreme Learning Machine (ELM).
  • the determined current driving scene includes at least one of the following: deceleration at an intersection, deceleration at a road section, and pressing lane lines.
  • the corresponding algorithm logic is determined according to the current driving scene.
  • the algorithm logic can be a predetermined rule, or it can be adjusted according to the actual judgment. Whether the vehicle is in a state of abnormal driving behavior in the current driving scene. Different current driving scenarios have their own different algorithm logics.
  • an automatic driving assistance system which includes a first device for acquiring vehicle driving behavior data; a second device for acquiring current vehicle driving scene data; and a process of communicating and connecting with the first device and the second device
  • the processor is configured to determine whether the vehicle is in an abnormal driving state according to the acquired vehicle driving behavior data and current vehicle driving scene data.
  • the first device may include an ECU (Electronic Control Unit)
  • the second device may include laser radar, millimeter wave radar, ultrasonic radar, and digital camera.
  • an intelligent driving vehicle which includes the automatic driving assistance system of the second aspect described above.
  • a computer-readable storage medium including an instruction set, which can be executed by a processor to implement the method described in any one of the foregoing implementation manners of the first aspect.
  • Various embodiments of the present application provide an abnormal driving behavior recognition method, an automatic driving assistance system, a non-transient storage system, and an intelligent driving vehicle including an automatic driving assistance system.
  • the current driving scene information is introduced into the recognition process of the abnormal driving behavior of the vehicle, thereby solving the problem of misrecognition of the abnormal driving behavior caused by the lack of consideration of the driving scene information in the prior art;
  • the solution of this application sets up a variety of algorithm logics for specific driving scenarios, and determines the abnormal driving behavior recognition of the vehicle through the logic operation of the algorithm family of the multiple algorithm logics. The technicians can also base on actual needs.
  • the technical solution of this application can be widely applied to different levels of automatic driving solutions, systems, and vehicles.
  • Figure 1 is a schematic diagram of an abnormal driving recognition application scenario
  • FIG. 2 is a diagram of an abnormal driving behavior recognition framework provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a suspicious abnormal driving behavior identification process provided by an embodiment of the present application.
  • Fig. 4 is a schematic diagram of a neural network for driving scene recognition provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of scene-based abnormal driving behavior recognition provided by an embodiment of the present application.
  • Figure 6-1 is a schematic diagram of an algorithm logic in a scenario provided by an embodiment of the present application.
  • Figure 6-2 is a schematic diagram of an algorithm logic in a scenario provided by an embodiment of the present application.
  • Figure 6-3 is a schematic diagram of an algorithm logic in a scenario provided by an embodiment of the present application.
  • Figure 6-4 is a schematic diagram of an algorithm logic in a scenario provided by an embodiment of the present application.
  • Fig. 6-5 is a schematic diagram of an algorithm logic in a scenario provided by an embodiment of the present application.
  • Figure 7-1 is a schematic diagram of an algorithm logic in a scenario provided by an embodiment of the present application.
  • Fig. 7-2 is a schematic diagram of an algorithm logic in a scenario provided by an embodiment of the present application.
  • Figure 8-1 is a schematic diagram of an algorithm logic in a scenario provided by an embodiment of the present application.
  • Figure 8-2 is a schematic diagram of an algorithm logic in a scenario provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a non-transitory storage medium provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of an automatic driving assistance system provided by an embodiment of the present application.
  • the embodiment of the present application provides an abnormal driving recognition method, which introduces driving scenes (vehicle surrounding environment information) into the recognition process of abnormal driving behaviors to identify abnormal driving behaviors in different scenarios, and provides a way for accurately identifying abnormal driving behaviors.
  • FIG. 2 shows a schematic flowchart 200 of abnormal driving behavior recognition based on an embodiment of the present application, including:
  • the vehicle driving behavior data mainly includes quantifiable vehicle control parameters, which can be obtained by the trip computer ECU or the vehicle control system.
  • the vehicle control parameters include: (1) Sudden braking behavior parameter: vehicle acceleration; (2) ) Sudden acceleration behavior parameter: vehicle acceleration; (3) the deviation value of the vehicle from the center line of the lane; (4) vehicle speed is lower or higher than the traffic flow speed: vehicle speed; (5) turning behavior parameter: vehicle heading angle.
  • These driving behavior data can be obtained directly or indirectly calculated by vehicle control parameters, and they are a direct manifestation of the driving behavior of the vehicle. It should be understood that the vehicle driving behavior described herein may include manual driving or automatic driving under various levels (for example, SAE specifications L0-L5).
  • step 202 Analyze the driving behavior data obtained in 201 to identify suspicious abnormal driving behaviors.
  • step 301 Obtain driving behavior data for data modeling analysis, including extracting statistical values such as the mean, root mean square, maximum, minimum, peak, and kurtosis factor of the driving behavior data; in step 303, perform statistical feature values of the driving behavior data Extraction; in step 304, perform cluster analysis on the data obtained in step 303; in step 305, after completing the data clustering, it is true that the amount of data in the normal behavior data cluster and the suspicious abnormal behavior data cluster is in the total cluster data Determine the boundary between the normal data cluster and the suspicious abnormal driving behavior data cluster; in step 306, based on the result of step 305, identify the suspicious abnormal driving behavior.
  • the aforementioned clustering analysis methods may include (but are not limited to): Principal Components Analysis: PCA, Kernel Principal Components Analysis: KPCA, Locally Linear Embedding: LLE , Laplacian Eigenmap (Laplacian Eigenmap: LE).
  • Acquire current vehicle driving scene data mainly determine the driving scene of the vehicle based on the surrounding environment information of the vehicle.
  • five types of information are mainly involved, including: (1) Other vehicle information parameters, such as the number of other vehicles around and the speed of other vehicles , Other vehicle acceleration, other vehicle orientation angle, other vehicle lateral and longitudinal position coordinates, etc. (including but not limited to); (2) vehicle information parameters, such as vehicle lateral and longitudinal position coordinates, vehicle orientation angle, etc.; (3) traffic light parameters , Such as left turn light red/yellow/green, etc.; (4) Lane line information parameters, such as lane line position, lane width, lane number, etc.; (5) Road information parameters, such as road type, intersection or not.
  • the current vehicle driving scene data can be obtained through on-board devices such as lidar, millimeter wave radar, camera, ultrasonic radar, etc., and the control center can also notify the vehicle of the surrounding vehicles through the communication between the vehicle and the control center.
  • Driving scene data, or other vehicles notifying the vehicle of driving scene data around the vehicle through communication between the vehicle and other vehicles, or a combination of the above methods, is not limited in this application.
  • Some of the aforementioned parameters are continuous parameters, and some are discrete parameters. In some embodiments, the continuous parameters may be discretized to facilitate subsequent identification of the current driving scene of the vehicle.
  • the method of machine learning is used to identify the current driving scene of the vehicle through the current driving scene data.
  • the method of neural network may be used to identify the driving scene of the vehicle.
  • the neural network An extreme learning machine model (Extreme Learning Machine: ELM) can be used.
  • Figure 4 shows a schematic 400 of using an extreme learning machine to recognize the current driving scene.
  • the extreme learning machine includes an input layer 403, a hidden layer 404, and an output layer 405. After 401 inputs the current vehicle driving scene data to the input layer 403, it passes through the hidden layer (single layer) 404 to the output layer 405 to output the classification result (scene recognition result 402). It should be understood that the extreme learning machine needs to go through a training stage before use.
  • the extreme learning machine is trained with the data of the vehicle in different scenarios so that the extreme learning machine can recognize different scenes, and it can be used after the training is completed;
  • the extreme learning machine model is trained by using discretized scene parameters, and the discretized scene parameters are combined to form a scene.
  • the following scenes at the intersection (1) There is a car in the front range of the crossing vehicle; (2) U-turn turning scene; (3) Located in the left-turn waiting area and the traffic light is not green; (4) The speed of the vehicle entering the intersection Too large; (5) There are obstacles within the safety range of vehicles at the intersection.
  • any suitable classifier in the field of machine learning for example, a convolutional neural network
  • the suspicious abnormal driving behavior is not necessarily the real abnormal driving behavior.
  • the current driving scene information of the vehicle must be considered. After the current driving scene information is added, the suspicious abnormal driving behavior is determined according to the algorithm logic corresponding to the driving scene. It belongs to the real abnormal driving behavior, and which suspicious abnormal driving behavior belongs to the normal driving behavior.
  • FIG. 5 shows a schematic diagram 500 of a scenario-based abnormal driving behavior recognition process provided by an embodiment of the present application, including:
  • step 507 Determine the abnormal driving behavior in the suspicious abnormal driving behavior based on step 506;
  • step 508 Determine the normal driving behavior in the suspicious abnormal driving behavior based on step 506;
  • Figures 6-1 to 6-5 show the recognition algorithm and judgment logic of abnormal driving behavior (deceleration/brake) in the intersection scene based on the embodiment.
  • the intersection scene parameters include other vehicle information (number of other vehicles, speed of other vehicles, acceleration of other vehicles, position of other vehicles relative to vehicles, absolute position of other vehicles, direction angle of other vehicles), road information (zebra crossing, whether there is an area to be turned, whether U Type turning, stop line), vehicle information (mainly vehicle position, vehicle straight/left/right turn status, etc.), traffic light information (direct light color, straight light remaining time, left turn light color, left turn light remaining Time, etc.).
  • the intersection scene includes the following five types:
  • this braking is not an abnormal driving behavior but a normal driving behavior.
  • acceleration is used as the behavior parameter of the deceleration behavior.
  • a positive acceleration indicates that the vehicle is in an accelerating state, and a change in acceleration from positive (or 0) to negative indicates that the vehicle is braking and decelerating.
  • Kalman filtering can be used to filter out changes in vehicle speed due to road bumps or small changes in accelerator force (the speed fluctuations that occur in this case do not represent the vehicle.
  • other suitable filtering methods or data analysis methods can also be used to filter out the data that does not represent the actual deceleration.
  • the acceleration parameter data after the acceleration parameter data is obtained, statistical value data such as the mean value, root mean square, maximum value, minimum value, peak value, and kurtosis factor in the acceleration parameter data are extracted, feature extraction is performed on these data, and then the The behavior parameter data after feature extraction is clustered. After the data clustering is completed, determine the proportion of the amount of data in the normal behavior data cluster and the suspicious abnormal behavior data cluster in the total cluster data, and determine the boundary between the normal data cluster and the suspicious abnormal braking behavior data cluster to identify Suspicious braking behavior.
  • statistical value data such as the mean value, root mean square, maximum value, minimum value, peak value, and kurtosis factor in the acceleration parameter data are extracted
  • feature extraction is performed on these data, and then the The behavior parameter data after feature extraction is clustered. After the data clustering is completed, determine the proportion of the amount of data in the normal behavior data cluster and the suspicious abnormal behavior data cluster in the total cluster data, and determine the boundary between the normal data cluster and the suspicious abnormal braking behavior data cluster to identify Su
  • a trained extreme learning machine may be used to identify which specific intersection scene the vehicle is currently in through the vehicle driving scene data, and the vehicle scene driving data may include processed discretized data.
  • FIG. 6-1 which shows the algorithm logic 61 of the vehicle given in the embodiment in the above scenario (1), including:
  • the driving behavior data of the vehicle passes, for example, the aforementioned Principal Components Analysis (PCA), Kernel Principal Components Analysis: KPCA, and partial Linear embedding (Locally Linear Embedding: LLE) and Laplacian Eigenmap (Laplacian Eigenmap: LE) can identify suspicious abnormal driving behaviors and get the starting time when the vehicle decelerates;
  • PCA Principal Components Analysis
  • KPCA Kernel Principal Components Analysis
  • LLE Linear embedding
  • Laplacian Eigenmap Laplacian Eigenmap
  • Obstacles may include (but are not limited to): vehicles, pedestrians, animals, roadblocks, or any other objects/people/animals in the lane that may hinder driving.
  • 614 If the judgment result of 613 is yes, it indicates that there is an obstacle in the collision area in front of the vehicle, and if the vehicle does not brake, there is a great possibility of collision immediately, so in this case, the vehicle can be judged to decelerate/brake If it is not an abnormal driving behavior, it is a normal driving behavior;
  • 615 If the judgment result of 613 is no, it indicates that there are no obstacles in the collision area in front of the vehicle. In this case, the deceleration/brake may be abnormal driving behavior, but it may also be normal driving behavior. At this time, the vehicle is driving The status is pending, and the above five scenarios need to be judged to get the final result.
  • FIG. 6-2 shows the algorithm logic 62 of the vehicle given in the embodiment in the above scenario (2), including:
  • the driving behavior data of the vehicle passes, for example, the above-mentioned Principal Components Analysis (PCA), Kernel Principal Components Analysis: KPCA, and partial Linear embedding (Locally Linear Embedding: LLE), Laplacian Eigenmap (Laplacian Eigenmap: LE), can identify suspicious abnormal driving behavior, and get the starting time when the vehicle decelerates/brakes;
  • PCA Principal Components Analysis
  • KPCA Kernel Principal Components Analysis
  • LLE Linear embedding
  • Laplacian Eigenmap Laplacian Eigenmap
  • 624 If the judgment result of 624 is no, it means that the vehicle has not turned around since the moment of deceleration. In this case, the deceleration/brake may be abnormal driving behavior, but it may also be normal driving behavior. The status is pending, and the above five scenarios need to be judged to get the final result.
  • Fig. 6-3 it shows the algorithm logic 63 of the vehicle given in the embodiment in the above scenario (3), including:
  • the driving behavior data of the vehicle passes, for example, the aforementioned Principal Components Analysis (PCA), Kernel Principal Components Analysis (KPCA), and partial Linear embedding (Locally Linear Embedding: LLE), Laplacian Eigenmap (Laplacian Eigenmap: LE), can identify suspicious abnormal driving behavior, and get the starting time when the vehicle decelerates/brakes;
  • PCA Principal Components Analysis
  • KPCA Kernel Principal Components Analysis
  • LLE Laplacian Eigenmap
  • LE Laplacian Eigenmap
  • 633 If the judgment result of 632 is yes, it indicates that the vehicle has a high speed when entering the intersection. According to common sense, if the speed is too fast when entering the intersection, the brake should generally be applied. Therefore, in this case, it can be determined that the braking of the vehicle does not belong to Abnormal driving is a normal driving behavior;
  • 634 If the judgment result of 633 is no, it indicates that the vehicle's speed when entering the intersection is low. According to common sense, in this case, the vehicle does not need to brake when entering the intersection. Therefore, the vehicle decelerates in this case /Brake may be an abnormal driving behavior, but it may also be a normal driving behavior. At this time, the driving state of the vehicle is pending, and the above five scenarios need to be judged to get the final result.
  • FIG. 6-4 shows the algorithm logic 64 of the vehicle given in the embodiment in the above scenario (4), including:
  • the driving behavior data of the vehicle passes, for example, the aforementioned Principal Components Analysis (PCA), Kernel Principal Components Analysis (KPCA), and partial Linear embedding (Locally Linear Embedding: LLE), Laplacian Eigenmap (Laplacian Eigenmap: LE), can identify suspicious abnormal driving behavior, and get the starting time when the vehicle decelerates/brakes;
  • PCA Principal Components Analysis
  • KPCA Kernel Principal Components Analysis
  • LLE Laplacian Eigenmap
  • LE Laplacian Eigenmap
  • FIGs. 6-5 show the algorithm logic 65 of the vehicle given in the embodiment in the above scenario (5), including:
  • the driving behavior data of the vehicle passes, for example, the aforementioned Principal Components Analysis (PCA), Kernel Principal Components Analysis: KPCA, and partial Linear embedding (Locally Linear Embedding: LLE), Laplacian Eigenmap (Laplacian Eigenmap: LE), can identify suspicious abnormal driving behavior, and get the starting time when the vehicle decelerates/brakes;
  • PCA Principal Components Analysis
  • KPCA Kernel Principal Components Analysis
  • LLE Laplacian Eigenmap
  • LE Laplacian Eigenmap
  • Obstacles may include (but are not limited to): vehicles, pedestrians, animals, roadblocks, or any other objects/people/animals in the lane that may hinder driving.
  • 655 Determine whether there is a trajectory intersection between the obstacle and the vehicle. Still taking the above obstacle as a vehicle as an example, determine the trajectory 1 based on the vehicle trajectory obtained by 653 (trajectory 1) and the vehicle trajectory obtained by 654 (trajectory 2) Whether there is an intersection with track 2 (including the extended intersection), the two tracks can be analyzed and judged in the general earth coordinate system; if the judgment result of 655 is no, that is, the track between the vehicle and the obstacle If there is no intersection, it indicates that there is no possibility of collision between the vehicle and the obstacle, and the logic returns to the previous step 654, and continues to obtain the next obstacle (for example, a bicycle in motion) for analysis; in some embodiments , For example, lidar, millimeter wave radar, ultrasonic radar or a combination of them can be used to detect obstacles and determine when the obstacle passes the intersection; in other embodiments, by selecting the vehicle's Driving data to determine when the vehicle passed the intersection; it needs to be pointed out that if the obstacle is stationary, such as a road
  • T can be calculated by the following parameters: the length of the trajectory S (the length of the trajectory of the vehicle between the trajectory point and the intersection point of the vehicle from the moment of deceleration), the instantaneous speed V of the vehicle from the moment of deceleration, and the vehicle at The immediate acceleration a0 at the adjacent time before the deceleration occurs, for example: if the vehicle starts to decelerate at time Tc, and for example, Tc is 12:05:10 (12:05:10), then it is adjacent before Tc Time Tc-1 can be 12:05:09 (12 o'clock, 5 minutes, 9 seconds).
  • Tc is the time when the deceleration starts, so the acceleration a0 at time Tc-1 must be non Negative value, when a0 is positive, it can be based on the kinematic formula To calculate T; and add T at the time since the deceleration occurs to obtain the time T1 when the vehicle reaches the intersection without braking; for example, if the deceleration occurs at 12:05:10 (12:05:10) ), if the time required for the vehicle to reach the intersection without braking from the moment of deceleration is 10 seconds, then T1 is 12:05:20 (12:05:20); it should be understood that in some embodiments , The time accuracy is calculated in seconds, but those skilled in the art can appropriately adjust the time unit accuracy according to the actual situation without violating the spirit of this application; it should also be understood that the calculation of T in the above-mentioned embodiment uses The formula of uniform acceleration motion can also be calculated using the formula of uniform motion (that is, the acceleration of the acceleration of the acceleration of the acceleration of the acceleration of the acceleration of
  • 3 seconds is selected as the safety time difference. If the vehicle and the obstacle pass by the intersection after a difference of more than 3 seconds, it is considered It is safe, otherwise it is considered unsafe, that is, the vehicle may collide with the obstacle; it should be understood that the 3 seconds here is a value determined based on some embodiments, and those skilled in the art can use this value according to actual needs. Make appropriate adjustments to the value without going against the concept of this application;
  • 659 If the judgment result of 658 is yes, it indicates that the vehicle and the obstacle have reached the intersection within the safe time difference. If the vehicle does not brake and the obstacle is more likely to collide, then according to common sense, the vehicle should brake to Avoid collisions, so in this case, it can be determined that the braking of the vehicle does not belong to abnormal driving, that is, normal driving behavior;
  • the vehicle does not need to brake.
  • the braking that occurs in this case may be abnormal braking, but it may also be normal braking.
  • the driving state of the vehicle is pending, and it is necessary to judge all the above five scenarios to get the final result.
  • the vehicle system will run all the above algorithm logic 61-65. If the branch 615 of the algorithm logic 61 is tentatively determined as "may belong to normal braking", the system will Continue to judge the algorithm logic 62-65. If the branch logic results obtained in the algorithm logic 62-65 are all "may belong to normal braking", then it can be determined: in the intersection scene, the vehicle deceleration/brake does not belong to the above 5 In any of these scenarios, it can be confirmed that the brake is an abnormal driving behavior.
  • Figures 7-1 to 7-2 show the algorithm logic for judging abnormal driving behavior (brake) in a road segment scene based on the embodiment.
  • FIG. 7-1 which shows the algorithm logic 71 of the vehicle given in the embodiment in the above scenario (1), including:
  • the driving behavior data of the vehicle passes, for example, the aforementioned Principal Components Analysis (PCA), Kernel Principal Components Analysis: KPCA, and partial Linear embedding (Locally Linear Embedding: LLE) and Laplacian Eigenmap (Laplacian Eigenmap: LE) can initially identify suspicious abnormal driving behaviors and get the starting time when the vehicle decelerates;
  • PCA Principal Components Analysis
  • KPCA Kernel Principal Components Analysis
  • LLE Linear embedding
  • Laplacian Eigenmap Laplacian Eigenmap
  • Obstacles can include (but are not limited to): vehicles, pedestrians, roadblocks, or any other objects in the lane that may hinder the traffic;
  • FIG. 7-2 which shows the algorithm logic 72 of the vehicle given in the embodiment in the above scenario (2), including:
  • the driving behavior data of the vehicle passes, for example, the aforementioned principal component analysis (PCA), kernel principal component analysis (KPCA), and partial Linear embedding (Locally Linear Embedding: LLE) and Laplacian Eigenmap (Laplacian Eigenmap: LE) can initially identify suspicious abnormal driving behaviors and get the starting time when the vehicle decelerates;
  • PCA principal component analysis
  • KPCA kernel principal component analysis
  • LLE partial Linear embedding
  • Laplacian Eigenmap Laplacian Eigenmap
  • the control center can also communicate with the vehicle and the control center. Notifying the vehicle of the obstacle information around it, or the other vehicle notifying the vehicle of the obstacle information around it through the communication between the vehicle and other vehicles, or a combination of the above methods, this application does not limit this.
  • Obstacles may include (but are not limited to): vehicles, pedestrians, animals, roadblocks, or any other objects/people/animals in the lane that may hinder driving.
  • T can be calculated by the following parameters: the length of the trajectory S (the length of the trajectory of the vehicle between the trajectory point and the intersection point of the vehicle from the moment of deceleration), the instantaneous speed V of the vehicle from the moment of deceleration, and the vehicle at The immediate acceleration a 0 at the adjacent time before the deceleration occurs.
  • T c the vehicle starts to decelerate at the time T c
  • T c the time when the deceleration starts
  • T c- an acceleration time constant a 0 is non-negative, is positive in the case where a 0, the equation can be based on kinematics To calculate T; and add T at the time since the deceleration occurs to obtain the time T1 when the vehicle reaches the intersection without braking; for example, if the deceleration occurs at 12:05:10 (12:05:10) ), if the time required for the vehicle to reach the intersection without braking from the moment of deceleration is 10 seconds, then T1 is 12:05:20 (12:05:20); it should be understood that
  • 7210 If the judgment result of 728 is no, it means that the vehicle and the obstacle have reached the intersection outside the safe time difference range. In this case, the possibility of collision between the vehicle and the obstacle is very small. According to common sense, the vehicle does not need to brake.
  • the braking that occurs in this case may be an abnormal driving behavior, but it may also be a normal driving behavior. At this time, the driving state of the vehicle is pending, and it is necessary to judge all the above five scenarios to get the final result.
  • the vehicle system will run all the above algorithm logic 71-72. If the branch 715 of the algorithm logic 71 is tentatively determined as "may belong to normal braking", the system will Continue to determine the algorithm logic 72. If the branch logic results obtained in the algorithm logic 72 are all "may belong to normal braking", then it can be determined: in the intersection scene, the vehicle deceleration/brake does not belong to the above two scenarios Either, so it can be confirmed that the brake is an abnormal driving behavior.
  • Figures 8-1 to 8-2 show the algorithm logic for determining abnormal driving behavior (pressing lane lines) in a road scene based on the embodiment.
  • the deviation value of the vehicle from the center line of the lane is used as the behavior parameter of the lane line pressing behavior, and the value of the parameter is determined by the distance value from the center point of the rear axle of the vehicle to the left and right lane lines.
  • the deviation value between the vehicle and the lane centerline is obtained.
  • statistics such as the mean value, root mean square, maximum value, minimum value, peak value and kurtosis factor in the deviation value data of the vehicle and the lane centerline are extracted Value data, and then perform feature extraction on these data, and then perform cluster analysis on the behavior parameter data after feature extraction.
  • a trained extreme learning machine can be used to identify which specific road scene the vehicle is currently in through the vehicle driving scene data.
  • FIG. 8-1 which shows the algorithm logic 81 of the vehicle given in the embodiment in the above scenario (1), including:
  • the driving behavior data of the vehicle passes, for example, the above-mentioned Principal Components Analysis (PCA) and Kernel Principal Components Analysis: KPCA), Locally Linear Embedding (LLE), Laplacian Eigenmap (LE), which can initially identify suspicious abnormal driving behaviors, and get the starting time when the vehicle is pressed against the lane line;
  • PCA Principal Components Analysis
  • KPCA Kernel Principal Components Analysis
  • LLE Locally Linear Embedding
  • LE Laplacian Eigenmap
  • the logics of 812 and 813 indicate that the lane line behavior of the vehicle is accompanied by the passage of the vehicle through the intersection.
  • the logics of 812 and 813 indicate that the lane line behavior of the vehicle is accompanied by the passage of the vehicle through the intersection.
  • the logics of 812 and 813 indicate that the lane line behavior of the vehicle is accompanied by the passage of the vehicle through the intersection.
  • the logics of 812 and 813 indicate that the lane line behavior of the vehicle is accompanied by the passage of the vehicle through the intersection.
  • the logics of 812 and 813 indicate that the lane line behavior of the vehicle is accompanied by the passage of the vehicle through the intersection.
  • the logics of 812 and 813 indicate that the lane line behavior of the vehicle is accompanied by the passage of the vehicle through the intersection.
  • the logics of 812 and 813 indicate that the lane line behavior of the vehicle is accompanied by the passage of the vehicle through the intersection.
  • the logics of 812 and 813 indicate that the lane line behavior
  • the lane line pressing behavior in this case may be an abnormal driving behavior, but it may also be a normal driving behavior At this time, the driving state of the vehicle is pending, and it is necessary to judge both of the above two scenarios to get the final result.
  • FIG. 8-2 which shows the algorithm logic 82 of the vehicle given in the embodiment in the above scenario (2), including:
  • the vehicle driving behavior data passes, for example, the aforementioned Principal Components Analysis (PCA) and Kernel Principal Components Analysis: KPCA), Locally Linear Embedding (LLE), Laplacian Eigenmap (LE), which can initially identify suspicious abnormal driving behaviors, and get the starting time when the vehicle is pressed against the lane line;
  • PCA Principal Components Analysis
  • KPCA Kernel Principal Components Analysis
  • LLE Locally Linear Embedding
  • LE Laplacian Eigenmap
  • the lane pressing behavior in this case may be an abnormal driving behavior, but it may also be a normal driving behavior. At this time, the driving state of the vehicle is pending, and it is necessary to judge both of the above two scenarios to get the final result.
  • the lane ID of the vehicle is obtained by obtaining the lane ID of the vehicle and determining whether the vehicle has changed lanes based on this
  • the technician can also use other appropriate means to obtain the lane information, for example,
  • the image information is used to obtain the lane information of the vehicle through the method of machine learning.
  • the system will continue to judge the logic 82. If the branch logic results obtained in the logic 81-82 are all "may belong to the normal lane line", it can be determined that the lane line does not belong to the above two in the road scene. In any of these scenarios, it can be confirmed that the lane-pressing line is an abnormal driving behavior.
  • a computer-readable storage medium 901 which includes an instruction set 903, which can be executed by the processor 902 to achieve: obtain vehicle driving behavior data; obtain vehicle driving scene data;
  • the aforementioned algorithm logic 61-65, 71-72, 81-82 is executed to determine whether the vehicle is in an abnormal driving state.
  • an automatic driving assistance system is provided, and the automatic driving assistance system includes:
  • the vehicle driving behavior data mainly includes quantifiable vehicle control parameters, which can be controlled by Obtained by the trip computer ECU or on-board control system, the vehicle control parameters include: (1) sudden braking behavior parameter: vehicle acceleration; (2) sudden acceleration behavior parameter: vehicle acceleration; (3) deviation value of the vehicle from the center line of the lane; (4) Vehicle speed is lower or higher than the speed of traffic flow: vehicle speed; (5) turning behavior parameter: vehicle heading angle.
  • These driving behavior data can be obtained directly or indirectly calculated by vehicle control parameters, and they are a direct manifestation of the driving behavior of the vehicle. It should be understood that the vehicle driving behavior described herein may include manual driving or automatic driving under various levels (for example, SAE specifications L0-L5).
  • a second device 1002 for acquiring current vehicle driving scene data may include but is not limited to: lidar, millimeter wave radar, camera, ultrasonic radar, communication device, etc.
  • the current vehicle driving scene data mainly includes: (1) Other car information parameters, such as the number of other cars around, speed of other cars, acceleration of other cars, orientation angle of other cars, horizontal and vertical position coordinates of other cars, etc. (including but not limited to); (2) Vehicle information parameters, such as horizontal and vertical vehicles Position coordinates, vehicle heading angle, etc.; (3) Traffic light parameters, such as left turn light red/yellow/green, etc.; (4) Lane line information parameters, such as lane line position, lane width, number of lanes, etc.; (5) Road Information parameters, such as road type, intersection, etc.
  • the vehicle driving scene data can be obtained through the above-mentioned lidar, millimeter wave radar, camera, ultrasonic radar and other on-board devices, and the control center can also notify the vehicle surroundings based on the communication device through the vehicle and the control center.
  • the driving scene data of the vehicle, or another vehicle notifies the vehicle of the driving scene data around it through communication between the vehicle and other vehicles, or a combination of the above methods, which is not limited in this application.
  • Some of the above-mentioned parameters are continuous parameters, and some are discrete parameters. In some embodiments, the continuous parameters may be discretized to facilitate subsequent driving scene recognition.
  • an intelligent driving vehicle including an automatic driving assistance system, and the automatic driving assistance includes:
  • the driving behavior data mainly includes quantifiable vehicle control parameters, which can be Obtained by computer ECU or on-board control system.
  • Vehicle control parameters include: (1) Sudden braking behavior parameter: vehicle acceleration; (2) Sudden acceleration behavior parameter: vehicle acceleration; (3) Vehicle distance from the lane centerline deviation value; (4) Vehicle speed Lower or higher than traffic speed: vehicle speed; (5) turning behavior parameter: vehicle heading angle.
  • These driving behavior data can be obtained directly or indirectly calculated by vehicle control parameters, and they are a direct manifestation of the driving behavior of the vehicle. It should be understood that the vehicle driving behavior described herein may include manual driving or automatic driving under various levels (for example, SAE specifications L0-L5).
  • a second device 1002 for acquiring current vehicle driving scene data may include but is not limited to: lidar, millimeter wave radar, camera, ultrasonic radar, communication device, etc.
  • the current driving scene data mainly includes: (1) Other car information parameters, such as the number of other cars around, speed of other cars, acceleration of other cars, orientation angle of other cars, horizontal and vertical position coordinates of other cars, etc. (including but not limited to); (2) Vehicle information parameters, such as horizontal and vertical vehicles Position coordinates, vehicle heading angle, etc.; (3) Traffic light parameters, such as left turn light red/yellow/green, etc.; (4) Lane line information parameters, such as lane line position, lane width, number of lanes, etc.; (5) Road Information parameters, such as road type, intersection, etc.
  • the vehicle driving scene data can be obtained through the above-mentioned lidar, millimeter wave radar, camera, ultrasonic radar and other on-board devices, and the control center can also notify the vehicle surroundings based on the communication device through the vehicle and the control center.
  • the driving scene data of the vehicle, or another vehicle notifies the vehicle of the driving scene data around it through communication between the vehicle and other vehicles, or a combination of the above methods, which is not limited in this application.
  • Some of the above-mentioned parameters are continuous parameters, and some are discrete parameters. In some embodiments, the continuous parameters may be discretized to facilitate subsequent driving scene recognition.
  • the processor 1003 is communicatively connected with the first device and the second device; the processor 1003 determines whether the vehicle is in a state of abnormal driving behavior at least according to the acquired vehicle driving behavior data and vehicle driving scene data. In some embodiments, the processor may determine whether the vehicle is in a state of abnormal driving behavior based on vehicle driving behavior data, current vehicle driving scene data, and algorithm logic 61-65, 71-72, 81-82.
  • Various embodiments of the present application provide a method for identifying abnormal driving behavior, an automatic driving assistance system, a non-transient storage system, and a vehicle including the automatic driving assistance system.
  • the technical solution of the present application by introducing the driving scene information into the recognition process of the abnormal driving behavior of the vehicle, the problem of misrecognition of the abnormal driving behavior caused by the lack of consideration of the driving scene information in the prior art is solved;
  • the solution of this application corresponds to a variety of algorithm logics in a specific driving scene, and determines the abnormal driving behavior recognition of the vehicle through the logic operation of the algorithm family of the multiple algorithm logic. The technician can also according to actual needs.
  • the technical solution of this application has good scalability; on the other hand, the amount of code of the algorithm logic of the technical solution of this application is small, so the technical solution of this application can be directly applied to the local vehicle conveniently. It has economic advantages.
  • the technical solution of this application can be widely applied to different levels of automatic driving solutions, systems, and vehicles.
  • the disclosed system, device, and method can be implemented in other ways.
  • the device embodiments described above are merely illustrative, for example, the division of units is only a logical business division, and there may be other divisions in actual implementation, for example, multiple units or components can be combined or integrated. To another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • business units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be realized in the form of hardware or software business unit.
  • the integrated unit is implemented in the form of a software business unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , Including several instructions to make a computer device (which can be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disks or optical disks and other media that can store program codes. .
  • the services described in this application can be implemented by hardware, software, firmware, or any combination thereof.
  • these services can be stored in a computer-readable medium or transmitted as one or more instructions or codes on the computer-readable medium.
  • the computer-readable medium includes a computer storage medium and a communication medium, where the communication medium includes any medium that facilitates the transfer of a computer program from one place to another.
  • the storage medium may be any available medium that can be accessed by a general-purpose or special-purpose computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Mathematical Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

本申请涉及自动驾驶领域,公开了一种异常驾驶行为识别方法、系统以及包括该系统的车辆,异常驾驶行为识别方法包括:获取车辆驾驶行为数据,基于车辆驾驶行为数据判断车辆是否处于可疑异常驾驶行为的状态;如果车辆处于可疑异常驾驶行为的状态,则获取当前车辆驾驶场景数据;根据所述车辆驾驶行为数据和所述当前车辆驾驶场景数据判断所述可疑异常驾驶行为是否为异常驾驶行为;在本申请技术方案中,将当前驾驶场景信息引入到车辆的异常驾驶行为的识别过程中,提升了异常驾驶行为识别的准确度。

Description

一种异常驾驶行为识别方法 技术领域
本申请涉及自动驾驶领域,具体地,涉及一种异常驾驶行为识别方法、一种自动驾驶系统以及采用该自动驾驶系统的智能驾驶车辆。
背景技术
随着5G通信和车联网技术的快速发展,自动驾驶技术已经成为研究热点。自动驾驶领域核心技术包括智能环境感知、自动导航定位、驾驶行为决策和智能路径规划控制等。自动驾驶行为决策技术是无人驾驶车能否平稳可靠运行的重要保障。对智能驾驶行为决策进行合理的评测是驾驶行为正确决策的基础。驾驶行为的决策和异常驾驶行为的识别又是自动驾驶理论的重要组成部分。通常,异常驾驶行为包括超速、急减速、急加速和频繁换道等非平稳的驾驶行为。提供可靠高效的异常驾驶行为识别方法对提升用户驾驶体验和改进现有驾驶行为决策具有重要的实际意义。
在很多情况下,现有技术所识别的异常驾驶行为只是可疑异常驾驶行为,而并非真正的异常驾驶行为。参见图1所示出的场景100,其中示出了三种不同的情况,在左侧的图中,车辆101的车道无其它车辆,如果发生车辆刹车(包括急刹车),属于异常刹车行为;在中间的图中,车辆101车道前方有车辆102慢行,在这种情况下如果发生车辆刹车(包括急刹车),则属于正常刹车行为;在右侧的图中,车辆101前方有他车103切入,且车速较低,在这种情况下如果发生车辆刹车(包括急刹车),则属于正常刹车行为。可以看出,在上述三种情况中,只有第一种情况是真正的异常驾驶行为,而后两种都是正常刹车行为(为了避免和前车的碰撞),但现有技术会认为所有的急刹车都是异常驾驶行为,这样对于异常驾驶的识别是不准确的。
基于以上,需要一种新的异常驾驶行为识别方法,其可以更加准确地识别异常驾驶行为。
发明内容
为了解决现有技术中异常驾驶行为识别准确度低的问题,本申请各种实施例提供了一种异常驾驶行为识别方法、系统、非暂态存储介质和采用该系统的车辆。本申请技术方案结合了车辆驾驶行为和车辆驾驶场景这两方面的信息数据,并至少依据这两方面的数据,利用为具体场景所设定的算法逻辑来综合判断车辆是否处于异常驾驶状态,从而解决了现有技术中对于异常驾驶行为识别不够准确的问题。
作为本申请的一方面,本申请实施例提供了一种车辆异常驾驶识别方法,该方法具体可以包括:通过行车电脑ECU(Electronic Control Unit)或者车载控制系统获得车辆驾驶行为数据,基于车辆驾驶行为数据判断车辆是否处于可疑异常驾驶行为的状态;通过激光雷达、毫米波雷达、摄像头、超声雷达等车载装置来获取当前车辆驾驶场景数据;根据车辆驾驶行 为数据和当前车辆驾驶场景数据来判断上述可疑异常驾驶行为是否为异常驾驶行为。上述方法结合了车辆的驾驶行为数据和当前驾驶场景数据这两种信息,并依据这两种信息来判断车辆是否处于异常驾驶行为的状态,从而克服了现有技术中仅仅基于车辆驾驶行为数据来对车辆进行异常驾驶行为的判断的不准确性,因此也提高了自动驾驶的安全性能。
在一个可能的设计中,车辆的驾驶行为数据可以包括:车辆速度、车辆加速度、车辆朝向角、车辆距离车道线偏离值。这些驾驶行为数据都是可以直接获得或间接由车控参数计算求得,它们是对车辆驾驶行为的直接体现。通过对车辆驾驶行为数据进行聚类分析以获得(未决定的)异常驾驶行为;聚类分析可以使用习知的技术方案:例如:主成分分析法(PCA)、核主元分析法(KPCA)、局部线性嵌入(LLE)、拉普拉斯映射(LE)等。
在一个可能的设计中,车辆的当前驾驶场景数据包括:车辆信息参数、他车信息参数、交通信号参数、车道线参数、道路信息参数。这些信息表征了车辆所处的具体驾驶场景。这些数据中有些是离散型的而有些是连续型的,可以将连续型的数据进行离散化处理,以便于后续的处理和运算。可以使用神经网络来对离散化处理后的当前车辆驾驶场景数据进行分类,以确定当前驾驶场景,神经网络包括以下至少一种:卷积神经网络(CNN)、极限学习机(ELM)。
在一个可能的设计中,所确定的当前驾驶场景包括以下至少之一:路口减速、道路路段减速、压车道线。这几类场景在实际驾驶中出现的频次较高,并且在这几类场景下,如果仅依据驾驶行为数据来判断车辆是否处于异常驾驶状态容易发生误判。
在一个可能的设计中,在确定了当前驾驶场景之后,根据当前驾驶场景确定与之对应的算法逻辑,算法逻辑可以是事先确定的规则,也可以依据实际的判断情况而作相应调整,从而确定在当前驾驶场景下车辆是否处于异常驾驶行为的状态。不同的当前驾驶场景对应有各自不同的算法逻辑。
第二方面,提供一种自动驾驶辅助系统,包括用于获取车辆驾驶行为数据的第一装置;用于获取当前车辆驾驶场景数据的第二装置;与第一装置和第二装置通信连接的处理器;处理器配置为可以根据获取的车辆驾驶行为数据和当前车辆驾驶场景数据判断车辆是否处于异常驾驶状态。第一装置可以包括行车电脑ECU(Electronic Control Unit),第二装置可以包括:激光雷达、毫米波雷达、超声波雷达、数码相机。
可以理解的是,第二方面提供的系统对应于第一方面提供的方法,故第二方面各实现方式以及达到的技术效果可参见第一方面各实现方式的相关描述。
第三方面,提供一种智能驾驶车辆,其包括前述第二方面的自动驾驶辅助系统。
第四方面,提供一种计算机可读存储介质,包括指令集,所述指令集可以被处理器执行以实现前述第一方面任意一种实现方式所述的方法。
本申请各种实施例提供了异常驾驶行为识别方法、自动驾驶辅助系统、非暂态存储系统以及包括自动驾驶辅助系统的智能驾驶车辆。在本申请技术方案中,将当前驾驶场景信息引入到车辆的异常驾驶行为的识别过程中,从而解决了现有技术中由于对驾驶场景信息的考虑欠缺所导致的异常驾驶行为的误识别问题;并且本申请方案为具体的驾驶场景设定了与之相配合的多种算法逻辑,并通过多种算法逻辑的算法族的逻辑运算来确定车辆的异常驾驶行为识别,技术人员也可以依据实际需求来对算法逻辑进行合理地调整;因此本申请技术方案的扩展性好;另一方面,本申请技术方案的算法逻辑的代码量较小,因此本申请技术方案可以 方便地直接应用于车辆端,具有经济性优势,综上:本申请技术方案可以广泛地适用于不同等级的自动驾驶方案、系统、车辆。
附图说明
图1是异常驾驶识别应用场景示意图;
图2是本申请实施例提供的一种异常驾驶行为识别架构图;
图3是本申请实施例提供的一种可疑异常驾驶行为识别流程示意图;
图4是本申请实施例提供的一种用于驾驶场景识别的神经网络示意图;
图5是本申请实施例提供的一种基于场景的异常驾驶行为识别示意图;
图6-1是本申请实施例提供的一种场景下的一种算法逻辑示意图;
图6-2是本申请实施例提供的一种场景下的一种算法逻辑示意图;
图6-3是本申请实施例提供的一种场景下的一种算法逻辑示意图;
图6-4是本申请实施例提供的一种场景下的一种算法逻辑示意图;
图6-5是本申请实施例提供的一种场景下的一种算法逻辑示意图;
图7-1是本申请实施例提供的一种场景下的一种算法逻辑示意图;
图7-2是本申请实施例提供的一种场景下的一种算法逻辑示意图;
图8-1是本申请实施例提供的一种场景下的一种算法逻辑示意图;
图8-2是本申请实施例提供的一种场景下的一种算法逻辑示意图;
图9是本申请实施例提供的一种非暂态存储介质的示意图;
图10是本申请实施例提供的一种自动驾驶辅助系统的示意图。
具体实施方式
本申请实施例提供了一种异常驾驶识别方法,将驾驶场景(车辆周边环境信息)引入异常驾驶行为的识别过程中,以识别不同场景下的异常驾驶行为,为准确识别异常驾驶行为提供了一种新方案,并为提高自动驾驶可靠性和优化用户驾驶体验提供支持。
参见图2,其示出了基于本申请实施例给出的异常驾驶行为识别的流程示意图200,包括:
201:获取车辆驾驶行为数据,车辆驾驶行为数据主要包括可量化的车控参数,可以由行车电脑ECU或者车载控制系统获得,车控参数包括:(1)急刹车行为参数:车辆加速度;(2)急加速行为参数:车辆加速度;(3)车辆距离车道中心线偏差值;(4)车速低于或高于车流速度:车辆速度;(5)转弯行为参数:车辆朝向角。这些驾驶行为数据都是可以直接获得或间接由车控参数计算求得,它们是对车辆驾驶行为的直接体现。应当理解的是,这里所述的车辆驾驶行为既可以包括人工驾驶,也可以包括各种等级(例如SAE规范L0-L5)下的自动驾驶。
202:对201所获得的驾驶行为数据进行分析以识别可疑异常驾驶行为,在一些实施例中,参见图3示出的流程300,其包括:在步骤301获取驾驶行为数据;在步骤302,对获取驾驶行为数据进行数据建模分析,包括提取驾驶行为数据的均值、均方根、最大值、最小值、峰值、峭度因子等统计值;在步骤303,对驾驶行为的数据统计特征值进行提取;在步骤304,对步骤303获取的数据进行聚类分析;在步骤305,在完成数据聚类以后,根确正常行为数据簇和可疑的异常行为数据簇中数据量在总聚类数据中的比重,确定正常数据簇和可疑的异 常驾驶行为数据簇的边界;在步骤306,基于步骤305的结果识别出可疑的异常驾驶行为。上述的聚类分析方法可以包括(但不限于):主成分分析法(Principal Components Analysis:PCA)、核主元分析法(Kernel Principal Components Analysis:KPCA)、局部线性嵌入(Locally Linear Embedding:LLE)、拉普拉斯映射(Laplacian Eigenmap:LE)。
203:获取当前车辆驾驶场景数据,主要根据车辆周围环境信息确定车辆的驾驶场景,在此过程中主要涉及五类信息,包括:(1)他车信息参数,例如周围他车数量、他车速度、他车加速度、他车朝向角、他车横向和纵向位置坐标等(包括不限于);(2)车辆信息参数,例如车辆横向和纵向位置坐标、车辆朝向角等;(3)交通灯参数,例如左转灯红/黄/绿等;(4)车道线信息参数,例如车道线位置、车道宽度、车道数量等;(5)道路信息参数,例如道路类型、是否路口等。应当理解的是,可以通过例如激光雷达、毫米波雷达、摄像头、超声雷达等车载装置来获取当前车辆驾驶场景数据,还可以通过车辆和控制中心的通信来由控制中心向车辆通知其周围的车辆驾驶场景数据,或者通过车辆和其它车辆之间的通信由别的车辆向车辆通知其周围的车辆驾驶场景数据,或者是以上各种方法的结合,本申请对此不作限定。上述的这些参数有些是连续型参数,有些是离散型参数,在一些实施例中,可以对连续型参数进行离散化处理以便于后续的车辆当前驾驶场景识别。
204:在获得当前车辆驾驶场景数据以后,依据当前车辆驾驶场景数据进行车辆驾驶场景识别,通过驾驶场景识别,可以确定车辆目前是具体处于哪些场景之中,例如(但不限于):车辆是否处于路口?是否前方有车,是否处于左转待转区等。在一些实施例中,采用机器学习的方法通过当前车辆驾驶场景数据来识别当前车辆驾驶场景,在一些实施例中,可以采用神经网络的方法来识别车辆驾驶场景,在一些实施例中,神经网络可以采用极限学习机模型(Extreme Learning Machine:ELM),图4示出了采用极限学习机来进行当前驾驶场景识别的示意400,极限学习机包括输入层403、隐层404、输出层405,在401将当前车辆驾驶场景数据输入到输入层403后,经过隐层(单层)404至输出层405输出分类结果(场景识别结果402)。应当理解的是,极限学习机在使用之前需要经过训练的阶段,采用车辆在不同场景下的数据对极限学习机进行训练以使得极限学习机可以识别出不同的场景,训练完成后即可使用;在一些实施例中,通过在离散化的场景参数来对极限学习机模型进行训练,离散化的场景参数取值组合成场景。例如路口的下列场景:(1)过路口车辆行驶前方范围内有车;(2)U型掉头转弯场景;(3)位于左转待转区、交通灯非绿;(4)进入路口车辆速度过大;(5)路口车辆安全范围内有障碍物。还应当理解的是:虽然实施例给出了极限学习机的示例,但是任何合适的机器学习领域的分类器(例如:卷积神经网络)都可以用于实现驾驶场景识别。
205:至少基于识别的可疑异常驾驶行为和当前驾驶场景,来判断异常驾驶行为。换言之,可疑异常驾驶行为并不一定就是真正的异常驾驶行为,必须考虑车辆的当前驾驶场景信息,在加入了当前驾驶场景信息之后,依据与驾驶场景相对应的算法逻辑来确定哪些可疑异常驾驶行为属于真正的异常驾驶行为,哪些可疑异常驾驶行为属于正常驾驶行为。
参见图5,其示出了本申请实施例提供的一种基于场景的异常驾驶行为识别流程示意图500,包括:
501:开始;
502:获取车辆驾驶行为数据;
503:获取当前车辆驾驶场景数据;
504:基于车辆驾驶行为数据识别可疑异常驾驶行为;
505:基于当前驾驶场景数据识别当前驾驶场景;
506:根据当前场景下的算法和判断逻辑判断当前场景下的可疑异常驾驶行为是否属于异常驾驶行为;
507:基于步骤506确定可疑异常驾驶行为中的异常驾驶行为;
508:基于步骤506确定可疑异常驾驶行为中的正常驾驶行为;
509:结束。
图6-1至6-5示出了基于实施例的路口场景下的异常驾驶行为(减速/刹车)识别算法和判断逻辑。路口场景参数包括他车信息(他车数量、他车速度、他车加速度、他车相对车辆的位置、他车绝对位置、他车朝向角)、道路信息(斑马线、是否有待转区、是否U型转弯、停止线)、车辆信息(主要是车辆位置、车辆直行/左转/右转状态等)、交通灯信息(直行灯的颜色、直行灯剩余时间、左转灯颜色、左转灯剩余时间等)。在一些实施例中,路口场景包括以下5种:
(1)过路口车辆行驶前方范围内有障碍物;
(2)车辆U型掉头转弯场景;
(3)进入路口车辆速度过大;
(4)车辆位于左转待转区、交通灯非绿;路口场景4;
(5)路口车辆安全范围内有障碍物。
如果车辆在路口发生了减速/刹车,并且车辆是处于上述的5种场景的任意一种,则这种刹车就不属于异常驾驶行为而属于正常驾驶行为。
在一些实施例中,将加速度作为减速行为的行为参数,加速度为正说明车辆处于加速状态,加速度由正(或者0)转负说明车辆出现刹车减速行为。需要指出的是,在一些实施例中,可以使用卡尔曼滤波滤掉由于道路颠簸或者油门加力的微小变化所发生的车辆速度变化的情况(在这种情况下发生的速度波动并不代表车辆真正发生了加速或减速)下的数据;应当理解的是,除了上述的卡尔曼滤波,其它合适的滤波方法或者数据分析方法也可以用以滤除这些并不代表真正发生减速的数据。
在一些实施例中,在获得加速度参数数据后,提取加速度参数数据中的均值、均方根、最大值、最小值、峰值和峭度因子等统计值数据,对这些数据进行特征提取,然后对特征提取后的行为参数数据进行聚类分析。在完成数据聚类以后,确定正常行为数据簇和可疑的异常行为数据簇中数据量在总聚类数据中的比重,确定正常数据簇和可疑的异常刹车行为数据簇的边界,以此识别出可疑的刹车驾驶行为。
在一些实施例中,可以使用训练好的极限学习机来通过车辆驾驶场景数据来识别出车辆当前处于哪种具体的路口场景,车辆场景驾驶数据可以包括经过处理的离散化的数据。
下面结合附图6-1至6-5对这5种路口场景下的算法和判断逻辑进行详细描述。
参见图6-1,其示出了实施例所给出的车辆在上述场景(1)情况下的算法逻辑61,包括:
611:获取减速起始时刻数据,当车辆发生刹车时,车辆驾驶行为数据通过例如上述的主成分分析法(Principal Components Analysis:PCA)、核主元分析法(Kernel Principal Components Analysis:KPCA)、局部线性嵌入(Locally Linear Embedding:LLE)、拉普拉斯映射(Laplacian  Eigenmap:LE),可以识别出可疑的异常驾驶行为,并得到车辆发生减速的起始时刻;
612:获取减速起始时刻车辆周围的障碍物及位置,可以通过例如激光雷达、毫米波雷达、超声雷达、摄像头等车载装置来获取这些信息,还可以通过车辆和控制中心的通信来由控制中心向车辆通知其周围的障碍物信息,或者通过车辆和他车之间的通信由他车向车辆通知其周围的障碍物信息,或者是以上各种方法的结合,本申请对此不作限定。障碍物可以包括(但不限于):车辆、行人、动物、路障、或者任何其它在车道上的可能妨碍行车的物体/人/动物。
613:判断每个障碍物是否在车辆的纵向前方8-12米、横向左右3-4米的范围内,这个范围属于车辆沿着现有路线行驶很快就要经过的区域,可以称之为碰撞域。换言之,如果在碰撞域内有障碍物而车辆不进行刹车,则很快就有极大可能发生碰撞;碰撞域范围的障碍物识别可以基于超声雷达、毫米波雷达、超声波雷达等装置来实现。
614:如果613的判断结果为是,则表明在车辆前方碰撞域有障碍物,并且如果车辆不发生刹车则马上就有极大的可能发生碰撞,因此在该情况下可以判定车辆的减速/刹车不属于异常驾驶行为,即属于正常驾驶行为;
615:如果613的判断结果为否,则表明车辆的前方的碰撞域内无障碍物,在此情况下发生的减速/刹车可能是异常驾驶行为,但是也可能是正常驾驶行为,此时车辆的驾驶状态处于待定,需要将上述五种场景都进行判断才能得出最终结果。
继续参见图6-2,其示出了实施例给出的车辆在上述场景(2)情况下的算法逻辑62,包括:
621:获取减速起始时刻数据,当车辆发生刹车时,车辆驾驶行为数据通过例如上述的主成分分析法(Principal Components Analysis:PCA)、核主元分析法(Kernel Principal Components Analysis:KPCA)、局部线性嵌入(Locally Linear Embedding:LLE)、拉普拉斯映射(Laplacian Eigenmap:LE),可以识别出可疑的异常驾驶行为,并得到车辆发生减速/刹车的起始时刻;
622:判断在车辆减速的时段内,车辆是否发生过掉头行为(U型转弯),这可以通过获取并分析自减速起始时刻的行车数据而得到;
623:如果622的判断结果为是,则表明车辆自减速起始时刻起发生过掉头行为,由于一般而言车辆发生掉头(U型转弯)必然伴随着减速,因此在该情况下可以判定车辆的减速/刹车不属于异常驾驶,即属于正常驾驶行为;
624:如果624的判断结果为否,则表明车辆自减速时刻起没有发生过掉头,在此情况下发生的减速/刹车可能是异常驾驶行为,但是也可能是正常驾驶行为,此时车辆的驾驶状态处于待定,需要将上述五种场景都进行判断才能得出最终结果。
继续参见图6-3,其示出了实施例给出的车辆在上述场景(3)情况下的算法逻辑63,包括:
631:获取减速起始时刻数据,当车辆发生刹车时,车辆驾驶行为数据通过例如上述的主成分分析法(Principal Components Analysis:PCA)、核主元分析法(Kernel Principal Components Analysis:KPCA)、局部线性嵌入(Locally Linear Embedding:LLE)、拉普拉斯映射(Laplacian Eigenmap:LE),可以识别出可疑的异常驾驶行为,并得到车辆发生减速/刹车的起始时刻;
632:判断在减速起始时刻,车辆的速度是否大于20km/h;应当理解的是,这里的20km/h是基于一些实施例给出的值,本领域技术人员可以依据实际需求将此值进行适当调整而不违背本申请的构思;
633:如果632的判断结果为是,则表明车辆在进入路口时的速度较大,依据常理,在进入路口时如果车速过快一般应当进行刹车,因此在该情况下可以判定车辆的刹车不属于异常驾驶,即属于正常驾驶行为;
634:如果633的判断结果为否,则表明车辆在进入路口时的速度较低,依据常理,这种情况下车辆在在进入路口时没有进行刹车的必要,因此在此情况下车辆发生的减速/刹车可能是异常驾驶行为,但是也可能是正常驾驶行为,此时车辆的驾驶状态处于待定,需要将上述五种场景都进行判断才能得出最终结果。
继续参见图6-4,其示出了实施例给出的车辆在上述场景(4)情况下的算法逻辑64,包括:
641:获取减速起始时刻数据,当车辆发生刹车时,车辆驾驶行为数据通过例如上述的主成分分析法(Principal Components Analysis:PCA)、核主元分析法(Kernel Principal Components Analysis:KPCA)、局部线性嵌入(Locally Linear Embedding:LLE)、拉普拉斯映射(Laplacian Eigenmap:LE),可以识别出可疑的异常驾驶行为,并得到车辆发生减速/刹车的起始时刻;
642:判断在减速发生的时刻,车辆是否发生了左转;
643:如果642的判断结果为是,继续在此步骤获取减速起始时刻的所有交通灯信息;
644:判断左转灯是否存在;
645:如果644的判断结果为是,继续判断左转灯是否非绿;
646:如果645的判断结果为是,则表明车辆在路口发生减速的时刻伴随着了左转红灯的交通信号,依据常理,车辆应当进行刹车,因此在该情况下可以判定车辆的刹车不属于异常驾驶,即属于正常驾驶行为;
647:如果上述642、644、645的判断结果中有任意一个为否,则表明车辆在路口发生减速的时刻处于以下三种情况之一:(1)没有发生左转;(2)发生左转没有左转灯;(3)发生左转,有左转灯并且左转灯为绿。在这三种情况下,一般都不必然伴随着减速,因此在此情况下车辆发生的减速/刹车可能是异常驾驶行为,但是也可能是正常驾驶行为,此时车辆的驾驶状态处于待定,需要将上述五种场景都进行判断才能得出最终结果。
继续参见图6-5,其示出了实施例给出的车辆在上述场景(5)情况下的算法逻辑65,包括:
651:获取减速起始时刻数据,当车辆发生刹车时,车辆驾驶行为数据通过例如上述的主成分分析法(Principal Components Analysis:PCA)、核主元分析法(Kernel Principal Components Analysis:KPCA)、局部线性嵌入(Locally Linear Embedding:LLE)、拉普拉斯映射(Laplacian Eigenmap:LE),可以识别出可疑的异常驾驶行为,并得到车辆发生减速/刹车的起始时刻;
652:获取减速起始时刻车辆周围的障碍物及位置,可以通过例如激光雷达、毫米波雷达、超声雷达、摄像头等车载装置来获取这些信息,还可以通过车辆和控制中心的通信来由控制中心向车辆通知其周围的障碍物信息,或者通过车辆和他车之间的通信由他车向车辆通知其周围的障碍物信息,或者是以上各种方法的结合,本申请对此不作限定。障碍物可以包括(但不限于):车辆、行人、动物、路障、或者任何其它在车道上的可能妨碍行车的物体/人/动物。
653:获取车辆在减速段内的行驶轨迹,这可以通过获取并分析自减速起始时刻的车辆行车数据而得到;
654:依次获取每个障碍物在车辆减速时段内的轨迹,例如(但不限于):如果第一个待 分析的障碍物是一辆车辆,则获取该车辆在车辆减速时段内的轨迹;
655:判断障碍物和车辆是否存在轨迹交叉点,仍然以上述的障碍物是车辆为例,依据653所得到的车辆轨迹(轨迹1)和654所得到的车辆轨迹(轨迹2),判断轨迹1和轨迹2是否存在着交叉点(包括延伸交叉点),可以在一般的地球坐标系中进行这两个轨迹的分析和判断;如果655的判断结果为否,即车辆和障碍物的轨迹之间不存在交叉点,则表明车辆和该障碍物没有碰撞可能,则逻辑返回到上一步654,继续获取下一个障碍物(例如可以是一辆骑行中的自行车)进行分析;在一些实施例中,可以通过例如激光雷达、毫米波雷达、超声雷达或它们的组合来探测障碍物并确定障碍物在什么时间通过了交叉点;在另一些实施例中,通过选取车辆在两次刹车之间的驾驶数据来判断车辆在什么时间通过了交叉点;需要指出的是,如果障碍物是静止的,例如一个路障,此种情况下该路障的轨迹就是一个静止的点(以地球坐标系为参照系而言),那么这种情况等同于判断逻辑61所适用的场景(过路口车辆行驶前方范围内有障碍物);
656:如果655的判断结果为是,即车辆轨迹和障碍物轨迹之间存在轨迹交叉点,则继续计算车辆如果(自减速发生时刻起)不刹车,到达交叉点所需要的时间,该时间可以记为T,T可以通过以下参数来计算:轨迹的长度S(自减速发生时刻起车辆的轨迹点和交叉点之间的车辆轨迹长度),车辆在减速发生时刻起的即时速度V,车辆在减速发生时刻之前的相邻时刻的即时加速度a0,例如:如果车辆在时刻Tc开始发生减速,并且例如Tc是12:05:10(12点零5分10秒),那么在Tc的之前相邻时刻Tc-1可以是12:05:09(12点零5分9秒),取此时刻车辆的加速度为a0,因为Tc是减速开始发生的时刻,因此Tc-1时刻的加速度a0一定为非负值,在a0为正的情况下,可以依据运动学公式
Figure PCTCN2020083072-appb-000001
来计算出T;并在自减速发生时刻起的时刻加T而得到车辆如果不刹车而到达交叉点的时间T1;例如:如果减速发生时刻为12:05:10(12点零5分10秒),车辆如果自减速发生时刻起不刹车到达交叉点所需要的时间为10秒,则T1为12:05:20(12点零5分20秒);应当理解的是,在一些实施例中,时间的精度以秒为单位计算,但是本领域技术人员可以依据实际情况来适当调整时间单位精度而不违背本申请的精神;还应当理解的是:上述实施例中对T的计算使用的是匀加速运动的公式,也可以使用匀速运动公式(即车辆在减速发生时刻之前的相邻时刻的加速度为0)来计算T,或者也可以采用数据统计或者机器学习的方法估算/预计T,技术人员可以依据实际情况合理选择获取T的方案而并不违背本申请的精神;
657:在656之后,获得计算障碍物到达交叉点的时间,可以记为T2;
658:判断车辆与障碍物到达交叉点的时间差,即T1和T2之间的时间差是否小于3秒,这里选取3秒作为安全时间差,如果车辆和障碍物先后相差3秒以上经过交叉点,则认为是安全的,反之则认为是不安全的,即车辆和障碍物可能发生碰撞;应当理解的是,这里的3秒是基于一些实施例所确定的值,本领域技术人员可以依据实际需求将此值进行适当调整而不违背本申请的构思;
659:如果658的判断结果为是,则表明车辆和障碍物有在安全时间差范围内到达交叉点,则车辆如果不刹车和障碍物有较大可能发生碰撞,那么依据常理,车辆应当进行刹车以避免碰撞,因此在该情况下可以判定车辆的刹车不属于异常驾驶,即属于正常驾驶行为;
6510:如果658的判断结果为否,则表明车辆和障碍物在安全时间差范围之外到达交叉点,这种情况下车辆和障碍物发生碰撞的可能很小,依据常理车辆不必进行刹车,因此在此 情况下发生的刹车可能是异常刹车,但是也可能是正常刹车,此时车辆的行驶状态处于待定,需要将上述五种场景都进行判断才能得出最终结果。
上面描述了路口场景下的五种具体场景以及与它们依次对应的算法逻辑61-65;在实施例中,对于上述任意一个算法逻辑,其中都有被判断为“正常刹车”的分支逻辑,以及被判断为“可能属于正常刹车”的分支逻辑。在任意一个算法逻辑中,如果最终判断为“属于正常刹车”,则可以直接判定车辆的刹车不属于异常驾驶行为,即属于正常驾驶行为。而对于这五种被判断为“可能属于正常刹车”的分支逻辑,需要执行算法逻辑61-65的并集才可以排除“可能属于正常刹车”,并进而确定为“异常刹车”。举例而言,在路口场景下,如果发生可疑刹车,车辆系统会运行上述所有的算法逻辑61-65,如果在算法逻辑61的分支615中被暂定为“可能属于正常刹车”,则系统会继续判断算法逻辑62-65,如果在算法逻辑62-65中所得到的分支逻辑结果都是“可能属于正常刹车”,则可以判定:在路口场景下,车辆发生的减速/刹车不属于上述5种场景中的任何一种,因此可以确认该刹车属于异常驾驶行为。
应当理解的是,虽然上述实施例方案描述了路口场景下的5种具体场景,但是技术人员可以在具体情况下,合理调整路口场景数目,而并不违背本申请的精神。
图7-1至7-2示出了基于实施例的道路路段场景下的异常驾驶行为(刹车)判断的算法逻辑。
在一些实施例中,在道路路段场景下,考虑以下两种场景:
(1)车辆行驶前方范围内有障碍物
(2)车辆安全范围内有障碍物
参见图7-1,其示出了实施例所给出的车辆在上述场景(1)情况下的算法逻辑71,包括:
711:获取减速起始时刻数据,当车辆发生刹车时,车辆驾驶行为数据通过例如上述的主成分分析法(Principal Components Analysis:PCA)、核主元分析法(Kernel Principal Components Analysis:KPCA)、局部线性嵌入(Locally Linear Embedding:LLE)、拉普拉斯映射(Laplacian Eigenmap:LE),可以初步识别出可疑的异常驾驶行为,并得到车辆发生减速的起始时刻;
712:获取减速起始时刻车辆周围的障碍物及位置,可以通过例如激光雷达、毫米波雷达、摄像头、超声雷达等车载装置来获取这些信息,还可以通过车辆和控制中心的通信来由控制中心向车辆通知其周围的障碍物信息,或者通过车辆之间的通信由别的车辆向车辆通知其周围的障碍物信息,或者是以上各种方法的结合,本申请对此不作限定。障碍物可以包括(但不限于):车辆、行人、路障、或者任何其它在车道上的可能妨碍行车的物体;
713:判断每个障碍物是否在车辆的纵向前方12-18米、横向左右2-5米的范围内,这个范围属于车辆沿着现有路线行驶很快就要经过的区域,可以称之为碰撞域。换言之,如果在碰撞域内有障碍物而车辆不进行刹车,则很快就有极大可能发生碰撞;碰撞域范围的障碍物识别可以基于超声雷达、毫米波雷达、超声波雷达等装置来实现。进一步地,对比算法逻辑61中的分支613,可以看出,在分支713中的碰撞域的范围要比分支613中的碰撞域宽和深,这是因为较之路口场景,在道路场景下,一般而言车辆的行驶速度要更快一些,因此需要设置更宽和更深的碰撞域以适应道路场景。还应当理解的是,上述的碰撞域的范围选择是基于一些实施例而得到的,技术人员可以依据实际需求合理改变碰撞域的宽和深,而并不违背本申请的思想;
714:如果713的判断结果为是,则表明在车辆前方碰撞域有障碍物,并且如果车辆不发 生刹车则马上就有极大的可能发生碰撞,因此在该情况下可以判定车辆的刹车不属于异常驾驶行为,即属于正常驾驶行为;
715:如果713的判断结果为否,则表明车辆的前方的碰撞域内无障碍物,在此情况下发生的刹车可能是异常驾驶行为,但是也可能是正常驾驶行为,此时车辆的行驶状态处于待定,需要将上述两种场景都进行判断才能得出最终结果。
参见图7-2,其示出了实施例所给出的车辆在上述场景(2)情况下的算法逻辑72,包括:
721:获取减速起始时刻数据,当车辆发生刹车时,车辆驾驶行为数据通过例如上述的主成分分析法(Principal Components Analysis:PCA)、核主元分析法(Kernel Principal Components Analysis:KPCA)、局部线性嵌入(Locally Linear Embedding:LLE)、拉普拉斯映射(Laplacian Eigenmap:LE),可以初步识别出可疑的异常驾驶行为,并得到车辆发生减速的起始时刻;
722:获取减速起始时刻车辆周围的障碍物及位置,可以通过例如激光雷达、毫米波雷达、超声雷达、摄像头等车载装置来获取这些信息,还可以通过车辆和控制中心的通信来由控制中心向车辆通知其周围的障碍物信息,或者通过车辆和他车之间的通信由他车向车辆通知其周围的障碍物信息,或者是以上各种方法的结合,本申请对此不作限定。障碍物可以包括(但不限于):车辆、行人、动物、路障、或者任何其它在车道上的可能妨碍行车的物体/人/动物。723:获取车辆在减速段内的行驶轨迹,这可以通过获取并分析自减速起始时刻的行车数据而得到;
724:依次获取每个障碍物在车辆减速时段内的轨迹,例如(但不限于):如果第一个待分析的障碍物是一辆车辆,则获取该车辆在车辆减速时段内的轨迹;
725:判断障碍物和车辆是否存在轨迹交叉点,仍然以上述的障碍物是车辆为例,依据653所得到的车辆轨迹(轨迹1)和724所得到的车辆轨迹(轨迹2),判断轨迹1和轨迹2是否存在着交叉点(包括延伸交叉点),可以在一般的地球坐标系中进行这两个轨迹的分析和判断;如果725的判断结果为否,即车辆和障碍物的轨迹之间不存在交叉点,则表明车辆和该障碍物没有碰撞可能,则逻辑返回到上一步654,继续获取下一个障碍物(例如可以是一辆骑行中的自行车)进行分析;在一些实施例中,可以通过例如激光雷达、毫米波雷达、超声雷达或它们的组合来探测障碍物并确定障碍物在什么时间通过了交叉点;在另一些实施例中,通过选取车辆在两次刹车之间的驾驶数据来判断车辆在什么时间通过了交叉点;需要指出的是,如果障碍物是静止的,例如一个路障,此种情况下该路障的轨迹就是一个静止的点(以地球坐标系为参照系而言),那么这种情况等同于判断逻辑71所适用的场景(道路路段车辆行驶前方范围内有障碍物);
726:如果725的判断结果为是,即车辆轨迹和障碍物轨迹之间存在轨迹交叉点,则继续计算车辆如果(自减速发生时刻起)不刹车,到达交叉点所需要的时间,该时间可以记为T,T可以通过以下参数来计算:轨迹的长度S(自减速发生时刻起车辆的轨迹点和交叉点之间的车辆轨迹长度),车辆在减速发生时刻起的即时速度V,车辆在减速发生时刻之前的相邻时刻的即时加速度a 0,例如:如果车辆在时刻T c开始发生减速,并且例如T c是12:05:10(12点零5分10秒),那么在T c的之前相邻时刻T c-1可以是12:05:09(12点零5分9秒),取此时刻车辆的加速度为a 0,因为T c是减速开始发生的时刻,因此T c-1时刻的加速度a 0一定为非负值,在a 0为正的情况下,可以依据运动学公式
Figure PCTCN2020083072-appb-000002
来计算出T;并在自减速发生时刻起的时刻加T而得到车辆如果不刹车而到达交叉点的时间T1;例如:如果减速发生时 刻为12:05:10(12点零5分10秒),车辆如果自减速发生时刻起不刹车到达交叉点所需要的时间为10秒,则T1为12:05:20(12点零5分20秒);应当理解的是,在一些实施例中,时间的精度以秒为单位计算,但是本领域技术人员可以依据实际情况来适当调整时间单位精度而不违背本申请的精神;还应当理解的是:上述实施例中对T的计算使用的是匀加速运动的公式,也可以使用匀速运动公式(即车辆在减速发生时刻之前的相邻时刻的加速度为0)来计算T,或者也可以采用数据统计或者机器学习的方法估算/预计T,技术人员可以依据实际情况合理选择获取T的方案而并不违背本申请的精神;
727:在726之后,继续计算障碍物到达交叉点的时间,可以记为T2;
728:在727之后,判断车辆与障碍物到达交叉点的时间差,即T1和T2之间的时间差是否小于2秒,这里选取2秒作为安全时间差,如果车辆和障碍物先后相差2秒以上经过交叉点,则认为是安全的,反之则认为是不安全的,即车辆和障碍物可能发生碰撞;应当理解的是,这里的2秒是基于一些实施例所确定的值,本领域技术人员可以依据实际需求将此值进行适当调整而不违背本申请的构思;进一步地,对比算法逻辑65中的分支658,可以看出,在分支728中的时间差(2S)小于分支658中的时间差(3S),这是因为较之路口场景,在道路场景下,一般而言车辆的行驶速度要更快一些,因此需要设置更短的安全时间差以适应道路场景。
729:如果728的判断结果为是,则表明车辆和障碍物有在安全时间差范围内到达交叉点,即车辆如果不刹车和障碍物有较大可能发生碰撞,那么依据常理车辆应当进行刹车以避免碰撞,在这种情况下可以判定车辆的刹车不属于异常驾驶,即属于正常驾驶行为;
7210:如果728的判断结果为否,则表明车辆和障碍物在安全时间差范围之外到达交叉点,这种情况下车辆和障碍物发生碰撞的可能很小,依据常理车辆不必进行刹车,因此在此情况下发生的刹车可能是异常驾驶行为,但是也可能是正常驾驶行为,此时车辆的行驶状态处于待定,需要将上述五种场景都进行判断才能得出最终结果。
上面描述了道路路段场景下的两种具体场景以及与它们依次对应的算法逻辑71-72;在实施例中,对于上述任意一个算法逻辑,其中都有被判断为“正常刹车”的分支逻辑,以及被判断为“可能属于正常刹车”的分支逻辑。在任意一个算法逻辑中,如果最终判断为“属于正常刹车”,则可以直接判定车辆的刹车不属于异常驾驶行为,即属于正常驾驶行为。而对于这两种被判断为“可能属于正常刹车”的分支逻辑,需要执行算法逻辑71-72的并集才可以排除“可能属于正常刹车”,并进而确定为“异常刹车”。举例而言,在路口场景下,如果发生可疑刹车,车辆系统会运行上述所有的算法逻辑71-72,如果在算法逻辑71的分支715中被暂定为“可能属于正常刹车”,则系统会继续判断算法逻辑72,如果在算法逻辑72中所得到的分支逻辑结果都是“可能属于正常刹车”,则可以判定:在路口场景下,车辆发生的减速/刹车不属于上述两种场景中的任何一种,因此可以确认该刹车属于异常驾驶行为。
应当理解的是,虽然上述实施例方案描述了道路路段场景下的两种具体场景,但是技术人员可以在具体情况下,合理调整道路场景数目,而并不违背本申请的精神。
图8-1至8-2示出了基于实施例的道路场景下的异常驾驶行为(压车道线)判断的算法逻辑。
在一些实施例中,在道路路段场景下,考虑以下两种场景:
(1)车辆行驶经过了路口;
(2)车辆发生了换道;
在一些实施例中,将车辆与车道中心线的偏离值作为压车道线行为的行为参数,该参数取值是由车辆后轴中心点到左右车道线的距离值确定的。
在一些实施例中,在获得了车辆与车道中心线的偏离值之后,提取车辆与车道中心线的偏离值数据中的均值、均方根、最大值、最小值、峰值和峭度因子等统计值数据,然后对这些数据进行特征提取,然后对特征提取后的行为参数数据进行聚类分析。在完成数据聚类以后,根据经验确定正常行为数据簇和可疑的异常行为数据簇中数据量在总聚类数据中的比重,确定正常数据簇和可疑的异常压车道线行为数据簇的边界,以此识别出可疑的压车道线驾驶行为。
在一些实施例中,可以使用训练好的极限学习机来通过车辆驾驶场景数据来识别出车辆当前处于哪种具体的道路场景。
参见图8-1,其示出了实施例所给出的车辆在上述场景(1)情况下的算法逻辑81,包括:
811:获取压车道线起始时刻数据,当车辆发生压车道线时,车辆驾驶行为数据通过例如上述的主成分分析法(Principal Components Analysis:PCA)、核主元分析法(Kernel Principal Components Analysis:KPCA)、局部线性嵌入(Locally Linear Embedding:LLE)、拉普拉斯映射(Laplacian Eigenmap:LE),可以初步识别出可疑的异常驾驶行为,并得到车辆发生压车道线的起始时刻;
812:判断当前时刻车辆和左车道线或右车道线的距离的较小值否大于0.9米;如果判断结果为否,则继续返回获取下一时刻的数据来进行判断;
813:如果812的判断结果为是,由于一般家用车辆的宽度在1.8米以内,车辆中心距离左侧或者右侧车道线的距离小于0.9米则表明车辆有较大概率已经压到了车道线(车辆后轴中心点距离车道线的距离小于车辆的一半宽度)。应当理解的是,这里的0.9米是在一些实施例中所选择的值,技术人员可以依据车辆类型的不同而合理调整上述值而并不违背本申请的精神;在813继续判断在当前时刻车辆是否位于路口前后20米范围内;
814:如果813的判断结果为是,则812和813的逻辑表明车辆的压车道线行为伴随着车辆通过路口,逻辑分支813判断车辆当前是否处于与车道中心线有较大偏离的情况,一般而言,在车辆经过路口时,发生换道(压车道线)是一个大概率行为,车辆如果在路口发生了左转或者右转,车道以及方向都会发生变化;车辆如果在路口直行,那么通过路口后车辆的车道也有较大概率发生变化;因此综合以上,在此情况下发生的压车道线行为不属于异常驾驶行为,而属于正常驾驶行为;
815:如果813的判断结果为否,则表明车辆发生了压车道线但是并没有伴随着经过路口,因此这种情况下发生的压车道线行为可能属于异常驾驶行为,但是也可能是正常驾驶行为,此时车辆的行驶状态处于待定,需要将上述两种场景都进行判断才能得出最终结果。
参见图8-2,其示出了实施例所给出的车辆在上述场景(2)情况下的算法逻辑82,包括:
821:获取车辆压车道线行为时刻数据,当车辆发生压车道线时,车辆驾驶行为数据通过例如上述的主成分分析法(Principal Components Analysis:PCA)、核主元分析法(Kernel Principal Components Analysis:KPCA)、局部线性嵌入(Locally Linear Embedding:LLE)、拉普拉斯映射(Laplacian Eigenmap:LE),可以初步识别出可疑的异常驾驶行为,并得到车辆发生压车道线的起始时刻;
822:获取当前时刻车辆所在车道ID;
823:获取当前时刻4秒内车辆所在车道ID;
824:获取当前时刻4秒后车辆所在车道ID;
825:判断当前时刻前后4秒内的车道ID是否一致;
826:如果825的判断结果为是,则表明在当前时刻前后4秒内车辆发生了换道,由于换道必然伴随着压车道线,因此,在这种情况下发生的压车道线行为不属于异常驾驶行为,而属于正常驾驶行为;
827:如果828的判断结果为否,则表明在当前时刻前后4秒内车辆没有发生换道,因此这种情况下发生的压车道线行为可能属于异常驾驶行为,但是也可能是正常驾驶行为,此时车辆的驾驶状态处于待定,需要将上述两种场景都进行判断才能得出最终结果。
应当理解的是:虽然在本实施例中通过获取车道ID的方式来获取车辆的车道信息并依此判断车辆是否发生过换道,技术人员也可以采用其它合适的手段来获取车道信息,例如可以通过图像信息通过机器学习的方法来获取车辆的车道信息。
上面描述了压车道线下的两种具体场景以及与它们依次对应的算法逻辑81-82;在实施例中,对于上述任意一个算法逻辑,其中都有被判断为“属于正常压车道线”的分支逻辑,以及判断为“可能属于正常压车道线”的分支逻辑,对于这两种被判断为“可能属于正常压车道线”的分支逻辑,需要执行算法逻辑81-82的并集才可以排除“可能属于正常刹车”,确定为“异常刹车”。具体而言,在道路场景下,如果发生可疑压车道线,系统会运行上述所有的算法逻辑81-85,如果在算法逻辑81的分支815中被暂定为“可能属于正常压车道线”,则系统会继续判断逻辑82,如果在逻辑81-82中所得到的分支逻辑结果都是“可能属于正常压车道线”,则可以判定:在道路场景下,发生的压车道线不属于上述两种场景中的任何一种,因此可以确认该压车道线属于异常驾驶行为。
在一些实施例中,参见图9:提供一种计算机可读存储介质901,其包括指令集903,指令集903可以被处理器902执行以实现:获取车辆驾驶行为数据;获取车辆驾驶场景数据;执行上述的算法逻辑61-65、71-72、81-82来判断车辆是否处于异常驾驶状态。
在一些实施例中:参见图10:提供一种自动驾驶辅助系统,自动驾驶辅助系统包括:
用于获取车辆驾驶行为数据的第一装置1001;第一装置1001可以包括但不限于:行车电脑ECU,车载传感器,车载控制系统等,车辆驾驶行为数据主要包括可量化的车控参数,可以由行车电脑ECU或者车载控制系统获得,车控参数包括:(1)急刹车行为参数:车辆加速度;(2)急加速行为参数:车辆加速度;(3)车辆距离车道中心线偏差值;(4)车速低于或高于车流速度:车辆速度;(5)转弯行为参数:车辆朝向角。这些驾驶行为数据都是可以直接获得或间接由车控参数计算求得,它们是对车辆驾驶行为的直接体现。应当理解的是,这里所述的车辆驾驶行为既可以包括人工驾驶,也可以包括各种等级(例如SAE规范L0-L5)下的自动驾驶。
用于获取当前车辆驾驶场景数据的第二装置1002;第二装置可以包括但不限于:激光雷达、毫米波雷达、摄像头、超声雷达、通信装置等,当前车辆驾驶场景数据主要包括:(1)他车信息参数,例如周围他车数量、他车速度、他车加速度、他车朝向角、他车横向和纵向位置坐标等(包括不限于);(2)车辆信息参数,例如车辆横向和纵向位置坐标、车辆朝向角等;(3)交通灯参数,例如左转灯红/黄/绿等;(4)车道线信息参数,例如车道线位置、 车道宽度、车道数量等;(5)道路信息参数,例如道路类型、是否路口等。应当理解的是,可以通过上述的激光雷达、毫米波雷达、摄像头、超声雷达等车载装置获得车辆驾驶场景数据,还可以基于通信装置通过车辆和控制中心的通信来由控制中心向车辆通知其周围的驾驶场景数据,或者通过该车辆和其它车辆之间的通信由别的车辆向该车辆通知其周围的驾驶场景数据,或者是以上各种方法的结合,本申请对此不作限定。上述的这些参数有些是连续型参数,有些是离散型参数,在一些实施例中,可以对连续型参数进行离散化处理以便于后续的驾驶场景识别。
与第一装置和第二装置通信连接的处理器;处理器至少根据获取的车辆驾驶行为数据和当前车辆驾驶场景数据判断车辆是否处于异常驾驶行为的状态。在一些实施例中,处理器可以基于车辆驾驶行为数据、当前车辆驾驶场景数据、以及算法逻辑61-65、71-72、81-82来判断车辆是否处于异常驾驶行为的状态。
在一些实施例中,提供一种智能驾驶车辆,包括自动驾驶辅助系统,自动驾驶辅助包括:
用于获取车辆驾驶行为数据的第一装置1001;第一装置1001可以包括但不限于:行车电脑ECU,车载传感器,车载控制系统等,驾驶行为数据主要包括可量化的车控参数,可以由行车电脑ECU或者车载控制系统获得,车控参数包括:(1)急刹车行为参数:车辆加速度;(2)急加速行为参数:车辆加速度;(3)车辆距离车道中心线偏差值;(4)车速低于或高于车流速度:车辆速度;(5)转弯行为参数:车辆朝向角。这些驾驶行为数据都是可以直接获得或间接由车控参数计算求得,它们是对车辆驾驶行为的直接体现。应当理解的是,这里所述的车辆驾驶行为既可以包括人工驾驶,也可以包括各种等级(例如SAE规范L0-L5)下的自动驾驶。
用于获取当前车辆驾驶场景数据的第二装置1002;第二装置1002可以包括但不限于:激光雷达、毫米波雷达、摄像头、超声雷达、通信装置等,当前驾驶场景数据主要包括:(1)他车信息参数,例如周围他车数量、他车速度、他车加速度、他车朝向角、他车横向和纵向位置坐标等(包括不限于);(2)车辆信息参数,例如车辆横向和纵向位置坐标、车辆朝向角等;(3)交通灯参数,例如左转灯红/黄/绿等;(4)车道线信息参数,例如车道线位置、车道宽度、车道数量等;(5)道路信息参数,例如道路类型、是否路口等。应当理解的是,可以通过上述的激光雷达、毫米波雷达、摄像头、超声雷达等车载装置获得车辆驾驶场景数据,还可以基于通信装置通过车辆和控制中心的通信来由控制中心向车辆通知其周围的驾驶场景数据,或者通过该车辆和其它车辆之间的通信由别的车辆向该车辆通知其周围的驾驶场景数据,或者是以上各种方法的结合,本申请对此不作限定。上述的这些参数有些是连续型参数,有些是离散型参数,在一些实施例中,可以对连续型参数进行离散化处理以便于后续的驾驶场景识别。
与第一装置和第二装置通信连接的处理器1003;处理器1003至少根据获取的车辆驾驶行为数据和车辆驾驶场景数据判断车辆是否处于异常驾驶行为的状态。在一些实施例中,处理器可以基于车辆驾驶行为数据、当前车辆驾驶场景数据、以及算法逻辑61-65、71-72、81-82来判断车辆是否处于异常驾驶行为的状态。
本申请各种实施例提供了一种异常驾驶行为识别方法、自动驾驶辅助系统、非暂态存储系统以及包括自动驾驶辅助系统的车辆。在本申请技术方案中,通过将驾驶场景信息引入到车辆的异常驾驶行为的识别过程中,从而解决了现有技术中由于对驾驶场景信息的考虑欠缺 所导致的异常驾驶行为的误识别问题;并且本申请方案为具体的驾驶场景中对应了与之相配合的多种算法逻辑,并通过多种算法逻辑的算法族的逻辑运算来确定车辆的异常驾驶行为识别,技术人员也可以依据实际需求来对算法逻辑进行合理地调整;因此本申请技术方案的扩展性好;另一方面,本申请技术方案的算法逻辑的代码量较小,因此本申请技术方案可以方便地直接应用于车辆本地,具有经济性优势,综上:本申请技术方案可以广泛地适用于不同等级的自动驾驶方案、系统、车辆。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”、“第四”等(如果存在)是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的实施例能够以除了在这里图示或描述的内容以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,单元的划分,仅仅为一种逻辑业务划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各业务单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件业务单元的形式实现。
集成的单元如果以软件业务单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
本领域技术人员应该可以意识到,在上述一个或多个示例中,本申请所描述的业务可以用硬件、软件、固件或它们的任意组合来实现。当使用软件实现时,可以将这些业务存储在计算机可读介质中或者作为计算机可读介质上的一个或多个指令或代码进行传输。计算机可读介质包括计算机存储介质和通信介质,其中通信介质包括便于从一个地方向另一个地方传送计算机程序的任何介质。存储介质可以是通用或专用计算机能够存取的任何可用介质。
以上的具体实施方式,对本申请的目的、技术方案和有益效果进行了进一步详细说明,所应理解的是,以上仅为本申请的具体实施方式而已。
以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (18)

  1. 一种异常驾驶行为识别方法,包括:
    获取车辆驾驶行为数据,基于所述车辆驾驶行为数据判断车辆是否处于可疑异常驾驶行为的状态;
    如果车辆处于可疑异常驾驶行为的状态,则获取当前车辆驾驶场景数据;
    至少根据所述车辆驾驶行为数据和所述当前车辆驾驶场景数据判断所述可疑异常驾驶行为是否为异常驾驶行为。
  2. 根据权利要求1所述的方法,其中:
    对所述车辆驾驶行为数据进行聚类分析以获得可疑异常驾驶行为;所述车辆驾驶行为数据包括以下至少一种:车辆速度、车辆加速度、车辆朝向角、车辆距离车道线偏离值。
  3. 根据权利要求2所述的方法,其中:
    所述聚类分析包括以下方法的至少一种:主成分分析法(PCA)、核主元分析法(KPCA)、局部线性嵌入(LLE)、拉普拉斯映射(LE)。
  4. 根据权利要求2-3任一所述的方法,还包括:
    使用神经网络对所述当前车辆驾驶场景数据进行分类,以确定当前驾驶场景;所述当前车辆驾驶场景数据包括以下的至少一种:车辆信息参数、他车信息参数、交通信号参数、车道线参数、道路信息参数;
    所述至少根据所述车辆驾驶行为数据和所述当前车辆驾驶场景数据判断所述可疑异常驾驶行为是否为异常驾驶行为包括:至少根据所述可疑异常驾驶行为和所述当前驾驶场景判断所述可疑异常驾驶行为是否为异常驾驶行为。
  5. 根据权利要求4所述的方法,其中:
    所述神经网络包括以下至少一种:卷积神经网络(CNN)、极限学习机(Extreme Learning Machine)。
  6. 根据权利要求4所述的方法,其中:
    所述当前驾驶场景包括以下至少之一:路口减速、道路路段减速、压车道线。
  7. 根据权利要求6所述的方法,还包括:
    在确定所述当前驾驶场景之后,根据所述当前驾驶场景确定与之对应的算法逻辑,并依据所述算法逻辑判断在当前驾驶场景下车辆的所述可疑异常驾驶行为是否为异常驾驶行为。
  8. 一种自动驾驶辅助系统,包括:
    用于获取车辆驾驶行为数据的第一装置;
    用于获取当前车辆驾驶场景数据的第二装置;
    处理器,所述处理器与所述第一装置和第二装置通信连接;所述处理器配置为:基于所述车辆驾驶行为数据判断车辆是否处于可疑异常驾驶行为的状态,如果车辆处于可疑异常驾驶行为的状态,则至少根据所述车辆驾驶行为数据和所述当前车辆驾驶场景数据判断所述可疑异常驾驶行为是否为异常驾驶行为。
  9. 根据权利要求8所述的系统,其中:
    所述第一装置包括:行车电脑ECU(Electronic Control Unit)。
  10. 根据权利要求8-9任一所述的系统,其中:
    所述第二装置包括以下至少一种:激光雷达、毫米波雷达、超声波雷达、数码相机。
  11. 根据权利要求8所述的系统,其中:
    对所述车辆驾驶行为数据进行聚类分析以获得可疑异常驾驶行为;所述车辆驾驶行为数据包括以下至少一种:车辆速度、车辆加速度、车辆朝向角、车辆距离车道线偏离值。
  12. 根据权利要求11所述的系统,其中:
    所述聚类分析包括以下方法的至少一种:主成分分析法(PCA)、核主元分析法(KPCA)、局部线性嵌入(LLE)、拉普拉斯映射(LE)。
  13. 根据权利要求11-12任一所述的系统,其中:
    使用神经网络对所述当前车辆驾驶场景数据进行分类,以确定当前驾驶场景;所述当前车辆驾驶场景数据包括以下的至少一种:车辆信息参数、他车信息参数、交通信号参数、车道线参数、道路信息参数;
    所述至少根据所述车辆驾驶行为数据和所述当前车辆驾驶场景数据判断所述可疑异常驾驶行为是否为异常驾驶行为包括:至少根据所述可疑异常驾驶行为和所述当前驾驶场景判断所述可疑异常驾驶行为是否为异常驾驶行为。
  14. 根据权利要求13所述的方法,其中:
    所述神经网络包括以下至少一种:卷积神经网络(CNN)、极限学习机(ELM)。
  15. 根据权利要求13-14任一所述的系统,其中:
    所述当前驾驶场景包括以下至少之一:路口减速、道路路段减速、压车道线。
  16. 根据权利要求15所述的系统,还包括:
    在确定所述当前驾驶场景之后,根据所述当前驾驶场景确定与之对应的算法逻辑,并依据所述算法逻辑判断的所述可疑异常驾驶行为是否为异常驾驶行为。
  17. 一种智能驾驶车辆,其特征在于:其包括如权利要求8-16任一所述的系统。
  18. 一种计算机可读存储介质,包括指令集,所述指令集可以被处理器执行以实现如权利要求1-7任一所述的方法。
PCT/CN2020/083072 2020-04-02 2020-04-02 一种异常驾驶行为识别方法 WO2021196144A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/CN2020/083072 WO2021196144A1 (zh) 2020-04-02 2020-04-02 一种异常驾驶行为识别方法
EP20929078.2A EP4120215A4 (en) 2020-04-02 2020-04-02 PROCEDURE FOR IDENTIFYING ABNORMAL DRIVING BEHAVIOR
CN202080004340.5A CN112512890B (zh) 2020-04-02 2020-04-02 一种异常驾驶行为识别方法
US17/959,066 US20230025414A1 (en) 2020-04-02 2022-10-03 Method for identifying abnormal driving behavior

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/083072 WO2021196144A1 (zh) 2020-04-02 2020-04-02 一种异常驾驶行为识别方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/959,066 Continuation US20230025414A1 (en) 2020-04-02 2022-10-03 Method for identifying abnormal driving behavior

Publications (1)

Publication Number Publication Date
WO2021196144A1 true WO2021196144A1 (zh) 2021-10-07

Family

ID=74953153

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/083072 WO2021196144A1 (zh) 2020-04-02 2020-04-02 一种异常驾驶行为识别方法

Country Status (4)

Country Link
US (1) US20230025414A1 (zh)
EP (1) EP4120215A4 (zh)
CN (1) CN112512890B (zh)
WO (1) WO2021196144A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114407860A (zh) * 2022-01-07 2022-04-29 所托(杭州)汽车智能设备有限公司 一种自动制动系统误触发判断方法、装置、设备及介质
CN114407918A (zh) * 2021-12-30 2022-04-29 广州文远知行科技有限公司 接管场景分析方法、装置、设备及存储介质
CN114495483A (zh) * 2021-12-14 2022-05-13 江苏航天大为科技股份有限公司 一种基于毫米波雷达的车辆异常行驶行为识别方法
CN115205797A (zh) * 2022-09-19 2022-10-18 上海伯镭智能科技有限公司 一种无人驾驶车辆工作状态监控方法及装置

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022231715A2 (en) * 2021-03-15 2022-11-03 Motional Ad Llc Trajectory checker
CN115527076B (zh) * 2021-06-08 2023-05-26 河北雄安京德高速公路有限公司 一种营运车辆异常驾驶行为识别模型的构建方法及系统
CN113538844A (zh) * 2021-07-07 2021-10-22 中科院成都信息技术股份有限公司 一种智能视频分析系统及方法
CN113859250B (zh) * 2021-10-14 2023-10-10 泰安北航科技园信息科技有限公司 一种基于驾驶行为异常识别的智能网联汽车信息安全威胁检测系统
CN114518741A (zh) * 2022-02-18 2022-05-20 北京小马易行科技有限公司 自动驾驶车辆的监控方法、监控装置以及监控系统
CN114475614A (zh) * 2022-03-21 2022-05-13 中国第一汽车股份有限公司 一种危险目标的筛选方法、装置、介质及设备
CN115565397A (zh) * 2022-08-19 2023-01-03 清智汽车科技(苏州)有限公司 应用于adas产品的风险容错方法和装置
CN115805948B (zh) * 2022-09-30 2023-10-31 北京百度网讯科技有限公司 车辆异常行驶行为检测方法、装置、电子设备和存储介质
CN116881778B (zh) * 2023-09-06 2023-11-28 交通运输部公路科学研究所 智能辅助教学系统安全预警与防护功能测试系统与方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105788250A (zh) * 2014-12-24 2016-07-20 中国电信股份有限公司 车辆驾驶行为处理方法和装置
CN107826118A (zh) * 2017-11-01 2018-03-23 南京阿尔特交通科技有限公司 一种判别异常驾驶行为的方法及装置
CN108773373A (zh) * 2016-09-14 2018-11-09 北京百度网讯科技有限公司 用于操作自动驾驶车辆的方法和装置

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105774810B (zh) * 2014-12-24 2019-05-07 中国电信股份有限公司 车辆驾驶行为处理方法和装置
CN106853830A (zh) * 2016-06-24 2017-06-16 乐视控股(北京)有限公司 异常驾驶行为识别方法、装置及终端设备
CN107272687A (zh) * 2017-06-29 2017-10-20 深圳市海梁科技有限公司 一种自动驾驶公交车辆的驾驶行为决策系统
CN107609602A (zh) * 2017-09-28 2018-01-19 吉林大学 一种基于卷积神经网络的驾驶场景分类方法
CN108438001A (zh) * 2018-03-15 2018-08-24 东南大学 一种基于时间序列聚类分析的异常驾驶行为判别方法
CN108764111B (zh) * 2018-05-23 2022-03-01 长安大学 一种车辆异常驾驶行为的检测方法
CN110733509A (zh) * 2018-07-18 2020-01-31 阿里巴巴集团控股有限公司 驾驶行为分析方法、装置、设备以及存储介质
CN109606284B (zh) * 2018-11-27 2021-08-24 北京千方科技股份有限公司 一种不良驾驶行为检测方法及装置
CN110264825A (zh) * 2019-07-31 2019-09-20 交通运输部公路科学研究所 一种驾驶模拟安全评价方法、装置及系统

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105788250A (zh) * 2014-12-24 2016-07-20 中国电信股份有限公司 车辆驾驶行为处理方法和装置
CN108773373A (zh) * 2016-09-14 2018-11-09 北京百度网讯科技有限公司 用于操作自动驾驶车辆的方法和装置
CN107826118A (zh) * 2017-11-01 2018-03-23 南京阿尔特交通科技有限公司 一种判别异常驾驶行为的方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4120215A4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114495483A (zh) * 2021-12-14 2022-05-13 江苏航天大为科技股份有限公司 一种基于毫米波雷达的车辆异常行驶行为识别方法
WO2023108932A1 (zh) * 2021-12-14 2023-06-22 江苏航天大为科技股份有限公司 一种基于毫米波雷达的车辆异常行驶行为识别方法
CN114407918A (zh) * 2021-12-30 2022-04-29 广州文远知行科技有限公司 接管场景分析方法、装置、设备及存储介质
CN114407860A (zh) * 2022-01-07 2022-04-29 所托(杭州)汽车智能设备有限公司 一种自动制动系统误触发判断方法、装置、设备及介质
CN115205797A (zh) * 2022-09-19 2022-10-18 上海伯镭智能科技有限公司 一种无人驾驶车辆工作状态监控方法及装置
CN115205797B (zh) * 2022-09-19 2023-01-03 上海伯镭智能科技有限公司 一种无人驾驶车辆工作状态监控方法及装置

Also Published As

Publication number Publication date
EP4120215A1 (en) 2023-01-18
US20230025414A1 (en) 2023-01-26
CN112512890B (zh) 2021-12-07
CN112512890A (zh) 2021-03-16
EP4120215A4 (en) 2023-03-22

Similar Documents

Publication Publication Date Title
WO2021196144A1 (zh) 一种异常驾驶行为识别方法
WO2022105579A1 (zh) 一种基于自动驾驶的控制方法、装置、车辆以及相关设备
US11840239B2 (en) Multiple exposure event determination
Chen et al. Deepdriving: Learning affordance for direct perception in autonomous driving
US10186150B2 (en) Scene determination device, travel assistance apparatus, and scene determination method
US10509408B2 (en) Drive planning device, travel assistance apparatus, and drive planning method
US10112614B2 (en) Drive planning device, travel assistance apparatus, and drive planning method
CN105809130B (zh) 一种基于双目深度感知的车辆可行驶区域计算方法
US10366608B2 (en) Scene determination device, travel assistance apparatus, and scene determination method
US9767368B2 (en) Method and system for adaptive ray based scene analysis of semantic traffic spaces and vehicle equipped with such system
JP5580852B2 (ja) 交通参加者の動きを予測する方法、システム、プログラム及び車両
EP3861289A1 (en) Trajectory prediction on top-down scenes
US20170364083A1 (en) Local trajectory planning method and apparatus for smart vehicles
JP2017535873A (ja) 路上シーン認識のための連続オクルージョンモデル
CN108133484B (zh) 基于场景分割的自动驾驶处理方法及装置、计算设备
CN111332296B (zh) 其他车辆的车道变换的预测
CN114829185A (zh) 定量驾驶评估和交通工具安全性限制
CN115618932A (zh) 基于网联自动驾驶的交通事件预测方法、装置及电子设备
JP2021082286A (ja) 車線変更の検出を改良するためのシステム、非一時的コンピュータ可読媒体および方法
US10074275B2 (en) Scene determination device, travel assistance apparatus, and scene determination method
US11429843B2 (en) Vehicle operation labeling
CN113536973B (zh) 一种基于显著性的交通标志检测方法
Yan [Retracted] Vehicle Safety‐Assisted Driving Technology Based on Computer Artificial Intelligence Environment
CN112232312A (zh) 基于深度学习的自动驾驶方法、装置和电子设备
Dopfer et al. What can we learn from accident videos?

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20929078

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020929078

Country of ref document: EP

Effective date: 20221013

NENP Non-entry into the national phase

Ref country code: DE