WO2022226434A1 - Self-driving vehicle evaluation using real-world data - Google Patents

Self-driving vehicle evaluation using real-world data Download PDF

Info

Publication number
WO2022226434A1
WO2022226434A1 PCT/US2022/070037 US2022070037W WO2022226434A1 WO 2022226434 A1 WO2022226434 A1 WO 2022226434A1 US 2022070037 W US2022070037 W US 2022070037W WO 2022226434 A1 WO2022226434 A1 WO 2022226434A1
Authority
WO
WIPO (PCT)
Prior art keywords
driving
episode
self
real
world
Prior art date
Application number
PCT/US2022/070037
Other languages
French (fr)
Inventor
Huidong Gao
Jiangsheng Yu
Original Assignee
Futurewei Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Futurewei Technologies, Inc. filed Critical Futurewei Technologies, Inc.
Priority to PCT/US2022/070037 priority Critical patent/WO2022226434A1/en
Publication of WO2022226434A1 publication Critical patent/WO2022226434A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Definitions

  • the present disclosure is related to self-driving vehicles, and in particular to systems and methods for evaluating operation of self-driving vehicles.
  • Self-driving vehicles also referred to as autonomous vehicles
  • Self-driving vehicles are becoming more commonplace.
  • testing the safety and reliability of self-driving vehicles is of growing importance.
  • Makers of self-driving vehicles e.g., self-driving cars, trucks, etc.
  • a computer-implemented method of identifying improper operational behavior of a self-driving vehicle comprises converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; computing episode e v aluation data for the real-world driving episodes of the multiple leaf nodes and computing probability distributions for the episode evaluation data for each leaf node; obtaining driving data of the self-driving vehicle; identifying a particular driving episode of the self-driving vehicle in the driving data and classifying the particular driving episode as corresponding to a particular leaf node of the decision tree; and determining a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.
  • the method further comprises generating an alert for the particular dri ving episode based on the self-driving behavior metric.
  • the determining the self-driving behavior metric includes determining a behavioral realness metric for the particular driving episode using a probability that the particular driving episode represents real-world driving behavior; and the generating the alert includes generating the alert using the behavioral realness metric for the particular driving episode.
  • the converting the real-world driving data into the decision tree includes identifying real-world dri ving episodes in the real- world driving data; and grouping similar identified real-world driving episodes into a corresponding leaf node of the decision tree according to a presence of specified factors in the real-world driving episodes.
  • the identifying the real-world driving episodes in the real-world driving data comprises determining trajectories of vehicles in the real-world driving data,; and identifying a determined trajectory as a driving episode.
  • the determining the self-driving behavior metric for the particular driving episode includes determining an improper operation metric, including determining a speed of the self-driving vehicle in the particular driving episode; comparing the speed of the self-driving vehicle to computed probability distributions for average speed, minimum speed, and maximum speed for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the speed of the self-driving vehicle to at least one of the computed probability distributions.
  • the determining the self-driving behavior metric includes determining an improper operation metric including determining, in the particular driving episode, a distance of the self-driving vehicle to objects; comparing the distance to probability distributions for average distance and minimum distance for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the distance to at least one of the probability distributions of the leaf node.
  • the method further comprising storing die computed probability distributions in memory as density histograms.
  • the computing the probability distributions further comprising calculating a probability distribution for one or more of average speed, minimum speed, maximum speed, average distance, and minimum distance for the identified driving episodes of the leaf node; determining a speed of the self-driving vehicle in the driving episode and a distance of the self-driving vehicle to objects in the driving episode of the selfdriving vehicle; calculating at least one improper operation metric for the driving episode using at least one of the calculated probability distributions and one or both of the determined speed and distance of the self-driving vehicle; and generating die alert using the at least one improper operation metric.
  • a self-driving vehicle evaluation system for a self-driving vehicle comprising a non-transitory memory storing real-world driving data and instructions; and a processor in communication with the memory.
  • the processor is configured, upon execution of the instructions, to perform the following steps: convert real -world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world dri ving episodes; compute episode evaluation data for the real-world driving episodes of the multiple leaf nodes and compute probability distributions for the episode evaluation data for each leaf node; obtain driving data of the self-driving vehicle; identify a particular driving episode of the self-driving vehicle in the driving data and classify the particular driving episode as corresponding to a particular leaf node of the decision tree; and determine a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.
  • the self-driving vehicle evaluation system including a user interface operatively coupled to the processing circuitry; wherein the converting further comprises the processor further executes the instructions to receive selections of objects of interest via the user interface; and group the identified real-world driving episodes into the multiple leaf nodes of the decision tree according to a presence of the objects of interest in the real- world driving episodes.
  • the processor further executes the instructions to receive the episode parameter for identifying a real-world driving episode in the real-w orld driving data via the user interface; and identify one or more real-world driving episodes in the real-world driving data using the episode parameter.
  • the processor further executes the instructions to present an alert according to the self-driving behavior metric, and the processor presenting tire alert using the user interface.
  • the processor further executes the instructions to determine a behavioral realness metric for the particular driving episode using a probability that the particular driving episode represents real- world driving behavior; and present the behavioral realness metric for the driving episode using the user interface.
  • the determining the self-driving behavior metric for the particular driving episode includes determining an improper operation metric, including determining a speed of the self-driving vehicle in the particular driving episode; comparing the speed of the self-driving vehicle to computed probability distributions for average speed, minimum speed, and maximum speed for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the speed of the self-driving vehicle to at least one of the computed probability distributions.
  • the determining the self-driving behavior metric includes determining an improper operation metric including determining, in the particular driving episode, a distance of the self-driving vehicle to objects; comparing the distance to probability distributions for average distance and minimum distance for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the distance to at least one of the probability distributions of the leaf node.
  • die processor further executes the instructions to store the probability distributions of the multiple leaf nodes in memory as density histograms.
  • the computing the probability distributions further comprising calculating a probability distribution for one or more of average speed, minimum speed, maximum speed, average distance, and minimum distance for the identified driving episodes of the leaf node; determining a speed of the self-driving vehicle in the driving episode and a distance of the self-driving vehicle to objects in the driving episode of the self- dri ving vehicle; calculating at least one improper operation metric for the driving episode using at least one of the calculated probability distributions and one or both of the determined speed and distance of the self-driving vehicle; and generating the alert using the at least one improper operation metric.
  • a non-transitory computer-readable storage medium storing computer instructions that when executed by one or more processors, cause the one or more processors to perform the steps of converting real-world dri ving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; computing episode evaluation data for the real-world driving episodes of the multipie leaf nodes and computing probability distributions for the episode evaluation data for each leaf node; obtaining driving data of the self-driving vehicle; identifying a particular driving episode of the self-driving vehicle in the driving data and classify ing the particular driving episode as corresponding to a particular leaf node of the decision tree; and
  • the computer-readable storage medium further storing instructions that when executed by the one or more processors perform the step of generating an alert for the particular driving episode based on the self-driving behavior metric.
  • the determining the self-driving behavior metric includes determining a behavioral realness me tric for the particular driving episode using a probability that the particular driving episode represents real-world driving behavior; and the generating the alert includes generating the alert using the behavioral realness metric for the particular driving episode.
  • the converting the real-world driving data into the decision tree includes identifying real-world driving episodes in the real- world driving data; and grouping similar identified real-world driving episodes into a corresponding leaf node of the decisi on tree according to a presence of specified factors in the real-world driving episodes.
  • the identifying the real-world driving episodes in the real-world driving data comprising determining trajectories of vehicles in the real-world driving data; and identifying a determined trajectory as a driving episode.
  • the determining the self-driving behavior metric for the particular driving episode includes determining an improper operation metric, including determining a speed of the self-driving vehicle in the particular driving episode; comparing the speed of the self-driving vehicle to computed probability distributions for average speed, minimum speed, and maximum speed for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the speed of the self-driving vehicle to at least one of the computed probability distributions.
  • the determining the self-driving behavior metric includes determining an improper operation metric including determining, in the particular dri ving episode, a distance of the self-driving vehicle to objects; comparing the distance to probability distributions for average distance and minimum distance for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the distance to at least one of the probability distributions of the leaf node.
  • the computer-readable storage medium further storing instructions that when executed by the one or more processors perform the step of storing the computed probability distributions in memory as density histograms. [0030] In some embodiments, the computer-readable storage medium further storing instructions that when executed by the one or more processors perfomi the steps of calculating a probability distribution for one or more of average speed, minimum speed, maximum speed, average distance, and minimum distance for the identified driving episodes of the leaf node; determining a speed of the self-driving vehicle m the driving episode and a distance of the self-driving vehicle to objects in the driving episode of the selfdriving vehicle; calculating at least one improper operation metric for the driving episode using at least one of the calculated probability distributions and one or both of the determined speed and distance of the self-driving vehicle; and generating the alert using the at least one improper operation metric.
  • a self-driving vehicle evaluation system tor a self-driving vehicle comprising a decision tree conversion module for converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; an episode evaluation module for computing episode evaluation data for the real-world driving episodes of the multiple leaf nodes and computing probability distributions for the episode evaluation data for each leaf node; a driving data module for obtaining driving data, of the seif-driving vehicle; a classification module for identifying a particular driv ing episode of the self-driving vehicle in the driving data and classifying the particular driving episode as corresponding to a particular leaf node of the decision tree; and a behavior module for determining a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.
  • FIG. 1 is an illustration of a self-driving vehicle scenario including self-driving vehicles.
  • FIG. 2 is a flow diagram of a computer-implemented method of identifying improper operational behavior of a self-driving vehicle according to an embodiment.
  • FIG. 3 illustrates a decision tree constructed from real-world driving data according to an example embodiment.
  • FIGS. 4A-4E show example probability distributions for driving episodes.
  • FIG. 5 is an illustration of using a density histogram to calculate self-driving behavior metrics.
  • FIG. 6 is a block diagram of a system for identifying improper operational behavior of a self-driving vehicle according to an embodiment.
  • the functions or algorithms described herein may be implemented in software in one embodiment.
  • the software may consist of computer executable instructions stored on computer readable media or computer readable storage device such as one or more non-transitory memories or other type of hardware-based storage devices, either local or networked.
  • modules which may be software, hardware, firmware, or any combination thereof
  • Multiple functions may be performed in one or more modules as desired, and the embodiments described are merely examples,
  • the software may be executed on a digital signal processor, application specific integrated circuit (ASIC), microprocessor, or other type of processor operating on a computer system, such as a personal computer, server, or other computer system, turning such computer system into a specifically programmed machine.
  • ASIC application specific integrated circuit
  • Tire functionality can be configured to perform an operation using, for instance, software, hardware, firmware, or the like.
  • the phrase “configured to” can refer to a logic circuit structure of a hardware element that is to implement the associated functionality.
  • the phrase “configured to” can also refer to a logic circuit structure of a hardware element that is to implement the coding d esign of associated functionality of firmware or software.
  • the term “module” refers to a structural element that can be implemented using any suitable hardware (e.g., a processor, among others), software (e.g., an application, among others), firmware, or any combination of hardware, software, and firmware.
  • the term “logic” encompasses any functionality for performing a task.
  • each operation illustrated in the flowcharts corresponds to logic for performing that operation.
  • An operation can be performed using software, hardware, firmware, or the like.
  • the terms “ ' component,” “system,” and the like may refer to computer-related entities, hardware, and software in execution, firmware, or combination thereof.
  • a component may be a process running on a processor, an ob j ect, an execution, a program, a function, a subroutine, a computer, or a combination of software and hardware.
  • processor and “processing circuitry” may refer to a hardware component, such as a processing unit of a computer system.
  • the claimed subject matter may be implemented as a method, apparatus, or article of manufacture using standard programming and engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computing device to implement the disclosed subject matter.
  • article of manufacture is intended to encompass a computer program accessible from any computer-readable storage de vice or media.
  • Computer-readable storage media can include, but are not limited to, magnetic storage devices, e.g., hard disk, floppy disk, magnetic strips, optical disk, compact disk (CD), digital versatile disk (DVD), smart cards, flash memory devices, among others, in contrast, computer-readable media (i.e., not limited to storage media) may additionally include communication media such as transmission media for wireless signals and the like.
  • This disclosure provides routines and data tor evaluating an operational behavior of a self-driving vehicle.
  • the evaluation can be used to assess a potential performance and safety of a self-driving vehicle. Further the evaluation can provide identification of improper operational behavior by the self-driving vehicle.
  • the identification comprises identification of improper operational behavior performed by the self-driving vehicle, not by a human occupant or human driver. It should be understood that the identification includes identification of dangerous or potentially dangerous behavior by the self-driving vehicle operating in a self-driving mode.
  • the danger includes danger to the vehicle and occupants. The danger further includes danger to pedestrians and/or danger to property.
  • Gathered real-world driving data is assembled into a collection of data that can be used to test and analyze a self-driving vehicle in multiple ways, including comparing a performance of the self-driving vehicle without taking the self-driving vehicle onto public roadways.
  • the real-world driving data provides many possible driving scenarios that can be used to thoroughly test the selfdriving vehicle, but without endangering humans or property.
  • the identification is performed on real-world driving data to ensure that the seif-dri ving vehicle does not exhibit any improper or dangerous self-driving actions.
  • the identification can comprise identification of improper operational behavior in a recorded real-world driving data (i.e., previously- captured operational behavior data) of tire seif-driving vehicle.
  • the identification can comprise identification of improper operational behavior in real time by analyzing generated driving data as the self-driving vehicle is moving and/or operating. The identification of improper operational behavior by the seif-driving vehicle can subsequently be used to improve seifdriving vehicle performance.
  • testing the safety of operation of selfdriving vehicles is important. It is desirable for the safety evaluation to consider real-world data and take different driving scenarios into account. For example, a velocity of 5 meters per second (5m/s) could be considered as not dangerous if the vehicle is operating on a residential street, but the velocity could pose danger in a scenario where pedestrians are present. Alternatively, it could comprise dangerous self-driving behavior when the self-driving vehicle is on a freeway or other high-speed vehicle roadway.
  • a universal evaluation method is desirable to evaluate the performance of self-driving vehicles.
  • the self-driving vehicle evaluation can be performed as part of a design process or design evaluation process.
  • the self-driving vehicle evaluation can be performed to ensure all possible self-driving scenarios have been considered and handled.
  • Tire self-driving vehicle evaluation can be performed to automate testing.
  • the self-driving vehicle evaluation can be performed to provide consistent testing of self-driving vehicles overtime.
  • the self-driving vehicle evaluation can be performed to ensure compliance with relevant selfdriving vehicle regulations.
  • the self-driving vehicle evaluation can be performed to provide self-driving performance testing and tracking that is consistent for different self-driving vehicles.
  • the self-driving vehicle evaluation can be performed to provide testing and tracking that, can be expanded upon over time and as needed, and as self-driving vehicles become more accepted and more prevalent.
  • the basic concept comprises the use of gathered real-world self- driving vehicle information to evaluate self-driving vehicles.
  • the real-world self-driving information can be obtained once, or can be repeatedly collected (or added to) over time, creating a larger and larger database of real-world self- driving information.
  • the gathered real -wo rid self-driving information can be used for testing all manner of self-driving vehicles.
  • the real-world self-driving vehicle information in some embodiments comprises data pre viously gathered from self-driving vehicles and is used to build a self-driving operational behavior database of self-driving scenarios.
  • the self-driving operational behavior database is used for testing self-driving vehicle designs. As a result, the database accumulates self-driving vehicle data and examples, enabling more intensive and detailed testing of newer self-driving vehicle designs. The database therefore enables continuous improvement in the testing and evaluation of self-driving vehicles.
  • the real-world self-driving vehicle information can include recorded data for all self-driving regimes and scenarios.
  • the real-world selfdriving information can include data tor different road and street types.
  • the real-world self-driving information can include data for different driving speeds.
  • the real-world self-driving information can include data for different vehicle operational laws and rules.
  • the real-world self-driving information can include data for different traffic types and levels.
  • the real-world self-driving information can include data for different types of passengers/cargo (i.e., personal transport versus commercial transport, emergency vehicles, et cetera).
  • the real-world self-driving information can include data for different types of weather and/or different driving conditions.
  • the real-world seif-driving information am include data for various obstacles or road hazards.
  • the real-world self-driving information can include data for the presence of and/or anticipated and unanticipated actions by pedestrians.
  • the real-world self-driving information can include data for the presence of extreme weather conditions.
  • the real-world self-driving information can include data for wildlife and/or livestock on or near the roadway.
  • the real-world self-driving information can include data for interacting with and avoiding various other types of drivers on the road.
  • the real-world self-driving information can include data for operation in the presence of emergency vehicles.
  • the real-world self-driving information can include data for operation in the presence of road problems, road repair, and/or construction.
  • the real-w'orld self-driving information can include data for operation in the proximity of wrecks, roadway blockages, or other unforeseen or emergency situations.
  • the real-world self-driving information can include data for various other types of drivers on the road. Other self-driving scenarios are contemplated and are within the scope of the discussion and claims.
  • the real-world self-driving information comprises a knowledge repository that will grow over time.
  • the real-world self-driving information will ensure that testing of self-driving vehicles is thorough.
  • the real-world selfdriving information will ensure the testing of self-driving vehicles is consistent.
  • the real-world self-driving information can be used to identify improper self-driving operation.
  • Tire real-world self-driving information can be used to identify potentially dangerous self-driving operation.
  • the real-world self-driving information can be used to identify potentially dangerous selfdriving operation in a variety of situations and environments, such as detecting potentially dangerous operation that is acceptable in good weather conditions, but might lead to loss of vehicular driving control under adverse weather such as rain, snow, or ice, for example.
  • FIG. 1 is an illustration of a self-driving vehicle scenario 100 including self-driving vehicles 110 and 120.
  • the self-driving vehicles 110 and 120 can generate real-world driving data.
  • the self-driving vehicles 110 and 120 can contribute self-driving data to an aggregated accumulated real-world driving data.
  • Self-driving vehicle 110 includes an onboard computer 112 and sensors 115 in communication with the onboard computer 112.
  • the onboard computer 112 receives signals and/or measurements from the sensors 115, processes this data, and uses the data as part of operating the self-driving car 110.
  • self-driving car 120 includes an onboard computer 12.2.
  • Each vehicle 110 and 120 includes sensors 115 and 125 such as position sensors, time-of-flight sensors, accelerometers, radar, lidar (light detection and ranging or 3D laser scanning), ultrasonic and/or sonar detectors, one or more video cameras, inertial navigation devices, GPS devices, etc., to collect real-world driving data.
  • Sensors such as radar, hdar, and ultrasonic devices can be used for terrain mapping, object detection, and object distance measuring, for example.
  • Hie driving data generated by a self-driving vehicle is used to determine operational behavior metrics to evaluate the perfosmance of the seif- driving vehicle.
  • Tire operational behavior metrics can be used to determine a potential danger in the operation of a self-driving vehicle.
  • the selfdriving vehicles 110 and 120 include onboard computers 112. and 122.
  • the self- driving vehicles 110 and 120 further include sensors 115 and 125, where the onboard computers 112 and 122 are in communication with the sensors 115 and 125 and recei ve sensor data.
  • the onboard computers 112 and 122 process the sensor data during operation of the self-driving vehicles 110 and 120. Further, the onboard computers 112 and 122 can use the sensor data in conjunction with previously-obtained real-world driving data to assess the operational behavior of the self-driving vehicles 110 and 120.
  • FIG. 2 is a flow diagram of a computer-implemented method 200 of identifying improper operational behavior of a self-driving vehicle according to an embodiment.
  • the method 200 begins with processing naturalistic driving data (NOD) or real-world driving data.
  • the driving data set includes information on the behavior of multiple real-world vehicles (e.g., speed, direction, etc.) as well as information on the surroundings of the vehicles tor different scenarios (e.g., objects near the vehicles, whether the scenario is for a highway, weather, road conditions, etc.).
  • the multiple real-world vehicles can include self-driving vehicles and/or human-operated vehicles.
  • the data can be collected using vehicles such as those shown m FIG. 1.
  • the data in the real-world driving data set is converted into a decision tree having multiple leaf nodes.
  • the real-world driving data may be stored in memory of a computing device (e.g., a computer, laptop computer, tablet computer, smartphone, etc.), or the real-world driving data may be downloaded from the cloud (e.g., from a cloud server).
  • the converting of the data is performed by processing circuitry (e.g., one or more hardware processors) of the computing device.
  • Tire real-world driving data can be obtained in the form of raw' or previously-processed data. The can be obtained in the form of organized data, including organized in a database, neural network model, or other suitable form or suitable data structure.
  • a driving episode of a vehicle may be defined using a trajectory of the vehicle (e.g., a predetermined number of meters travelled by the vehicle) or by a duration of driving time (e.g., a predetermined number of seconds of driving time).
  • a driving episode may be comprised of one or more trajectories.
  • the trajectory can be defined as an operation or operational phase of the self-driving vehicle.
  • the trajectory can be defined as operation: at a substantially consistent speed or speed range, in a substantially consistent direction, or from a source point to a destination point.
  • a trajectory can be defined in other ways, and all such trajectories are within the scope of this disclosure.
  • a dri ving episode may be identified and/or categorized in other ways.
  • FIG. 3 illustrates a decision tree 300 constructed from real-world driving data according to an example embodiment.
  • the decision tree 300 in the example shown includes a root node 305, a first node level 310, a second node level 315, and a third node level 320.
  • the third node level 320 in this example comprises a leaf node level comprising multiple leaf nodes.
  • the example only includes three levels of decisions, but an actual implementation may have many levels of decisions.
  • the decisions in the example of FIG. 3 are binary decisions.
  • the first decision level 305 divides the driving episodes into two groups; those episodes in which the operated vehicle only interacts with other vehicles (V), and episodes in which the vehicle interacts with non-vehicle objects (e.g., pedestrians, cyclists, etc.) as well as vehicles.
  • V vehicles
  • non-vehicle objects e.g., pedestrians, cyclists, etc.
  • the second decision level 310 further divides the two groups output from decision level 305 into those driving episodes with vehicles encountering speed bumps (SB) and those driving episodes that do not have speed bumps,
  • the output from the second decision level is four groups: i) those driving episodes that include only vehicles and include speed bumps, si) those driving episodes that include only vehicles and do not include speed humps, sis) those driving episodes that include non-vehicle objects, vehicles, and speed bumps, and iv) those driving episodes that include non-vehicle objects, vehicles, and do not include speed bumps.
  • the third decision level 315 divides the four groups that are outputted from decision level 310 into those driving episodes with vehicles encountering a crosswalk (CW) and those episodes that do not have crosswalks. Eight leaf nodes 320 are generated and the individual driving episodes identified in the driving data are classified in the appropriate leaf node.
  • CW crosswalk
  • the eight leaf nodes include: i) driving episodes that include only vehicles and include speed bumps and crosswalks (V JSB___CW), ii) driving episodes that include only vehicles, include speed bumps, and do not include crosswalks (V_SB_iionCW), iii) driving episodes that include only vehicles, do not include speed bumps, and include crosswalks (V nonSB CW), iv) driving episodes that include only vehicles and do not include speed bumps or crosswalks (V nonSB nonCWh v) driving episodes that include non-vehicle objects and include speed bumps and crosswalks (nonV__SB_CW), vi) driving episodes that include ⁇ non-vehide objects, include speed bumps and do not include crosswalks (nonV SB nonCW), vii) driving episodes that include non-vehicle objects, do not include speed bumps, and include crosswalks (nonV_nonSB_CW), and viii) driving episodes that include non-vehicle objects
  • evaluation data is computed for each of the driving episodes in each leaf node of the multiple leaf nodes.
  • Some examples of the evaluation data include: average speed (v «v g ) of the monitored vehicle in each driving episode in the leaf node, minimum speed (vmm) of the vehicle in each driving episode in the leaf node, maximum speed (vmax) of the vehicle in each driving episode in the leaf node, average distance betw een objects (vehicles and non-vehicle objects if any) pairwise (/ ⁇ 3 ⁇ 4 ) in each driving episode in the leaf node, and minimum distance between objects pairwise (/ ⁇ » . ⁇ ) in each driving episode in the leaf node.
  • Distributions for the evaluation data are then calculated and stored for each leaf node.
  • FIGS, 4A-4E show example probability distributions for selfdriving episodes.
  • Hie figures show' example distributions of the computed average speed (in m/s), minimum speed, maximum speed, average distance (in meters), and minimum distance, respectively, for driving episodes.
  • the driving episodes in the example are based on a spilt between scenarios with only vehicles (V) and scenarios with non-vehicle objects present (nonV).
  • the example distributions are density histograms. In a density histogram the areas of all the vertical columns add up to one for ail the driving episodes in the distribution. For example, in FIG. 4.4, the probability that the average speed of a vehicle is 5 m/s is 0.12.
  • the decision tree, leaf nodes, and evaluation data distributions provide a database against which the performance of an self-driving vehicle can be evaluated.
  • driving data of a self-driving vehicle to be evaluated is obtained.
  • Driving data is collected by running the self- driving vehicle and collecting sensor data generated by the sensors (the data can further include other data, including data from sources other than the seif-driving vehicle).
  • the self-driving vehicle includes sensors and an onboard computer to collect the driving data, such as for the vehicles shown in FIG. 1.
  • the driving data can he collected and stored in memory of the evaluating computer system, or the driving data for the self-driving vehicle may be collected and streamed to the cloud and stored and later downloaded by the evaluating computer system.
  • driving episodes for the self-driving vehicle being evaluated are identified in the new driving data, and classified into leaf nodes using the decision tree.
  • the driving episodes identified for the self-driving vehicle are classified into the eight leaf nodes.
  • the evaluation data e.g., average speed, minimum speed, maximum speed, average distance, and minimum distance
  • one or more operational behavior metrics can be calculated for the driving episodes of the seif-driving vehicle being evaluated. Different metrics can be calculated by comparing the e valuation data for the selfdriving vehicle against the distributions for the real-world self-driving data. For example, the average speed tor a driving episode scenario for the self-driving vehicle can be compared to the average speed distribution for real-world selfdriving data for the corresponding scenario leaf node that the driving epi sode was classified into.
  • An example of a metric to evaluate the self-driving vehicle is an self-driving behavior metric.
  • the self-driving behavior metric is related to a rareness of an event and a value(s) of the criteria of the self-driving behavior metric are calculated.
  • the probability density of this criterion indicates the rareness of the calculated value.
  • the rarer the calculated value the more unusual (and potentially improper) is the behavior.
  • a smaller criteria value e.g., smaller minimum distance
  • the self-driving behavior metric for each criterion can be defined to be the inverse of the probability density and the actual criterion value.
  • the behavior realness metric can be defined by the probability density function.
  • the behavior realness metric in some embodiments assigns a weight, value, or probability to a self-driving behavior metric, with the behavior realness metric indicating the likelihood of the self-driving behavior metric occurring in seif-driving operation of the self-driving vehicle. Therefore, a higher value in the behavior realness metric indicates the performance of the self-driving vehicle m the scenario is more similar to the real-world vehicle performance, and thus more realistic,
  • FIG. 5 is an illustration of using a density histogram to calculate self-driving behavior metrics.
  • the self-driving behavior metrics are subsequently used for evaluating operation of a seif-driving vehicle.
  • the example involves a self-driving car episode classified into the leaf node corresponding to scenarios with only vehicles present, and speed bumps and crosswalks present (V SB CW).
  • the minimum distance of the self-dri v ing vehicle is weighed against the minimum distances of the real-world data in the V_SBCW leaf node using the distribution 505 for the minimum distance.
  • Behavior _realness_metric ijnin ⁇ P(L M!N 10) (2) in the minimum distance distribution in FIG. 5, the behavioral realness metric is the shaded area 510.
  • the evaluation system may flag a certain scenario result for the user.
  • the system may generate an alert for the driving episode based on the detemiined seif-driving behavior metric indicating a rare event (e.g., for when the minimum distance in the scenario is small).
  • Tire evaluation system implementing the method 200 of FIG. 2 may be a simulation system used to w eigh the performance of a self-driving vehicle. A user of the simulation system may want to change parameters of the evaluation.
  • FIG. 6 is a block diagram of a system 600 for identifying improper operational behavior of a self-driving vehicle according to an embodiment.
  • the computer system 600 executes software or instructions that configure the system 600 for performing methods and algorithms according to example embodiments. Ail components need not be used in various embodiments.
  • One example is a computing device that may include processing circuitry 7 (e.g., a processing unit 602), memory' 603, removable storage 610, and non-removable storage 612.
  • processing circuitry 7 e.g., a processing unit 602
  • memory' 603, removable storage 610 e.g., RAM
  • tire computing device may be in different forms in different embodiments.
  • the computing device may be a server, a router, or a virtual router.
  • the storage may also or alternatively include cloud-based storage accessible via a network, such as die Internet or server-based storage.
  • a network such as die Internet or server-based storage.
  • an SSD may include a processor on which the parser may be run, allowing transfer of parsed, filtered data through I/O channels between the SSD and main memory 7 .
  • Memory 7 603 may include volatile memory 7 614 and non-volatile memory 608.
  • Computer 600 may include - or have access to a computing environment that includes - a variety of computer-readable media, such as volatile memory 614 and non-volatile memory 7 608, removable storage 610 and non-removable storage 612.
  • Computer storage includes random-access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM) or electrically 7 erasable programmable read-only memory 7 (EEPROM), flash memory 7 or other memory technologies, compact disc read-only memory (CD ROM), Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium capable of storing computer-readable instructions.
  • RAM random-access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically 7 erasable programmable read-only memory 7
  • flash memory 7 or other memory technologies compact disc read-only memory (CD ROM), Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium capable of storing computer-readable instructions.
  • CD ROM compact disc read-only memory
  • DVD Digital Versatile Disks
  • Computer 600 may include or have access to a computing environment that includes input interface 606 (or user interface), output interface 604, and a communication interface 616.
  • Output interface 604 may include a display device, such as a touchscreen, that also may serve as an input device.
  • the input interface 606 may include one or more of a touchscreen, touchpad, mouse, keyboard, camera, one or more device-specific buttons, one or more sensors integrated within or coupled via wired or wireless data connections to the computer 600, and other input devices.
  • the computer 600 may operate in a networked environment using a communication connection to connect to one or more remote computers, such as database servers.
  • the remote computer may include a personal computer (PC), server, router, network PC, a peer device or other common data flow network switch, or the like.
  • the communication connection may include a Local Area Network (LAN), a Wide Area Network (WAN), cellular, Wi-Fi, Bluetooth, or other networks.
  • the various components of computer 600 are connected with a system bus 620.
  • Computer-readable instructions stored on a computer-readable medium are executable by the processing unit 602 of the computer 600, such as a program 618.
  • the program 618 in some embodiments comprises software to implement one or more methods described herein.
  • a hard drive, CD-ROM, and RAM are some examples of articles including a non-transitory computer- readable medium, such as a storage device.
  • the terms computer-readable medium and storage device do not include carrier waves to the extent carrier waves are deemed too transitory.
  • Storage can also include networked storage, such as a storage area network (SAN).
  • Computer program 618 may be used to cause processing unit 602 to perform one or more methods or algorithms described herein ,
  • the computer 600 can be configured to comprise all or a portion of a simulation system to implement the method 200 of FIG. 2.
  • a user e.g., a manufacturer of self-driving vehicles
  • the customization may increase or decrease the number of leaf nodes to classify the driving episodes (of both the real-world driving episodes and the self-driving vehicle being evaluated) using a finer classification or coarser classification of the driving episode.
  • the user may change the number or types of objects of interest in the driving episodes to change the classification in the leaf nodes.
  • the user may also specify- other non-object factors to classify episodes, such as weather, road conditions, etc.
  • the processing circuitry of the computer 600 adds or removes decisions of the decision tree based on the changes, in some examples, the customized decision-tree date is sent to the onboard computer of the vehicle (e.g., using the cloud). Driving data for the self-driving vehicle is collected by the self-driving vehicle and analyzed against the decision tree using the onboard computer. In some examples, the computer 600 is implemented using the onboard computer of the self-driving vehicle.
  • the user may change how a driving episode is defined.
  • the processing circuitry may receive one or more episode parameters (e.g., a threshold distance travelled for a trajectory to be called a driving episode, or a time duration for a trajectory to be called a driving episode) and change the way driving episodes are identified in the real-world data and the self-driving vehicle data using the changed episode parameters.
  • episode parameters e.g., a threshold distance travelled for a trajectory to be called a driving episode, or a time duration for a trajectory to be called a driving episode
  • the user may change the threshold value for a metric (e.g,, the self-driving behavior metric) before an alert is generated and presented (e.g., displayed) to the user.
  • a metric e.g, the self-driving behavior metric
  • the computer 600 includes a decision tree conversion module for converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes, an episode evaluation module for computing episode evaluation data for the real-world driving episodes of the multiple leaf nodes and computing probability distributions for the episode evaluation data for each leaf node, a driving data module for obtaining driving data of the self-driving vehicle, a classification module for identifying a particular driving episode of the self-driving vehicle in the driving data and classifying the particular driving episode as corresponding to a particular leaf node of the decision tree, and a behavior module for determining a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.
  • a decision tree conversion module for converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes
  • an episode evaluation module for computing episode evaluation data for the real-world driving episodes of the multiple leaf nodes and computing probability distributions for the episode evaluation
  • the computer 600 may include other or additional modules for performing any one of or combination of steps described in the embodiments. Further, any of the additional or alternative embodiments or aspects of the method, as shown in any of the figures or recited in any of the claims, are also contemplated to include similar modules. [0083]
  • the methods, systems and devices described herein provide techniques tor evaluating seif-driving vehicles. Operation of the seif-driving vehicles is evaluated using self-driving behavior metrics that change with the driving scenarios being evaluated. Driving data is applied to a decision tree structure to improve classification of driving conditions by the evaluating system.
  • the self-driving behavior metrics determined for tire self-driving vehicles are determined and interpreted in a probabilistic way rather than merely comparing a single valued result to a fixed single value. These techniques result in evaluation of self-driving vehicles that is realistic and practical.
  • the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results.
  • Other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems.
  • Other embodiments may be within the scope of the following claims.

Abstract

A computer-implemented method of identifying improper operational behavior of a self-driving vehicle is provided. The method includes converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; computing episode evaluation data for the real-world driving episodes of the multiple leaf nodes and computing probability distributions for the episode evaluation data for each leaf node; obtaining driving data of the self-driving vehicle; identifying a particular driving episode of the self-driving vehicle in the driving data and classifying the particular driving episode as corresponding to a particular leaf node of the decision tree; and determining a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.

Description

SELF-DRIVING VEHICLE EVALUATION USING REAL-WORLD
DATA
TECHNICAL FIELD
[0001] The present disclosure is related to self-driving vehicles, and in particular to systems and methods for evaluating operation of self-driving vehicles.
BACKGROUND
[0002] Self-driving vehicles (also referred to as autonomous vehicles), are becoming more commonplace. As a result, testing the safety and reliability of self-driving vehicles is of growing importance. Makers of self-driving vehicles (e.g., self-driving cars, trucks, etc.) need to show the safety of their vehicles and need to show they have performed safety testing, as safe operation of self-driving vehicles is of great importance.
[0003] However, there is currently no universally accepted way to evaluate driving scenarios and driving safety for self-driving vehicles. Current approaches calculate evaluation metrics based on relative velocity and distance in general. Although some approaches compare the calculated evaluation metrics to real-world data metrics, they do not consider that the metric values could change in different driving conditions. A safety' evaluation for an autonomous vehicle should not only consider real-world data, but should also take different driving conditions into account.
SUMMARY
[0004] A computer-implemented method of identifying improper operational behavior of a self-driving vehicle is provided. The method comprises converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; computing episode e v aluation data for the real-world driving episodes of the multiple leaf nodes and computing probability distributions for the episode evaluation data for each leaf node; obtaining driving data of the self-driving vehicle; identifying a particular driving episode of the self-driving vehicle in the driving data and classifying the particular driving episode as corresponding to a particular leaf node of the decision tree; and determining a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.
[0005] In some embodiments the method further comprises generating an alert for the particular dri ving episode based on the self-driving behavior metric. [0006] In some embodiments, the determining the self-driving behavior metric includes determining a behavioral realness metric for the particular driving episode using a probability that the particular driving episode represents real-world driving behavior; and the generating the alert includes generating the alert using the behavioral realness metric for the particular driving episode.
[0007] In some embodiments, the converting the real-world driving data into the decision tree includes identifying real-world dri ving episodes in the real- world driving data; and grouping similar identified real-world driving episodes into a corresponding leaf node of the decision tree according to a presence of specified factors in the real-world driving episodes.
[0008] In some embodiments, the identifying the real-world driving episodes in the real-world driving data comprises determining trajectories of vehicles in the real-world driving data,; and identifying a determined trajectory as a driving episode.
[0009] In some embodiments, the determining the self-driving behavior metric for the particular driving episode includes determining an improper operation metric, including determining a speed of the self-driving vehicle in the particular driving episode; comparing the speed of the self-driving vehicle to computed probability distributions for average speed, minimum speed, and maximum speed for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the speed of the self-driving vehicle to at least one of the computed probability distributions.
[0010] In some embodiments, the determining the self-driving behavior metric includes determining an improper operation metric including determining, in the particular driving episode, a distance of the self-driving vehicle to objects; comparing the distance to probability distributions for average distance and minimum distance for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the distance to at least one of the probability distributions of the leaf node.
[0011] In some embodiments, the method further comprising storing die computed probability distributions in memory as density histograms.
[0012] In some embodiments, the computing the probability distributions further comprising calculating a probability distribution for one or more of average speed, minimum speed, maximum speed, average distance, and minimum distance for the identified driving episodes of the leaf node; determining a speed of the self-driving vehicle in the driving episode and a distance of the self-driving vehicle to objects in the driving episode of the selfdriving vehicle; calculating at least one improper operation metric for the driving episode using at least one of the calculated probability distributions and one or both of the determined speed and distance of the self-driving vehicle; and generating die alert using the at least one improper operation metric.
[0013] A self-driving vehicle evaluation system for a self-driving vehicle is provided. The system comprising a non-transitory memory storing real-world driving data and instructions; and a processor in communication with the memory. The processor is configured, upon execution of the instructions, to perform the following steps: convert real -world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world dri ving episodes; compute episode evaluation data for the real-world driving episodes of the multiple leaf nodes and compute probability distributions for the episode evaluation data for each leaf node; obtain driving data of the self-driving vehicle; identify a particular driving episode of the self-driving vehicle in the driving data and classify the particular driving episode as corresponding to a particular leaf node of the decision tree; and determine a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.
[0014] In some embodiments, the self-driving vehicle evaluation system including a user interface operatively coupled to the processing circuitry; wherein the converting further comprises the processor further executes the instructions to receive selections of objects of interest via the user interface; and group the identified real-world driving episodes into the multiple leaf nodes of the decision tree according to a presence of the objects of interest in the real- world driving episodes.
[0015] In some embodiments, the processor further executes the instructions to receive the episode parameter for identifying a real-world driving episode in the real-w orld driving data via the user interface; and identify one or more real-world driving episodes in the real-world driving data using the episode parameter.
[0016] In some embodiments, the processor further executes the instructions to present an alert according to the self-driving behavior metric, and the processor presenting tire alert using the user interface.
[0017] In some embodiments, the processor further executes the instructions to determine a behavioral realness metric for the particular driving episode using a probability that the particular driving episode represents real- world driving behavior; and present the behavioral realness metric for the driving episode using the user interface.
[0018] In some embodiments, the determining the self-driving behavior metric for the particular driving episode includes determining an improper operation metric, including determining a speed of the self-driving vehicle in the particular driving episode; comparing the speed of the self-driving vehicle to computed probability distributions for average speed, minimum speed, and maximum speed for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the speed of the self-driving vehicle to at least one of the computed probability distributions.
[0019] In some embodiments, the determining the self-driving behavior metric includes determining an improper operation metric including determining, in the particular driving episode, a distance of the self-driving vehicle to objects; comparing the distance to probability distributions for average distance and minimum distance for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the distance to at least one of the probability distributions of the leaf node. [0020] In some embodiments, die processor further executes the instructions to store the probability distributions of the multiple leaf nodes in memory as density histograms.
[00211 In some embodiments, the computing the probability distributions further comprising calculating a probability distribution for one or more of average speed, minimum speed, maximum speed, average distance, and minimum distance for the identified driving episodes of the leaf node; determining a speed of the self-driving vehicle in the driving episode and a distance of the self-driving vehicle to objects in the driving episode of the self- dri ving vehicle; calculating at least one improper operation metric for the driving episode using at least one of the calculated probability distributions and one or both of the determined speed and distance of the self-driving vehicle; and generating the alert using the at least one improper operation metric.
A non-transitory computer-readable storage medium is provided. The computer- readable storage medium storing computer instructions that when executed by one or more processors, cause the one or more processors to perform the steps of converting real-world dri ving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; computing episode evaluation data for the real-world driving episodes of the multipie leaf nodes and computing probability distributions for the episode evaluation data for each leaf node; obtaining driving data of the self-driving vehicle; identifying a particular driving episode of the self-driving vehicle in the driving data and classify ing the particular driving episode as corresponding to a particular leaf node of the decision tree; and
[0022| determining a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.
[0023] In some embodiments, the computer-readable storage medium further storing instructions that when executed by the one or more processors perform the step of generating an alert for the particular driving episode based on the self-driving behavior metric.
[Q024] In some embodiments, the determining the self-driving behavior metric includes determining a behavioral realness me tric for the particular driving episode using a probability that the particular driving episode represents real-world driving behavior; and the generating the alert includes generating the alert using the behavioral realness metric for the particular driving episode. [0025] In some embodiments, the converting the real-world driving data into the decision tree includes identifying real-world driving episodes in the real- world driving data; and grouping similar identified real-world driving episodes into a corresponding leaf node of the decisi on tree according to a presence of specified factors in the real-world driving episodes.
[0026] In some embodiments, the identifying the real-world driving episodes in the real-world driving data comprising determining trajectories of vehicles in the real-world driving data; and identifying a determined trajectory as a driving episode.
[0027] in some embodiments, the determining the self-driving behavior metric for the particular driving episode includes determining an improper operation metric, including determining a speed of the self-driving vehicle in the particular driving episode; comparing the speed of the self-driving vehicle to computed probability distributions for average speed, minimum speed, and maximum speed for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the speed of the self-driving vehicle to at least one of the computed probability distributions.
[0028] In some embodiments, the determining the self-driving behavior metric includes determining an improper operation metric including determining, in the particular dri ving episode, a distance of the self-driving vehicle to objects; comparing the distance to probability distributions for average distance and minimum distance for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the distance to at least one of the probability distributions of the leaf node.
[0029] In some embodiments, the computer-readable storage medium further storing instructions that when executed by the one or more processors perform the step of storing the computed probability distributions in memory as density histograms. [0030] In some embodiments, the computer-readable storage medium further storing instructions that when executed by the one or more processors perfomi the steps of calculating a probability distribution for one or more of average speed, minimum speed, maximum speed, average distance, and minimum distance for the identified driving episodes of the leaf node; determining a speed of the self-driving vehicle m the driving episode and a distance of the self-driving vehicle to objects in the driving episode of the selfdriving vehicle; calculating at least one improper operation metric for the driving episode using at least one of the calculated probability distributions and one or both of the determined speed and distance of the self-driving vehicle; and generating the alert using the at least one improper operation metric.
[0031] A self-driving vehicle evaluation system tor a self-driving vehicle is provided. The system comprising a decision tree conversion module for converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; an episode evaluation module for computing episode evaluation data for the real-world driving episodes of the multiple leaf nodes and computing probability distributions for the episode evaluation data for each leaf node; a driving data module for obtaining driving data, of the seif-driving vehicle; a classification module for identifying a particular driv ing episode of the self-driving vehicle in the driving data and classifying the particular driving episode as corresponding to a particular leaf node of the decision tree; and a behavior module for determining a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.
BRIEF DESCRIPTION OF THE DRAWINGS [0032] Some figures illustrating example embodiments are included with the text in the detailed description.
[0033] FIG. 1 is an illustration of a self-driving vehicle scenario including self-driving vehicles.
[0034] FIG. 2 is a flow diagram of a computer-implemented method of identifying improper operational behavior of a self-driving vehicle according to an embodiment. [0035] FIG. 3 illustrates a decision tree constructed from real-world driving data according to an example embodiment.
[0036] FIGS. 4A-4E show example probability distributions for driving episodes.
[0037] FIG. 5 is an illustration of using a density histogram to calculate self-driving behavior metrics.
[0038] FIG. 6 is a block diagram of a system for identifying improper operational behavior of a self-driving vehicle according to an embodiment.
DETAILED DESCRIPTION
[0039] In the following description, reference is made to the accompanying drawings that form a part hereof and, in winch are shown, by way of illustration, specific embodiments that may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized, and that structural, logical, and electrical changes may be made without departing from the scope of the present invention. The following description of example embodiments is, therefore, not to be taken in a limited sense, and the scope of the present invention is defined by the appended claims.
[0040] The functions or algorithms described herein may be implemented in software in one embodiment. The software may consist of computer executable instructions stored on computer readable media or computer readable storage device such as one or more non-transitory memories or other type of hardware-based storage devices, either local or networked. Further, such functions correspond to modules which may be software, hardware, firmware, or any combination thereof Multiple functions may be performed in one or more modules as desired, and the embodiments described are merely examples, lire software may be executed on a digital signal processor, application specific integrated circuit (ASIC), microprocessor, or other type of processor operating on a computer system, such as a personal computer, server, or other computer system, turning such computer system into a specifically programmed machine.
[QQ41] Tire functionality can be configured to perform an operation using, for instance, software, hardware, firmware, or the like. For example, the phrase “configured to” can refer to a logic circuit structure of a hardware element that is to implement the associated functionality. The phrase “configured to” can also refer to a logic circuit structure of a hardware element that is to implement the coding d esign of associated functionality of firmware or software. The term “module” refers to a structural element that can be implemented using any suitable hardware (e.g., a processor, among others), software (e.g., an application, among others), firmware, or any combination of hardware, software, and firmware. The term “logic” encompasses any functionality for performing a task. For instance, each operation illustrated in the flowcharts corresponds to logic for performing that operation. An operation can be performed using software, hardware, firmware, or the like. The terms “'component,” “system,” and the like may refer to computer-related entities, hardware, and software in execution, firmware, or combination thereof. A component may be a process running on a processor, an object, an execution, a program, a function, a subroutine, a computer, or a combination of software and hardware. The terms “processor” and “processing circuitry” may refer to a hardware component, such as a processing unit of a computer system.
[0042] Furthermore, the claimed subject matter may be implemented as a method, apparatus, or article of manufacture using standard programming and engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computing device to implement the disclosed subject matter. The term “article of manufacture,” as used herein, is intended to encompass a computer program accessible from any computer-readable storage de vice or media. Computer-readable storage media can include, but are not limited to, magnetic storage devices, e.g., hard disk, floppy disk, magnetic strips, optical disk, compact disk (CD), digital versatile disk (DVD), smart cards, flash memory devices, among others, in contrast, computer-readable media (i.e., not limited to storage media) may additionally include communication media such as transmission media for wireless signals and the like.
[0043] Due to the increase in self-driving vehicles (also called autonomous vehicles), testing is needed to ensure their safety, as the public must feel that self-driving vehicles will operate in a safe manner in all possible scenarios. This disclosure provides routines and data tor evaluating an operational behavior of a self-driving vehicle. The evaluation can be used to assess a potential performance and safety of a self-driving vehicle. Further the evaluation can provide identification of improper operational behavior by the self-driving vehicle. The identification comprises identification of improper operational behavior performed by the self-driving vehicle, not by a human occupant or human driver. It should be understood that the identification includes identification of dangerous or potentially dangerous behavior by the self-driving vehicle operating in a self-driving mode. The danger includes danger to the vehicle and occupants. The danger further includes danger to pedestrians and/or danger to property.
[0044] Gathered real-world driving data is assembled into a collection of data that can be used to test and analyze a self-driving vehicle in multiple ways, including comparing a performance of the self-driving vehicle without taking the self-driving vehicle onto public roadways. The real-world driving data provides many possible driving scenarios that can be used to thoroughly test the selfdriving vehicle, but without endangering humans or property.
[0045] The identification is performed on real-world driving data to ensure that the seif-dri ving vehicle does not exhibit any improper or dangerous self-driving actions. The identification can comprise identification of improper operational behavior in a recorded real-world driving data (i.e., previously- captured operational behavior data) of tire seif-driving vehicle. Alternatively, or in addition, the identification can comprise identification of improper operational behavior in real time by analyzing generated driving data as the self-driving vehicle is moving and/or operating. The identification of improper operational behavior by the seif-driving vehicle can subsequently be used to improve seifdriving vehicle performance.
[QQ46] As noted previously herein, testing the safety of operation of selfdriving vehicles is important. It is desirable for the safety evaluation to consider real-world data and take different driving scenarios into account. For example, a velocity of 5 meters per second (5m/s) could be considered as not dangerous if the vehicle is operating on a residential street, but the velocity could pose danger in a scenario where pedestrians are present. Alternatively, it could comprise dangerous self-driving behavior when the self-driving vehicle is on a freeway or other high-speed vehicle roadway. A universal evaluation method is desirable to evaluate the performance of self-driving vehicles. [0047] The self-driving vehicle evaluation can be performed as part of a design process or design evaluation process. The self-driving vehicle evaluation can be performed to ensure all possible self-driving scenarios have been considered and handled. Tire self-driving vehicle evaluation can be performed to automate testing. The self-driving vehicle evaluation can be performed to provide consistent testing of self-driving vehicles overtime. The self-driving vehicle evaluation can be performed to ensure compliance with relevant selfdriving vehicle regulations. The self-driving vehicle evaluation can be performed to provide self-driving performance testing and tracking that is consistent for different self-driving vehicles. The self-driving vehicle evaluation can be performed to provide testing and tracking that, can be expanded upon over time and as needed, and as self-driving vehicles become more accepted and more prevalent.
[0048] The basic concept comprises the use of gathered real-world self- driving vehicle information to evaluate self-driving vehicles. The real-world self-driving information can be obtained once, or can be repeatedly collected (or added to) over time, creating a larger and larger database of real-world self- driving information. The gathered real -wo rid self-driving information can be used for testing all manner of self-driving vehicles. The real-world self-driving vehicle information in some embodiments comprises data pre viously gathered from self-driving vehicles and is used to build a self-driving operational behavior database of self-driving scenarios. The self-driving operational behavior database is used for testing self-driving vehicle designs. As a result, the database accumulates self-driving vehicle data and examples, enabling more intensive and detailed testing of newer self-driving vehicle designs. The database therefore enables continuous improvement in the testing and evaluation of self-driving vehicles.
[QQ49] The real-world self-driving vehicle information can include recorded data for all self-driving regimes and scenarios. The real-world selfdriving information can include data tor different road and street types. The real-world self-driving information can include data for different driving speeds. The real-world self-driving information can include data for different vehicle operational laws and rules. The real-world self-driving information can include data for different traffic types and levels. The real-world self-driving information can include data for different types of passengers/cargo (i.e., personal transport versus commercial transport, emergency vehicles, et cetera). The real-world self-driving information can include data for different types of weather and/or different driving conditions. The real-world seif-driving information am include data for various obstacles or road hazards. The real- world self-driving information can include data for the presence of and/or anticipated and unanticipated actions by pedestrians. The real-world self-driving information can include data for the presence of extreme weather conditions.
The real-world self-driving information can include data for wildlife and/or livestock on or near the roadway. The real-world self-driving information can include data for interacting with and avoiding various other types of drivers on the road. The real-world self-driving information can include data for operation in the presence of emergency vehicles. The real-world self-driving information can include data for operation in the presence of road problems, road repair, and/or construction. The real-w'orld self-driving information can include data for operation in the proximity of wrecks, roadway blockages, or other unforeseen or emergency situations. The real-world self-driving information can include data for various other types of drivers on the road. Other self-driving scenarios are contemplated and are within the scope of the discussion and claims.
[0050] The real-world self-driving information comprises a knowledge repository that will grow over time. The real-world self-driving information will ensure that testing of self-driving vehicles is thorough. The real-world selfdriving information will ensure the testing of self-driving vehicles is consistent. [0051] The real-world self-driving information can be used to identify improper self-driving operation. Tire real-world self-driving information can be used to identify potentially dangerous self-driving operation. The real-world self-driving information can be used to identify potentially dangerous selfdriving operation in a variety of situations and environments, such as detecting potentially dangerous operation that is acceptable in good weather conditions, but might lead to loss of vehicular driving control under adverse weather such as rain, snow, or ice, for example.
[Q052] FIG. 1 is an illustration of a self-driving vehicle scenario 100 including self-driving vehicles 110 and 120. The self-driving vehicles 110 and 120 can generate real-world driving data. The self-driving vehicles 110 and 120can contribute self-driving data to an aggregated accumulated real-world driving data. Self-driving vehicle 110 includes an onboard computer 112 and sensors 115 in communication with the onboard computer 112. The onboard computer 112 receives signals and/or measurements from the sensors 115, processes this data, and uses the data as part of operating the self-driving car 110. Likewise, self-driving car 120 includes an onboard computer 12.2. and sensors 125, wherein the onboard computer 122 uses data from the sensors 125 as part of operating the self-driving car 120. Further, the onboard computers 112 and 122, along with the sensors 115 and 125 generate real-world driving data. Each vehicle 110 and 120 includes sensors 115 and 125 such as position sensors, time-of-flight sensors, accelerometers, radar, lidar (light detection and ranging or 3D laser scanning), ultrasonic and/or sonar detectors, one or more video cameras, inertial navigation devices, GPS devices, etc., to collect real-world driving data. Sensors such as radar, hdar, and ultrasonic devices can be used for terrain mapping, object detection, and object distance measuring, for example. [0053] The driving data collected by the self-driving vehicles 110 and
12.0 can be at least partially processed by the onboard computers 112 and 122. Alternatively, the driving data can be transmitted to a remote facility or facilities for processing. Hie driving data generated by a self-driving vehicle is used to determine operational behavior metrics to evaluate the perfosmance of the seif- driving vehicle. Tire operational behavior metrics can be used to determine a potential danger in the operation of a self-driving vehicle.
[0054] Two self-driving vehicles 110 and 120 are shown. The selfdriving vehicles 110 and 120 include onboard computers 112. and 122. The self- driving vehicles 110 and 120 further include sensors 115 and 125, where the onboard computers 112 and 122 are in communication with the sensors 115 and 125 and recei ve sensor data. The onboard computers 112 and 122 process the sensor data during operation of the self-driving vehicles 110 and 120. Further, the onboard computers 112 and 122 can use the sensor data in conjunction with previously-obtained real-world driving data to assess the operational behavior of the self-driving vehicles 110 and 120.
[0055] FIG. 2 is a flow diagram of a computer-implemented method 200 of identifying improper operational behavior of a self-driving vehicle according to an embodiment. The method 200 begins with processing naturalistic driving data (NOD) or real-world driving data. The driving data set includes information on the behavior of multiple real-world vehicles (e.g., speed, direction, etc.) as well as information on the surroundings of the vehicles tor different scenarios (e.g., objects near the vehicles, whether the scenario is for a highway, weather, road conditions, etc.). The multiple real-world vehicles can include self-driving vehicles and/or human-operated vehicles. The data can be collected using vehicles such as those shown m FIG. 1.
[0056] At block 205, the data in the real-world driving data set is converted into a decision tree having multiple leaf nodes. The real-world driving data may be stored in memory of a computing device (e.g., a computer, laptop computer, tablet computer, smartphone, etc.), or the real-world driving data may be downloaded from the cloud (e.g., from a cloud server). The converting of the data is performed by processing circuitry (e.g., one or more hardware processors) of the computing device. Tire real-world driving data can be obtained in the form of raw' or previously-processed data. The can be obtained in the form of organized data, including organized in a database, neural network model, or other suitable form or suitable data structure.
[0057] To generate the decision tree, the real-world driving data is processed to identify driving episodes of each vehicle in the reai-w'orld driving data. A driving episode of a vehicle may be defined using a trajectory of the vehicle (e.g., a predetermined number of meters travelled by the vehicle) or by a duration of driving time (e.g., a predetermined number of seconds of driving time). As a consequence, a driving episode may be comprised of one or more trajectories. The trajectory can be defined as an operation or operational phase of the self-driving vehicle. The trajectory can be defined as operation: at a substantially consistent speed or speed range, in a substantially consistent direction, or from a source point to a destination point. It should be understood that a trajectory can be defined in other ways, and all such trajectories are within the scope of this disclosure. However, it should be understood that the above are merely examples and a dri ving episode may be identified and/or categorized in other ways. There may be thousands of driving episodes identified in the driving data. Different levels or tiers of decisions are applied to the identified driving episodes to group similar driving episodes together into the multiple leaf nodes of the decision tree.
[0058] FIG. 3 illustrates a decision tree 300 constructed from real-world driving data according to an example embodiment. The decision tree 300 in the example shown includes a root node 305, a first node level 310, a second node level 315, and a third node level 320. The third node level 320 in this example comprises a leaf node level comprising multiple leaf nodes. For simplicity, the example only includes three levels of decisions, but an actual implementation may have many levels of decisions. The decisions in the example of FIG. 3 are binary decisions. The first decision level 305 divides the driving episodes into two groups; those episodes in which the operated vehicle only interacts with other vehicles (V), and episodes in which the vehicle interacts with non-vehicle objects (e.g., pedestrians, cyclists, etc.) as well as vehicles.
[0059] The second decision level 310 further divides the two groups output from decision level 305 into those driving episodes with vehicles encountering speed bumps (SB) and those driving episodes that do not have speed bumps, lire output from the second decision level is four groups: i) those driving episodes that include only vehicles and include speed bumps, si) those driving episodes that include only vehicles and do not include speed humps, sis) those driving episodes that include non-vehicle objects, vehicles, and speed bumps, and iv) those driving episodes that include non-vehicle objects, vehicles, and do not include speed bumps.
[0060] The third decision level 315 divides the four groups that are outputted from decision level 310 into those driving episodes with vehicles encountering a crosswalk (CW) and those episodes that do not have crosswalks. Eight leaf nodes 320 are generated and the individual driving episodes identified in the driving data are classified in the appropriate leaf node.
[0061J The eight leaf nodes include: i) driving episodes that include only vehicles and include speed bumps and crosswalks (V JSB___CW), ii) driving episodes that include only vehicles, include speed bumps, and do not include crosswalks (V_SB_iionCW), iii) driving episodes that include only vehicles, do not include speed bumps, and include crosswalks (V nonSB CW), iv) driving episodes that include only vehicles and do not include speed bumps or crosswalks (V nonSB nonCWh v) driving episodes that include non-vehicle objects and include speed bumps and crosswalks (nonV__SB_CW), vi) driving episodes that include \non-vehide objects, include speed bumps and do not include crosswalks (nonV SB nonCW), vii) driving episodes that include non-vehicle objects, do not include speed bumps, and include crosswalks (nonV_nonSB_CW), and viii) driving episodes that include non-vehicle objects and do not include speed bumps or crosswalks (nonV nonSB nonCW).
[0062] Returning to FIG. 2 at block 210, evaluation data is computed for each of the driving episodes in each leaf node of the multiple leaf nodes. Some examples of the evaluation data include: average speed (v«vg) of the monitored vehicle in each driving episode in the leaf node, minimum speed (vmm) of the vehicle in each driving episode in the leaf node, maximum speed (vmax) of the vehicle in each driving episode in the leaf node, average distance betw een objects (vehicles and non-vehicle objects if any) pairwise (/¥¾) in each driving episode in the leaf node, and minimum distance between objects pairwise (/¥».·) in each driving episode in the leaf node. Distributions for the evaluation data are then calculated and stored for each leaf node.
[0063] FIGS, 4A-4E show example probability distributions for selfdriving episodes. Hie figures show' example distributions of the computed average speed (in m/s), minimum speed, maximum speed, average distance (in meters), and minimum distance, respectively, for driving episodes. The driving episodes in the example are based on a spilt between scenarios with only vehicles (V) and scenarios with non-vehicle objects present (nonV). The example distributions are density histograms. In a density histogram the areas of all the vertical columns add up to one for ail the driving episodes in the distribution. For example, in FIG. 4.4, the probability that the average speed of a vehicle is 5 m/s is 0.12.
[0064] The decision tree, leaf nodes, and evaluation data distributions provide a database against which the performance of an self-driving vehicle can be evaluated. Reluming to FIG. 2 at block 215, driving data of a self-driving vehicle to be evaluated is obtained. Driving data is collected by running the self- driving vehicle and collecting sensor data generated by the sensors (the data can further include other data, including data from sources other than the seif-driving vehicle). The self-driving vehicle includes sensors and an onboard computer to collect the driving data, such as for the vehicles shown in FIG. 1. The driving data can he collected and stored in memory of the evaluating computer system, or the driving data for the self-driving vehicle may be collected and streamed to the cloud and stored and later downloaded by the evaluating computer system. [0065] At block 220, driving episodes for the self-driving vehicle being evaluated are identified in the new driving data, and classified into leaf nodes using the decision tree. For the example decision tree 300 in FIG. 3, the driving episodes identified for the self-driving vehicle are classified into the eight leaf nodes. The evaluation data (e.g., average speed, minimum speed, maximum speed, average distance, and minimum distance) is computed for the driving episodes,
[0066] At block 225, one or more operational behavior metrics can be calculated for the driving episodes of the seif-driving vehicle being evaluated. Different metrics can be calculated by comparing the e valuation data for the selfdriving vehicle against the distributions for the real-world self-driving data. For example, the average speed tor a driving episode scenario for the self-driving vehicle can be compared to the average speed distribution for real-world selfdriving data for the corresponding scenario leaf node that the driving epi sode was classified into.
[0067] An example of a metric to evaluate the self-driving vehicle is an self-driving behavior metric. The self-driving behavior metric is related to a rareness of an event and a value(s) of the criteria of the self-driving behavior metric are calculated. For each criterion in a particular seif-driving behavior metric, the probability density of this criterion indicates the rareness of the calculated value. The rarer the calculated value, the more unusual (and potentially improper) is the behavior. Also, a smaller criteria value (e.g., smaller minimum distance) may indicate a more improper self-driving behavior. Thus, the self-driving behavior metric for each criterion can be defined to be the inverse of the probability density and the actual criterion value.
[Q068] Another metric useful to evaluate the self-driving vehicle is a behavioral realness metric. The behavior realness metric can be defined by the probability density function. The behavior realness metric in some embodiments assigns a weight, value, or probability to a self-driving behavior metric, with the behavior realness metric indicating the likelihood of the self-driving behavior metric occurring in seif-driving operation of the self-driving vehicle. Therefore, a higher value in the behavior realness metric indicates the performance of the self-driving vehicle m the scenario is more similar to the real-world vehicle performance, and thus more realistic,
[0069] FIG. 5 is an illustration of using a density histogram to calculate self-driving behavior metrics. The self-driving behavior metrics are subsequently used for evaluating operation of a seif-driving vehicle. The example involves a self-driving car episode classified into the leaf node corresponding to scenarios with only vehicles present, and speed bumps and crosswalks present (V SB CW). The minimum distance of the self-dri v ing vehicle is weighed against the minimum distances of the real-world data in the V_SBCW leaf node using the distribution 505 for the minimum distance.
[0070] Assume the driving episode of the self-driving vehicle to be evaluated is classified into the V SB CW leaf node and the minimum distance of the self-driving vehicle in this episode is 10 meters (/ mm = 10m). The selfdriving behavior metric for i ™ is detennined as
Figure imgf000019_0001
and the behavioral realness metric for / mm is determined as
Behavior _realness_metricijnin ~ P(LM!N = 10) (2) in the minimum distance distribution in FIG. 5, the behavioral realness metric is the shaded area 510.
[0071] Based on the metrics, the evaluation system may flag a certain scenario result for the user. Returning to FIG. 2, at block 230 the system may generate an alert for the driving episode based on the detemiined seif-driving behavior metric indicating a rare event (e.g., for when the minimum distance in the scenario is small). In some embodiments, this includes [0072] Tire evaluation system implementing the method 200 of FIG. 2 may be a simulation system used to w eigh the performance of a self-driving vehicle. A user of the simulation system may want to change parameters of the evaluation. [0073] FIG. 6 is a block diagram of a system 600 for identifying improper operational behavior of a self-driving vehicle according to an embodiment. The computer system 600 executes software or instructions that configure the system 600 for performing methods and algorithms according to example embodiments. Ail components need not be used in various embodiments.
[0074] One example is a computing device that may include processing circuitry7 (e.g., a processing unit 602), memory' 603, removable storage 610, and non-removable storage 612. Although the example computing device is illustrated and described as computer 600, tire computing device may be in different forms in different embodiments. For example, the computing device may be a server, a router, or a virtual router.
[Q075] Although the various data storage elements are illustrated as part of the computer 600, the storage may also or alternatively include cloud-based storage accessible via a network, such as die Internet or server-based storage. Note also that an SSD may include a processor on which the parser may be run, allowing transfer of parsed, filtered data through I/O channels between the SSD and main memory7.
[0076] Memory7 603 may include volatile memory7 614 and non-volatile memory 608. Computer 600 may include - or have access to a computing environment that includes - a variety of computer-readable media, such as volatile memory 614 and non-volatile memory7 608, removable storage 610 and non-removable storage 612. Computer storage includes random-access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM) or electrically7 erasable programmable read-only memory7 (EEPROM), flash memory7 or other memory technologies, compact disc read-only memory (CD ROM), Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium capable of storing computer-readable instructions.
[0077] Computer 600 may include or have access to a computing environment that includes input interface 606 (or user interface), output interface 604, and a communication interface 616. Output interface 604 may include a display device, such as a touchscreen, that also may serve as an input device. The input interface 606 may include one or more of a touchscreen, touchpad, mouse, keyboard, camera, one or more device-specific buttons, one or more sensors integrated within or coupled via wired or wireless data connections to the computer 600, and other input devices. The computer 600 may operate in a networked environment using a communication connection to connect to one or more remote computers, such as database servers. The remote computer may include a personal computer (PC), server, router, network PC, a peer device or other common data flow network switch, or the like. The communication connection may include a Local Area Network (LAN), a Wide Area Network (WAN), cellular, Wi-Fi, Bluetooth, or other networks. According to one embodiment, the various components of computer 600 are connected with a system bus 620.
[0078J Computer-readable instructions stored on a computer-readable medium are executable by the processing unit 602 of the computer 600, such as a program 618. The program 618 in some embodiments comprises software to implement one or more methods described herein. A hard drive, CD-ROM, and RAM are some examples of articles including a non-transitory computer- readable medium, such as a storage device. The terms computer-readable medium and storage device do not include carrier waves to the extent carrier waves are deemed too transitory. Storage can also include networked storage, such as a storage area network (SAN). Computer program 618 may be used to cause processing unit 602 to perform one or more methods or algorithms described herein ,
[0079] The computer 600 can be configured to comprise all or a portion of a simulation system to implement the method 200 of FIG. 2. A user (e.g., a manufacturer of self-driving vehicles) may customize die leaf nodes of the decision tree by changing the decisions in the decision tree. The customization may increase or decrease the number of leaf nodes to classify the driving episodes (of both the real-world driving episodes and the self-driving vehicle being evaluated) using a finer classification or coarser classification of the driving episode. For instance, the user may change the number or types of objects of interest in the driving episodes to change the classification in the leaf nodes. The user may also specify- other non-object factors to classify episodes, such as weather, road conditions, etc. The processing circuitry of the computer 600 adds or removes decisions of the decision tree based on the changes, in some examples, the customized decision-tree date is sent to the onboard computer of the vehicle (e.g., using the cloud). Driving data for the self-driving vehicle is collected by the self-driving vehicle and analyzed against the decision tree using the onboard computer. In some examples, the computer 600 is implemented using the onboard computer of the self-driving vehicle.
[0080] In another example, the user may change how a driving episode is defined. The processing circuitry may receive one or more episode parameters (e.g., a threshold distance travelled for a trajectory to be called a driving episode, or a time duration for a trajectory to be called a driving episode) and change the way driving episodes are identified in the real-world data and the self-driving vehicle data using the changed episode parameters.
[0081] In a further example, the user may change the threshold value for a metric (e.g,, the self-driving behavior metric) before an alert is generated and presented (e.g., displayed) to the user. These changes allow the user to emphasize different aspects of the performance of the self-driving vehicle in the evaluation.
[0082] In an example embodiment, the computer 600 includes a decision tree conversion module for converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes, an episode evaluation module for computing episode evaluation data for the real-world driving episodes of the multiple leaf nodes and computing probability distributions for the episode evaluation data for each leaf node, a driving data module for obtaining driving data of the self-driving vehicle, a classification module for identifying a particular driving episode of the self-driving vehicle in the driving data and classifying the particular driving episode as corresponding to a particular leaf node of the decision tree, and a behavior module for determining a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree. In some embodiments, the computer 600 may include other or additional modules for performing any one of or combination of steps described in the embodiments. Further, any of the additional or alternative embodiments or aspects of the method, as shown in any of the figures or recited in any of the claims, are also contemplated to include similar modules. [0083] The methods, systems and devices described herein provide techniques tor evaluating seif-driving vehicles. Operation of the seif-driving vehicles is evaluated using self-driving behavior metrics that change with the driving scenarios being evaluated. Driving data is applied to a decision tree structure to improve classification of driving conditions by the evaluating system. The self-driving behavior metrics determined for tire self-driving vehicles are determined and interpreted in a probabilistic way rather than merely comparing a single valued result to a fixed single value. These techniques result in evaluation of self-driving vehicles that is realistic and practical. [0084] Although a few embodiments have been described in detail above, other modifications are possible. For example, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. Other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Other embodiments may be within the scope of the following claims.

Claims

CLAIMS What is claimed is:
1. A computer-implemented method of identifying improper operational behavior of a self-driving vehicle, the method comprising: converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; computing episode evaluation data for the real-world driving episodes of the multiple leaf nodes and computing probability dis tributions for the episode evaluation data for each leaf node; obtaining driving data of the self-driving vehicle; identifying a particular driving episode of the self-driving vehicle in the driving data and classifying the particular driving episode as corresponding to a particular leaf node of the decision tree; and determining a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.
2. The method of claim 1, the method further comprising generating an alert for the particular driving episode based on the self-driving behavior metric.
3. The method of any of claims 1-2, wherein the determining the selfdriving behavior metric includes: determining a behavioral realness metric for the particular driving episode using a probability that the particular driving episode represents real- world driving behavior; and the generating the alert includes generating the alert using the behavioral realness metric for the particular driving episode.
4. The method of any of claims 1-3, wherein the converting the real-world driving data into the decision tree includes: identifying real-world driving episodes in the real-world driving data; and grouping similar identified real-world driving episodes into a corresponding leaf node of the decision tree according to a presence of specified factors in the real-world driving episodes.
5. Hie method of claim 4, wherein the identifying the real-world driving episodes m the real-world driving data comprising: determining trajectories of vehicles in the real-world driving data; and identifying a determined trajectory' as a driving episode.
6. lire method of any of claims 1-5, wherein the determining the selfdriving behavior metric for the particular driving episode includes determining an improper operation metric, including: determining a speed of the seif-driving vehicle in the particular driving episode; comparing the speed of the self-driving vehicle to computed probability' distributions for average speed, minimum speed, and maximum speed for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the speed of the self-driving vehicle to at least one of the computed probability' distributions.
7. The method of any of claims 1-6, wherein the determining the selfdriving behavior metric includes determining an improper operation metric including: determining, in the particular driving episode, a distance of the selfdriving vehicle to objects; comparing the distance to probability distributions for average distance and minimum distance for the identified driving episodes of tire leaf node; and calculating the improper operation metric for die particular driving episode using a comparison of the distance to at least one of the probability distributions of the leaf node.
8. The method of any of claims 1-7, further comprising storing the computed probability distributions in memory as density histograms.
9. The method of any of claims 1-8, wherein the computing the probability distributions further comprising: calculating a probability distribution for one or more of average speed, minimum speed, maximum speed, average distance, and minimum distance tor the identified driving episodes of the leaf node: determining a speed of the self-driving vehicle in the driving episode and a distance of the seif-driving vehicle to objects in the driving episode of the selfdriving vehicle; calculating at least one improper operation metric for the driving episode using at least one of the calculated probability distributions and one or both of the determined speed and distance of the self-driving vehicle; and generating the alert using the at least one improper operation metric.
10. A seif-driving vehicle evaluation system for a self-driving vehicle, the system comprising: a non-transitory memory storing real-world dri ving data and instructions; and a processor in communication with the memory, the processor configured, upon execution of the instructions, to perform the following steps: convert real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; compute episode evaluation data for the real-world driving episodes of the multiple leaf nodes and compute probability distributions for the episode evaluation data for each leaf node; obtain driving data of the self-driving vehicle; identify a particular driving episode of the self-driving vehicle in the driving data and classify the particular driving episode as corresponding to a particular leaf node of the decision tree; and determine a self-driving behavior metric for the particular driving episode rising the probability distributions in the particular leaf node of the decision tree.
11. The system of claim 10, including: a user interface operatively coupled to the processing circuitry'; wherein the converting further comprises the processor further executes the instructions to: receive selections of objects of interest via the user interface; and group the identified real-world driving episodes into the multiple leaf nodes of the decision tree according to a presence of the objects of interest in the real -world driving episodes.
12. The system of any of claims 10-11, wherein the processor further executes the instructions to: receive the episode parameter for identifying a real-world driving episode in the real-world driving data via the user interface; and identify one or more real-world driving episodes in the real-world driving data using the episode parameter.
13. The system of any of claims 10-12, wherein the processor further executes the instructions to present an alert according to the self-driving behavior metric, and the processor presenting the alert using the user interface.
14. The system of any of claims 10-13, wherein the processor further executes the instructions to: determine a behavioral realness metric for the particular driving episode using a probability that the particular driving episode represents real-world driving behavior; and present the behavioral realness metric for the driving episode using the user interface.
15. The system of any of claims 10-14, wherein the determining the self- driving behavior metric for the particular driving episode includes determining an improper operation metric, including: determining a speed of the seif-driving vehicle in the particular driving episode: comparing the speed of the self-driving vehicle to computed probability distributions for average speed, minimum speed, and maximum speed for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the speed of the self-driving vehicle to at feast one of the computed probability distributions.
16. The system of any of claims 10-15, wherein the determining tire selfdriving behavior metric includes determining an improper operation metric including: determining, m the particular driving episode, a distance of the seif- driving vehicle to objects; comparing the distance to probability distributions for average distance and minimum distance tor the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the distance to at least one of the probability distributions of the leaf node.
17. Hie system of any of claims 10-16, wherein the processor further executes the instructions to store the probability distributions of the multiple leaf nodes in memory as density histograms.
18. Hie system of any of claims 1-8, wherein the computing the probability distributions further comprising: calculating a probability' distribution for one or more of average speed, minimum speed, maximum speed, average distance, and minimum distance tor the identified driving episodes of the leaf node: determining a speed of the self-driving vehicle in the driving episode and a distance of the self-driving vehicle to objects in the driving episode of the selfdriving vehicle; calculating at least one improper operation metric for the driving episode using at least one of the calculated probability distributions and one or both of the determined speed and distance of the self-driving vehicle; and generating the alert using the at least one improper operation metric.
19. A non-transitory computer readable storage medium storing computer instructions that when executed by one or more processors, cause the one or more processors to perform steps comprising: converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; computing episode evaluation data for the real-world driving episodes of the multiple leaf nodes and computing probability distributions for the episode evaluation data for each leaf node; obtaining driving data of the seif-driving vehicle; identifying a particular driving episode of the self-driving vehicle in the driving data and classifying the particular driving episode as corresponding to a particular leaf node of the decision tree; and determining a self-driving behavior metric for the particular driving episode using the probability distributions in the particular leaf node of the decision tree.
20. The computer-readable medium of claim 19, the computer-readable storage medium further storing instructions that when executed by the one or more processors perform the step of generating an alert for the particular driving episode based on the self-driving behavior metric.
21. The computer-readable medium of any of claims 19-20, wherein the determining the self-driving behavior metric includes: determining a behavioral realness metric for the particular driving episode using a probability that the particular driving episode represents real- world driving behavior; and the generating the alert includes generating the alert using the behavioral realness metric for the particular driving episode.
22. The computer-readable medium of any of claims 19-21, wherein the converting the real-world driving data into the decision tree includes: identifying real-world driving episodes in the real-world driving data; and grouping similar identified real-world driving episodes into a corresponding leaf node of the decision tree according to a presence of specified factors in the real-world driving episodes.
23. The computer-readable medium of claim 22, wherein the identifying the real-world driving episodes in the real-world driving data comprising: determining trajectories of vehicles in the real-world driving data; and identifying a determined trajectory' as a driving episode.
24. lire computer-readable medium of any of claims 19-23, wherein the determining the self-driving behavior metric for the particular driving episode includes determining an improper operation metric, including: determining a speed of the seif-driving vehicle in the particular driving episode; comparing the speed of the self-driving vehicle to computed probability' distributions for average speed, minimum speed, and maximum speed for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the speed of the self-driving vehicle to at least one of the computed probability distributions.
25. The computer-readable medium of any of claims 19-24, wherein the determining the self-driving behavior metric includes determining an improper operation metric including: determining, in the particular driving episode, a distance of the selfdriving vehicle to objects; comparing the distance to probability distributions for average distance and minimum distance for the identified driving episodes of the leaf node; and calculating the improper operation metric for the particular driving episode using a comparison of the distance to at least one of the probability distributions of the leaf node.
26. The compu ter-readable medium of any of claims 19-25, the computer- readable storage medium further storing instructions that when executed by the one or more processors perform the step of storing the computed probability distributions in memory as density histograms.
27. Tire computer-readable storage medium of any of claims 19-26, including instructions that cause the computing device to perform operations including: calculating a probability-7 distribution for one or more of average speed, minimum speed, maximum speed, average distance, and minimum distance for the identified driving episodes of the leaf node; determining a speed of the self-driving vehicle in the driving episode and a distance of the self-driving vehicle to objects in the driving episode of the self- driving vehicle; calculating at least one improper operation metric for the driving episode using at least one of the calculated probability distributions and one or both of the determined speed and distance of the self-driving vehicle; and generating the alert using the at least one improper operation metric.
28. A seif-driving vehicle evaluation system for a self-driving vehicle, the system comprising: a decision tree conversion module for converting real-world driving data into a decision tree having multiple leaf nodes, the multiple leaf nodes including real-world driving episodes; an episode evaluation module for computing episode evaluation data for the real-world driving episodes of the multiple leaf nodes and computing probability distributions for the episode evaluation data for each leaf node; a driving data module for obtaining driving data of the self-driving vehicle; a classification module for identifying a particular driving episode of the self-driving vehicle in the driving data and classifying the particular driving episode as corresponding to a particular leaf node of the decision tree; and a behavior module for determining a self-driving behavior metric for the particular driving episode using the probability' distributions in the particular leaf node of the decision tree.
PCT/US2022/070037 2022-01-05 2022-01-05 Self-driving vehicle evaluation using real-world data WO2022226434A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/070037 WO2022226434A1 (en) 2022-01-05 2022-01-05 Self-driving vehicle evaluation using real-world data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/070037 WO2022226434A1 (en) 2022-01-05 2022-01-05 Self-driving vehicle evaluation using real-world data

Publications (1)

Publication Number Publication Date
WO2022226434A1 true WO2022226434A1 (en) 2022-10-27

Family

ID=80218606

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/070037 WO2022226434A1 (en) 2022-01-05 2022-01-05 Self-driving vehicle evaluation using real-world data

Country Status (1)

Country Link
WO (1) WO2022226434A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017079228A2 (en) * 2015-11-04 2017-05-11 Zoox, Inc. Adaptive autonomous vehicle planner logic
WO2020079069A2 (en) * 2018-10-16 2020-04-23 Five AI Limited Driving scenarios for autonomous vehicles
US20210096571A1 (en) * 2019-09-27 2021-04-01 Zoox, Inc. Perception error models

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017079228A2 (en) * 2015-11-04 2017-05-11 Zoox, Inc. Adaptive autonomous vehicle planner logic
WO2020079069A2 (en) * 2018-10-16 2020-04-23 Five AI Limited Driving scenarios for autonomous vehicles
US20210339772A1 (en) * 2018-10-16 2021-11-04 Five Al Limited Driving scenarios for autonomous vehicles
US20210096571A1 (en) * 2019-09-27 2021-04-01 Zoox, Inc. Perception error models

Similar Documents

Publication Publication Date Title
US11878720B2 (en) Method and system for risk modeling in autonomous vehicles
US10493986B2 (en) Vehicle driver assist system
US11783568B2 (en) Object classification using extra-regional context
US11599563B2 (en) Programmatically identifying a personality of an autonomous vehicle
US10732633B2 (en) Operation model construction system, operation model construction method, and non-transitory computer readable storage medium
US11932260B2 (en) Selecting testing scenarios for evaluating the performance of autonomous vehicles
CN115017742B (en) Automatic driving test scene generation method, device, equipment and storage medium
US20220019713A1 (en) Estimation of probability of collision with increasing severity level for autonomous vehicles
US20230071808A1 (en) Apparatus and method for processing vehicle signals to compute a behavioral hazard measure
Platho et al. Predicting velocity profiles of road users at intersections using configurations
CN114771548A (en) Data logging for advanced driver assistance system testing and verification
JP7380616B2 (en) Automatic driving control device, automatic driving control method, and automatic driving control program
US20220383736A1 (en) Method for estimating coverage of the area of traffic scenarios
CN114287006A (en) Classification of AI modules
WO2022226434A1 (en) Self-driving vehicle evaluation using real-world data
Ritchie et al. Field investigation of advanced vehicle reidentification techniques and detector technologies-Phase 1
US20230391357A1 (en) Methods and apparatus for natural language based scenario discovery to train a machine learning model for a driving system
Das et al. Why slammed the brakes on? auto-annotating driving behaviors from adaptive causal modeling
US20220262103A1 (en) Computer-implemented method for testing conformance between real and synthetic images for machine learning
Song et al. Remote estimation of free-flow speeds
Raj et al. Evaluation of perception and nonperception based approaches for modeling urban road level of service
US11897501B2 (en) ADS perception development
KR102477885B1 (en) Safety analysis management server for evaluating autonomous driving roads
Milardo et al. An unsupervised approach for driving behavior analysis of professional truck drivers
US20230091986A1 (en) Method and system for evaluation and development of automated driving system features

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22702841

Country of ref document: EP

Kind code of ref document: A1