WO2022133090A1 - Génération et évaluation adaptatives de scénarios critiques de véhicule autonome - Google Patents

Génération et évaluation adaptatives de scénarios critiques de véhicule autonome Download PDF

Info

Publication number
WO2022133090A1
WO2022133090A1 PCT/US2021/063814 US2021063814W WO2022133090A1 WO 2022133090 A1 WO2022133090 A1 WO 2022133090A1 US 2021063814 W US2021063814 W US 2021063814W WO 2022133090 A1 WO2022133090 A1 WO 2022133090A1
Authority
WO
WIPO (PCT)
Prior art keywords
autonomous driving
scenario
operations
parameters
testing
Prior art date
Application number
PCT/US2021/063814
Other languages
English (en)
Inventor
Qianying Zhu
Lidan ZHANG
Xiangbin WU
Xinxin Zhang
Fei Li
Zhigang Wang
Ping Guo
Original Assignee
Intel Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corporation filed Critical Intel Corporation
Priority to CN202180077564.3A priority Critical patent/CN116490858A/zh
Publication of WO2022133090A1 publication Critical patent/WO2022133090A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision

Definitions

  • Embodiments described herein generally relate to automated driving or driver assisted driving technology and more specifically to approaches for assessment and generation of autonomous driving scenarios in simulation and testing environments.
  • FIG. 1 illustrates an autonomous vehicle system, implementing autonomous driving software, according to an example.
  • FIG. 2 illustrates an example data flow for adaptive generation and testing of critical scenarios, according to an example.
  • FIG. 3 illustrates time-to-collision calculations for autonomous driving cases, used with adaptive generation and testing of critical scenarios, according to an example.
  • FIG. 4 illustrates a time-to-collision distribution, having a search space defined using a time-to-collision input, according to an example.
  • FIG. 5 illustrates a flowchart of a method for adaptive generation and use of critical scenarios, according to an example.
  • FIG. 6 illustrates an example of a moving vehicle scenario demonstrating an autonomous vehicle operation safety model, according to an example.
  • FIG. 7 illustrates an example workflow for operating a safety complexity assessment on an autonomous driving scenario, according to an example.
  • FIG. 8 illustrates vehicle positions in a test scenario map, according to an example.
  • FIGS. 9A and 9B illustrate test scenario maps providing an adjustment of section range, according to an example.
  • FIG. 10 illustrates a test scenario map providing normalization of a scenario, according to an example.
  • FIGS. 11 A, 11B, and 11C illustrate an example of test scenario assessment in a straight road environment, according to an example.
  • FIGS. 12A and 12B illustrate an example of test scenario assessment in an intersection environment, according to an example.
  • FIG. 13 illustrates an example of a process flow for scenario complexity assessment, based on vehicle operation safety rules, according to an example.
  • FIG. 14 illustrates a flowchart of an example method for generating and evaluating a scenario used for testing autonomous driving operations, according to an example.
  • FIG. 15 illustrates a machine in the example form of a computer system, to perform any one of the methodologies discussed herein, according to an example.
  • the following includes mechanisms and techniques which generally relate to identifying and testing scenarios used for validation of AD operations performed by AD software and logic. Specifically, the following describes an adaptive scenario generation system and method, used in a robust testing process, which may iteratively generate more critical (and more dangerous) test cases than in a previous round of testing by adjusting specific parameters.
  • the following provides normalized and unified assessment methods for autonomous driving scenarios to improve the test efficiency and help find critical cases. Such methods can be used not only on simulation scenarios but also on real world field/road test scenarios. As a result, the following approaches may assist the application and usage of AV safety models and the resulting safety improvements provided by such models.
  • an adaptive scenario generation process is used to speed-up testing time required for AD testing by removing or avoiding use of redundant or low-complexity testing samples. Further, this scenario generation process can generate and explore samples which have critical scenarios, yet are not fully covered in a pre-collected scenario dataset.
  • the adaptive scenario generation process provides a closed loop process to progressively increase complexity of the test scenarios and adaptively generate samples and scenarios. The adjustment of various parameters of a scenario can be identified based on critical measurements useful for testing, such as time to collision (TTC), and the adjustment of these parameters can be divided into different risk levels.
  • TTC time to collision
  • Other examples are also disclosed for the use of specific search algorithms to modify states, and the identification of different conditions with statistical models.
  • this adaptive scenario generation approach can be easily used as a plugin or extension to any scenario-based AD testing framework, enabling even existing testing systems to provide more diverse and more challenging testing use cases.
  • Other use cases and applications of the present techniques to AD testing and AD performance will also be apparent.
  • AD and advanced driver assistance systems have both incorporated various safety features
  • ADAS advanced driver assistance systems
  • VOSM vehicle operation safety model
  • RSS Responsibility-Sensitive Safety
  • VOSMs defines several parameters of individual vehicles and uses these parameters to model, among other things, a set of distances which determine whether a vehicle is safe or not.
  • the safe distances address sufficient longitudinal distance and sufficient lateral distance. This is discussed below with reference to FIG. 6, which explains the relation of these distances in a safety model.
  • VOSMs provide a mathematical model for safety assurance during autonomous driving.
  • VOSMs specifically formulate a set of safety standards, such as the minimum distance between vehicles to avoid collisions. Multiple parameters are used to calculate the formulation, such as response time p, minimum braking a min , brake and maximum acceleration a max, brake of the vehicle. If all requirements are satisfied, the vehicle passes the VOSM standard and is believed to be safe, otherwise the vehicle is not safe.
  • Various forms of safety requirements are being developed to standardize such approaches. For example, the China ITS standard “Technical Requirement of Safety Assurance of AV Decision Making” (T/ITS 0116-2019) establishes an RSS-based safety standard for use in in China.
  • testing standards are also being developed, such as the China ITS standard “Test Procedures and Evaluation Rules for Safety Assurance of AV Decision Making” (T/ITS 0116-2019). It is expected that a testing standard will require execution of three parts: simulation test, field test and road test.
  • the following therefore also provides a more detailed scenario complexity assessment method, based on VOSM rules for autonomous driving.
  • an assessment method is provided to evaluate the velocity changes in a virtual ego path and time cost with a fastest route.
  • the route can be dynamically calculated by VOSM rules, combined with a section range which defines a drivable area.
  • the evaluation will be normalized by the same scene with all actors cleared.
  • the scenario may be further adjusted or modified (e.g., to increase the complexity of testing scenarios, and identify new scenario variations, as noted above).
  • AD systems a wide variety of AD and ADAS models, logic, and algorithms use vehicle sensor data to control, or help control (e.g., via driver prompts, partial steering input, emergency braking, etc.) the vehicle.
  • AD systems can fully control the vehicle without driver assistance, whereas ADAS systems augment a driver’s control of the vehicle or automate, adapt, or enhance vehicle systems to increase safety and provide better driving.
  • ADAS systems safety features are designed to avoid collisions and accidents by offering technologies that alert the driver to potential problems, or to avoid collisions by implementing safeguards and taking over control of the vehicle.
  • the following provides an overview of testing and evaluation of scenarios relevant to testing of AD models, but it will be understood that such testing and evaluation operations are equally relevant to ADAS models.
  • FIG. 1 is a schematic drawing illustrating an example AD system 100 implemented within an autonomous vehicle.
  • This system 100 and its included data processing platform 102 are depicted in a simplistic manner, as a number of features are not shown or described for purposes of simplicity.
  • the particular software, logic, or functional AD operations performed by the data processing platform 102 may be dependent on a version of software, logic, or data, which may be programmed, reprogrammed, or updated at regular intervals by an authorized entity (e.g., a vehicle manufacturer or service provider).
  • the present techniques for scenario verification may be used to perform tests on such software, logic, or data, to ensure that the AD features correctly and safely operate.
  • FIG. 1 includes a data processing platform 102 incorporated into the vehicle 104.
  • the data processing platform 102 includes a sensor array interface 106, processing circuitry 108, data identification circuitry 110, and a vehicle interface 112.
  • the operation of the processing circuitry 108 or the data identification circuitry 110 may be adapted using programmed software, logic, or data which is tested and verified using the techniques discussed herein.
  • the vehicle 104 which may also be referred to as an “ego vehicle”, “subject vehicle”, or “host vehicle”, may be any type of vehicle, such as a commercial vehicle, a consumer vehicle, a recreation vehicle, a car, a truck, a motorcycle, a boat, a drone, a robot, an airplane, a hovercraft, or any mobile craft able to operate at least partially in an autonomous mode.
  • the vehicle 104 may operate at some times in a manual mode where the driver operates the vehicle 104 conventionally using pedals, a steering wheel, or other controls. At other times, the vehicle 104 may operate in a fully autonomous mode, where the vehicle 104 operates without user intervention.
  • the vehicle 104 may operate in a semi-autonomous mode, where the vehicle 104 controls many of the aspects of driving, but the driver may intervene or influence the operation using conventional (e.g., steering wheel) and non-conventional inputs (e.g., voice control).
  • the vehicle may operate at the same or different times among any number of driving automation levels, defined from Level 1 to Level 5 (e.g., as defined by SAE International 13016: Level 1, Driver Assistance; Level 2, Partial Driving Automation; Level 3, Conditional Driving Automation; Level 4, High Driving Automation; Level 5, Full Driving Automation).
  • the vehicle may also operate using combinations or variations of these levels.
  • the vehicle may operate according to a new concept, L2+ (Level 2 Plus), which describes a type of hybrid ADAS/AV to facilitate enhanced driving experience and boost safety without the need to provide a fully autonomous control system.
  • L2+ Level 2 Plus
  • the sensor array interface 106 may be used to provide input or output signaling to the data processing platform 102 from one or more sensors of a sensor array installed on the vehicle 104.
  • sensors include, but are not limited to: forward, side, or rearward facing cameras; radar; LiDAR; ultrasonic distance measurement sensors; or other sensors.
  • Forwardfacing or front-facing is used in this document to refer to the primary direction of travel, the direction the seats are arranged to face, the direction of travel when the transmission is set to drive, or the like.
  • rear-facing or rearward-facing is used to describe sensors that are directed in a roughly opposite direction than those that are forward or front-facing. It is understood that some front-facing cameras may have a relatively wide field of view, even up to 180-degrees.
  • a rear-facing camera that is directed at an angle (perhaps 60-degrees off center) to be used to detect traffic in adjacent traffic lanes, may also have a relatively wide field of view, which may overlap the field of view of the front-facing camera.
  • Side-facing sensors are those that are directed outward from the sides of the vehicle 104. Cameras in the sensor array may include infrared or visible light cameras, able to focus at long-range or short-range with narrow or large fields of view.
  • the vehicle 104 may also include various other sensors, such as driver identification sensors (e.g., a seat sensor, an eye tracking and identification sensor, a fingerprint scanner, a voice recognition module, or the like), occupant sensors, or various environmental sensors to detect wind velocity, outdoor temperature, barometer pressure, rain/moisture, or the like.
  • driver identification sensors e.g., a seat sensor, an eye tracking and identification sensor, a fingerprint scanner, a voice recognition module, or the like
  • occupant sensors e.g., a vehicle sensor, or the like
  • various environmental sensors to detect wind velocity, outdoor temperature, barometer pressure, rain/moisture, or the like.
  • Sensor data is used to determine the vehicle’s operating context, environmental information, road conditions, travel conditions, or the like.
  • the sensor array interface 106 may communicate with another interface, such as an onboard navigation system, of the vehicle 104 to provide or obtain sensor data.
  • Components of the data processing platform 102 may communicate with components internal to the data processing platform 102 or components that are external to the platform 102 using a network, which may include local-area networks (LAN), wide-area networks (WAN), wireless networks (e.g., IEEE 802.11 (Wi-Fi) or cellular (e.g., 4G/5G) network), ad hoc networks, personal area networks (e.g., Bluetooth), vehicle-based networks (e.g., Controller Area Network (CAN) BUS), or other combinations or permutations of network protocols and network types.
  • the network may include a single local area network (LAN) or wide-area network (WAN), or combinations of LANs or WANs, such as the Internet.
  • the various devices coupled to the network may be coupled to the
  • the data processing platform 102 may communicate with a vehicle control platform 118 using a vehicle data interface 112, to receive and obtain vehicle data.
  • the vehicle control platform 118 may be a component of a larger architecture that controls various aspects of the vehicle’s operation.
  • the vehicle control platform 118 may have interfaces to autonomous driving control systems (e.g., steering, braking, acceleration, etc.), comfort systems (e.g., heat, air conditioning, seat positioning, etc.), navigation interfaces (e.g., maps and routing systems, positioning systems, etc.), collision avoidance systems, communication systems, security systems, vehicle status monitors (e.g., tire pressure monitor, oil level sensor, battery level sensor, speedometer, etc.), and the like.
  • autonomous driving control systems e.g., steering, braking, acceleration, etc.
  • comfort systems e.g., heat, air conditioning, seat positioning, etc.
  • navigation interfaces e.g., maps and routing systems, positioning systems, etc.
  • collision avoidance systems e.g., communication systems, security systems
  • the vehicle control platform 118 may control or monitor one or more subsystems, and communicate data from such subsystems to the data processing platform 102.
  • features of the sensor array interface 106 and the vehicle data interface 112 are integrated into a same or coordinated data collection interface, to receive data from at least one sensing component of the vehicle. Such data may be provided via the interface(s) during autonomous operation of the vehicle, and such data may be automatically logged or monitored.
  • sensor data such as braking, throttle, speed data signals, among other data signal types
  • the data identification circuitry 110 may implement various rules, algorithms, or logic, including one of several types of machine learning, such as artificial neural networks (ANN), support vector machines (SVM), Gaussian mixture model (GMM), deep learning, or the like.
  • ANN artificial neural networks
  • SVM support vector machines
  • GMM Gaussian mixture model
  • the processing circuitry 108 may initiate one or more responsive data processing, logging, or communication activities.
  • Other autonomous vehicle and data processing actions may be monitored, coordinated, or initiated depending on the type, severity, location, or other aspects of an event detected with the data processing platform 102. It will be understood that any number of the models, logic, and algorithms used for AV and data processing operations may be simulated, tested, and validated using the techniques discussed herein.
  • the following proposes to generate more critical scenarios inputted to an AD testing system, thus greatly reducing testing time and cost.
  • the following sampling process identifies a process to generate riskier (and more complicated) scenarios by considering the feedback measured from previous testing results.
  • different metrics can be used, such as TTC (Time-To-Collision).
  • TTC Time-To-Collision
  • the following discusses a search-based method and an improved method of conditional probabilistic sampling to generate more challenging cases and inputs for AD software testing.
  • FIG. 2 depicts a system architecture for AD testing. Specifically, FIG. 2 depicts a scenario-based AD testing framework, involving sampling operations 202, a test engine 206, an evaluation decision 208, and an output 210. The following extends this AD testing framework with the use of an adaptive scenario/sample generation system 212, which improves testing speed by generating more challenging/greater risk scenarios (than previous rounds) for the test engine 206.
  • a test engine 206 would extract one scenario from a pre-collected dataset (e.g., with sampling operation 202 extracting a sample from scenario database 204), and provide this input to the test engine 206 until enough steps or until some criteria is met by the evaluation decision 208 (such as enough failed cases, etc.).
  • the extracted inputs are simply a subset of a pre-collected scenario dataset from the database 204.
  • the pre-collected scenario dataset must be large enough to cover all cases. However, because the next-round testing sample does not consider the previous testing results, this may lead to testing with redundant samples which further slows down the time needed for testing.
  • This testing framework is extended with the adaptive scenario/sample generation system 212, which uses further approaches for test case sampling.
  • a sampling approach may apply different sampling mechanisms, such as random sampling, Monto-Carlo sampling, importance-based sampling, etc. to reduce testing time.
  • the adaptive scenario/sample generation system 212 considers a last-round testing results and generates more challenging/risk cases for testing engine.
  • a result collection process 214 of the system 212 operates to collect current testing results, and calculate a risk metric (e.g., TTC) relevant to identifying further critical scenarios.
  • the adaptive sample generation process of the system 216 operates to adaptively generate more challenging scenarios, based on results from one or more previous rounds of testing. These generated scenarios are then provided to the test engine 206 for further testing and evaluation.
  • the presently described system and technique can be used as a module or plug-in to currently available testing systems.
  • existing systems can be adapted to also implicitly or explicitly generate more critical scenarios (e.g., new scenarios) based on the testing of even one input scenario.
  • adaptive generation may generate more critical/dangerous cases than a last-round of testing by adjusting specific parameters.
  • the following addresses three problems: 1) how to parameterize and represent a scenario; 2) how to measure a critical level/degree of a testing scenario; 3) how to generate more critical scenarios by adjusting scenario parameters.
  • a scenario consists of descriptions of all traffic participants (Vehiclci ) and road layout (W tane ), the subscript i refers to the i-th traffic participant.
  • a parameterized trajectory For each participant (e.g., a vehicle), a parameterized trajectory
  • TTC Time-To-Collision
  • TTC is a popular safety indicator, which is defined as the time required for two vehicles to collide if they keep the current heading path and motion model. The underlying assumption is the constant
  • a Momentary TTC of a VUT (vehicle under test) and a VT (vehicle target) may be defined as:
  • D ret and V rei represent the relative distance and speed between VUT and VT, respectively.
  • the high-risk events tend to correspond to small and TT C.
  • FIG. 3 provides a TTC calculation for side, rear end, and head on cases 302, 304, 306.
  • TTC can be used as a risk assessment metric
  • four risk levels can be defined by predefining several thresholds (e.g., given a statistical analysis approach) on TTC as shown in Table 1.
  • a scenario generation problem becomes to generate more critical scenarios (with smaller TTC values and in range of risk level-II or III) compared with last-round input scenario samples.
  • it is expected to adjust parameter p to generate different elements in the scenario tuple, including initial states, dynamic trajectories and road layouts.
  • initial states initial states
  • dynamic trajectories road layouts.
  • examples are provided on how to adaptively generate scenario with different initial states and dynamic trajectories. It will be understood that other approaches may be provided.
  • grid-search-based scenario generation may be performed, such as with scenario generation by adaption on initial states (x°).
  • VUT initial states
  • the following grid-search algorithm may be used to generate more critical scenarios for testing.
  • grid-search-based scenario generation may be performed. Such an example may include scenario generation by adaption on trajectories (x.(t; 0)). In particular, the following involves a change x.(t; 6) ⁇ for new scenarios. This scenario generation approach first samples a time-step, keeps the states before the moment, and re-generates new scenarios after the sampled time-point. TABLE 3
  • efficient scenario generation may be performed by probabilistic modeling.
  • a probabilistic distribution of TTC given any NDS (Naturalistic Driving Study) dataset, can be represented.
  • FIG. 4 depicts an example TTC distribution 400 with TTC input as condition for identifying a search space.
  • the evaluation becomes: given any input TTCin value (calculated from one scenario), generate new samples with a smaller TTC value.
  • any conditional Monte Carlo based generation methods can be used to sample the search space with ⁇ TTCin, such as with use of naive Monte Carlo or Importance-based sampling methods.
  • FIG. 5 depicts a flowchart 500 of a method for adaptive generation and use of critical scenarios, used for implementing the present techniques.
  • Operation 502 involves identifying an initial scenario sample for testing, such as from a scenario dataset.
  • This initial scenario sample may involve testing for some critical scenario condition for AD operation, based on known conditions or modeling.
  • Operation 504 involves testing the initial scenario with AD operations, according to various AD test approaches discussed above. Data from the testing of the initial scenario is collected and used for further analysis in the following operations. Such data may be parameterized and measured as discussed above.
  • Operation 506 includes identification of a risk assessment metric, from the testing of the initial scenario, as discussed above.
  • this risk assessment metric be provided by measurement of a time to collision value from the test of the initial scenario.
  • Operation 508 includes adjustment of the parameters of the initial scenario, based on the risk assessment metric, as discussed above. From these adjustments, a subsequent test scenario with new parameters (e.g., with different initial states and dynamic trajectories) may be generated.
  • Operation 510 includes testing of the subsequent scenario. Operations 506-510 may be repeated, using the results of the subsequent scenario for further risk assessment, adjustment, and generation of additional subsequent scenarios.
  • FIG. 6 illustrates an example of a moving vehicle scenario demonstrating the application of an autonomous vehicle operation safety model.
  • an ego vehicle 605 shares the road with two target vehicles, the sedan 620 and the truck 625.
  • ego vehicle refers to the vehicle from which a certain perspective is obtained and a target vehicle is a subject within that perspective.
  • a VOSM can provide a mathematical model for safety assurance during autonomous driving.
  • a VOSM typically will formulates a set of safety standards, such as the minimum distance d min between vehicles to avoid collisions. Multiple parameters are used to calculate the formulation, such as response time p, minimum braking a min brake and maximum acceleration a max , brake °f the vehicle. If all requirements are satisfied, the vehicle passes the standards and is believed to be safe, otherwise the vehicle is not safe.
  • a VOSM (or “safety model” as used herein) defines a safe longitudinal distance 610 and a safe lateral distance 615 for the ego vehicle 605. These distances create a zone, shell, bubble, or shield around the ego vehicle 605, also illustrated around the sedan 620 and the truck 625. Generally, violation of these safe distances (e.g., intersection or overlap 630) indicates that the ego vehicle 605 is not safe and should take corrective action. Note that the intersection 630 need not result in a collision, merely that, according to the safety model, an unsafe situation has arisen.
  • the safety model may provide the following representation of safe longitudinal distance (minimum longitudinal safe distance):
  • the model also provides the following representation of safe lateral distance
  • p ⁇ and p 2 are the response time of the ego vehicle 605 (cx) and a target vehicle (c2) such as the truck 625.
  • the maximum acceleration rate and minimum braking rate of are respectively the maximum acceleration rate and minimum braking rate of c 2 .
  • the ego vehicle 605 When the ego vehicle detects that it is closer than either the minimum safe longitudinal distance or the minimum safe lateral distance, the ego vehicle 605 is expected to implement a corrective action. Such corrective actions may include braking or turning to increase the distance between the ego vehicle 605 and the target vehicle 625 or other object until the minimum safe longitudinal distance and the minimum safe lateral distance are restored.
  • Equations (4) and (5) above illustrate the parameterization of the safety model to response times of the ego vehicle 605 and the target vehicle 625, maximum lateral or longitudinal acceleration of the target vehicle and minimum braking (e.g., deceleration) of the ego vehicle.
  • maximum acceleration is the greatest acceleration capable by a vehicle and minimum braking is the deacceleration a vehicle can guarantee will be applied when executing a maneuver.
  • the maximum and minimum braking may be the same.
  • the minimum braking for the ego vehicle 605 is reduced from the maximum braking based on the brake wear.
  • Equations (4) and (5) generally assume a worst case scenario in which the ego vehicle 605 is underperforming (thus the use of the minimum braking for the ego vehicle) and the target vehicle 625 is at peak performance (thus the use of maximum acceleration for the target vehicle 625) even though it is more likely that the ego vehicle 605 will outperform its minimum braking and the target vehicle 625 will underperformed its maximum acceleration.
  • the danger zone is defined around the ego vehicle 605. As noted above, when another object interferes with this zone, or is projected to interfere with the zone, then the ego vehicle 605 is expected to take action. Because the velocities of both the ego vehicle 605 and the target vehicle 625 are parameters of equations (1) and (2), the danger zone is constantly changing based on the detected movement of the ego vehicle 605 and the target vehicle 625. [0076] Other VOSMs will generally follow the safety model template and requirements described above by defining relevant parameters and providing for acceptable vehicle interactions based on those parameters.
  • FIG. 7 illustrates an example workflow for operating a safety complexity assessment on an autonomous driving scenario.
  • a safety area is dynamically calculated as per the surrounding vehicles’ speed and the speed of Pn using safety model rules for longitude and lateral distance.
  • a workflow includes adjusting position along a trajectory, based on safety model rules; this is followed by identifying a safety area and positions along a trajectory, at operation 704. Further iterations may be followed by additional adjustment and positioning.
  • FIG. 8 provides an illustration of vehicle positions in a test scenario map 802, demonstrating movement of a test vehicle along designated points (P0, Pl, P2, P3).
  • the start point is P0 and the target point is the destination.
  • Vehicle A/B/C are other actors in the scenario and will follow the actions and route as pre-defined by the scenario.
  • a safety area 804 (at the current time point) is calculated with safety model rules which are related to the surrounding vehicles’ speed.
  • R 806 is system defined and represents the section range. R 806 can be a constant or variable by section.
  • each waypoint Pn matches the following requirements to be added into the waypoint list: (a) All candidate points are located on the boundary of safety circle with Pn-1 as center and in the safety model rule safety area (safety area 804); (b) Pn should be the point which has the shortest distance with target point; (c) In some critical moment, there is no safety area for driving. The waypoint should keep the same as the previous one.
  • waypoint selection may be provided by the following waypoint selection algorithm. TABLE 4
  • Section Range (e.g., R 806) can be selected, defined, or adjusted.
  • section range R can be system defined as requirement. It can be a constant for all sections or be variable by section through the whole route. Section range R should be small enough to fit the complex road environment. The value can be initialized by expertise, and then adjusted by safety area.
  • section range selection may be provided by the following section range adjustment algorithm. TABLE 5
  • FIGS. 9A and 9B provide additional illustration of examples of maps 902A, 902B for adjusting section range.
  • the trajectory of P0 to Pl will be occluded by the vehicle C which is moving rightwards during the time .
  • the section range needs to be adjusted or decreased to guarantee that the trajectory is not occluded.
  • cost functions may also be considered in this approach.
  • the total cost includes time cost and maneuver cost.
  • Time cost is an integration of each section’s routing time, for each section, a same value is assigned for the velocity and set to the maximum speed limitation for a fastest routing.
  • a maneuver cost is designed to evaluate the steering changes which can reflect the complexity of a maneuver. Similar with time cost, in some situations there is no next waypoint, so the velocity will be zero.
  • normalization may also be considered. Because each scenario has various road nets which are hard to compare (for example, an intersection compared with a curve road), a normalization step will be applied in the pipeline. All actors will be removed in the scenario and the total cost will be recalculated as a baseline to normalize.
  • FIG. 10 provides an example of an intersection in a map 1002.
  • the dash lines 1004, 1006 refer to the trajectory of actors A and C.
  • the dash line 1008 with nodes refer to the trajectory and waypoints with individual actors A, B, C in this scenario.
  • the dash line 1010 with nodes refers to the trajectory and waypoints without actors A, B, C.
  • the dash line 1010 is smooth and reflects the base forward direction.
  • Cost Time ' and Cost Maneuver ' are the corresponding time and maneuver cost under road clear situation.
  • Operations may be adapted for other case examples, such as are shown in the maps 1102A, 1102B, 1102C in FIGS. 11A, 11B, and 11C, illustrating times TO, Tl, T2 respectively for a straight road scenario assessment.
  • vehicles A, B, and C are actors in the scenario and will change lanes and speed forward right.
  • FIGS. 12A and 12B Another case example is shown in the maps 1202A, 1202B in FIGS. 12A and 12B, illustrating times TO, Tl respectively for an intersection scenario.
  • actors A, C will follow the trajectories 1204, 1206 and actor B will remain at its location.
  • the virtual ego path starts, at time TO, at P0 with the initial section range R-R 1 from safety area 1208 (in range 1210), with Pl being the next waypoint Pl, as depicted in FIG. 12A.
  • the trajectory from Pl to P2 will be occluded by actor C which is moving.
  • P2 can serve as the next waypoint.
  • FIG. 13 depicts a flowchart 1300 of a method for scenario complexity assessment, based on vehicle operation safety rules, according to the present techniques.
  • this method may be used for assessing some scenario (e.g., a scenario used for testing or verifying vehicle control system software) with a VOSM, in a test setting.
  • RSS or other safety models may be tested and applied as the relevant VOSM.
  • Operation 1302 includes calculating a safety area based on VOSM rules, such as for evaluation of longitudinal and lateral safe distances based on other vehicle speeds.
  • Operation 1304 includes selecting waypoints for a safety evaluation, based on the safety area. As noted above, these waypoints may be selected at the edge of a known safety area. Algorithms for waypoint selection are discussed above.
  • Operation 1306 includes adjusting the drivable section range of the test scenario, for evaluation with the safety rules, so that a critical scenario can be invoked. Algorithms for section range adjustment are discussed above.
  • Operation 1308 involves evaluating cost functions for the adjusted drivable section range, with use of cost functions including time cost and maneuver cost. Details of such cost functions are discussed above.
  • Operation 1310 involves producing an assessment of the scenario, based on the safety area, the adjusted section range, and the cost functions. From this, the difficulty of a test scenario can be quantified or represented.
  • FIG. 14 illustrates a flowchart 1400 of a method for generating and evaluating a scenario used for testing autonomous driving operations.
  • the operations of this method may be performed in a testing simulator or platform, a device integrated within or as part of a simulation computing engine, service, machine, device, or as part of instructions from a computing machine- or device-readable storage medium which are executed by circuitry of the computing machine or device.
  • This autonomous driving scenario may be a first (initial) autonomous driving scenario in an iterative flow of multiple analyzed scenarios.
  • the autonomous driving scenario is defined by a first plurality of scenario parameters.
  • the scenario parameters specify characteristics of a roadway, participants on the roadway, and a trajectory for each of the participants, or other scenario characteristics relevant to testing and simulation as noted above.
  • operations are optionally performed, to cause testing of the autonomous driving scenario, using safety operations of an autonomous driving model.
  • At least a at least a portion of the first autonomous driving scenario may be tested on an autonomous driving simulation platform, as the testing of the autonomous driving scenario applies the safety operations to the autonomous driving scenario.
  • a (first) risk assessment metric is produced based on results from the testing of the autonomous driving scenario.
  • operations are performed to identify or determine a risk assessment metric for the autonomous driving scenario.
  • This risk assessment metric is based on one or more safety operations used by an autonomous driving model in the driving scenario.
  • One or more values for the risk assessment metric or metrics may be produced based on an outcome of the safety operations from testing (at operation 1404), or other evaluative processes, consistent with the examples discussed above.
  • operations are performed to change or otherwise determine new parameters which produce greater difficulty of the safety operations and an increased risk assessment metric.
  • the first plurality of scenario parameters from the first scenario are used as a starting point, and such parameters are modified to produce a second plurality of scenario parameters.
  • the second plurality of scenario parameters are determined by performing a grid-search for scenario parameters, or by performing probabilistic modeling to identify a search space for scenario parameters, as discussed above.
  • the second risk assessment metric may be used to indicate or identify a greater difficulty for the safety operations than the first risk assessment metric.
  • the risk assessment metrics are associated with a Time-to-Collision (TTC) calculation.
  • TTC Time-to-Collision
  • such a TTC calculation produces a value within one of a plurality of risk levels.
  • risk levels can be used to identify more difficult scenarios and scenario parameters, such as when the second risk assessment metric is associated with (or provides) a TTC calculation that produces a value within an equal or a greater risk level than the first risk assessment metric.
  • operations are optionally performed, to produce an assessment (e.g., score, rating, metric) of the autonomous driving scenario, based on testing or evaluative outcomes of the safety operations in the relevant scenarios.
  • this assessment may be based on outcomes in testing, simulation, or in risk assessment measurements relating to the first autonomous driving scenario (with the first set of scenario parameters) and the second autonomous driving scenario (with the second/changed set of scenario parameters).
  • the outcomes or assessments of other scenarios or sets of scenarios may also be considered.
  • the difficulty assessment may be based on a safety area, a section range, or cost function, for a maneuver (or multiple maneuvers) associated with the safety operations that are performed with testing the autonomous driving scenario.
  • the cost function may relate to a cost in time or a complexity cost for a maneuver.
  • the section range may relate to waypoints used in a simulation for a maneuver.
  • the flowchart concludes with the generation of a second (new) autonomous driving scenario, from the second (new) set of scenario parameters.
  • This second autonomous driving scenario is made available for use in further simulation, testing, or evaluation.
  • operations of testing and generating subsequent autonomous driving scenarios, including the second autonomous driving scenario may be used during multiple rounds of testing (e.g., returning to operation 1402 for the second scenario, third scenario, or n-th scenario).
  • Such iterative operations may be used to progressively increase a respective risk assessment metric produced for a scenario, which indicates a greater difficulty for the safety operations in each of the subsequent autonomous driving scenarios.
  • the iterative operations may continue until some condition is identified or met.
  • Embodiments may be implemented in one or a combination of hardware, firmware, and software. Embodiments may also be implemented as instructions stored on a machine- readable storage device, which may be read and executed by at least one processor to perform the operations described herein.
  • a machine-readable storage device may include any non- transitory mechanism or medium for storing information in a form readable by a machine (e.g., a computer).
  • a machine-readable storage device may include read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and other storage devices and media.
  • a processor subsystem may be used to execute the instruction on the machine-readable medium.
  • the processor subsystem may include one or more processors, each with one or more cores. Additionally, the processor subsystem may be disposed on one or more physical devices.
  • the processor subsystem may include one or more specialized processors, such as a graphics processing unit (GPU), a digital signal processor (DSP), a field programmable gate array (FPGA), or a fixed function processor.
  • GPU graphics processing unit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • Examples, as described herein, may include, or may operate on, logic or a number of components, modules, mechanisms, or units (collectively, “components”). Such components may be hardware, software, or firmware communicatively coupled to one or more processors in order to carry out the operations described herein.
  • Components may be hardware components, and as such components may be considered tangible entities capable of performing specified operations and may be configured or arranged in a certain manner.
  • circuits may be arranged (e.g., internally or with respect to external entities such as other circuits) in a specified manner as a component.
  • the whole or part of one or more computer systems e.g., a standalone, client or server computer system
  • one or more hardware processors may be configured by firmware or software (e.g., instructions, an application portion, or an application) as a component that operates to perform specified operations.
  • the software may reside on a machine-readable medium.
  • the software when executed by the underlying hardware of the component, causes the hardware to perform the specified operations.
  • the term hardware component, module, mechanism, or unit is understood to encompass a tangible entity, be that an entity that is physically constructed, specifically configured (e.g., hardwired), or temporarily (e.g., transitorily) configured (e.g., programmed) to operate in a specified manner or to perform part or all of any operation described herein.
  • each of the components need not be instantiated at any one moment in time.
  • the components comprise a general-purpose hardware processor configured using software; the general-purpose hardware processor may be configured as respective different components at different times.
  • Software may accordingly configure a hardware processor, for example, to constitute a particular component at one instance of time and to constitute a different component at a different instance of time.
  • Components may also be software or firmware modules, which operate to perform the methodologies described herein.
  • Circuitry or circuits may comprise, for example, singly or in any combination, hardwired circuitry, programmable circuitry such as computer processors comprising one or more individual instruction processing cores, state machine circuitry, and/or firmware that stores instructions executed by programmable circuitry.
  • the circuits, circuitry, or modules may, collectively or individually, be embodied as circuitry that forms part of a larger system, for example, an integrated circuit (IC), system on-chip (SoC), desktop computers, laptop computers, tablet computers, servers, smart phones, etc.
  • IC integrated circuit
  • SoC system on-chip
  • logic may refer to firmware and/or circuitry configured to perform any of the aforementioned operations.
  • Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., nonvolatile) in memory devices and/or circuitry.
  • Circuitry as used in any embodiment herein, may comprise, for example, singly or in any combination, hardwired circuitry, programmable circuitry, state machine circuitry, logic and/or firmware that stores instructions executed by programmable circuitry.
  • the circuitry may be embodied as an integrated circuit, such as an integrated circuit chip.
  • the circuitry may be formed, at least in part, by the processor circuitry executing code and/or instructions sets (e.g., software, firmware, etc.) corresponding to the functionality described herein, thus transforming a general-purpose processor into a specific -purpose processing environment to perform one or more of the operations described herein.
  • the processor circuitry may be embodied as a stand-alone integrated circuit or may be incorporated as one of several components on an integrated circuit.
  • the various components and circuitry of the node or other systems may be combined in a system-on-a-chip (SoC) architecture.
  • the processing circuitry may be embodied or provided by a data processing unit (DPU), infrastructure processing unit (IPU), acceleration circuitry, or combinations of graphical processing units (GPUs) or programmed FPGAs.
  • DPU data processing unit
  • IPU infrastructure processing unit
  • GPUs graphical processing units
  • Circuitry e.g., processing circuitry
  • Circuitry membership may be flexible over time.
  • Circuitries include members that may, alone or in combination, perform specified operations when operating.
  • hardware of the circuitry may be immutably designed to carry out a specific operation (e.g., hardwired).
  • the hardware of the circuitry may include variably connected physical components (e.g., execution units, transistors, simple circuits, etc.) including a machine readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation.
  • a machine readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation.
  • the instructions enable embedded hardware (e.g., the execution units or a loading mechanism) to create members of the circuitry in hardware via the variable connections to carry out portions of the specific operation when in operation.
  • the machine readable medium elements are part of the circuitry or are communicatively coupled to the other components of the circuitry when the device is operating.
  • any of the physical components may be used in more than one member of more than one circuitry.
  • execution units may be used in a first circuit of a first circuitry at one point in time and reused by a second circuit in the first circuitry, or by a third circuit in a second circuitry at a different time. Additional examples of these components with respect to a computer system follow.
  • FIG. 15 is a block diagram illustrating a machine in the example form of a computer system 1500, within which a set or sequence of instructions may be executed to cause the machine to perform any one of the methodologies discussed herein, according to an embodiment.
  • the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine may operate in the capacity of either a server or a client machine in server-client network environments, or it may act as a peer machine in peer-to-peer (or distributed) network environments.
  • the machine may be a vehicle subsystem or vehicle on-board computer, a personal computer (PC), a tablet PC, a hybrid tablet, a mobile telephone or smartphone, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • tablet PC tablet PC
  • hybrid tablet a mobile telephone or smartphone
  • any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • the term “processor-based system” shall be taken to include any set of one or more machines that are controlled by or operated by a processor circuitry (e.g., a computer) or other circuitry to individually or jointly execute instructions to perform any one or more of the methodologies discussed herein.
  • Example computer system 1500 includes at least one hardware processor 1502 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both, processor cores, compute nodes, etc.), a main memory 1504 and a static memory 1506 (e.g., memory or storage for firmware, microcode, a basic-input-output (BIOS), unified extensible firmware interface (UEFI), etc.), which communicate with each other via a link 1530 (e.g., bus or interconnect).
  • the computer system 1500 may further include a display unit 1510, an alphanumeric input device 1512 (e.g., a keyboard), and a user interface (UI) navigation device 1514 (e.g., a mouse).
  • a hardware processor 1502 e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both, processor cores, compute nodes, etc.
  • main memory 1504 e.g., main memory 1504
  • static memory 1506 e.g
  • the display unit 1510, input device 1512 and UI navigation device 1514 are incorporated into a touch screen display.
  • the computer system 1500 may additionally include a storage device 1516 (e.g., a drive unit), a signal generation device 1518 (e.g., a speaker), a network interface device 1520, and one or more sensors (not shown), such as a global positioning system (GPS) sensor, compass, accelerometer, gyrometer, magnetometer, or other sensor.
  • GPS global positioning system
  • the machine 1500 may include an output controller 1528, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
  • a serial e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
  • USB universal serial bus
  • IR infrared
  • NFC near field communication
  • Registers of the processor 1502, the main memory 1504, the static memory 1506, or the mass storage 1508 may be, or include, a machine-readable medium 1522 on which is stored one or more sets of data structures or instructions 1524 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein.
  • the instructions 1524 may also reside, completely or at least partially, within any of registers of the main memory 1504, static memory 1506, and/or within the processor 1502 during execution thereof by the computer system 1500.
  • one or any combination of the hardware processor 1502, the main memory 1504, static memory 1506, and the mass storage 1508 also constituting machine- readable media.
  • machine-readable medium 1522 is illustrated in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 1524.
  • the term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions.
  • the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media.
  • machine-readable media include non-volatile memory, including but not limited to, by way of example, semiconductor memory devices (e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)) and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • semiconductor memory devices e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)
  • EPROM electrically programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory devices e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)
  • flash memory devices e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM
  • the instructions 1524 may further be transmitted or received over a communications network 1526 using a transmission medium via the network interface device 1520 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).
  • Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, plain old telephone (POTS) networks, and wireless data networks (e.g., Bluetooth, Wi-Fi, 3G, and 4G LTE/LTE-A, 5G, DSRC, or like networks).
  • POTS plain old telephone
  • wireless data networks e.g., Bluetooth, Wi-Fi, 3G, and 4G LTE/LTE-A, 5G, DSRC, or like networks.
  • the term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
  • information stored or otherwise provided on a machine-readable medium may be representative of instructions, such as instructions themselves or a format from which the instructions may be derived.
  • This format from which the instructions may be derived may include source code, encoded instructions (e.g., in compressed or encrypted form), packaged instructions (e.g., split into multiple packages), or the like.
  • the information representative of the instructions in the machine-readable medium may be processed by processing circuitry into the instructions to implement any of the operations discussed herein.
  • deriving the instructions from the information may include: compiling (e.g., from source code, object code, etc.), interpreting, loading, organizing (e.g., dynamically or statically linking), encoding, decoding, encrypting, unencrypting, packaging, unpackaging, or otherwise manipulating the information into the instructions.
  • the derivation of the instructions may include assembly, compilation, or interpretation of the information (e.g., by the processing circuitry) to create the instructions from some intermediate or preprocessed format provided by the machine-readable medium.
  • the information when provided in multiple parts, may be combined, unpacked, and modified to create the instructions.
  • the information may be in multiple compressed source code packages (or object code, or binary executable code, etc.) on one or several remote servers.
  • the source code packages may be encrypted when in transit over a network and decrypted, uncompressed, assembled (e.g., linked) if necessary, and compiled or interpreted (e.g., into a library, stand-alone executable, etc.) at a local machine, and executed by the local machine.
  • the functional units or capabilities described in this specification may have been referred to or labeled as components or modules, in order to more particularly emphasize their implementation independence. Such components may be embodied by any number of software or hardware forms.
  • a component or module may be implemented as a hardware circuit comprising custom very-large-scale integration (VLSI) circuits or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components.
  • VLSI very-large-scale integration
  • a component or module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices, or the like.
  • Components or modules may also be implemented in software for execution by various types of processors.
  • An identified component or module of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions, which may, for instance, be organized as an object, procedure, or function. Nevertheless, the executables of an identified component or module need not be physically located together but may comprise disparate instructions stored in different locations which, when joined logically together, comprise the component or module and achieve the stated purpose for the component or module.
  • a component or module of executable code may be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices or processing systems.
  • some aspects of the described process (such as code rewriting and code analysis) may take place on a different processing system (e.g., in a computer in a data center) than that in which the code is deployed (e.g., in a computer embedded in a sensor or robot).
  • operational data may be identified and illustrated herein within components or modules and may be embodied in any suitable form and organized within any suitable type of data structure.
  • the operational data may be collected as a single data set or may be distributed over different locations including over different storage devices, and may exist, at least partially, merely as electronic signals on a system or network.
  • the components or modules may be passive or active, including agents operable to perform desired functions.
  • Example 1 is a method, comprising: obtaining a first autonomous driving scenario for use in an autonomous driving simulation, the first autonomous driving scenario defined by a first plurality of scenario parameters; identifying a first risk assessment metric of the first autonomous driving scenario based on safety operations used by an autonomous driving model in the first autonomous driving scenario; determining a second plurality of scenario parameters from changes to the first plurality of scenario parameters that produce a second risk assessment metric, wherein the second risk assessment metric indicates a greater difficulty for the safety operations than the first risk assessment metric; and generating a second autonomous driving scenario from the second plurality of scenario parameters, for use in the autonomous driving simulation.
  • Example 2 the subject matter of Example 1 optionally includes testing at least a portion of the first autonomous driving scenario on an autonomous driving simulation platform, wherein the testing of the first autonomous driving scenario applies the safety operations to the first autonomous driving scenario, and wherein the first risk assessment metric is based on results of the testing of the first autonomous driving scenario; and repeating operations of testing and generating subsequent autonomous driving scenarios, during multiple rounds of testing, using operations that progressively increase a respective risk assessment metric which indicate a greater difficulty for the safety operations in the subsequent autonomous driving scenarios.
  • Example 3 the subject matter of Example 2 optionally includes producing an assessment of the autonomous driving model based on outcomes of the safety operations used by the autonomous driving model in one or more of: the first autonomous driving scenario, the second autonomous driving scenario, or the subsequent autonomous driving scenarios.
  • Example 4 the subject matter of any one or more of Examples 2-3 optionally include wherein the autonomous driving model is one of a plurality of autonomous driving models associated with an autonomous driving or advanced driver assistance system, and wherein the autonomous driving simulation platform is a test simulation computing system comprising hardware configured to evaluate the plurality of autonomous driving models.
  • Example 5 the subject matter of any one or more of Examples 1-4 optionally include wherein the first and second plurality of scenario parameters specify characteristics of a roadway, participants on the roadway, and a trajectory for each of the participants.
  • Example 6 the subject matter of any one or more of Examples 1-5 optionally include wherein the second plurality of scenario parameters used in the second autonomous driving scenario specify initial states and trajectories which differ from the first plurality of scenario parameters used in the first autonomous driving scenario.
  • Example 7 the subject matter of any one or more of Examples 1-6 optionally include wherein the first risk assessment metric is associated with a Time-to-Collision (TTC) calculation, wherein the TTC calculation produces a value within one of a plurality of risk levels, and wherein the second risk assessment metric is associated with a TTC calculation that produces a value within an equal or a greater risk level than the first risk assessment metric.
  • TTC Time-to-Collision
  • Example 8 the subject matter of any one or more of Examples 1-7 optionally include wherein determining the second plurality of scenario parameters includes performing a grid-search for scenario parameters to be used by the second plurality of scenario parameters.
  • Example 9 the subject matter of any one or more of Examples 1-8 optionally include wherein determining the second plurality of scenario parameters includes performing probabilistic modeling to identify a search space for scenario parameters to be used by the second plurality of scenario parameters.
  • Example 10 the subject matter of any one or more of Examples 1-9 optionally include the operations further comprising: producing a difficulty assessment of the first autonomous driving scenario, the difficulty assessment based on a safety area, a section range, or cost function, for a maneuver associated with the safety operations that is performed with testing the first autonomous driving scenario; wherein generating the second autonomous driving scenario and determining the second plurality of scenario parameters are further based on the difficulty assessment.
  • Example 11 the subject matter of Example 10 optionally includes wherein the cost function relates to a cost in time and a complexity cost for the maneuver.
  • Example 12 the subject matter of any one or more of Examples 10-11 optionally include wherein the section range relates to waypoints used in the autonomous driving simulation for the maneuver.
  • Example 13 is at least one machine-readable storage medium comprising instructions that, when executed by processing circuitry of a machine, cause the processing circuitry to perform any of the methods of Examples 1 to 12.
  • Example 14 is a computing system for autonomous driving test scenario evaluation, comprising: memory configured to store data of an autonomous driving model; and processing circuitry configured to: identify a first autonomous driving scenario for use in an autonomous driving simulation, the first autonomous driving scenario defined by a first plurality of scenario parameters; identify a first risk assessment metric of the first autonomous driving scenario based on safety operations used by the autonomous driving model in the first autonomous driving scenario; determine a second plurality of scenario parameters from changes to the first plurality of scenario parameters that produce a second risk assessment metric, wherein the second risk assessment metric indicates a greater difficulty for the safety operations than the first risk assessment metric; and generate a second autonomous driving scenario from the second plurality of scenario parameters, for use in the autonomous driving simulation.
  • Example 15 the subject matter of Example 14 optionally includes the processing circuitry further configured to: test at least a portion of the first autonomous driving scenario on an autonomous driving simulation platform, wherein testing of the first autonomous driving scenario applies the safety operations to the first autonomous driving scenario, and wherein the first risk assessment metric is based on results of the testing of the first autonomous driving scenario; and repeat operations to test and generate subsequent scenario parameters and subsequent autonomous driving scenarios, during multiple rounds of testing, using operations that progressively increase a respective risk assessment metric which indicate a greater difficulty for the safety operations in the subsequent autonomous driving scenarios.
  • Example 16 the subject matter of Example 15 optionally includes the processing circuitry further configured to: produce an assessment of the autonomous driving model based on outcomes of the safety operations used by the autonomous driving model in one or more of: the first autonomous driving scenario, the second autonomous driving scenario, or the subsequent autonomous driving scenarios.
  • Example 17 the subject matter of any one or more of Examples 15-16 optionally include wherein the autonomous driving model is one of a plurality of autonomous driving models associated with an autonomous driving or advanced driver assistance system, and wherein the autonomous driving simulation platform is a test simulation computing system comprising hardware configured to evaluate the plurality of autonomous driving models.
  • the autonomous driving model is one of a plurality of autonomous driving models associated with an autonomous driving or advanced driver assistance system
  • the autonomous driving simulation platform is a test simulation computing system comprising hardware configured to evaluate the plurality of autonomous driving models.
  • Example 18 the subject matter of any one or more of Examples 14-17 optionally include first and second plurality of scenario parameters specify characteristics of a roadway, participants on the roadway, and a trajectory for each of the participants.
  • Example 19 the subject matter of any one or more of Examples 14-18 optionally include wherein the second plurality of scenario parameters used in the second autonomous driving scenario specify initial states and trajectories which differ from the first plurality of scenario parameters used in the first autonomous driving scenario.
  • Example 20 the subject matter of any one or more of Examples 14-19 optionally include wherein the first risk assessment metric is associated with a Time-to-Collision (TTC) calculation, wherein the TTC calculation produces a value within one of a plurality of risk levels, and wherein the second risk assessment metric is associated with a TTC calculation that produces a value within an equal or a greater risk level than the first risk assessment metric.
  • TTC Time-to-Collision
  • Example 21 the subject matter of any one or more of Examples 14-20 optionally include wherein operations to determine the second plurality of scenario parameters includes operations to perform a grid-search for scenario parameters to be used by the second plurality of scenario parameters.
  • Example 22 the subject matter of any one or more of Examples 14-21 optionally include wherein operations to determine the second plurality of scenario parameters includes operations to perform probabilistic modeling that identify a search space for scenario parameters to be used by the second plurality of scenario parameters.
  • Example 23 the subject matter of any one or more of Examples 14-22 optionally include the processing circuitry further configured to: produce a difficulty assessment of the first autonomous driving scenario, the difficulty assessment based on a safety area, a section range, or cost function, for a maneuver associated with the safety operations that is performed with testing the first autonomous driving scenario; wherein operations to generate the second autonomous driving scenario and determine the second plurality of scenario parameters are further based on the difficulty assessment.
  • Example 24 the subject matter of Example 23 optionally includes wherein the cost function relates to a cost in time and a complexity cost for the maneuver.
  • Example 25 the subject matter of any one or more of Examples 23-24 optionally include wherein the section range relates to waypoints used in the autonomous driving simulation for the maneuver.
  • Example 26 is an apparatus, comprising: means for identifying a first autonomous driving scenario for use in an autonomous driving simulation, the first autonomous driving scenario defined by a first plurality of scenario parameters; means for identifying a first risk assessment metric of the first autonomous driving scenario based on safety operations used by an autonomous driving model in the first autonomous driving scenario; means for determining a second plurality of scenario parameters from changes to the first plurality of scenario parameters that produce a second risk assessment metric, wherein the second risk assessment metric indicates a greater difficulty for the safety operations than the first risk assessment metric; and means for generating a second autonomous driving scenario from the second plurality of scenario parameters, for use in the autonomous driving simulation.
  • Example 27 the subject matter of Example 26 optionally includes means for testing at least a portion of the first autonomous driving scenario on an autonomous driving simulation platform, wherein the testing of the first autonomous driving scenario applies the safety operations to the first autonomous driving scenario, and wherein the first risk assessment metric is based on results of the testing of the first autonomous driving scenario; and means for producing an assessment of the autonomous driving model based on outcomes of the safety operations used by the autonomous driving model.
  • Example 28 the subject matter of Example 27 optionally includes means for repeating operations of testing and generating subsequent autonomous driving scenarios, during multiple rounds of testing, using operations that progressively increase a respective risk assessment metric which indicate a greater difficulty for the safety operations in the subsequent autonomous driving scenarios.
  • Example 29 the subject matter of any one or more of Examples 26-28 optionally include means for defining the first plurality of scenario parameters, the first plurality of scenario parameters to specify characteristics of a roadway, participants on the roadway, and a trajectory for each of the participants.
  • Example 30 the subject matter of any one or more of Examples 26-29 optionally include wherein the second plurality of scenario parameters used in the second autonomous driving scenario specify initial states and trajectories which differ from the first plurality of scenario parameters used in the first autonomous driving scenario.
  • Example 31 the subject matter of any one or more of Examples 26-30 optionally include means for generating a Time-to-Collision (TTC) calculation, wherein the first risk assessment metric is associated with the calculation, wherein the TTC calculation produces a value within one of a plurality of risk levels, and wherein the second risk assessment metric is associated with a TTC calculation that produces a value within an equal or a greater risk level than the first risk assessment metric.
  • TTC Time-to-Collision
  • Example 32 the subject matter of any one or more of Examples 26-31 optionally include means for performing a grid-search for scenario parameters to be used by the second plurality of scenario parameters.
  • Example 33 the subject matter of any one or more of Examples 26-32 optionally include means for performing probabilistic modeling to identify a search space for scenario parameters to be used by the second plurality of scenario parameters.
  • Example 34 the subject matter of any one or more of Examples 26-33 optionally include means for producing a difficulty assessment of the first autonomous driving scenario, the difficulty assessment based on a safety area, a section range, or cost function, for a maneuver associated with the safety operations that is performed with testing the first autonomous driving scenario; wherein the second autonomous driving scenario and the second plurality of scenario parameters are further based on the difficulty assessment; wherein the cost function relates to a cost in time and a complexity cost for the maneuver; and wherein the section range relates to waypoints used in a simulation for the maneuver.
  • Example 35 the subject matter of any one or more of Examples 26-34 optionally include wherein the autonomous driving model is one of a plurality of autonomous driving models associated with a means for performing autonomous driving or a means for performing advanced driver assistance.
  • the autonomous driving model is one of a plurality of autonomous driving models associated with a means for performing autonomous driving or a means for performing advanced driver assistance.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

Divers aspects de procédés, de systèmes et de cas d'utilisation pour la génération et la modification adaptatives de scénarios de conduite autonome, destinés à être testés dans une plateforme de simulation de conduite autonome, sont décrits. Dans un exemple, une approche pour une évaluation de scénario de conduite autonome consiste : à identifier un premier scénario destiné à être utilisé dans une simulation, qui est définie par un premier ensemble de paramètres de scénario; à identifier une première métrique d'évaluation de risque du premier scénario sur la base d'opérations de sécurité utilisées par un modèle de conduite autonome dans le premier scénario; à déterminer un second ensemble (modifié) de paramètres de scénario à partir de changements du premier ensemble de paramètres de scénario sur la base d'une seconde métrique d'évaluation de risque; et à générer un second scénario sur la base du second ensemble (modifié) de paramètres de scénario. Par exemple, le second scénario de conduite autonome peut présenter une plus grande difficulté pour les opérations de sécurité que la première métrique d'évaluation de risque, permettant une augmentation de la complexité de test.
PCT/US2021/063814 2020-12-17 2021-12-16 Génération et évaluation adaptatives de scénarios critiques de véhicule autonome WO2022133090A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202180077564.3A CN116490858A (zh) 2020-12-17 2021-12-16 自主交通工具关键场景的自适应生成和评估

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN2020137373 2020-12-17
CNPCT/CN2020/137372 2020-12-17
CN2020137372 2020-12-17
CNPCT/CN2020/137373 2020-12-17

Publications (1)

Publication Number Publication Date
WO2022133090A1 true WO2022133090A1 (fr) 2022-06-23

Family

ID=82058686

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/063814 WO2022133090A1 (fr) 2020-12-17 2021-12-16 Génération et évaluation adaptatives de scénarios critiques de véhicule autonome

Country Status (2)

Country Link
CN (1) CN116490858A (fr)
WO (1) WO2022133090A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220324464A1 (en) * 2021-03-30 2022-10-13 Motional Ad Llc Selecting testing scenarios for evaluating the performance of autonomous vehicles
CN117111640A (zh) * 2023-10-24 2023-11-24 中国人民解放军国防科技大学 基于风险态度自调整的多机避障策略学习方法及装置
CN117196262A (zh) * 2023-11-06 2023-12-08 中船凌久高科(武汉)有限公司 一种基于状态编码优化的测试场车辆与场景匹配调度方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012030659A (ja) * 2010-07-29 2012-02-16 Denso Corp 状況適合型運転支援装置
KR20170031913A (ko) * 2015-09-14 2017-03-22 주식회사 만도 운전지원장치 및 운전지원방법
WO2018147871A1 (fr) * 2017-02-10 2018-08-16 Nissan North America, Inc. Gestion opérationnelle de véhicule autonome
US20200089247A1 (en) * 2018-09-14 2020-03-19 Florian Shkurti Iterative generation of adversarial scenarios
US20200290619A1 (en) * 2019-03-14 2020-09-17 GM Global Technology Operations LLC Automated driving systems and control logic using maneuver criticality for vehicle routing and mode adaptation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012030659A (ja) * 2010-07-29 2012-02-16 Denso Corp 状況適合型運転支援装置
KR20170031913A (ko) * 2015-09-14 2017-03-22 주식회사 만도 운전지원장치 및 운전지원방법
WO2018147871A1 (fr) * 2017-02-10 2018-08-16 Nissan North America, Inc. Gestion opérationnelle de véhicule autonome
US20200089247A1 (en) * 2018-09-14 2020-03-19 Florian Shkurti Iterative generation of adversarial scenarios
US20200290619A1 (en) * 2019-03-14 2020-09-17 GM Global Technology Operations LLC Automated driving systems and control logic using maneuver criticality for vehicle routing and mode adaptation

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220324464A1 (en) * 2021-03-30 2022-10-13 Motional Ad Llc Selecting testing scenarios for evaluating the performance of autonomous vehicles
US11932260B2 (en) * 2021-03-30 2024-03-19 Motional Ad Llc Selecting testing scenarios for evaluating the performance of autonomous vehicles
CN117111640A (zh) * 2023-10-24 2023-11-24 中国人民解放军国防科技大学 基于风险态度自调整的多机避障策略学习方法及装置
CN117111640B (zh) * 2023-10-24 2024-01-16 中国人民解放军国防科技大学 基于风险态度自调整的多机避障策略学习方法及装置
CN117196262A (zh) * 2023-11-06 2023-12-08 中船凌久高科(武汉)有限公司 一种基于状态编码优化的测试场车辆与场景匹配调度方法
CN117196262B (zh) * 2023-11-06 2024-02-13 中船凌久高科(武汉)有限公司 一种基于状态编码优化的测试场车辆与场景匹配调度方法

Also Published As

Publication number Publication date
CN116490858A (zh) 2023-07-25

Similar Documents

Publication Publication Date Title
US11308391B2 (en) Offline combination of convolutional/deconvolutional and batch-norm layers of convolutional neural network models for autonomous driving vehicles
O'Kelly et al. Scalable end-to-end autonomous vehicle testing via rare-event simulation
US11137762B2 (en) Real time decision making for autonomous driving vehicles
US11475770B2 (en) Electronic device, warning message providing method therefor, and non-transitory computer-readable recording medium
WO2022133090A1 (fr) Génération et évaluation adaptatives de scénarios critiques de véhicule autonome
US10997729B2 (en) Real time object behavior prediction
US10671077B2 (en) System and method for full-stack verification of autonomous agents
US11465650B2 (en) Model-free reinforcement learning
US11704554B2 (en) Automated training data extraction method for dynamic models for autonomous driving vehicles
CN111613091A (zh) 利用外部驾驶员数据增强移动设备操作
Erdogan et al. Real-world maneuver extraction for autonomous vehicle validation: A comparative study
JP2022511968A (ja) 開放された車両ドアを検出するための分類器のトレーニング
Luo et al. Stealthy tracking of autonomous vehicles with cache side channels
US11810460B2 (en) Automatic generation of pedestrians in virtual simulation of roadway intersections
US11772666B2 (en) Vehicle operation safety model test system
CN113022581A (zh) 用于确定用于自主车辆的配置的方法和设备
CN112784867A (zh) 利用合成图像训练深度神经网络
CN113379654A (zh) 动态路由的块鉴别器
US20180157770A1 (en) Geometric proximity-based logging for vehicle simulation application
US11767030B1 (en) Scenario simulation execution within a truncated parameter space
EP4295341A1 (fr) Simulation d'événement rare lors d'une planification de mouvement de véhicule autonome
US11940793B1 (en) Vehicle component validation using adverse event simulation
US20220289233A1 (en) Trace graph as a surrogate for operating/driving scenarios
US11814080B2 (en) Autonomous driving evaluation using data analysis
US20240166222A1 (en) Measuring simulation realism

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21907818

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180077564.3

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21907818

Country of ref document: EP

Kind code of ref document: A1