WO2020060480A1 - System and method for generating a scenario template - Google Patents

System and method for generating a scenario template Download PDF

Info

Publication number
WO2020060480A1
WO2020060480A1 PCT/SG2018/050640 SG2018050640W WO2020060480A1 WO 2020060480 A1 WO2020060480 A1 WO 2020060480A1 SG 2018050640 W SG2018050640 W SG 2018050640W WO 2020060480 A1 WO2020060480 A1 WO 2020060480A1
Authority
WO
WIPO (PCT)
Prior art keywords
scenario
under test
criteria
scenario template
vehicle under
Prior art date
Application number
PCT/SG2018/050640
Other languages
French (fr)
Inventor
Oliver Michael GRIMM
Henning HASEMANN
Intakhab Mehboob KHAN
Xiang YUANBO
Original Assignee
Sixan Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sixan Pte Ltd filed Critical Sixan Pte Ltd
Publication of WO2020060480A1 publication Critical patent/WO2020060480A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B13/00Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
    • G05B13/02Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
    • G05B13/0265Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion
    • G05B13/027Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion using neural networks only
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B13/00Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
    • G05B13/02Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
    • G05B13/04Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric involving the use of models or simulators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles

Definitions

  • Various embodiments of the present invention generally relate to systems, methods and computer-implemented methods, for the generation of a scenario template, and more specifically, but not exclusively, in driving simulators for the development of autonomous vehicle systems.
  • Driving simulators offer advantages such as efficient mileage data collection, road conditions dataset diversity and sensor corresponding data accuracy. They also allow an efficient, continuous and unlimited data collection in a virtual environment with relatively low operational costs, all of which have helped to speed up the development of autonomous vehicle technologies.
  • the quality of the data collected from driving simulators needs to be as realistic as possible, so that it can be used for improving the performance of the autonomous vehicle system that controls the vehicle.
  • Driving simulators are often developed using rule-based systems which require extensive manual input and configuration.
  • a computing system comprising: one or more processors;
  • a memory device coupled to the one or more processors
  • a driving simulator system stored in the memory device and configured to be executed by the one or more processors, the driving simulator system comprising instructions for:
  • each of the at least one predetermined measurement criteria corresponds to a predetermined parameter with a threshold value defining a condition for the incident event.
  • each of the at least one predetermined measurement criteria is associated with a safety feature for an advanced driver assistance function.
  • each of the at least one predetermined measurement criteria is associated with a comfort feature for an advanced driver assistance function.
  • detecting the incident event includes generating a timestamp-based signal incident, the timestamp-based signal stored in a scenario database.
  • the incident event is defined as a near miss or a collision with the traffic agent or the road infrastructure element.
  • the at least one traffic agent is trained by a traffic agent learning system for emulating driving competencies or real-world drivers.
  • computing system further comprises the step of assigning a complexity level to the extracted sequence of scenes, wherein the complexity level corresponds to an indication of the complexity of the incident.
  • the first scenario template is created using a formal scenario description language.
  • the computing sytem further comprises the step of assigning a success or fail criteria to the scenario template, wherein the success or fail criteria is determined based on one or more predetermined criteria.
  • the computing system further comprises the following steps: generating a second scenario template extracted from one or more public database, wherein the second scenario template is distinct from the scenario template;
  • a computer- implemented method for use in a driving simulation system comprising the steps of: introducing at least one traffic agent and a plurality of road infrastructure elements for interacting with a vehicle under test, detecting an incident event between the vehicle under test and with the at least one traffic agent or one of the plurality of road infrastructure elements based upon the vehicle under test exceeding at least one predetermined measurement criteria; extracting a sequence of scenes from a predetermined time period before the incident event occurred to the time period the incident event occurred;
  • each of the at least one predetermined measurement criteria corresponds to a predetermined parameter with a threshold value defining a condition for the incident event.
  • each of the at least one predetermined measurement criteria is associated with a safety feature for an advanced driver assistance function.
  • each of the at least one predetermined measurement criteria is associated with a comfort feature for an advanced driver assistance function.
  • detecting the incident event includes generating a timestamp-based signal tincident, the timestamp-based signal stored in a scenario database.
  • the incident event is defined as a near miss or a collision with the traffic agent or the road infrastructure element.
  • the at least one traffic agent is trained by a traffic agent learning system for emulating driving competencies or real-world drivers.
  • the computer-implemented method further comprises the step of assigning a complexity level to the extracted sequence of scenes, wherein the complexity level corresponds to an indication of the complexity of the incident.
  • the first scenario template is created using a formal scenario description language.
  • the computer-implemented method further comprises the step of assigning a success or fail criteria to the scenario template, wherein the success or fail criteria is determined based on one or more predetermined criteria.
  • the computer-implemented method further comprises the following steps: generating a second scenario template extracted from one or more public database, wherein the second scenario template is distinct from the scenario template;
  • FIG. 1 depicts a block diagram of an exemplary computing system architecture to generate a scenario template according to various embodiments
  • FIG. 2 depicts a flow chart diagram of an exemplary method to generate a scenario template according to various embodiments
  • FIG. 3 depicts a diagram of exemplary measurement criteria according to various embodiments
  • FIG. 4 depicts diagram illustrating the exemplary operating principles of the incident detector and extractor modules according to various embodiments
  • FIG. 5 depicts a flow chart diagram of an exemplary method to generate scenario templates data according to various embodiments
  • FIG. 6 depicts a flow chart diagram of an exemplary method to test scenario templates according to various embodiments
  • FIG. 7a depicts diagrams of exemplary scenario templates illustrating the interaction between a traffic agent and a vehicle under test (VUT) according to various embodiments;
  • FIG. 7b depicts a diagram of exemplary scenario templates that are fused according to various embodiments.
  • FIG. 8 illustrates the relationship between traffic agents, vehicle under test, conditions and fused scenario states.
  • the term“configured” herein may be understood as in connection with systems and computer program components.
  • a system of one or more computers to be configured to perform particular operations or actions it means that software, firmware, hardware, or a combination of them is installed on the system that in operation cause the system to perform operations or actions.
  • one or more computer programs to be configured to perform particular operations or actions means that the one program or multiple programs include instructions that, when executed by a data processing apparatus, cause the apparatus to perform the operations or actions.
  • the present disclosure is directed to systems and methods that make use of computer hardware and software to automatically generate a scenario template of a vehicle under test in a driving simulator or a simulation environment.
  • Driving simulators provide significant advantages in speeding up the development of autonomous vehicle systems.
  • the quality of data collected from driving simulators needs to be as realistic as possible, so that it can be used for improving the performance of the autonomous vehicle system that controls the vehicle.
  • driving simulators are often developed using rule-based systems which require extensive manual input and configuration, this restricts the ability to capture potentially limitless driving scenarios and do not provide the complexity and depth of real-world urban driving conditions.
  • the present disclosure describes a unique approach to automatically create and store a scenario template with multiple traffic agents, either untrained or trained by machine learning algorithms or techniques.
  • multiple traffic agents can cooperate to create multiparty scenarios.
  • trained traffic agents and/or traffic agents trained for a specific functionality can be introduced into the 3D simulation environment.
  • Each scenario template that is automatically created undergoes further processing upon occurrence of a risky traffic scenario or incident and is thereafter stored into a scenario database.
  • the stored scenario template can be replayed for a vehicle under test at any time and can be transposed to similar locations in the 3D simulation environment and replayed under different weather conditions.
  • a virtual traffic agent Once a virtual traffic agent has been trained in a way that it replicates human driving behavior for a predetermined geographical area, the goal is to inject one or more trained virtual traffic agents into a simulation environment representing the predetermined geographical area where they can interact, cooperate with and challenge an automous vehicle system controlling an autonomous vehicle under test.
  • the overall goal is to test the limitations and weaknesses of the autonomous vehicle system, especially from adverse and dangerous traffic scenerios that are attributed to assertive or aggressive driving behaviors.
  • the disclosed systems and methods of generating scenario templates have a technical effect and benefit of providing an improvement to driving simulation systems and in turn the performance of autonomous vehicle systems.
  • driving simulation systems can avoid the rule-based system or hand-crafted rule system which may be less effective and flexible for decisions made by driving simulation systems for the testing of autonomous vehicle systems.
  • Traffic agent learning systems significantly reduce hardware resoures and human resources that are required for training, evaluating and testing autonomous vehicle systems.
  • Another advantage of using driving simulators or simulation environments is the prevention of damages, accidents and loss of lives during the test process. Risky scenarios that would potentially cause damages, accidents and in the worst case loss of lives in the real-world can be tested safely in a simulation environment.
  • Various embodiments are provided for systems, and various embodiments are provided for methods. It will be understood that basic properties of the systems also hold for the methods and vice versa. Therefore, for sake of brevity, duplicate description of such properties may be omitted.
  • FIG. 1 illustrates a block diagram of an exemplary architecture of a driving simulation system 10 to generate a scenario template according to various embodiments.
  • the driving simulation system 10 is configured to implement simulated missions of an autonomous vehicle and includes a server 11 that communicates with one or more client devices 12 via a communications network (not shown) which operably communicates with an autonomous vehicle system 110 that in turn controls the operational controllers of an autonomous vehicle (not shown).
  • the driving simulation system 10 is configured to test the autonomous operation of the autonomous vehicle in a simulated manner.
  • the client device 12 may comprise of a personal computer, a portable computing device such as a laptop, a television, a mobile phone, or any other appropriate storage and/or communication device to exchange data via a web browser and/or communications network.
  • the driving simulation system 10 includes a central processing unit (CPU) or a processor 15 which executes instructions contained in programs such as a driving simulator system and stored in storage devices (not shown).
  • the processor 15 may provide the central processing unit (CPU) functions of a computing device on one or more integrated circuits.
  • processor broadly refers to and is not limited to a single or multi-core general purpose processor, a special purpose processor, a conventional processor, a graphical processing unit, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, one or more application-specific integrated circuits (ASICs), one or more field-programmable gate array circuits (FPGA), any other type of integrated circuit, a system on a chip (SOC), and/or a state machine.
  • DSP digital signal processor
  • ASICs application-specific integrated circuits
  • FPGA field-programmable gate array circuits
  • one or more client devices 12 may exchange information via any communication network, such as a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), a proprietary network, and/or internet protocol (IP) network such as the Internet, an intranet or an extranet.
  • LAN local area network
  • MAN metropolitan area network
  • WAN wide area network
  • IP internet protocol
  • Each client device 12, module or component within the system may be connected over a network or directly to each other.
  • network ‘network’,‘computer network’ and‘online’ may be used interchangeably and do not imply a particular network embodiment.
  • any type of network may be used to implement the online or computer networked embodiment of the present invention.
  • the network may be maintained by a server or a combination of servers or the network may be serverless.
  • any type of protocol for example, HTTP, FTP, ICMP, UDP, WAP, SIP, H.323, NDMP, TCP/IP
  • HTTP HyperText Transfer Protocol
  • FTP FTP
  • ICMP UDP
  • WAP Wireless Fidelity
  • SIP Session Initiation Protocol
  • H.323, NDMP TCP/IP
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • the driving simulation system 10 includes a memory 16 for storing data and software instructions which are executed by the processor 15 and may control the operation of various aspects of the computing system 10.
  • the memory 16 is configured for storing data associated with the simulation of the autonomous vehicle. It can also be configured to store a variety of other data and data files, such as logs of simulated missions of the autonomous vehicle.
  • the memory 16 can include one or more databases 101 and image processing software, as well as a driving simulation system as described herein, which further includes a neural network or a deep neural network.
  • the memory 16 used in the embodiments may include a volatile memory, for example a DRAM (Dynamic Random Access Memory) or a non-volatile memory, for example a PROM (Programmable Read Only Memory), an EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), or a flash memory, e.g., a floating gate memory, a charge trapping memory, an MRAM (Magneto-Resistive Random Access Memory) or a PCRAM (Phase Change Random Access Memory).
  • the computing system also includes a solid state drive (SSD) 13 that stores data and software instructions to be executed by the processor 15.
  • SSD solid state drive
  • a solid state drive 13 is a data storage device that uses a NAND-based flash memory to store data and is a form of non-volatile memory. Once data is written to the flash memory, and if no power is supplied to the flash memory, the data is still retained in the flash memory.
  • the memory 16 and/or the SSD 13 include one or more databases 101 for the storage of at least one traffic agent database 120, a measurement catalog database 130 and a scenario database 140.
  • the traffic agent database 120 stores traffic agents that are trained by a traffic agent learning system which produces traffic agents capable of emulating driving competencies or driving behavior of real-world drivers.
  • the traffic agent database 120 is configured to receive inputs from a traffic service module 153.
  • Traffic agents can be trained based on geographical locations, environmental conditions, defensive and/or aggressive driving behaviors.
  • the traffic agents can be trained by the traffic agent learning system as disclosed in the priority patent application.
  • the measurement catalog database 130 stores a set of predetermined parameters with threshold values or predefined qualitative criteria that define trigger conditions for likely or imminent critical events occurring during active simulation of a vehicle under test. These predetermined parameters are based on a domain knowledge associated with developing safety or comfort features for advanced driver assistance functions, which will be described in more detail below.
  • the scenario database 140 stores sensor data streams and/or scenarios recorded or extracted from real-world drivers or generated from driving simulation systems.
  • the real-world vehicle scenarios can be acquired from various public, government or domain knowledge databases.
  • the scenario database 140 is configured to provide inputs to the scenario executor 171 to execute scenario templates stored in the scenario database for the testing of the vehicle under test in the driving simulation system.
  • the test coverage monitor module 190 is configured to provide and receive inputs from the scenario database 140 and ensures that all the scenario templates are executed and successfully passed by the driving simulation system 10.
  • the driving simulation system 10 includes a user interface (not shown) on the client device 12 that is configured to facilitate control inputs to provide control commands to the autonomous vehicle and to facilitate simulation inputs associated with simulating the operation of the autonomous vehicle.
  • the user interface is configured to monitor the simulation of the autonomous vehicle, to provide inputs to the autonomous vehicle system 110 during an associated simulation mission.
  • the driving simulation system 10 also includes a simulation controller 170 configured to provide simulation signals to and receive simulation feedback signals from the external component or the autonomous vehicle system 110.
  • the simulation controller 170 provides the‘world information’ required for it to drive.
  • The‘world information’ comprises sensor signals from the camera, lidar and/or radar.
  • the simulation controller 170 is configured to mimic the controller of an autonomous vehicle system which controls one or more aspects of the vehicle operation based on automated driving commands received from the processor. For example, the simulation controller 170 can provide simulation signals representative of one or more actuator systems or one or more indicator systems in a vehicle.
  • the simulation controller 170 operably communicates with the incident detector module 150, the incident extractor module 160 and the scenario executor 171 and is configured to provide and to receive inputs from each of these modules which in turn provide simulation signals to the autonomous vehicle system 110. Each of the modules will be described in more detail below.
  • FIG. 2 illustrates a flow diagram of an exemplary method to generate a scenario template according to embodiments of the present disclosure.
  • one or more trained traffic agents from the traffic agent database 120 are introduced into the driving simulation system 10 which includes a simulation environment.
  • the trained traffic agents fall into two categories.
  • the first category of traffic agents in step 200 includes those that have been trained by the traffic agent learning system to emulate the driving competencies or driving behavior of real-world drivers which are further categorized according to geographical locations, environmental conditions, defensive and/or aggressive driving behavior.
  • Traffic agents in the second category in step 210 are those that have been trained specifically for a functional scenario.
  • An example of a functional scenario is that a vehicle is performing a cut-in to another vehicle.
  • either traffic agents in step 200 or 210 or both can be introduced in the simulation environment.
  • the simulation environment can be user-configured and can include any of a variety of geographical locations that corresponds to real locations.
  • step 220 on commencement of the test, the vehicle under test, the traffic agents and road infrastructure elements are allowed to interact with one another in the simulation environment.
  • the one or more traffic agents introduced in step 200 proceed to navigate through the simulation environment with the objective to execute driving behavior that each has been trained for, whether it is a defensive or an aggressive driving style respectively or an American or Chinese driving style. For example, how a vehicle is driven in America may differ from a vehicle in China due to differing driving cultures and unique road rules.
  • Road infrastructure elements are included in the simulation environment to provide more complexity, depth and realism to the generated scenario templates.
  • Road infrastructure elements include stationary and movable road elements and road users. Whereas stationary road elements include for example traffic signs, poles, road barriers, buildings, monuments, structures, natural objects, vegetation and/or a terrain surface, movable road elements include ground vehicles, pedestrian, animals, and the like.
  • an incident is defined as any behavior of the ego or surrounding vehicles that is considered interesting to the user. This can include dangerous situations and situations where the vehicle under test may trigger a hard brake or other extreme controls, even if they did not cause immediate danger to the other surrounding vehicles. For example, a traffic agent with an aggressive style of driving may cause a risky scenario in which an incident with the vehicle under test is likely to occur or actually occurs.
  • a set of measurement criteria which was previously configured by the user is applied to the vehicle under test during the active simulation.
  • the set of measurement criteria is comprised of one or more predetermined parameters with threshold values or predefined qualitative criteria that define trigger conditions for imminent critical events or accidents. These predetermined parameters are based on a domain knowledge associated with developing safety or comfort features for advanced driver assistance functions.
  • the measurement criteria can be predetermined parameters associated with safety features such as speed of the vehicle under test or safety distance of other objects to the vehicle under test.
  • the predetermined parameters can be associated with the type of acceleration and deceleration of the vehicle under test: lateral deceleration, long deceleration, lateral acceleration and long acceleration.
  • the set of measurement criteria is stored in the measurement catalog database and operably communicates with the incident detector module in the driving simulation system.
  • One or more measurement criteria may be applied to the vehicle under test simultaneously during active simulation. The following example shall illustrate this.
  • the incident detector module is always active and applies the measurement criteria retrieved from the measurement catalog database. Should the vehicle under test exceed the road speed limit or breach the safety distance to another vehicle that was set as measurement criteria, the incident detector module is activated to generate a time-stamp based signal of an incident event that is provided by means of a clock (not shown).
  • the vehicle under test may come very close to colliding with or actually collides with the traffic agent.
  • the incident detector module is activated to generate a time-stamp based signal provided by the clock at time Wident to indicate the time the incident event occurred.
  • the time stamp i dent will be logged and stored in the scenario database.
  • an incident event is defined as a near miss or a collision with a traffic agent or a road infrastructure element.
  • the incident detector module is configured to detect the incident event in real-time or near real-time. This module is configured to detect the incident event based on information conveyed by output signals which have been generated by the simulation controller or visual output data collected by sensors on the vehicle under test in the simulation environment.
  • the incident extractor module is activated and goes back in time by a predetermined time period denoted as temergmg.
  • the incident extractor module is configured to extract a recording of the sequence of scenes to capture the near miss or collision with the traffic agent for the time period from temergmg to ident-
  • the sequence of scenes include the environment information and the position of the vehicle under test and the traffic agent and/or the road infrastructure elements. This extraction process is based on information conveyed by output signals which are generated by the simulation controller or visual output data collected by sensors on the vehicle under test in the simulation environment.
  • the incident extractor module extracts a record of the sequence of scenes for the relevant time period
  • the extracted record of sequence of events undergoes further processing to be evaluated, categorized, and classified.
  • the extracted record may be assigned a complexity level that indicates the complexity of the recorded near miss or incident.
  • the vehicle under test may be challenged by scenarios that have a low complexity level such asa traffic agent brakes hard in front of the vehicle under test.
  • the vehicle under test will subsequently be challenged by scenarios that have a higher complexity level.
  • the extracted record will be automatically assigned a complexity level, in which a lower numerical number corresponds to a scenario of lower complexity level and a higher numerical number to a scenario of higher complexity level .
  • the extracted record may also be evaluated to assess driving behavior risks using parameters such as the time to collision and the distance of vehicles. These parameters can directly assess the level of driving behavior risk of the extracted record of sequence of events.
  • the extracted record of sequence of events may be classified according to types of vehicle maneuvers. The classification of the extracted record offers the advantage of building special scenario clusters. For example, one cluster may contain all cases where a traffic agent performs a cut-in maneuver from the front right side of the vehicle under test.
  • the extracted record may also be categorized according to the different types of traffic agents involved in the incident, such as cars, motorbikes, bicycles, etc., road infrastructure elements involved such as bridges, intersections, tunnels, hilly drive, steep decline, curvy road, etc., and sensor disturbances including occlusion of field of view, sun glare for example.
  • the objective of categorizing the extracted records of sequences of events is to have them stored in one multi-dimensional, filterable database in which scenarios can be stored and retrieved easily.
  • the processor 15 executes automatically each of the processing steps which are classification, categorization, and assignment of complexity level and level of risk.
  • the driving simulation system 10 is configured to create a scenario template corresponding to the extracted record of scenes of the incident event and is stored in a scenario database 140.
  • the scenario templates are created using a domain specific language or a formal scenario description language (SDL) which enables a scenario to be enumerated and to be described at a high level so that only useful scenarios are instantiated or have the ability to be repeated deterministically with different software or hardware versions of the autonomous vehicle system under test.
  • SDL formal scenario description language
  • the scenarios can be transposed to different geographical locations and repeated under varying weather conditions.
  • the scenario templates can also be modified in a way that the parameters of the scenario’s participants’ movements are changed. This allows the template to be instantiated with a multitude of modified parameters, with the possibility to lower or increase the levels of complexity or risk.
  • one or more success and/or fail criteria are assigned to the specific scenario template.
  • Users of the driving simulation system 10 may have differing definitions on determining if a scenario template is considered a success or a failure.
  • Car manufacturers, for example, or organizations that establish or enforce car safety standards such as the New Car Assessment Program (NCAP) may have specific criteria how an accident or incorrect driving behavior is to be defined.
  • NCAP New Car Assessment Program
  • predetermined criteria are defined so that each scenario template will be assigned accordingly based on the predetermined criteria. Examples of predetermined criteria are safety distance to another car and maximum speed at point of collision or time to collision of vehicles. Predetermined criteria may also be more specialized for scenarios that are more complicated.
  • a success criteria is assigned if the following conditions are achieved: (i) at least 2 meters of drivable space to the left of the vehicle under test, (ii) the vehicle under test is stopped (or its speed is less than or equal to lkm/h), and (iii) distance from the vehicle under test to the vehicle in front is less than 0.5m.
  • the scenario template is stored in a scenario database 140.
  • FIG. 4 illustrates the workings of the incident detector module and the incident extractor module.
  • the vehicle under test may be exposed to scenarios in which an incident event may occur. For example, a traffic agent with an aggressive style of driving may cause a risky scenario in which an incident with the vehicle under test is likely to occur or occurs.
  • One or more measurement criteria are configured by the user and are applied by the incident detector module to the vehicle under test during active simulation.
  • the set of measurement criteria is one or more predetermined parameters with threshold values or predefined qualitative criteria that define trigger conditions for an incident event occurring during active simulation of a vehicle under test.
  • the incident detector module is activated to generate a time stamp-based signal that is provided by a clock (not shown), of an incident event, in which the time stamp ident will be logged in the scenario database to be associated with the time the incident event occurs.
  • the incident detector module is configured to detect the incident event in real-time or near real-time.
  • an incident event is defined as a near miss or a collision between the vehicle under test and a traffic agent and/or a road infrastructure element.
  • the incident detector module is configured to detect the imminent incident in based on information conveyed by output signals generated by the simulation controller or, visual output data collected by sensors on the vehicle under test in the simulation environment.
  • the incident extractor module Upon occurrence of the incident event, the incident extractor module is activated to extract a recording of the sequence of scenes prior to the time the incident event occurred by a predetermined time period denoted as temerging.
  • the incident extractor module is configured to extract a recording of the sequence of scenes that captures scenes prior to the near miss or collision with the traffic agent for the time period from temerging to tincident.
  • the sequence of scenes include the environment information and the position of the vehicle under test and the traffic agent and/or the road infrastructure element. This extraction process is based on information conveyed by output signals which are generated by the simulation controller or visual output data collected by sensors on the vehicle under test in the simulation environment.
  • the extracted recording of the sequence of scenes is subsequently processed as mentioned beforehand.
  • FIG. 5 illustrates a flow chart diagram of an exemplary method to generate scenario templates whose data is derived from a plurality of public databases.
  • the databases comprise data that is obtained from real vehicles in the real world such as vehicles participating in normal public road traffic, which may or may not be autonomously controlled or controlled by human drivers.
  • the database comprising NCAP test cases for advanced driver assistance system (ADAS) functions may comprise test cases which include data obtained from autonomously controlled vehicles.
  • ADAS advanced driver assistance system
  • a domain knowledge database may contain test cases that comprise a combination of autonomous and non- autonomously controlled vehicles which may be obtained from car manufacturers or other driving simulation systems.
  • the driving simulation system 10 is configured to create a scenario template corresponding to the extracted data from the plurality of public databases.
  • the scenario templates are created using a domain specific language or a formal scenario description language (SDL) which enables a scenario to be enumerated and to be described at a high level so that only useful scenarios are instantiated or have the ability to be repeated deterministically with different software or hardware versions of the autonomous vehicle system under test.
  • SDL formal scenario description language
  • the scenarios can be transposed to different geographical locations and repeated under varying weather conditions.
  • the scenario templates can also be modified in a way that the parameters of the scenario’s participants’ movements are modified. This allows the template to be instantiated with a multitude of modified parameters.
  • a success and fail label is assigned to the specific scenario template as was done for the extracted record of sequence of scenes for the vehicle under test, after its processing to a scenario template.
  • Users of the driving simulation system 10 may have differing definitions on determining if a the outcome of an executed scenario is considered a success or a failure.
  • Car manufacturers, for example, or organizations that establish or enforce car safety standards such as the NCAP may have specific criteria how an accident or incorrect driving behavior is to be defined.
  • predetermined criteria are defined so that each scenario template will be assigned accordingly based on the predetermined criteria. For example, a predetermined criteria can be safety distance to another car maximum speed at point of collision, or time to collision of vehicles.
  • predetermined criteria may also be more specialized for scenarios that are more complicated and the attribution of a success or fail criteria can include more than one predetermined criteria.
  • FIGS. 2 and 5 illustrate two different ways of populating the scenario database 140.
  • the flow diagram in FIG. 2 describes the method of generating scenario templates from the interaction of traffic agents, road infrastructure elements and a vehicle under test in a simulation environment
  • FIG. 5 describes the method of generating scenario templates from extracting scenario data from public databases and generating scenario templates by converting the scenario data into scenario templates using a scenario description language.
  • scenario templates generated in a common language applying both methods and stored in the scenario database provides the benefit of generating numerous scenario templates that are created by the combination of using specially trained traffic agents to create challenging scenarios (as provided in FIG. 2) and using scenario data from public databases. This provides an additional layer of complexity and depth to the realism of the scenario templates that are generated.
  • FIG. 2 describes the method of generating scenario templates from the interaction of traffic agents, road infrastructure elements and a vehicle under test in a simulation environment
  • FIG. 5 describes the method of generating scenario templates from extracting scenario data from public databases and generating scenario templates by converting the scenario data into scenario templates using a scenario description
  • the driving simulation system 10 includes a data analytics module 400 of the scenario templates which are stored in the scenario database.
  • the data analytics module 400 provides an overview of the coverage of the scenario templates and also highlights the scenarios that are not covered.
  • the data analytics module 400 can provide statistics and numbers on the various categories of scenarios.
  • the categories of scenario templates may cover one or more of the following areas: geometry dependent driving behavior, interaction with other road users, interaction with road infrastructure elements, sensor degradation, or behavioral competencies.
  • the data analytics module will also identify the types of scenario templates that were not extracted by the scenario database. This helps the user to identify scenario templates that have not been covered.
  • Uncovered scenario templates can then be obtained either through public databases or through using trained traffic agents to simulate the uncovered scenarios.
  • the data analytics module 400 also evaluates the risk of the scenario templates.
  • the scenario risk is a measure of how high the chance that an accident (i.e. a collision between the vehicle under test (VUT) and a traffic agent) might occur. In general, it is assumed that this scenario risk is of interest for classification of scenarios independently of the driving behavior risks assigned for scenario templates as mentioned beforehand. Scenario risk can be assessed in a qualitative way, which however is not well-suited for automation. Several measures that quantify risk more objectively exist such as Time-To-Collision value computation, from which various derivation exists that account for arbitrary motion or individual vs. collective risk.
  • the complexity of a scenario is a measure for the computational effort, necessary reaction times and planning capabilities a VUT must posses. This measure can in tendency be related to risk as situations that are complex can often be risky. Counterexamples for this relationship can however be found. Complexity can be defined from various perspectives: An actor-centric complexity metric would take into account the number and maneouvers of all the involved actors and possibly also the road geometry. Complexity would then be defined as high for many existing actors with complex conditions / motion plans and/or a complex/windy road geometry.
  • VTJT VTJT
  • High jerk values in a VUT trajectory indicate sudden changes in acceleration and thus execution of difficult driving maneouvers.
  • This approach can be detailed to incorporate a more differentiated features such as jerk, acceleration, speed and distance separated with respect to whether they are observed alongside the driving direction (longitudinal) or perpendicular to it (lateral).
  • a useful weighting of these features can be found with expert knowledge or based on existing statistical methods.
  • risk to compensate for freedom in the behavior of the VUT, several runs can be sampled from the scenario with varying behavior, aggregating the results to approximate the total complexity for all possible scenario runs. After the (fully automated) computation of risk and complexity values for a number of scenarios, a clustering algorithm can automatically identify a manageable number of groups of risk/complexity combinations.
  • the driving simulation system 10 also includes a scenario fusion module 410 which allows two different scenario templates to be combined. Combining two different scenario templates enhances the complexity and the coverage of the scenario templates.
  • the scenario fusion module 410 is configured to merge two or more scenario templates extracted from the scenario database into a more complex scenario that combines their properties.
  • a fused scenario template is at least as challenging to a vehicle under test as each of its individual component scenario templates.
  • the process of merging two or more scenario templates comprises the following steps: (i) shallow merging, (ii) condition graph merging, (iii) computation of implicit conditions and (iv) offset adjustment.
  • the scenario database 140 is populated with scenario templates generated from endurance testing of a vehicle under test in a simulation environment, scenario templates generated from public databases (for example, accident databases, expert or domain knowledge database, or NCAP test protocols) and scenario templates that are generated as a result of fusing two or more scenario templates.
  • the scenario templates generated from fusing two or more scenario templates may be obtained from the combination of scenario templates generated from endurance testing of the vehicle under test in a simulation environment and scenario templates generated from public databases or on its own.
  • each of the scenario templates can be executed by the vehicle under test in the simulation environment for the purpose of testing the autonomous vehicle system at step 420.
  • the driving simulation system 10 will monitor the testing progress by storing the scenario outcomes with attributes passed or failed.
  • Each scenario template is accompanied with a success or fail criteria. If the outcome of an executed scenario template is assigned as a fail, the scenario is not passed. An example of a fail criteria is when a collision occurs.
  • the driving simulation system 10 also identifies gaps in the test coverage of the scenario templates by identifying scenario types that have not yet been executed successfully. By doing so, the weaknesses of the vehicle under test can be emphasized and worked on.
  • FIG. 7a illustrates diagrams of exemplary scenario templates illustrating the interaction between a traffic agent and a vehicle under test according to various embodiments.
  • the figure on the left illustrates a scenario template of a traffic agent Al attempting to cut in to the lane of a vehicle under test (VUT) from the left.
  • the figure on the right illustrates a scenario template with a traffic agent A2 attempting to cut in to the lane of a vehicle under test from the right.
  • Both scenario templates are obtained from the scenario database 140.
  • FIG 7b illustrates an example of a fused scenario template obtained as a result of combining both scenarios as depicted in the two diagrams of FIG. 7a.
  • FIG. 8 illustrates the relationship between traffic agents Al and A2, vehicle under test, conditions and fused scenario states. Considering the component scenarios individually, each would be structured in the following manner:
  • Traffic agent is on lane (Al or A2) (left/right of VUT);
  • FIG. 8 exemplifies how this set of states and conditions must change when two scenarios are fused: An order must be defined for states and conditions and there must be a state that connects the scenarios (here: state after one vehicle has changed lane but before the other has, so in this particular case the last state of one scenario and the first of the other). To honor the implicit conditions (see above), new conditions might need to be created; in particular for guarding these connecting state(s). If that would not be the case, the individual conditions for the two component scenarios might cause both agents to change to the center lane at the same time and cause an (unwanted) collision.

Abstract

The present invention discloses a computing system comprising one or more processors, a memory device coupled to the one or more processors and a driving simulator system stored in the memory device and configured to be executed by the one or more processor. The driving simulator system comprising instructions for introducing at least one traffic agent and a plurality of road infrastructure elements for interacting with a vehicle under test, detecting an incident event between the vehicle under test with the at least one traffic agent or one of the plurality of road infrastructure elements based on the vehicle under test exceeding at least one predetermined measurement criteria, extracting a sequence of scenes from a predetermined time period before the incident event occurred to the time period the incident event occurred, creating a first scenario template based on the extracted sequence of scenes for storing into a scenario database.

Description

SYSTEM AND METHOD FOR GENERATING A SCENARIO TEMPLATE
Cross-Reference to Related Applications
[0001] This application claims the priority of PCT Application No. PCT/SG2018/050477 filed on 18 September 2018, the entirety of which is hereby incorporated by reference.
Technical Field
[0002] Various embodiments of the present invention generally relate to systems, methods and computer-implemented methods, for the generation of a scenario template, and more specifically, but not exclusively, in driving simulators for the development of autonomous vehicle systems.
Background
[0003] Autonomous vehicle systems are tested and validated - under real-world conditions in which a physical vehicle controlled by the autonomous vehicle system drives on physical roads. Performing such testing might be difficult, expensive and at times even dangerous to other road users. There are limitations to test an autonomous vehicle in a real- world environment since an extremely large number of driving hours may need to be accumulated in order to properly train and evaluate autonomous vehicle systems. It has been recommended that autonomous vehicles should log 11 billion miles of road test data to reach an acceptable safety threshold.
[0004] With the emergence of autonomous vehicle technologies, there is a need for multiple and diversified eco-systems to train, evaluate and validate autonomous vehicle systems that control autonomous vehicles. One developing area to test and validate autonomous vehicle systems involves the development and deployment of driving simulators. Driving simulators offer advantages such as efficient mileage data collection, road conditions dataset diversity and sensor corresponding data accuracy. They also allow an efficient, continuous and unlimited data collection in a virtual environment with relatively low operational costs, all of which have helped to speed up the development of autonomous vehicle technologies. However, the quality of the data collected from driving simulators needs to be as realistic as possible, so that it can be used for improving the performance of the autonomous vehicle system that controls the vehicle. Driving simulators are often developed using rule-based systems which require extensive manual input and configuration. Extensive manual input and configuration restrict the ability of capturing potentially limitless driving scenarios and do not reflect the complexity and depth of real-world urban driving conditions. Thus, there is a need to address the problems in existing driving simulators as described above in order to improve the performance of autonomous vehicle systems.
Summary of the Invention
[0005] Throughout this document, unless otherwise indicated to the contrary, the terms“comprising”,“consisting of’, and the like, are to be construed as non-exhaustive, or in other words, as meaning“including, but not limited to”.
[0006] In accordance with a first embodiment of the invention, there is a computing system comprising: one or more processors;
a memory device coupled to the one or more processors;
a driving simulator system stored in the memory device and configured to be executed by the one or more processors, the driving simulator system comprising instructions for:
introducing at least one traffic agent and a plurality of road infrastructure elements for interacting with a vehicle under test, detecting an incident event between the vehicle under test with the at least one traffic agent or one of the plurality of road infrastructure elements based on the vehicle under test exceeding at least one predetermined measurement criteria;
extracting a sequence of scenes from a predetermined time period before the incident event occurred to the time period the incident event occurred; creating a first scenario template based on the extracted sequence of scenes for storing into a scenario database.
[0007] Preferably, each of the at least one predetermined measurement criteria corresponds to a predetermined parameter with a threshold value defining a condition for the incident event.
[0008] Preferably, each of the at least one predetermined measurement criteria is associated with a safety feature for an advanced driver assistance function.
[0009] Preferably, each of the at least one predetermined measurement criteria is associated with a comfort feature for an advanced driver assistance function.
[0010] Preferably, detecting the incident event includes generating a timestamp-based signal incident, the timestamp-based signal stored in a scenario database.
[0011] Preferably, the incident event is defined as a near miss or a collision with the traffic agent or the road infrastructure element. [0012] Preferebly, the at least one traffic agent is trained by a traffic agent learning system for emulating driving competencies or real-world drivers.
[0013] Preferably, computing system further comprises the step of assigning a complexity level to the extracted sequence of scenes, wherein the complexity level corresponds to an indication of the complexity of the incident.
[0014] Preferably, the first scenario template is created using a formal scenario description language.
[0015] Preferably, the computing sytem further comprises the step of assigning a success or fail criteria to the scenario template, wherein the success or fail criteria is determined based on one or more predetermined criteria.
[0016] Preferably, the computing system further comprises the following steps: generating a second scenario template extracted from one or more public database, wherein the second scenario template is distinct from the scenario template;
fusing the first and the second scenario templates to generate a third scenario template wherein the third scenario template is a combination of the properties of the first and second scenario templates.
[0017] In accordance with a second embodiment of the invention, there is a computer- implemented method for use in a driving simulation system comprising the steps of: introducing at least one traffic agent and a plurality of road infrastructure elements for interacting with a vehicle under test, detecting an incident event between the vehicle under test and with the at least one traffic agent or one of the plurality of road infrastructure elements based upon the vehicle under test exceeding at least one predetermined measurement criteria; extracting a sequence of scenes from a predetermined time period before the incident event occurred to the time period the incident event occurred;
creating a scenario template based on the recorded sequence of scenes for storing into a scenario database.
[0018] Preferably, each of the at least one predetermined measurement criteria corresponds to a predetermined parameter with a threshold value defining a condition for the incident event.
[0019] Preferably, each of the at least one predetermined measurement criteria is associated with a safety feature for an advanced driver assistance function.
[0020] Preferably, each of the at least one predetermined measurement criteria is associated with a comfort feature for an advanced driver assistance function.
[0021] Preferably, detecting the incident event includes generating a timestamp-based signal tincident, the timestamp-based signal stored in a scenario database.
[0022] Preferably, the incident event is defined as a near miss or a collision with the traffic agent or the road infrastructure element.
[0023] Preferably, the at least one traffic agent is trained by a traffic agent learning system for emulating driving competencies or real-world drivers. [0024] Preferably, the computer-implemented method further comprises the step of assigning a complexity level to the extracted sequence of scenes, wherein the complexity level corresponds to an indication of the complexity of the incident.
[0025] Preferably, the first scenario template is created using a formal scenario description language.
[0026] Preferably, the computer-implemented method further comprises the step of assigning a success or fail criteria to the scenario template, wherein the success or fail criteria is determined based on one or more predetermined criteria.
[0027] Preferably, the computer-implemented method further comprises the following steps: generating a second scenario template extracted from one or more public database, wherein the second scenario template is distinct from the scenario template;
fusing the first and the second scenario templates to generate a third scenario template wherein the third scenario template is a combination of the properties of the first and second scenario templates.
Brief Description of the Drawings
[0028] In the drawings, like reference characters generally refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. The dimensions of the various features or elements may be arbitrarily expanded or reduced for clarity. In the following description, various embodiments of the invention are described with reference to the following drawings, in which:
[0029] FIG. 1 depicts a block diagram of an exemplary computing system architecture to generate a scenario template according to various embodiments;
[0030] FIG. 2 depicts a flow chart diagram of an exemplary method to generate a scenario template according to various embodiments;
[0031] FIG. 3 depicts a diagram of exemplary measurement criteria according to various embodiments;
[0032] FIG. 4 depicts diagram illustrating the exemplary operating principles of the incident detector and extractor modules according to various embodiments;
[0033] FIG. 5 depicts a flow chart diagram of an exemplary method to generate scenario templates data according to various embodiments;
[0034] FIG. 6 depicts a flow chart diagram of an exemplary method to test scenario templates according to various embodiments;
[0035] FIG. 7a depicts diagrams of exemplary scenario templates illustrating the interaction between a traffic agent and a vehicle under test (VUT) according to various embodiments;
[0036] FIG. 7b depicts a diagram of exemplary scenario templates that are fused according to various embodiments; and
[0037] FIG. 8 illustrates the relationship between traffic agents, vehicle under test, conditions and fused scenario states.
Detailed Description
[0038] The following detailed description refers to the accompanying drawings that show, by way of illustration, specific details and embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments may be utilized and structural, and logical changes may be made without departing from the scope of the invention. The various embodiments are not necessarily mutually exclusive, as some embodiments can be combined with one or multiple embodiments to form new embodiments.
[0039] In the specification the term“comprising” shall be understood to have a broad meaning similar to the term“including” and will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps. This definition also applies to variations on the term “comprising” such as“comprise” and“comprises”.
[0040] In order that the invention may be readily understood and put into practical effect, particular embodiments will now be described by way of examples and not limitations, and with reference to the figures. It will be understood that any property described herein for a specific system may also hold for any system described herein. It will be understood that any property described herein for a specific method may also hold for any method described herein. Furthermore, it will be understood that for any system or method described herein, not necessarily all the components or steps described must be enclosed in the system or method, but only some (but not all) components or steps may be enclosed.
[0041] The term“configured” herein may be understood as in connection with systems and computer program components. For a system of one or more computers to be configured to perform particular operations or actions, it means that software, firmware, hardware, or a combination of them is installed on the system that in operation cause the system to perform operations or actions. For one or more computer programs to be configured to perform particular operations or actions means that the one program or multiple programs include instructions that, when executed by a data processing apparatus, cause the apparatus to perform the operations or actions.
[0042] To achieve the stated features, advantages and objects, the present disclosure is directed to systems and methods that make use of computer hardware and software to automatically generate a scenario template of a vehicle under test in a driving simulator or a simulation environment. Driving simulators provide significant advantages in speeding up the development of autonomous vehicle systems. However, the quality of data collected from driving simulators needs to be as realistic as possible, so that it can be used for improving the performance of the autonomous vehicle system that controls the vehicle. Since driving simulators are often developed using rule-based systems which require extensive manual input and configuration, this restricts the ability to capture potentially limitless driving scenarios and do not provide the complexity and depth of real-world urban driving conditions. The present disclosure describes a unique approach to automatically create and store a scenario template with multiple traffic agents, either untrained or trained by machine learning algorithms or techniques. To increase the complexity of a traffic scenario, multiple traffic agents can cooperate to create multiparty scenarios. In an endurance test involving a vehicle under test in a driving simulator, trained traffic agents and/or traffic agents trained for a specific functionality can be introduced into the 3D simulation environment. Each scenario template that is automatically created undergoes further processing upon occurrence of a risky traffic scenario or incident and is thereafter stored into a scenario database. The stored scenario template can be replayed for a vehicle under test at any time and can be transposed to similar locations in the 3D simulation environment and replayed under different weather conditions.
[0043] Once a virtual traffic agent has been trained in a way that it replicates human driving behavior for a predetermined geographical area, the goal is to inject one or more trained virtual traffic agents into a simulation environment representing the predetermined geographical area where they can interact, cooperate with and challenge an automous vehicle system controlling an autonomous vehicle under test. The overall goal is to test the limitations and weaknesses of the autonomous vehicle system, especially from adverse and dangerous traffic scenerios that are attributed to assertive or aggressive driving behaviors.
[0044] The disclosed systems and methods of generating scenario templates have a technical effect and benefit of providing an improvement to driving simulation systems and in turn the performance of autonomous vehicle systems. For example, by utilizing the disclosed systems and methods, driving simulation systems can avoid the rule-based system or hand-crafted rule system which may be less effective and flexible for decisions made by driving simulation systems for the testing of autonomous vehicle systems. By providing the ability to extend the coverage of traffic scenarios, which are generated by the interaction of traffic agents among one another and one or more vehicle(s), a broader coverage of scenarios can be achieved, compared to developers constructing scenarios by hand. This greatly reduces the research time needed relative to development of hand crafted rules. For example, a designer would need to exhaustively derive several models of how different vehicles would need to react in likely or imminent vehicular incidents which can be challenging given all the possible scenarios that an autonomous vehicle may encounter.
[0045] Additionally, there is the benefit of significant scalability and customizability since a plurality of likely or imminent vehicular incidents can be easily simulated for a plurality of geographical locations or a desired geographical location with geographically- trained traffic agents without moving real vehicles in the real world. For example, an autonomous vehicle driving around trained traffic agents may react differently in Germany as opposed to other countries such as China due to differing driving cultures and country- specific traffic rules. By using the disclosed traffic agent learning systems disclosed in the priority application and as incorporated herein, trained traffic agents for a desired geographical location can be introduced into the disclosed exemplary systems and methods. Additionally, trained traffic agents that are specifically trained for a specific functional scenario may also be introduced to add to the complexity and depth of the scenario templates generated. Traffic agent learning systems significantly reduce hardware resoures and human resources that are required for training, evaluating and testing autonomous vehicle systems. Another advantage of using driving simulators or simulation environments is the prevention of damages, accidents and loss of lives during the test process. Risky scenarios that would potentially cause damages, accidents and in the worst case loss of lives in the real-world can be tested safely in a simulation environment. [0046] Various embodiments are provided for systems, and various embodiments are provided for methods. It will be understood that basic properties of the systems also hold for the methods and vice versa. Therefore, for sake of brevity, duplicate description of such properties may be omitted.
[0047] FIG. 1 illustrates a block diagram of an exemplary architecture of a driving simulation system 10 to generate a scenario template according to various embodiments. The driving simulation system 10 is configured to implement simulated missions of an autonomous vehicle and includes a server 11 that communicates with one or more client devices 12 via a communications network (not shown) which operably communicates with an autonomous vehicle system 110 that in turn controls the operational controllers of an autonomous vehicle (not shown). The driving simulation system 10 is configured to test the autonomous operation of the autonomous vehicle in a simulated manner. The client device 12 may comprise of a personal computer, a portable computing device such as a laptop, a television, a mobile phone, or any other appropriate storage and/or communication device to exchange data via a web browser and/or communications network. The driving simulation system 10 includes a central processing unit (CPU) or a processor 15 which executes instructions contained in programs such as a driving simulator system and stored in storage devices (not shown). The processor 15 may provide the central processing unit (CPU) functions of a computing device on one or more integrated circuits. As used herein, the term‘processor’ broadly refers to and is not limited to a single or multi-core general purpose processor, a special purpose processor, a conventional processor, a graphical processing unit, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, one or more application-specific integrated circuits (ASICs), one or more field-programmable gate array circuits (FPGA), any other type of integrated circuit, a system on a chip (SOC), and/or a state machine.
[0048] As used herein, one or more client devices 12 may exchange information via any communication network, such as a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), a proprietary network, and/or internet protocol (IP) network such as the Internet, an intranet or an extranet. Each client device 12, module or component within the system may be connected over a network or directly to each other. A person skilled in the art will recognize that the terms‘network’,‘computer network’ and‘online’ may be used interchangeably and do not imply a particular network embodiment. In general, any type of network may be used to implement the online or computer networked embodiment of the present invention. The network may be maintained by a server or a combination of servers or the network may be serverless. Additionally, any type of protocol (for example, HTTP, FTP, ICMP, UDP, WAP, SIP, H.323, NDMP, TCP/IP) may be used to communicate across the network. The devices as described herein may communicate via one or more such communication networks.
[0049] The driving simulation system 10 includes a memory 16 for storing data and software instructions which are executed by the processor 15 and may control the operation of various aspects of the computing system 10. The memory 16 is configured for storing data associated with the simulation of the autonomous vehicle. It can also be configured to store a variety of other data and data files, such as logs of simulated missions of the autonomous vehicle. The memory 16 can include one or more databases 101 and image processing software, as well as a driving simulation system as described herein, which further includes a neural network or a deep neural network. The memory 16 used in the embodiments may include a volatile memory, for example a DRAM (Dynamic Random Access Memory) or a non-volatile memory, for example a PROM (Programmable Read Only Memory), an EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), or a flash memory, e.g., a floating gate memory, a charge trapping memory, an MRAM (Magneto-Resistive Random Access Memory) or a PCRAM (Phase Change Random Access Memory). Alternatively or operably communicating with the memory 16, the computing system also includes a solid state drive (SSD) 13 that stores data and software instructions to be executed by the processor 15. A solid state drive 13 is a data storage device that uses a NAND-based flash memory to store data and is a form of non-volatile memory. Once data is written to the flash memory, and if no power is supplied to the flash memory, the data is still retained in the flash memory. The memory 16 and/or the SSD 13 include one or more databases 101 for the storage of at least one traffic agent database 120, a measurement catalog database 130 and a scenario database 140.
[0050] The traffic agent database 120 stores traffic agents that are trained by a traffic agent learning system which produces traffic agents capable of emulating driving competencies or driving behavior of real-world drivers. The traffic agent database 120 is configured to receive inputs from a traffic service module 153. Traffic agents can be trained based on geographical locations, environmental conditions, defensive and/or aggressive driving behaviors. For example, the traffic agents can be trained by the traffic agent learning system as disclosed in the priority patent application. The measurement catalog database 130 stores a set of predetermined parameters with threshold values or predefined qualitative criteria that define trigger conditions for likely or imminent critical events occurring during active simulation of a vehicle under test. These predetermined parameters are based on a domain knowledge associated with developing safety or comfort features for advanced driver assistance functions, which will be described in more detail below. The scenario database 140 stores sensor data streams and/or scenarios recorded or extracted from real-world drivers or generated from driving simulation systems. The real-world vehicle scenarios can be acquired from various public, government or domain knowledge databases. The scenario database 140 is configured to provide inputs to the scenario executor 171 to execute scenario templates stored in the scenario database for the testing of the vehicle under test in the driving simulation system. The test coverage monitor module 190 is configured to provide and receive inputs from the scenario database 140 and ensures that all the scenario templates are executed and successfully passed by the driving simulation system 10.
[0051] The driving simulation system 10 includes a user interface (not shown) on the client device 12 that is configured to facilitate control inputs to provide control commands to the autonomous vehicle and to facilitate simulation inputs associated with simulating the operation of the autonomous vehicle. The user interface is configured to monitor the simulation of the autonomous vehicle, to provide inputs to the autonomous vehicle system 110 during an associated simulation mission. The driving simulation system 10 also includes a simulation controller 170 configured to provide simulation signals to and receive simulation feedback signals from the external component or the autonomous vehicle system 110. The simulation controller 170 provides the‘world information’ required for it to drive. The‘world information’ comprises sensor signals from the camera, lidar and/or radar. The simulation controller 170 is configured to mimic the controller of an autonomous vehicle system which controls one or more aspects of the vehicle operation based on automated driving commands received from the processor. For example, the simulation controller 170 can provide simulation signals representative of one or more actuator systems or one or more indicator systems in a vehicle. The simulation controller 170 operably communicates with the incident detector module 150, the incident extractor module 160 and the scenario executor 171 and is configured to provide and to receive inputs from each of these modules which in turn provide simulation signals to the autonomous vehicle system 110. Each of the modules will be described in more detail below.
FIG. 2 illustrates a flow diagram of an exemplary method to generate a scenario template according to embodiments of the present disclosure. At step 200 and 210, one or more trained traffic agents from the traffic agent database 120 are introduced into the driving simulation system 10 which includes a simulation environment. The trained traffic agents fall into two categories. The first category of traffic agents in step 200 includes those that have been trained by the traffic agent learning system to emulate the driving competencies or driving behavior of real-world drivers which are further categorized according to geographical locations, environmental conditions, defensive and/or aggressive driving behavior. Traffic agents in the second category in step 210 are those that have been trained specifically for a functional scenario. An example of a functional scenario is that a vehicle is performing a cut-in to another vehicle. Depending on the goals of the test, either traffic agents in step 200 or 210 or both can be introduced in the simulation environment. The simulation environment can be user-configured and can include any of a variety of geographical locations that corresponds to real locations.
[0052] At step 220, on commencement of the test, the vehicle under test, the traffic agents and road infrastructure elements are allowed to interact with one another in the simulation environment. The one or more traffic agents introduced in step 200 proceed to navigate through the simulation environment with the objective to execute driving behavior that each has been trained for, whether it is a defensive or an aggressive driving style respectively or an American or Chinese driving style. For example, how a vehicle is driven in America may differ from a vehicle in China due to differing driving cultures and unique road rules. Road infrastructure elements are included in the simulation environment to provide more complexity, depth and realism to the generated scenario templates. Road infrastructure elements include stationary and movable road elements and road users. Whereas stationary road elements include for example traffic signs, poles, road barriers, buildings, monuments, structures, natural objects, vegetation and/or a terrain surface, movable road elements include ground vehicles, pedestrian, animals, and the like.
[0053] At step 230, as the vehicle under test navigates through the simulation environment during the test, it may be exposed to scenarios where imminent incidents or incidents may occur. An incident is defined as any behavior of the ego or surrounding vehicles that is considered interesting to the user. This can include dangerous situations and situations where the vehicle under test may trigger a hard brake or other extreme controls, even if they did not cause immediate danger to the other surrounding vehicles. For example, a traffic agent with an aggressive style of driving may cause a risky scenario in which an incident with the vehicle under test is likely to occur or actually occurs. A set of measurement criteria which was previously configured by the user is applied to the vehicle under test during the active simulation. The set of measurement criteria is comprised of one or more predetermined parameters with threshold values or predefined qualitative criteria that define trigger conditions for imminent critical events or accidents. These predetermined parameters are based on a domain knowledge associated with developing safety or comfort features for advanced driver assistance functions. For example, as illustrated in Figure 3, the measurement criteria can be predetermined parameters associated with safety features such as speed of the vehicle under test or safety distance of other objects to the vehicle under test. For measurement criteria associated with comfort-related features, the predetermined parameters can be associated with the type of acceleration and deceleration of the vehicle under test: lateral deceleration, long deceleration, lateral acceleration and long acceleration. Each of the aforementioned predetermined parameters will be assigned a minimum or maximum value, depending on the criteria, deemed as safe or comfortable beyond which the incident extractor will be activated. The set of measurement criteria is stored in the measurement catalog database and operably communicates with the incident detector module in the driving simulation system. One or more measurement criteria may be applied to the vehicle under test simultaneously during active simulation. The following example shall illustrate this. The incident detector module is always active and applies the measurement criteria retrieved from the measurement catalog database. Should the vehicle under test exceed the road speed limit or breach the safety distance to another vehicle that was set as measurement criteria, the incident detector module is activated to generate a time-stamp based signal of an incident event that is provided by means of a clock (not shown). The vehicle under test may come very close to colliding with or actually collides with the traffic agent. In either case, the incident detector module is activated to generate a time-stamp based signal provided by the clock at time Wident to indicate the time the incident event occurred. The time stamp ident will be logged and stored in the scenario database. In the context of the present disclosure, an incident event is defined as a near miss or a collision with a traffic agent or a road infrastructure element. The incident detector module is configured to detect the incident event in real-time or near real-time. This module is configured to detect the incident event based on information conveyed by output signals which have been generated by the simulation controller or visual output data collected by sensors on the vehicle under test in the simulation environment. [0054] At step 240, the incident extractor module is activated and goes back in time by a predetermined time period denoted as temergmg. The incident extractor module is configured to extract a recording of the sequence of scenes to capture the near miss or collision with the traffic agent for the time period from temergmg to ident- The sequence of scenes include the environment information and the position of the vehicle under test and the traffic agent and/or the road infrastructure elements. This extraction process is based on information conveyed by output signals which are generated by the simulation controller or visual output data collected by sensors on the vehicle under test in the simulation environment.
[0055] Once the incident extractor module extracts a record of the sequence of scenes for the relevant time period, at step 250, the extracted record of sequence of events undergoes further processing to be evaluated, categorized, and classified. For example, the extracted record may be assigned a complexity level that indicates the complexity of the recorded near miss or incident. At the beginning of the test, the vehicle under test may be challenged by scenarios that have a low complexity level such asa traffic agent brakes hard in front of the vehicle under test. However, as the test continues in time duration and increasing maturity and capabilities of its autonomous driving system, the vehicle under test will subsequently be challenged by scenarios that have a higher complexity level. The extracted record will be automatically assigned a complexity level, in which a lower numerical number corresponds to a scenario of lower complexity level and a higher numerical number to a scenario of higher complexity level . The extracted record may also be evaluated to assess driving behavior risks using parameters such as the time to collision and the distance of vehicles. These parameters can directly assess the level of driving behavior risk of the extracted record of sequence of events. Additionally, the extracted record of sequence of events may be classified according to types of vehicle maneuvers. The classification of the extracted record offers the advantage of building special scenario clusters. For example, one cluster may contain all cases where a traffic agent performs a cut-in maneuver from the front right side of the vehicle under test. The extracted record may also be categorized according to the different types of traffic agents involved in the incident, such as cars, motorbikes, bicycles, etc., road infrastructure elements involved such as bridges, intersections, tunnels, hilly drive, steep decline, curvy road, etc., and sensor disturbances including occlusion of field of view, sun glare for example. The objective of categorizing the extracted records of sequences of events is to have them stored in one multi-dimensional, filterable database in which scenarios can be stored and retrieved easily. The processor 15 executes automatically each of the processing steps which are classification, categorization, and assignment of complexity level and level of risk.
[0056] At step 260, once the extracted record has undergone processing, the driving simulation system 10 is configured to create a scenario template corresponding to the extracted record of scenes of the incident event and is stored in a scenario database 140. The scenario templates are created using a domain specific language or a formal scenario description language (SDL) which enables a scenario to be enumerated and to be described at a high level so that only useful scenarios are instantiated or have the ability to be repeated deterministically with different software or hardware versions of the autonomous vehicle system under test. For example, creating scenario templates using scenario description language, the scenarios can be transposed to different geographical locations and repeated under varying weather conditions. The scenario templates can also be modified in a way that the parameters of the scenario’s participants’ movements are changed. This allows the template to be instantiated with a multitude of modified parameters, with the possibility to lower or increase the levels of complexity or risk.
[0057] At step 270, one or more success and/or fail criteria are assigned to the specific scenario template. Users of the driving simulation system 10 may have differing definitions on determining if a scenario template is considered a success or a failure. Car manufacturers, for example, or organizations that establish or enforce car safety standards such as the New Car Assessment Program (NCAP) may have specific criteria how an accident or incorrect driving behavior is to be defined. In order to assign a success and fail criteria to the scenario template, predetermined criteria are defined so that each scenario template will be assigned accordingly based on the predetermined criteria. Examples of predetermined criteria are safety distance to another car and maximum speed at point of collision or time to collision of vehicles. Predetermined criteria may also be more specialized for scenarios that are more complicated. For example, in a scenario which involves creating a rescue lane on a German highway during a traffic jam, a success criteria is assigned if the following conditions are achieved: (i) at least 2 meters of drivable space to the left of the vehicle under test, (ii) the vehicle under test is stopped (or its speed is less than or equal to lkm/h), and (iii) distance from the vehicle under test to the vehicle in front is less than 0.5m. At step 280, the scenario template is stored in a scenario database 140.
[0058] FIG. 4 illustrates the workings of the incident detector module and the incident extractor module. During active simulation, the vehicle under test may be exposed to scenarios in which an incident event may occur. For example, a traffic agent with an aggressive style of driving may cause a risky scenario in which an incident with the vehicle under test is likely to occur or occurs. One or more measurement criteria are configured by the user and are applied by the incident detector module to the vehicle under test during active simulation. The set of measurement criteria is one or more predetermined parameters with threshold values or predefined qualitative criteria that define trigger conditions for an incident event occurring during active simulation of a vehicle under test. As mentioned beforehand, should the vehicle under test exceed the threshold value of a measurement criteria, the incident detector module is activated to generate a time stamp-based signal that is provided by a clock (not shown), of an incident event, in which the time stamp ident will be logged in the scenario database to be associated with the time the incident event occurs. The incident detector module is configured to detect the incident event in real-time or near real-time. In the context of the present disclosure, an incident event is defined as a near miss or a collision between the vehicle under test and a traffic agent and/or a road infrastructure element. The incident detector module is configured to detect the imminent incident in based on information conveyed by output signals generated by the simulation controller or, visual output data collected by sensors on the vehicle under test in the simulation environment.
[0059] Upon occurrence of the incident event, the incident extractor module is activated to extract a recording of the sequence of scenes prior to the time the incident event occurred by a predetermined time period denoted as temerging. The incident extractor module is configured to extract a recording of the sequence of scenes that captures scenes prior to the near miss or collision with the traffic agent for the time period from temerging to tincident. The sequence of scenes include the environment information and the position of the vehicle under test and the traffic agent and/or the road infrastructure element. This extraction process is based on information conveyed by output signals which are generated by the simulation controller or visual output data collected by sensors on the vehicle under test in the simulation environment. The extracted recording of the sequence of scenes is subsequently processed as mentioned beforehand.
[0060] FIG. 5 illustrates a flow chart diagram of an exemplary method to generate scenario templates whose data is derived from a plurality of public databases. The databases comprise data that is obtained from real vehicles in the real world such as vehicles participating in normal public road traffic, which may or may not be autonomously controlled or controlled by human drivers. The database comprising NCAP test cases for advanced driver assistance system (ADAS) functions may comprise test cases which include data obtained from autonomously controlled vehicles. A domain knowledge database may contain test cases that comprise a combination of autonomous and non- autonomously controlled vehicles which may be obtained from car manufacturers or other driving simulation systems. At step 340, the driving simulation system 10 is configured to create a scenario template corresponding to the extracted data from the plurality of public databases. As mentioned beforehand, the scenario templates are created using a domain specific language or a formal scenario description language (SDL) which enables a scenario to be enumerated and to be described at a high level so that only useful scenarios are instantiated or have the ability to be repeated deterministically with different software or hardware versions of the autonomous vehicle system under test. For example, creating scenario templates using scenario description language, the scenarios can be transposed to different geographical locations and repeated under varying weather conditions. The scenario templates can also be modified in a way that the parameters of the scenario’s participants’ movements are modified. This allows the template to be instantiated with a multitude of modified parameters. [0061] At step 350, a success and fail label is assigned to the specific scenario template as was done for the extracted record of sequence of scenes for the vehicle under test, after its processing to a scenario template. Users of the driving simulation system 10 may have differing definitions on determining if a the outcome of an executed scenario is considered a success or a failure. Car manufacturers, for example, or organizations that establish or enforce car safety standards such as the NCAP may have specific criteria how an accident or incorrect driving behavior is to be defined. In order to assign one or more success and/or fail criteria to the scenario template, predetermined criteria are defined so that each scenario template will be assigned accordingly based on the predetermined criteria. For example, a predetermined criteria can be safety distance to another car maximum speed at point of collision, or time to collision of vehicles. As also mentioned beforehand, predetermined criteria may also be more specialized for scenarios that are more complicated and the attribution of a success or fail criteria can include more than one predetermined criteria. Once a success and fail label has been applied to each scenario template, the scenario template can now be repeated deterministically with concretely defined movement of other road users.
[0062] FIGS. 2 and 5 illustrate two different ways of populating the scenario database 140. Whereas the flow diagram in FIG. 2 describes the method of generating scenario templates from the interaction of traffic agents, road infrastructure elements and a vehicle under test in a simulation environment, FIG. 5 describes the method of generating scenario templates from extracting scenario data from public databases and generating scenario templates by converting the scenario data into scenario templates using a scenario description language. Having scenario templates generated in a common language applying both methods and stored in the scenario database provides the benefit of generating numerous scenario templates that are created by the combination of using specially trained traffic agents to create challenging scenarios (as provided in FIG. 2) and using scenario data from public databases. This provides an additional layer of complexity and depth to the realism of the scenario templates that are generated. [0063] FIG. 6 illustrates a flow chart diagram of an exemplary method of executing scenario templates according to various embodiments. The driving simulation system 10 includes a data analytics module 400 of the scenario templates which are stored in the scenario database. The data analytics module 400 provides an overview of the coverage of the scenario templates and also highlights the scenarios that are not covered. For example, the data analytics module 400 can provide statistics and numbers on the various categories of scenarios. The categories of scenario templates may cover one or more of the following areas: geometry dependent driving behavior, interaction with other road users, interaction with road infrastructure elements, sensor degradation, or behavioral competencies. Based on the predetermined categories, the data analytics module will also identify the types of scenario templates that were not extracted by the scenario database. This helps the user to identify scenario templates that have not been covered. Uncovered scenario templates can then be obtained either through public databases or through using trained traffic agents to simulate the uncovered scenarios. The data analytics module 400 also evaluates the risk of the scenario templates. The scenario risk is a measure of how high the chance that an accident (i.e. a collision between the vehicle under test (VUT) and a traffic agent) might occur. In general, it is assumed that this scenario risk is of interest for classification of scenarios independently of the driving behavior risks assigned for scenario templates as mentioned beforehand. Scenario risk can be assessed in a qualitative way, which however is not well-suited for automation. Several measures that quantify risk more objectively exist such as Time-To-Collision value computation, from which various derivation exists that account for arbitrary motion or individual vs. collective risk. These metrics however assume known trajectories for all involved vehicles, which also includes the VUT whose behavior is subject to test; and thus on itself only applies to a particular test run. To handle this limitation, and derive a risk assessment that describes the scenario rather than its concrete excution; certain assumptions about the VUT have to be made: For instance it could be assumed that the behavior of the VUT is sufficiently similar to some behavioral model that exists in the simulation. For better accuracy of the scenario risk value we can vary the behavior of such a vehicle over many explorative runs, and aggregate the observed risk values. A more direct approach would be to analyze scenario conditions with respect to occuring speeds, accelerations and distances (eg. for cut-ins) as to whether dangerous values occur. This requires a more informal analysis of the scenario and could potentially misclassify some newly discovered risky scenarios as non-risky. The complexity of a scenario is a measure for the computational effort, necessary reaction times and planning capabilities a VUT must posses. This measure can in tendency be related to risk as situations that are complex can often be risky. Counterexamples for this relationship can however be found. Complexity can be defined from various perspectives: An actor-centric complexity metric would take into account the number and maneouvers of all the involved actors and possibly also the road geometry. Complexity would then be defined as high for many existing actors with complex conditions / motion plans and/or a complex/windy road geometry. As for the case of risk, a potentially more natural definition of complexity would be oriented on the VTJT: High jerk values in a VUT trajectory indicate sudden changes in acceleration and thus execution of difficult driving maneouvers. This approach can be detailed to incorporate a more differentiated features such as jerk, acceleration, speed and distance separated with respect to whether they are observed alongside the driving direction (longitudinal) or perpendicular to it (lateral). A useful weighting of these features can be found with expert knowledge or based on existing statistical methods. As for the case of risk, to compensate for freedom in the behavior of the VUT, several runs can be sampled from the scenario with varying behavior, aggregating the results to approximate the total complexity for all possible scenario runs. After the (fully automated) computation of risk and complexity values for a number of scenarios, a clustering algorithm can automatically identify a manageable number of groups of risk/complexity combinations.
[0064] The driving simulation system 10 also includes a scenario fusion module 410 which allows two different scenario templates to be combined. Combining two different scenario templates enhances the complexity and the coverage of the scenario templates. The scenario fusion module 410 is configured to merge two or more scenario templates extracted from the scenario database into a more complex scenario that combines their properties. In general, a fused scenario template is at least as challenging to a vehicle under test as each of its individual component scenario templates. The process of merging two or more scenario templates comprises the following steps: (i) shallow merging, (ii) condition graph merging, (iii) computation of implicit conditions and (iv) offset adjustment.
[0065] The scenario database 140 is populated with scenario templates generated from endurance testing of a vehicle under test in a simulation environment, scenario templates generated from public databases (for example, accident databases, expert or domain knowledge database, or NCAP test protocols) and scenario templates that are generated as a result of fusing two or more scenario templates. The scenario templates generated from fusing two or more scenario templates may be obtained from the combination of scenario templates generated from endurance testing of the vehicle under test in a simulation environment and scenario templates generated from public databases or on its own. With the rich diversity of scenario templates in the scenario database 140, each of the scenario templates can be executed by the vehicle under test in the simulation environment for the purpose of testing the autonomous vehicle system at step 420. Once each of the scenario templates has been executed, and at step 430, the driving simulation system 10 will monitor the testing progress by storing the scenario outcomes with attributes passed or failed.. Each scenario template is accompanied with a success or fail criteria. If the outcome of an executed scenario template is assigned as a fail, the scenario is not passed. An example of a fail criteria is when a collision occurs. At step 440, the driving simulation system 10 also identifies gaps in the test coverage of the scenario templates by identifying scenario types that have not yet been executed successfully. By doing so, the weaknesses of the vehicle under test can be emphasized and worked on.
[0066] FIG. 7a illustrates diagrams of exemplary scenario templates illustrating the interaction between a traffic agent and a vehicle under test according to various embodiments. The figure on the left illustrates a scenario template of a traffic agent Al attempting to cut in to the lane of a vehicle under test (VUT) from the left. The figure on the right illustrates a scenario template with a traffic agent A2 attempting to cut in to the lane of a vehicle under test from the right. Both scenario templates are obtained from the scenario database 140. FIG 7b illustrates an example of a fused scenario template obtained as a result of combining both scenarios as depicted in the two diagrams of FIG. 7a. Referring to FIG 7b, the explicit conditions are those mentioned in the component scenarios (in this example, lane change from left and lane change from right). These would typically contain a condition such as: When distance between VUT and traffic agent is <= X meters, trigger the lane change. Those conditions are usually chosen by the scenario author such that behaviors are valid under all possible VUT behaviors. For example, trafffic agents do not crash into each other or into the side of the VUT (in which case it would have little chance to avoid the crash). These“valid behaviors” form the implicit conditions. In the example of the two lane change scenarios: traffic agents should not crash into each other and thus must obey to certain timings/distances. This condition is not explicitly formulated in any of the component scenario definitions (as neither“knows” about the other traffic agents) but need to be taken into account during fusion of the scenarios.
[0067] FIG. 8 illustrates the relationship between traffic agents Al and A2, vehicle under test, conditions and fused scenario states. Considering the component scenarios individually, each would be structured in the following manner:
1. State: Traffic agent is on lane (Al or A2) (left/right of VUT);
2. Condition: Under a certain condition (traffic agent<->VUT distance) the traffic agent initiates a lane change;
3. State: After the lane change is complete, the traffic agent is in front of the VUT in the same lane.
[0068] FIG. 8 exemplifies how this set of states and conditions must change when two scenarios are fused: An order must be defined for states and conditions and there must be a state that connects the scenarios (here: state after one vehicle has changed lane but before the other has, so in this particular case the last state of one scenario and the first of the other). To honor the implicit conditions (see above), new conditions might need to be created; in particular for guarding these connecting state(s). If that would not be the case, the individual conditions for the two component scenarios might cause both agents to change to the center lane at the same time and cause an (unwanted) collision. [0069] While the invention has been particularly shown and described with reference to specific embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. The scope of the invention is thus indicated by the appended claims and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced.

Claims

1. A computing system comprising:
one or more processors;
a memory device coupled to the one or more processors;
a driving simulator system stored in the memory device and configured to be executed by the one or more processors, the driving simulator system comprising instructions for:
introducing at least one traffic agent and a plurality of road infrastructure elements for interacting with a vehicle under test,
detecting an incident event between the vehicle under test with the at least one traffic agent or one of the plurality of road infrastructure elements based on the vehicle under test exceeding at least one predetermined measurement criteria;
extracting a sequence of scenes from a predetermined time period before the incident event occurred to the time period the incident event occurred; creating a first scenario template based on the extracted sequence of scenes for storing into a scenario database.
2. The computing system according to claim 1, wherein each of the at least one predetermined measurement criteria corresponds to a predetermined parameter with a threshold value defining a condition for the incident event.
3. The computing system according to claim 1, wherein each of the at least one predetermined measurement criteria is associated with a safety feature for an advanced driver assistance function.
4. The computing system according to claim 1, wherein each of the at least one predetermined measurement criteria is associated with a comfort feature for an advanced driver assistance function.
5. The computing system according to claim 1, wherein detecting the incident event includes generating a timestamp-based signal ident, the timestamp-based signal stored in a scenario database.
6. The computing system according to claim 1, wherein the incident event is defined as a near miss or a collision with the traffic agent or the road infrastructure element.
7. The computing system according to claim 1, wherein the at least one traffic agent is trained by a traffic agent learning system for emulating driving competencies or real- world drivers.
8. The computing system according to claim 1, further comprising the step of assigning a complexity level to the extracted sequence of scenes, wherein the complexity level corresponds to an indication of the complexity of the incident.
9. The computing system according to claim 1, wherein the first scenario template is created using a formal scenario description language.
10. The computing system according to claim 1 , further comprising the step of assigning a success or fail criteria to the scenario template, wherein the success or fail criteria is determined based on one or more predetermined criteria.
11. The computing system according to claim 1, further comprising the following steps:
generating a second scenario template extracted from one or more public database, wherein the second scenario template is distinct from the scenario template;
fusing the first and the second scenario templates to generate a third scenario template wherein the third scenario template is a combination of the properties of the first and second scenario templates.
12. A computer-implemented method for use in a driving simulation system comprising the steps of:
introducing at least one traffic agent and a plurality of road infrastructure elements for interacting with a vehicle under test,
detecting an incident event between the vehicle under test and with the at least one traffic agent or one of the plurality of road infrastructure elements based upon the vehicle under test exceeding at least one predetermined measurement criteria; extracting a sequence of scenes from a predetermined time period before the incident event occurred to the time period the incident event occurred;
creating a scenario template based on the recorded sequence of scenes for storing into a scenario database.
13. The computer- implemented method according to claim 12, wherein each of the at least one predetermined measurement criteria corresponds to a predetermined parameter with a threshold value defining a condition for the incident event.
14. The computer- implemented method according to claim 12, wherein each of the at least one predetermined measurement criteria is associated with a safety feature for an advanced driver assistance function.
15. The computer- implemented method according to claim 12, wherein each of the at least one predetermined measurement criteria is associated with a comfort feature for an advanced driver assistance function.
16. The computer-implemented method according to claim 12, wherein detecting the incident event includes generating a timestamp-based signal ident, the timestamp- based signal stored in a scenario database.
17. The computer-implemented method according to claim 1 , wherein the incident event is defined as a near miss or a collision with the traffic agent or the road infrastructure element.
18. The computer-implemented method according to claim 1, wherein the at least one traffic agent is trained by a traffic agent learning system for emulating driving competencies or real-world drivers.
19. The computer-implemented method according to claim 12, further comprising the step of assigning a complexity level to the extracted sequence of scenes, wherein the complexity level corresponds to an indication of the complexity of the incident.
20. The computer-implemented method according to claim 12, wherein the first scenario template is created using a formal scenario description language.
21. The computer-implemented method according to claim 12, further comprising the step of assigning a success or fail criteria to the scenario template, wherein the success or fail criteria is determined based on one or more predetermined criteria.
22. The computer-implemented method according to claim 12, further comprising the following steps:
generating a second scenario template extracted from one or more public database, wherein the second scenario template is distinct from the scenario template;
fusing the first and the second scenario templates to generate a third scenario template wherein the third scenario template is a combination of the properties of the first and second scenario templates.
PCT/SG2018/050640 2018-09-18 2018-12-31 System and method for generating a scenario template WO2020060480A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
PCT/SG2018/050477 WO2020060478A1 (en) 2018-09-18 2018-09-18 System and method for training virtual traffic agents
SGPCT/SG2018/050477 2018-09-18

Publications (1)

Publication Number Publication Date
WO2020060480A1 true WO2020060480A1 (en) 2020-03-26

Family

ID=69887715

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/SG2018/050477 WO2020060478A1 (en) 2018-09-18 2018-09-18 System and method for training virtual traffic agents
PCT/SG2018/050640 WO2020060480A1 (en) 2018-09-18 2018-12-31 System and method for generating a scenario template

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/SG2018/050477 WO2020060478A1 (en) 2018-09-18 2018-09-18 System and method for training virtual traffic agents

Country Status (1)

Country Link
WO (2) WO2020060478A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111409788A (en) * 2020-04-17 2020-07-14 大连海事大学 Unmanned ship autonomous navigation capability testing method and system
CN112513951A (en) * 2020-10-28 2021-03-16 华为技术有限公司 Scene file acquisition method and device
DE102022107845A1 (en) 2022-04-01 2023-10-05 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Method, system and computer program product for selecting concrete scenarios

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111581887B (en) * 2020-05-16 2023-04-07 郑州轻工业大学 Unmanned vehicle intelligent training method based on simulation learning in virtual environment
CN113254336B (en) * 2021-05-24 2022-11-08 公安部道路交通安全研究中心 Method and system for simulation test of traffic regulation compliance of automatic driving automobile
CN116070783B (en) * 2023-03-07 2023-05-30 北京航空航天大学 Learning type energy management method of hybrid transmission system under commute section
CN117698685B (en) * 2024-02-06 2024-04-09 北京航空航天大学 Dynamic scene-oriented hybrid electric vehicle self-adaptive energy management method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160019807A1 (en) * 2013-03-12 2016-01-21 Japan Automobile Research Institute Vehicle risky situation reproducing apparatus and method for operating the same
US20160210382A1 (en) * 2015-01-21 2016-07-21 Ford Global Technologies, Llc Autonomous driving refined in virtual environments
CN107153363A (en) * 2017-05-08 2017-09-12 百度在线网络技术(北京)有限公司 The emulation test method and device of pilotless automobile, equipment and computer-readable recording medium
WO2017210222A1 (en) * 2016-05-30 2017-12-07 Faraday&Future Inc. Generating and fusing traffic scenarios for automated driving systems

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3278317B1 (en) * 2015-03-31 2022-11-16 Sony Group Corporation Method and electronic device
US10496766B2 (en) * 2015-11-05 2019-12-03 Zoox, Inc. Simulation system and methods for autonomous vehicles
WO2018002910A1 (en) * 2016-06-28 2018-01-04 Cognata Ltd. Realistic 3d virtual world creation and simulation for training automated driving systems
BR112019016268B1 (en) * 2017-02-10 2023-11-14 Nissan North America, Inc. METHOD FOR USE IN CROSSING A VEHICLE AND AUTONOMOUS VEHICLE TRANSPORTATION NETWORK
US10019011B1 (en) * 2017-10-09 2018-07-10 Uber Technologies, Inc. Autonomous vehicles featuring machine-learned yield model

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160019807A1 (en) * 2013-03-12 2016-01-21 Japan Automobile Research Institute Vehicle risky situation reproducing apparatus and method for operating the same
US20160210382A1 (en) * 2015-01-21 2016-07-21 Ford Global Technologies, Llc Autonomous driving refined in virtual environments
WO2017210222A1 (en) * 2016-05-30 2017-12-07 Faraday&Future Inc. Generating and fusing traffic scenarios for automated driving systems
CN107153363A (en) * 2017-05-08 2017-09-12 百度在线网络技术(北京)有限公司 The emulation test method and device of pilotless automobile, equipment and computer-readable recording medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
AUST M. L.: "Improving the Evaluation Process for Active Safety Functions Addressing Key Challenges in Functional Formative Evaluation of Advanced Driver Assistance Systems", 31 December 2012 (2012-12-31), XP055695210, Retrieved from the Internet <URL:http://publications.lib.chalmers.se/records/fulltext/155871.pdf> [retrieved on 20190607] *
BERTHELON C. ET AL.: "Methodology to introduce scenarios of accident in driving simulators interest for the analysis of drivers' behavior", 16 September 2011 (2011-09-16), XP055695214, Retrieved from the Internet <URL:http://onlinepubs.trb.org/onlinepubs/conferences/2011/RSS/1/Berthelon,C.pdf> [retrieved on 20190607] *
DUAN J. ET AL.: "Automatic Generation Method of Test Scenario for ADAS based on Complexity", 7 October 2017 (2017-10-07), XP055695204, Retrieved from the Internet <URL:https://www.researchgate.net/publication/320039559_Automatic_Generation_Method_of_Test_Scenario_for_ADAS_Based_on_Complexity> [retrieved on 20190607], DOI: 10.4271/2017-01-1992 *
ELROFAI H. ET AL.: "scenario-based safety validation of connected and automated driving", STREETWISE, TNO, 31 July 2018 (2018-07-31), XP055695183, Retrieved from the Internet <URL:https://publications.tno.nl/publication/34626550/AyT8Zc/TNO-2018-streetwise.pdf> [retrieved on 20190607] *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111409788A (en) * 2020-04-17 2020-07-14 大连海事大学 Unmanned ship autonomous navigation capability testing method and system
CN112513951A (en) * 2020-10-28 2021-03-16 华为技术有限公司 Scene file acquisition method and device
DE102022107845A1 (en) 2022-04-01 2023-10-05 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Method, system and computer program product for selecting concrete scenarios

Also Published As

Publication number Publication date
WO2020060478A1 (en) 2020-03-26

Similar Documents

Publication Publication Date Title
WO2020060480A1 (en) System and method for generating a scenario template
US20230376037A1 (en) Autonomous vehicle simulation system for analyzing motion planners
US20190164007A1 (en) Human driving behavior modeling system using machine learning
US20190130056A1 (en) Deterministic Simulation Framework for Autonomous Vehicle Testing
CN114514524A (en) Multi-agent simulation
CN112654933A (en) Computer-implemented simulation method and apparatus for testing control devices
Ponn et al. An optimization-based method to identify relevant scenarios for type approval of automated vehicles
Viswanadha et al. Addressing the ieee av test challenge with scenic and verifai
CN110874610B (en) Human driving behavior modeling system and method using machine learning
Erdogan et al. Parametrized end-to-end scenario generation architecture for autonomous vehicles
Ramakrishna et al. Anti-carla: An adversarial testing framework for autonomous vehicles in carla
Singh et al. Simulation driven design and test for safety of ai based autonomous vehicles
KR102157587B1 (en) Simulation method for autonomous vehicle linked game severs
Hildebrandt et al. World-in-the-loop simulation for autonomous systems validation
Fremont et al. Safety in autonomous driving: Can tools offer guarantees?
Li A scenario-based development framework for autonomous driving
Balakrishnan et al. Transfer reinforcement learning for autonomous driving: From wisemove to wisesim
Levermore et al. Test framework and key challenges for virtual verification of automated vehicles: the VeriCAV project
Shah et al. A simulation-based benchmark for behavioral anomaly detection in autonomous vehicles
JP2024507997A (en) Method and system for generating scenario data for testing vehicle driver assistance systems
Pietruch et al. An overview and review of testing methods for the verification and validation of ADAS, active safety systems, and autonomous driving
Themann et al. Database of relevant traffic scenarios as a tool for the development and validation of automated driving
CN112329908A (en) Image generation method for neural network model test
Thompson Testing the intelligence of unmanned autonomous systems
US20220289217A1 (en) Vehicle-in-virtual-environment (vve) methods and systems for autonomous driving system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18933797

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18933797

Country of ref document: EP

Kind code of ref document: A1