CN108290579A - Simulation system and method for autonomous vehicle - Google Patents

Simulation system and method for autonomous vehicle Download PDF

Info

Publication number
CN108290579A
CN108290579A CN201680064648.2A CN201680064648A CN108290579A CN 108290579 A CN108290579 A CN 108290579A CN 201680064648 A CN201680064648 A CN 201680064648A CN 108290579 A CN108290579 A CN 108290579A
Authority
CN
China
Prior art keywords
data
autonomous vehicle
simulation
dynamic object
partially
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201680064648.2A
Other languages
Chinese (zh)
Other versions
CN108290579B (en
Inventor
J·S·莱文森
G·T·斯布莱
A·G·赖格
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zook Co Ltd
Original Assignee
Zook Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/756,993 external-priority patent/US9878664B2/en
Priority claimed from US14/932,952 external-priority patent/US10745003B2/en
Priority claimed from US14/932,958 external-priority patent/US9494940B1/en
Priority claimed from US14/756,991 external-priority patent/US9720415B2/en
Priority claimed from US14/932,959 external-priority patent/US9606539B1/en
Priority claimed from US14/932,963 external-priority patent/US9612123B1/en
Priority claimed from US14/756,996 external-priority patent/US9916703B2/en
Priority claimed from US14/756,995 external-priority patent/US9958864B2/en
Priority claimed from US14/932,966 external-priority patent/US9507346B1/en
Priority claimed from US14/932,948 external-priority patent/US9804599B2/en
Priority claimed from US14/932,940 external-priority patent/US9734455B2/en
Priority claimed from US14/756,992 external-priority patent/US9910441B2/en
Priority claimed from US14/932,954 external-priority patent/US9517767B1/en
Priority claimed from US14/757,016 external-priority patent/US10496766B2/en
Priority to CN202210276163.7A priority Critical patent/CN114643995A/en
Priority claimed from PCT/US2016/060030 external-priority patent/WO2017079229A1/en
Application filed by Zook Co Ltd filed Critical Zook Co Ltd
Publication of CN108290579A publication Critical patent/CN108290579A/en
Publication of CN108290579B publication Critical patent/CN108290579B/en
Application granted granted Critical
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
    • B60R21/01512Passenger detection systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
    • B60R21/01512Passenger detection systems
    • B60R21/01544Passenger detection systems detecting seat belt parameters, e.g. length, tension or height-adjustment
    • B60R21/01546Passenger detection systems detecting seat belt parameters, e.g. length, tension or height-adjustment using belt buckle sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0953Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0016Planning or execution of driving tasks specially adapted for safety of the vehicle or its occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0017Planning or execution of driving tasks specially adapted for safety of other traffic participants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0018Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • B60W60/00272Planning or execution of driving tasks using trajectory prediction for other traffic participants relying on extrapolation of current movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • B60W60/00274Planning or execution of driving tasks using trajectory prediction for other traffic participants considering possible movement changes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/007Emergency override
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0027Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0018Method for the design of a control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2400/00Indexing codes relating to detected, measured or calculated conditions or factors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2720/00Output or target parameters relating to overall vehicle dynamics
    • B60W2720/10Longitudinal speed

Abstract

Various embodiments relate generally to autonomous vehicle and associated machinery, electrical and electronic hardware, computer software and system and wired and wireless network communication, to provide autonomous vehicle fleet as service.More particularly, system, equipment and method are configured as simulating the navigation of autonomous vehicle in various simulated environments.In particular, method may include the data for receiving the characteristic for representing dynamic object, calculate the classification of dynamic object to identify the dynamic object of classification, mark is represented and is simulated with the predicated response for forming simulation dynamic object and indicating the data of simulation autonomous vehicle with the motion range of prediction of the data of the associated dynamical correlation characteristic of dynamic object of classification, the data model for the dynamic object for forming classification, the dynamic object for simulating in simulated environment classification.

Description

Simulation system and method for autonomous vehicle
Cross reference to related applications
This PCT international applications are the U. S. application No.14/757,016 submitted on November 5th, 2015, on November 4th, 2015 U.S. Patent application submit, entitled " AUTONOMOUS VEHICLE FLEET SERVICE AND SYSTEM " No.14/932,959, " ADAPTIVE MAPPING TO NAVIGATE submitting, entitled on November 4th, 2015 The U.S. Patent application of AUTONOMOUS VEHICLES RESPONSIVE TO PHYSICAL ENVIRONMENT CHANGES " No.14/932,963, on November 4th, 2015 " TELEOPERATION SYSTEM AND METHOD FOR submitting, entitled The U.S. Patent application No.14/932,966 of TRAJECTORY MODIFICATION OF AUTONOMOUS VEHICLES ", On November 4th, 2015 " AUTOMATED EXTRACTION OF SEMANTIC INFORMATION TO submitting, entitled The U.S. Patent application of ENHANCE INCREMENTAL MAPPING MODIFICATIONS FOR ROBOTIC VEHICLES " No.14/932,940, " COORDINATION OF DISPATCHING AND submitting, entitled on November 4th, 2015 The U.S. Patent application No.14/756,995 of MAINTAINING FLEET OF AUTONOMOUS VEHICLES ", 2015 years 11 The U.S. Patent application of the moon " ADAPTIVE AUTONOMOUS VEHICLE PLANNER LOGIC " submitting, entitled on the 4th No.14/756,992, " SYSTEM OF CONFIGURING ACTIVE submitting, entitled on November 4th, 2015 The U.S. Patent application of LIGHTING TO INDICATE DIRECTIONALITY OF AN AUTONOMOUS VEHICLE " No.14/756,994, " METHOD FOR ROBOTIC VEHICLE submitting, entitled on November 4th, 2015 The U.S. of COMMUNICATION WITH AN EXTERNAL ENVIRONMENT VIA ACOUSTIC BEAM FORMING " is special Profit application No.14/756,993, " SENSOR-BASED OBJECT-DETECTION submitting, entitled on November 4th, 2015 The U.S. Patent application No.14/756,991 of OPTIMIZATION FOR AUTONOMOUS VEHICLES ", November 4 in 2015 U.S. Patent application that day submits, entitled " CALIBRATION FOR AUTONOMOUS VEHICLE OPERATION " No.14/756,996, " ACTIVE LIGHTING CONTROL FOR submitting, entitled on November 4th, 2015 COMMUNICATING A STATE OF AN AUTONOMOUS VEHICLE TO ENTITIES IN A SURROUNDING The U.S. Patent application No.14/932,948 of ENVIRONMENT ", " RESILIENT submitting, entitled on November 4th, 2015 The U.S. Patent application No.14/932,952 of SAFETY SYSTEM FOR A ROBOTIC VEHICLE ", November 4 in 2015 U.S. Patent application that day submits, entitled " INTERNAL SAFETY SYSTEMS FOR ROBOTIC VEHICLES " No.14/932,954, " QUADRANT CONFIGURATION OF ROBOTIC submitting, entitled on November 4th, 2015 The U.S. Patent application No.14/932,958 of VEHICLES " and " ROBOTIC submitting, entitled on November 4th, 2015 The continuation Shen of the U.S. Patent application No.14/932,962 of VEHICLE ACTIVE SAFETY SYSTEMS AND METHODS " Please, and this application on November 5th, 2015 " INDEPENDENT STEERING, POWER, TORQUE submitting, entitled The U.S. Patent application No.14/757 of CONTROL AND TRANSFER IN AUTONOMOUS VEHICLES ", 015 is related, The totality of these U.S. Patent applications is incorporated to by reference for all purposes herein.
Technical field
It is soft that various embodiments relate generally to autonomous vehicle and associated machinery, electrical and electronic hardware, computer Part and the communication of system and wired and wireless network, to provide autonomous vehicle fleet as service.More specifically, system, The navigation that equipment and method are configured as to autonomous vehicle in various simulated environments is simulated.
Background technology
The various schemes of exploitation automatic driving vehicle, which are concentrated mainly on, makes conventional vehicles (for example, manual actuation automotive vehicle) In terms of automation, it is intended to automatic driving vehicle of the production for consumer's purchase.For example, many motor corporations and branch company are just It in the conventional automobile of modification and controlling mechanism, such as turns to, to provide the consumer with, possess can be in the case of non-driver The ability of the vehicle operated.In some schemes, conventional automatic driving vehicle executes better safe than sorry in some conditions Driving function, but if vehicle control device can not solve the problems, such as some may jeopardize occupant safety, then require to drive Member is controlled (for example, steering etc.).
Although practical conventional automatic driving vehicle usually has shortcomings.For example, it is being developed it is a large amount of nobody drive Sail car from require manually (that is, control by people) turn to and the differentiation of the vehicles of other similar automatic functions from.Therefore, greatly Most unmanned cars are based on such a normal form:It is the driver for adapting to have driving license by Car design, is driven for this in vehicle The person of sailing remains special seat or position.Then, automatic driving vehicle is less preferably devised, and would generally abandon simplifying Car design and the chance for saving resource (for example, reducing the cost of production automatic driving vehicle).Other disadvantages exist in often In the automatic driving vehicle of rule.
Other disadvantages exist in conventional transportation service, due to providing Conventional transport and taking the common of shared service Scheme and cause conventional transportation service to be poorly suited for initiatively being managed the inventory of such as vehicle.In routine In scheme, it is desirable that passenger accesses mobile application with by assigning human driver and vehicle (for example, privately owned to passenger In the case of) centralization service and ask transportation service.In the case of the vehicle possessed using different people, private vehicle and The maintenance of security system can not usually be checked.In another conventional scheme, some entities can be by allowing to be used as It is shared to taking for one group of vehicle to realize that the driver that member recruits enters the vehicle shared between member.When driver needs Will when specific position carries shared vehicle and gets off (in urban environment, such situation is rare) from shared vehicle, The program is poorly suited for providing conventional transportation service, and require to enter fairly expensive real estate (that is, parking lot) with It is parked at this and takes shared vehicle.In conventional scheme described above, from the perspective of inventory, once driver from It opens, since vehicle becomes motionless, so the conventional truck for providing transportation service generally can not be utilized well.In addition, It takes secret sharing (and personal vehicle transport service possessed) to be usually poorly suited for rebalancing inventory, to match transport The demand of service, to adapt to use and typical driving mode.It shall yet further be noted that some routinely describe have limited self-driving from The vehicle of kinetic force is also less suited to rebalance inventory, because usually may require human driver.According to U.S. transportation The regulation of portion National Highway Traffic administration (" NHTSA ") safely, the example of the vehicle with limited self-driving automatic ability be by Labeled as the vehicle of 3 grades of (" L3 ") vehicles.
Another disadvantage is that typical automatic driving vehicle scheme be usually poorly suited for relative in traveling vehicle with Other drivers of vehicle or person-to-person interaction (for example, Social Interaction) are detected vehicle and navigate.For example, Some conventional schemes can not fully identify pedestrian, the people etc. of cycling and associated interaction, for example, expression in the eyes connects Touch, do gesture etc., for solving the safety of the occupant of automatic driving vehicle and driver, the pedestrian of other vehicles etc. The purpose of risk.
Therefor it is required that a kind of scheme for implementing the limitation without routine techniques of autonomous vehicle.
Description of the drawings
Various embodiments of the present invention or example (" example ") are disclosed in specific implementation mode below and attached drawing:
Fig. 1 is the fleet for depicting the autonomous vehicle in accordance with some embodiments that networking is communicated with autonomous vehicle service platform Diagram of the embodiment;
Fig. 2 is the example in accordance with some embodiments for monitoring the flow chart of the fleet of autonomous vehicle;
Fig. 3 A are the exemplary diagrams depicted according to some exemplary sensors and other autonomous vehicle components;
Fig. 3 B to 3E are depicted according to some losses of exemplary sensors field redundancy and autonomous vehicle to sensors field Adaptation exemplary diagram;
Fig. 4 is the functional block diagram depicted according to some exemplary systems including autonomous vehicle service platform, wherein from Main vehicle service platform is communicationally coupled to autonomous vehicle controller via communication layers;
Fig. 5 is an example of the flow chart of control autonomous vehicle in accordance with some embodiments;
Fig. 6 is the exemplary diagram for the architecture for depicting autonomous vehicle controller in accordance with some embodiments;
Fig. 7 is the exemplary diagram for depicting autonomous vehicle service platform in accordance with some embodiments, wherein autonomous vehicle Service platform is implemented for maintaining and the redundant communication channel of the reliable communication of autonomous vehicle fleet;
Fig. 8 be depict it is in accordance with some embodiments be configured as it is various using the Message Processing of swapping data answer Exemplary diagram;
Fig. 9 is to depict to be remotely controlled operation using the communication protocol described in Fig. 8 according to some exemplary promotions The diagram of the type of data;
Figure 10 is the exemplary diagram for showing teleoperator's interface in accordance with some embodiments, and teleoperator can be with Path planning is influenced using teleoperator's interface;
Figure 11 is the exemplary diagram depicted according to some exemplary planners for being configured as calling straighforward operation;
Figure 12 is the example of the flow chart in accordance with some embodiments for being configured as control autonomous vehicle;
Figure 13 depicts the example that track can be generated according to some exemplary wherein planners;
Figure 14 is another exemplary diagram for depicting autonomous vehicle service platform in accordance with some embodiments;
Figure 15 is the example of the flow chart of control autonomous vehicle in accordance with some embodiments;
Figure 16 is exemplary the showing according to some exemplary autonomous vehicle fleet management devices for implementing fleet's optimization manager Figure;
Figure 17 is the example in accordance with some embodiments for managing the flow chart of the fleet of autonomous vehicle;
Figure 18 is to show the autonomous vehicle fleet in accordance with some embodiments for implementing autonomous vehicle communication link manager The diagram of manager;
Figure 19 is the example of the flow chart of the action in accordance with some embodiments that autonomous vehicle is determined during event;
Figure 20 is the exemplary diagram for depicting locator in accordance with some embodiments;
Figure 21 is showing for the flow chart in accordance with some embodiments that local pose data are generated based on integrated sensor data Example;
Figure 22 is another the exemplary diagram for depicting locator in accordance with some embodiments;
Figure 23 is the exemplary diagram for depicting perception engine in accordance with some embodiments;
Figure 24 is the example of the flow chart in accordance with some embodiments for generating perception engine data;
Figure 25 is the example of cutting processor in accordance with some embodiments;
Figure 26 A are the exemplary diagrams for depicting object tracker and grader according to various embodiments;
Figure 26 B are another the exemplary diagrams depicted according at least some exemplary object trackers;
Figure 27 is according to some exemplary examples for perceiving the front-end processor of engine;
Figure 28 be depict according to various embodiments be configured as simulating autonomous vehicle in synthetic environment The diagram of simulator;
Figure 29 is the example for the flow chart that the various aspects in accordance with some embodiments to autonomous vehicle are simulated;
Figure 30 is the example of the flow chart in accordance with some embodiments for generating map datum;
Figure 31 is the diagram for the architecture for depicting drawing engine in accordance with some embodiments;
Figure 32 is to depict the diagram applied according to some exemplary autonomous vehicles;And
Figure 33 to 35 show according to various embodiments be configured as provide various work(to the component of autonomous vehicle service The example of the various computing platforms of energy;
Figure 36 be depict according to some it is exemplary be configured as in simulated environment to simulate autonomous vehicle The diagram for the simulator that the one or more functions of (simulated autonomous vehicle) are simulated;
Figure 37 is depicted according to some exemplary vehicle modeling devices;
Figure 38 is the exemplary diagram depicted according to some exemplary Sensor Model devices;
Figure 39 is the exemplary diagram depicted according to some exemplary dynamic object data modeling devices;
Figure 40 is shown according to some exemplary exemplary flow charts for generating simulated environment;And
Figure 41 show according to various embodiments be configured to supply various simulator correlation functions and/or structure with right The example for the various computing platforms that autonomous vehicle service is simulated.
Specific implementation mode
Can implement various embodiments or example in many ways, including as system, process, device, user interface or The computer-readable medium of person's such as computer readable storage medium wherein passes through light, electronics or wireless communication link The sequence of program instructions on the computer network of program instruction is sent to implement various embodiments or example.To sum up, can The operation of disclosed process is performed in any order, except separately being given in non-claimed.
One or more exemplary specific implementation modes are together provided below along with attached drawing.Specific implementation mode is to combine What such example provided, but it is not limited to any specific example.Only by claim and it is many substitute, modification and Its equivalent limits the range.Numerous specific details are set forth in the following description to provide deep understanding.These details Be provided for exemplary purpose, and can be according to claim rather than in these details it is some or all come The described technology of practice.For simplicity, not detailed with known technologic material in the relevant technical field of the example It is described, makes the description indigestion to avoid unnecessary.
Fig. 1 is the fleet for depicting the autonomous vehicle in accordance with some embodiments that networking is communicated with autonomous vehicle service platform Diagram of the embodiment.Diagram 100 depicts fleet's (such as Autonomous Vehicles of the autonomous vehicle 109 operated as service One or more of 109a to 109e), each autonomous vehicle 109 is configured as on road network 110 from driving, and is built The vertical communication link 192 with autonomous vehicle service platform 101.In the fleet of wherein autonomous vehicle 109 constitutes the example of service, User 102 can emit request 103 to autonomous transport via one or more networks 106 to autonomous vehicle service platform 101. In response, autonomous vehicle service platform 101 can send one of autonomous vehicle 109 from geographical location 119 to geographical location 111 Come automatically transport user 102.Autonomous vehicle service platform 101 can send Autonomous Vehicles from station 190 to geographical location 119 , or autonomous vehicle 109c (for example, without occupant) on the way can be turned, to serve the transport of user 102 Request.Autonomous vehicle service platform 101 can be additionally configured to response from user 102 (for example, as passenger) request and Turn autonomous vehicle 109c (having passenger) on the way.Exist in addition, autonomous vehicle service platform 101 can be additionally configured to reservation Autonomous vehicle 109c (having passenger) on the way, for turning, to serve asking for user 102 after existing passenger leaves It asks.Pay attention to, it is possible to implement multiple 101 (not shown) of autonomous vehicle service platform and one or more stations 190, with serve with One or more autonomous vehicles 109 that road network 110 connects.One or more stations 190 can be configured as storage, service, pipe Manage, and/or safeguard the inventory of autonomous vehicle 109 (for example, station 190 may include implement autonomous vehicle service platform 101 one A or multiple computing devices).
According to some examples, at least some of autonomous vehicle 109a to 109c is configured as two-way autonomous vehicle, for example, Two-way autonomous vehicle (" AV ") 130.It is in office along the longitudinal axis 131 that two-way autonomous vehicle 130 can be configured as main (but not limited to) It is travelled on one direction.Therefore, two-way autonomous vehicle 130 can be configured as the active illumination implemented positioned at outside vehicle, with police Show other people on direction that neighbouring and two-way autonomous vehicle 130 is travelled (for example, other drivers, pedestrian, cycling People etc.).For example, when that when driving, active light source 136 can be embodied as to active car light 138a, or work as edge along first direction Active light source 136 when driving, can be embodied as active car light 138b by second direction, can use the of one or more colors One subset, using optional animation (for example, car light pattern of color that is variable intensity or can changing over time) come real Alms giver's vehicle lights 138a.It is similar, the second subset of one or more colors can be used and may differ from active car light The car light pattern of the car light pattern of 138a implements active car light 138b.It is, for example, possible to use white car light implements active car light 138a is used as " head lamp ", and red car light can be used to implement active car light 138b and be used as " taillight ".Active car light 138a and 138b or their part, which can be configured as, provides the other and relevant function of car light, for example, providing, " signal for turn refers to Show " function (for example, using yellow car light).According to various examples, the logic in autonomous vehicle 130 can be configured as adjustment master Vehicle lights 138a and 138b comply with the various safety requirements and traffic regulations or law of any number of compass of competency.
In some embodiments, two-way autonomous vehicle 130 can be configured as in each quadrangular portion (for example, four sides Shape part 194) in have similar structural detail and component.At least in this example, quadrangular portion is described as by plane 132 and plane 134 intersection defined in two-way autonomous vehicle 130 part, wherein plane 132 and plane 134 both pass through vehicle , to form two similar half with every side of plane 134 in plane 132.In addition, two-way autonomous vehicle 130 can be with Including autonomous vehicle controller 147, autonomous vehicle controller 147 includes being configured as controlling the logic of most of vehicle functions (for example, hardware or software or combination thereof), the vehicle functions include drive control (for example, propulsion, steering etc.) With active light source 136 and other functions.Two-way autonomous vehicle 130 can also include being arranged in the multiple of vehicle different location (other sensors are not shown) in sensor 139.
Autonomous vehicle controller 147 can be additionally configured to determine the local pose of autonomous vehicle 109 (for example, local position Set) and detect external object relative to vehicle.It may for instance be considered that two-way autonomous vehicle 130 just in road network 110 along side To 119 travelings.The locator (not shown) of autonomous vehicle controller 147 can determine the local pose at geographical location 111.In It is that locator can use acquired sensing data, such as sensor associated with the surface of building 115 and 117 Data, these sensing datas can with reference to data (for example, map datum (for example, 3D map datums, including reflection number According to)) be compared, to determine local pose.In addition, the perception engine (not shown) of autonomous vehicle controller 147 can also quilt Be configured to external object (for example, the behavior of external object 112 (" tree ") and external object 114 (" pedestrian ") be detected, point Class and prediction.The static state that object can be broadly classified as such as external object 112 by the classification of such external object is right As the dynamic object with such as external object 114.The locator and perception engine of AV controllers 147 and other components collaboration behaviour Make, so that autonomous vehicle 109 can automatically drive.
According to some examples, autonomous vehicle service platform 101 can be configured as:If the request remote control of autonomous vehicle 109 Operation provides teleoperator's service.It may for instance be considered that the autonomous vehicle controller 147 in autonomous vehicle 109d is in point 191 The path 124 on 126 defiladed route 122 of object is detected at (describing in such as illustration 120).If autonomous vehicle controller 147 It can not confirm path or track that vehicle 109d can be safely travel with quite high certainty, then autonomous vehicle controller 147 can emit the request message 105 for straighforward operation service.In response, teleoperator's computing device 104 can be from Teleoperator 108 receives instruction and clears the jumps 126 to execute a series of actions come successfully (and safely).Then, Response data 107 can be transmitted back to autonomous vehicle 109d, for example, so that vehicle can when it advances along alternative route 121 Safely across one group of two-wire.In some instances, teleoperator's computing device 104 can generate mark and be excluded from planning The response of the geographic area in a certain path.In particular, other than providing the path to be followed, teleoperator 108 may be used also To define the region or position that autonomous vehicle must avoid.
In view of above description, the structure of autonomous vehicle 130 and/or autonomous vehicle controller 147 and their component And/or function can be by executing real-time (or near real-time) with independently relevant operation (for example, localization and perception) Trajectory calculation, so that autonomous vehicle 109 can be from driving.
In some cases, the bidirectional characteristic of two-way autonomous vehicle 130 provide be analogous to each other or substantially mutually The vehicle of similar quadrangular portion 194 (or symmetric part of any other number).It is such symmetrically to reduce design Complexity, and the number of exclusive component or structure is relatively reduced, to reduce the complexity of inventory and manufacture view. For example, power drive system and wheel system can be arranged in any one of quadrangular portion 194.In addition, autonomous Vehicle control device 147 can be additionally configured to reference straighforward operation service, to reduce the possibility that autonomous vehicle 109 on the way postpones Property, at the same solve the problems, such as otherwise may to influence the event of the safety of occupant or.In some cases, road network 110 Visible part, which depicts, can limit or control autonomous vehicle 109 in other ways to the ground of the movement of road network shown in Fig. 1 Reason encloses partition.According to various examples, autonomous vehicle 109 and its fleet can be configured as rank 4 (" entirely automatically from driving Change " or L4) vehicle is operated, this vehicle can provide convenience and the private of point-to-point personal mobility to on-demand transport It is sexual, while providing the efficiency of shared vehicle.In some instances, autonomous vehicle 109 or it is described herein it is any from Main vehicle, which can be configured as, to be omitted steering wheel or provides appointing for (that is, what people controlled) steering manually for autonomous vehicle 109 What its mechanical device.In addition, autonomous vehicle 109 or any autonomous vehicle described herein can be additionally configured to province Having omited the seat reserved for occupant in vehicle, either position connects for steering wheel or for any machinery of steering Mouthful.
Fig. 2 is the example of the flow chart of the fleet of monitoring autonomous vehicle in accordance with some embodiments.At 202, flow 200 Start from monitoring the fleet of autonomous vehicle.At least one autonomous vehicle includes being configured as making vehicle automatically from the first geographic region Domain marches to the autonomous vehicle controller of the second geographic area.At 204, detection indicates the confidence level calculated with vehicle The data of associated event.Event can be the operation for influencing autonomous vehicle or the operation that may influence autonomous vehicle Condition or situation.Event may be inside autonomous vehicle, it is also possible to outside autonomous vehicle.For example, road can will be hindered Barrier be considered as event, the decrease of communication or loss can also be considered as event.Event may include transportation condition or Congestion situations and the unpredictable either unusual number or type that are perceived of perception engine external object (or Tracking).Event may include with the relevant condition of weather (for example, loss of the frictional force caused by ice either rain) or The angle (for example, at sunset) shined upon, for example, cause the sun glitter other vehicles human driver eyes low land Horizontal line angle.Can by these either other conditions be considered as cause call teleoperator service or make vehicle execute safety stop The event of track.
At 206, the data of the subset for indicating candidate tracks can be received from autonomous vehicle in response to detecting event. For example, the planner of autonomous vehicle controller can (for example, second) calculates and estimates a large amount of (for example, thousands of per unit time Or more) track.In some embodiments, candidate tracks are that offer autonomous vehicle can be safe in the case of consideration event Ground moves forward the son of the track of the quite high confidence level of (for example, the alternative route provided using teleoperator) Collection.Note that can to some candidate tracks carry out hierarchical arrangement or by they be higher than other candidate tracks confidence level phase Association.According to some examples, the subset of candidate tracks can be originated from any number of source, such as planner, track operator meter Equipment (for example, teleoperator can determine and provide approximate path) etc. is calculated, and they can be combined as candidate The superset of track.At 208, can at one or more processors identification path pilot data.Route guidance data can be with It is configured as that teleoperator is assisted to select lead track from one or more of candidate tracks.In some instances, road The value of the specified instruction confidence level of diameter pilot data or instruction particular candidate track can reduce or negate that event can energy loss The probability of the determination degree of the probability of the operation of evil autonomous vehicle.It, can be in response to the input from teleoperator at 210 And the candidate tracks of lead track alternatively are received (for example, teleoperator can be from one group of different grades of candidate tracks Middle at least one candidate tracks of selection are selected as lead track).For example, can be connect via the operator for listing multiple candidate tracks Mouth carries out this selection by the order of highest confidence level to minimum confidence level.At 212, it can will be used as lead track The selections of candidate tracks be emitted to vehicle, in turn, implement to be used for by making vehicle execute the manipulation that teleoperator specifies Solve the lead track of the condition.In this way, autonomous vehicle can be converted from the mode of operation of non-standard.
Fig. 3 A are the exemplary diagrams depicted according to some exemplary sensors and other autonomous vehicle components.Diagram 300 depict the interior views of two-way autonomous vehicle 330, and two-way autonomous vehicle 330 includes sensor, signal router 345, moves Power transmission system 349, detachable battery 343, Audio generator 344 (for example, loud speaker or frequency converter) and autonomous vehicle (" AV ") control logic 347.Sensor shown in diagram 300 includes image capture sensors 340 (for example, any kind of Light capture device or video camera), audio capturing sensor 242 (for example, any kind of microphone), radar equipment 348, sound Receive equipment 341 (either other similar sensors including ultrasonic sensor or with the relevant sensor of sound) and laser (some of them have been not shown, for example, Inertial Measurement Unit is (i.e. for radar equipment 346 and other sensor types and form " IMU "), global positioning system (" GPS ") sensor, sonar sensor etc.).Note that quadrangular portion 350 indicates two-way autonomous In 4 " quadrangular portions " of vehicle 330 the symmetry of each (for example, than those described, each quadrangle Part 350 may include wheel, power drive system 349, similar steering mechanism, similar structural support and component etc.).Such as Described in Fig. 3 A, similar sensor can be placed on to the similar position in each quadrangular portion 350, so And, it is possible to implement any other configuration.Each wheel can be turned to individually, and independently of other wheel steerings.Should also Help to be placed and taken out note that detachable battery 343 can be configured as, rather than charges in the original location, so that it is guaranteed that by It reduces or can be ignored in the repair time for being necessarily the charging of battery 343 and bringing.Although autonomous vehicle controller 347a is retouched State to be used in two-way autonomous vehicle 330, however autonomous vehicle controller 347a is not limited to this, and can it is unidirectional from Implement in main vehicle or the vehicle of any other type, either on land, in the air or at sea.Note that not purport It is limited with described positioning, position, direction, quantity and type to describing for sensor shown in Fig. 3 A, Thus it is possible to which there are any numbers and any kind of sensor, and can any sensor be positioned and is oriented in independently From anywhere on vehicle 330.
According to some embodiments, it can use and implement to be suitable for being programmed the cluster of graphics processing unit (" GPU ") Frame and the cluster of GPU of programming model implement the part of autonomous vehicle (" AV ") control logic 347.For example, can make With the compatible programming language of calculating Unified Device architecture (" CUDA ") and Application Programming Interface (" API ") model to GPU It is programmed.CUDA is produced and safeguards by the NVIDIA of California Santa ClaraTM.Note that other volumes can also be implemented Cheng Yuyan, for example, OpenCL or any other parallel programming language.
According to some embodiments, hardware and/or software may be used, autonomous vehicle control logic 347 is embodied as Autonomous Vehicles Controller 347a is described as including motion controller 362, planner 364, perception engine 366 and locator 368. As indicated, autonomous vehicle controller 347a is configured as receiving camera data 340a, laser radar data 346a and radar Data 348a or any other range sensing or localization data, including sonar data 341a etc..Autonomous vehicle controller 347a is additionally configured to receive location data, such as GPS data 352, IMU data 354 and other location sense measured data (examples Such as, with wheel relevant data, such as steering angle, angular speed etc.).In addition, autonomous vehicle controller 347a can receive it is any Other sensing datas 356 and with reference to data 339.In some cases, with reference to data 339 include map datum (for example, 3D map datums, 2D map datums, 4D map datums (e.g., including time determine)) and route data (for example, road network number According to including but not limited to RNDF data (either similar data), MDF data (or similar data) etc..
Locator 368 is configured as from one or more source receiving sensor data, for example, GPS data 352, wheel number According to, IMU data 354, laser radar data 346a, camera data 340a, radar data 348a etc. and with reference to data 339 (for example, 3D map datums and route data).Locator 368 is by the way that sensing data to be compared to integrate with map datum (for example, merge sensor data) and analysis data, with the local pose (or position) of the two-way autonomous vehicle of determination 330.Root According to some embodiments, locator 368 either can near real-time generate or update in real time any autonomous vehicle posture or Person position.Note that locator 368 and its function need not be confined to " two-way " vehicle, but can be in any kind of any vehicle Implement in.Therefore, (and the AV controllers of locator 368 can be implemented in " unidirectional " vehicle or any non-autonomous vehicle Other components of 347a).According to some embodiments, the data for describing local pose may include in x coordinate, y-coordinate, z coordinate One or more (either any coordinate of any coordinate system (including polar coordinate system or cylindrical coordinate etc.)), tilt value, rolling Dynamic value, pitch value (for example, angle value), rate (for example, speed), height etc..
Perception engine 366 is configured as from one or more source receiving sensor data, for example, laser radar data 346a, camera data 340a, radar data 348a etc. and local pose data.Perception engine 366 can be configured as base The position of external object is determined in sensing data and other data.For example, external object can be not be can riding surface Partial object.For example, perception engine 366 can detect external object, and can be classified as pedestrian, cycling People, dog, other vehicles etc. (for example, perception engine 366 can be configured as and be classified to object according to the type of classification, it can With it is associated with semantic information, including label).External object can be labeled as by the classification based on these external objects Dynamic object or static object.For example, static object can be labeled as the external object for being classified as tree, and will be classified It is labeled as static object for the external object of pedestrian.The outer of static state can or cannot be marked as described in map datum Portion's object.The example that may be marked as static external object include traffic cone, across road arrangement cement roadblock, Mailbox or dustbin of the new placement of Lane Closed mark, neighbouring road etc..It may be marked as dynamic external object Example includes people, pedestrian, animal, the other vehicles etc. of cycling.If external object is marked as dynamic, and has outside the Pass Other data of portion's object can indicate typical level and the behavior pattern associated with classification type of activity and speed. Other data in relation to external object can be generated by tracking external object.Thus it is possible to be predicted using classification type Or determine that external object for example may interfere with the possibility of the autonomous vehicle along the route planned in other ways.Example Such as, can the external object that be classified as pedestrian is associated with a certain maximum speed and average speed (for example, based on Track data).The speed of the pedestrian of the speed relative to autonomous vehicle can be used to determine whether to collide.In addition, Perception engine 364 can determine uncertain level associated with the current and future state of object.It in some instances, can be with Uncertain level is expressed as to estimated value (or probability).
Planner 364 is configured as receiving perception data from perception engine 366, and can also include coming from locator 368 locator data.According to some examples, perception data may include specified static state and dynamic near autonomous vehicle The barrier map of object, and locator data may include local pose or position.In operation, planner 364 is based on At least one position of autonomous vehicle generates several tracks for external dynamic and the relative position of static object, and right Estimated these tracks.Planner 364 is based on various criterion and selects optimal trajectory, is led autonomous vehicle by the criterion To on the road travelled to offer collisionless.In some instances, it is true as probability to can be configured as calculating track for planner 364 Fixed track.In addition, planner 364 can also will turn to and promote order (and deceleration or braking commands) to be emitted to movement Controller 362.Next, motion controller 362 can by the order any order (for example, diversion order, reduction of speed or Person promotes order and braking commands) conversion in order to control signal (for example, for being applied to actuator or other machinery connects Mouthful), to implement the change of steering or wheel angle 351 and/or speed 353.
Fig. 3 B to 3E are depicted according to some losses of exemplary sensors field redundancy and autonomous vehicle to sensors field Adaptation exemplary diagram.The diagram 391 of Fig. 3 B describes sensors field 301a, wherein sensor 310a carries out object Detection (for example, for determining range or distance or other information).Although sensor 310a can implement any kind of Sensor or forms of sensor, but sensor 310a and the sensor of similar description (such as sensor 310b, 310c and 310d) it may include laser radar apparatus.Therefore, sensors field 301a, 301b, 310c and 310d includes laser at it The field of middle extension.The diagram 392 of Fig. 3 C depicts the sensors field of four overlappings, (not by corresponding laser radar sensor 310 Show) generate each sensors field.As indicated, the part 301 of sensors field includes non-overlapping sensors field (example Such as, single laser radar field), the part 302 of sensors field include two overlappings sensors fields, and part 303 includes three The sensors field of a overlapping, then, if laser radar sensor breaks down, such sensor provides multiple levels Redundancy.
Fig. 3 D depict the damage according to some exemplary sensors fields caused by the failed operation of laser radar 309 It loses.The sensors field 302 of Fig. 3 C is transformed to single sensors field 305, one of sensors field 301 of Fig. 3 C damages in gap 304 It loses, and three in the sensors field of Fig. 3 C 303 is transformed to sensors field 306 (that is, being limited to the field of two overlappings).Such as The autonomous car 330c of fruit is travelled along travel direction 396, then at the sensors field and end section in the front of the autonomous vehicle moved Sensors field compared to less robust.According to some examples, autonomous vehicle controller (not shown) is configured as utilizing Autonomous Vehicles The bidirectional characteristic of 330c solves the loss of the sensors field of vehicle front guidance field.Fig. 3 E are described for restoring Autonomous Vehicles The both-way operation of the certain robustness of 330d upfront sensors field.As indicated, being arranged more robust sensors field 302 in vehicle The rear of 330d, the same space is in taillight 348.At the convenient time, autonomous vehicle 330d is by driving runway 397 into And switch its directionality so that taillight 348 actively to switch to the other side (for example, tail end) of autonomous vehicle 330d double to execute To manipulation.As indicated, when its along travel direction 398 when driving, autonomous vehicle 330d restores the sensor of the robust of vehicle front Field 302.In addition, both-way operation described above eliminates the need of the more complex manipulation to requiring to move backward to busy road It asks.
Fig. 4 is the functional block diagram depicted according to some exemplary systems including autonomous vehicle service platform, Autonomous Vehicles Service platform is communicably coupled to autonomous vehicle controller via communication layers.Diagram 400, which describes, to be arranged in autonomous vehicle Autonomous vehicle (" AV ") controller 447 in 430, autonomous vehicle 430 is again including being coupled in the multiple of autonomous vehicle controller 447 Sensor 470.Sensor 470 includes one or more laser radar apparatus 472, one or more 474, one or more, video cameras A radar 476, one or more global positioning system (" GPS ") data sink sensors, one or more inertia measurement lists Member (" IMU ") 475, one or more speedometer sensors 477 are (for example, wheel encoder sensor, vehicle-wheel speed sensor Deng) and any other suitable sensor 478 (for example, thermal camera or sensor), EO-1 hyperion energy force snesor, Ultrasonic sensor (or any other sensor based on acoustic energy), the sensor etc. based on radio frequency.In some cases, may be used Using the wheel angle sensors of the steering angle including being configured as sensing wheel as speedometer sensor 477 or suitable Sensor 478.In a non-limiting example, autonomous vehicle controller 447 may include four or more laser radars 472,16 either more video cameras 474 and four or more radar cells 476.In addition, sensor 470 may be used also To be configured as providing sensor number to the component of autonomous vehicle controller 447 and the element of autonomous vehicle service platform 401 According to.As shown in diagram 400, autonomous vehicle controller 447 includes planner 464, motion controller 462, locator 468, sense Know engine 466 and local map generator 440.Note that the element described in the diagram 400 of Fig. 4 may include and combine The identical structure of element and/or function of similar name described in one or more of the other figure.
Locator 468 is configured as making autonomous vehicle localization (that is, determining local pose), reference relative to reference to data Data may include map datum, route data (for example, road net data, such as the data similar to RNOF) etc..In some cases Under, for example, locator 468 is configured as autonomous vehicle 430 being indicated relative to environment representation feature in identifier space The point of position.Locator 468 is shown as including sensing data integrator 469, can be configured as integrated sensor data Multiple subsets (for example, different forms of sensor), to reduce and the relevant uncertainty of each individual type of sensor. According to some examples, sensing data integrator 469 is configured as to sensing data (for example, laser radar data, video camera Data, radar data etc.) it is merged, to form the integrated sensor data values for determining local pose.According to some Example, the retrieval of locator 468 are originated from the reference data with reference to data repository 405, and resources bank 405 includes for storing 2D maps The map datum resources bank 405a of data, 3D map datums, 4D map datums etc..Locator 468, which can be configured as, at least to be marked Know environment in feature subset, with mark or in other ways confirm autonomous vehicle 430 posture map datum carry out Matching.According to some examples, locator 468 can be configured as the feature of any amount in marking environment, so that the collection of feature Conjunction can have one or more features or all features.It, can be by the laser radar data of any amount in particular example The data of (for example, most of or generally all laser radar datas) and the expression map for the purpose that localizes are compared Compared with.In general, without matching object can be dynamic object, such as vehicle by the more obtained of environmental characteristic and map datum , people, the pedestrian etc. of cycling.Note that can be executed in the case where being with or without map datum dynamic object (including barrier Hinder object) detection.In particular, can be dynamic independently of map datum (that is, in the case of no map datum) detect and track State object.In some instances, 2D map datums and 3D map datums can be considered as " global map data " or Autonomous Vehicles The map datum that service platform 401 at a time had verified that.Due to that can be updated periodically and/or verify map Map datum in data repository 405a, between the actual environment that map datum and autonomous vehicle are located therein there may be Deviation.Therefore, locator 468 can retrieve the map datum locally obtained that local map generator 440 is generated, to reinforce Localization.Local map generator 440 generates local map data with being configured as real-time or near real-time.Optionally, example Such as, local map generator 440 can receive static and dynamic object map datum, with by ignoring the dynamic in localization Object enhances the accuracy for the map being locally generated.According at least some embodiments, local map generator 440 can with it is fixed Position device 468 is integrated or is formed as a part for locator 468.In at least one, local map generator 440 can be with Individually or with locator 468 be cooperative configured as based on localize and chart simultaneously (" SLAM ") etc. generate map and/or With reference to data.Note that locator 468 can implement " mixing " scheme using map datum, the thus logic in locator 468 It can be configured as the map datum that various amounts are selected from map datum resources bank 405a, or from local map generator 440 Local map data are selected, the reliability standard of each source of map data is depended on.Therefore, the map datum that is just locally generated and Speech, locator 468 still may use expired map datum.
For example, perception engine 466 is configured as:By identifying autonomous vehicle 430 just in the ambient enviroment wherein advanced Object of interest carry out 464 programme path of auxiliary programming device and generate track.Furthermore it is also possible to which probability is interested with each Object is associated, and thus probability can indicate that object of interest may be the possibility of the threat to safety traffic (for example, quickly Mobile motorcycle, rather than the tracking reinforced may be needed by being sitting in the people to read the newspaper on bus stop bench).As institute Show, perception engine 466 includes object detector 442 and object classifier 444.Object detector 442 is configured as relative to ring Other feature in border distinguishes object, and object classifier 444 can be configured as and classify subjects into dynamically or statically Object, and dynamic and position of the static object relative to autonomous vehicle 430 are tracked, for planning purpose.In addition, perception is drawn Hold up 466 can be configured as to whether specified object is (or being likely to become) path planning at planner 464 may be damaged Barrier static state or dynamic object allocation identification symbol.Although not shown in fig 4, it is noted, however, that perception engine 466 can also Other and perceptually relevant function is executed, for example, segmentation and tracking, the following describe the examples of segmentation and tracking.
Planner 464 is configured as via available multiple paths or Route Generation for realizing the mesh arrived at The multiple candidate tracks of target.Track estimator 465 is configured as estimation candidate tracks, and identifies which subset of candidate tracks It is associated with the confidence level of higher degree in collisionless path is provided to destination.Then, track estimator 465 can be based on For causing order to generate the correlation criterion of the control signal for vehicle part 450 (for example, actuator or other mechanism) To select optimal trajectory.Note that correlation criterion may include any number of factor for defining optimal trajectory, not needing will be optimal The selectional restriction of track is to reduce to collide.For example, the selection of track can be enable to optimize user experience (for example, user is comfortable Degree) and the collision-free trajectory observed traffic rules and regulations with law.It can be by moderately accelerating come excellent along various straight lines and angular direction Change user experience (for example, to reduce the similar traveling jolted or other irritating movements).In some cases, phase Which other criterion closes at least part of criterion can specify to rewrite or replace.For example, when being generated in limited case When track, can strengthen or weaken temporarily legal restriction (for example, across double amber lines with around the people of cycling advance, or Person is travelled by the rate limitation labelled in being higher than to match traffic flow).Then, control signal is configured to result in powertrain The variation of propulsive force and direction at system and/or wheel.In this example, motion controller 462 is configured as that transformation will be ordered For the control signal (for example, speed, wheel angle etc.) of the movement for controlling autonomous vehicle 430.In track, estimator 465 has Have it is insufficient for ensure high enough to provide it is collisionless optimization traveling confidence level information in the case of, planner 464 can generate the request supported teleoperator to teleoperator 404.
Autonomous vehicle service platform 401 includes teleoperator 404 (for example, teleoperator's computing device), with reference to number According to resources bank 405, map rejuvenation device 406, vehicle data controller 408, prover 409 and offline objects grader 410. Note that each element of autonomous vehicle service platform 401 can be independently positioned or is distributed, and with autonomous vehicle service Other elements in platform 401 are communicated.In addition, the element of autonomous vehicle service platform 401 can be only via communication layers 402 On the spot communicated with autonomous vehicle 430.Map rejuvenation device 406 is configured as receiving map datum (for example, being given birth to from local map Grow up to be a useful person any other component of 440, sensor 460 or autonomous vehicle controller 447), and it is configured to basis The map being locally generated detects the deviation of the map datum in such as map datum resources bank 405a.Vehicle data controller 408 can make map rejuvenation device 406 update the reference data in resources bank 405, and promote to 2D, 3D, and/or 4D map number According to update.In some cases, vehicle data controller 408 can control local map datum and be received autonomous vehicle clothes The rate and map rejuvenation device 406 be engaged in platform 408 execute the newer frequency to map datum.
Prover 409 is configured as executing the calibration to identical or different types of various sensors.Prover 409 can To be configured to determine that the relative attitude (for example, in cartesian space (x, y, z)) of sensor and the orientation (example of sensor Such as, rolling, deflection and pitching).It can be calibrated relative to other sensors and globally relative to the referential of vehicle all Such as posture and orientation of video camera, laser radar sensor, radar sensor sensor.Offline self calibration can also calibrate or Person estimates other parameters, for example, vehicle inertia tensor, axletree are away from, radius of wheel or surface roadways frictional force.According to some Example can also be calibrated, be changed with detection parameters online.It is also noted that the calibration of prover 409 may include sensing The intrinsic parameters (for example, light distortion, beam angle etc.) of device and extrinsic parameter.It in some cases, for example, can be by most The correlation between depth discontinuity and the edge of image data in bigization 3D laser datas executes prover 409.From Line object classification 410 is configured as any other component from sensor 470 or from autonomous vehicle controller 447 and receives number According to for example, sensing data.According to some embodiments, offline objects classification 410 offline classification pipeline can be configured as it is pre- Collection and annotation object (for example, automatically by human hand building site and/or using offline labeling algorithm), and can be additionally configured to Training online classification device (for example, object classifier 444), during online autonomous operation, online classification device can provide object The real-time grading of type.
Fig. 5 is the example of the flow chart of control autonomous vehicle in accordance with some embodiments.At 502, flow 500 starts from Such as the sensing data from diversified forms sensor of autonomous vehicle is received by autonomous vehicle controller.It can will pass The one or more of sensor data is integrated to generate fused data, for example to improve estimation level.In some instances, At 504, the sensor stream (for example, identical or various forms of) of one or more sensors can be merged, to be formed The sensing data of fusion.It in some instances, can be by laser radar sensor data and camera sensor at 504 The subset of data merges, to promote to localize.At 506, it can obtain and be indicated based on sensing data extremely at processor The data of the object of few two subsets.For example, can at least be obtained from laser radar and camera data (for example, drawing in perception Hold up place) mark static object or dynamic object data.At 508, the path that detected object influences are planned is determined, And the subset of (for example, at planner) track is estimated in response to the object detected at 510.At 512, determination is set The horizontal range more than acceptable confidence level associated with the standard operation of autonomous vehicle of letter.Therefore, in this case, set Letter level, which may be such that, to be less likely to determine the selection to optimal path, thus, it is possible to which optimal path is determined as to promote nothing Collision traveling, meet traffic law, comfortable user experience (for example, comfortably taking) is provided, and/or be directed to it is any other because Element generates the function of the probability of candidate tracks.Then, at 514, straighforward operation will can be emitted to the request of alternative route Member's computing device.Hereafter, teleoperator's computing device can be such that autonomous vehicle travels on it optimal to planner offer Track.In some cases, vehicle can also determine execute safety stop manipulate be action optimal process (for example, safety and Autonomous vehicle is automatically set to be parked in the position with relatively low dangerous probability).Note that in the flow chart and this paper its The order described in its flow chart is not intended to imply that the requirement to linearly performing various functions because can sequentially or Concurrently and independently of or dependent on the other parts of flow chart come with any one or more other parts of flow chart Each part of execution flow chart.
Fig. 6 is the exemplary diagram for the architecture for depicting autonomous vehicle controller in accordance with some embodiments.Diagram 600 depict multiple processes, including motion controller process 662, planner processor 664, perception 666, drawing course 640 and localization process 668, some of which can generate or receive and the relevant data of other processes.Other mistakes Journey, for example, process 670 and 650 can promote one or more mechanical parts with autonomous vehicle to interact.For example, perception Process 666, drawing course 640 and localization process 668 are configured as from 670 receiving sensor data of sensor, and are planned Device process 664 and perception 666 are configured as receiving pilot data 606, and pilot data 606 may include such as road net data Route data.Further directed to diagram 600, localization process 668 is configured as receiving map datum 605a (that is, 2D maps Data), map datum 605b (that is, 3D map datums) and local map data 642 and other types of map datum. For example, localization process 668 can also receive the map datum of such as other forms of 4D map datums, for example, 4D map numbers According to may include the time determine.Localization process 668 is configurable to generate the local location data 641 for indicating local pose.It will Local location data 641, which provide, arrives motion controller process 662, planner process 664 and perception 666.Perception 666 are configurable to generate static and dynamic object map datum 667, in turn, can be by static and dynamic object map datum 667 It is emitted to planner process 664.It in some instances, can be with other data (for example, semantic classification information and pair predicted As behavior) emit static and dynamic object map datum 667 together.Planner process 664 is configurable to generate track data 665, track data 665 describes multiple tracks that planner 664 is generated.Motion controller process uses track data 665 Low-level command or control signal are generated to be applied to actuator 650, to lead to the variation of steering angle and/or speed.
Fig. 7 is the exemplary diagram for depicting autonomous vehicle service platform in accordance with some embodiments, autonomous vehicle service Platform implements redundant communication channel to maintain the reliable communication with autonomous vehicle fleet.Diagram 700 describes autonomous vehicle service Platform 701, autonomous vehicle service platform 701 include with reference to Data Generator 705, vehicle data controller 702, autonomous vehicle vehicle Team's manager 703, teleoperator's manager 707, simulator 740 and policy manager 742.With reference to Data Generator 705 It is configurable to generate and changes map datum and route data (for example, RNDF data).In addition, can with reference to Data Generator 705 Be configured as accessing in 2D map datums resources bank 720 2D data, access 3D numbers in 3D map datums resources bank 722 According to and access route data in route data resources bank 724.In some instances, other maps can also be implemented to indicate Data and resources bank, such as include the 4D map datums of time determination.It is each that vehicle data controller 702 can be configured as execution Kind operation.Changed based on the quality level of the communication by channel 770 for example, vehicle data controller 702 can be configured as Become the rate in autonomous vehicle service fleet and the swapping data of platform 701.During the Bandwidth-Constrained period, for example, can be with Data are communicated and carry out priority ranking, so that the priority of the request of the straighforward operation from autonomous vehicle 730 is height, to ensure It delivers.Alternatively, it is also possible to emitted by channel 770 every vehicle data pick-up variable level, depending on can be used for specific letter The bandwidth in road.For example, there are the network connection of robust, full laser radar data can be emitted (for example, generally Whole laser radar datas, but can also lack), and in the case where there is the connection of degrade or low rate, Ke Yifa Penetrate the simpler or more abstract description (for example, bounding box etc. of associated metadata) of data.Autonomous vehicle vehicle Team's manager 703 is configured as coordinating sending for autonomous vehicle 730, and to optimize multiple variables, including the efficient of battery supply makes With, running time, air-conditioning unit etc. in autonomous vehicle 730 whether can be used during the low charge state of battery, in order to Optimization cost function associated with operation autonomous vehicle service can be monitored to any of which one or all.It can It is analyzed for autonomous vehicle fleet with implementing algorithm for minimizing running cost or the various variables of time.In addition, In order to maximize the uptime of fleet, autonomous vehicle fleet management device 703 is safeguarded the inventory of autonomous vehicle and is used for Adapt to the part of service dispatch.
Teleoperator's manager 707 is configured as multiple remote controls of the management teleoperator 708 provided by input Operator's computing device 704.Simulator 740 is configured as simulating the operation of one or more autonomous vehicles 730 and remote control behaviour Interaction between work person's manager 707 and autonomous vehicle 730.Simulator 740, which can also simulate, to be arranged in autonomous vehicle 730 The operation (introducing for including the noise simulated) of multiple sensors.Furthermore it is also possible to simulated environment (for example, city), to The autonomous vehicle of simulation can introduce to the environment of synthesis, the sensor simulated as a result, which can receive, all as simulated to swash The sensing data simulated that light returns.Simulator 740 can also provide other functions, including to software upgrading and/or ground The verification of diagram data.In view of various conditions or event that autonomous vehicle can encounter when driving in road network, policy manager 742 are configured as safeguarding the strategy or rule for indicating that autonomous vehicle should abide by.In some cases, in view of strategy change, Newer strategy and/or rule can be simulated in simulator 740, to confirm the safety operation of the fleet of autonomous vehicle.Under Text further describes some in the said elements of autonomous vehicle service platform 701.
Communication channel 770 is configured to supply the net between the fleet of autonomous vehicle 730 and autonomous vehicle service platform 701 Network communication link.For example, communication channel 770 includes the network 771,772,773 and 774 of multiple and different types, they have phase The sub-network (for example, 771a to 771n) answered, to ensure that the certain redundancy for reliably operating autonomous vehicle service is horizontal.Example Such as, the different types of network in communication channel 770 may include different cellular network provider, different types of data network Network etc., with caused by the failure in one or more networks 771,772,773 and 774 communication weaken or loss In the case of ensure enough bandwidth.
Fig. 8 be depict it is in accordance with some embodiments be configured as it is various using the Message Processing of swapping data answer Exemplary diagram.Diagram 800 describe the teleoperator that is arranged in teleoperator's manager using 801 and The autonomous vehicle being arranged in autonomous vehicle apply 830, as a result, teleoperator using 801 and autonomous vehicle using 830 via Promote various networks (for example, network 871,872, with other networks 873) on the agreement that is communicated exchange message data. According to some examples, communication protocol is implemented as the data distribution with the specification maintained by Object Management Group alliance and takes BusinessTMMiddleware protocols.According to communication protocol, teleoperator may include that setting exists using 830 using 801 and autonomous vehicle Message router 854 in message field is configured as the message router being connect with 852 interfaces of teleoperator API.One In a little examples, message router 854 is route service.In some instances, it can be identified and be remotely controlled by teleoperator's identifier Operator applies the message field 850a in 801, and message field 850b can be identified as to domain associated with vehicle identifiers.It is distant Control operator is configured as connecting with teleoperator's process 803a to 803c interfaces using the teleoperator API 852 in 801 It connects, thus make teleoperator's process 803b associated with autonomous vehicle identifier 804 and makes teleoperator's process 803c It is associated with event ID 806 (for example, identifier of the possible problematic crossroad of specified collision-free Trajectory Planning of Welding).From Teleoperator API 852 in main vehicle application 830 is configured as connecting with 840 interface of autonomous vehicle operating system, independently Vehicle operating system 840 includes that sensing applies 848 using 842, aware application 844, localized application 846 and control.In view of Above description, communication protocol described above can promote data exchange, to promote straighforward operation described herein.Separately Outside, communication protocol described above may adapt in one or more autonomous vehicles and one or more autonomous vehicle clothes Security data exchange is provided between business platform.For example, message router 854 can be configured as is encrypted reconciliation to message It is close, to provide the interaction of safety between such as teleoperator's process 803 and autonomous vehicle operating system 840.
Fig. 9 is depicted according to some exemplary communication protocol promotion straighforward operations for using described in Fig. 8 The diagram of the type of data.Diagram 900 describes teleoperator 908, and teleoperator 908 answers with teleoperator is coupled in It is connected with 901 904 interface of teleoperator's computing device, teleoperator is configured as using 901 via at one or more The data-centered Message Processing bus 972 implemented in a network 971 exchanges data.Data-centered Message Processing Bus 972 provides communication link between teleoperator applies 930 using 901 and autonomous vehicle.Teleoperator applies 901 Teleoperator API 962 be configured as receiving messenger service configuration data 964 and route data 960, for example, road net data (for example, data of similar RNDF), task data (for example, MDF data) etc..Similarly, message handling services bridge 932 also by with It is set to and receives message handling services configuration data 934.Message handling services configuration data 934 and 964 provides configuration data to match Teleoperator is set using 901 and autonomous vehicle using the message handling services between 930.Message handling services configuration data 934 and 964 example includes being implemented as configuration data distribution serviceTMService quality (" QoS ") configuration data of application.
For promoting the example for the data exchange for being remotely controlled operation via communication protocol to be described as follows.It is considered that by certainly The sensory perceptual system dyspoiesis object data 920 of main vehicle control device.In addition, planner option data 924 is generated by planner, with The subset of candidate tracks is notified to teleoperator, and position data 926 is generated by locator.By barrier data 920, rule It draws device option data 924 and position data 926 is emitted to message handling services bridge 932,932 basis of message handling services bridge Messenger service configuration data 934 generates telemetry 940 and inquiry data 942, total via data-centered Message Processing The two is emitted to teleoperator as telemetry 950 and inquiry data 952 and applied in 901 by line 972.Teleoperator API 962 receives telemetry 950 and inquiry data 952, in turn, number is being configured in view of route data 960 and messenger service Telemetry 950 and inquiry data 952 are handled according in the case of 964.Obtained data are then via straighforward operation Member's computing device 904 and/or cooperation display are (for example, the 908 visible instrument board of teleoperator to one group of cooperation is shown Device) it is presented to teleoperator 908.Teleoperator 908 looks back the display for being presented on teleoperator's computing device 904 On candidate tracks option, and select generate order data 982 and query-response data 980 lead track, the two make It is transmitted through teleoperator API 962 for query-response data 954 and order data 956.In turn, in via being with data The Message Processing bus 972 of the heart is by query-response data 954 and order data 956 as query-response data 944 and command number Autonomous vehicle is emitted to according to 946 to apply in 930.Message handling services bridge 932 receives query-response data 944 and order data 946, and teleoperator's order data 928 is generated, it is configurable to generate the teleoperator's selection implemented by planner Track.Note that message processing procedure described above and unrestricted, and other Message Processings associations can also be implemented View.
Figure 10 is to show that teleoperator in accordance with some embodiments can use it to influence the remote control of path planning and grasp The exemplary diagram of work person's interface.Diagram 1000 describes the autonomous vehicle 1030 communicated with autonomous vehicle service platform 1001 Example, autonomous vehicle service platform 1001 include the teleoperator's manager 1007 for being configured as promoting straighforward operation. In one example, teleoperator's manager 1007 receives data, and preferentially observation is close latent by data demand teleoperator 1008 In the path of the autonomous vehicle in barrier or the region of low planner confidence level, so that teleoperator 1008 can be advance It solves the problems, such as.In order to illustrate, it is believed that can be by the close crossroad of autonomous vehicle labeled as problematic.Then, User interface 1010 shows the expression 1014 for the corresponding autonomous vehicle 1030 advanced along path 1012, is generated by planner more A track has predicted path 1012.Equally it is shown that other vehicles 1011 and dynamic object 1013 (for example, pedestrian), this It may lead to the larger puzzlement at planner, therefore it is required that straighforward operation is supported.User interface 1010 is also to teleoperator 1008 are presented charge volume 1026 current in present speed 1022, rate limitation 1024 and battery.According to some examples, use Family interface 1010 can show other data, such as the sensing data acquired from autonomous vehicle 1030.In the second example, recognize Multiple tracks that the path 1044 generated with planner is in the same space are generated for planner 1064, but regardless of being examined How is the object 1046 not identified surveyed.Planner 1064 can also generate the subset of candidate tracks 1040, but in this example, If given current confidence level, planner cannot continue.If planner 1064 can not determine alternative route, can To emit straighforward operation request.In this case, teleoperator can select one of candidate tracks 1040, to promote Autonomous Vehicles 1030 traveling consistent with the path 1042 based on teleoperator.
Figure 11 is the exemplary diagram depicted according to some exemplary planners for being configured as calling straighforward operation.Show Figure 110 0 describes planner 1164, and planner 1164 includes landform manager 1110, route manager 1112, path generator 1114, track estimator 1120 and track tracker 1128.Landform manager 1110 is configured as receiving map datum, example Such as 3D data or other similar map data of specified features of terrain.Landform manager 1110 is further configured to based on extremely Path candidate is identified with the relevant feature of landform on the path of destination.According to various examples, landform manager 1110 connects Receive the 3D maps that sensor associated with one or more of fleet autonomous vehicle is generated.Route manager 1112 by with Be set to and receive environmental data 1103, environmental data 1103 may include with can be selected as to one of the path of destination or Multiple routes are associated with the relevant information of traffic.Path generator 1114 is from landform manager 1110 and route manager 1112 receive data, and generate one or more paths or the route segment for being suitable for that autonomous vehicle is oriented to destination.Table Show that the data of one or more paths or route segment are launched into track estimator 1120.
Track estimator 1120 includes state and task manager 1122, and in turn, state and task manager 1122 can be with Including confidence level generator 1123.Track estimator 1120 further includes lead track generator 1126 and track creator 1124.In addition, planner 1164 is configured as receiving policy data 1130, perception engine data 1132 and locator data 1134。
According to some examples, policy data 1130 may include that planner 1164 is used for generating track for determining to have The path of enough confidence levels.The example of policy data 1130 include it is specified by deviation external object distance (for example, if If possible, keep with the people of cycling at a distance of 3 feet of safe buffering distance) it limits the strategy of Track Pick-up or wants Ask track must not across center double amber lines strategy or require the single track being limited to track in 4 three-lane roads Strategy (for example, it is based on past event, such as near the track of bus stop usually very congestion) and by strategy Specified any other like criterion.Perception engine data 1132 includes the position of interested static object and dynamic object Map, and locator data 1134 include at least local pose or position.
State and task manager 1122 can be configured as the mode of operation according to determine the probability autonomous vehicle.For example, It is collisionless situation that first mode of operation (that is, " standard operation "), which can describe wherein track, and the second mode of operation (that is, " non-standard operation ") can describe confidence level wherein associated with possible track be not sufficient to ensure that collisionless traveling it is another A kind of situation.According to some examples, state and task manager 1122 are configured with perception data 1132 and determine Autonomous Vehicles State be specification or non-standard.Confidence level generator 1123 can be configured as analysis perception data 1132 with determination The state of autonomous vehicle.For example, confidence level generator 1123 can use semantic letter associated with static and dynamic object Breath and associated probability Estimation come enhance planner 1164 determine safe action policy degree of certainty.For example, planner 1164 can be people using specified object or not to be the perception engine data 1132 of probability of people be to determine planner 1164 It is no just safety operation (for example, planner 1164 can receive object have 98% probability be people and object have 2% it is general Rate is not the degree of certainty of people).
When determining confidence level (for example, based on statistics and determine the probability) less than the threshold required by predicted safety operation When value, relatively low confidence level (for example, single probability score) can trigger planner 1164 to autonomous vehicle service platform The request 1135 that straighforward operation is supported in 1101 transmittings.In some cases, telemetry and one group of candidate tracks can be adjoint Request.The example of telemetry includes sensing data, localization data, perception data etc..Teleoperator 1108 can be through Emit the track 1137 of selection from teleoperator's computing device 1104 to lead track generator 1126.Then, the rail of selection Mark 1137 is the track formed using the guiding of teleoperator.Statelessly change (for example, non-standard state is to wait for when confirming It is fixed) when, lead track generator 1126 transfers data to track creator 1124, and in turn, track creator 1124 makes track The track that tracker 1128 (as contrail tracker) is specified using teleoperator controls 1170 (example of signal to generate Such as, steering angle, speed etc.).Note that before state is transformed into non-standard state, planner 1164 can be triggered to remote control Operate the transmitting for the request 1135 supported.In particular, autonomous vehicle controller and/or its component can predict:Distant place Barrier is likely to become problem, and so that planner 1164 is called straighforward operation before autonomous vehicle reaches barrier. Otherwise, when encountering barrier or scene (for example, pulling over observing and leave the road), autonomous vehicle may be because to safe condition Conversion and cause to postpone.It in another example, can be certainly when autonomous vehicle is difficult to the specific position to navigate known to It is dynamic to call straighforward operation.If such situation may cause the traffic obtained to sensor reading and from each introduces a collection or The interference of the reliability of person's casualty data, the then determination can optionally consider time in other factors, including one day, the sun Position.
Figure 12 is the example of the flow chart in accordance with some embodiments for being configured as control autonomous vehicle.At 1202, stream Journey 1200 starts.The data for the subset for indicating object are received at planner in autonomous vehicle, the subset of object includes and table Show the associated at least one object of the data of the degree of certainty of classification type.For example, perception engine data may include and object Associated metadata, thus the metadata specify degree of certainty associated with specific classification type.For example, dynamic object can To be classified as " young pedestrian ", it is 85% that this, which is classified as correct confidence level,.At 1204, locator data can be received (for example, at planner).Locator data may include the map datum being locally generated in autonomous vehicle.Local map number According to the degree of certainty (including uncertainty) that event may occur in geographic area can be specified.Event can influence Autonomous Vehicles Operation or may influence autonomous vehicle operation condition or situation.Event can inside autonomous vehicle (example Such as, the sensor of failure either damage) or outside autonomous vehicle (for example, a passage is blocked up).This document describes showing for event Example, for example, in fig. 2 and in other figures and paragraph.At 1206, it may be determined that be in same with geographic area interested The path in space.Such as, it is believed that event is when intensity of sunlight compromises the sight of driver during the traffic peak period The sun on high in position.Then, it is expected that predicting:Traffic slows down in response to sunburst.Therefore, if less There may be the alternative route for avoiding event, then planner can preferentially call straighforward operation.At 1208, it is based on local pose Data determine local location at planner.At 1210, for example, can be based on the degree of certainty of classification type and the determination of event Degree determines the mode of operation of (for example, according to probability) autonomous vehicle, and the degree of certainty of classification type and the degree of certainty of event can be with Based on any number of factor, such as speed, position and other status informations.In order to illustrate, following example is considered:Wherein, Young pedestrian may be detected in the sight of other drivers by autonomous vehicle during by the event that the sun damages, and thus caused The pedestrian of the youth is in unsafe situation.Therefore, relatively unsafe situation can be detected as to the probability that may occur Event (i.e., it is possible to the dangerous situation of straighforward operation is called for it).At 1212, determine that mode of operation is in specification condition Possibility, and based on the determination, message is emitted to teleoperator's computing device of request straighforward operation, preferentially to turn Next mode of operation is changed to (for example, from standard operation state to the non-standard mode of operation of such as dangerous mode of operation Preferential conversion).
Figure 13 describes the example that track can be generated according to some exemplary wherein planners.Diagram 1300 includes track Estimator 1320 and track creator 1324.Track estimator 1320 includes that confidence level generator 1322 and teleoperator ask Ask message device 1329.As indicated, track estimator 1320 be coupled to perception engine 1366, with receive static map data 1301, And current and prediction Obj State data 1303.Track estimator 1320 also receives local attitude data from locator 1368 1305 and from Global motion planning device 1369 receive layout data 1307.Under a mode of operation (for example, non-standard), confidence water 1322 reception static map data 1301 of growing up to be a useful person all one's life and current and prediction Obj State data 1303.Based on the data, Confidence level generator 1322 can determine that detected track is associated with unacceptable confidence value.Then, confidence The detected track data 1309 of the transmitting of horizontal generator 1322 (e.g., including the data of candidate tracks), to be grasped via remote control Work person inquires that message device 1329 notifies teleoperator, and in turn, teleoperator inquires the transmitting of message device 1329 to straighforward operation The request 1370 that member helps.
Under another mode of operation (for example, non-standard state), by static map data 1301, current and prediction pair As status data 1303, local pose data 1305 and layout data 1307 (for example, Global motion planning data) are received rail In mark calculator 1325, trajectory calculation device 1325 is configured as calculating track (for example, iteratively), with determination optimal one or Multiple paths.Then, at least one path is selected, and is emitted by path data 1311 alternatively.According to some Embodiment, as an example, trajectory calculation device 1325 is configured as implementing the planning again of track.Nominal driving locus generator 1327 are configured as by accurate schemes generation track, such as by generating track based on roll stablized loop technology.It connects down Come, for example, nominal driving locus path data 1372 can be emitted to track tracker by nominal driving locus generator 1327 Or vehicle control device, to implement the physical change of steering, acceleration and other components.
Figure 14 is another the exemplary diagram for depicting autonomous vehicle service platform in accordance with some embodiments.Diagram 1400 describe autonomous vehicle service platform 1401, and autonomous vehicle service platform 1401 includes teleoperator's manager 1407, Teleoperator's manager 1407 be configured as management teleoperator 1408, teleoperator's computing device 1404 and from Interaction and/or communication between other components of main vehicle service platform 1401.With further reference to diagram 1400, autonomous vehicle clothes Business platform 1401 includes simulator 1440, resources bank 1441, policy manager 1442, reference data update device 1438,2D maps Data repository 1420,3D map datums resources bank 1422 and route data resources bank 1424.(can not it show in resources bank Go out) in implement and store other map datums, such as 4D map datums (for example, usage time determination).
Teleoperator, which acts, recommends controller 1412 including being configured as via autonomous vehicle (" AV ") planner data 1472 receive and/or control straighforward operation service request, and autonomous vehicle planner data 1472 may include to teleoperator The request of help and telemetry and other data.Then, autonomous vehicle planner data 1472 may include the time recommended Select track or path, teleoperator 1408 can via teleoperator's computing device 1404 from the candidate tracks of recommendation or Person selects in path.According to some examples, teleoperator act recommend controller 1412 can be configured as access from Other sources of the candidate tracks of the recommendation of middle selection optimal trajectory.For example, autonomous vehicle planner data can will be included in Candidate tracks in 1472 concurrently introduce simulator 1440, and simulator 1440 is configured as simulation by asking teleoperator side The event or situation that the autonomous vehicle helped is undergone.Simulator 1440, which can be accessed, to be executed on candidate tracks collection needed for simulation Map datum and other data, therefore, simulator 1440 without ceaselessly be repeated simulation with confirm enough fully.More Specifically, or simulator 1440 can provide the confirmation to the appropriateness of candidate tracks or can alert straighforward operation Member carefully chooses track.
Teleoperator interacts capture analyzer 1416 and can be configured as a large amount of teleoperator's affairs of capture or friendship Mutually, to be stored in resources bank 1441, for example, at least in some cases, resources bank 1441 can be accumulated to be grasped with multiple remote controls The relevant data of work person's affairs, for the analysis and generation of strategy.According to some embodiments, resources bank 1441 can also by with Storage strategy data are set to, so that policy manager 1442 accesses.In addition, teleoperator interact capture analyzer 1416 can be with Using machine learning techniques, lead to the event of the request helped teleoperator empirically to determine how best response Or situation.In some cases, policy manager 1442 can be configured as in response to one group of a large amount of teleoperator Interactive analysis (for example, after application machine learning techniques), and update specific policy or generate new strategy.Strategy pipe The reason management of device 1442 may be considered as autonomous vehicle controller and its component and be operated according to it to meet the autonomous behaviour of vehicle The rule of work or the strategy of guideline.In some cases, modification or newer strategy can be applied to simulation Device 1440, to confirm permanent publication or implement the effect of such strategy change.
Simulator interface controller 1414 is configured as between simulator 1440 and teleoperator's computing device 1404 Interface is provided.Such as, it is believed that the sensing data from autonomous vehicle fleet is via autonomous vehicle (" AV ") fleet data 1470 It is applied to reference to data update device 1438, is thus configurable to generate newer map and route with reference to data update device 1438 Data 1439.In some implementations, newer map and route data 1439 can be preliminarily issued to provide as to map datum The update of data in source library 1420 and 1422, or as the update to the data in route data resources bank 1424.At this In the case of, can be " beta versions " by such data markers, in beta versions, for example, when autonomous vehicle use includes When the map segment of preliminary newer information, it is possible to implement the lower threshold for asking straighforward operation service.Furthermore it is also possible to Newer map and route data 1439 are introduced into simulator 1440, to verify newer map datum.(the example when all issuing Such as, at the end of beta is tested), cancel the threshold value of request teleoperator service related with map segment previously reduced. User interface graphics controller 1410 provides abundant figure to teleoperator 1408, it is possible thereby to right in simulator 1440 Autonomous vehicle fleet is simulated, and autonomous vehicle fleet can be accessed via teleoperator's computing device 1404, all right As the autonomous vehicle fleet simulated is true.
Figure 15 is the example of the flow chart of control autonomous vehicle in accordance with some embodiments.At 1502, flow 1500 is opened Begin.Message data can be received at teleoperator's computing device, to manage autonomous vehicle fleet.For autonomous vehicle In the context of planning path, message data can indicate event attribute associated with non-standard mode of operation.For example, can be with It is for example to become the specific crossroad of problem because of a large amount of pedestrians that traffic signal violation lamp goes across the road hastily by event characterization. Event attribute describes the characteristic of event, for example, the number of the pedestrian to go across the road, being handed over caused by number of increased pedestrian etc. Logical delay.At 1504, straighforward operation resources bank can be accessed, with based on aggregated data associated with one group of autonomous vehicle Simulated operation retrieves the first subset of recommendation.In this case, simulator may be the recommendation that teleoperator can implement Source.Alternatively, it is also possible to access straighforward operation resources bank in response to similar event attribute, to be based on teleoperator's interaction Polymerization and retrieve the second subset of recommendation.In particular, teleoperator, which interacts capture analyzer, can apply machine learning Technology is best responded based on the previous Request helped straighforward operation with like attribute with empirically determining how Event.At 1506, the first subset and second subset of recommendation are combined, to form the action side of one group of recommendation of autonomous vehicle Needle.At 1508, the action policy of one group of recommendation can visually be presented on the display of teleoperator's computing device Expression.At 1510, the data letter of the selection (for example, by teleoperator) for the action policy for indicating to recommend can be detected Number.
Figure 16 is exemplary the showing according to some exemplary autonomous vehicle fleet management devices for implementing fleet's optimization manager Figure.Diagram 1600 describes the autonomous vehicle vehicle for the fleet for being configured as managing the autonomous vehicle 1630 advanced in road network 1650 Team's manager.Autonomous vehicle fleet management device 1603 is coupled to teleoperator via teleoperator's computing device 1604 1608, and it is additionally coupled to fleet management's data repository 1646.Autonomous vehicle fleet management device 1603 is configured as receiving plan Slightly data 1602 and environmental data 1606 and other data.With further reference to diagram 1600,1620 quilt of fleet's optimization manager It is shown as including traveling request processor 1631, traveling request processor 1631 includes fleet data withdrawal device 1632 and Autonomous Vehicles again Send optimization calculator 1634.Traveling request processor 1631 is configured as processing traveling request, for example, from asking The traveling of the user 1688 of autonomous vehicle service is asked.Fleet data withdrawal device 1632 be configured as extract with it is autonomous in fleet The related data of vehicle.Data associated with each autonomous vehicle are stored in resources bank 1646.For example, being directed to each vehicle Data can describe maintenance issues, dispatch service calling, routine use, battery be charged and discharged rate and it is any its Its data, these data can be real time updated, and can be used for optimizing the fleet of autonomous vehicle, when minimizing stoppage in transit Between.Autonomous vehicle sends optimization calculator 1634 to be configured as analyzing extracted data, and calculating makes the optimal of fleet With to ensure that the next vehicle (for example, from station 1652) sent provides minimum running time and/or autonomous vehicle service Totle drilling cost.
Fleet's optimization manager 1620 is described as to include mixing autonomous vehicle/non-autonomous vehicle processor 1640, at this Device 1640 is managed again including AV/ non-A/V optimization calculator 1642 and non-A/V selector 1644.According to some examples, Autonomous Vehicles are mixed / non-autonomous vehicle processor 1640 is configured as management autonomous vehicle and people drives vehicle (for example, as independent contractor) Mixed fleet.Then, autonomous vehicle service can meet the needs of excess using non-autonomous vehicle, alternatively, that may surpass Go out it is geographical enclose the region (for example, non-A/V coverage 1690) of partition or in the poor region of communication overlay using it is non-from Main vehicle.AV/ non-A/V optimization calculator 1642 is configured as optimizing the use of autonomous fleet, and non-A/V is invited to drive Member participates in transportation service (for example, having minimal damage or harmless to autonomous vehicle service).Non-A/V selector 1644 includes For selecting multiple non-A/V drivers to be patrolled come what is helped based on the calculating that calculator 1642 is obtained is optimized by AV/ non-A/V Volume.
Figure 17 is the example in accordance with some embodiments for managing the flow chart of the fleet of autonomous vehicle.At 1702, Flow 1700 starts.At 1702, policy data is received.Policy data may include defining how best to be applied to selection Autonomous vehicle is the parameter advanced and ask service.At 1704, fleet management's data can be extracted from resources bank.Fleet management's number According to the subset (for example, the ready data of the vehicle of transport request are served in description) of the data including autonomous vehicle pond. At 1706, the data for the request that indicates to advance are received.For exemplary purposes, request of advancing may be directed to from the first geographical location To the transport in the second geographical location.At 1708, the attribute based on policy data is calculated, can be used for taking for the request to determine The subset of the autonomous vehicle of business.For example, attribute may include battery charge level and the time to the maintenance arranged next time. At 1710, autonomous vehicle is selected to carry out transport from the first geographical location to the second geographical location, and generate for row The data of autonomous vehicle are sent into the associated third geographical location of starting of request.
Figure 18 is to show the autonomous vehicle fleet in accordance with some embodiments for implementing autonomous vehicle communication link manager The diagram of manager.Diagram 1800 describes autonomous vehicle fleet management device, is configured as management and advances in road network 1850 Autonomous vehicle 1830 fleet, road network 1850 and the communication disruption weight being identified as at the region of " communicating relief regions " 1880 It is folded.Autonomous vehicle fleet management device 1803 is coupled to teleoperator 1808 via teleoperator's computing device 1804.Independently Vehicle fleet management device 1803 is configured as receiving policy data 1802 and environmental data 1806 and other data.Further Reference diagram 1800, autonomous vehicle communication link manager 1820 are shown as including that environment event detector 1831, strategy adapt to Determiner 1832 and traveling request processor 1834.Environment event detector 1831 is configured as receiving environmental data 1806, The specified variation being implemented in the environment of autonomous vehicle service of environmental data 1806.For example, environmental data 1806 can refer to Determining region 1880 has the communication service to degrade, this may influence autonomous vehicle service.Strategy, which adapts to determiner 1832, to be referred to The parameter to be applied when settled during this event (for example, during loss of communications) receives traveling request.It advances and asks Processor 1834 is configured as handling traveling request in view of the communication of degradation.In this example, user 1888 is asking certainly Main vehicle service.In addition, traveling request processor 1834 includes the logic for applying adjusted strategy, adjusted strategy For changing the mode for sending autonomous vehicle, to avoid the complex situations caused by poor communication.
Communication event detector 1840 includes policy download manager 1842 and be configured with communication (" be configured with COMM ") AV sends device 1844.Policy download manager 1842 is configured as in view of communication relief regions 1880 and to autonomous vehicle 1830 provide newer strategy, and as a result, if autonomous vehicle enters region 1880, newer strategy, which can specify, quickly exits this The route in region.For example, at the time of before driving into region 1880, autonomous vehicle 1864 can receive newer strategy.When logical When letter loss, autonomous vehicle 1864 implements newer strategy, and selects the route 1866 for being quickly driven out to region 1880.It is configured with The AV of COMM sends device 1844 to can be configured as mark will park the point 1865 of autonomous vehicle at this, and point 1865 is configured as Terminal, to establish peer-to-peer network on region 1880.Then, the AV for being configured with COMM sends device 1844 to be configured as sending Autonomous vehicle 1862 (no passenger) stops to position 1865, for being grasped as the communication tower in reciprocity mobile ad hoc network Make.
Figure 19 is in accordance with some embodiments determining certainly in a certain event (for example, communication degrades or communication loss) period The example of the flow chart of the action of main vehicle.At 1901, flow 1900 starts.Policy data is received, policy data is thus fixed Justice will be applied to during event the parameter of the traveling request in geographic area.At 1902, it is possible to implement in following action One or more:(1) subset of autonomous vehicle is sent to the geographical location in the part of geographic area, the son of autonomous vehicle Collection is configured as being parked at specific geographic position and each autonomous vehicle serves as static communication terminal, or in geographic region It is advanced in domain to each act as mobile communication terminal, the part of (2) in autonomous vehicle associated with the part of geographic area pond Between implement peer-to-peer communications, (3) to autonomous vehicle provide Event Policies, Event Policies description geographic area is driven out to during event Part route, (4) call straighforward operation, and (5) recalculate path, to avoid the geographical portions.It is implementing After the action, at 1914, the fleet of autonomous vehicle is monitored.
Figure 20 is the exemplary diagram for depicting locator in accordance with some embodiments.Diagram 2000 includes locator 2068, locator 2068 is configured as from 2070 receiving sensor data of sensor, such as laser radar data 2072, video camera Data 2074, radar data 2076 and other data 2078.In addition, locator 2068 is configured as receiving with reference to data 2020, such as 2D map datums 2022,3D map datums 2024 and 3D local map data.It, can also according to some examples Implement other map datums, such as 4D map datums 2025 and semantic map datum (not shown), including corresponding data structure And resources bank.With further reference to diagram 2000, locator 2068 includes positioning system 2010 and localizing system 2012, the two It is both configured to receive with reference to data 2020 and the sensing data from sensor 2070.Localization data set grows up to be a useful person 2014 It is configured as receiving data from positioning system 2010 and receives data from localizing system 2012, thus localize data integration Device 2014 is configured as that the sensing data from multiple sensors is integrated or merged, to form local pose data 2052.
Figure 21 is showing for the flow chart in accordance with some embodiments that local pose data are generated based on integrated sensor data Example.At 2101, flow 2100 starts.At 2102, receives with reference to data, include three-dimensional map data with reference to data.One In a little examples, it can be received with reference to data, such as 3D 4D map datums via one or more networks.At 2104, connect The localization data from one or more localization sensors are received, and place them into localizing system.It, will at 2106 Location data from one or more alignment sensors receives in positioning system.At 2108, integrate localization data and Location data.At 2110, localization data and location data are integrated, to form the office for the geo-location for specifying autonomous vehicle Portion's location data.
Figure 22 is another the exemplary diagram for depicting locator in accordance with some embodiments.Diagram 2200 includes positioning Device 2268, locator 2268 are based on positioning again including localizing system 2210 and opposite localizing system 2212 to generate respectively Data 2250 and data 2251 based on local location.Localizing system 2210 includes projection processor 2254a, for place GPS data 2273, GPS datum marks 2211 and 3D map datums 2222 and other optional datas are managed (for example, 4D map numbers According to).Localizing system 2210 further includes Ranging Processor 2254b, to handle wheel data 2275 (for example, wheel velocity), vehicle Model data 2213 and 3D map datums 2222 and other optional datas.In addition, localizing system 2210 includes integrator Processor 2254c, to handle IMU data 2257, auto model data 2215 and 3D map datums 2222 and other Select data.Similarly, opposite localizing system 2212 includes laser radar terms of localization approach device 2254d, for handling laser Radar data 2272,2D segments map datum 2220,3D map datums 2222 and 3D local maps data 2223, Yi Jiqi Its optional data.Opposite localizing system 2212 further includes visual registration processor 2254e, with handle camera data 2274, 3D map datums 2222 and 3D local maps data 2223 and other optional datas.In addition, opposite localizing system 2212 include radar return processor 2254f, to handle radar data 2276,3D map datums 2222 and 3D local maps Data 2223 and other optional datas.Note that in the various examples, it is possible to implement other types of sensing data and biography Sensor or processor, such as sonar data etc..
It, can be by the data 2250 based on localization and the data based on opposite localization with further reference to diagram 2200 2251 are fed respectively to that data set grows up to be a useful person 2266a and localization data set grows up to be a useful person 2266.Data set grow up to be a useful person 2266a and localization number Can be configured as the corresponding data of fusion according to integrator 2266, thus when localization data set grow up to be a useful person at 2266 with based on phase Before being merged to the data 2251 of localization, it can grow up to be a useful person data 2250 of the places the 2266a fusion based on localization in data set.Root According to some embodiments, data set grow up to be a useful person 2266a be formed to localize data set grow up to be a useful person 2266 a part, or be not present. It is anyway possible to which the data 2250 based on localization and the data 2251 based on opposite localization are all fed to localization number According in integrator 2266, for fused data, to generate local location data 2252.Data 2250 based on localization can To include unitary bound data (and uncertainty value) from projection processor 2254a and come from Ranging Processor 2254b With the binary bound data (and uncertainty value) of integrator processor 2254c.Data 2251 based on opposite localization can be with Including coming from terms of localization approach device 2254d and visual registration processor 2254e and optionally from radar return processor The unitary bound data (and uncertainty value) of 2254f.According to some embodiments, localization data set is grown up to be a useful person and 2266 can be implemented Nonlinear smoothing function, for example, Kalman filter (for example, gate Kalman filter), opposite beam adjuster, posture figure Loose, particle filter, histogram filter etc..
Figure 23 is the exemplary diagram for depicting perception engine in accordance with some embodiments.Diagram 2300 includes perception engine 2366, perception engine 2366 includes cutting processor 2310, object tracker 2330 and grader 2360 again.In addition, example Such as, perception engine 2366 be configured as receiving local location data 2352, laser radar data 2372, camera data 2374, And radar data 2376.Note that other sensing datas, such as sonar data can be accessed, engine 2366 is perceived to provide Function.Cutting processor 2310 is configured as extracting ground data and/or divides the part of image, and object is distinguished one another And object and still image (for example, background) are distinguished.In some cases, 3D spots can be divided, to distinguish one another. In some examples, spot may refer to one group of feature of the object in identifier space copying surroundings, and can be similar by having The element of characteristic (for example, intensity and color) constitutes (for example, point etc. of the pixel of camera data, return laser beam data). In some examples, spot can also give directions shape cloud (for example, being made of color laser echo data) or constitute object its Its element.The frame that object tracker 2330 is configured as executing the movement of spot or the image section of other segmentations is estimated to frame Meter.In addition, will be in the second frame of spot and moment t2 at a position in the first frame of moment t1 using data correlation The spot of different location is associated.In some instances, object tracker 2330 is configured as executing 3D objects (for example, spot) Real-time probabilistic tracking.Grader 2360 is configured as mark object, and according to classification type (for example, for pedestrian, riding voluntarily The people etc. of vehicle) and classified to the object according to energy/activity (for example, object is dynamic or static), thus The data of presentation class are described by semantic label.According to some embodiments, the probability Estimation to object type, example can be executed Such as, vehicle, the people of cycling, pedestrian etc. are classified subjects into, each object classification has different confidence levels.Perceive engine 2366 are configured to determine that perception engine data 2354, perception engine data 2354 may include static object map and/or move State subject map and semantic information, so that such as planner can use the information to enhancing path planning.According to various Example, one or more of cutting processor 2310, object tracker 2330 and grader 2360 can apply engineering Habit technology perceives engine data 2354 to generate.
Figure 24 is the example of the flow chart in accordance with some embodiments for generating perception engine data.Flow chart 2400 starts from 2402, at 2402, retrieval indicates the data of the local location of autonomous vehicle.At 2404, passed from one or more localization Sensor receives localization data, and label is split the feature of the environment which provided autonomous vehicle, at 2406 with shape At the object of segmentation.At 2408, one or more parts of the object of segmentation are spatially tracked, there is movement to be formed At least one object to be tracked of (for example, movement of estimation).At 2410, object to be tracked is at least classified as static state Object or dynamic object.It in some cases, can be associated with classification type by static object or dynamic object. At 2412, the data for the object that mark is classified are generated.For example, the data for the object that mark is classified may include semantic letter Breath.
Figure 25 is the example of cutting processor in accordance with some embodiments.Diagram 2500 describes cutting processor 2510, Cutting processor 2510 receives laser radar data from one or more laser radars 2572, and from one or more video cameras 2574 receive camera review data.Local pose data 2552, laser radar data and camera review data are received It is rotated in generator 2521 to member.In some instances, member rotation generator be configured as based on each attribute (for example, color, Intensity etc.) it divides an image into differentiable region (for example, dotted cloud cluster either group) and can be askd to join one in the same time or greatly One time update therein at least two is more.Object is executed at cutting processor 2523 using first spin data 2522 Segmentation and ground segmentation, thus by from cutting processor 2523 first spin data 2522 and with divide both relevant data It is applied to scanning difference processor 2513.Scanning difference processor 2513 is configured as the movement of the image section of prediction segmentation And/or relative velocity, it can be used for identifying dynamic object at 2517.Optionally the instruction at 2517 is had and to be detected The data of the object of speed are emitted to planner, to enhance programmed decision-making.Further, it is possible to use from scanning difference processor 2513 data come the position of Approximate object, with the drawing (and optionally identifying the level of movement) of object as formation. In some instances, it can generate and occupy grid map 2515.It can will indicate that the data for occupying grid map 2515 are emitted to Planner, to further enhance path planning decision (for example, uncertain by reducing).With further reference to diagram 2500, make Classified to the spot in blob classifications device 2520 with the image camera data from one or more video cameras 2574, spot Point grader 2520 also receives blob data 2524 from cutting processor 2523.Cutting processor 2510 can also be from one or more A radar 2576 receives original radar return data 2512, to execute segmentation, radar segmentation at radar cutting processor 2514 Processor 2514 generates and the relevant blob data of radar 2516.With further reference to diagram 25, cutting processor 2510 can also It receives and/or generates and the relevant tracked blob data 2518 of radar data.Blob data 2516, tracked spot number It can be used for tracking object or part thereof according to the 2518, data from blob classifications device 2520 and blob data 2524.According to Some examples, one of the following or it is multiple can be optional:Scan difference processor 2513, blob classifications device 2520, with And the data from radar 2576.
Figure 26 A are the exemplary diagrams for depicting object tracker and grader according to various embodiments.Diagram 2600 Object tracker 2630 be configured as receiving blob data 2516, tracked blob data 2518, come from blob classifications device 2520 data, blob data 2524 and the camera review data from one or more video cameras 2676.Image trace Device 2633 is configured as receiving camera review data from one or more video cameras 2676, to generate tracked picture number According to again tracked image data can be provided to data correlation processor 2632.As indicated, data correlation processor 2632 It is configured as receiving blob data 2516, tracked blob data 2518, the data from blob classifications device 2520, spot number According to 2524 and the tracking image data from image tracker 2633, and data correlation processor 2632 is further matched It is set to the one or more associations identified between data type described above.For example, data correlation processor 2632 is configured To track various blob datas from a frame to next frame, for example, with estimation movement etc..In addition, tracking renovator 2634 can One or more tracking or object to be tracked are updated to use the data that data correlation processor 2632 is generated. In some examples, tracking renovator 2634 can implement the newer data that Kalman filter etc. is tracked object with formation, It can be stored in online in track database (" DB ") 2636.It can be via data correlation processor 2632 and tracking number Feedback data is exchanged according to the path 2699 between library 2636.In some instances, image tracker 2633 is optional, and can To be excluded.Object tracker 2630 can also use other sensing datas, for example, radar or sonar and it is any its The sensing data of its type.
Figure 26 B are another the exemplary diagrams depicted according at least some exemplary object trackers.Diagram 2601 Including object tracker 2631, object tracker 2631 may include and combine one or more of the other figure (for example, Figure 26 A) institute The identical structure of element and/or function of the similar name of description.As indicated, object tracker 2631 includes optimization registration part 2699 comprising processor 2696, processor 2696 are configured as executing object scan registration and data fusion.Processor 2696 The data for being additionally configured to obtain are stored in 3D object databases 2698.
Referring back to Figure 26 A, diagram 2600 further includes grader 2660, and grader 2660 may include tracking classification engine 2662, for generating static-obstacle thing data 2672 and dynamic barrier data 2674, the two can be emitted to planning Device, for path planning purpose.In at least one example, it is quiet that tracking classification engine 2662, which is configured as disturbance in judgement object, It is state or dynamic and whether be another classification type of object (for example, whether object is vehicle, pedestrian, sets, ride People, dog, cat, paper bag of bicycle etc.).Static-obstacle thing data 2672 can be formed as to barrier map (for example, 2D is occupied Map) part, and dynamic barrier data 2674 can be formed as include bounding box and with instruction speed and point The data of Class Type.Dynamic barrier data 2674 at least include 2D dynamic barrier map datums in some cases.
Figure 27 is according to some exemplary examples for perceiving the front-end processor of engine.Diagram 2700 includes according to each Kind is exemplary for executing the ground cutting processor 2723a that ground is divided and excessive point for executing " over-segmentation " Cut processor 2723b.Processor 2723a and 2723b are configured as optionally receiving color laser radar data 2775.Excessively divide The data 2710 that processor 2723b generates the first spot type (for example, relatively small spot) are cut, polymerization classification is provided to With segmentation engine 2712, the data 2714 of polymerization classification and the second spot type of generation of segmentation engine 2712.Data 2714 are carried It is supplied to data correlation processor 2732, data correlation processor 2732 is configured as whether detection data 2714 resides in tracking number According in library 2736.Judge at 2,740 second spot type (for example, relatively large spot, may include it is one or more compared with Small spot) data 2714 whether be new tracking.If so, at 2742 initialize tracking, if not, by be tracked Object data is stored in track database 2736, and is tracked renovator 2742 and can be extended or update tracking.Tracking point Class engine 2762 is coupled to track database 2736, for example, with by addition, removal or modification with track relevant data come Mark and update/modification tracking.
Figure 28 be depict according to various embodiments be configured as simulating autonomous vehicle in synthetic environment The diagram of simulator.Diagram 2800 includes the simulator 2840 for being configurable to generate simulated environment 2803.As indicated, simulator 2840 are configured with user with reference to data 2822 (for example, 3D map datums and/or other maps or route data, packet Include RNDF data or similar road net data), to generate simulation geometric figure, such as template surface in simulated environment 2803 2892a and 2892b.Template surface 2892a and 2892b can simulate wall or the front side of the building of neighbouring road.Simulator 2840 can also carry out the dynamic in analog synthesis environment using dynamic object data be generated in advance or Program Generating 2825 Agency.The example of dynamic proxy is the dynamic object 2801 of simulation, and the dynamic object 2801 of simulation indicates the simulation with speed Cycling people.The dynamic proxy of simulation can be optionally in response simulation environment other static and dynamic proxies, packet Include the autonomous vehicle of simulation.For example, the object 2801 of simulation may subtract because of other barriers in simulated environment 2803 Slowly, rather than pre-set track is followed, so as to create to being present in practical dynamic environment with the real world more Actual simulation.
Simulator 2840 can be configured as generation simulation autonomous vehicle controller 2847, simulate autonomous vehicle controller 2847 include adapting to the synthesis of perception engine 2866, locator 2868, motion controller 2862 and planner 2864, In each can have the function of in simulated environment 2803 described herein.Simulator 2840 can also generate analog interface (" I/F ") 2849, by simulate with different sensors in the form of and different sensors data format data exchange.Then, simulation connects Mouth 2849 can simulate the software interface of packetized data (for example, coming from simulated laser radar sensor 2872).In addition, simulation Device 2840 can be additionally configured to generate the simulation autonomous vehicle 2830 for implementing simulation AV controllers 2847.Simulate autonomous vehicle 2830 include that simulated laser radar sensor 2872, analog video camera or imaging sensor 2874 and guinea pig sense Device 2876.In the example shown, it is consistent with Ray tracing 2892 to can be configured as generation for simulated laser radar sensor 2872 Analog laser, this cause analog sensor return 2891 generation.Note that simulator 2840 can with analogue noise or its Its environmental effect on sensing data addition (for example, influence analog sensor return 2891 equal additions diffusion or Reflection).In addition, simulator 2840 can be configured as the various defect sensors of simulation, including the non-school of sensor fault, sensor Accurate, intermittent data interruption etc..
Simulator 2840 includes physical processor 2850, for simulating in the simulation behavior of simulation autonomous vehicle 2830 Using the machinery of autonomous vehicle, static state, for the use of dynamic and kinematics.For example, physical processor 2850 includes for simulating Contact machinery content mechanical module 2851, the collision detection module 2852 for simulating the interaction between simulated main body, And the multiagent dynamic module 2854 for simulating the interaction between simulated machinery interaction.
Simulator 2840 further includes emulator controller 2856, and emulator controller 2856 is configured as control simulation with suitable The function of answering any element being synthetically generated of simulated environment 2803, to determine causality etc..Simulator 2840 includes being used for Estimate the simulator estimator 2858 of the performance for the element of simulated environment 2803 being synthetically generated.For example, simulator estimator 2858 can be with analysis mode vehicle command 2880 (for example, speed of the steering angle and simulation of simulation), to be ordered as judgement Whether order makes response appropriate to the activity of the simulation in simulated environment 2803.In addition, simulator estimator 2858 can be with Estimate teleoperator 2808 via teleoperator's computing device 2804 and simulates the interaction of autonomous vehicle 2830.Simulator is estimated Gauge 2858 can estimate the newer influence with reference to data 2827, newer to include newer map segment with reference to data 2827 And route data, the response of guiding simulation autonomous vehicle 2830 can be added to.When update, deletion or addition strategy When data 2829, simulator estimator 2858 can also be estimated to simulate the response of AV controllers 2847.2840 or more simulator Description is not intended to be limited.Then, simulator 2840 is configured as execution pair and the relevant autonomous vehicle of simulated environment A variety of different simulations, both including static nature or including behavioral characteristics.It is, for example, possible to use simulator 2840 verifies software Variation in version, to ensure reliability.Simulator 2840 can also be used to determine vehicle dynamic characteristics for alignment purpose. Furthermore it is also possible to using simulator 2840 come explore can application control and generated track space, to pass through from simulating Play the effect of study.
Figure 29 is the example for the flow chart that the various aspects in accordance with some embodiments to autonomous vehicle are simulated.Flow Figure 29 00 starts from 2902, and at 2902, including three-dimensional map data is received with reference to data in simulator.2904 Place, can be retrieved as the dynamic object data of classified object definition motor pattern.At 2906, it is at least based on three-dimensional (" 3D ") map datum and dynamic object data form simulated environment.Simulated environment may include one or more template surfaces. At 2908, autonomous vehicle is simulated comprising form the simulation autonomous vehicle controller of the part of simulated environment.Independently Vehicle control device may include the simulation perception engine and simulator locating device for being configured as receiving sensor data.At 2910, Simulated sensor data is generated based on the data returned at least one analog sensor, and generates simulation at 2912 Vehicle command, so that the simulation autonomous vehicle movement (for example, vector propulsion) in synthetic environment.At 2914, mould is estimated Quasi- vehicle command, to judge whether the behavior for simulating autonomous vehicle is consistent with desired behavior (for example, consistent with strategy).
Figure 30 is the example of the flow chart in accordance with some embodiments for generating map datum.Flow chart 3000 starts from 3002, track data is retrieved at 3002.Track data may include capture whithin a period of time track (for example, record Track).At 3004, localization data can be at least received.Localization data can be captured whithin a period of time (for example, note The localization data of record).At 3006, it is possible to implement video camera or other imaging sensors, to generate localization data Subset.Then, the localization data retrieved may include image data.At 3008, the subset for the data that localize is aligned, To identify global position (for example, global posture).At 3010, three-dimensional (" 3D ") map datum is generated based on global position, and And at 3012, for example, three-dimensional map data can be used for by manual route data editing machine (e.g., including manual road network number According to editing machine, such as RNDF editing machines), automated path Data Generator (e.g., including automatic road network generator, including automatic RNDF generators), autonomous vehicle fleet, simulator, teleoperator's computing device and autonomous vehicle service it is any other Component is implemented.
Figure 31 is the diagram for the architecture for depicting drawing engine in accordance with some embodiments.Diagram 3100 includes 3D systems Figure engine, 3D drawing engines are configured as receiving locus record data 3140, laser radar record data 3172, camera record Data 3174, radar record data 3176 and other sensing data (not shown) optionally recorded.Logic 3141 includes back Road close detector 3150, circuit closed detector 3150 are configured as whether detection sensor data indicate near in space Point is previously accessed.Logic 3141 further include match collimator controller 3152, with collimator controller 3152 be used for relative to one or Multiple registration points are directed at map datum, and in some cases map datum includes 3D map datums.In addition, logic 3141 provides The data 3142 of the state of circuit closed used in global posture graphic generator 3143 of expression, global posture graphic hotsopt Device 3143 is configurable to generate posture graph data 3145.It in some instances, can also be based on come self registration fining module 3146 data generate posture graph data 3145.Logic 3144 includes 3D drawing appliances 3154 and laser radar self calibration unit 3156.In addition, 3144 receiving sensor data of logic and posture graph data 3145, with generate 3D map datums 3120 (or Other map datums, such as 4D map datums).In some instances, logic 3144 can implement unblind distance function (" TSDF ") with merge sensor data and/or map datum, to form optimal three-dimensional map.In addition, logic 3144 by with It is set to including texture and reflection characteristic.3D map datums 3120 can be issued, with by manual 3160 (example of route data editing machine Such as, route data either other types of route or the editing machine with reference to data are manipulated), automatic route data generator 3162 (for example, being configurable to generate route data either other types of road network or the logic with reference to data), autonomous vehicle 3164 Fleet, simulator 3166, teleoperator's computing device 3168 and autonomous vehicle service any other component use. The annotation that can either be automatically generated from manual annotations of drawing engine 3110 and from other sensors (for example, sonar or instrument The environment (for example, Intelligent stop car light) of device) capture semantic information.
Figure 32 is to depict the diagram applied according to some exemplary autonomous vehicles.Diagram 3200 describes mobile computing and sets Standby 3203, mobile computing device 3203 includes independently being served by 3240, is independently served by 3240 and is configured as contact independently Vehicle service platform 3201, to arrange the transport of user 3202 via autonomous vehicle 3230.As indicated, being independently served by 3240 May include transport controller 3242, transport controller 3242 can be resident in computing device (for example, mobile phone 3203 Deng) on software application.Transport controller 3242 be configured as receive, scheduling, selection or execute with autonomous vehicle and/or The relevant operation of autonomous vehicle fleet, user 3202 can be that autonomous vehicle and/or autonomous vehicle fleet arrange where user Transport of the position to destination.For example, user 3202 can open using to ask vehicle 3230.The application can be explicitly Figure, and user 3202 can cast the drawing pin for indicating their destination in such as geographical enclose in partition.Alternatively, described to answer With can show the list of neighbouring preassigned boarding position, or provides a user and keyed in thereto by address or title The text entry field of destination.
With further reference to shown example, autonomous vehicle can also include user identifier controller 3246, user using 3240 Mark controller 3246 can be configured as detected when vehicle is close user 3202 be in the geographic area, nearby or Close to autonomous vehicle 3230.In some cases, when autonomous vehicle 3230 is close, can be 3202 used times of user, user 3202 May be not easy to perceive or identify vehicle 3230 (for example, due to various other vehicles, including truck, car, taxi and Typical other barriers in urban environment).In one example, autonomous vehicle 3230 can establish wireless communication link 3262 (for example, via radio frequency (" RF ") signal, for example, WiFi orIncluding BLE etc.), for transmitting and/or determining user 3202 spatial position (for example, using the relative direction and signal strength of RF signals) relative to autonomous vehicle 3230.At some In the case of, autonomous vehicle 3230 can detect the approximate geographical location of user 3202 such as using GPS data.Mobile computing is set Standby 3203 GPS receiver (not shown), which can be configured as to autonomous vehicle, is served by 3240 offer GPS datas.Then, User identifier controller 3246 can provide GPS data, autonomous vehicle via link 3260 to autonomous vehicle service platform 3201 Service platform 3201 can provide the position to autonomous vehicle 3230 via link 3261 again.Then, autonomous vehicle 3230 can With by the way that the GPS data of user to be compared with the positions obtained the GPS of vehicle, to determine the relative distance of user 3202 The direction and/or.
Autonomous vehicle 3230 can also include the existing additional logic for identity user 3202, so that logic is configured To execute face detection algorithm, to be generally detected to user 3202, or unique facial characteristics based on user 3202 Carry out the identity (for example, name, telephone number etc.) of specifically identity user.It is used in addition, autonomous vehicle 3230 may include detection In the logic of the code of identity user 3202.The example of such code includes:Special visible code, for example, QR code, color Code etc.;Special audio code, for example, voice activation or identification code etc.;And it is other.In some cases, generation Code can be the security key of coding, the security key of the coding can be emitted to autonomous vehicle 3230 via link 3262, It entrance to ensure safety and/or leaves.Alternatively, it is also possible to will be as described herein for one in the technology of identity user 3202 Kind or it is a variety of be used as to enter and leave power and being granted to user 3202, to prevent the safety that other people enter autonomous vehicle 3230 single Member (such as, it is ensured that third party people cannot be into the autonomous vehicle of no occupant before reaching at user 3202).According to various Example, can also autonomous vehicle be served by 3240, autonomous vehicle service platform 3201 and autonomous vehicle 3230 one A or multiple middle any other units entered and left implemented for identity user 3202 and safety is provided.
In order to help user 3202 to identify the arrival of its requested transport, autonomous vehicle 3230 can be configured as:When When it is close to user 3202, notice or the presence for alerting 3202 autonomous vehicle 3230 of user in other ways.For example, autonomous vehicle 3230 can be according to particular lamp mode activation one or more light emitting devices 3280 (for example, LEO).In particular, can create Certain lamp patterns are built, so that user 3202, which can easily perceive, has been subscribed to autonomous vehicle 3230 with service user 3202 Transportation demand.As an example, autonomous vehicle 3230 can generate can be perceived as the lamp pattern " flickered " by user 3202 3290, or can be the lamp pattern 3290 of its outwardly and inwardly other animation using vision and time mode of lamp.It can be with The lamp pattern 3290 with and without acoustic pattern is generated, so that user 3202 identifies that the vehicle is exactly the vehicle ordered by them .
According to some embodiments, autonomous vehicle customer controller 3244 can be implemented to be configured as each of control autonomous vehicle The software application of kind function.In addition, being guided again during marching to its initial purpose ground or again using can be configured as New routing autonomous vehicle.Logic on plate is set to change independently in addition, autonomous vehicle customer controller 3244 can be configured as The interior lighting of vehicle 3230, for example, to realize that mood illuminates.Controller 3244 can also control audio-source (for example, such as The external source of Spotify, or the audio that is locally stored on mobile computing device 3203), selection takes type (for example, repairing Change desirable acceleration and brake is aggressive, modification actively suspends parameter and implements to gather to select one group of " road-manipulation " characteristic Driving performance, including vibrations, or selection " soft-to take " quality are closed, wherein reducing vibrations in order to comfortable) etc..For example, mobile Computing device 3203 can be additionally configured to control HVAC functions, such as ventilation and temperature.
Figure 33 to 35 show according to various embodiments be configured as provide various work(to the component of autonomous vehicle service The example of the various computing platforms of energy.In some instances, can use computing platform 3300 implement computer program, using, Method, process, algorithm or other softwares are to execute any one of techniques described herein.
Note that the various structures and/or function of Figure 33 are also applied for Figure 34 and Figure 35, thus it is possible to above and below Figure 33 Some elements in the two figures are discussed in text.
In some cases, computing platform 3300 can be arranged in any equipment of such as computing device 3390a, calculate Equipment 3390a can be arranged one or more of autonomous vehicle service platform computing device, autonomous vehicle 3391, and/or In mobile computing device 3391.
Computing platform 3300 includes bus 3302 or other communication mechanisms for transmitting information, by subsystem and is set Standby interconnection, the equipment such as processor 3304, system storage 3306 (for example, RAM), storage device 3308 (for example, ROM etc.), (it can be real in the other parts of RAM 3306 or computing platform 3300 for memory high speed buffer storage Apply), communication interface 3313 (for example, Ethernet or wireless controller, bluetooth controller, NFC logical etc.), with promote via The communication of port on communication link 3321, with for example with computing device communication, the computing device includes having processor Mobile computing and/or communication equipment.It can utilize in one or more graphics processing units (" GPU ") and one or more Central Processing Unit (" CPU ") (for example,Those of company's manufacture) or one or more virtual processor and CPU Implement processor 3304 with any combinations of virtual processor.Computing platform 3300 is exchanged via input and output device 3301 Indicate that the data output and input, the equipment 3301 include but is not limited to keyboard, Genius mouse, audio input (for example, speech To text device), user interface, display, monitor, cursor, touch-sensitive display, LCD or light-emitting diode display and other With the relevant equipment of I/O.
According to some examples, computing platform 3300 executes specific operation by processor 3304, and the execution of processor 3304 is deposited The one or more sequences instructed in one or more of system storage 3306 are stored up, and computing platform 3300 can be implemented It is arranged at client-server arrangement, equity or is embodied as any mobile computing device, including smart phone etc..It can incite somebody to action Such instruction or data read in system storage 3306 from another computer-readable medium (for example, storage device 3308). In some instances, the software instruction for embodiment can be replaced, or combined with software instruction to use hardwired electric Road.Instruction can be embedded in software or firmware.Term " computer-readable medium " refers to participating in carrying to processor 3304 For any tangible medium of the instruction for execution.Such medium can take many forms, including but not limited to non-easy The property lost medium and Volatile media.For example, non-volatile media includes light or disk etc..Volatile media includes dynamic memory Device, such as system storage 3306.
For example, the common form of computer-readable medium includes floppy disk, floppy disc, hard disk, tape, any other magnetic Jie Matter, CD-ROM, any other optical medium, card punch, paper tape, any other physical medium with sectional hole patterns, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cassette tape or computer can be read from any Other media.It can also emit or receive instruction using transmission medium.Term " transmission medium " may include can store, Coding any tangible either intangible medium that either carrying instruction executes for machine and include that number or analogue communication are believed Number or other intangible mediums, with the communication instructed as promotion.Transmission medium includes coaxial cable, copper conductor, Yi Jiguang Fibre includes the conducting wire for including the bus 3302 for emitting computer data signal.
In some instances, the execution of instruction sequence can be completed by computing platform 3300.According to some examples, calculate flat Platform 3300 can be by communication link 3321 (for example, cable network, such as LAN, PSTN or any wireless network, including various The WiFi of standard and agreement,NFC, Zig-Bee etc.) be coupled to any other processor, with press mutually coordination (or It is asynchronous) mode execute instruction sequence.Computing platform 3300 can by communication link 3321 and communication interface 3313 transmitting and Receive message, data and instruction, including program code (for example, application code).When receiving program code, Ke Yiyou Processor 3304 executes received program code, and/or program code is stored in memory 3306 or other non-volatile In reservoir, for then executing.
In the example shown, system storage 3306 may include comprising can for implement functionality described herein The various modules executed instruction.System storage 3306 may include operating system (" O/S ") 3332 and application 3336 and/ Or (multiple) logic module 3359.In example shown in fig. 33, system storage 3306 is controlled including autonomous vehicle (" AV ") Device module 3350 processed and/or its component are (for example, perception engine modules, localization module, planner module, and/or motion control Device module), any module or one or more part in them can be configured as by implementing described herein one A or multiple functions and promote autonomous vehicle service.
With reference to example shown in Figure 34, system storage 3306 include autonomous vehicle services platform module 3450 and/or Its component (for example, teleoperator's manager, simulator etc.), any module in them or one or more part It can be configured as and promote autonomous vehicle service by implementing one or more functions described herein.
The example with reference to shown in Figure 35, system storage 3306 includes for being used in such as mobile computing device Autonomous vehicle (" AV ") module and/or its component.One or more parts of module 3550 can be configured as by implementing this The one or more functions of text description and promote the delivery of autonomous vehicle service.
Referring back to Figure 33, software, hardware, firmware, circuit or combination thereof implementation may be used and retouched herein The structure and/or function of any feature in feature that is stating or being incorporated by reference into.Note that can be by above structure and structure It polymerize with one or more of the other structure or element at element and their function.Alternatively, can by the element with And their function is subdivided into composition subcomponent, if present.As software, can use various types of programmings or Formatted language, framework, grammer, application, agreement, object or technology are described herein or by reference to implement The technology being incorporated to.As hardware and/or firmware, can be implemented using various types of programmings or integrated circuit design language Technology that is described herein or being incorporated by reference into, the integrated circuit design language include hardware description language, example Such as it is configured as design field programmable gate array (" FPGA "), application-specific integrated circuit (" ASIC ") or any other type Integrated circuit any register transfer language (" RTL ").According to some embodiments, term " module " can for example calculate with finger method Either a part for algorithm, and/or the logic implemented using hardware circuit or software or combination thereof.These can To change, and it is not limited to provided example or description.
In some embodiments, the module 3350, the module of Figure 34 3450 of Figure 33 and the module 3550 of Figure 35 or One or more of their component component or any process described herein or equipment can be with mobile device (examples Such as, mobile phone either computing device) it is communicated or can be arranged in the mobile device.
In some cases, with one or more modules 3359 (module 3350 of Figure 33, the module of Figure 34 3450 and The module 3550 of Figure 35) either one or more of their component component (or any process described herein or Equipment) computing device of mobile device or any networking that is communicated is capable of providing appointing in features described herein At least some of structure and/or function of what feature.As described in this article or be incorporated by reference into it is as shown in the figure, Software may be used, hardware, firmware, circuit or any combination of them implement it is described herein or by reference The structure and/or function of any feature in the feature being incorporated to.Note that can by above structure and constituent element and they Function and one or more of the other structure or element polymerization or combine.Alternatively, can by the element and they Function is subdivided into composition subcomponent, if present.As software, various types of programmings or formatting language can be used Speech, framework, grammer, application, agreement, object or technology implement skill that is described herein or being incorporated by reference into At least some of art.For example, at least one of element described in any figure in the drawings element can indicate One or more algorithms.Alternatively, at least one of described element element can indicate to include being configured to supply composition structure An and/or part for the logic of a part for the hardware of function.
For example, can be in one or more computing devices (that is, any mobile computing device, such as wearable device, audio Equipment (for example, earphone or telephone headset) or mobile phone (either wear or carry)) in implementing Fig. 33 One or more of the module 3550 of module 3350, the module of Figure 34 3450 and Figure 35 or their component component, Either any process or equipment described herein, the computing device include be configured as execute memory in one or The one or more processors of polyalgorithm.Therefore, in the element in the described herein or figure that is incorporated by reference into At least some elements can indicate one or more algorithms.Alternatively, at least one of described element element can indicate to wrap Include a part for the logic for the part for being configured to supply the hardware for constituting structure and/or function.These can change, and There is provided example or description are provided.
As hardware and/or firmware, this can be implemented using various types of programmings or integrated circuit design language Structure that is described in text or being incorporated by reference into and/or technology, the integrated circuit design language include hardware description Language, such as it is configured as design field programmable gate array (" FPGA "), application-specific integrated circuit (" ASIC "), multi-chip mould Any register transfer language (" RTL ") of block or the integrated circuit of any other type.
For example, can in one or more computing devices including one or more circuits implementing Fig. 33 module 3350, the module 3450 of Figure 34 and one or more of the module 3550 of Figure 35 or their component component or Any process or equipment described herein.Therefore, the member in the described herein or figure that is incorporated by reference into At least some of part element can indicate one or more components of hardware.Alternatively, at least one of described element element Can indicate include the logic for the part for being configured to supply the circuit for constituting structure and/or function a part.
According to some embodiments, term " circuit " can for example refer to including wherein having electric current to flow through to execute one or more Any system of multiple components of function, the component include discrete and compound component.The example of discrete component includes Transistor, resistor, capacitor, inductor, diode etc., and the example of compound component includes memory, processor, mould Quasi- circuit, digital circuit etc., including field programmable gate array (" FPGA "), application-specific integrated circuit (" ASIC ").Therefore, circuit May include electronic unit and logical block (for example, being configured as executing the instruction of one group of executable instruction of such as algorithm Logic, therefore be the component of circuit) system.According to some embodiments, term " module " can for example refer in hardware circuit or A part, and/or logic for the algorithm or algorithm implemented in person's software or combination thereof is (that is, module can be carried out For circuit).In some embodiments, the memory of algorithm and/or storage algorithm is circuit " component ".Then, for example, term " circuit " can also finger system, including algorithm.These can change, and be not limited to provided example or retouch It states.
Figure 36 be depict according to some it is exemplary be configured as in simulated environment to simulate one of autonomous vehicle or The diagram for the simulator that multiple functions are simulated.Diagram 3600 describes simulator 3640, and simulator 3640 is configured as closing At simulated world or simulated environment 3603, wherein can to autonomous vehicle 3630 (and any part in its component, such as Sensor) operation simulated, with for example for the vehicle for the one or more autonomous vehicles that may be constructed autonomous vehicle service The effect of team 3630a determines hardware and software or combination thereof.In addition, for example, simulator 3640 can be configured as: When the autonomous vehicle 3630 of simulation is advanced in different riving conditions or scene, simulating vehicle dynamic.For example, simulator 3640 can simulate the operation of autonomous vehicle 3630 in following Driving Scene, and it may be city spy that the Driving Scene, which is included in, It is driven on some dominant landforms (for example, urban transportation etc. of the hill in San Francisco, New York City) and in different riving conditions Period (for example, simulation to being led to the wheel frictional force reduced due to rain, ice etc.) is driven.In the example shown, simulated environment 3603, which include section, has in the travel direction of simulation autonomous vehicle 3630 increased gradient or the gradient (that is, vehicle 3630 to up-hill journey, as indicated by symbol 3686), and it is also described as having simulation autonomous vehicle 3630 can be with Across ice 3684 formation.
In addition, simulator 3640 is also based on laser and camera data and any other data (for example, radar number According to, sonar data etc.) synthesis generate simulated environment 3603.For example, simulated environment 3603 can be based on including using laser The 3D map numbers for the 3D point cloud that scanner combination camera vision/image data (or any other sensing data) is generated According to.The example of sensing data includes but not limited to laser radar data, image or camera data, GPS data, inertia survey Measure unit (" IMU ") data, acoustic data, ranging data, wheel angle data, battery charge level, one or more power Transmission device either the thermal energy data (for example, temperature) of the driving current, any part of motor, accelerate or deceleration data, can To be applied to the brake pressures of one or more wheels either power etc. and described herein or pass through in other cases The other sensing datas being incorporated by.Simulator 3640 can also simulate can be by being arranged on the autonomous vehicle 3630 of simulation The sensor returned data of sensor detection in or.The example of the sensor returned data of simulation includes from surface The simulated laser radar return 3671 that the part 3672 of (for example, skin mode of the building of neighbouring road) is reflected.
Diagram 3600 describes can be with cooperating to generate including dynamic object 3680 and 3682a and simulated roadway The data modeling device 3620 and simulator 3640 of the simulated environment 3603 of condition (for example, ice 3684).Simulator 3640 can wrap Physical processor 3650 is included, physical processor 3650 is configured as simulating and be simulated to the behavior for simulating autonomous vehicle 3630 During using the machinery of autonomous vehicle, static state, for the use of dynamic and kinematics.For example, physical processor 3650 can be with The machinery interaction of interaction and/or simulation between contact machinery and simulation main body including simulation.Simulator 3640 can be with Including emulator controller 3656, emulator controller 3656 is configured as control simulation to adjust any of simulated environment 3603 The function for the element being synthetically generated, to determine and estimate causality etc..Note that the element described in the diagram 3600 of Figure 36 May include and the similar name in conjunction with described in one or more of the other figure that is described herein or being incorporated by reference into The identical structure of element and/or function.
In some instances, data modeling device 3620 receives record file data, record from the fleet 3630a of autonomous vehicle File data provides various types of data, including but not limited to object data 3631, map datum 3633, sensing data 3635, vehicle part data 3637 and can describe one or more of fleet 3630a autonomous vehicles structure and/or Any other data of function.In some instances, can in 1 year different time and Various Seasonal and it is a variety of not With recording data described above on several miles of condition downward driving.It can be from any of any number can be travelled in road network The autonomous vehicle of number generates the data of record.In addition, data modeling device 3620 may include characterize road logic (for example, Gradient, roughness (for example, by protrusion, pit-hole etc.), towards features such as the angles of gradient in roadside), typical or institute Desired friction valve or surface type), the dynamic object (example at one or more parts of road network desired by probability Such as, the autonomous vehicle 3630 of simulation in school area at the end of one it is expected that encounter many children as dynamic object Son) and any other characteristic.Data modeling device 3620 can also be included in the logic that road is characterized under various weather conditions. Therefore, data modeling device 3620 can form road network either any other path or road using the data for representing characterization road The data model of section.In some cases, the logic in data modeling device 3620 can be based on hundreds thousand of to millions of miles Traveling (for example, travelled by fleet 3630a) either fewer or greater than institute during the distance measurements of same or different roads Any other data volume of record, to assemble or fused data (for example, sensing data).
As indicated, data modeling device 3620 includes dynamic object data modeling device 3621, environmental modeling device 3623, sensor Modeling device 3625 and vehicle modeling device 3627 or their any other hardware and/or Software Implementation, to generate One or more data models, simulator 3640 can use the one or more of these data models generation simulated environment 3603 Part.Dynamic object data modeling device 3621 can be configured as to receive and represent the fleet 3630a of autonomous vehicle and therefrom obtain spy Property data one or more of environment Properties of Objects data (for example, data of record).Such data can wrap Include 3D point cloud either as such as can visually define object class (for example, for pedestrian, pet or animal, cycling People, automobile etc.) the other data of another kind indicate, it is possible thereby to by the object of classification and a certain dynamic level and/or prediction Motion range (for example, per unit time) it is associated, thus at least in some instances, the motion range of prediction can also be retouched State the direction of motion (for example, being indicated by the motion vector predicted) of prediction.According to some examples, the motion range of prediction can be retouched Probability and/or speed associated with the movement of object or the acceleration of dynamic object can be changed into from static object by stating object Degree.
In view of above description, dynamic object data modeling device 3621 can be configured as any number of object of mark, and And it can be configured to classify subjects into any number of class.According to some examples, dynamic object data modeling device 3621 can also identify static object, and classify to static object, and static object includes at a time may be static And another moment be dynamic object (for example, the pet dog 3682b for being at a time sitting in roadside may be when another Quarter takeoffs suddenly, and runs into road).In order to illustrate, it is believed that dynamic object data modeling device 3621 can be by object 3682b points Class is dog, and can be associated with dog by dynamic characteristic and/or the motion range of prediction.In addition, dynamic object data modeling Device 3621 can generate with and the related description object 3682b of the interaction of other dynamic objects predicted motion data model, Other dynamic objects are, for example, the dynamic object 3680 or dynamic object 3682a for the dog being described as in movement.Without dynamic In the presence of object 3682a, by dog 3682b and it can participate in first general in activity (for example, to front jumping, and run) Rate is associated.However, encounter dog 3682a in dog 3682b either interacts (or chasing) (movement with prediction with dog 3682a Range 3683) in the case of, the probability that dog 3682b is participated in activity may suddenly increase.For example, dog 3682b to front jumping simultaneously And the probability for chasing dog 3682a by the light of nature may increase to about 85% from about 10% (for example, data based on record).Base In the data model, simulator can based on from dynamic object data modeling device 3621 the derived behavior modeled, to generate Simulated environment 3603 comprising two (2) dynamic objects 3682a and 3682b to consider in navigation and planning, rather than one A (1) dynamic object 3682a.
Dynamic object data modeling device 3621 can generate the data model and movement phase of the classification for describing any object Pass data (for example, prediction motion range, speed, prediction motion path etc.).Diagram 3600 is further looked at, dynamic is right Image data modeling device 3621 can generate data model, and simulator 3640 can use the movement that the data model is generated and predicted The associated simulation dynamic object 3680 of range (e.g., including the direction of motion in crossing) (for example, people of running) Rate.It can classify to other dynamic objects, and in some cases, can also son point further be carried out to them Class.For example, section may be adjacent to several bars or nightclub, therefore, the dynamic object of classification is (for example, young adult row People) may have and the first predictive behavior or can be moved on daytime, but may in the morning 2:00 closes in bar or nightclub Occur later or predicts other predictive behaviors (or uncertain behavior) occur.Physically relative to autonomous vehicle 3603 The physical environment driven through, the dynamic object data model that simulator 3640 can use modeling device 3621 to be generated are being simulated Higher precision is provided in environment 3603.
Environmental modeling device 3623 can be configured as the various pieces for generating simulated environment 3603, for example, in some examples In, it is static part.In the example shown, environmental modeling device 3623 can receive map datum 3633, to generate description object Manage the environmental model of the geometric figure of external environment.The environment mould that simulator 3640 can be generated with use environment modeling device 3623 Type data for example generate simulated environment 3603 based on 3D map datums 3633.Note that in some cases, environmental modeling device 3623 may include that can either be similar to one or more parts of drawing engine or drawing appliance structure and/or function (such as It is described herein to be either incorporated to by reference), etc. to generate 3D (or 4D) simulated environment 3603.
Sensor Model device 3625 is configured as being extracted from the fleet 3630a of autonomous vehicle based on the data as record Sensing data 3635 and generate represent one or more of various types of sensors sensor various functions number According to model.For example, sensing data 3635 may include one or more types sensing data in sensing data One or more subsets, such as, but not limited to laser radar data, radar data, sonar data, image/camera data, sound Sound data, ultrasound data and the relevant data of IMU, the biography of ranging data, wheel angle data and any other type Sensor data.Simulator 3640 can use the data that Sensor Model device 3625 is generated to institute in simulation autonomous vehicle 3630 Any number of sensor implemented is modeled.Such as, it is believed that the autonomous vehicle controller (not shown) that can be modeled can Be configured as mark simulation autonomous vehicle 3630 or simulated laser radar sensor (its be configured for ray trace swash Optical scanning) posture 3670, wherein at least one is described as the return laser beam 3671 reflected from surface portion 3672.In addition, Autonomous vehicle controller can access 3D map datums, with (and the such geometric figure of the external geometric figure of mark 3672 Range or position), and can be additionally configured to mark x coordinate, y-coordinate, z coordinate, roll value, tilt value and pitching One or more of value, to describe the posture of simulated laser radar sensor.In some instances, the simulation of simulator 3640 Device controller 3656 can be configured as by for the analogue value of the return laser beam 3671 of simulation and measurement (for example, intensity, model Enclose, reflectivity etc.) it is compared with the laser radar data (for example, sensing data 3635) empirically obtained, with determination The accuracy of simulation.
Vehicle modeling device 3627 can be configured as reception and represent various types associated with various vehicle parts and value Data, and can be configured to generate represent the machinery of autonomous vehicle and/or the structure of electric component and/or work( The data of energy characteristic.In addition, vehicle modeling device 3627 can generate the dynamic and kinematics characteristic, Yi Jiqi of description autonomous vehicle The electrically and mechanically vehicle part data model of function.
Data modeling device 3620 can be additionally configured to one or more subsets of a type of data and other classes One or more subset associations of the data of type.According to various examples, data modeling device 3620 can be implemented offline or online Depth learning technology, to determine the various states of autonomous vehicle.Simulator 3640 and physical processor 3650 can be with use states Data assess various logic module, and whether the analog response to determine simulation autonomous vehicle 3630 is appropriate.For example, data modeling Device 3620 can model the described section with 3686 inclined-planes that go up a slope, and can also go up a slope including wheel Travel the regulating scope of applied acceleration and/or torque.Therefore, simulation autonomous vehicle 3630 may encounter such as in wheel Frictional force (that is, wheel spin) is lost when " event " of ice 3684.In this case, emulator controller 3656 is it is expected that vehicle The rate of the angular speed of wheel increases, and to confirm the accuracy of simulation, and takes action policy appropriate.Show as another Example considers that simulation autonomous vehicle 3630 is slowing down, to be parked in before pavement.Then, emulator controller 3656 is expected inspection The rate and to a certain degree horizontal (for example, brake force or pressure) for measuring deceleration, otherwise can define problem.As Another example, consider emulator controller 3656 determine simulation autonomous vehicle 3630 steering positively biased to the right or from Road opening.Then, simulator 3640 can simulate road conditions that wherein road is tilted to the right (for example, being based on map datum 3633), the tire of simulation gas leakage or simulation misalignment.
According to some examples, simulator is configured as the dynamic object of the classification based on the dog for being described as running The motion range 3683 of 3682a and to simulate autonomous vehicle 3630 one or more functions data indicate predicated response into Row simulation.Simulator 3640 can be further configured to the dynamic object 3682a of calculating simulation autonomous vehicle 3630 and classification The distance between the motion range 3683 of prediction 3685 change rate, if it exceeds threshold value is (for example, with regard to position, distance, time Deng for), then simulator 3640 is configured as the change rate based on 5 distances 3685 calculated and makes simulation autonomous vehicle 3630 Avoid the simulation dynamic object 3682a in simulated environment 3603.In some cases, autonomous vehicle 3630 can stop driving, To avoid the collision with dynamic object 3682a.In some other examples, simulator 3640 can be with system simulator with high safety 3690, with the use of the autonomous vehicle 3630 of security system simulation on the one or more plates of simulation.The example of security system includes Direct acoustic is formed by beam and/or including car light to alert dynamic object 3682a, or can be implemented external and/or interior Portion's security system.On November 4th, 2015 (the attorney docket No.Z00-012) submitted, entitled " SYSTEM OF CONFIGURING ACTIVE LIGHTING TO INDICATEDIRECTIONALITY OF AN AUTONOMOUS (the attorney docket No.Z00- that the U.S. Patent application No.14/756,994 of VEHICLE ", on November 4th, 2015 submit 013), entitled " METHOD FOR ROBOTIC VEHICLE COMMUNICATION WITH AN EXTERNAL The U.S. Patent application No.14/756,993 of ENVIRONMENT VIA ACOUSTIC BEAM FORMING ", November 4 in 2015 (attorney docket No.Z00-017), entitled " the ACTIVE LIGHTING CONTROL FOR that day submits COMMUNICATING A STATE OF AN AUTONOMOUS VEHICLE TO ENTITIES IN A The U.S. Patent application No.14/932,948 of SURROUNDINGENVIRONMENT ", (agent submitted on November 4th, 2015 Reference Number No.Z00-018), the U.S. of entitled " RESILIENT SAFETY SYSTEM FOR A ROBOTIC VEHICLE " it is special Profit application No.14/932,952, (the attorney docket No.Z00-019) submitted on November 4th, 2015, entitled " INTERNAL The U.S. Patent application No.14/932,954 of SAFETYSYSTEMS FOR ROBOTIC VEHICLES " and in November, 2015 Being described in the U.S. Patent application No.14/932,962 of (the attorney docket No.Z00-022) submitted on the 4th can be by simulating Device 3640 simulate security system example, from there through the mode of reference by the full content of these U.S. Patent applications simultaneously Enter.
According to some examples, simulator 3640 can generate and using the " ground for marking (for example, semantic marker) It is live " data, it can test and either verify (for example, verification software change and validating map data 3633 or any modeling Data change) to the mark of the algorithm.In addition, simulator 3640 can be used for for example based on computer vision grader and Deep neural network (for example, implementing Bayesian or probabilistic inference algorithm and other similar techniques) carries out grader instruction Practice, to identify dynamic object or the agency in simulated environment 3603, for example, dynamic object 3680 and 3682a.Note that although Can use the technology for implementing data modeling device 3620 in the context of simulation, however can also described herein or Implement data modeling device 3620 in any part being incorporated by reference into.For example, perception engine either system may include or Implement the one or more partly and any other described in the context of simulator 3640 of data modeling device 3620 Structure and/or function.Note that hardware may be used, either software or combination thereof are implemented described herein or are passed through Any part shown in the Figure 36 being incorporated by.
Figure 37 is depicted according to some exemplary vehicle modeling devices.Diagram 3700 depicts a part for autonomous vehicle 3700, include the propulsion unit 3732 comprising driving mechanism (for example, motor) 3733, axle 3735 and wheel 3737, braking list It is 3720 and steering unit (the attorney docket No.Z00-021) submitted on November 4th, 3734,2015 of member, entitled “QUADRANT
The U.S. Patent application No. of CONFIGURATION OF ROBOTIC VEHICLES "
Some of which example is described in 14,932,958, is incorporated herein the U.S. Patent application by quoting. Vehicle modeling device 3727 is configured as receiving vehicle part data 3799, and vehicle part data 3799 include in description section 3700 Vehicle part operability model data, including sensing data (for example, the angular speed of wheel angle 3711, tire 3713, etc.).Vehicle modeling device 3727 is configured as receiving the data for the one or more components for representing autonomous vehicle, and Mark represents characteristics of components (for example, current of electric of motor 3733) associated with one or more components of autonomous vehicle Data.Vehicle modeling device 3727 generates the data model of one or more components based on characteristics of components, for these data models, Simulator, which can be configured as, simulates one or more portions based on the data model of the behavior for prognosis modelling autonomous vehicle The operation of part (for example, propulsion unit 3732, brake unit 3720, steering unit 3734 etc.).In some cases, simulator can To be configured as Access Events data model, including event feature is represented (for example, description is covered with the spy of the part of the road of ice The event model of property) data.Then, simulator can be configured as based on event feature data (for example, the friction reduced Power) event (for example, the fritter to freeze) in simulated environment is simulated.
Figure 38 is the exemplary diagram depicted according to some exemplary Sensor Model devices.Diagram 3000 includes sensor Modeling device 3825, Sensor Model device 3825 are shown as including sensor type modeling device 3803 and sensor air modeling device again 3805, generate analog sensor categorical data 3806 to be based on sensing data 3801.Therefore, Sensor Model device 3825 by with It is set to the sensing data 3835 for receiving different type and different number, to generate corresponding simulated sensor data 3837.Root According to some examples, Sensor Model device 3825, which can be configured as to receive, represents what one or more autonomous vehicles were advanced wherein In environment derived from sensing data 3801 data, and Sensor Model device 3825 can be further configured to use Sensor type modeling device 3803 models the subset of sensing data, to characterize sensor device (for example, laser radar Sensor), to form the individual sensing data of tool.Then, sensor type modeling device 3803 can be based on having spy Property sensing data generate representative simulation sensor device data 3806.
In some cases, sensor error modeling device 3805 can be configured as associated with sensor device to representing The data of subset of measured deviation (for example, error) modeled.Sensor type modeling device 3803 can be configured as base The sensor device of representative simulation is adjusted in the subset of measured deviation (for example, what sensor error modeling device 3805 was generated) Data.As an example, sensor type modeling device 3803 can be configured as to the subsets of laser radar sensor data into Row modeling to form the individual laser radar sensor data of tool, and generates representative to characterize laser radar sensor The data of the laser radar apparatus of simulation.Furthermore it is possible to the subset of lidar measurement deviation or error is modeled, and And use it for adjusting the data of the laser radar apparatus of representative simulation based on the subset of lidar measurement deviation.
Figure 39 is the exemplary diagram depicted according to some exemplary dynamic object data modeling devices.Diagram 3900 includes It is configured as receiving the dynamic object data modeling device 3921 of the object data 3931 for generating simulated object data 3941. In the example, dynamic object data modeling device 3921 includes object data grader 3922 (for example, Bayesian graders etc.), Object data grader 3922 is configured as the classification of mark dynamic object, and identifies representative characteristic associated with classification The data of the set of (or behavior of prediction).Then, simulator can be configured as using the set of the characteristic come to mould The motion range of the prediction of simulation dynamic object in near-ring border is simulated.As indicated, object data grader 3922 will move State object 3932,3933,3934 and 3938 is classified as the first animal dynamic object, the second animal dynamic object, pedestrian's dynamic Object and slide plate dynamic object.Based on the classification, object data tokenizer 3951,3952,3953 and 3959 by with It is set to:The data for the motion range for representing such as prediction are provided based on the dynamic object identified.In some cases, object Data characterization device can be based on the data for implementing randomization with the relevant probability of the motion range of prediction.Number based on randomization According to simulator is capable of the possible rare behavior of simulated object, for example, randomly takeofing and going to the dog of the street (for example, following Behind ball).
Figure 40 is shown according to some exemplary exemplary flow charts for generating simulated environment.Flow 4000 starts from 4002, at 4002, the data for the characteristic for representing the dynamic object in environment are received from one or more autonomous vehicles.4004 Place, identifies the dynamic object of classification.At 4006, mark represents dynamical correlation characteristic associated with the dynamic object of classification Data.At 4008, the dynamical correlation characteristic of the data object based on classification forms the data model of the dynamic object of classification. At 4010, the motion range of the prediction of the dynamic object of classification is simulated in simulated environment, to form the dynamic object of simulation. At 4012, the predicated response that can be indicated the data for the one or more functions for simulating autonomous vehicle is simulated.Note that Order described in other flow charts of the flow chart and this paper is not intended to imply that requirement linearly performs various functions, Because each of flow chart part serially or can be performed in parallel with any one or more other parts of flow chart, And can independently of or dependent on flow chart other parts.
Figure 41 show according to various embodiments be configured to supply various simulator correlation functions and/or structure with mould The example of the various computing platforms of quasi- autonomous vehicle service.In some instances, can implement to use using computing platform 3300 In the computer program, application, method, process, algorithm or the other softwares that execute technology described above.Note that Figure 33 Various structures and/or function be also applied for Figure 41, and therefore, can be discussed in the context of Figure 33 in the two figures Some elements.It should also be noted that the element described in the diagram 4100 of Figure 41 may include and combine one described herein Or a number of other figures it is described have the function of the similar identical structure of element named and/or.
With reference to example shown in Figure 41, system storage 3306 include autonomous vehicle services platform module 4150 and/or Its component (for example, data modeling device module 4152, emulation module 4154 etc.), any one or one therein or more A part can be configured as the mould promoted by implementing one or more functions described herein to autonomous vehicle service It is quasi-.In some cases, computing platform 3300 can be arranged in any equipment, for example, be arranged in computing device 3390a, Computing device 3390a can be arranged in autonomous vehicle service platform, autonomous vehicle 3391, and/or mobile computing device 3390b In.
Figure 33 to 35 show according to various embodiments be configured as provide various work(to the component of autonomous vehicle service The example of the various computing platforms of energy.In some instances, can use computing platform 3300 implement computer program, using, Method, process, algorithm or other softwares are to execute any one of techniques described herein.
Note that the various structures and/or function of Figure 33 are also applied for Figure 34 and Figure 35, thus it is possible to above and below Figure 33 Some elements in the two figures are discussed in text.
In some cases, computing platform 3300 can be arranged in any equipment of such as computing device 3390a, calculate Equipment 3390a can be arranged one or more of autonomous vehicle service platform computing device, autonomous vehicle 3391, and/or In mobile computing device 3391.
Computing platform 3300 includes bus 3302 or other communication mechanisms for transmitting information, by subsystem and is set Standby interconnection, the equipment such as processor 3304, system storage 3306 (for example, RAM), storage device 3308 (for example, ROM etc.), (it can be real in the other parts of RAM 3306 or computing platform 3300 for memory high speed buffer storage Apply), communication interface 3313 (for example, Ethernet or wireless controller, bluetooth controller, NFC logical etc.), with promote via The communication of port on communication link 3321, with for example with computing device communication, the computing device includes having processor Mobile computing and/or communication equipment.It can utilize in one or more graphics processing units (" GPU ") and one or more Central Processing Unit (" CPU ") (for example,Those of company's manufacture) or one or more virtual processor and CPU Implement processor 3304 with any combinations of virtual processor.Computing platform 3300 is exchanged via input and output device 3301 Indicate that the data output and input, the equipment 3301 include but is not limited to keyboard, Genius mouse, audio input (for example, speech To text device), user interface, display, monitor, cursor, touch-sensitive display, LCD or light-emitting diode display and other With the relevant equipment of I/O.
According to some examples, computing platform 3300 executes specific operation by processor 3304, and the execution of processor 3304 is deposited The one or more sequences instructed in one or more of system storage 3306 are stored up, and computing platform 3300 can be implemented It is arranged at client-server arrangement, equity or is embodied as any mobile computing device, including smart phone etc..It can incite somebody to action Such instruction or data read in system storage 3306 from another computer-readable medium (for example, storage device 3308). In some instances, the software instruction for embodiment can be replaced, or combined with software instruction to use hardwired electric Road.Instruction can be embedded in software or firmware.Term " computer-readable medium " refers to participating in carrying to processor 3304 For any tangible medium of the instruction for execution.Such medium can take many forms, including but not limited to non-easy The property lost medium and Volatile media.For example, non-volatile media includes light or disk etc..Volatile media includes dynamic memory Device, such as system storage 3306.
For example, the common form of computer-readable medium includes floppy disk, floppy disc, hard disk, tape, any other magnetic Jie Matter, CD-ROM, any other optical medium, card punch, paper tape, any other physical medium with sectional hole patterns, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cassette tape or computer can be read from any Other media.It can also emit or receive instruction using transmission medium.Term " transmission medium " may include can store, Coding any tangible either intangible medium that either carrying instruction executes for machine and include that number or analogue communication are believed Number or other intangible mediums, with the communication instructed as promotion.Transmission medium includes coaxial cable, copper conductor, Yi Jiguang Fibre includes the conducting wire for including the bus 3302 for emitting computer data signal.
In some instances, the execution of instruction sequence can be completed by computing platform 3300.According to some examples, calculate flat Platform 3300 can be by communication link 3321 (for example, cable network, such as LAN, PSTN or any wireless network, including various The WiFi of standard and agreement,NFC, Zig-Bee etc.) be coupled to any other processor, with press mutually coordination (or It is asynchronous) mode execute instruction sequence.Computing platform 3300 can by communication link 3321 and communication interface 3313 transmitting and Receive message, data and instruction, including program code (for example, application code).When receiving program code, Ke Yiyou Processor 3304 executes received program code, and/or program code is stored in memory 3306 or other non-volatile In reservoir, for then executing.
In the example shown, system storage 3306 may include comprising can for implement functionality described herein The various modules executed instruction.System storage 3306 may include operating system (" O/S ") 3332 and application 3336 and/ Or (multiple) logic module 3359.In example shown in fig. 33, system storage 3306 is controlled including autonomous vehicle (" AV ") Device module 3350 processed and/or its component are (for example, perception engine modules, localization module, planner module, and/or motion control Device module), any module or one or more part in them can be configured as by implementing described herein one A or multiple functions and promote autonomous vehicle service.
With reference to example shown in Figure 34, system storage 3306 include autonomous vehicle services platform module 3450 and/or Its component (for example, teleoperator's manager, simulator etc.), any module in them or one or more part It can be configured as and promote autonomous vehicle service by implementing one or more functions described herein.
The example with reference to shown in Figure 35, system storage 3306 includes for being used in such as mobile computing device Autonomous vehicle (" AV ") module and/or its component.One or more parts of module 3550 can be configured as by implementing this The one or more functions of text description and promote the delivery of autonomous vehicle service.
Referring back to Figure 33, software, hardware, firmware, circuit or combination thereof implementation may be used and retouched herein The structure and/or function of any feature in feature that is stating or being incorporated by reference into.Note that can be by above structure and structure It polymerize with one or more of the other structure or element at element and their function.Alternatively, can by the element with And their function is subdivided into composition subcomponent, if present.As software, can use various types of programmings or Formatted language, framework, grammer, application, agreement, object or technology are described herein or by reference to implement The technology being incorporated to.As hardware and/or firmware, can be implemented using various types of programmings or integrated circuit design language Technology that is described herein or being incorporated by reference into, the integrated circuit design language include hardware description language, example Such as it is configured as design field programmable gate array (" FPGA "), application-specific integrated circuit (" ASIC ") or any other type Integrated circuit any register transfer language (" RTL ").According to some embodiments, term " module " can for example calculate with finger method Either a part for algorithm, and/or the logic implemented using hardware circuit or software or combination thereof.These can To change, and it is not limited to provided example or description.
In some embodiments, the module 3350, the module of Figure 34 3450 of Figure 33 and the module 3550 of Figure 35 or One or more of their component component or any process described herein or equipment can be with mobile device (examples Such as, mobile phone either computing device) it is communicated (wired or wireless) or can be arranged in the mobile device.
In some cases, with one or more modules 3359 (module 3350 of Figure 33, the module of Figure 34 3450 and The module 3550 of Figure 35) either one or more of their component component (or any process described herein or Equipment) the computing device (not shown) of mobile device or any networking that is communicated is capable of providing spy described herein At least some of structure and/or function of any feature in sign.As described in this article or it is incorporated by reference into As shown in the figure, it is described herein or logical that software, hardware, firmware, circuit or any combination of them implementation may be used The structure and/or function of any feature in crossing reference and the feature that is incorporated to.Note that can by above structure and constituent element, And their function polymerize or combines with one or more of the other structure or element.Alternatively, can by the element with And their function is subdivided into composition subcomponent, if present.As software, can use various types of programmings or Formatted language, framework, grammer, application, agreement, object or technology are described herein or by reference to implement At least some of technology being incorporated to.For example, at least one of element described in any figure in the drawings element It can indicate one or more algorithms.Alternatively, at least one of described element element can indicate to include being configured to supply Constitute a part for the logic of a part for the hardware of structure and/or function.
For example, can be in one or more computing devices (that is, any mobile computing device, such as wearable device, audio Equipment (for example, earphone or telephone headset) or mobile phone (either wear or carry)) in implementing Fig. 33 One or more of the module 3550 of module 3350, the module of Figure 34 3450 and Figure 35 or their component component, Either any process or equipment described herein, the computing device include be configured as execute memory in one or The one or more processors of polyalgorithm.Therefore, in the element in the described herein or figure that is incorporated by reference into At least some elements can indicate one or more algorithms.Alternatively, at least one of described element element can indicate to wrap Include a part for the logic for the part for being configured to supply the hardware for constituting structure and/or function.These can change, and There is provided example or description are provided.
As hardware and/or firmware, this can be implemented using various types of programmings or integrated circuit design language Structure that is described in text or being incorporated by reference into and/or technology, the integrated circuit design language include hardware description Language, such as it is configured as design field programmable gate array (" FPGA "), application-specific integrated circuit (" ASIC "), multi-chip mould Any register transfer language (" RTL ") of block or the integrated circuit of any other type.
For example, can in one or more computing devices including one or more circuits implementing Fig. 33 module 3350, the module 3450 of Figure 34 and one or more of the module 3550 of Figure 35 or their component component or Any process or equipment described herein.Therefore, the member in the described herein or figure that is incorporated by reference into At least some of part element can indicate one or more components of hardware.Alternatively, at least one of described element element Can indicate include the logic for the part for being configured to supply the circuit for constituting structure and/or function a part.
According to some embodiments, term " circuit " can for example refer to including wherein having electric current to flow through to execute one or more Any system of multiple components of function, the component include discrete and compound component.The example of discrete component includes Transistor, resistor, capacitor, inductor, diode etc., and the example of compound component includes memory, processor, mould Quasi- circuit, digital circuit etc., including field programmable gate array (" FPGA "), application-specific integrated circuit (" ASIC ").Therefore, circuit May include electronic unit and logical block (for example, being configured as executing the instruction of one group of executable instruction of such as algorithm Logic, therefore be the component of circuit) system.According to some embodiments, term " module " can for example refer in hardware circuit or A part, and/or logic for the algorithm or algorithm implemented in person's software or combination thereof is (that is, module can be carried out For circuit).In some embodiments, the memory of algorithm and/or storage algorithm is circuit " component ".Then, for example, term " circuit " can also finger system, including algorithm.These can change, and be not limited to provided example or retouch It states.Therefore, hardware (including circuit) may be used to implement in above structure and/or any structure in function and/or function.
Although using some datail descriptions aforementioned exemplary for clearly purpose is understood, wound described above The property made technology is not limited to provided details.There are the modes of many replacements for implementing creative technology described above. Disclosed example be it is illustrative and not restrictive.

Claims (23)

1. a kind of method, including:
Mark represents the characteristic in one or more of one or more of simulated environment or physical environment dynamic object First data;
First data are based at least partially on to determine the classification of dynamic object;
Mark represents the second data of dynamical correlation characteristic associated with the dynamic object;
One or more of first data or second data are based at least partially on to generate the dynamic object Data model;
The motion range of the prediction of the dynamic object is simulated in simulated environment;And
It is based at least partially on the motion range of the prediction of the dynamic object, to simulating one or more of autonomous vehicle The predicated response of a function is simulated.
2. the method for claim 1 wherein carrying out simulation to the predicated response includes:
Calculate the change rate of the distance between motion range of the prediction of the simulation autonomous vehicle and the dynamic object; And
Being based at least partially on the change rate of calculated distance avoids the simulation autonomous vehicle in the simulated environment The dynamic object.
3. method of claim 1 further includes:At least based on one or more of functions to the simulation autonomous vehicle The predicated response is simulated, and is verified the one or more of function associated with physics autonomous vehicle and is changed.
4. the method for claim 1 wherein determine that the classification of the dynamic object includes determining that the dynamic object is that animal is dynamic One of state object, pedestrian's dynamic object and electric vehicle dynamic object, and wherein, it is based at least partially on institute It states the classification of dynamic object and selects the motion range of the prediction.
5. method of claim 1 further includes:
Data associated with the predicated response is simulated are provided to computing device associated with teleoperator;
Teleoperator's data are received from the computing device associated with the teleoperator;And
Teleoperator's data are based at least partially on to control in physics autonomous vehicle or the simulation autonomous vehicle One or more operations.
6. the method for claim 4, further includes:It is based at least partially on the classification of the dynamic object and makes the prediction Motion range randomization.
7. method of claim 1 further includes:
Sensing data is modeled to characterize sensor device, to form the individual sensing data of tool, wherein make With one or more in the data or simulated sensor data obtained by the physical sensors in one or more physical environments It is a that the sensing data modeled;And
The individual sensing data of tool is based at least partially on to generate the data of representative simulation sensor device.
8. the method for claim 7, wherein carrying out modeling to the sensing data includes:It is inclined to be based at least partially on measurement Difference represents the data of the analog sensor equipment to adjust.
9. the method for claim 7, wherein carrying out modeling to the sensing data includes:
Laser radar sensor data are modeled, to characterize laser radar sensor, to form the individual laser of tool Radar data;
It is based at least partially on the data that the individual laser radar data of tool generates representative simulation laser radar apparatus;
The data of representative and the associated lidar measurement deviation of the laser radar apparatus are modeled;And
The lidar measurement deviation is based at least partially on to adjust the number for representing the simulated laser radar equipment According to.
10. method of claim 1 further includes:
Mark characteristics of components associated with one or more of components of the autonomous vehicle;
The characteristics of components is based at least partially on to generate one or more data models of one or more of components;With And
One or more of data models are based at least partially on to simulate the operation of one or more of components, with prediction The behavior of the simulation autonomous vehicle.
11. the method for claim 10, further includes:
Access includes the event data model for the data for representing event feature associated with event;
The event feature is based at least partially on to simulate the event in the simulated environment;And
Simulate another predicated response of one or more of functions of the simulation autonomous vehicle.
12. a kind of system, including:
One or more computing devices including one or more processors, wherein one or more of computing devices are configured For:
Receive the first data of the characteristic for representing dynamic object;
First data are based at least partially on to determine the classification of dynamic object, with the dynamic object of mark classification;
Mark represents the second data of dynamical correlation characteristic associated with the dynamic object of the classification;
It is based at least partially on the second data next life of the dynamical correlation characteristic for the dynamic object for representing the classification At the data model of the dynamic object of the classification;
The motion range of the prediction of the dynamic object of the classification is simulated in simulated environment;And
It is based at least partially on the motion range of the prediction of the dynamic object of the classification, to simulating the one of autonomous vehicle The predicated response that the data of a or multiple functions indicate is simulated.
13. the system of claim 12, wherein one or more of computing devices are configured to:
Dummy instruction is executed, so that the simulation autonomous vehicle is based at least partially on the prediction of the simulation dynamic object Motion range come execute simulation manipulate;
Generate data associated with the simulation autonomous vehicle execution simulation manipulation;And
The data are supplied to one or more of teleoperator.
14. the system of claim 12, wherein one or more of computing devices are configured to:
Sensing data is modeled to characterize sensor device, to form the individual sensing data of tool;And
The individual sensing data of tool is based at least partially on to generate the third data of representative simulation sensor device.
15. the system of claim 12, wherein one or more of computing devices are configured to:
Mark characteristics of components associated with the component of autonomous vehicle;
The characteristics of components is based at least partially on to generate the data model of one or more components;And
The data model is based at least partially on to simulate the operation of one or more of components, to predict the simulation certainly The behavior of main vehicle.
16. system according to claim 12, further includes:
Data associated with control simulation manipulation are received via computing device associated with teleoperator;
Dummy instruction is executed so that the simulation autonomous vehicle executes the simulation and manipulates;
Generate data associated with the simulation autonomous vehicle execution simulation manipulation;And
The data are analyzed to be determined for compliance with one or more strategies.
17. system according to claim 12 further includes providing institute to computing device associated with physics autonomous vehicle State predicated response.
18. a kind of non-transient computer readable storage medium has the computer executable instructions being stored thereon, the calculating Machine is instructed when being executed by computer, and it includes the action of the following terms to make the computer execution:
It is based at least partially on and represents the data of one or more of one or more environment characteristic of dynamic object to determine The classification of dynamic object;
Mark dynamical correlation characteristic associated with the dynamic object;
The dynamical correlation characteristic is based at least partially on to generate the data model of the dynamic object;And
It is based at least partially on the motion range of the prediction of the dynamic object, simulation and simulation autonomous vehicle in simulated environment Associated one or more event.
19. non-transient computer readable storage medium according to claim 18, wherein simulate one or more of things Part includes:
Calculate the change rate of the distance between motion range of the prediction of the simulation autonomous vehicle and the dynamic object; And
The change rate of calculated distance is based at least partially on to calculate one or more tracks, one or more of tracks The simulation autonomous vehicle is set to avoid the dynamic object in the simulated environment.
20. non-transient computer readable storage medium according to claim 19, wherein the action further includes:
One or more of tracks are supplied to computing device associated with teleoperator;
It is received from the computing device associated with the teleoperator to one of one or more of tracks Selection;And
It executes with one or more of lower-pilot:The mould is simulated using to the selection of one or more of tracks The movement of quasi- autonomous vehicle;Or make physics autonomous vehicle using the selection to one or more of tracks.
21. non-transient computer readable storage medium according to claim 18, wherein the action further includes:
It is related with the simulation simulation manipulation of autonomous vehicle is controlled via being received to the associated computing device of teleoperator The data of connection;And
Dummy instruction is executed so that the simulation autonomous vehicle executes the simulation and manipulates.
22. non-transient computer readable storage medium according to claim 21, wherein the action further includes:
Generate data associated with the simulation autonomous vehicle execution simulation manipulation;And
The data are analyzed to be determined for compliance with one or more strategies.
23. non-transient computer readable storage medium according to claim 18, wherein the action further includes:
Sensing data is modeled to characterize sensor device, to form the individual sensing data of tool;And
The individual sensing data of tool is based at least partially on to generate the data of representative simulation sensor device.
CN201680064648.2A 2015-11-04 2016-11-02 Simulation system and method for autonomous vehicle Active CN108290579B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210276163.7A CN114643995A (en) 2015-11-04 2016-11-02 Simulation system and method for autonomous vehicle

Applications Claiming Priority (33)

Application Number Priority Date Filing Date Title
US14/932,963 US9612123B1 (en) 2015-11-04 2015-11-04 Adaptive mapping to navigate autonomous vehicles responsive to physical environment changes
US14/932,940 2015-11-04
US14/756,996 US9916703B2 (en) 2015-11-04 2015-11-04 Calibration for autonomous vehicle operation
US14/932,962 US9630619B1 (en) 2015-11-04 2015-11-04 Robotic vehicle active safety systems and methods
US14/756,991 2015-11-04
US14/756,995 US9958864B2 (en) 2015-11-04 2015-11-04 Coordination of dispatching and maintaining fleet of autonomous vehicles
US14/932,954 US9517767B1 (en) 2015-11-04 2015-11-04 Internal safety systems for robotic vehicles
US14/756,994 2015-11-04
US14/932,948 US9804599B2 (en) 2015-11-04 2015-11-04 Active lighting control for communicating a state of an autonomous vehicle to entities in a surrounding environment
US14/756,996 2015-11-04
US14/756,993 US9878664B2 (en) 2015-11-04 2015-11-04 Method for robotic vehicle communication with an external environment via acoustic beam forming
US14/932,958 US9494940B1 (en) 2015-11-04 2015-11-04 Quadrant configuration of robotic vehicles
US14/932,966 US9507346B1 (en) 2015-11-04 2015-11-04 Teleoperation system and method for trajectory modification of autonomous vehicles
US14/756,995 2015-11-04
US14/756,992 US9910441B2 (en) 2015-11-04 2015-11-04 Adaptive autonomous vehicle planner logic
US14/932,962 2015-11-04
US14/756,993 2015-11-04
US14/932,954 2015-11-04
US14/932,952 2015-11-04
US14/932,940 US9734455B2 (en) 2015-11-04 2015-11-04 Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
US14/932,959 US9606539B1 (en) 2015-11-04 2015-11-04 Autonomous vehicle fleet service and system
US14/756,992 2015-11-04
US14/932,963 2015-11-04
US14/932,958 2015-11-04
US14/932,966 2015-11-04
US14/932,952 US10745003B2 (en) 2015-11-04 2015-11-04 Resilient safety system for a robotic vehicle
US14/756,991 US9720415B2 (en) 2015-11-04 2015-11-04 Sensor-based object-detection optimization for autonomous vehicles
US14/932,959 2015-11-04
US14/756,994 US9701239B2 (en) 2015-11-04 2015-11-04 System of configuring active lighting to indicate directionality of an autonomous vehicle
US14/932,948 2015-11-04
US14/757,016 US10496766B2 (en) 2015-11-05 2015-11-05 Simulation system and methods for autonomous vehicles
US14/757,016 2015-11-05
PCT/US2016/060030 WO2017079229A1 (en) 2015-11-04 2016-11-02 Simulation system and methods for autonomous vehicles

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210276163.7A Division CN114643995A (en) 2015-11-04 2016-11-02 Simulation system and method for autonomous vehicle

Publications (2)

Publication Number Publication Date
CN108290579A true CN108290579A (en) 2018-07-17
CN108290579B CN108290579B (en) 2022-04-12

Family

ID=62817166

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201680064648.2A Active CN108290579B (en) 2015-11-04 2016-11-02 Simulation system and method for autonomous vehicle
CN202210276163.7A Pending CN114643995A (en) 2015-11-04 2016-11-02 Simulation system and method for autonomous vehicle

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210276163.7A Pending CN114643995A (en) 2015-11-04 2016-11-02 Simulation system and method for autonomous vehicle

Country Status (2)

Country Link
JP (2) JP7036732B2 (en)
CN (2) CN108290579B (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110673636A (en) * 2019-09-30 2020-01-10 上海商汤临港智能科技有限公司 Unmanned simulation test system and method, and storage medium
CN110766793A (en) * 2019-10-08 2020-02-07 北京地平线机器人技术研发有限公司 Map construction method and device based on semantic point cloud
CN111221334A (en) * 2020-01-17 2020-06-02 清华大学 Environmental sensor simulation method for simulating automatic driving automobile
CN111259545A (en) * 2020-01-15 2020-06-09 吉利汽车研究院(宁波)有限公司 Intelligent driving virtual simulation cloud platform
CN111291447A (en) * 2018-12-07 2020-06-16 沃尔沃汽车公司 Evaluating simulated vehicle functional features
CN111307166A (en) * 2018-12-11 2020-06-19 北京图森智途科技有限公司 Method, device and processing equipment for constructing occupied grid map
CN111324945A (en) * 2020-01-20 2020-06-23 北京百度网讯科技有限公司 Sensor scheme determination method, device, equipment and storage medium
TWI703065B (en) * 2018-12-18 2020-09-01 大陸商北京航跡科技有限公司 Systems and methods for determining driving action in autonomous driving
CN112590871A (en) * 2020-12-23 2021-04-02 交控科技股份有限公司 Train safety protection method, device and system
CN112612269A (en) * 2020-12-14 2021-04-06 北京理工大学 Hidden attack strategy acquisition method for Mecanum wheel trolley
CN112639888A (en) * 2018-08-09 2021-04-09 祖克斯有限公司 Programmed world generation
CN113048995A (en) * 2019-12-27 2021-06-29 动态Ad有限责任公司 Long term object tracking to support autonomous vehicle navigation
CN113119749A (en) * 2018-12-07 2021-07-16 纳恩博(北京)科技有限公司 Driving method of electric scooter and electric scooter
CN113257073A (en) * 2021-06-24 2021-08-13 成都运达科技股份有限公司 Train driving simulation stability analysis method, system, terminal and medium
US20210397179A1 (en) * 2020-06-22 2021-12-23 The Boeing Company Method and system for vehicle engagement control
CN114120252A (en) * 2021-10-21 2022-03-01 阿波罗智能技术(北京)有限公司 Method and device for identifying state of automatic driving vehicle, electronic equipment and vehicle
CN114390990A (en) * 2019-06-28 2022-04-22 祖克斯有限公司 System and method for determining target vehicle speed
CN114475653A (en) * 2021-12-28 2022-05-13 广州文远知行科技有限公司 Vehicle emergency steering simulation scene configuration method and device
CN114743010A (en) * 2022-06-13 2022-07-12 山东科技大学 Ultrahigh voltage power transmission line point cloud data semantic segmentation method based on deep learning
CN115179978A (en) * 2022-07-18 2022-10-14 内蒙古工业大学 Obstacle-avoiding early warning system for shuttle vehicle based on stereo earphones
CN116324662A (en) * 2020-09-24 2023-06-23 埃尔构人工智能有限责任公司 System for performing structured testing across an autonomous fleet of vehicles
US20230241995A1 (en) * 2019-10-09 2023-08-03 Carnegie Mellon University Method for managing electric vehicles
CN116324662B (en) * 2020-09-24 2024-04-19 埃尔构人工智能有限责任公司 System for performing structured testing across an autonomous fleet of vehicles

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220161853A1 (en) * 2019-04-12 2022-05-26 Continental Automotive Systems, Inc. Autonomous Truck-Trailer Maneuvering and Parking
CN110239518B (en) * 2019-05-20 2023-09-01 福瑞泰克智能系统有限公司 Vehicle transverse position control method and device
KR102228516B1 (en) * 2019-08-05 2021-03-16 엘지전자 주식회사 Autonomous vehicle for carrying user group with multiple users, method and control server for controlling the autonomous driving vehicle
KR102255595B1 (en) * 2019-10-23 2021-05-26 국민대학교산학협력단 Device and method for providing automated driving information from the user perspective
CN111026873B (en) * 2019-10-24 2023-06-20 中国人民解放军军事科学院国防科技创新研究院 Unmanned vehicle and navigation method and device thereof
KR102139513B1 (en) * 2019-11-28 2020-08-12 국민대학교산학협력단 Autonomous driving control apparatus and method based on ai vehicle in the loop simulation
EP3869843A1 (en) * 2020-02-19 2021-08-25 Volkswagen Ag Method for invoking a teleoperated driving session, apparatus for performing the steps of the method, vehicle and computer program
US20210302981A1 (en) * 2020-03-31 2021-09-30 Gm Cruise Holdings Llc Proactive waypoints for accelerating autonomous vehicle testing
CN111553844B (en) * 2020-04-29 2023-08-29 阿波罗智能技术(北京)有限公司 Method and device for updating point cloud
JP7303153B2 (en) * 2020-05-18 2023-07-04 トヨタ自動車株式会社 Vehicle driving support device
CN112764984B (en) * 2020-12-25 2023-06-02 际络科技(上海)有限公司 Automatic driving test system and method, electronic equipment and storage medium
CN114834475A (en) * 2021-01-15 2022-08-02 郑州宇通客车股份有限公司 Vehicle output torque control method and device
CN113222335B (en) * 2021-04-06 2022-10-14 同济大学 Risk assessment utility-based security unmanned vehicle group construction method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103370249A (en) * 2010-10-05 2013-10-23 谷歌公司 System and method for predicting behaviors of detected objects
CN104094177A (en) * 2012-01-30 2014-10-08 谷歌公司 Vehicle control based on perception uncertainty
US8996224B1 (en) * 2013-03-15 2015-03-31 Google Inc. Detecting that an autonomous vehicle is in a stuck condition
US20150185034A1 (en) * 2007-01-12 2015-07-02 Raj V. Abhyanker Driverless vehicle commerce network and community
WO2015134152A1 (en) * 2014-03-03 2015-09-11 Google Inc. Remote assistance for autonomous vehicles in predetermined situations

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3281188B2 (en) * 1994-08-09 2002-05-13 ヤマハ発動機株式会社 Unmanned car
FR2889882B1 (en) 2005-08-19 2009-09-25 Renault Sas METHOD AND SYSTEM FOR PREDICTING IMPACT BETWEEN A VEHICLE AND A PIETON
JP2011248855A (en) * 2010-04-30 2011-12-08 Denso Corp Vehicle collision warning apparatus
US9656667B2 (en) * 2014-01-29 2017-05-23 Continental Automotive Systems, Inc. Method for minimizing automatic braking intrusion based on collision confidence

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150185034A1 (en) * 2007-01-12 2015-07-02 Raj V. Abhyanker Driverless vehicle commerce network and community
CN103370249A (en) * 2010-10-05 2013-10-23 谷歌公司 System and method for predicting behaviors of detected objects
CN104094177A (en) * 2012-01-30 2014-10-08 谷歌公司 Vehicle control based on perception uncertainty
US8996224B1 (en) * 2013-03-15 2015-03-31 Google Inc. Detecting that an autonomous vehicle is in a stuck condition
WO2015134152A1 (en) * 2014-03-03 2015-09-11 Google Inc. Remote assistance for autonomous vehicles in predetermined situations

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112639888A (en) * 2018-08-09 2021-04-09 祖克斯有限公司 Programmed world generation
US11861790B2 (en) 2018-08-09 2024-01-02 Zoox, Inc. Procedural world generation using tertiary data
CN113119749A (en) * 2018-12-07 2021-07-16 纳恩博(北京)科技有限公司 Driving method of electric scooter and electric scooter
CN111291447B (en) * 2018-12-07 2023-11-21 沃尔沃汽车公司 Evaluating simulated vehicle functional characteristics
CN111291447A (en) * 2018-12-07 2020-06-16 沃尔沃汽车公司 Evaluating simulated vehicle functional features
CN111307166B (en) * 2018-12-11 2023-10-03 北京图森智途科技有限公司 Method and device for constructing occupied grid map and processing equipment
CN111307166A (en) * 2018-12-11 2020-06-19 北京图森智途科技有限公司 Method, device and processing equipment for constructing occupied grid map
US11155264B2 (en) 2018-12-18 2021-10-26 Beijing Voyager Technology Co., Ltd. Systems and methods for determining driving action in autonomous driving
TWI703065B (en) * 2018-12-18 2020-09-01 大陸商北京航跡科技有限公司 Systems and methods for determining driving action in autonomous driving
CN114390990A (en) * 2019-06-28 2022-04-22 祖克斯有限公司 System and method for determining target vehicle speed
CN110673636A (en) * 2019-09-30 2020-01-10 上海商汤临港智能科技有限公司 Unmanned simulation test system and method, and storage medium
CN110766793A (en) * 2019-10-08 2020-02-07 北京地平线机器人技术研发有限公司 Map construction method and device based on semantic point cloud
US20230241995A1 (en) * 2019-10-09 2023-08-03 Carnegie Mellon University Method for managing electric vehicles
CN113048995A (en) * 2019-12-27 2021-06-29 动态Ad有限责任公司 Long term object tracking to support autonomous vehicle navigation
CN111259545B (en) * 2020-01-15 2023-08-08 吉利汽车研究院(宁波)有限公司 Intelligent driving virtual simulation cloud platform
CN111259545A (en) * 2020-01-15 2020-06-09 吉利汽车研究院(宁波)有限公司 Intelligent driving virtual simulation cloud platform
CN111221334A (en) * 2020-01-17 2020-06-02 清华大学 Environmental sensor simulation method for simulating automatic driving automobile
US11953605B2 (en) 2020-01-20 2024-04-09 Beijing Baidu Netcom Science Technology Co., Ltd. Method, device, equipment, and storage medium for determining sensor solution
CN111324945A (en) * 2020-01-20 2020-06-23 北京百度网讯科技有限公司 Sensor scheme determination method, device, equipment and storage medium
CN111324945B (en) * 2020-01-20 2023-09-26 阿波罗智能技术(北京)有限公司 Sensor scheme determining method, device, equipment and storage medium
US11586200B2 (en) * 2020-06-22 2023-02-21 The Boeing Company Method and system for vehicle engagement control
US20210397179A1 (en) * 2020-06-22 2021-12-23 The Boeing Company Method and system for vehicle engagement control
CN116324662A (en) * 2020-09-24 2023-06-23 埃尔构人工智能有限责任公司 System for performing structured testing across an autonomous fleet of vehicles
CN116324662B (en) * 2020-09-24 2024-04-19 埃尔构人工智能有限责任公司 System for performing structured testing across an autonomous fleet of vehicles
CN112612269B (en) * 2020-12-14 2021-11-12 北京理工大学 Hidden attack strategy acquisition method for Mecanum wheel trolley
CN112612269A (en) * 2020-12-14 2021-04-06 北京理工大学 Hidden attack strategy acquisition method for Mecanum wheel trolley
CN112590871A (en) * 2020-12-23 2021-04-02 交控科技股份有限公司 Train safety protection method, device and system
CN113257073A (en) * 2021-06-24 2021-08-13 成都运达科技股份有限公司 Train driving simulation stability analysis method, system, terminal and medium
CN114120252B (en) * 2021-10-21 2023-09-01 阿波罗智能技术(北京)有限公司 Automatic driving vehicle state identification method and device, electronic equipment and vehicle
CN114120252A (en) * 2021-10-21 2022-03-01 阿波罗智能技术(北京)有限公司 Method and device for identifying state of automatic driving vehicle, electronic equipment and vehicle
CN114475653A (en) * 2021-12-28 2022-05-13 广州文远知行科技有限公司 Vehicle emergency steering simulation scene configuration method and device
CN114475653B (en) * 2021-12-28 2024-03-15 广州文远知行科技有限公司 Vehicle emergency steering simulation scene configuration method and device
CN114743010B (en) * 2022-06-13 2022-08-26 山东科技大学 Ultrahigh voltage power transmission line point cloud data semantic segmentation method based on deep learning
CN114743010A (en) * 2022-06-13 2022-07-12 山东科技大学 Ultrahigh voltage power transmission line point cloud data semantic segmentation method based on deep learning
CN115179978A (en) * 2022-07-18 2022-10-14 内蒙古工业大学 Obstacle-avoiding early warning system for shuttle vehicle based on stereo earphones
CN115179978B (en) * 2022-07-18 2023-05-16 内蒙古工业大学 Shuttle car obstacle avoidance early warning system based on stereo earphone

Also Published As

Publication number Publication date
JP2022046646A (en) 2022-03-23
CN114643995A (en) 2022-06-21
JP2019504800A (en) 2019-02-21
JP7036732B2 (en) 2022-03-15
JP7330259B2 (en) 2023-08-21
CN108290579B (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN108290579A (en) Simulation system and method for autonomous vehicle
US11796998B2 (en) Autonomous vehicle fleet service and system
US11061398B2 (en) Machine-learning systems and techniques to optimize teleoperation and/or planner decisions
US11067983B2 (en) Coordination of dispatching and maintaining fleet of autonomous vehicles
US11106218B2 (en) Adaptive mapping to navigate autonomous vehicles responsive to physical environment changes
JP7316789B2 (en) Adaptive mapping for navigating autonomous vehicles in response to changes in the physical environment
US11301767B2 (en) Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
JP7195143B2 (en) Adaptive Autonomous Vehicle Planner Logic
US10496766B2 (en) Simulation system and methods for autonomous vehicles
US9734455B2 (en) Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
CN108369775A (en) Change in response to physical environment is adaptively charted to navigate to autonomous vehicle
CN108475406A (en) Software application for asking and controlling autonomous vehicle service
US20240028031A1 (en) Autonomous vehicle fleet service and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant