CN114643995A - Simulation system and method for autonomous vehicle - Google Patents
Simulation system and method for autonomous vehicle Download PDFInfo
- Publication number
- CN114643995A CN114643995A CN202210276163.7A CN202210276163A CN114643995A CN 114643995 A CN114643995 A CN 114643995A CN 202210276163 A CN202210276163 A CN 202210276163A CN 114643995 A CN114643995 A CN 114643995A
- Authority
- CN
- China
- Prior art keywords
- data
- autonomous vehicle
- vehicle
- simulated
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 112
- 238000004088 simulation Methods 0.000 title claims description 40
- 238000013499 data model Methods 0.000 claims abstract description 42
- 230000003068 static effect Effects 0.000 claims description 41
- 230000015654 memory Effects 0.000 claims description 31
- 230000001133 acceleration Effects 0.000 claims description 18
- 230000009471 action Effects 0.000 claims description 16
- 230000007613 environmental effect Effects 0.000 claims description 13
- 230000007704 transition Effects 0.000 claims description 9
- 238000004891 communication Methods 0.000 abstract description 71
- 230000033001 locomotion Effects 0.000 abstract description 65
- 230000004044 response Effects 0.000 abstract description 31
- 238000010586 diagram Methods 0.000 description 102
- 230000006870 function Effects 0.000 description 67
- 230000008569 process Effects 0.000 description 49
- 230000008447 perception Effects 0.000 description 45
- 230000004807 localization Effects 0.000 description 36
- 238000004422 calculation algorithm Methods 0.000 description 30
- 230000011218 segmentation Effects 0.000 description 24
- 230000003993 interaction Effects 0.000 description 18
- 238000012545 processing Methods 0.000 description 18
- 238000003860 storage Methods 0.000 description 16
- 241000282472 Canis lupus familiaris Species 0.000 description 15
- 230000006399 behavior Effects 0.000 description 15
- 238000013439 planning Methods 0.000 description 13
- 239000000470 constituent Substances 0.000 description 10
- 238000013461 design Methods 0.000 description 10
- 238000005259 measurement Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 9
- 238000005457 optimization Methods 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 8
- 238000005096 rolling process Methods 0.000 description 8
- 239000008186 active pharmaceutical agent Substances 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 238000013507 mapping Methods 0.000 description 7
- 238000013459 approach Methods 0.000 description 6
- 238000003491 array Methods 0.000 description 6
- 230000002457 bidirectional effect Effects 0.000 description 6
- 241001465754 Metazoa Species 0.000 description 5
- 230000006978 adaptation Effects 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 230000033228 biological regulation Effects 0.000 description 4
- 239000003795 chemical substances by application Substances 0.000 description 4
- 239000002131 composite material Substances 0.000 description 4
- 238000010801 machine learning Methods 0.000 description 4
- 238000012423 maintenance Methods 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000012384 transportation and delivery Methods 0.000 description 3
- HPTJABJPZMULFH-UHFFFAOYSA-N 12-[(Cyclohexylcarbamoyl)amino]dodecanoic acid Chemical compound OC(=O)CCCCCCCCCCCNC(=O)NC1CCCCC1 HPTJABJPZMULFH-UHFFFAOYSA-N 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000003990 capacitor Substances 0.000 description 2
- 230000015556 catabolic process Effects 0.000 description 2
- 230000001364 causal effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000006731 degradation reaction Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000009191 jumping Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001953 sensory effect Effects 0.000 description 2
- 230000035939 shock Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000004568 cement Substances 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013075 data extraction Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000006735 deficit Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 230000005226 mechanical processes and functions Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 239000010813 municipal solid waste Substances 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000003997 social interaction Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000036561 sun exposure Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000029305 taxis Effects 0.000 description 1
- 239000004557 technical material Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/015—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
- B60R21/01512—Passenger detection systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/015—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
- B60R21/01512—Passenger detection systems
- B60R21/01544—Passenger detection systems detecting seat belt parameters, e.g. length, tension or height-adjustment
- B60R21/01546—Passenger detection systems detecting seat belt parameters, e.g. length, tension or height-adjustment using belt buckle sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/09—Taking automatic action to avoid collision, e.g. braking and steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0953—Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0097—Predicting future conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0098—Details of control systems ensuring comfort, safety or stability not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
- B60W60/0016—Planning or execution of driving tasks specially adapted for safety of the vehicle or its occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
- B60W60/0017—Planning or execution of driving tasks specially adapted for safety of other traffic participants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
- B60W60/0018—Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0027—Planning or execution of driving tasks using trajectory prediction for other traffic participants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0027—Planning or execution of driving tasks using trajectory prediction for other traffic participants
- B60W60/00272—Planning or execution of driving tasks using trajectory prediction for other traffic participants relying on extrapolation of current movement
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0027—Planning or execution of driving tasks using trajectory prediction for other traffic participants
- B60W60/00274—Planning or execution of driving tasks using trajectory prediction for other traffic participants considering possible movement changes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/007—Emergency override
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0027—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0001—Details of the control system
- B60W2050/0002—Automatic control, details of type of controller or control system architecture
- B60W2050/0018—Method for the design of a control system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2400/00—Indexing codes relating to detected, measured or calculated conditions or factors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/20—Static objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2720/00—Output or target parameters relating to overall vehicle dynamics
- B60W2720/10—Longitudinal speed
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
Various embodiments relate generally to autonomous vehicles and associated mechanical, electrical, and electronic hardware, computer software and systems, and wired and wireless network communications to provide a fleet of autonomous vehicles as a service. More particularly, systems, devices, and methods are configured to simulate navigation of an autonomous vehicle in various simulated environments. In particular, the method may include receiving data representative of a characteristic of a dynamic object, calculating a classification of the dynamic object to identify the classified dynamic object, identifying data representative of a dynamically related characteristic associated with the classified dynamic object, forming a data model of the classified dynamic object, simulating a predicted range of motion of the classified dynamic object in a simulated environment to form a simulated dynamic object, and simulating a predicted response of a data representation of the simulated autonomous vehicle.
Description
The application is a divisional application of Chinese patent application with the application date of 2016, 11 and 2, the application number of CN201680064648.2, and the name of the invention is 'simulation system and method for autonomous vehicle'.
Cross Reference to Related Applications
The PCT International applications are U.S. application No.14/757,016 filed on 5/11/2015, U.S. patent application No.14/932,959 entitled "AUTONOMOUS VEHICLE FLEET SERVICE AND SYSTEM" filed on 4/11/2015, U.S. patent application No.14/932,963 entitled "ADAPTIVE MAPPING TO NAVIIGATE AUTOMOUS VEHICLES RESPONSIVE TO PHYSICAL ENVIRONMENT CHANGES", U.S. patent application No.14/932,966 filed on 4/2015, U.S. patent application No. 2015 4 entitled "TELEOPERING SYSTEM AND METHOD FOR TRANSPORTION OF MODIFICATION OF AUTONOMOUS VEHICLES", U.S. patent application No. 2015 4 filed on 11/2015, U.S. patent application No. 82976 filed on 1/2015 4/2015, U.S. patent application No. 82976 filed on 5/11/4/2015, US patent application No.14/756,992 entitled "ADAPTIVE Autonomous VEHICLE PLANNER LOGIC" filed 11/4/2015, US patent application No.14/756,994 entitled "SYSTEM OF CONFIGURING ACTIVE LIGHTING TO INFORMATION DIRECTIONAL OF AN AUTONOMOUS VEHICLE", US patent application No.14/756,993 entitled "METHOD FOR ROTATIC VEHICLE COMMUNICATION WITH EXTERNAL TRANSIST OF FORMING" filed 11/4/2015, US patent application No. 201558 entitled "METHOD FOR ROTATIC VEHICLE FOR ACOUS FORMING", US patent application No. 2015 filed 11/4/2015, US patent application No.14/756,996 entitled "TRANSSOR-OBJECT BASIS-DETORING" filed 11/4/14/756,996, US patent application No. 3526 filed 11/11. NET patent application No. ACTIVE LIGHTING entitled "SYSTEM FOR AUTONOMOUS VEHICLE AUTONOMOLES, U.S. patent application No.14/932,952, entitled "RESILIENT SAFETY SYSTEM FOR a rolling VEHICLE," filed on day 11, month 4 2015, U.S. patent application No.14/932,954, entitled "INTERNAL SAFETY SYSTEMS FOR rolling VEHICLEs," filed on day 11, month 4 2015, U.S. patent application No.14/932,958, entitled "rolling CONTROL OF rolling VEHICLEs," filed on day 11, month 4 2015, and a continuation OF U.S. patent application No.14/932,962, entitled "rolling VEHICLE ACTIVE SAFETY SYSTEMS AND METHODS," filed on day 2015, month 5, relate to U.S. patent application No.14/757,015, entitled "INDEPENDENT STEERING, POWER, TORQUE rolling AND TRANSFER IN AUTONOMOUS VEHICLEs," filed on day 2015, which are hereby incorporated by reference in their entirety FOR all purposes.
Technical Field
Various embodiments relate generally to autonomous vehicles and associated mechanical, electrical, and electronic hardware, computer software and systems, and wired and wireless network communications to provide a fleet of autonomous vehicles as a service. More specifically, systems, devices, and methods are configured to simulate navigation of an autonomous vehicle in various simulation environments.
Background
Various solutions for developing unmanned vehicles have focused primarily on automating conventional vehicles (e.g., manually driven automotive vehicles) with the intent of producing unmanned vehicles for consumer purchase. For example, many automotive companies and affiliates are modifying conventional automotive and control mechanisms, such as steering, to provide consumers with the ability to own vehicles that can be operated without a driver. In some scenarios, conventional unmanned vehicles perform safety-up driving functions under some conditions, but require the driver to take control (e.g., turn, etc.) if the vehicle controller is unable to address some issues that may endanger the safety of the occupants.
Although practical conventional unmanned vehicles generally have a number of disadvantages. For example, a large number of unmanned cars are being developed that evolve from vehicles that require manual (i.e., human-controlled) steering and other similar automatic functions. Therefore, most unmanned cars are based on a paradigm: the vehicle is designed to accommodate the driver with the license, and a dedicated seat or location is reserved for the driver in the vehicle. As a result, unmanned vehicles are less optimally designed and often forego the opportunity to simplify vehicle design and save resources (e.g., reduce the cost of producing unmanned vehicles). Other disadvantages are also present in conventional unmanned vehicles.
Other drawbacks are also present in conventional transportation services, which are less suitable for actively managing inventory of, for example, vehicles, due to the common scheme of providing a common transportation and ride sharing service. In one conventional approach, passengers are required to access mobile applications to request transport services by dispatching the passengers with a centralized service of human drivers and vehicles (e.g., in a private ownership scenario). In the case of vehicles owned by different persons, the maintenance of private vehicles and also of safety systems is often not checked. In another conventional approach, some entities can implement ride sharing for a group of vehicles by allowing drivers recruited as members to enter vehicles shared among the members. When the driver needs to pick up and drop off the shared vehicle at a particular location (which is rare in urban environments), this solution is less suitable for providing regular transportation services and requires access to rather expensive real estate (i.e., a parking lot) to park and ride the shared vehicle there. In the conventional scheme described above, from the viewpoint of inventory, a conventional vehicle for providing transportation service is not generally well utilized since the vehicle becomes stationary once a driver leaves. Additionally, ride sharing schemes (and vehicle transportation services owned by individuals) are generally less suitable for rebalancing inventory to match the needs of the transportation service to accommodate usage and typical driving patterns. It should also be noted that some conventionally described vehicles with limited self-driving automation capabilities are also less suitable for rebalancing inventory, as human drivers may typically be required. An example of a vehicle with limited autopilot automation capabilities is the vehicle labeled as a class 3 ("L3") vehicle, according to the united states department of transportation national highway traffic safety agency ("NHTSA") regulations.
Another disadvantage is that typical unmanned vehicle solutions are generally less suitable for detecting and navigating a vehicle with respect to interactions (e.g., social interactions) between the vehicle in motion and other drivers or individuals of the vehicle. For example, some conventional approaches are unable to adequately identify pedestrians, cyclists, etc., and associated interactions, e.g., eye contact, gesturing, etc., for purposes of addressing safety risks to occupants of unmanned vehicles, as well as drivers of other vehicles, pedestrians, etc.
What is needed, therefore, is a solution for implementing an autonomous vehicle that does not have the limitations of conventional techniques.
Disclosure of Invention
One aspect of the invention relates to a method for simulating navigation of an autonomous vehicle, comprising: identifying first data representative of characteristics of one or more dynamic objects in one or more of the simulated environment or the physical environment; determining a classification of a dynamic object based at least in part on the first data; identifying second data representative of a dynamically related characteristic associated with the dynamic object; generating a data model of the dynamic object based at least in part on one or more of the first data or the second data; simulating a predicted range of motion of the dynamic object in a simulation environment; and simulating a predicted response that simulates one or more functions of an autonomous vehicle based at least in part on the predicted range of motion of the dynamic object.
Another aspect of the invention relates to a system for simulating navigation of an autonomous vehicle, comprising: one or more computing devices comprising one or more processors, wherein the one or more computing devices are configured to: receiving first data representing a property of a dynamic object; determining a classification of a dynamic object based at least in part on the first data to identify a classified dynamic object; identifying second data representative of dynamic correlation properties associated with the classified dynamic object; generating a data model of the classified dynamic object based at least in part on the second data representative of the dynamically-relevant characteristics of the classified dynamic object; simulating a predicted range of motion of the classified dynamic object in a simulation environment; and simulating a predicted response of a data representation simulating one or more functions of an autonomous vehicle based at least in part on the predicted range of motion of the classified dynamic object.
Yet another aspect of the invention relates to a non-transitory computer-readable storage medium having computer-executable instructions stored thereon that, when executed by a computer, cause the computer to perform acts comprising: determining a classification of the dynamic object based at least in part on data representative of characteristics of one or more dynamic objects in the one or more environments; identifying a dynamic correlation property associated with the dynamic object; generating a data model of the dynamic object based at least in part on the dynamic correlation characteristic; and simulating one or more events associated with a simulated autonomous vehicle in a simulated environment based at least in part on the predicted range of motion of the dynamic object.
Yet another aspect of the invention relates to a method for controlling an autonomous vehicle, comprising: receiving first data representing a dynamic object in a physical environment; determining, by a vehicle controller, a classification of the dynamic object based at least in part on the first data; identifying second data representative of a dynamic correlation property associated with the dynamic object; generating a data model of the dynamic object based at least in part on at least one of the first data, the second data, or the classification; simulating, by the vehicle controller and based at least in part on the data model of the dynamic object, a predicted range of motion of the dynamic object in a simulated environment; generating, by the vehicle controller and based at least in part on the predicted range of motion of the dynamic object, a command to control the autonomous vehicle; and controlling the autonomous vehicle based at least in part on the command.
Yet another aspect of the invention relates to a system for controlling an autonomous vehicle, comprising: one or more processors; and one or more computer-readable media storing instructions executable by the one or more processors, wherein the instructions, when executed, cause the system to: receiving first data representing a dynamic object in a physical environment; determining a classification of the dynamic object based at least in part on the first data; identifying second data representative of dynamic correlation properties associated with the classified dynamic object; generating a data model of the dynamic object based at least in part on at least one of the first data or the second data; simulating a predicted range of motion of the classified dynamic object in a simulation environment; generating a command for controlling the autonomous vehicle in the physical environment based at least in part on the predicted range of motion; and controlling the autonomous vehicle based at least in part on the command.
Yet another aspect of the invention relates to a non-transitory computer-readable storage medium having computer-executable instructions stored thereon that, when executed by a computer, cause the computer to: receiving sensor data associated with a dynamic object in a physical environment; determining a classification of the dynamic object based at least in part on the sensor data; identifying a dynamic correlation characteristic representative of a dynamic correlation associated with the dynamic object based at least in part on the sensor data; generating a data model of the dynamic object based at least in part on the dynamic correlation characteristic, based at least in part on at least one of the dynamic correlation characteristic or the classification; and simulating an event associated with an autonomous vehicle in the physical environment based at least in part on the predicted range of motion of the dynamic object; and generating a command for controlling the autonomous vehicle based at least in part on at least one of the event or the data model.
Drawings
Various embodiments or examples ("examples") of the invention are disclosed in the following detailed description and accompanying drawings:
FIG. 1 is a diagram depicting an implementation of a fleet of autonomous vehicles communicatively networked with an autonomous vehicle service platform, in accordance with some embodiments;
FIG. 2 is an example of a flow chart for monitoring a fleet of autonomous vehicles, according to some embodiments;
FIG. 3A is a diagram depicting examples of sensors and other autonomous vehicle components, according to some examples;
3B-3E are diagrams depicting examples of sensor field redundancy and adaptation of autonomous vehicles to loss of sensor field, according to some examples;
FIG. 4 is a functional block diagram depicting a system including an autonomous vehicle service platform communicatively coupled to an autonomous vehicle controller via a communication layer, according to some examples;
FIG. 5 is one example of a flow chart for controlling an autonomous vehicle according to some embodiments;
FIG. 6 is a diagram depicting an example of an architecture of an autonomous vehicle controller, according to some embodiments;
FIG. 7 is a diagram depicting an example of an autonomous vehicle service platform implementing redundant communication channels for maintaining reliable communication with a fleet of autonomous vehicles, in accordance with some embodiments;
FIG. 8 is a diagram depicting an example of a message processing application configured to exchange data between various applications, in accordance with some embodiments;
FIG. 9 is a diagram depicting types of data that facilitate remote control operations using the communication protocol depicted in FIG. 8, according to some examples;
FIG. 10 is a diagram illustrating an example of a teleoperator interface that a teleoperator may use to affect path planning, according to some embodiments;
fig. 11 is a diagram depicting an example of a planner configured to invoke a teleoperation, according to some examples;
fig. 12 is an example of a flow chart configured to control an autonomous vehicle, according to some embodiments;
fig. 13 depicts an example in which a planner may generate a trajectory, according to some examples;
FIG. 14 is a diagram depicting another example of an autonomous vehicle service platform, in accordance with some embodiments;
FIG. 15 is an example of a flow chart for controlling an autonomous vehicle according to some embodiments;
FIG. 16 is a diagram of an example of an autonomous vehicle fleet manager implementing a fleet optimization manager, according to some examples;
FIG. 17 is an example of a flow chart for managing a fleet of autonomous vehicles, according to some embodiments;
FIG. 18 is a diagram illustrating an autonomous vehicle fleet manager implementing an autonomous vehicle communication link manager, according to some embodiments;
FIG. 19 is an example of a flow chart for determining an action of an autonomous vehicle during an event, according to some embodiments;
FIG. 20 is a diagram depicting an example of a locator according to some embodiments;
FIG. 21 is an example of a flow chart for generating local pose data based on integrated sensor data, according to some embodiments;
FIG. 22 is a diagram depicting another example of a locator according to some embodiments;
FIG. 23 is a diagram depicting an example of a perception engine, in accordance with some embodiments;
FIG. 24 is an example of a flow diagram for generating perception engine data according to some embodiments;
FIG. 25 is an example of a segmentation processor according to some embodiments;
FIG. 26A is a diagram depicting an example of an object tracker and classifier in accordance with various embodiments;
FIG. 26B is a diagram depicting another example of an object tracker in accordance with at least some examples;
fig. 27 is an example of a front-end processor for a perception engine, according to some examples;
FIG. 28 is a diagram depicting a simulator configured to simulate an autonomous vehicle in a synthetic environment, in accordance with various embodiments;
FIG. 29 is an example of a flow chart for simulating various aspects of an autonomous vehicle, according to some embodiments;
FIG. 30 is an example of a flow chart for generating map data according to some embodiments;
FIG. 31 is a diagram depicting the architecture of a mapping engine in accordance with some embodiments;
FIG. 32 is a diagram depicting an autonomous vehicle application, according to some examples; and
33-35 illustrate examples of various computing platforms configured to provide various functionality to components of an autonomous vehicle service, in accordance with various embodiments;
FIG. 36 is a diagram depicting a simulator configured to simulate one or more functions of a simulated autonomous vehicle (simulated autonomous vehicle) in a simulation environment, according to some examples;
FIG. 37 depicts a vehicle modeler, according to some examples;
FIG. 38 is a diagram depicting an example of a sensor modeler, according to some examples;
FIG. 39 is a diagram depicting an example of a dynamic object data modeler, according to some examples;
FIG. 40 is a flow diagram illustrating an example of generating a simulation environment, according to some examples; and
FIG. 41 illustrates an example of various computing platforms configured to provide various simulator-related functions and/or structures to simulate autonomous vehicle services, in accordance with various embodiments.
Detailed Description
Various embodiments or examples may be implemented in numerous ways, including as a system, a process, an apparatus, a user interface, or as a series of program instructions on a computer readable medium such as a computer readable storage medium or a computer network wherein the program instructions are sent over optical, electronic, or wireless communication links. In general, the operations of the disclosed processes may be performed in any order, unless otherwise specified in the claims.
One or more example implementations are provided below along with accompanying figures. The detailed description is provided in connection with such examples, but is not limited to any particular example. The scope is limited only by the claims, and the alternatives, modifications, and equivalents thereof. In the following description, numerous specific details are set forth in order to provide a thorough understanding. These details are provided for the purpose of example and the described techniques may be practiced according to the claims rather than some or all of these specific details. For the sake of brevity, technical material that is known in the technical fields related to the examples has not been described in detail so as not to unnecessarily obscure the description.
FIG. 1 is a diagram depicting an implementation of a fleet of autonomous vehicles communicatively networked with an autonomous vehicle service platform, in accordance with some embodiments. The diagram 100 depicts a fleet of autonomous vehicles 109 (e.g., one or more of the autonomous vehicles 109 a-109 e) operating as services, each autonomous vehicle 109 configured to self-drive on a road network 110 and establish a communication link 192 with an autonomous vehicle service platform 101. In examples where a fleet of autonomous vehicles 109 constitutes a service, the user 102 may transmit a request 103 for autonomous transport to the autonomous vehicle service platform 101 via one or more networks 106. In response, the autonomous vehicle service platform 101 may send one of the autonomous vehicles 109 from the geographic location 119 to the geographic location 111 to autonomously transport the user 102. The autonomous vehicle service platform 101 may dispatch an autonomous vehicle from the station 190 to the geographic location 119, or may divert an autonomous vehicle 109c (e.g., an unoccupied person) already in transit to service the transportation request of the user 102. The autonomous vehicle service platform 101 may also be configured to turn the autonomous vehicle 109c (with the passenger) en route in response to a request from the user 102 (e.g., as a passenger). Additionally, the autonomous vehicle service platform 101 may also be configured to reserve an autonomous vehicle 109c (with a passenger) en route for diversion to service the request of the user 102 after the existing passenger departs. Note that multiple autonomous vehicle service platforms 101 (not shown) and one or more stops 190 may be implemented to service one or more autonomous vehicles 109 connected to road network 110. One or more stations 190 may be configured to store, service, manage, and/or maintain inventory of autonomous vehicles 109 (e.g., the stations 190 may include one or more computing devices implementing the autonomous vehicle service platform 101).
According to some examples, at least some of the autonomous vehicles 109 a-109 c are configured as bidirectional autonomous vehicles, e.g., bidirectional autonomous vehicle ("AV") 130. The bi-directional autonomous vehicle 130 may be configured to travel primarily (but not limited to) in either direction along the longitudinal axis 131. Thus, bidirectional autonomous vehicle 130 may be configured to implement active lighting located outside the vehicle to alert others (e.g., other drivers, pedestrians, bikers, etc.) nearby and in the direction in which bidirectional autonomous vehicle 130 is traveling. For example, the active light source 136 may be implemented as an active light 138a when traveling in a first direction, or the active light source 136 may be implemented as an active light 138b when traveling in a second direction, and the active light 138a may be implemented with optional animation (e.g., a light pattern of variable light intensity or color that may change over time) using a first subset of one or more colors. Similarly, the active lights 138b may be implemented using a second subset of one or more colors and a light pattern that may be different from the light pattern of the active lights 138 a. For example, the active lights 138a may be implemented using white lights as "headlights" and the active lights 138b may be implemented using red lights as "taillights. The active lights 138a and 138b, or portions thereof, may be configured to provide other light-related functions, such as providing a "turn signal indication" function (e.g., using yellow lights). According to various examples, logic in the autonomous vehicle 130 may be configured to adjust the active vehicle lights 138a and 138b to comply with various safety requirements and traffic regulations or laws for any number of jurisdictions.
In some embodiments, bi-directional autonomous vehicle 130 may be configured to have similar structural elements and components in each quadrilateral portion (e.g., quadrilateral portion 194). In at least this example, the quadrilateral portion is described as the portion of the bi-directional autonomous vehicle 130 defined by the intersection of the plane 132 and the plane 134, where the plane 132 and the plane 134 both pass through the vehicle to form two similar halves on each side of the plane 132 and the plane 134. Additionally, the bi-directional autonomous vehicle 130 may also include an autonomous vehicle controller 147, the autonomous vehicle controller 147 including logic (e.g., hardware or software, or a combination thereof) configured to control most vehicle functions including drive control (e.g., propulsion, steering, etc.) and active light sources 136, among other functions. The bi-directional autonomous vehicle 130 may also include a plurality of sensors 139 (other sensors not shown) disposed at various locations of the vehicle.
The autonomous vehicle controller 147 may also be configured to determine a local pose (e.g., local position) of the autonomous vehicle 109 and detect external objects relative to the vehicle. For example, it may be considered that bi-directional autonomous vehicle 130 is traveling in direction 119 in road network 110. A locator (not shown) of the autonomous vehicle controller 147 may determine the local pose at the geographic location 111. The localizer may then use the collected sensor data, such as sensor data associated with the surfaces of the buildings 115 and 117, which may be compared to reference data, such as map data (e.g., 3D map data, including reflection data), to determine the local pose. Additionally, a perception engine (not shown) of the autonomous vehicle controller 147 may also be configured to detect, classify, and predict behavior of external objects (e.g., external objects 112 ("trees") and external objects 114 ("pedestrians"). Classification of such external objects may roughly classify objects as static objects, such as external objects 112, and dynamic objects, such as external objects 114.
According to some examples, the autonomous vehicle service platform 101 may be configured to: if a remote operation is requested from the autonomous vehicle 109, a remote operator service is provided. For example, the autonomous vehicle controller 147 in the autonomous vehicle 109d may be considered to detect that the object 126 obscures the path 124 on the road 122 at point 191 (as depicted in the inset 120). If the autonomous vehicle controller 147 is unable to confirm a path or trajectory that the vehicle 109d may safely travel with a relatively high degree of certainty, the autonomous vehicle controller 147 may transmit a request message 105 for a teleoperational service. In response, the teleoperator computing device 104 may receive instructions from the teleoperator 108 to perform a series of actions to successfully (and safely) clear the obstacle 126. The response data 107 may then be transmitted back to the autonomous vehicle 109d, for example, to enable the vehicle to safely cross a set of two wires as it travels along the alternate path 121. In some examples, the teleoperator computing device 104 may generate a response identifying the geographic area excluding a path from the plan. In particular, in addition to providing a path to follow, the teleoperator 108 may define areas or locations that the autonomous vehicle must avoid.
In view of the above description, the autonomous vehicle 130 and/or the autonomous vehicle controller 147, and the structures and/or functions of their components, may perform real-time (or near real-time) trajectory calculations through operations related to autonomy (e.g., localization and perception) to enable the autonomous vehicle 109 to drive itself.
In some cases, the bi-directional nature of bi-directional autonomous vehicle 130 provides a vehicle with quadrilateral portions 194 (or any other number of symmetric portions) that are similar or substantially similar to each other. Such symmetry reduces the complexity of the design and relatively reduces the number of unique components or structures, thereby reducing inventory and manufacturing complexity. For example, a power train and wheel system may be provided in any one of the quadrangular portions 194. Additionally, the autonomous vehicle controller 147 may also be configured to invoke teleoperated services to reduce the likelihood of the autonomous vehicle 109 being delayed en route while addressing events or issues that may otherwise affect the safety of the occupant. In some cases, the visible portion of road network 110 depicts a geographically delimited area that may restrict, or otherwise control, the movement of autonomous vehicles 109 to the road network shown in fig. 1. According to various examples, the autonomous vehicle 109 and its fleet may be configured to operate as a level 4 ("all-self-driving automation" or L4) vehicle that may provide convenience and privacy of point-to-point personal mobility to on-demand transportation while providing efficiency in sharing vehicles. In some examples, the autonomous vehicle 109, or any of the autonomous vehicles described herein, may be configured to omit a steering wheel or any other mechanical device that provides manual (i.e., human-controlled) steering for the autonomous vehicle 109. Additionally, the autonomous vehicle 109, or any of the autonomous vehicles described herein, may also be configured to omit a seat or position in the vehicle reserved for an occupant for a steering wheel or any mechanical interface to the steering system.
Fig. 2 is an example of a flow diagram for monitoring a fleet of autonomous vehicles, according to some embodiments. At 202, the process 200 begins by monitoring a fleet of autonomous vehicles. At least one autonomous vehicle includes an autonomous vehicle controller configured to autonomously cause the vehicle to travel from a first geographic area to a second geographic area. At 204, data representing an event associated with the calculated confidence level of the vehicle is detected. An event may be a condition or situation that affects the operation of, or may affect the operation of, an autonomous vehicle. The event may be internal to the autonomous vehicle or external to the autonomous vehicle. For example, an obstacle obstructing the road may be regarded as an event, and a decrease or loss of communication may be regarded as an event. Events may include traffic conditions or congestion conditions, and unpredictable or unusual numbers or types of external objects (or tracks) perceived by the perception engine. Events may include weather-related conditions (e.g., loss of friction due to ice or rain) or angles of sun exposure (e.g., at sunset), such as low horizon angles of the eyes of human drivers that cause the sun to blare other vehicles. These or other conditions may be considered events that cause a remote operator service to be invoked or cause the vehicle to execute a safe stopping trajectory.
At 206, data representing a subset of candidate trajectories may be received from the autonomous vehicle in response to detecting the event. For example, a planner of an autonomous vehicle controller may calculate and estimate a large number (e.g., thousands or more) of trajectories per unit time (e.g., seconds). In some embodiments, the candidate trajectories are a subset of trajectories that provide a fairly high level of confidence that the autonomous vehicle can safely move forward (e.g., using an alternate path provided by a teleoperator) in view of the event. Note that some candidate trajectories may be ranked or associated with a higher confidence than other candidate trajectories. According to some examples, the subset of candidate trajectories may originate from any number of sources, such as a planner, a trajectory operator computing device (e.g., a teleoperator may determine and provide an approximate path), etc., and may be combined as a superset of candidate trajectories. At 208, path guidance data may be identified at the one or more processors. The path guidance data may be configured to assist a teleoperator in selecting a guidance trajectory from one or more of the candidate trajectories. In some instances, the path guidance data specifies a value indicative of a confidence level, or a probability indicative of a degree of certainty that a particular candidate trajectory may reduce or negate the probability that an event may impair operation of the autonomous vehicle. At 210, a guide trajectory may be received as a selected candidate trajectory in response to an input from the remote operator (e.g., the remote operator may select at least one candidate trajectory from a set of different ranked candidate trajectories to be selected as the guide trajectory). This selection may be made, for example, via an operator interface listing a plurality of candidate trajectories, in order of highest confidence level to lowest confidence level. At 212, a selection of candidate trajectories as guide trajectories may be transmitted to the vehicle, and the guide trajectories for resolving the conditions may then be implemented by causing the vehicle to perform a remote operator-specified maneuver. In this way, the autonomous vehicle may transition from an out-of-specification operating state.
Fig. 3A is a diagram depicting examples of sensors and other autonomous vehicle components, according to some examples. Diagram 300 depicts an interior view of a bidirectional autonomous vehicle 330, the bidirectional autonomous vehicle 330 including sensors, a signal router 345, a drivetrain 349, a detachable battery 343, an audio generator 344 (e.g., a speaker or transducer), and autonomous vehicle ("AV") control logic 347. The sensors shown in diagram 300 include an image capture sensor 340 (e.g., any type of light capture device or camera), an audio capture sensor 242 (e.g., any type of microphone), a radar device 348, a sonar device 341 (or other similar sensors, including ultrasonic sensors or sound related sensors), and a lidar device 346, as well as other sensor types and forms (some of which are not shown, e.g., inertial measurement units (i.e., "IMUs"), global positioning system ("GPS") sensors, sonar sensors, etc.). Note that quadrilateral portions 350 represent the symmetry of each of the 4 "quadrilateral portions" of bi-directional autonomous vehicle 330 (e.g., each quadrilateral portion 350 may include wheels, a driveline 349, similar steering mechanisms, similar structural supports and components, etc., in addition to those described). As depicted in fig. 3A, similar sensors may be placed at similar locations in each quadrilateral portion 350, however, any other configuration may be implemented. Each wheel can be steered individually and independently of the other wheels. It should also be noted that the removable battery 343 may be configured to facilitate insertion and removal, rather than charging in situ, to ensure that service time due to the necessity of charging the battery 343 is reduced or negligible. Although the autonomous vehicle controller 347a is described as being used in a bi-directional autonomous vehicle 330, the autonomous vehicle controller 347a is not so limited and may be implemented in a uni-directional autonomous vehicle or any other type of vehicle, whether on land, in the air, or at sea. Note that the depicted and described positioning, location, orientation, quantity, and type of sensors shown in fig. 3A are not intended to be limiting, and thus, any number and any type of sensors may be present and may be positioned and oriented anywhere on autonomous vehicle 330.
According to some embodiments, portions of the autonomous vehicle ("AV") control logic 347 may be implemented using clusters of GPUs implementing a framework and programming model suitable for programming the clusters of graphics processing units ("GPUs"). For example, a GPU may be programmed using a compute unified device architecture ("CUDA") compatible programming language and application programming interface ("API") model. Production and maintenance of CUDA from NVIDIA of Santa Clara, CalifTM. Note that other programming languages, such as OpenCL, or any other parallel programming language may also be implemented.
According to some embodiments, the autonomous vehicle control logic 347 may be implemented in hardware and/or software as an autonomous vehicle controller 347a, which is depicted as including a motion controller 362, a planner 364, a perception engine 366, and a locator 368. As shown, autonomous vehicle controller 347a is configured to receive camera data 340a, lidar data 346a, and radar data 348a, or any other range sensing or localization data, including sonar data 341a, or the like. The autonomous vehicle controller 347a is also configured to receive positioning data, such as GPS data 352, IMU data 354, and other positioning sensing data (e.g., data related to the wheels, such as steering angle, angular velocity, etc.). Additionally, the autonomous vehicle controller 347a may receive any other sensor data 356, as well as the reference data 339. In some cases, the reference data 339 includes map data (e.g., 3D map data, 2D map data, 4D map data (e.g., including time determinations)), and route data (e.g., road network data, including but not limited to RNDF data (or the like), MDF data (or the like), and the like.
The locator 368 is configured to receive sensor data, such as GPS data 352, wheel data, IMU data 354, lidar data 346a, camera data 340a, radar data 348a, and the like, and reference data 339 (e.g., 3D map data and route data) from one or more sources. Locator 368 integrates (e.g., fuses sensor data) and analyzes data by comparing sensor data to map data to determine a local pose (or position) of bi-directional autonomous vehicle 330. According to some embodiments, the localizer 368 may generate or update the pose or position of any autonomous vehicle in real time or near real time. Note that the locator 368 and its functionality need not be limited to a "two-way" vehicle, but may be implemented in any vehicle of any type. Thus, the locator 368 (and other components of the AV controller 347 a) may be implemented in a "one-way" vehicle or any non-autonomous vehicle. According to some embodiments, the data describing the local pose may include one or more of x-coordinates, y-coordinates, z-coordinates (or any coordinates of any coordinate system (including polar or cylindrical coordinate systems, etc.), yaw values, roll values, pitch values (e.g., angle values), velocity (e.g., speed), altitude, and/or the like.
The perception engine 366 is configured to receive sensor data, such as lidar data 346a, camera data 340a, radar data 348a, etc., and local pose data, from one or more sources. The perception engine 366 may be configured to determine a location of an external object based on the sensor data and other data. For example, the external object may be an object that is not part of a drivable road surface. For example, the perception engine 366 can detect external objects and can classify them as pedestrians, cyclists, dogs, other vehicles, and the like (e.g., the perception engine 366 can be configured to classify objects according to a type of classification, can associate them with semantic information, including tags). Based on the classification of these external objects, the external objects may be labeled as dynamic objects or static objects. For example, an external object classified as a tree may be labeled as a static object, while an external object classified as a pedestrian may be labeled as a static object. External objects marked as static may or may not be described in the map data. Examples of external objects that may be flagged as static include traffic cones, cement barriers placed across roads, lane closure signs, newly placed mailboxes or trash cans adjacent to roads, and the like. Examples of external objects that may be tagged as dynamic include bicyclists, pedestrians, animals, other vehicles, and so forth. If the external object is marked as dynamic, and other data about the external object may indicate typical levels of activity and speed, as well as behavioral patterns associated with the classification type. Other data about the external object may be generated by tracking the external object. The classification type may then be used to predict or otherwise determine the likelihood that an external object, for example, may interfere with an autonomous vehicle traveling along the planned path. For example, an external object classified as a pedestrian may be associated with some maximum velocity, as well as an average velocity (e.g., based on tracking data). The speed of the pedestrian relative to the speed of the autonomous vehicle may be used to determine whether a collision is likely. Additionally, perception engine 364 may determine a level of uncertainty associated with the current and future states of the object. In some examples, the uncertainty level may be represented as an estimated value (or probability).
Fig. 3B-3E are diagrams depicting examples of sensor field redundancy and adaptation of autonomous vehicles to loss of sensor field, according to some examples. Diagram 391 of FIG. 3B depicts sensor field 301a in which sensor 310a detects an object (e.g., for determining range or distance, or other information). Although sensor 310a may implement any type of sensor or form of sensor, sensor 310a and similarly described sensors (e.g., sensors 310b, 310c, and 310d) may include lidar devices. Thus, sensor fields 301a, 301b, 310c, and 310d each include a field in which the laser extends. Diagram 392 of fig. 3C depicts four overlapping sensor fields, each of which is generated by a respective lidar sensor 310 (not shown). As shown, portion 301 of the sensor field includes a non-overlapping sensor field (e.g., a single lidar field), portion 302 of the sensor field includes two overlapping sensor fields, and portion 303 includes three overlapping sensor fields, such sensors thus providing multiple levels of redundancy if the lidar sensor fails.
Fig. 3D depicts a loss of sensor field due to faulty operation of lidar 309, according to some examples. The sensor field 302 of fig. 3C is transformed into a single sensor field 305, one of the sensor fields 301 of fig. 3C is lost at the gap 304, and three of the sensor fields 303 of fig. 3C are transformed into sensor fields 306 (i.e., limited to two overlapping fields). If the autonomous car 330c is traveling in the direction of travel 396, the sensor field in front of the moving autonomous vehicle is less robust than the sensor field at the trailing portion. According to some examples, an autonomous vehicle controller (not shown) is configured to account for the loss of sensor field of the lead area ahead of the vehicle using the bi-directional characteristics of the autonomous vehicle 330 c. Fig. 3E depicts a robust bi-directional maneuver for restoring the sensor field in front of the autonomous vehicle 330 d. As shown, the more robust sensor field 302 is disposed behind the vehicle 330d, in the same space as the tail lights 348. When convenient, the autonomous vehicle 330d performs bi-directional maneuvering by driving the lane 397 and switching its directionality such that the tail lights 348 actively switch to the other side (e.g., the tail end) of the autonomous vehicle 330 d. As shown, the autonomous vehicle 330d recovers the robust sensor field 302 in front of the vehicle as it travels in the direction of travel 398. In addition, the bi-directional steering described above eliminates the need for more complex steering requiring backing up to busy roads.
Fig. 4 is a functional block diagram depicting a system including an autonomous vehicle service platform communicatively coupled to an autonomous vehicle controller via a communication layer, according to some examples. Diagram 400 depicts an autonomous vehicle ("AV") controller 447 disposed in an autonomous vehicle 430, the autonomous vehicle 430 in turn including a plurality of sensors 470 coupled to the autonomous vehicle controller 447. The sensors 470 include one or more lidar devices 472, one or more cameras 474, one or more radars 476, one or more global positioning system ("GPS") data receiver sensors, one or more inertial measurement units ("IMU") 475, one or more odometer sensors 477 (e.g., wheel encoder sensors, wheel speed sensors, etc.), as well as any other suitable sensors 478 (e.g., infrared cameras or sensors), high spectral capability sensors, ultrasonic sensors (or any other acoustic energy based sensors), radio frequency based sensors, etc. In some cases, a wheel angle sensor configured to sense a steering angle of the wheel may be included as the odometer sensor 477 or the suitable sensor 478. In a non-limiting example, the autonomous vehicle controller 447 may include four or more lidar 472, sixteen or more cameras 474, and four or more radar units 476. Additionally, sensors 470 may also be configured to provide sensor data to components of the autonomous vehicle controllers 447 as well as elements of the autonomous vehicle service platform 401. As shown in diagram 400, the autonomous vehicle controllers 447 include a planner 464, a motion controller 462, a locator 468, a perception engine 466, and a local map generator 440. Note that elements described in diagram 400 of FIG. 4 may include the same structure and/or functionality as similarly-named elements described in connection with one or more other diagrams.
The locator 468 is configured to localize (i.e., determine a local pose) the autonomous vehicle relative to reference data, which may include map data, route data (e.g., road network data, such as RNOF-like data), and the like. In some cases, for example, locator 468 is configured to identify a point in space that may represent the location of autonomous vehicle 430 relative to environmental representative features. Locator 468 is shown to include a sensor data integrator 469, which may be configured to integrate multiple subsets of sensor data (e.g., different sensor formats) to reduce the uncertainty associated with each individual type of sensor. According to some examples, sensor data integrator 469 is configured to fuse sensor data (e.g., lidar data, camera data, radar data, etc.) to form integrated sensor data values that are used to determine local poses. According to some examples, the locator 468 retrieves reference data from a reference data repository 405, the repository 405 including a map data repository 405a for storing 2D map data, 3D map data, 4D map data, and the like. Locator 468 may be configured to identify at least a subset of features in the environment to match map data that identifies or otherwise confirms the pose of autonomous vehicle 430. According to some examples, the locator 468 may be configured to identify any quantity of features in the environment, such that a set of features may have one or more features or all features. In a particular example, any amount of lidar data (e.g., most or substantially all of the lidar data) may be compared to data representing a map for localization purposes. Generally, the non-matching object obtained by the comparison of the environmental features and the map data may be a dynamic object, such as a vehicle, a cyclist, a pedestrian, or the like. Note that the detection of dynamic objects (including obstacles) may be performed with or without map data. In particular, dynamic objects may be detected and tracked independently of (i.e., without) map data. In some examples, the 2D map data and the 3D map data may be considered "global map data" or map data that the autonomous vehicle service platform 401 has verified at some point in time. Because the map data in the map data repository 405a can be periodically updated and/or verified, there may be a discrepancy between the map data and the actual environment in which the autonomous vehicle is located. Thus, the locator 468 can retrieve locally obtained map data generated by the local map generator 440 to enhance localization. The local map generator 440 is configured to generate local map data in real time or near real time. Optionally, for example, the local map generator 440 may receive static and dynamic object map data to enhance the accuracy of the locally generated map by disregarding dynamic objects in the localization. In accordance with at least some embodiments, the local map generator 440 may be integrated with the locator 468 or formed as part of the locator 468. In at least one instance, the local map generator 440, alone or in cooperation with the locator 468, can be configured to generate map and/or reference data based on simultaneous localization and mapping ("SLAM"), or the like. Note that the locator 468 may implement a "hybrid" approach that uses map data, whereby logic in the locator 468 may be configured to select various amounts of map data from the map data repository 405a, or local map data from the local map generator 440, depending on the degree of reliability of each map data source. Thus, with respect to locally generated map data, the locator 468 may still be using outdated map data.
For example, the perception engine 466 is configured to: the planner 464 plans the route and generates the trajectory by identifying objects of interest in the surrounding environment in which the autonomous vehicle 430 is traveling. Additionally, a probability may also be associated with each object of interest, whereby the probability may indicate the likelihood that the object of interest may be a threat to safe driving (e.g., a fast moving motorcycle, rather than a person sitting in a bus stop bench reading a newspaper, may require intensive tracking). As shown, perception engine 466 includes an object detector 442 and an object classifier 444. The object detector 442 is configured to distinguish objects relative to other features in the environment, and the object classifier 444 may be configured to classify objects as dynamic or static objects and track the position of dynamic and static objects relative to the autonomous vehicle 430 for planning purposes. Additionally, the awareness engine 466 may be configured to assign an identifier to a static or dynamic object that specifies whether the object is (or may become) an obstacle that may compromise the path plan at the planner 464. Although not shown in fig. 4, note that the perception engine 466 may also perform other perception-related functions, such as segmentation and tracking, examples of which are described below.
The planner 464 is configured to generate a plurality of candidate trajectories for achieving a goal to reach the destination via the plurality of paths or routes available. The trajectory estimator 465 is configured to estimate candidate trajectories and identify which subsets of candidate trajectories are associated with a higher degree of confidence in providing collision-free paths to the destination. The trajectory estimator 465 may then select an optimal trajectory based on the relevant criteria for causing the command to generate control signals for the vehicle component 450 (e.g., actuator or other mechanism). Note that the correlation criteria may include any number of factors that define an optimal trajectory, and the selection of the optimal trajectory need not be limited to reducing collisions. For example, the selection of trajectories may be enabled to optimize the user experience (e.g., user comfort) as well as collision-free trajectories that comply with traffic regulations and laws. The user experience can be optimized by moderate acceleration in various linear and angular directions (e.g., to reduce bump-like driving or other unpleasant movements). In some cases, at least a portion of the relevant criteria may specify which other criteria are to be overwritten or replaced. For example, legal restrictions may be temporarily enforced or weakened when generating trajectories in restricted situations (e.g., crossing a double yellow line to bypass a cyclist, or driving at a speed limit above a marked speed to match traffic flow). The control signals are then configured to induce a change in propulsion and direction at the driveline and/or wheels. In this example, motion controller 462 is configured to translate the commands into control signals (e.g., speed, wheel angle, etc.) for controlling movement of autonomous vehicle 430. In the event that trajectory estimator 465 has insufficient information to ensure a confidence level high enough to provide collision-free optimized driving, planner 464 may generate a request to teleoperator 404 for teleoperator support.
The autonomous vehicle services platform 401 includes a remote operator 404 (e.g., a remote operator computing device), a reference data repository 405, a map updater 406, a vehicle data controller 408, a calibrator 409, and an offline object classifier 410. Note that each element of autonomous vehicle service platform 401 may be independently located or distributed and in communication with other elements in autonomous vehicle service platform 401. Additionally, elements of the autonomous vehicle service platform 401 may independently communicate with the autonomous vehicle 430 via the communication layer 402. Map updater 406 is configured to receive map data (e.g., from local map generator 440, sensor 460, or any other component of autonomous vehicle controller 447), and is further configured to detect deviations in map data, for example, in map data repository 405a, from locally generated maps. The vehicle data controller 408 may cause the map updater 406 to update the reference data in the repository 405 and facilitate updating of 2D, 3D, and/or 4D map data. In some cases, the vehicle data controller 408 may control the rate at which local map data is received into the autonomous vehicle service platform 408 and the frequency at which the map updater 406 performs updates to the map data.
The calibrator 409 is configured to perform calibration of various sensors of the same or different types. The calibrator 409 may be configured to determine the relative pose of the sensor (e.g., in cartesian space (x, y, z)) and the orientation of the sensor (e.g., roll, yaw, and pitch). The attitude and orientation of sensors such as cameras, lidar sensors, radar sensors, etc. may be calibrated relative to other sensors, and globally relative to the reference frame of the vehicle. Offline self-calibration may also calibrate or estimate other parameters such as vehicle inertia tensor, wheel wheelbase, wheel radius, or surface road friction. According to some examples, calibration may also be performed online to detect parameter changes. It should also be noted that the calibration of the calibrator 409 may include intrinsic parameters of the sensor (e.g., light distortion, beam angle, etc.) as well as extrinsic parameters. In some cases, for example, the calibrator 409 may be performed by maximizing the correlation between depth discontinuities in the 3D laser data and edges of the image data. Offline object classification 410 is configured to receive data, e.g., sensor data, from sensors 470 or from any other component of autonomous vehicle controller 447. According to some embodiments, the offline classification pipeline of offline object classification 410 may be configured to pre-collect and annotate objects (e.g., manually by a person and/or automatically using an offline labeling algorithm), and may also be configured to train an online classifier (e.g., object classifier 444), which may provide real-time classification of object types during online autonomous operation.
Fig. 5 is an example of a flow chart for controlling an autonomous vehicle according to some embodiments. At 502, the flow 500 begins with sensor data originating from multiple forms of sensors at an autonomous vehicle being received, for example, by an autonomous vehicle controller. One or more of the sensor data may be integrated together to generate fused data, for example, to improve the estimation level. In some examples, at 504, sensor streams (e.g., of the same or different forms) of one or more sensors may be fused to form fused sensor data. In some examples, at 504, the subset of the lidar sensor data and the camera sensor data may be fused to facilitate localization. At 506, data representing the object based on at least two subsets of the sensor data may be obtained at the processor. For example, data identifying static or dynamic objects may be obtained (e.g., at a perception engine) from at least the lidar and camera data. At 508, it is determined that the detected object affects the planned path, and at 510 a subset of the trajectories is estimated (e.g., at the planner) in response to the detected object. At 512, it is determined that the confidence level exceeds a range of acceptable confidence levels associated with the normative operation of the autonomous vehicle. Thus, in this case, the confidence level may be such that selection of the optimal path is unlikely to be determined, and thus, the optimal path may be determined as a function of a probability of facilitating collision-free travel, complying with traffic regulations, providing a comfortable user experience (e.g., a comfortable ride), and/or generating a candidate trajectory for any other factor. Thus, at 514, a request for an alternate path may be transmitted to the teleoperator computing device. Thereafter, the teleoperator computing device may provide the planner with an optimal trajectory on which to drive the autonomous vehicle. In some cases, the vehicle may also determine that performing a safe stopping maneuver is the optimal process of action (e.g., safely and autonomously stopping an autonomous vehicle at a location with a relatively low probability of danger). Note that the order in which the various functions are performed in this flowchart, and in other flowcharts herein, is not intended to imply a requirement to perform the functions linearly, as each portion of the flowchart may be performed sequentially, or in parallel with, and independently or dependent on, any one or more other portions of the flowchart.
FIG. 6 is a diagram depicting an example of an architecture of an autonomous vehicle controller, according to some embodiments. The diagram 600 depicts a number of processes, including a motion controller process 662, a planner processor 664, a perception process 666, a charting process 640, and a localization process 668, some of which may generate or receive data related to other processes. Other processes, such as processes 670 and 650, may facilitate interacting with one or more mechanical components of the autonomous vehicle. For example, the perception process 666, mapping process 640, and localization process 668 are configured to receive sensor data from sensors 670, while the planner process 664 and the perception process 666 are configured to receive guidance data 606, which may include route data such as road network data. Further to the diagram 600, the localization process 668 is configured to receive map data 605a (i.e., 2D map data), map data 605b (i.e., 3D map data), and local map data 642, as well as other types of map data. For example, the localization process 668 may also receive other forms of map data such as 4D map data, for example, the 4D map data may include a time determination. The localization process 668 is configured to generate local position data 641 representing local poses. The local position data 641 is provided to the motion controller process 662, the planner process 664, and the perception process 666. The perception process 666 is configured to generate static and dynamic object map data 667, which in turn, can transmit the static and dynamic object map data 667 to the planner process 664. In some examples, static and dynamic object map data 667 may be transmitted along with other data (e.g., semantic classification information and predicted object behavior). The planner process 664 is configured to generate trajectory data 665, the trajectory data 665 describing a plurality of trajectories generated by the planner 664. The motion controller process uses trajectory data 665 to generate low level commands or control signals to apply to actuator 650 to cause changes in steering angle and/or velocity.
FIG. 7 is a diagram depicting an example of an autonomous vehicle service platform implementing redundant communication channels to maintain reliable communication with a fleet of autonomous vehicles, in accordance with some embodiments. Diagram 700 depicts an autonomous vehicle services platform 701, the autonomous vehicle services platform 701 including a reference data generator 705, a vehicle data controller 702, an autonomous vehicle fleet manager 703, a teleoperator manager 707, a simulator 740, and a policy manager 742. The reference data generator 705 is configured to generate and modify map data and route data (e.g., RNDF data). In addition, the reference data generator 705 may be configured to access 2D data in the 2D map data repository 720, access 3D data in the 3D map data repository 722, and access route data in the route data repository 724. In some examples, other map representation data and resource pools may also be implemented, including, for example, time-determined 4D map data. The vehicle data controller 702 may be configured to perform various operations. For example, vehicle data controller 702 may be configured to vary the rate at which data is exchanged between the autonomous vehicle service fleet and platform 701 based on a quality level of communications over channel 770. During the bandwidth-limited period, for example, data communications may be prioritized such that the priority of the teleoperational request from the autonomous vehicle 730 is high to ensure delivery. In addition, variable levels of per-vehicle data extraction may also be transmitted over channel 770, depending on the bandwidth available for a particular channel. For example, in the presence of a robust network connection, full lidar data may be transmitted (e.g., substantially all of the lidar data, but possibly less), while in the presence of a degraded or low-rate connection, a simpler or more abstract description of the data may be transmitted (e.g., a bounding box with associated metadata, etc.). The autonomous vehicle fleet manager 703 is configured to coordinate the dispatching of autonomous vehicles 730 to optimize a number of variables, including efficient use of battery power, travel time, whether air conditioning units in the autonomous vehicles 730 may be used during battery low-charge conditions, etc., any or all of which may be monitored in order to optimize cost functions associated with operating autonomous vehicle services. Algorithms may be implemented to analyze various variables for the fleet of autonomous vehicles to minimize travel costs or time. Additionally, to maximize uptime of the fleet, the autonomous vehicle fleet manager 703 maintains an inventory of autonomous vehicles as well as parts for adapting service schedules.
The communication channel 770 is configured to provide a network communication link between the fleet of autonomous vehicles 730 and the autonomous vehicle service platform 701. For example, the communication channel 770 includes a plurality of different types of networks 771, 772, 773, and 774 having corresponding sub-networks (e.g., 771a through 771n) to ensure a level of redundancy for reliably operating autonomous vehicle services. For example, different types of networks in the communication channel 770 may include different cellular network providers, different types of data networks, etc., to ensure sufficient bandwidth in the event of a loss or degradation of communication due to a failure in one or more of the networks 771, 772, 773, and 774.
Fig. 8 is a diagram depicting an example of a message processing application configured to exchange data between various applications, in accordance with some embodiments. Diagram 800 depicts a remote operator application 801 disposed in a remote operator administrator and an autonomous vehicle application 830 disposed in an autonomous vehicle, whereby the remote operator application 801 and the autonomous vehicle application 830 exchange message data via protocols that facilitate communications over various networks (e.g., networks 871, 872, with other networks 873). According to some examples, the communication protocol is implemented as a data distribution service having specifications maintained by an object management group federationTMThe middleware protocol of (3). According to the communication protocol, the teleoperator application 801 and the autonomous vehicle application 830 may include a message router 854 disposed in a message domain, the message router configured to interface with the teleoperator API 852. In some examples, message router 854 is a routing service. In some examples, the message field 850a in the teleoperator application 801 may be identified by a teleoperator identifier, while the message field 850b may be identified as a field associated with a vehicle identifier. The teleoperator API 852 in the teleoperator application 801 is configured to interface with the teleoperator processes 803 a-803 c, thereby associating the teleoperator process 803b with the autonomous vehicle identifier 804 and the teleoperator process 803c with the event identifier 806 (e.g., an identifier specifying an intersection where collision-free path planning may be problematic). The teleoperator API 852 in the autonomous vehicle applications 830 is configured to interface with an autonomous vehicle operating system 840, the autonomous vehicle operating system 840 including a sensing application 842, a perception application 844, a localization application 846, and a control application 848. In view of the above, the communication protocols described above may facilitate data exchange to facilitate remote control operations described herein. Additionally, the communication protocols described above may be adapted to provide secure data exchange between one or more autonomous vehicles and one or more autonomous vehicle service platforms. For example, message router 854 may be configured to encrypt and decrypt messages to provide secure interaction between, for example, teleoperator process 803 and autonomous vehicle operating system 840.
Fig. 9 is a diagram depicting types of data for facilitating teleoperation using the communication protocol described in fig. 8, according to some examples. Diagram 900 depicts a teleoperator 908, the teleoperator 908 interfacing with a teleoperator computing device 904 coupled to a teleoperator application 901, the teleoperator application 901 configured to exchange data via a data-centric message processing bus 972 implemented in one or more networks 971. Data-centric message processing bus 972 provides a communication link between teleoperator application 901 and autonomous vehicle application 930. Remote control of operator applications 901The teleoperator API 962 is configured to receive message service configuration data 964 and route data 960, such as road network data (e.g., RNDF-like data), task data (e.g., MDF data), etc. Similarly, the message processing service bridge 932 is also configured to receive message processing service configuration data 934. Message handling service configuration data 934 and 964 provide configuration data to configure message handling services between teleoperator application 901 and autonomous vehicle application 930. Examples of message processing service configuration data 934 and 964 include being implemented as a configuration data distribution serviceTMQuality of service ("QoS") configuration data for an application.
An example of a data exchange for facilitating teleoperation via a communication protocol is described below. The obstacle data 920 may be considered to be generated by the perception system of the autonomous vehicle controller. Additionally, planner option data 924 is generated by the planner to inform the teleoperator of the subset of candidate trajectories and location data 926 is generated by the locator. Obstacle data 920, planner option data 924, and location data 926 are transmitted to a message processing service bridge 932, which generates telemetry 940 and query 942 from message service configuration data 934, which are transmitted as telemetry 950 and query 952 to teleoperator application 901 via a data-centric message processing bus 972. Teleoperator API 962 receives telemetry data 950 and query data 952 and processes telemetry data 950 and query data 952 in view of route data 960 and message service configuration data 964. The resulting data is then presented to teleoperator 908 via teleoperator computing device 904 and/or a cooperative display (e.g., a dashboard display visible to a set of cooperating teleoperators 908). The teleoperator 908 reviews the candidate trajectory options presented on the display of the teleoperator computing device 904 and selects a guide trajectory that generates command data 982 and query response data 980, both of which are communicated through the teleoperator API 962 as query response data 954 and command data 956. In turn, the query response data 954 and the command data 956 are transmitted as query response data 944 and command data 946 into the autonomous vehicle application 930 via the data-centric message processing bus 972. The message processing service bridge 932 receives the query response data 944 and the command data 946 and generates teleoperator command data 928 configured to generate a teleoperator selected trajectory implemented by the planner. Note that the message handling process described above is not limiting and other message handling protocols may also be implemented.
Fig. 10 is a diagram illustrating an example of a teleoperator interface that a teleoperator may use to influence path planning, according to some embodiments. Diagram 1000 depicts an example of an autonomous vehicle 1030 communicating with an autonomous vehicle services platform 1001, the autonomous vehicle services platform 1001 including a teleoperator manager 1007 configured to facilitate teleoperations. In a first example, teleoperator manager 1007 receives data that requires teleoperator 1008 to preferentially observe the path of autonomous vehicles approaching potential obstacles or areas of low planner confidence level to enable teleoperator 1008 to resolve problems in advance. For the purpose of explanation, it is considered that the intersection where the autonomous vehicle is approaching may be marked as having a problem. Thus, the user interface 1010 displays a representation 1014 of a respective autonomous vehicle 1030 traveling along the path 1012, the path 1012 having been predicted by the plurality of trajectories generated by the planner. Also displayed are other vehicles 1011 and dynamic objects 1013 (e.g., pedestrians), which may cause a large confusion at the planner and thus require teleoperation support. The user interface 1010 also presents the current speed 1022, the speed limit 1024, and the current charge 1026 in the battery to the teleoperator 1008. According to some examples, user interface 1010 may display other data, such as sensor data collected from autonomous vehicle 1030. In a second example, consider that the planner 1064 has generated multiple trajectories in the same space as the path 1044 generated by the planner, regardless of the detected unidentified objects 1046. The planner 1064 may also generate a subset of the candidate trajectories 1040, but in this example, the planner cannot proceed given the current confidence level. If the planner 1064 is unable to determine an alternate path, a teleoperational request may be transmitted. In this case, the remote operator may select one of the candidate trajectories 1040 to facilitate travel of the autonomous vehicle 1030 consistent with the remote operator-based path 1042.
Fig. 11 is a diagram depicting an example of a planner configured to invoke a teleoperation according to some examples. Diagram 1100 depicts a planner 1164, the planner 1164 including a terrain manager 1110, a route manager 1112, a path generator 1114, a trajectory estimator 1120, and a trajectory tracker 1128. The terrain manager 1110 is configured to receive map data, such as 3D data or other similar map data specifying terrain features. The terrain manager 1110 is further configured to identify candidate paths based on terrain-related features on the path to the destination. According to various examples, terrain manager 1110 receives a 3D map generated by sensors associated with one or more autonomous vehicles in a fleet of vehicles. The route manager 1112 is configured to receive environment data 1103, the environment data 1103 may comprise traffic related information associated with one or more routes that can be selected as paths to the destination. The path generator 1114 receives data from the terrain manager 1110 and the route manager 1112 and generates one or more paths or path segments suitable for directing the autonomous vehicle to a destination. Data representing one or more paths or path segments is transmitted to trajectory estimator 1120.
According to some examples, the policy data 1130 may include a planner 1164 to determine paths with sufficient confidence levels to generate trajectories. Examples of policy data 1130 include a policy that specifies that the trajectory is generated by a distance away from an external object (e.g., to maintain a safe buffer distance of 3 feet from the cyclist, if possible), or a policy that requires that the trajectory must not cross the center double yellow line, or a policy that requires that the trajectory be limited to a single lane in a 4-lane road (e.g., based on past events, such as that the lane closest to a bus stop is often quite congested), and any other similar criteria specified by the policy. Perception engine data 1132 includes maps of the locations of static and dynamic objects of interest, and locator data 1134 includes at least a local pose or location.
The state and event manager 1122 may be configured to determine an operating state of the autonomous vehicle based on the probability. For example, a first operational state (i.e., "canonical operation") may describe a situation in which the trajectory is collision-free, while a second operational state (i.e., "non-canonical operation") may describe another situation in which the confidence level associated with the possible trajectory is insufficient to warrant collision-free travel. According to some examples, state and event manager 1122 is configured to determine whether the state of the autonomous vehicle is normative or non-normative using awareness data 1132. The confidence level generator 1123 may be configured to analyze the perception data 1132 to determine a state of the autonomous vehicle. For example, the confidence level generator 1123 may use semantic information associated with static and dynamic objects and associated probability estimates to enhance the certainty with which the planner 1164 determines safety action guidelines. For example, the planner 1164 may use the awareness engine data 1132 that specifies the probability of the object being a person or not being a person to determine whether the planner 1164 is operating safely (e.g., the planner 1164 may receive a determination that the object has a 98% probability of being a person and the object has a 2% probability of not being a person).
When it is determined that the confidence level (e.g., based on statistical and probabilistic determinations) is below a threshold required for the predicted safe operation, a relatively low confidence level (e.g., a single probability score) may trigger the planner 1164 to transmit a request 1135 for teleoperation support to the autonomous vehicle service platform 1101. In some cases, telemetry data and a set of candidate trajectories may accompany the request. Examples of telemetry data include sensor data, localized data, sensory data, and the like. The teleoperator 1108 may transmit the selected trajectory 1137 to the guide trajectory generator 1126 via the teleoperator computing device 1104. Thus, the selected trajectory 1137 is a trajectory formed using the guidance of the remote operator. When no state change is confirmed (e.g., a non-canonical state is pending), the guide trajectory generator 1126 transmits data to the trajectory generator 1124, which, in turn, causes the trajectory tracker 1128 (as a trajectory tracking controller) to generate control signals 1170 (e.g., steering angle, speed, etc.) using the trajectory specified by the teleoperator. Note that planner 1164 may trigger the transmission of a request 1135 for teleoperational support before the state transitions to an unnormal state. In particular, the autonomous vehicle controller and/or components thereof can predict that: remote obstacles may be problematic and it is preferable to have the planner 1164 invoke the remote control operation before the autonomous vehicle reaches the obstacle. Otherwise, when an obstacle or scenario (e.g., parking beside and off the road) is encountered, the autonomous vehicle may cause a delay due to the transition to the safe state. In another example, a remote control operation may be automatically invoked when an autonomous vehicle approaches a particular location that is known to be difficult to navigate. If such a situation may cause interference with the reliability of the sensor readings, as well as traffic or accident data obtained from various sources, the determination may optionally take into account other factors, including the time of day, the location of the sun.
Fig. 12 is an example of a flow chart configured to control an autonomous vehicle, according to some embodiments. At 1202, flow 1200 begins. Data representing a subset of objects is received at a planner in the autonomous vehicle, the subset of objects including at least one object associated with the data representing a degree of certainty of the classification type. For example, the perception engine data may include metadata associated with the object, whereby the metadata specifies a degree of certainty associated with a particular classification type. For example, a dynamic object may be classified as a "young pedestrian" with a confidence level of 85% that the classification is correct. At 1204, locator data may be received (e.g., at a planner). The locator data may include map data generated locally within the autonomous vehicle. The local map data may specify a degree of certainty (including uncertainty) that an event may occur within the geographic area. An event may be a condition or situation that affects the operation of, or may affect the operation of, an autonomous vehicle. The event may be internal to the autonomous vehicle (e.g., a faulty or damaged sensor) or external to the autonomous vehicle (e.g., a road blockage). Examples of events are described herein, for example, in FIG. 2 and in other figures and paragraphs. At 1206, a path in the same space as the geographic area of interest may be determined. For example, an event is considered to be the position of the sun in the sky when the intensity of sunlight impairs the driver's line of sight during peak traffic hours. Thus, it is expected or predicted that: traffic slows in response to intense sunlight. Thus, if an alternate path to avoid the event is unlikely to exist, the planner may preferentially invoke the teleoperation. At 1208, a local position is determined at the planner based on the local pose data. At 1210, for example, an operating state of the autonomous vehicle may be determined (e.g., according to a probability) based on the degree of certainty of the classification type and the degree of certainty of the event, which may be based on any number of factors, such as speed, location, and other state information. To illustrate, consider the following example: wherein the autonomous vehicle detects a young pedestrian during an event in which the vision of other drivers may be impaired by the sun, thereby causing the young pedestrian to be in an unsafe condition. Thus, a relatively unsafe situation may be detected as a probabilistic event that may occur (i.e., an unsafe situation for which a remote operation may be invoked). At 1212, a likelihood that the operating state is in a canonical state is determined, and based on the determination, a message is transmitted to a teleoperator computing device requesting teleoperation to preferentially transition to a next operating state (e.g., a preferential transition from the canonical operating state to an non-canonical operating state such as an unsafe operating state).
Fig. 13 depicts an example in which a planner may generate a trajectory according to some examples. Diagram 1300 includes a trajectory estimator 1320 and a trajectory generator 1324. Trajectory estimator 1320 includes confidence level generator 1322 and teleoperator query messager 1329. As shown, the trajectory estimator 1320 is coupled to the perception engine 1366 to receive static map data 1301, as well as current and predicted object state data 1303. The trajectory estimator 1320 also receives local pose data 1305 from the localizer 1368 and planning data 1307 from the global planner 1369. In one operational state (e.g., non-canonical), the confidence level generator 1322 receives static map data 1301 as well as current and predicted object state data 1303. Based on this data, the confidence level generator 1322 may determine that the detected trajectory is associated with an unacceptable confidence level value. The confidence level generator 1322 then transmits the detected trajectory data 1309 (e.g., data comprising candidate trajectories) to notify the teleoperator via the teleoperator query messenger 1329, which in turn transmits a request 1370 for teleoperator assistance.
In another operational state (e.g., an off-specification state), the static map data 1301, the current and predicted object state data 1303, the local pose data 1305, and the planning data 1307 (e.g., global planning data) are received into a trajectory calculator 1325, which trajectory calculator 1325 is configured to calculate a trajectory (e.g., iteratively) to determine an optimal path or paths. At least one path is then selected and transmitted as selected path data 1311. According to some embodiments, the trajectory calculator 1325 is configured to implement a re-planning of the trajectory, as an example. The nominal driving trajectory generator 1327 is configured to generate the trajectory in a precise scheme, for example by generating the trajectory based on a rolling time domain control technique. Next, the nominal driving trajectory generator 1327 may transmit the nominal driving trajectory path data 1372 to a trajectory tracker or vehicle controller to implement steering, acceleration, and other physical changes to components, for example.
FIG. 14 is a diagram depicting another example of an autonomous vehicle service platform, according to some embodiments. Diagram 1400 depicts an autonomous vehicle service platform 1401, the autonomous vehicle service platform 1401 including a teleoperator manager 1407, the teleoperator manager 1407 configured to manage interactions and/or communications between a teleoperator 1408, a teleoperator computing device 1404, and other components of the autonomous vehicle service platform 1401. With further reference to diagram 1400, the autonomous vehicle services platform 1401 includes a simulator 1440, a resource library 1441, a policy manager 1442, a reference data updater 1438, a 2D map data resource library 1420, a 3D map data resource library 1422, and a route data resource library 1424. Other map data, such as 4D map data (e.g., time of use determination), may be implemented and stored in a repository (not shown).
Teleoperator action recommendation controller 1412 includes a processor configured to receive and/or control teleoperational service requests via autonomous vehicle ("AV") planner data 1472, which may include requests for teleoperator assistance as well as telemetry data and other data. Thus, the autonomous vehicle planner data 1472 may include recommended candidate trajectories or paths from which the teleoperator 1408 may select via the teleoperator computing device 1404. According to some examples, teleoperator action recommendation controller 1412 may be configured to access other sources of recommended candidate trajectories from which to select an optimal trajectory. For example, candidate trajectories contained in the autonomous vehicle planner data 1472 may be introduced in parallel to the simulator 1440, the simulator 1440 configured to simulate events or conditions experienced by an autonomous vehicle requesting teleoperator assistance. The simulator 1440 may access map data and other data needed to perform a simulation on the set of candidate trajectories, and thus, the simulator 1440 need not repeat the simulation endlessly to confirm sufficiency. Rather, the simulator 1440 may either provide confirmation of the appropriateness of the candidate trajectory or may alert the remote operator to carefully select a trajectory.
Teleoperator interaction capture analyzer 1416 may be configured to capture a large number of teleoperator transactions or interactions for storage in repository 1441, e.g., repository 1441 may accumulate data related to a plurality of teleoperator transactions for analysis and generation of policies, at least in some cases. According to some embodiments, repository 1441 may also be configured to store policy data for access by policy manager 1442. Additionally, teleoperator interaction capture analyzer 1416 may apply machine learning techniques to empirically determine how to best respond to events or conditions that result in a request for teleoperator assistance. In some cases, policy manager 1442 may be configured to update a particular policy or generate a new policy in response to analysis of a large set of teleoperator interactions (e.g., after applying machine learning techniques). Policy manager 1442 manages policies that may be considered as rules or guidelines under which autonomous vehicle controllers and their components operate to comply with autonomous operation of the vehicle. In some cases, modified or updated policies may be applied to the simulator 1440 to confirm the effectiveness of permanently publishing or implementing such policy changes.
Simulator interface controller 1414 is configured to provide an interface between simulator 1440 and teleoperator computing device 1404. For example, consider that sensor data from a fleet of autonomous vehicles is applied to reference data updater 1438 via autonomous vehicle ("AV") fleet data 1470, whereby reference data updater 1438 is configured to generate updated map and route data 1439. In some implementations, the updated map and route data 1439 may be initially published as an update to data in the map data repositories 1420 and 1422, or as an update to data in the route data repository 1424. In this case, such data may be labeled as "beta version" in which a lower threshold for requesting teleoperational services may be implemented, for example, when the autonomous vehicle uses map tiles that include preliminary updated information. Additionally, updated map and route data 1439 may also be introduced to the simulator 1440 to verify the updated map data. When all are released (e.g., at the end of the beta test), the previously lowered threshold associated with the map tile requesting remote operator service is cancelled. User interface graphics controller 1410 provides rich graphics to teleoperator 1408 so that a fleet of autonomous vehicles may be simulated within simulator 1440 and accessed via teleoperator computing device 1404 as if the simulated fleet of autonomous vehicles were real.
FIG. 15 is an example of a flow chart for controlling an autonomous vehicle according to some embodiments. At 1502, the flow 1500 begins. Message data may be received at a teleoperator computing device to manage a fleet of autonomous vehicles. In the context of a planned path for an autonomous vehicle, the message data may indicate an event attribute associated with an irregular operating state. For example, an event may be characterized as a particular intersection that is a problem, for example, by a large number of pedestrians hurrying the road in violation of a traffic light. The event attribute describes characteristics of the event, such as the number of pedestrians crossing the road, traffic delay due to an increased number of pedestrians, and the like. At 1504, a library of teleoperated resources may be accessed to retrieve a first subset of recommendations based on simulated operation of aggregated data associated with a set of autonomous vehicles. In this case, the simulator may be the source of recommendations that the teleoperator may implement. Additionally, the teleoperated repository may also be accessed in response to similar event attributes to retrieve a second subset of recommendations based on an aggregation of teleoperator interactions. In particular, the teleoperator interaction capture analyzer may apply machine learning techniques to empirically determine how to best respond to events with similar attributes based on previous requests for teleoperation assistance. At 1506, the first and second subsets of recommendations are combined to form a set of recommended action guidelines for the autonomous vehicle. At 1508, a representation of the set of recommended action guidelines may be visually presented on a display of the teleoperator computing device. At 1510, a data signal can be detected that represents a selection of a recommended course of action (e.g., by a remote operator).
Fig. 16 is a diagram of an example of an autonomous vehicle fleet manager implementing a fleet optimization manager, according to some examples. Diagram 1600 depicts an autonomous vehicle fleet manager configured to manage a fleet of autonomous vehicles 1630 traveling within a road network 1650. Autonomous vehicle fleet manager 1603 is coupled to teleoperators 1608 via teleoperator computing devices 1604 and is also coupled to a fleet management data repository 1646. Autonomous vehicle fleet manager 1603 is configured to receive policy data 1602 and environmental data 1606, as well as other data. With further reference to the diagram 1600, the fleet optimization manager 1620 is shown to include a travel request processor 1631, the travel request processor 1631 in turn includes a fleet data extractor 1632 and an autonomous vehicle dispatch optimization calculator 1634. The travel request processor 1631 is configured to process travel requests, such as travel requests from a user 1688 requesting autonomous vehicle service. The fleet data extractor 1632 is configured to extract data related to autonomous vehicles in a fleet. Data associated with each autonomous vehicle is stored in a resource pool 1646. For example, the data for each vehicle may describe maintenance issues, dispatch service calls, daily usage, battery charge and discharge rates, and any other data that may be updated in real-time and may be used to optimize a fleet of autonomous vehicles to minimize down time. The autonomous vehicle dispatch optimization calculator 1634 is configured to analyze the extracted data and calculate an optimal use of the fleet to ensure that the next vehicle dispatched (e.g., from the station 1652) provides the least travel time and/or total cost of autonomous vehicle service.
The fleet optimization manager 1620 is depicted as including a hybrid autonomous vehicle/non-autonomous vehicle processor 1640, which in turn includes an AV/non-AV optimization calculator 1642 and a non-AV selector 1644. According to some examples, hybrid autonomous vehicle/non-autonomous vehicle processor 1640 is configured to manage a hybrid fleet of autonomous vehicles and human-driven vehicles (e.g., as independent contractors). The autonomous vehicle service may then use the non-autonomous vehicle to meet excess demand, or use the non-autonomous vehicle in an area that may be beyond a geographically bounded region (e.g., non-AV service area 1690), or in an area with poor communication coverage. The AV/non-AV optimization calculator 1642 is configured to optimize usage of the autonomous fleet and invite non-AV drivers to participate in the transportation service (e.g., with minimal or no damage to the autonomous vehicle service). The non-AV selector 1644 includes logic for selecting a plurality of non-AV drivers to assist based on the calculations obtained by the AV/non-AV optimization calculator 1642.
FIG. 17 is an example of a flow chart for managing a fleet of autonomous vehicles, according to some embodiments. At 1702, the flow 1700 begins. At 1702, policy data is received. The policy data may include parameters that define how best to apply to selecting an autonomous vehicle to service a travel request. At 1704, fleet management data may be extracted from a repository. The fleet management data includes a subset of data for the autonomous vehicle pool (e.g., data describing the readiness of the vehicle to service the transportation request). At 1706, data representing a travel request is received. For exemplary purposes, the travel request may be for a transport from a first geographic location to a second geographic location. At 1708, attributes based on the policy data are computed to determine a subset of autonomous vehicles available to service the request. For example, the attributes may include battery charge level and time to next scheduled maintenance. At 1710, an autonomous vehicle is selected for transportation from a first geographic location to a second geographic location, and data is generated for dispatching the autonomous vehicle to a third geographic location associated with the start of the travel request.
FIG. 18 is a diagram illustrating an autonomous vehicle fleet manager implementing an autonomous vehicle communication link manager, according to some embodiments. Diagram 1800 depicts an autonomous vehicle fleet manager configured to manage a fleet of autonomous vehicles 1830 traveling within a road network 1850, road network 1850 overlapping a communication outage at an area identified as a "communication impairment area" 1880. The autonomous vehicle fleet manager 1803 is coupled to a teleoperator 1808 via a teleoperator computing device 1804. The autonomous vehicle fleet manager 1803 is configured to receive policy data 1802 and environmental data 1806, as well as other data. With further reference to diagram 1800, autonomous vehicle communication link manager 1820 is shown to include an environmental event detector 1831, a policy adaptation determiner 1832, and a travel request processor 1834. The environmental event detector 1831 is configured to receive environmental data 1806, the environmental data 1806 specifying changes within an environment in which autonomous vehicle service is implemented. For example, environmental data 1806 may specify that region 1880 has degraded communication services, which may affect autonomous vehicle services. The policy adaptation determiner 1832 may specify parameters to be applied when a travel request is received during such an event (e.g., during a loss of communication). The travel request processor 1834 is configured to process travel requests in view of degraded communications. In this example, user 1888 is requesting autonomous vehicle service. In addition, the travel request processor 1834 includes logic to apply an adjusted strategy for modifying the manner in which autonomous vehicles are dispatched to avoid complications due to poor communication.
The communication event detector 1840 includes a policy download manager 1842 and a communication-configured ("COMM-configured") AV dispatcher 1844. The policy download manager 1842 is configured to provide an updated policy to the autonomous vehicle 1830 in view of the communication-impaired zone 1880, whereby the updated policy may specify a route to quickly exit the zone if the autonomous vehicle enters the zone 1880. For example, at a time prior to entering zone 1880, autonomous vehicle 1864 may receive an updated policy. When communication is lost, the autonomous vehicle 1864 implements an updated strategy and selects route 1866 for fast exit area 1880. The COMM-configured AV dispatcher 1844 may be configured to identify a point 1865 at which to park the autonomous vehicle, the point 1865 configured as a transit station to establish a peer-to-peer network over the area 1880. Accordingly, COMM-configured AV dispatcher 1844 is configured to dispatch an autonomous vehicle 1862 (without passengers) to park at location 1865 for operation as a communication tower in a peer-to-peer mobile ad hoc network.
Fig. 19 is an example of a flow chart for determining an action of an autonomous vehicle during a certain event (e.g., communication degradation or communication loss), in accordance with some embodiments. At 1901, flow 1900 begins. Policy data is received, the policy data thereby defining parameters to be applied to travel requests in a geographic area during an event. At 1902, one or more of the following actions may be implemented: (1) dispatch a subset of autonomous vehicles to a geographic location in a portion of a geographic area, the subset of autonomous vehicles configured to park at a particular geographic location and each autonomous vehicle acting as a static communication relay or traveling in the geographic area to each act as a mobile communication relay, (2) conduct peer-to-peer communication between portions of a pool of autonomous vehicles associated with the portion of the geographic area, (3) provide an event policy to the autonomous vehicles, the event policy describing a route exiting the portion of the geographic area during the event, (4) invoke a remote operation, and (5) recalculate the path to avoid the geographic portion. After the action is performed, at 1914, a fleet of autonomous vehicles is monitored.
FIG. 20 is a diagram depicting an example of a locator, according to some embodiments. Diagram 2000 includes a locator 2068, the locator 2068 configured to receive sensor data, such as lidar data 2072, camera data 2074, radar data 2076, and other data 2078, from sensors 2070. In addition, the locator 2068 is configured to receive reference data 2020, such as 2D map data 2022, 3D map data 2024, and 3D local map data. Other map data, such as 4D map data 2025 and semantic map data (not shown), including corresponding data structures and repositories may also be implemented according to some examples. With further reference to diagram 2000, the locator 2068 includes a locating system 2010 and a localization system 2012, both configured to receive the reference data 2020 and the sensor data from the sensors 2070. The localized data integrator 2014 is configured to receive data from the localization system 2010 and to receive data from the localization system 2012, whereby the localized data integrator 2014 is configured to integrate or fuse sensor data from the plurality of sensors to form the local pose data 2052.
Fig. 21 is an example of a flow chart for generating local pose data based on integrated sensor data, according to some embodiments. At 2101, flow 2100 begins. At 2102, reference data is received, the reference data including three-dimensional map data. In some examples, the reference data, such as 3D or 4D map data, may be received via one or more networks. At 2104, localized data from one or more localized sensors is received and placed into a localized system. At 2106, positioning data from one or more positioning sensors is received into a positioning system. At 2108, localized data and localization data are integrated. At 2110, the localized data and the positioning data are integrated to form localized positioning data that specifies a geographic location of the autonomous vehicle.
FIG. 22 is a diagram depicting another example of a locator according to some embodiments. Diagram 2200 includes a locator 2268, which in turn includes a localization system 2210 and a relative localization system 2212 to generate location-based data 2250 and local location-based data 2251, respectively. The localization system 2210 includes a projection processor 2254a for processing GPS data 2273, GPS fiducial points 2211, and 3D map data 2222, as well as other optional data (e.g., 4D map data). The localization system 2210 also includes a ranging processor 2254b to process wheel data 2275 (e.g., wheel speed), vehicle model data 2213, and 3D map data 2222, as well as other optional data. Further, the localization system 2210 includes an integrator processor 2254c to process IMU data 2257, vehicle model data 2215, and 3D map data 2222, as well as other optional data. Similarly, relative localization system 2212 includes a lidar localization processor 2254D for processing lidar data 2272, 2D tile map data 2220, 3D map data 2222, and 3D local map data 2223, as well as other optional data. The relative localization system 2212 also includes a visual registration processor 2254e to process the camera data 2274, the 3D map data 2222, and the 3D local map data 2223, as well as other optional data. Further, the relative localization system 2212 includes a radar echo processor 2254f to process radar data 2276, 3D map data 2222, and 3D local map data 2223, as well as other optional data. Note that in various examples, other types of sensor data and sensors or processors may be implemented, such as sonar data or the like.
With further reference to diagram 2200, the localization-based data 2250 and the relative localization-based data 2251 may be fed to a data integrator 2266a and a localized data integrator 2266, respectively. The data integrator 2266a and the localized data integrator 2266 may be configured to fuse the respective data, whereby the localized-based data 2250 may be fused at the data integrator 2266a before being fused with the relative-localized-based data 2251 at the localized data integrator 2266. According to some embodiments, the data integrator 2266a is formed as part of the localized data integrator 2266, or is absent. In any event, the localization-based data 2250 and the relative localization-based data 2251 may be fed into a localization data integrator 2266 for use in fusing the data to generate localized position data 2252. The localization-based data 2250 may include unary constraint data (and uncertainty values) from projection processor 2254a, and binary constraint data (and uncertainty values) from range processor 2254b and integrator processor 2254 c. The relative localization-based data 2251 may include univariate constraint data (and uncertainty values) from the localization processor 2254d and the visual registration processor 2254e, and optionally from the radar echo processor 2254 f. According to some embodiments, the localized data integrator 2266 may implement a non-linear smoothing function, such as a kalman filter (e.g., a gated kalman filter), a relative beam adjuster, an attitude pattern relaxation, a particle filter, a histogram filter, or the like.
FIG. 23 is a diagram depicting an example of a perception engine according to some embodiments. Diagram 2300 includes a perception engine 2366, which perception engine 2366 in turn includes a segmentation processor 2310, an object tracker 2330, and a classifier 2360. Additionally, perception engine 2366 is configured to receive local location data 2352, lidar data 2372, camera data 2374, and radar data 2376, for example. Note that other sensor data, such as sonar data, may be accessed to provide the functionality of the perception engine 2366. The segmentation processor 2310 is configured to extract portions of the ground data and/or segmented images to distinguish objects from each other and from static images (e.g., background). In some cases, the 3D blobs may be segmented to distinguish from each other. In some examples, a blob may refer to a set of features that identify an object in a spatial replication environment, and may be composed of elements (e.g., pixels of camera data, points of laser echo data, etc.) having similar characteristics (e.g., intensity and color). In some examples, a blob may also refer to a cloud of dots (e.g., made up of color laser echo data) or other elements that make up an object. Object tracker 2330 is configured to perform frame-to-frame estimation of the motion of blobs or other segmented image portions. In addition, data associations are used to associate blobs at one location in the first frame at time t1 with blobs at a different location in the second frame at time t 2. In some examples, object tracker 2330 is configured to perform real-time probabilistic tracking of 3D objects (e.g., blobs). The classifier 2360 is configured to identify an object and classify the object according to the classification type (e.g., pedestrian, cyclist, etc.) and according to energy/activity (e.g., whether the object is dynamic or static), whereby the data representing the classification is described by semantic tags. According to some embodiments, a probability estimation of object classes may be performed, e.g. classifying objects as vehicles, cyclists, pedestrians, etc., each object class having a different confidence. The awareness engine 2366 is configured to determine awareness engine data 2354, which awareness engine data 2354 may include static object maps and/or dynamic object maps, as well as semantic information such that, for example, a planner may use the information to enhance path planning. According to various examples, one or more of segmentation processor 2310, object tracker 2330, and classifier 2360 may apply machine learning techniques to generate perception engine data 2354.
Fig. 24 is an example of a flow diagram for generating perception engine data according to some embodiments. Flowchart 2400 begins at 2402, data representing a local location of an autonomous vehicle is retrieved. At 2404, localized data is received from one or more localized sensors, and the tag segments the features of the environment in which the autonomous vehicle is disposed to form segmented objects at 2406. At 2408, one or more portions of the segmented object are spatially tracked to form at least one tracked object having motion (e.g., estimated motion). At 2410, classifying at least the tracked object as a static object or a dynamic object. In some cases, a static object or a dynamic object may be associated with a classification type. At 2412, data identifying the classified objects is generated. For example, the data identifying the classified objects may include semantic information.
FIG. 25 is an example of a partition processor according to some embodiments. Diagram 2500 depicts a segmentation processor 2510, where segmentation processor 2510 receives lidar data from one or more lidar 2572 and camera image data from one or more cameras 2574. The local pose data 2552, lidar data, and camera image data are received into a meta-rotation generator 2521. In some examples, the meta-rotation generator is configured to divide the image into distinguishable regions (e.g., clusters or clusters of point-like clouds) based on various attributes (e.g., color, intensity, etc.), at least two or more of which may be updated at or about the same time. Object segmentation and ground segmentation are performed at segmentation processor 2523 using meta-rotation data 2522, whereby both the meta-rotation data 2522 from the segmentation processor 2523 and data relating to the segmentation are applied to a scan difference processor 2513. The scan difference processor 2513 is configured to predict motion and/or relative velocity of the segmented image portions, which may be used to identify dynamic objects at 2517. Data indicative of the subject with the detected velocity at 2517 is optionally transmitted to the planner to enhance planning decisions. In addition, data from the scan difference processor 2513 may be used to approximate the position of objects to form a map of such objects (and optionally identify the level of motion). In some examples, an occupancy grid map 2515 may be generated. Data representing the occupancy grid map 2515 may be transmitted to the planner to further enhance path planning decisions (e.g., by reducing uncertainty). With further reference to the diagram 2500, the blobs in the blob classifier 2520 are classified using the image camera data from the one or more cameras 2574, and the blob classifier 2520 also receives the blob data 2524 from the segmentation processor 2523. The segmentation processor 2510 may also receive raw radar echo data 2512 from one or more radars 2576 to perform segmentation at a radar segmentation processor 2514, the radar segmentation processor 2514 generating radar-related blob data 2516. With further reference to fig. 25, the segmentation processor 2510 can also receive and/or generate tracked spot data 2518 that correlates to the radar data. The spot data 2516, the tracked spot data 2518, data from the spot classifier 2520, and the spot data 2524 may be used to track the object or portions thereof. According to some examples, one or more of the following may be optional: a scan difference processor 2513, a spot classifier 2520, and data from the radar 2576.
Fig. 26A is a diagram depicting an example of an object tracker and classifier in accordance with various embodiments. The object tracker 2630 of diagram 2600 is configured to receive spot data 2516, tracked spot data 2518, data from a spot classifier 2520, spot data 2524, and camera image data from one or more cameras 2676. The image tracker 2633 is configured to receive camera image data from one or more cameras 2676 to generate tracked image data, which in turn may be provided to the data association processor 2632. As shown, the data association processor 2632 is configured to receive the spot data 2516, the tracked spot data 2518, data from the spot classifier 2520, the spot data 2524, and the tracked image data from the image tracker 2633, and the data association processor 2632 is further configured to identify one or more associations between the data types described above. For example, the data association processor 2632 is configured to track various speckle data from one frame to the next, e.g., to estimate motion, etc. Additionally, the tracking updater 2634 may update one or more tracked, or tracked, objects using data generated by the data association processor 2632. In some examples, the tracking updater 2634 may implement a kalman filter or the like to form updated data of the tracked object, which may be stored online in the tracking database ("DB") 2636. Feedback data may be exchanged via a path 2699 between the data association processor 2632 and the tracking database 2636. In some examples, the image tracker 2633 is optional and may be excluded. The object tracker 2630 may also use other sensor data, such as radar or sonar, as well as any other type of sensor data.
FIG. 26B is a diagram depicting another example of an object tracker in accordance with at least some examples. Diagram 2601 includes an object tracker 2631, which 2631 may include the same structures and/or functions as similarly-named elements described in connection with one or more other diagrams (e.g., fig. 26A). As shown, the object tracker 2631 includes an optimized registration portion 2699 that includes a processor 2696, the processor 2696 configured to perform object scan registration and data fusion. The processor 2696 is also configured to store the resulting data in a 3D object database 2698.
Referring back to fig. 26A, diagram 2600 also includes a classifier 2660, which classifier 2660 may include a tracking classification engine 2662 for generating static obstacle data 2672 and dynamic obstacle data 2674, which may both be transmitted to the planner for path planning purposes. In at least one example, the tracking classification engine 2662 is configured to determine whether an obstacle is static or dynamic and is another classification type of object (e.g., whether the object is a vehicle, pedestrian, tree, cyclist, dog, cat, paper bag, etc.). Static obstacle data 2672 may be formed as part of an obstacle map (e.g., a 2D occupancy map) and dynamic obstacle data 2674 may be formed to include bounding boxes and have data indicating speed and classification type. Dynamic obstacle data 2674 includes, at least in some cases, 2D dynamic obstacle map data.
Fig. 27 is an example of a front-end processor for a perception engine according to some examples. Diagram 2700 includes a ground segmentation processor 2723a for performing ground segmentation, and an over segmentation processor 2723b for performing "over segmentation," according to various examples. Processors 2723a and 2723b are configured to optionally receive color lidar data 2775. The over-segmentation processor 2723b generates data 2710 of a first blob type (e.g., relatively small blobs), which is provided to an aggregate classification and segmentation engine 2712, which aggregate classification and segmentation engine 2712 generates data 2714 of a second blob type. Data 2714 is provided to data association processor 2732, data association processor 2732 being configured to detect whether data 2714 resides in tracking database 2736. A determination is made at 2740 as to whether the data 2714 for the second blob type (e.g., a relatively large blob, which may include one or more smaller blobs) is a new trace. If so, tracking is initiated at 2742, and if not, the tracked object data is stored in tracking database 2736, and tracking updater 2742 may extend or update tracking. The trace classification engine 2762 is coupled to the trace database 2736, for example, to identify and update/modify traces by adding, removing, or modifying data related to the trace.
FIG. 28 is a diagram depicting a simulator configured to simulate an autonomous vehicle in a synthetic environment, in accordance with various embodiments. Diagram 2800 includes simulator 2840 configured to generate simulated environment 2803. As shown, simulator 2840 is configured to use user reference data 2822 (e.g., 3D map data and/or other map or route data, including RNDF data or similar road network data) to generate simulated geometries, such as simulated surfaces 2892a and 2892b, within simulated environment 2803. Simulated surfaces 2892a and 2892b may simulate walls or front sides of buildings adjacent to a roadway. Simulator 2840 may also simulate dynamic agents in a synthetic environment using pre-generated or programmatically generated dynamic object data 2825. An example of a dynamic proxy is simulated dynamic object 2801, simulated dynamic object 2801 representing a simulated bicyclist with speed. The simulated dynamic agents may optionally be responsive to other static and dynamic agents in the simulated environment, including simulated autonomous vehicles. For example, the simulated object 2801 may slow down because of other obstacles in the simulated environment 2803, rather than following a pre-set trajectory, thereby enabling a more realistic simulation of the actual dynamic environment existing in the real world to be created.
FIG. 29 is an example of a flow chart for simulating various aspects of an autonomous vehicle, according to some embodiments. Flowchart 2900 begins at 2902, reference data including three-dimensional map data is received into a simulator. At 2904, dynamic object data defining motion patterns for the classified objects may be retrieved. At 2906, a simulated environment is formed based at least on three-dimensional ("3D") map data and dynamic object data. The simulated environment may include one or more simulated surfaces. At 2908, a simulation is conducted of the autonomous vehicle, which includes a simulated autonomous vehicle controller that forms part of a simulated environment. The autonomous vehicle controller may include a simulated perception engine configured to receive sensor data and a simulated locator. At 2910, simulated sensor data is generated based on the data returned for the at least one simulated sensor, and simulated vehicle commands are generated at 2912 to move (e.g., vector propel) the simulated autonomous vehicle in the synthetic environment. At 2914, the simulated vehicle commands are evaluated to determine whether the behavior of the simulated autonomous vehicle is consistent with the desired behavior (e.g., consistent with a policy).
Fig. 30 is an example of a flow chart for generating map data according to some embodiments. The flow diagram 3000 begins at 3002, and at 3002, trajectory data is retrieved. The trajectory data may include a trajectory (e.g., a recorded trajectory) captured over a period of time. At 3004, at least localized data can be received. Localized data (e.g., recorded localized data) can be captured over a period of time. At 3006, a camera or other image sensor may be implemented to generate a subset of the localized data. The retrieved localized data may then include image data. At 3008, a subset of the localized data is aligned to identify a global position (e.g., a global pose). At 3010, three-dimensional ("3D") map data is generated based on the global location, and at 3012, the three-dimensional map data is available for implementation by, for example, a manual route data editor (e.g., including a manual road network data editor, such as an RNDF editor), an automatic path data generator (e.g., including an automatic road network generator, including an automatic RNDF generator), a fleet of autonomous vehicles, a simulator, a teleoperator computing device, and any other component of an autonomous vehicle service.
FIG. 31 is a diagram depicting the architecture of a mapping engine according to some embodiments. Diagram 3100 includes a 3D mapping engine configured to receive trajectory record data 3140, lidar record data 3172, camera record data 3174, radar record data 3176, and other optionally recorded sensor data (not shown). The logic 3141 includes a closed loop detector 3150, the closed loop detector 3150 configured to detect whether the sensor data indicates that a nearby point in space has been previously accessed, etc. The logic 3141 also includes an alignment controller 3152, the alignment controller 3152 to align map data, in some cases including 3D map data, relative to one or more alignment points. Additionally, logic 3141 provides data 3142 representing the state of the loop closure used by global pose graph generator 3143, global pose graph generator 3143 configured to generate pose graph data 3145. In some examples, pose graph data 3145 may also be generated based on data from registration refinement module 3146. Logic 3144 includes a 3D mapper 3154 and a lidar self-calibration unit 3156. Additionally, logic 3144 receives sensor data and pose graph data 3145 to generate 3D map data 3120 (or other map data, such as 4D map data). In some examples, the logic 3144 may implement a truncated symbolic distance function ("TSDF") to fuse the sensor data and/or map data to form an optimal three-dimensional map. Additionally, the logic 3144 is configured to include texture and reflection characteristics. The 3D map data 3120 may be published for use by a manual route data editor 3160 (e.g., an editor that manipulates route data or other types of routes or reference data), an automatic route data generator 3162 (e.g., logic configured to generate route data or other types of road networks or reference data), a fleet of autonomous vehicles 3164, a simulator 3166, a remote operator computing device 3168, and any other component of an autonomous vehicle service. Mapping engine 3110 may capture semantic information from manual annotations or automatically generated annotations as well as from other sensors, such as sonar or instrumented environments (e.g., smart stop lights).
FIG. 32 is a diagram depicting an autonomous vehicle application, according to some examples. Diagram 3200 depicts a mobile computing device 3203, mobile computing device 3203 including an autonomous service application 3240, autonomous service application 3240 configured to contact an autonomous vehicle service platform 3201 to arrange for transportation of a user 3202 via an autonomous vehicle 3230. As shown, autonomous service applications 3240 may include a transport controller 3242, which transport controller 3242 may be a software application resident on a computing device (e.g., mobile phone 3203, etc.). Transport controller 3242 is configured to receive, schedule, select, or perform operations related to an autonomous vehicle and/or a fleet of autonomous vehicles for which user 3202 may arrange for transport from a location of the user to a destination. For example, user 3202 may open an application to request vehicle 3230. The application may display a map, and user 3202 may place a pin indicating their destination within, for example, a geo-fenced area. Alternatively, the application may display a list of nearby pre-designated pick-up locations, or provide the user with a text entry field into which to enter the destination by address or name.
With further reference to the illustrated example, autonomous vehicle application 3240 may also include a user identification controller 3246, which user identification controller 3246 may be configured to detect whether user 3202 is in a geographic area, nearby, or near autonomous vehicle 3230 when the vehicle is in proximity. In some cases, when autonomous vehicle 3230 is proximate, available to user 3202, user 3202 may not readily perceive or identify vehicle 3230 (e.g., due to various other vehicles, including trucks, cars, taxis)And other obstacles typical of urban environments). In one example, autonomous vehicle 3230 may establish wireless communication link 3262 (e.g., via radio frequency ("RF") signals, such as WiFi orIncluding BLE, etc.) for communicating and/or determining the spatial location of user 3202 relative to autonomous vehicle 3230 (e.g., using the relative direction and signal strength of the RF signals). In some cases, autonomous vehicle 3230 may detect the approximate geographic location of user 3202, e.g., using GPS data or the like. A GPS receiver (not shown) of mobile computing device 3203 may be configured to provide GPS data to autonomous vehicle service application 3240. Thus, user identification controller 3246 may provide GPS data to autonomous vehicle service platform 3201 via link 3260, which in turn may provide the location to autonomous vehicle 3230 via link 3261. Autonomous vehicle 3230 may then determine the relative distance and/or direction of user 3202 by comparing the user's GPS data to the vehicle's GPS-obtained location.
To assist user 3202 in identifying the arrival of their requested shipment, autonomous vehicle 3230 may be configured to: as it approaches user 3202, user 3202 is notified or otherwise alerted of the presence of autonomous vehicle 3230. For example, the autonomous vehicle 3230 may activate one or more light emitting devices 3280 (e.g., LEO) according to a particular light pattern. In particular, certain light patterns may be created so that user 3202 may readily perceive that autonomous vehicle 3230 has been reserved to service the transportation needs of user 3202. As an example, autonomous vehicle 3230 may generate light pattern 3290, which may be perceived by user 3202 as "blinking," or light pattern 3290, which may be other animations of its exterior and interior lights in a visual and temporal manner. Light pattern 3290 may be generated with or without a sound pattern to allow user 3202 to recognize that the vehicle is the vehicle to which they are subscribed.
According to some embodiments, the autonomous vehicle user controller 3244 may implement software applications configured to control various functions of the autonomous vehicle. Additionally, the application may be configured to redirect or reroute the autonomous vehicle during travel to its initial destination. Further, autonomous vehicle user control 3244 may be configured to enable on-board logic to modify interior lighting of autonomous vehicle 3230, e.g., to implement mood lighting. The controller 3244 may also control audio sources (e.g., external sources such as Spotify, or audio stored locally on the mobile computing device 3203), select ride types (e.g., modify desired acceleration and braking aggressiveness, modify active pause parameters to select a set of "road-handling" characteristics to implement aggregate driving characteristics, including shock, or select "soft-ride" quality, where shock is reduced for comfort), and so forth. For example, the mobile computing device 3203 may also be configured to control HVAC functions, such as ventilation and temperature.
Fig. 33-35 illustrate examples of various computing platforms configured to provide various functionality to components of an autonomous vehicle service, in accordance with various embodiments. In some examples, a computer program, application, method, process, algorithm, or other software may be implemented using computing platform 3300 to perform any of the techniques described herein.
Note that the various structures and/or functions of fig. 33 also apply to fig. 34 and 35, and thus some elements of these two figures may be discussed in the context of fig. 33.
In some cases, computing platform 3300 may be disposed in any device, such as computing device 3390a, computing device 3390a may be disposed in one or more computing devices in an autonomous vehicle service platform, an autonomous vehicle 3391, and/or a mobile computing device 3391.
The computing platform 3300 includes a bus 3302 or other communication mechanism for communicating information that interconnects subsystems and devices, such as the processor 3304, system memory 3306 (e.g., RAM, etc.), storage device 3308 (e.g., ROM, etc.), in-memory cache (which may be implemented in RAM 3306 or other portions of the computing platform 3300), a communication interface 3313 (e.g., Ethernet or wireless controller, bluetooth controller, NFC logic, etc.) to facilitate communication via a port on a communication link 3321, e.g., to communicate with a computing device, including a mobile computing and/or communication device having a processor. One or more graphics processing units ("GPUs"), and one or more central processing units ("CPUs") (e.g.,company) or one or more virtual processors, and any combination of CPUs and virtual processors, to implement processor 3304. Computing platform 3300 exchanges data representing input and output via input and output devices 3301, including but not limited to keyboards, mice, audio inputs (e.g., voice-to-text devices), user interfaces, displays, monitors, cursors, touch-sensitive displays, LCD or LED displays, andother I/O related devices.
According to some examples, computing platform 3300 performs certain operations by processor 3304, processor 3304 executes one or more sequences of one or more instructions stored in system memory 3306, and computing platform 3300 may be implemented as a client-server arrangement, a peer-to-peer arrangement, or as any mobile computing device, including a smartphone or the like. Such instructions or data can be read into system memory 3306 from another computer-readable medium, such as storage device 3308. In some examples, hardwired circuitry may be used in place of or in combination with software instructions for implementation. The instructions may be embedded in software or firmware. The term "computer-readable medium" refers to any tangible medium that participates in providing instructions to processor 3304 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks, and the like. Volatile media includes dynamic memory, such as system memory 3306.
Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read. Transmission media may also be used to transmit or receive instructions. The term "transmission medium" may include any tangible or intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such instructions. Transmission media includes coaxial cables, copper wire, and fiber optics, including the wires that comprise bus 3302 for transmitting computer data signals.
In some examples, execution of the sequence of instructions may be accomplished by computing platform 3300. According to some examples, computing platform 3300 may be controlled by a communication link 3321 (e.g., a wired network such as a LAN, PSTN, etc,Or any wireless network, including WiFi, or any combination of standards and protocols,NFC, Zig-Bee, etc.) is coupled to any other processor to execute sequences of instructions in a coordinated (or asynchronous) manner with each other. Computing platform 3300 may transmit and receive messages, data, and instructions, including program code (e.g., application code), through communication link 3321 and communication interface 3313. Received program code may be executed by processor 3304 as it is received, and/or stored in memory 3306, or other non-volatile storage for later execution.
In the example shown, system memory 3306 may include various modules containing executable instructions for implementing the functions described herein. The system memory 3306 may include an operating system ("O/S") 3332, as well as applications 3336 and/or logical module (S) 3359. In the example shown in fig. 33, the system memory 3306 includes an autonomous vehicle ("AV") controller module 3350 and/or components thereof (e.g., a perception engine module, a localization module, a planner module, and/or a motion controller module), any of which, or one or more portions thereof, may be configured to facilitate autonomous vehicle services by implementing one or more of the functions described herein.
Referring to the example shown in fig. 34, the system memory 3306 includes an autonomous vehicle services platform module 3450 and/or components thereof (e.g., a teleoperator manager, a simulator, etc.), any of which, or one or more portions thereof, may be configured to facilitate autonomous vehicle services by implementing one or more of the functions described herein.
Referring to the example shown in fig. 35, the system memory 3306 includes and/or components of an autonomous vehicle ("AV") module for use in, for example, a mobile computing device. One or more portions of the module 3550 may be configured to facilitate delivery of autonomous vehicle services by implementing one or more of the functions described herein.
Referring back to fig. 33, the structure and/or function of any of the features described or incorporated by reference herein may be implemented in software, hardware, firmware, circuitry, or a combination thereof. Note that the above-described structures and constituent elements, and their functions, may be aggregated with one or more other structures or elements. Alternatively, the elements and their functions may be subdivided into constituent sub-elements, if present. As software, the techniques described or incorporated by reference herein may be implemented using various types of programming or formatting languages, architectures, syntax, applications, protocols, objects, or techniques. As hardware and/or firmware, the techniques described herein or incorporated by reference may be implemented using various types of programming or integrated circuit design languages, including a hardware description language such as any register transfer language ("RTL") configured to design field programmable gate arrays ("FPGAs"), application specific integrated circuits ("ASICs"), or any other type of integrated circuit. According to some embodiments, the term "module" may refer, for example, to an algorithm or a portion of an algorithm, and/or logic implemented using hardware circuitry or software, or a combination thereof. These may vary and are not limited to the examples or descriptions provided.
In some embodiments, the module 3350 of fig. 33, the module 3450 of fig. 34, and the module 3550 of fig. 35, or one or more of their components, or any process or device described herein, is capable of, or may be provided in, a mobile device (e.g., a mobile phone or computing device).
In some cases, a mobile device, or any networked computing device, in communication with one or more of modules 3359 (module 3350 of fig. 33, module 3450 of fig. 34, and module 3550 of fig. 35) or one or more of their components (or any process or device described herein) is capable of providing at least some of the structure and/or functionality of any of the features described herein. As described herein or shown in the figures incorporated by reference, the structure and/or function of any of the features described herein or incorporated by reference may be implemented in software, hardware, firmware, circuitry, or any combination thereof. Note that the above-described structures and constituent elements, and their functions, may be aggregated or combined with one or more other structures or elements. Alternatively, the elements and their functions may be subdivided into constituent sub-elements, if present. As software, at least some of the techniques described or incorporated by reference herein may be implemented using various types of programming or formatting languages, architectures, syntax, applications, protocols, objects, or technologies. For example, at least one of the elements described in any of the figures may represent one or more algorithms. Alternatively, at least one of the elements may represent a portion of logic including a portion of hardware configured to provide constituent structures and/or functions.
For example, module 3350 of fig. 33, module 3450 of fig. 34, and module 3550 of fig. 35, or one or more of their components, or any process or device described herein, can be implemented in one or more computing devices (i.e., any mobile computing device, e.g., a wearable device, an audio device (e.g., a headset or headset), or a mobile phone (whether worn or carried), including one or more processors configured to execute one or more algorithms in memory. Thus, at least some of the elements in the figures described herein or incorporated by reference can represent one or more algorithms. Alternatively, at least one of the elements may represent a portion that includes logic configured to provide a portion of the hardware comprising the structure and/or functionality. These may vary and are not limited to the examples or descriptions provided.
As hardware and/or firmware, the structures and/or techniques described or incorporated by reference herein may be implemented using various types of programming or integrated circuit design languages, including a hardware description language such as any register transfer language ("RTL") configured to design field programmable gate arrays ("FPGAs"), application specific integrated circuits ("ASICs"), multi-chip modules, or any other type of integrated circuit.
For example, the module 3350 of fig. 33, the module 3450 of fig. 34, and the module 3550 of fig. 35, or one or more of their components, or any process or device described herein, can be implemented in one or more computing devices that include one or more circuits. Thus, at least some of the elements in the figures described herein or incorporated by reference can represent one or more components of hardware. Alternatively, at least one of the elements may represent a portion including logic configured to provide a portion of circuitry forming a structure and/or function.
According to some embodiments, the term "circuitry" may refer, for example, to any system comprising a plurality of components, including discrete and composite components, through which current flows to perform one or more functions. Examples of discrete components include transistors, resistors, capacitors, inductors, diodes, and the like, and examples of composite components include memories, processors, analog circuits, digital circuits, and the like, including field programmable gate arrays ("FPGAs"), application specific integrated circuits ("ASICs"). Thus, a circuit may comprise a system of electronic components and logic components (e.g., logic configured to execute instructions such as a set of executable instructions of an algorithm, and thus be a component of a circuit). According to some embodiments, the term "module" may refer, for example, to an algorithm or a portion of an algorithm implemented in hardware circuitry or software, or a combination thereof, and/or logic (i.e., a module may be implemented as circuitry). In some embodiments, the algorithm and/or the memory storing the algorithm is a "component" of the circuit. Thus, for example, the term "circuitry" may also refer to a system of components, including an algorithm. These may vary and are not limited to the examples or descriptions provided.
Fig. 36 is a diagram depicting a simulator configured to simulate one or more functions of a simulated autonomous vehicle in a simulated environment, according to some examples. Diagram 3600 depicts a simulator 3640, the simulator 3640 configured to synthesize a simulated world or simulated environment 3603, where the operation of an autonomous vehicle 3630 (and any of its components, such as sensors) may be simulated to determine the efficacy of hardware and software, or a combination thereof, for example, a fleet 3630a of one or more autonomous vehicles that may constitute an autonomous vehicle service. Additionally, for example, simulator 3640 may be configured to: vehicle dynamics are simulated as the simulated autonomous vehicle 3630 travels in different driving conditions or scenarios. For example, the simulator 3640 may simulate operation of the autonomous vehicle 3630 in driving scenarios including driving on unique terrain that may be city specific (e.g., hills of san francisco, city traffic in new york city, etc.), and driving during different driving conditions (e.g., simulation of reduced wheel friction due to rain, ice, etc.). In the illustrated example, simulated environment 3603 includes road segments having an inclination or slope that increases in the direction of travel of simulated autonomous vehicle 3630 (i.e., vehicle 3630 is traveling uphill, as indicated by symbol 3686), and is also described as having the formation of ice 3684 that simulated autonomous vehicle 3630 may traverse.
Additionally, simulator 3640 may also generate simulated environment 3603 based on the synthesis of laser and camera data, as well as any other data (e.g., radar data, sonar data, etc.). For example, the simulated environment 3603 may be based on 3D map data including a 3D point cloud generated using a laser scanner in conjunction with camera vision/image data (or any other sensor data). Examples of sensor data include, but are not limited to, lidar data, image or camera data, GPS data, inertial measurement unit ("IMU") data, acoustic data, ranging data, wheel angle data, battery charge level, drive current of one or more power transmissions or motors, thermal energy data (e.g., temperature) of any component, acceleration or deceleration data, brake pressure or force that may be applied to one or more wheels, etc., as well as other sensor data described herein or otherwise incorporated by reference. Simulator 3640 may also simulate sensor return data that may be detected by sensors disposed on or in simulated autonomous vehicle 3630. An example of simulated sensor return data includes simulated lidar returns 3671 reflected from portions 3672 of external features (e.g., appearance patterns of buildings adjacent to a road).
Diagram 3600 depicts a data modeler 3620 and a modeler 3640 that may cooperate to generate a simulated environment 3603 that includes dynamic objects 3680 and 3682a, as well as simulated road conditions (e.g., ice 3684). The simulator 3640 may include a physical processor 3650, the physical processor 3650 being configured to simulate mechanical, static, dynamic, and kinematic aspects of the autonomous vehicle for use in simulating behavior of the simulated autonomous vehicle 3630. For example, the physical processor 3650 may include simulated contact mechanics, as well as simulated interactions between subjects and/or simulated mechanical interactions. Simulator 3640 may also include a simulator controller 3656, the simulator controller 3656 configured to control the simulation to adjust the function of any synthetically generated elements of the simulated environment 3603 to determine and estimate causal relationships and the like. Note that elements described in diagram 3600 of fig. 36 may include the same structure and/or functionality as similarly-named elements described in connection with one or more other diagrams described herein or incorporated by reference.
In some examples, data modeler 3620 receives log file data from a fleet 3630a of autonomous vehicles, the log file data providing various types of data including, but not limited to, object data 3631, map data 3633, sensor data 3635, vehicle component data 3637, and any other data that may describe the structure and/or function of one or more autonomous vehicles in fleet 3630 a. In some examples, the data described above may be recorded at different times and seasons of the year, and miles driven under a variety of different conditions. The logged data may be generated from any number of autonomous vehicles capable of traveling in the road network any number of times. Additionally, data modeler 3620 may include logic to characterize the road (e.g., features such as inclination, roughness (e.g., due to bumps, potholes, etc.), grade angle towards the curb, etc.), typical or expected friction values or surface types), dynamic objects expected according to probability at one or more portions of the road network (e.g., simulated autonomous vehicle 3630 may expect to encounter many children as dynamic objects near school at the end of the day), and any other characteristics. Data modeler 3620 may also include logic to characterize the road under various weather conditions. Thus, the data modeler 3620 may use the data representative of the roads to form a data model of a road network or any other path or segment. In some cases, logic in data modeler 3620 may aggregate or fuse data (e.g., sensor data) based on hundreds of thousands to millions of miles driven (e.g., driven by fleet 3630 a) or any other amount of data recorded during a distance amount that is less than or more than the same or different roads.
As shown, data modeler 3620 includes dynamic object data modeler 3621, environment modeler 3623, sensor modeler 3625, and vehicle modeler 3627, or any other hardware and/or software implementation thereof, to generate one or more data models that simulator 3640 may use to generate one or more portions of simulated environment 3603. The dynamic object data modeler 3621 may be configured to receive data (e.g., recorded data) representative of characteristics of one or more objects in the environment from which the fleet 3630a of autonomous vehicles acquired characteristic data. Such data may include a 3D point cloud or as another other data representation such as a class (e.g., pedestrian, pet or animal, bicyclist, car, etc.) that is capable of visually defining objects, whereby classified objects may be associated with a certain dynamic level and/or predicted range of motion (e.g., per unit time), whereby, at least in some examples, the predicted range of motion may also describe a predicted direction of motion (e.g., represented by a predicted motion vector). According to some examples, the predicted range of motion may describe a probability that an object may transition from a static object to a dynamic object and/or a velocity or acceleration associated with motion of the object.
In view of the above, dynamic object data modeler 3621 may be configured to identify any number of objects and may be further configured to classify objects into any number of classes. According to some examples, dynamic object data modeler 3621 may also identify and classify static objects, including objects that may be static at one time and dynamic at another time (e.g., pet dog 3682b sitting alongside at one time may jump up and run into the road at another time). For purposes of illustration, it is contemplated that dynamic object data modeler 3621 may classify object 3682b as a dog and may associate dynamic characteristics and/or predicted ranges of motion with the dog. Additionally, dynamic object data modeler 3621 may generate a data model describing the predicted motion of object 3682b in relation to interaction with other dynamic objects, such as dynamic object 3680 or dynamic object 3682a, which is described as a dog in motion. In the absence of dynamic object 3682a, dog 3682b may be associated with a first probability of engaging in an activity (e.g., jumping forward and running). However, in the event that dog 3682b encounters dog 3682a or interacts (or chase) with dog 3682a (with predicted range of motion 3683), the probability of dog 3682b engaging in activity may increase precipitously. For example, the probability that dog 3682b jumps forward and instinctively chase dog 3682a may increase from about 10% (e.g., based on recorded data) to about 85%. Based on the data model, the simulator may generate a simulated environment 3603 that includes two (2) dynamic objects 3682a and 3682b to be considered in navigation and planning, instead of one (1) dynamic object 3682a, based on modeled behavior derived from the dynamic object data modeler 3621.
Dynamic object data modeler 3621 may generate data models describing the classification of any object, as well as motion-related data (e.g., predicted range of motion, velocity, predicted path of motion, etc.). Further looking at diagram 3600, dynamic object data modeler 3621 may generate a data model that simulator 3640 may use to generate a rate of simulated dynamic object 3680 (e.g., a running person) associated with a predicted range of motion (e.g., a direction of motion included in a crosswalk). Other dynamic objects may be classified and, in some cases, may be further sub-classified. For example, a road segment may be adjacent to several bars or night clubs, and thus, a classified dynamic object (e.g., a young adult pedestrian) may have a first predicted behavior or may move during the day, but may appear 2:00 in the morning after the bar or night club closes, or predict the appearance of other predicted behaviors (or unpredictable behaviors). Simulator 3640 may provide greater accuracy in simulating environment 3603 using the dynamic object data model generated by modeler 3621 relative to the physical environment through which autonomous vehicle 3603 physically travels.
The environment modeler 3623 may be configured to generate various portions of the simulated environment 3603, e.g., static portions in some examples. In the example shown, environment modeler 3623 may receive map data 3633 to generate an environment model describing the geometry of the physical external environment. Simulator 3640 may generate simulated environment 3603 using environment model data generated by environment modeler 3623, e.g., based on 3D map data 3633. Note that in some cases, environment modeler 3623 may include or may be similar to one or more portions of a charting engine or cartographer structure and/or functionality (as described herein or incorporated by reference), or the like, to generate 3D (or 4D) simulation environment 3603.
The sensor modeler 3625 is configured to generate data models representing various functions of one or more of the various types of sensors based on sensor data 3635 extracted as recorded data from a fleet 3630a of autonomous vehicles. For example, sensor data 3635 may include one or more subsets of sensor data of one or more types of sensor data, such as, but not limited to, lidar data, radar data, sonar data, image/camera data, sound data, ultrasound data, IMU-related data, ranging data, wheel angle data, and any other type of sensor data. Simulator 3640 may use data generated by sensor modeler 3625 to model any number of sensors implemented in simulated autonomous vehicle 3630. For example, an autonomous vehicle controller (not shown) that is considered to be capable of being simulated may be configured to identify a pose 3670 that simulates an autonomous vehicle 3630 or that simulates a lidar sensor configured to perform a ray-tracing laser scan, at least one of which is described as a laser echo 3671 reflected from a surface portion 3672. Additionally, the autonomous vehicle controller may access 3D map data to identify external geometry 3672 (and the range or location of such geometry), and may also be configured to identify one or more of an x-coordinate, a y-coordinate, a z-coordinate, a roll value, a yaw value, and a pitch value to describe the pose of the simulated lidar sensor. In some examples, simulator controller 3656 of simulator 3640 may be configured to compare simulated values and measurements (e.g., intensity, range, reflectivity, etc.) for simulated laser echo 3671 to empirically obtained lidar data (e.g., sensor data 3635) to determine the accuracy of the simulation.
According to some examples, the simulator is configured to simulate a predicted response of the data representation simulating one or more functions of the autonomous vehicle 3630 based on a range of motion 3683 of the classified dynamic object 3682a described as a running dog. Simulator 3640 may be further configured to calculate a rate of change of distance 3685 between simulated autonomous vehicle 3630 and predicted range of motion 3683 of classified dynamic object 3682a, if a threshold is exceeded (e.g., in terms of location, distance, time, etc.), simulator 3640 is configured to cause simulated autonomous vehicle 3630 to avoid simulated dynamic object 3682a in simulated environment 3603 based on the calculated rate of change of 5 distance 3685. In some cases, the autonomous vehicle 3630 may stop driving to avoid a collision with the dynamic object 3682 a. In some other examples, simulator 3640 may implement safety system simulator 3690 to simulate the use of one or more on-board safety system simulated autonomous vehicles 3630. Examples of safety systems include directing sound by beam forming and/or including car lights to alert dynamic object 3682a, or external and/or internal safety systems may be implemented. US patent application Nos. 14/756,994 and 2015, US patent application No. Z00-013, US patent application No. 2015 and 2015, US patent application No. 2015 and No. 387 filed on 11/4/2015, US patent application No.14/756,993 and 2015, US patent application No. Z00-017, US patent application No. 2015 and 2015, US patent application No. 201532 and 2015, US patent application No. 2015 and No. 2015, US patent application No. 2 and 2015 and No. 2015 and 2015, US patent application No. 2015 and No. 0195 and US patent application No. 2015 and US patent application No. 2015 and No. 01945-11/4/11/12 OF the US patent application No. OF patent application No. 2015 filed by No. 2 and No. 2015 OF the US patent application No. 3 OF the US patent application No. 12 OF the present application No. 3 OF the present in No. 3 OF the present on No. 12 OF the present by No. respectively, Examples of security systems that may be simulated by simulator 3640 are described in U.S. patent application No.14/932,954, entitled "INTERNAL SAFETY SYSTEMS FOR rolling VEHICLES," and U.S. patent application No.14/932,962 filed 11/4/2015 (attorney docket No. z00-022), the entire contents of which are hereby incorporated by reference.
According to some examples, simulator 3640 may generate and use "ground truth" data for tokens (e.g., semantic tokens), and may test or verify (e.g., verify software changes and verify changes to map data 3633 or any modeled data) the identification of the algorithm. Moreover, simulator 3640 can be used for classifier training, e.g., based on computer vision classifiers and deep neural networks (e.g., implementing Bayesian or probabilistic inference algorithms, and other similar techniques), to identify dynamic objects or agents, e.g., dynamic objects 3680 and 3682a, in simulated environment 3603. Note that although techniques implementing data modeler 3620 may be used in the context of simulations, data modeler 3620 may also be implemented in any of the components described herein or incorporated by reference. For example, a perception engine or system may include or implement one or more portions of data modeler 3620, as well as any other structure and/or functionality described in the context of simulator 3640. Note that any of the components shown in fig. 36 described herein or incorporated by reference may be implemented in hardware or software or a combination thereof.
FIG. 37 depicts a vehicle modeler, according to some examples. Diagram 3700 depicts a portion 3700 OF an autonomous vehicle, including propulsion unit 3732, axles 3735 and wheels 3737, brake unit 3720, and steering unit 3734, including drive train (e.g., electric motor) 3733, some examples OF which are described in U.S. patent application No.14,932,958 entitled "QUADRANT CONFIGURATION OF ROBOTIC VEHICLES," filed 11/4/2015 (attorney docket No. Z00-021). The vehicle modeler 3727 is configured to receive vehicle component data 3799, the vehicle component data 3799 including model data describing operability of vehicle components in the portion 3700, including sensed data (e.g., wheel angle 3711, angular velocity of tire 3713, etc.). The vehicle modeler 3727 is configured to receive data representative of one or more components of the autonomous vehicle and identify data representative of a component characteristic (e.g., a motor current of the motor 3733) associated with the one or more components of the autonomous vehicle. The vehicle modeler 3727 generates data models of one or more components based on the component characteristics, for which the modeler may be configured to simulate operation of one or more components (e.g., propulsion unit 3732, brake unit 3720, steering unit 3734, etc.) based on the data models for predicting behavior of the simulated autonomous vehicle. In some cases, the simulator may be configured to access an event data model, including data representative of event characteristics (e.g., an event model describing characteristics of a portion of a road covering ice). The simulator may then be configured to simulate an event (e.g., a frozen patch) in the simulated environment based on the event characteristic data (e.g., reduced friction).
Fig. 38 is a diagram depicting an example of a sensor modeler, according to some examples. Diagram 3000 includes a sensor modeler 3825, which in turn is illustrated as including a sensor type modeler 3803 and a sensor air modeler 3805 to generate simulated sensor type data 3806 based on sensor data 3801. Accordingly, sensor modeler 3825 is configured to receive different types and different amounts of sensor data 3835 to generate corresponding simulated sensor data 3837. According to some examples, sensor modeler 3825 may be configured to receive data representative of sensor data 3801 derived in an environment in which one or more autonomous vehicles are traveling, and sensor modeler 3825 may be further configured to model a subset of the sensor data using sensor type modeler 3803 to characterize a sensor device (e.g., a lidar sensor) to form sensor data having a characteristic. Accordingly, sensor type modeler 3803 may generate data 3806 representative of the simulated sensor device based on the sensor data having the characteristic.
In some cases, the sensor error modeler 3805 may be configured to model data representative of a subset of measurement deviations (e.g., errors) associated with the sensor device. The sensor type modeler 3803 may be configured to adjust data representative of the simulated sensor device based on a subset of the measurement deviations (e.g., generated by the sensor error modeler 3805). As an example, sensor type modeler 3803 may be configured to model a subset of the lidar sensor data to characterize the lidar sensor to form lidar sensor data having characteristics, and to generate data representative of a simulated lidar device. In addition, a subset of lidar measurement offsets or errors may be modeled and used to adjust data representing a simulated lidar device based on the subset of lidar measurement offsets.
FIG. 39 is a diagram depicting an example of a dynamic object data modeler, according to some examples. The diagram 3900 includes a dynamic object data modeler 3921 configured to receive object data 3931 for generating simulated object data 3941. In this example, the dynamic object data modeler 3921 includes an object data classifier 3922 (e.g., a Bayesian classifier, etc.), the object data classifier 3922 configured to identify a classification of a dynamic object and identify data representative of a set of characteristics (or predicted behavior) associated with the classification. The simulator may then be configured to simulate the predicted range of motion of the simulated dynamic object in the simulated environment using the set of characteristics. As shown, the object data classifier 3922 classifies the dynamic objects 3932, 3933, 3934, and 3938 into a first animal dynamic object, a second animal dynamic object, a pedestrian dynamic object, and a skateboard dynamic object. Based on the classification, the object data characterizers 3951, 3952, 3953, and 3959 are configured to: data representing, for example, a predicted range of motion is provided based on the identified dynamic object. In some cases, the object data characterizer may implement randomized data based on probabilities associated with predicted ranges of motion. Based on the randomized data, the simulator can simulate a possible rare behavior of the subject, such as jumping randomly and running to a dog on the street (e.g., following the ball).
Fig. 40 is a flow diagram illustrating an example of generating a simulation environment, according to some examples. Flow 4000 begins at 4002 where data representative of characteristics of a dynamic object in an environment is received from one or more autonomous vehicles at 4002. At 4004, the classified dynamic object is identified. At 4006, data representative of dynamic correlation characteristics associated with the classified dynamic object is identified. At 4008, a data model of the classified dynamic object is formed based on the dynamic correlation characteristics of the classified data object. At 4010, a predicted range of motion of the classified dynamic object is simulated in the simulation environment to form a simulated dynamic object. At 4012, a predicted response of the data representation that simulates one or more functions of the autonomous vehicle may be simulated. Note that the order described in this flowchart, as well as other flowcharts herein, is not intended to imply a requirement to linearly perform the various functions, as each portion of the flowchart may be performed in series or in parallel with any one or more other portions of the flowchart, and may be independent of or dependent on other portions of the flowchart.
FIG. 41 illustrates an example of various computing platforms configured to provide various simulator-related functions and/or structures to simulate autonomous vehicle services, in accordance with various embodiments. In some examples, computing platform 3300 may be used to implement computer programs, applications, methods, procedures, algorithms, or other software for performing the techniques described above. Note that the various structures and/or functions of fig. 33 also apply to fig. 41, and thus, some elements of these two figures may be discussed in the context of fig. 33. It should also be noted that elements described in diagram 4100 of fig. 41 may include the same structure and/or functionality as similarly-named elements described in connection with one or more other diagrams described herein.
Referring to the example shown in fig. 41, the system memory 3306 includes an autonomous vehicle services platform module 4150 and/or components thereof (e.g., data modeler module 4152, simulator module 4154, etc.), any of which, or one or more portions thereof, may be configured to facilitate simulation of autonomous vehicle services by implementing one or more functions described herein. In some cases, computing platform 3300 may be disposed in any device, e.g., in computing device 3390a, computing device 3390a may be disposed in an autonomous vehicle service platform, autonomous vehicle 3391, and/or mobile computing device 3390 b.
Fig. 33-35 illustrate examples of various computing platforms configured to provide various functionality to components of an autonomous vehicle service, in accordance with various embodiments. In some examples, a computer program, application, method, process, algorithm, or other software may be implemented using computing platform 3300 to perform any of the techniques described herein.
Note that the various structures and/or functions of fig. 33 also apply to fig. 34 and 35, and thus some elements of these two figures may be discussed in the context of fig. 33.
In some cases, computing platform 3300 may be disposed in any device, such as computing device 3390a, computing device 3390a may be disposed in one or more computing devices in an autonomous vehicle service platform, autonomous vehicle 3391, and/or mobile computing device 3391.
According to some examples, computing platform 3300 performs certain operations by processor 3304, processor 3304 executes one or more sequences of one or more instructions stored in system memory 3306, and computing platform 3300 may be implemented as a client-server arrangement, a peer-to-peer arrangement, or as any mobile computing device, including a smartphone or the like. Such instructions or data can be read into system memory 3306 from another computer-readable medium, such as storage device 3308. In some examples, hardwired circuitry may be used in place of or in combination with software instructions for implementation. The instructions may be embedded in software or firmware. The term "computer-readable medium" refers to any tangible medium that participates in providing instructions to processor 3304 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks, and the like. Volatile media includes dynamic memory, such as system memory 3306.
Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read. Transmission media may also be used to transmit or receive instructions. The term "transmission medium" may include any tangible or intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such instructions. Transmission media includes coaxial cables, copper wire, and fiber optics, including the wires that comprise bus 3302 for transmitting computer data signals.
In some examples, execution of the sequence of instructions may be accomplished by computing platform 3300. According to some examples, computing platform 3300 may be comprised of a communication link 3321 (e.g., a wired network such as a LAN, PSTN, or any wireless network including WiFi, or any other suitable wireless network, or any suitable wireless network,NFC, Zig-Bee, etc.) is coupled to any other processor to execute sequences of instructions in a coordinated (or asynchronous) manner with each other. Computing platform 3300 may transmit and receive messages, data, and instructions, including program code (e.g., application code), through communication link 3321 and communication interface 3313. Received program code may be executed by processor 3304 as it is received, and/or stored in memory 3306, or other non-volatile storage for later execution.
In the example shown, system memory 3306 may include various modules containing executable instructions for implementing the functions described herein. The system memory 3306 may include an operating system ("O/S") 3332, as well as applications 3336 and/or logical module (S) 3359. In the example shown in fig. 33, the system memory 3306 includes an autonomous vehicle ("AV") controller module 3350 and/or components thereof (e.g., a perception engine module, a localization module, a planner module, and/or a motion controller module), any of which, or one or more portions thereof, may be configured to facilitate autonomous vehicle services by implementing one or more of the functions described herein.
Referring to the example shown in fig. 34, the system memory 3306 includes an autonomous vehicle services platform module 3450 and/or components thereof (e.g., a teleoperator manager, a simulator, etc.), any of which, or one or more portions thereof, may be configured to facilitate autonomous vehicle services by implementing one or more of the functions described herein.
Referring to the example shown in fig. 35, the system memory 3306 includes and/or components of an autonomous vehicle ("AV") module for use in, for example, a mobile computing device. One or more portions of the module 3550 may be configured to facilitate delivery of autonomous vehicle services by implementing one or more of the functions described herein.
Referring back to fig. 33, the structure and/or function of any of the features described or incorporated by reference herein may be implemented in software, hardware, firmware, circuitry, or a combination thereof. Note that the above-described structures and constituent elements, and their functions, may be aggregated with one or more other structures or elements. Alternatively, the elements and their functions may be subdivided into constituent sub-elements, if present. As software, the techniques described or incorporated by reference herein may be implemented using various types of programming or formatting languages, architectures, syntax, applications, protocols, objects, or techniques. As hardware and/or firmware, the techniques described herein or incorporated by reference may be implemented using various types of programming or integrated circuit design languages, including a hardware description language such as any register transfer language ("RTL") configured to design field programmable gate arrays ("FPGAs"), application specific integrated circuits ("ASICs"), or any other type of integrated circuit. According to some embodiments, the term "module" may refer, for example, to an algorithm or a portion of an algorithm, and/or logic implemented using hardware circuitry or software, or a combination thereof. These may vary and are not limited to the examples or descriptions provided.
In some embodiments, the module 3350 of fig. 33, the module 3450 of fig. 34, and the module 3550 of fig. 35, or one or more of their components, or any process or device described herein, is capable of communicating (wired or wireless) with or may be disposed in a mobile device (e.g., a mobile phone or computing device).
In some cases, a mobile device, or any networked computing device (not shown), in communication with one or more of modules 3359 (module 3350 of fig. 33, module 3450 of fig. 34, and module 3550 of fig. 35) or one or more of their components (or any process or device described herein) can provide at least some of the structure and/or functionality of any of the features described herein. As described herein or shown in the figures incorporated by reference, the structure and/or function of any of the features described herein or incorporated by reference may be implemented in software, hardware, firmware, circuitry, or any combination thereof. Note that the above-described structures and constituent elements, and their functions, may be aggregated or combined with one or more other structures or elements. Alternatively, the elements and their functions may be subdivided into constituent sub-elements, if present. As software, at least some of the techniques described or incorporated by reference herein may be implemented using various types of programming or formatting languages, architectures, syntax, applications, protocols, objects, or technologies. For example, at least one of the elements described in any of the figures may represent one or more algorithms. Alternatively, at least one of the elements may represent a portion of logic including a portion of hardware configured to provide constituent structures and/or functions.
For example, module 3350 of fig. 33, module 3450 of fig. 34, and module 3550 of fig. 35, or one or more of their components, or any process or device described herein, can be implemented in one or more computing devices (i.e., any mobile computing device, e.g., a wearable device, an audio device (e.g., a headset or headset), or a mobile phone (whether worn or carried), including one or more processors configured to execute one or more algorithms in memory. Thus, at least some of the elements in the figures described herein or incorporated by reference can represent one or more algorithms. Alternatively, at least one of the elements may represent a portion that includes logic configured to provide a portion of the hardware comprising the structure and/or functionality. These may vary and are not limited to the examples or descriptions provided.
As hardware and/or firmware, the structures and/or techniques described or incorporated by reference herein may be implemented using various types of programming or integrated circuit design languages, including a hardware description language such as any register transfer language ("RTL") configured to design field programmable gate arrays ("FPGAs"), application specific integrated circuits ("ASICs"), multi-chip modules, or any other type of integrated circuit.
For example, the module 3350 of fig. 33, the module 3450 of fig. 34, and the module 3550 of fig. 35, or one or more of their components, or any process or device described herein, can be implemented in one or more computing devices that include one or more circuits. Thus, at least some of the elements in the figures described herein or incorporated by reference can represent one or more components of hardware. Alternatively, at least one of the elements may represent a portion including logic configured to provide a portion of circuitry forming a structure and/or function.
According to some embodiments, the term "circuitry" may refer, for example, to any system comprising a plurality of components, including discrete and composite components, through which current flows to perform one or more functions. Examples of discrete components include transistors, resistors, capacitors, inductors, diodes, and the like, and examples of composite components include memories, processors, analog circuits, digital circuits, and the like, including field programmable gate arrays ("FPGAs"), application specific integrated circuits ("ASICs"). Thus, a circuit may comprise a system of electronic components and logic components (e.g., logic configured to execute instructions such as a set of executable instructions of an algorithm, and thus be a component of a circuit). According to some embodiments, the term "module" may refer, for example, to an algorithm or a portion of an algorithm implemented in hardware circuitry or software, or a combination thereof, and/or logic (i.e., a module may be implemented as circuitry). In some embodiments, the algorithm and/or the memory storing the algorithm is a "component" of a circuit. Thus, for example, the term "circuitry" may also refer to a system of components, including algorithms. These may vary and are not limited to the examples or descriptions provided. Thus, any of the above structures and/or functions may be implemented in hardware (including circuitry).
According to an aspect of the invention, there is provided a method comprising: identifying first data representative of characteristics of one or more dynamic objects in one or more of the simulated environment or the physical environment; determining a classification of a dynamic object based at least in part on the first data; identifying second data representative of a dynamic correlation property associated with the dynamic object; generating a data model of the dynamic object based at least in part on one or more of the first data or the second data; simulating a predicted range of motion of the dynamic object in a simulation environment; and simulating a predicted response that simulates one or more functions of an autonomous vehicle based at least in part on the predicted range of motion of the dynamic object.
In the method of an embodiment of the present invention, simulating the predicted response comprises: calculating a rate of change of distance between the simulated autonomous vehicle and the predicted range of motion of the dynamic object; and causing the simulated autonomous vehicle to avoid the dynamic object in the simulated environment based at least in part on the calculated rate of change of distance.
In an embodiment of the present invention, the method further includes: verifying one or more changes in a function associated with a physical autonomous vehicle based at least on simulating the predicted response to the one or more functions of the simulated autonomous vehicle.
In the method of an embodiment of the present invention, determining the classification of the dynamic object comprises determining that the dynamic object is one of an animal dynamic object, a pedestrian dynamic object, and an electric vehicle dynamic object, and wherein the predicted range of motion is selected based at least in part on the classification of the dynamic object.
In the method of an embodiment of the present invention, the method further comprises: providing data associated with simulating the predicted response to a computing device associated with a teleoperator; receiving teleoperator data from the computing device associated with the teleoperator; and controlling operation of one or more of a physical autonomous vehicle or the simulated autonomous vehicle based at least in part on the teleoperator data.
In the method of an embodiment of the present invention, the method further includes: randomizing the predicted range of motion based at least in part on the classification of the dynamic object.
In the method of an embodiment of the present invention, the method further includes: modeling sensor data to characterize a sensor device to form sensor data having a characteristic, wherein the sensor data is modeled using one or more of data obtained by physical sensors in one or more physical environments or simulated sensor data; and generating data representative of the analog sensor device based at least in part on the sensor data having the characteristic.
In a method of an embodiment of the invention, modeling the sensor data comprises: adjusting the data representative of the analog sensor device based at least in part on a measured deviation.
In a method of an embodiment of the invention, modeling the sensor data comprises: modeling the lidar sensor data to characterize the lidar sensor to form lidar data having characteristics; generating data representative of a simulated lidar device based at least in part on the lidar data having the characteristic; modeling data representative of lidar measurement offsets associated with the lidar device; and adjusting the data representative of the simulated lidar device based at least in part on the lidar measurement offset.
In the method of an embodiment of the present invention, the method further includes: identifying component characteristics associated with the one or more components of the autonomous vehicle; generating one or more data models for the one or more components based at least in part on the component characteristics; and simulating operation of the one or more components based at least in part on the one or more data models to predict behavior of the simulated autonomous vehicle.
In the method of an embodiment of the present invention, the method further includes: accessing an event data model that includes data representative of event characteristics associated with an event; simulating the event in the simulation environment based at least in part on the event characteristics; and simulating another predicted response of the one or more functions of the simulated autonomous vehicle.
According to another aspect of the invention, there is provided a system comprising: one or more computing devices comprising one or more processors, wherein the one or more computing devices are configured to: receiving first data representing a property of a dynamic object; determining a classification of a dynamic object based at least in part on the first data to identify a classified dynamic object; identifying second data representative of dynamic correlation characteristics associated with the classified dynamic object; generating a data model of the classified dynamic object based at least in part on the second data representative of the dynamically-relevant characteristics of the classified dynamic object; simulating a predicted range of motion of the classified dynamic object in a simulation environment; and simulating a predicted response of a data representation simulating one or more functions of an autonomous vehicle based at least in part on the predicted range of motion of the classified dynamic object.
In the system of an embodiment of the invention, the one or more computing devices are further configured to: executing simulated instructions to cause the simulated autonomous vehicle to perform a simulated maneuver based at least in part on the predicted range of motion of the simulated dynamic object; generating data associated with the simulated autonomous vehicle performing the simulated maneuver; and providing the data to one or more of the remote operators.
In the system of an embodiment of the invention, the one or more computing devices are further configured to: modeling the sensor data to characterize the sensor device to form sensor data having a characteristic; and generating third data representative of an analog sensor device based at least in part on the sensor data having the characteristic.
In the system of an embodiment of the invention, the one or more computing devices are further configured to: identifying a component characteristic associated with a component of the autonomous vehicle; generating a data model for one or more components based at least in part on the component characteristics; and simulating operation of the one or more components based at least in part on the data model to predict behavior of the simulated autonomous vehicle.
In the system of an embodiment of the present invention, the system further includes: receiving, via a computing device associated with a teleoperator, data associated with controlling a simulated maneuver; executing simulated instructions to cause the simulated autonomous vehicle to perform the simulated maneuver; generating data associated with the simulated autonomous vehicle performing the simulated maneuver; and analyzing the data to determine compliance with one or more policies.
In the system of an embodiment of the invention, the system further comprises providing the predicted response to a computing device associated with a physical autonomous vehicle.
According to another aspect of the invention, there is provided a non-transitory computer-readable storage medium having stored thereon computer-executable instructions that, when executed by a computer, cause the computer to perform acts comprising: determining a classification of the dynamic object based at least in part on data representative of characteristics of one or more dynamic objects in the one or more environments; identifying a dynamic correlation property associated with the dynamic object; generating a data model of the dynamic object based at least in part on the dynamic correlation characteristic; and simulating one or more events associated with a simulated autonomous vehicle in a simulated environment based at least in part on the predicted range of motion of the dynamic object.
In a non-transitory computer readable storage medium of an embodiment of the invention, simulating the one or more events comprises: calculating a rate of change of distance between the simulated autonomous vehicle and the predicted range of motion of the dynamic object; and calculating one or more trajectories based at least in part on the calculated rate of change of distance, the one or more trajectories causing the simulated autonomous vehicle to avoid the dynamic object in the simulated environment.
In a non-transitory computer-readable storage medium of an embodiment of the invention, the acts further comprise: providing the one or more trajectories to a computing device associated with a teleoperator; receiving, from the computing device associated with the remote operator, a selection of one of the one or more trajectories; and performing one or more of the following: simulating movement of the simulated autonomous vehicle using the selection of the one or more trajectories; or cause a physical autonomous vehicle to utilize the selection of the one or more trajectories.
In a non-transitory computer-readable storage medium of an embodiment of the invention, the acts further comprise: receiving, via a computing device associated with a remote operator, data associated with controlling a simulated maneuver of the simulated autonomous vehicle; and executing simulation instructions to cause the simulated autonomous vehicle to perform the simulated maneuver.
In a non-transitory computer-readable storage medium of an embodiment of the invention, the acts further comprise: generating data associated with the simulated autonomous vehicle performing the simulated maneuver; and analyzing the data to determine compliance with one or more policies.
In a non-transitory computer-readable storage medium of an embodiment of the invention, the acts further comprise: modeling the sensor data to characterize the sensor device to form sensor data having a characteristic; and generating data representative of the analog sensor device based at least in part on the sensor data having the characteristic.
Although the foregoing examples have been described in some detail for purposes of clarity of understanding, the inventive techniques described above are not limited to the details provided. There are many alternative ways of implementing the inventive techniques described above. The disclosed examples are illustrative and not restrictive.
Claims (20)
1. A method, comprising:
receiving sensor data from a sensor associated with a vehicle, the sensor data indicative of an object in a physical environment;
determining a classification associated with the object based at least in part on the sensor data;
determining at least one of a heading, a velocity, or an acceleration of the object based at least in part on the sensor data;
generating a data model associated with the object based at least in part on the sensor data, the classification, or at least one of the heading, the speed, or the acceleration;
generating a simulation based at least in part on the data model and the at least one of the heading, velocity, or acceleration of the object;
determining a probability associated with at least one of a velocity, an acceleration, a position, or an action of the object in a simulated environment based at least in part on the simulation; and
controlling the vehicle based at least in part on the probability.
2. The method of claim 1, wherein the data model comprises one or more of a dynamic object data modeler, an environmental modeler, a sensor modeler, or a vehicle modeler.
3. The method of claim 1, wherein determining the probability is further based at least in part on a second probability determined based at least in part on simulating an additional object based at least in part on an additional data model associated with the additional object.
4. The method of claim 1, wherein the probability indicates a likelihood that the object will transition from a static state to a dynamic state or interfere with a planned path of the vehicle.
5. The method of claim 1, further comprising receiving an indication of an event in the physical environment surrounding the vehicle,
wherein generating the simulation is further based on the event.
6. The method of claim 1, further comprising simulating a maneuver of the vehicle based at least in part on the probability, and wherein controlling the vehicle is further based at least in part on simulating the maneuver.
7. A system, comprising:
one or more processors;
a memory storing processor-executable instructions that, when executed by the one or more processors, cause the system to perform operations comprising:
receiving sensor data representing objects in an environment from sensors associated with a vehicle;
determining a classification associated with the object based at least in part on the sensor data;
determining at least one of a heading, a speed, or an acceleration of the object;
generating a data model associated with the object based at least in part on the sensor data, the classification, or at least one of the heading, the speed, or the acceleration;
generating a simulation based at least in part on the data model and the at least one of the heading, velocity, or acceleration of the object;
determining a probability associated with at least one of a velocity, an acceleration, a position, or an action of the object in a simulated environment based at least in part on the simulation; and
controlling the vehicle based at least in part on the probability.
8. The system of claim 7, wherein the data model comprises one or more of a dynamic object data modeler, an environmental modeler, a sensor modeler, or a vehicle modeler.
9. The system of claim 7, wherein determining the probability is further based at least in part on a second probability determined based at least in part on simulating an additional object based at least in part on an additional data model associated with the additional object.
10. The system of claim 7, wherein the probability indicates a likelihood that the object will transition from a static state to a dynamic state.
11. The system of claim 7, wherein the operations further comprise receiving an indication of an event in an environment surrounding the vehicle,
wherein generating the simulation is further based on the event.
12. The system of claim 7, wherein the operations further comprise simulating a maneuver of the vehicle based at least in part on the probability, and wherein controlling the vehicle is further based at least in part on simulating the maneuver.
13. The system of claim 12, wherein simulating the manipulation is based at least in part on:
generating one or more simulated candidate trajectories based at least in part on the probabilities; and
instantiating the one or more simulated candidate trajectories in the simulation, an
Wherein controlling the vehicle is based at least in part on determining that at least one of the simulated candidate trajectories is a collision free operation.
14. One or more non-transitory computer-readable media storing processor-executable instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising:
receiving, from a sensor associated with a vehicle, sensor data representing objects in an environment surrounding the vehicle;
determining, by a vehicle controller and based at least in part on the sensor data, a classification associated with the object;
determining at least one of a heading, a velocity, or an acceleration of the object;
generating a data model associated with the object based at least in part on the sensor data, the classification, or at least one of the heading, the speed, or the acceleration;
generating a simulation based at least in part on the data model and the at least one of the heading, velocity, or acceleration of the object;
determining a probability associated with at least one of a velocity, an acceleration, a position, or an action of the object in a simulated environment based at least in part on the simulation; and
generating a command for controlling the vehicle based at least in part on the probability.
15. The one or more non-transitory computer-readable media of claim 14, wherein the data model comprises one or more of a dynamic object data modeler, an environmental modeler, a sensor modeler, or a vehicle modeler.
16. The one or more non-transitory computer-readable media of claim 14, wherein determining the probability is further based at least in part on a second probability determined based at least in part on simulating an additional object based at least in part on an additional data model associated with the additional object.
17. The one or more non-transitory computer-readable media of claim 14, wherein the probability indicates a likelihood that the object will transition from a static state to a dynamic state.
18. The one or more non-transitory computer-readable media of claim 14, wherein the operations further comprise receiving an indication of an event in an environment surrounding the vehicle,
wherein generating the simulation is further based on the event.
19. The one or more non-transitory computer-readable media of claim 14, wherein the operations further comprise simulating a maneuver of the vehicle based at least in part on the probability, and wherein generating the command is further based at least in part on simulating the maneuver.
20. The one or more non-transitory computer-readable media of claim 19, wherein simulating the manipulation is based at least in part on:
generating one or more simulated candidate trajectories based at least in part on the probabilities; and
instantiating the one or more simulated candidate trajectories in the simulation, an
Wherein generating the command is based at least in part on determining that at least one of the simulated candidate trajectories is a collision-free operation.
Applications Claiming Priority (34)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/932,954 | 2015-11-04 | ||
US14/756,996 US9916703B2 (en) | 2015-11-04 | 2015-11-04 | Calibration for autonomous vehicle operation |
US14/932,952 US10745003B2 (en) | 2015-11-04 | 2015-11-04 | Resilient safety system for a robotic vehicle |
US14/756,991 US9720415B2 (en) | 2015-11-04 | 2015-11-04 | Sensor-based object-detection optimization for autonomous vehicles |
US14/932,958 US9494940B1 (en) | 2015-11-04 | 2015-11-04 | Quadrant configuration of robotic vehicles |
US14/756,994 US9701239B2 (en) | 2015-11-04 | 2015-11-04 | System of configuring active lighting to indicate directionality of an autonomous vehicle |
US14/932,959 US9606539B1 (en) | 2015-11-04 | 2015-11-04 | Autonomous vehicle fleet service and system |
US14/932,940 | 2015-11-04 | ||
US14/756,993 | 2015-11-04 | ||
US14/932,962 | 2015-11-04 | ||
US14/932,966 US9507346B1 (en) | 2015-11-04 | 2015-11-04 | Teleoperation system and method for trajectory modification of autonomous vehicles |
US14/756,992 US9910441B2 (en) | 2015-11-04 | 2015-11-04 | Adaptive autonomous vehicle planner logic |
US14/932,948 US9804599B2 (en) | 2015-11-04 | 2015-11-04 | Active lighting control for communicating a state of an autonomous vehicle to entities in a surrounding environment |
US14/756,993 US9878664B2 (en) | 2015-11-04 | 2015-11-04 | Method for robotic vehicle communication with an external environment via acoustic beam forming |
US14/932,940 US9734455B2 (en) | 2015-11-04 | 2015-11-04 | Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles |
US14/932,966 | 2015-11-04 | ||
US14/756,992 | 2015-11-04 | ||
US14/932,959 | 2015-11-04 | ||
US14/756,995 US9958864B2 (en) | 2015-11-04 | 2015-11-04 | Coordination of dispatching and maintaining fleet of autonomous vehicles |
US14/932,963 US9612123B1 (en) | 2015-11-04 | 2015-11-04 | Adaptive mapping to navigate autonomous vehicles responsive to physical environment changes |
US14/756,991 | 2015-11-04 | ||
US14/932,952 | 2015-11-04 | ||
US14/756,995 | 2015-11-04 | ||
US14/932,963 | 2015-11-04 | ||
US14/932,962 US9630619B1 (en) | 2015-11-04 | 2015-11-04 | Robotic vehicle active safety systems and methods |
US14/756,996 | 2015-11-04 | ||
US14/756,994 | 2015-11-04 | ||
US14/932,958 | 2015-11-04 | ||
US14/932,954 US9517767B1 (en) | 2015-11-04 | 2015-11-04 | Internal safety systems for robotic vehicles |
US14/932,948 | 2015-11-04 | ||
US14/757,016 | 2015-11-05 | ||
US14/757,016 US10496766B2 (en) | 2015-11-05 | 2015-11-05 | Simulation system and methods for autonomous vehicles |
PCT/US2016/060030 WO2017079229A1 (en) | 2015-11-04 | 2016-11-02 | Simulation system and methods for autonomous vehicles |
CN201680064648.2A CN108290579B (en) | 2015-11-04 | 2016-11-02 | Simulation system and method for autonomous vehicle |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201680064648.2A Division CN108290579B (en) | 2015-11-04 | 2016-11-02 | Simulation system and method for autonomous vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114643995A true CN114643995A (en) | 2022-06-21 |
Family
ID=62817166
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210276163.7A Pending CN114643995A (en) | 2015-11-04 | 2016-11-02 | Simulation system and method for autonomous vehicle |
CN201680064648.2A Active CN108290579B (en) | 2015-11-04 | 2016-11-02 | Simulation system and method for autonomous vehicle |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201680064648.2A Active CN108290579B (en) | 2015-11-04 | 2016-11-02 | Simulation system and method for autonomous vehicle |
Country Status (2)
Country | Link |
---|---|
JP (2) | JP7036732B2 (en) |
CN (2) | CN114643995A (en) |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11138350B2 (en) | 2018-08-09 | 2021-10-05 | Zoox, Inc. | Procedural world generation using tertiary data |
CN109532511B (en) * | 2018-12-07 | 2021-06-08 | 纳恩博(北京)科技有限公司 | Driving method of electric scooter and electric scooter |
EP3663942B1 (en) * | 2018-12-07 | 2023-04-26 | Volvo Car Corporation | Evaluation of a simulated vehicle functionality feature |
CN111307166B (en) * | 2018-12-11 | 2023-10-03 | 北京图森智途科技有限公司 | Method and device for constructing occupied grid map and processing equipment |
CN111413957B (en) | 2018-12-18 | 2021-11-02 | 北京航迹科技有限公司 | System and method for determining driving actions in autonomous driving |
US11299154B2 (en) * | 2019-04-11 | 2022-04-12 | Hyundai Motor Company | Apparatus and method for providing user interface for platooning in vehicle |
US11987287B2 (en) | 2019-04-12 | 2024-05-21 | Continental Autonomous Mobility US, LLC | Autonomous truck-trailer maneuvering and parking |
CN110239518B (en) * | 2019-05-20 | 2023-09-01 | 福瑞泰克智能系统有限公司 | Vehicle transverse position control method and device |
US20200406894A1 (en) * | 2019-06-28 | 2020-12-31 | Zoox, Inc. | System and method for determining a target vehicle speed |
KR102228516B1 (en) * | 2019-08-05 | 2021-03-16 | 엘지전자 주식회사 | Autonomous vehicle for carrying user group with multiple users, method and control server for controlling the autonomous driving vehicle |
CN110673636B (en) * | 2019-09-30 | 2023-01-31 | 上海商汤临港智能科技有限公司 | Unmanned simulation test system and method, and storage medium |
CN110766793B (en) * | 2019-10-08 | 2023-06-30 | 北京地平线机器人技术研发有限公司 | Map construction method and device based on semantic point cloud |
US11325492B2 (en) * | 2019-10-09 | 2022-05-10 | Carnegie Mellon University | Method for determining optimal placement of electric vehicle chargers |
KR102255595B1 (en) * | 2019-10-23 | 2021-05-26 | 국민대학교산학협력단 | Device and method for providing automated driving information from the user perspective |
CN111026873B (en) * | 2019-10-24 | 2023-06-20 | 中国人民解放军军事科学院国防科技创新研究院 | Unmanned vehicle and navigation method and device thereof |
KR102139513B1 (en) * | 2019-11-28 | 2020-08-12 | 국민대학교산학협력단 | Autonomous driving control apparatus and method based on ai vehicle in the loop simulation |
US11675366B2 (en) * | 2019-12-27 | 2023-06-13 | Motional Ad Llc | Long-term object tracking supporting autonomous vehicle navigation |
CN111259545B (en) * | 2020-01-15 | 2023-08-08 | 吉利汽车研究院(宁波)有限公司 | Intelligent driving virtual simulation cloud platform |
CN111221334B (en) * | 2020-01-17 | 2021-09-21 | 清华大学 | Environmental sensor simulation method for simulating automatic driving automobile |
CN111324945B (en) | 2020-01-20 | 2023-09-26 | 阿波罗智能技术(北京)有限公司 | Sensor scheme determining method, device, equipment and storage medium |
EP3869843B1 (en) * | 2020-02-19 | 2024-06-19 | Volkswagen Ag | Method for invoking a teleoperated driving session, apparatus for performing the steps of the method, vehicle and computer program |
US20210302981A1 (en) * | 2020-03-31 | 2021-09-30 | Gm Cruise Holdings Llc | Proactive waypoints for accelerating autonomous vehicle testing |
CN111553844B (en) | 2020-04-29 | 2023-08-29 | 阿波罗智能技术(北京)有限公司 | Method and device for updating point cloud |
JP7303153B2 (en) * | 2020-05-18 | 2023-07-04 | トヨタ自動車株式会社 | Vehicle driving support device |
US11586200B2 (en) * | 2020-06-22 | 2023-02-21 | The Boeing Company | Method and system for vehicle engagement control |
US11755469B2 (en) * | 2020-09-24 | 2023-09-12 | Argo AI, LLC | System for executing structured tests across a fleet of autonomous vehicles |
CN112612269B (en) * | 2020-12-14 | 2021-11-12 | 北京理工大学 | Hidden attack strategy acquisition method for Mecanum wheel trolley |
CN112590871B (en) * | 2020-12-23 | 2022-09-02 | 交控科技股份有限公司 | Train safety protection method, device and system |
CN112764984B (en) * | 2020-12-25 | 2023-06-02 | 际络科技(上海)有限公司 | Automatic driving test system and method, electronic equipment and storage medium |
CN114834475B (en) * | 2021-01-15 | 2024-05-31 | 宇通客车股份有限公司 | Vehicle output torque control method and device |
CN113222335B (en) * | 2021-04-06 | 2022-10-14 | 同济大学 | Risk assessment utility-based security unmanned vehicle group construction method |
CN113257073B (en) * | 2021-06-24 | 2021-09-21 | 成都运达科技股份有限公司 | Train driving simulation stability analysis method, system, terminal and medium |
CN114120252B (en) * | 2021-10-21 | 2023-09-01 | 阿波罗智能技术(北京)有限公司 | Automatic driving vehicle state identification method and device, electronic equipment and vehicle |
CN114475653B (en) * | 2021-12-28 | 2024-03-15 | 广州文远知行科技有限公司 | Vehicle emergency steering simulation scene configuration method and device |
CN115158346B (en) * | 2022-01-19 | 2024-10-01 | 江苏徐工工程机械研究院有限公司 | Unmanned mining card humanoid driving system and control method |
US20230278589A1 (en) | 2022-03-07 | 2023-09-07 | Woven By Toyota, Inc. | Autonomous driving sensor simulation |
TWI832203B (en) * | 2022-04-08 | 2024-02-11 | 富智捷股份有限公司 | Verification system and verification method |
US12091001B2 (en) * | 2022-04-20 | 2024-09-17 | Gm Cruise Holdings Llc | Safety measurement of autonomous vehicle driving in simulation |
CN114743010B (en) * | 2022-06-13 | 2022-08-26 | 山东科技大学 | Ultrahigh voltage power transmission line point cloud data semantic segmentation method based on deep learning |
CN115179978B (en) * | 2022-07-18 | 2023-05-16 | 内蒙古工业大学 | Shuttle car obstacle avoidance early warning system based on stereo earphone |
CN118259316B (en) * | 2024-05-31 | 2024-07-30 | 国家海洋环境监测中心 | Sea area ship track prediction method and system based on Beidou system |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3281188B2 (en) * | 1994-08-09 | 2002-05-13 | ヤマハ発動機株式会社 | Unmanned car |
FR2889882B1 (en) * | 2005-08-19 | 2009-09-25 | Renault Sas | METHOD AND SYSTEM FOR PREDICTING IMPACT BETWEEN A VEHICLE AND A PIETON |
US9459622B2 (en) * | 2007-01-12 | 2016-10-04 | Legalforce, Inc. | Driverless vehicle commerce network and community |
JP2011248855A (en) | 2010-04-30 | 2011-12-08 | Denso Corp | Vehicle collision warning apparatus |
US8509982B2 (en) * | 2010-10-05 | 2013-08-13 | Google Inc. | Zone driving |
US20130197736A1 (en) | 2012-01-30 | 2013-08-01 | Google Inc. | Vehicle control based on perception uncertainty |
US8996224B1 (en) * | 2013-03-15 | 2015-03-31 | Google Inc. | Detecting that an autonomous vehicle is in a stuck condition |
US9656667B2 (en) | 2014-01-29 | 2017-05-23 | Continental Automotive Systems, Inc. | Method for minimizing automatic braking intrusion based on collision confidence |
US9720410B2 (en) * | 2014-03-03 | 2017-08-01 | Waymo Llc | Remote assistance for autonomous vehicles in predetermined situations |
-
2016
- 2016-11-02 CN CN202210276163.7A patent/CN114643995A/en active Pending
- 2016-11-02 JP JP2018544031A patent/JP7036732B2/en active Active
- 2016-11-02 CN CN201680064648.2A patent/CN108290579B/en active Active
-
2021
- 2021-12-24 JP JP2021210801A patent/JP7330259B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
JP7036732B2 (en) | 2022-03-15 |
JP2022046646A (en) | 2022-03-23 |
CN108290579A (en) | 2018-07-17 |
JP2019504800A (en) | 2019-02-21 |
CN108290579B (en) | 2022-04-12 |
JP7330259B2 (en) | 2023-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108290579B (en) | Simulation system and method for autonomous vehicle | |
US11796998B2 (en) | Autonomous vehicle fleet service and system | |
CN108292474B (en) | Coordination of a fleet of dispatching and maintaining autonomous vehicles | |
US11301767B2 (en) | Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles | |
JP7195143B2 (en) | Adaptive Autonomous Vehicle Planner Logic | |
JP7316789B2 (en) | Adaptive mapping for navigating autonomous vehicles in response to changes in the physical environment | |
US11106218B2 (en) | Adaptive mapping to navigate autonomous vehicles responsive to physical environment changes | |
US20200074024A1 (en) | Simulation system and methods for autonomous vehicles | |
US9734455B2 (en) | Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles | |
CN108369775B (en) | Adaptive mapping to navigate an autonomous vehicle in response to changes in a physical environment | |
US9720415B2 (en) | Sensor-based object-detection optimization for autonomous vehicles | |
US9632502B1 (en) | Machine-learning systems and techniques to optimize teleoperation and/or planner decisions | |
WO2017079229A1 (en) | Simulation system and methods for autonomous vehicles | |
JP2022137160A (en) | Machine learning system and technique for optimizing remote control and/or planner determination | |
US20240028031A1 (en) | Autonomous vehicle fleet service and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |