WO2017079460A2 - Aptive mapping to navigate autonomous vehicles responsive to physical environment changes - Google Patents

Aptive mapping to navigate autonomous vehicles responsive to physical environment changes Download PDF

Info

Publication number
WO2017079460A2
WO2017079460A2 PCT/US2016/060368 US2016060368W WO2017079460A2 WO 2017079460 A2 WO2017079460 A2 WO 2017079460A2 US 2016060368 W US2016060368 W US 2016060368W WO 2017079460 A2 WO2017079460 A2 WO 2017079460A2
Authority
WO
WIPO (PCT)
Prior art keywords
data
autonomous vehicle
vehicle
map
sensor data
Prior art date
Application number
PCT/US2016/060368
Other languages
French (fr)
Other versions
WO2017079460A8 (en
WO2017079460A3 (en
Inventor
Jesse Sol LEVINSON
Gabriel Thurston SIBLEY
Original Assignee
Zoox, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/756,996 external-priority patent/US9916703B2/en
Priority claimed from US14/756,992 external-priority patent/US9910441B2/en
Priority claimed from US14/756,991 external-priority patent/US9720415B2/en
Priority claimed from US14/756,995 external-priority patent/US9958864B2/en
Priority claimed from US14/932,940 external-priority patent/US9734455B2/en
Priority claimed from US14/932,959 external-priority patent/US9606539B1/en
Priority to CN201680064836.5A priority Critical patent/CN108369775B/en
Priority to CN202111033039.XA priority patent/CN113721629B/en
Application filed by Zoox, Inc. filed Critical Zoox, Inc.
Priority to EP16862985.5A priority patent/EP3371797A4/en
Priority to JP2018543270A priority patent/JP7316789B2/en
Priority to CN202410296946.0A priority patent/CN118192555A/en
Publication of WO2017079460A2 publication Critical patent/WO2017079460A2/en
Publication of WO2017079460A8 publication Critical patent/WO2017079460A8/en
Publication of WO2017079460A3 publication Critical patent/WO2017079460A3/en
Priority to JP2022020682A priority patent/JP2022065083A/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L3/00Electric devices on electrically-propelled vehicles for safety purposes; Monitoring operating variables, e.g. speed, deceleration or energy consumption
    • B60L3/0007Measures or means for preventing or attenuating collisions
    • B60L3/0015Prevention of collisions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/503Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/507Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking specific to autonomous vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/508Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking specific to vehicles driving in fleets or convoys
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q5/00Arrangement or adaptation of acoustic signal devices
    • B60Q5/005Arrangement or adaptation of acoustic signal devices automatically actuated
    • B60Q5/006Arrangement or adaptation of acoustic signal devices automatically actuated indicating risk of collision between vehicles or with pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/008Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3848Data obtained from both position sensors and additional sensors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • G01C21/3867Geometry of map features, e.g. shape points, polygons or for simplified maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • G01C21/387Organisation of map data, e.g. version management or database structures
    • G01C21/3881Tile-based structures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • G01S17/875Combinations of systems using electromagnetic waves other than radio waves for determining attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0297Fleet control by controlling means in a control room
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/221Remote-control arrangements
    • G05D1/222Remote-control arrangements operated by humans
    • G05D1/224Output arrangements on the remote controller, e.g. displays, haptics or speakers
    • G05D1/2244Optic
    • G05D1/2245Optic providing the operator with a purely computer-generated representation of the environment of the vehicle, e.g. virtual reality
    • G05D1/2246Optic providing the operator with a purely computer-generated representation of the environment of the vehicle, e.g. virtual reality displaying a map of the environment
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/221Remote-control arrangements
    • G05D1/222Remote-control arrangements operated by humans
    • G05D1/224Output arrangements on the remote controller, e.g. displays, haptics or speakers
    • G05D1/2244Optic
    • G05D1/2247Optic providing the operator with simple or augmented images from one or more cameras
    • G05D1/2249Optic providing the operator with simple or augmented images from one or more cameras using augmented reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/221Remote-control arrangements
    • G05D1/226Communication links with the remote-control arrangements
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/69Coordinated control of the position or course of two or more vehicles
    • G05D1/698Control allocation
    • G05D1/6987Control allocation by centralised control off-board any of the vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/20Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
    • G08G1/205Indicating the location of the monitored vehicles as destination, e.g. accidents, stolen, rental
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K11/00Methods or devices for transmitting, conducting or directing sound in general; Methods or devices for protecting against, or for damping, noise or other acoustic waves in general
    • G10K11/18Methods or devices for transmitting, conducting or directing sound
    • G10K11/26Sound-focusing or directing, e.g. scanning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L2240/00Control parameters of input or output; Target parameters
    • B60L2240/60Navigation input
    • B60L2240/62Vehicle position
    • B60L2240/622Vehicle position by satellite navigation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L2260/00Operating Modes
    • B60L2260/20Drive modes; Transition between modes
    • B60L2260/32Auto pilot mode
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/549Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for expressing greetings, gratitude or emotions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R2021/01204Actuation parameters of safety arrangents
    • B60R2021/01252Devices other than bags
    • B60R2021/01265Seat belts
    • B60R2021/01272Belt tensioners
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/406Traffic density
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9322Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/0009Transmission of position information to remote stations
    • G01S5/0018Transmission from mobile station to base station
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/20Specific applications of the controlled vehicles for transportation
    • G05D2105/22Specific applications of the controlled vehicles for transportation of humans
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2107/00Specific environments of the controlled vehicles
    • G05D2107/10Outdoor regulated spaces
    • G05D2107/13Spaces reserved for vehicle traffic, e.g. roads, regulated airspace or regulated waters
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/20Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles

Definitions

  • Various embodiments relate generally to autonomous vehicles and associated mechanical, electrical and electronic hardware, computer software and systems, and wired and wireless network communications to provide an autonomous vehicle fleet as a service. More specifically, systems, devices, and methods are configured to provide updates to maps, such as three-dimensional (“3D”) maps, either locally (e.g., in-situ at autonomous vehicles) or remotely, or both, for navigating one or more of this vehicles adapted to changes in environments through which the vehicles traverse.
  • maps such as three-dimensional (“3D") maps
  • a variety of approaches to developing driverless vehicles focus predominately on automating conventional vehicles (e.g., manually-driven automotive vehicles) with an aim toward producing driverless vehicles for consumer purchase.
  • conventional vehicles e.g., manually-driven automotive vehicles
  • a number of automotive companies and affiliates are modifying conventional automobiles and control mechanisms, such as steering, to provide consumers with an ability to own a vehicle that may operate without a driver.
  • a conventional driverless vehicle performs safety-critical driving functions in some conditions, but requires a driver to assume control (e.g., steering, etc.) should the vehicle controller fail to resolve certain issues that might jeopardize the safety of the occupants.
  • driverless vehicles typically have a number of drawbacks.
  • a large number of driverless cars under development have evolved from vehicles requiring manual (i.e., human-controlled) steering and other like automotive functions. Therefore, a majority of driverless cars are based on a paradigm that a vehicle is to be designed to accommodate a licensed driver, for which a specific seat or location is reserved within the vehicle.
  • driverless vehicles are designed sub-optimally and generally forego opportunities to simplify vehicle design and conserve resources (e.g., reducing costs of producing a driverless vehicle).
  • Other drawbacks are also present in conventional driverless vehicles.
  • L3 Level 3
  • NHSA National Highway Traffic Safety Administration
  • typical approaches to driverless vehicles are generally not well-suited to detect and navigate vehicles relative to interactions (e.g., social interactions) between a vehicle- in-travel and other drivers of vehicles or individuals.
  • interactions e.g., social interactions
  • some conventional approaches are not sufficiently able to identity pedestrians, cyclists, etc., and associated interactions, such as eye contact, gesturing, and the like, for purposes of addressing safety risks to occupants of a driverless vehicles, as well as drivers of other vehicles, pedestrians, etc.
  • FIG. 1 is a diagram depicting implementation of a fleet of autonomous vehicles that are communicatively networked to an autonomous vehicle service platform, according to some embodiments;
  • FIG. 2 is an example of a flow diagram to monitor a fleet of autonomous vehicles, according to some embodiments
  • FIG. 3A is a diagram depicting examples of sensors and other autonomous vehicle components, according to some examples.
  • FIGs. 3B to 3E are diagrams depicting examples of sensor field redundancy and autonomous vehicle adaption to a loss of a sensor field, according to some examples
  • FIG. 4 is a functional block diagram depicting a system including an autonomous vehicle service platform that is communicatively coupled via a communication layer to an autonomous vehicle controller, according to some examples;
  • FIG. 5 is an example of a flow diagram to control an autonomous vehicle, according to some embodiments.
  • FIG. 6 is a diagram depicting an example of an architecture for an autonomous vehicle controller, according to some embodiments.
  • FIG. 7 is a diagram depicting an example of an autonomous vehicle service platform implementing redundant communication channels to maintain reliable communications with a fleet of autonomous vehicles, according to some embodiments;
  • FIG. 8 is a diagram depicting an example of a messaging application configured to exchange data among various applications, according to some embodiment
  • FIG. 9 is a diagram depicting types of data for facilitating teleoperations using a communications protocol described in FIG. 8, according to some examples.
  • FIG. 10 is a diagram illustrating an example of a teleoperator interface with which a teleoperator may influence path planning, according to some embodiments;
  • FIG. 11 is a diagram depicting an example of a planner configured to invoke teleoperations, according to some examples;
  • FIG. 12 is an example of a flow diagram configured to control an autonomous vehicle, according to some embodiments.
  • FIG. 13 depicts an example in which a planner may generate a traj ectory, according to some examples
  • FIG. 14 is a diagram depicting another example of an autonomous vehicle service platform, according to some embodiments.
  • FIG. 15 is an example of a flow diagram to control an autonomous vehicle, according to some embodiments.
  • FIG. 16 is a diagram of an example of an autonomous vehicle fleet manager implementing a fleet optimization manager, according to some examples
  • FIG. 17 is an example of a flow diagram for managing a fleet of autonomous vehicles, according to some embodiments.
  • FIG. 18 is a diagram illustrating an autonomous vehicle fleet manager implementing an autonomous vehicle communications link manager, according to some embodiments.
  • FIG. 19 is an example of a flow diagram to determine actions for autonomous vehicles during an event, according to some embodiments.
  • FIG. 20 is a diagram depicting an example of a localizer, according to some embodiments.
  • FIG. 21 is an example of a flow diagram to generate local pose data based on integrated sensor data, according to some embodiments.
  • FIG. 22 is a diagram depicting another example of a localizer, according to some embodiments.
  • FIG. 23 is a diagram depicting an example of a perception engine, according to some embodiments.
  • FIG. 24 is an example of a flow chart to generate perception engine data, according to some embodiments.
  • FIG. 25 is an example of a segmentation processor, according to some embodiments.
  • FIG. 26A is a diagram depicting examples of an object tracker and a classifier, according to various embodiments.
  • FIG. 26B is a diagram depicting another example of an object tracker according to at least some examples.
  • FIG. 27 is an example of front-end processor for a perception engine, according to some examples;
  • FIG. 28 is a diagram depicting a simulator configured to simulate an autonomous vehicle in a synthetic environment, according to various embodiments;
  • FIG. 29 is an example of a flow chart to simulate various aspects of an autonomous vehicle, according to some embodiments.
  • FIG. 30 is an example of a flow chart to generate map data, according to some embodiments.
  • FIG. 31 is a diagram depicting an architecture of a mapping engine, according to some embodiments
  • FIG. 32 is a diagram depicting an autonomous vehicle application, according to some examples.
  • FIGs. 33 to 35 illustrate examples of various computing platforms configured to provide various functionalities to components of an autonomous vehicle service, according to various embodiments
  • FIG. 36 is a diagram depicting a mapping engine configured to generate mapping data adaptively for autonomous vehicles responsive to changes in physical environments, according to some examples
  • FIG. 37 is a diagram depicting an example of an autonomous vehicle controller implementing updated map data, according to some examples.
  • FIG. 38 is a flow chart illustrating an example of generating map data, according to some examples.
  • FIG. 39 is a diagram depicting an example of a localizer configured to implement map data and locally-generated map data, according to some examples
  • FIG. 40 is a diagram depicting an example of a localizer configured to vary transmission rates or amounts of locally-generated sensor and/or map data, according to some examples
  • FIG. 41 is a flow diagram depicting an example using various amounts of locally-generated map data to localize an autonomous vehicle, according to some examples.
  • FIGs. 42 to 43 illustrate examples of various computing platforms configured to provide various mapping-related functionalities to components of an autonomous vehicle service, according to various embodiments.
  • Various embodiments or examples may be implemented in numerous ways, including as a system, a process, an apparatus, a user interface, or a series of program instructions on a computer readable medium such as a computer readable storage medium or a computer network where the program instructions are sent over optical, electronic, or wireless communication links.
  • a computer readable medium such as a computer readable storage medium or a computer network where the program instructions are sent over optical, electronic, or wireless communication links.
  • operations of disclosed processes may be performed in an arbitrary order, unless otherwise provided in the claims.
  • FIG. 1 is a diagram depicting an implementation of a fleet of autonomous vehicles that are communicatively networked to an autonomous vehicle service platform, according to some embodiments.
  • Diagram 100 depicts a fleet of autonomous vehicles 109 (e.g., one or more of autonomous vehicles 109a to 109e) operating as a service, each autonomous vehicle 109 being configured to self-drive a road network 1 10 and establish a communication link 192 with an autonomous vehicle service platform 101.
  • a user 102 may transmit a request 103 for autonomous transportation via one or more networks 106 to autonomous vehicle service platform 101.
  • autonomous vehicle service platform 101 may dispatch one of autonomous vehicles 109 to transport user 102 autonomously from geographic location 1 19 to geographic location 11 1.
  • Autonomous vehicle service platform 101 may dispatch an autonomous vehicle from a station 190 to geographic location 119, or may divert an autonomous vehicle 109c, already in transit (e.g., without occupants), to service the transportation request for user 102.
  • Autonomous vehicle service platform 101 may be further configured to divert an autonomous vehicle 109c in transit, with passengers, responsive to a request from user 102 (e.g., as a passenger).
  • autonomous vehicle service platform 101 may be configured to reserve an autonomous vehicle 109c in transit, with passengers, for diverting to service a request of user 102 subsequent to dropping off existing passengers.
  • stations 190 may be implemented to service one or more autonomous vehicles 109 in connection with road network 110.
  • One or more stations 190 may be configured to store, service, manage, and/or maintain an inventory of autonomous vehicles 109 (e.g., station 190 may include one or more computing devices implementing autonomous vehicle service platform 101).
  • bidirectional autonomous vehicle 130 may be configured to travel in either direction principally along, but not limited to, a longitudinal axis 131. Accordingly, bidirectional autonomous vehicle 130 may be configured to implement active lighting external to the vehicle to alert others (e.g., other drivers, pedestrians, cyclists, etc.) in the adj acent vicinity, and a direction in which bidirectional autonomous vehicle 130 is traveling.
  • active sources of light 136 may be implemented as active lights 138a when traveling in a first direction, or may be implemented as active lights 138b when traveling in a second direction.
  • Active lights 138a may be implemented using a first subset of one or more colors, with optional animation (e.g., light patterns of variable intensities of light or color that may change over time).
  • active lights 138b may be implemented using a second subset of one or more colors and light patterns that may be different than those of active lights 138a.
  • active lights 138a may be implemented using white- colored lights as "headlights," whereas active lights 138b may be implemented using red-colored lights as "taillights.
  • Active lights 138a and 138b, or portions thereof, may be configured to provide other light-related functionalities, such as provide "turn signal indication" functions (e.g., using yellow light).
  • logic in autonomous vehicle 130 may be configured to adapt active lights 138a and 138b to comply with various safety requirements and traffic regulations or laws for any number of jurisdictions.
  • bidirectional autonomous vehicle 130 may be configured to have similar structural elements and components in each quad portion, such as quad portion 194.
  • the quad portions are depicted, at least in this example, as portions of bidirectional autonomous vehicle 130 defined by the intersection of a plane 132 and a plane 134, both of which pass through the vehicle to form two similar halves on each side of planes 132 and 134.
  • bidirectional autonomous vehicle 130 may include an autonomous vehicle controller 147 that includes logic (e.g., hardware or software, or as combination thereof) that is configured to control a predominate number of vehicle functions, including driving control (e.g., propulsion, steering, etc.) and active sources 136 of light, among other functions.
  • Bidirectional autonomous vehicle 130 also includes a number of sensors 139 disposed at various locations on the vehicle (other sensors are not shown).
  • Autonomous vehicle controller 147 may be further configured to determine a local pose (e.g., local position) of an autonomous vehicle 109 and to detect external objects relative to the vehicle. For example, consider that bidirectional autonomous vehicle 130 is traveling in the direction 119 in road network 110. A localizer (not shown) of autonomous vehicle controller 147 can determine a local pose at the geographic location 1 11. As such, the localizer may use acquired sensor data, such as sensor data associated with surfaces of buildings 1 15 and 1 17, which can be compared against reference data, such as map data (e.g., 3D map data, including reflectance data) to determine a local pose.
  • map data e.g., 3D map data, including reflectance data
  • a perception engine (not shown) of autonomous vehicle controller 147 may be configured to detect, classify, and predict the behavior of external obj ects, such as external object 1 12 (a "tree") and external obj ect 1 14 (a "pedestrian”). Classification of such external obj ects may broadly classify obj ects as static obj ects, such as external obj ect 1 12, and dynamic obj ects, such as external obj ect 114.
  • autonomous vehicle service platform 101 is configured to provide teleoperator services should an autonomous vehicle 109 request teleoperation.
  • an autonomous vehicle controller 147 in autonomous vehicle 109d detects an obj ect 126 obscuring a path 124 on roadway 122 at point 191, as depicted in inset 120. If autonomous vehicle controller 147 cannot ascertain a path or traj ectory over which vehicle 109d may safely transit with a relatively high degree of certainty, then autonomous vehicle controller 147 may transmit request message 105 for teleoperation services.
  • a teleoperator computing device 104 may receive instructions from a teleoperator 108 to perform a course of action to successfully (and safely) negotiate obstacles 126.
  • Response data 107 then can be transmitted back to autonomous vehicle 109d to cause the vehicle to, for example, safely cross a set of double lines as it transits along the alternate path 121.
  • teleoperator computing device 104 may generate a response identifying geographic areas to exclude from planning a path.
  • a teleoperator 108 may define areas or locations that the autonomous vehicle must avoid.
  • autonomous vehicle 130 and/or autonomous vehicle controller 147 can perform real-time (or near real-time) traj ectory calculations through autonomous-related operations, such as localization and perception, to enable autonomous vehicles 109 to self-drive.
  • bidirectional nature of bidirectional autonomous vehicle 130 provides for a vehicle that has quad portions 194 (or any other number of symmetric portions) that are similar or are substantially similar to each other. Such symmetry reduces complexity of design and decreases relatively the number of unique components or structures, thereby reducing inventory and manufacturing complexities.
  • a drivetrain and wheel system may be disposed in any of the quad portions.
  • autonomous vehicle controller 147 is configured to invoke teleoperation services to reduce the likelihood that an autonomous vehicle 109 is delayed in transit while resolving an event or issue that may otherwise affect the safety of the occupants.
  • the visible portion of road network 1 10 depicts a geo-fenced region that may limit or otherwise control the movement of autonomous vehicles 109 to the road network shown in FIG.
  • autonomous vehicle 109 and a fleet thereof, may be configurable to operate as a level 4 ("full self-driving automation," or L4) vehicle that can provide transportation on demand with the convenience and privacy of point-to-point personal mobility while providing the efficiency of shared vehicles.
  • L4 level 4
  • autonomous vehicle 109, or any autonomous vehicle described herein may be configured to omit a steering wheel or any other mechanical means of providing manual (i.e., human-controlled) steering for autonomous vehicle 109.
  • autonomous vehicle 109, or any autonomous vehicle described herein may be configured to omit a seat or location reserved within the vehicle for an occupant to engage a steering wheel or any mechanical steering system.
  • FIG. 2 is an example of a flow diagram to monitor a fleet of autonomous vehicles, according to some embodiments.
  • flow 200 begins when a fleet of autonomous vehicles are monitored.
  • At least one autonomous vehicle includes an autonomous vehicle controller configured to cause the vehicle to autonomously transit from a first geographic region to a second geographic region.
  • data representing an event associated with a calculated confidence level for a vehicle is detected.
  • An event may be a condition or situation affecting operation, or potentially affecting operation, of an autonomous vehicle.
  • the events may be internal to an autonomous vehicle, or external. For example, an obstacle obscuring a roadway may be viewed as an event, as well as a reduction or loss of communication.
  • An event may include traffic conditions or congestion, as well as unexpected or unusual numbers or types of external obj ects (or tracks) that are perceived by a perception engine.
  • An event may include weather-related conditions (e.g., loss of friction due to ice or rain) or the angle at which the sun is shining (e.g., at sunset), such as low angle to the horizon that cause sun to shine brightly in the eyes of human drivers of other vehicles. These and other conditions may be viewed as events that cause invocation of the teleoperator service or for the vehicle to execute a safe-stop traj ectory.
  • data representing a subset of candidate trajectories may be received from an autonomous vehicle responsive to the detection of the event.
  • a planner of an autonomous vehicle controller may calculate and evaluate large numbers of traj ectories (e.g., thousands or greater) per unit time, such as a second.
  • candidate traj ectories are a subset of the trajectories that provide for relatively higher confidence levels that an autonomous vehicle may move forward safely in view of the event (e.g., using an alternate path provided by a teleoperator). Note that some candidate traj ectories may be ranked or associated with higher degrees of confidence than other candidate traj ectories.
  • subsets of candidate traj ectories may originate from any number of sources, such as a planner, a teleoperator computing device (e.g., teleoperators can determine and provide approximate paths), etc., and may be combined as a superset of candidate traj ectories.
  • path guidance data may be identified at one or more processors. The path guidance data may be configured to assist a teleoperator in selecting a guided traj ectory from one or more of the candidate trajectories. In some instances, the path guidance data specifies a value indicative of a confidence level or probability that indicates the degree of certainty that a particular candidate trajectory may reduce or negate the probability that the event may impact operation of an autonomous vehicle.
  • a guided traj ectory, as a selected candidate trajectory, may be received at 210, responsive to input from a teleoperator (e.g., a teleoperator may select at least one candidate traj ectory as a guided traj ectory from a group of differently -ranked candidate traj ectories).
  • the selection may be made via an operator interface that lists a number of candidate traj ectories, for example, in order from highest confidence levels to lowest confidence levels.
  • the selection a candidate traj ectory as a guided trajectory may be transmitted to the vehicle, which, in turn, implements the guided trajectory for resolving the condition by causing the vehicle to perform a teleoperator-specified maneuver.
  • the autonomous vehicle may transition from a non-normative operational state.
  • FIG. 3A is a diagram depicting examples of sensors and other autonomous vehicle components, according to some examples.
  • Diagram 300 depicts an interior view of a bidirectional autonomous vehicle 330 that includes sensors, signal routers 345, drive trains 349, removable batteries 343, audio generators 344 (e.g., speakers or transducers), and autonomous vehicle (“AV”) control logic 347.
  • AV autonomous vehicle
  • Sensors shown in diagram 300 include image capture sensors 340 (e.g., light capture devices or cameras of any type), audio capture sensors 342 (e.g., microphones of any type), radar devices 348, sonar devices 341 (or other like sensors, including ultrasonic sensors or acoustic- related sensors), and LIDAR devices 346, among other sensor types and modalities (some of which are not shown, such inertial measurement units, or "IMUs," global positioning system (“ GPS”) sensors, sonar sensors, etc.).
  • quad portion 350 is representative of the symmetry of each of four "quad portions" of bidirectional autonomous vehicle 330 (e.g., each quad portion 350 may include a wheel, a drivetrain 349, similar steering mechanisms, similar structural support and members, etc.
  • autonomous vehicle controller 347a is depicted as being used in a bidirectional autonomous vehicle 330, autonomous vehicle controller 347a is not so limited and may be implemented in unidirectional autonomous vehicles or any other type of vehicle, whether on land, in air, or at sea. Note that the depicted and described positions, locations, orientations, quantities, and types of sensors shown in FIG. 3 A are not intended to be limiting, and, as such, there may be any number and type of sensor, and any sensor may be located and oriented anywhere on autonomous vehicle 330.
  • portions of the autonomous vehicle (“AV") control logic 347 may be implemented using clusters of graphics processing units (“GPUs”) implementing a framework and programming model suitable for programming the clusters of GPUs.
  • GPUs graphics processing units
  • a compute unified device architecture (“CUD ATM ”) compatible programming language and application programming interface (“API”) model may be used to program the GPUs.
  • CUD ATM is produced and maintained by NVIDIA of Santa Clara, California. Note that other programming languages may be implemented, such as OpenCL, or any other parallel programming language.
  • autonomous vehicle control logic 347 may be implemented in hardware and/or software as autonomous vehicle controller 347a, which is shown to include a motion controller 362, a planner 364, a perception engine 366, and a localizer 368.
  • autonomous vehicle controller 347a is configured to receive camera data 340a, LIDAR data 346a, and radar data 348a, or any other range-sensing or localization data, including sonar data 341a or the like.
  • Autonomous vehicle controller 347a is also configured to receive positioning data, such as GPS data 352, IMU data 354, and other position-sensing data (e.g., wheel-related data, such as steering angles, angular velocity, etc.).
  • autonomous vehicle controller 347a may receive any other sensor data 356, as well as reference data 339.
  • reference data 339 includes map data (e.g., 3D map data, 2D map data, 4D map data (e.g., including Epoch Determination)) and route data (e.g., road network data, including, but not limited to, RNDF data (or similar data), MDF data (or similar data), etc.
  • Localizer 368 is configured to receive sensor data from one or more sources, such as GPS data 352, wheel data, IMU data 354, LIDAR data 346a, camera data 340a, radar data 348a, and the like, as well as reference data 339 (e.g., 3D map data and route data). Localizer 368 integrates (e.g., fuses the sensor data) and analyzes the data by comparing sensor data to map data to determine a local pose (or position) of bidirectional autonomous vehicle 330. According to some examples, localizer 368 may generate or update the pose or position of any autonomous vehicle in real-time or near real-time. Note that localizer 368 and its functionality need not be limited to "bi-directional" vehicles and can be implemented in any vehicle of any type.
  • localizer 368 (as well as other components of AV controller 347a) may be implemented in a "uni-directional" vehicle or any non-autonomous vehicle.
  • data describing a local pose may include one or more of an x-coordinate, a y-coordinate, a z-coordinate (or any coordinate of any coordinate system, including polar or cylindrical coordinate systems, or the like), a yaw value, roll value, a pitch value (e.g., an angle value), a rate (e.g., velocity), altitude, and the like.
  • Perception engine 366 is configured to receive sensor data from one or more sources, such as LIDAR data 346a, camera data 340a, radar data 348a, and the like, as well as local pose data. Perception engine 366 may be configured to determine locations of external obj ects based on sensor data and other data. Extemal objects, for instance, may be objects that are not part of a drivable surface. For example, perception engine 366 may be able to detect and classify extemal objects as pedestrians, bicyclists, dogs, other vehicles, etc.
  • perception engine 366 is configured to classify the obj ects in accordance with a type of classification, which may be associated with semantic information, including a label). Based on the classification of these extemal objects, the extemal obj ects may be labeled as dynamic objects or static obj ects. For example, an extemal obj ect classified as a tree may be labeled as a static object, while an external obj ect classified as a pedestrian may be labeled as a dynamic object. Extemal objects labeled as static may or may not be described in map data.
  • Examples of external objects likely to be labeled as static include traffic cones, cement barriers arranged across a roadway, lane closure signs, newly-placed mailboxes or trash cans adj acent a roadway, etc.
  • Examples of external obj ects likely to be labeled as dynamic include bicyclists, pedestrians, animals, other vehicles, etc. If the external object is labeled as dynamic, and further data about the external object may indicate a typical level of activity and velocity, as well as behavior patterns associated with the classification type. Further data about the extemal obj ect may be generated by tracking the extemal obj ect.
  • the classification type can be used to predict or otherwise determine the likelihood that an external object may, for example, interfere with an autonomous vehicle traveling along a planned path.
  • an extemal obj ect that is classified as a pedestrian may be associated with some maximum speed, as well as an average speed (e.g., based on tracking data).
  • the velocity of the pedestrian relative to the velocity of an autonomous vehicle can be used to determine if a collision is likely.
  • perception engine 364 may determine a level of uncertainty associated with a current and future state of objects. In some examples, the level of uncertainty may be expressed as an estimated value (or probability).
  • Planner 364 is configured to receive perception data from perception engine 366, and may also include localizer data from localizer 368.
  • the perception data may include an obstacle map specifying static and dynamic objects located in the vicinity of an autonomous vehicle, whereas the localizer data may include a local pose or position.
  • planner 364 generates numerous trajectories, and evaluates the trajectories, based on at least the location of the autonomous vehicle against relative locations of external dynamic and static objects.
  • Planner 364 selects an optimal trajectory based on a variety of criteria over which to direct the autonomous vehicle in way that provides for collision-free travel.
  • planner 364 may be configured to calculate the trajectories as probabilistically-determined trajectories.
  • planner 364 may transmit steering and propulsion commands (as well as decelerating or braking commands) to motion controller 362.
  • Motion controller 362 subsequently may convert any of the commands, such as a steering command, a throttle or propulsion command, and a braking command, into control signals (e.g., for application to actuators or other mechanical interfaces) to implement changes in steering or wheel angles 351 and/or velocity 353.
  • FIGs. 3B to 3E are diagrams depicting examples of sensor field redundancy and autonomous vehicle adaption to a loss of a sensor field, according to some examples.
  • Diagram 391 of FIG. 3B depicts a sensor field 301a in which sensor 310a detects objects (e.g., for determining range or distance, or other information). While sensor 310a may implement any type of sensor or sensor modality, sensor 310a and similarly-described sensors, such as sensors 310b, 310c, and 310d, may include LIDAR devices. Therefore, sensor fields 301a, 301b, 301c, and 301d each includes a field into which lasers extend.
  • 3C depicts four overlapping sensor fields each of which is generated by a corresponding LIDAR sensor 310 (not shown). As shown, portions 301 of the sensor fields include no overlapping sensor fields (e.g., a single LIDAR field), portions 302 of the sensor fields include two overlapping sensor fields, and portions 303 include three overlapping sensor fields, whereby such sensors provide for multiple levels of redundancy should a LIDAR sensor fail.
  • FIG. 3D depicts a loss of a sensor field due to failed operation of LIDAR 309, according to some examples.
  • Sensor field 302 of FIG. 3C is transformed into a single sensor field 305, one of sensor fields 301 of FIG. 3C is lost to a gap 304, and three of sensor fields 303 of FIG. 3C are transformed into sensor fields 306 (i.e., limited to two overlapping fields).
  • an autonomous vehicle controller (not shown) is configured to leverage the bidirectional nature of autonomous vehicle 330c to address the loss of sensor field at the leading area in front of the vehicle.
  • 3E depicts a bidirectional maneuver for restoring a certain robustness of the sensor field in front of autonomous vehicle 330d.
  • a more robust sensor field 302 is disposed at the rear of the vehicle 330d coextensive with taillights 348.
  • autonomous vehicle 330d performs a bidirectional maneuver by pulling into a driveway 397 and switches its directionality such that taillights 348 actively switch to the other side (e.g., the railing edge) of autonomous vehicle 330d.
  • autonomous vehicle 330d restores a robust sensor field 302 in front of the vehicle as it travels along direction of travel 398.
  • the above-described bidirectional maneuver obviates a requirement for a more complicated maneuver that requires backing up into a busy roadway.
  • FIG. 4 is a functional block diagram depicting a system including an autonomous vehicle service platform that is communicatively coupled via a communication layer to an autonomous vehicle controller, according to some examples.
  • Diagram 400 depicts an autonomous vehicle controller (“AV") 447 disposed in an autonomous vehicle 430, which, in turn, includes a number of sensors 470 coupled to autonomous vehicle controller 447.
  • AV autonomous vehicle controller
  • Sensors 470 include one or more LIDAR devices 472, one or more cameras 474, one or more radars 476, one or more global positioning system (“GPS") data receiver-sensors, one or more inertial measurement units (“IMUs”) 475, one or more odometry sensors 477 (e.g., wheel encoder sensors, wheel speed sensors, and the like), and any other suitable sensors 478, such as infrared cameras or sensors, hyperspectral-capable sensors, ultrasonic sensors (or any other acoustic energy-based sensor), radio frequency-based sensors, etc.
  • wheel angle sensors configured to sense steering angles of wheels may be included as odometry sensors 477 or suitable sensors 478.
  • autonomous vehicle controller 447 may include four or more LIDARs 472, sixteen or more cameras 474 and four or more radar units 476. Further, sensors 470 may be configured to provide sensor data to components of autonomous vehicle controller 447 and to elements of autonomous vehicle service platform 401. As shown in diagram 400, autonomous vehicle controller 447 includes a planner 464, a motion controller 462, a localizer 468, a perception engine 466, and a local map generator 440. Note that elements depicted in diagram 400 of FIG. 4 may include structures and/or functions as similarly -named elements described in connection to one or more other drawings.
  • Localizer 468 is configured to localize autonomous vehicle (i.e., determine a local pose) relative to reference data, which may include map data, route data (e.g., road network data, such as RNDF-like data), and the like. In some cases, localizer 468 is configured to identity, for example, a point in space that may represent a location of autonomous vehicle 430 relative to features of a representation of an environment. Localizer 468 is shown to include a sensor data integrator 469, which may be configured to integrate multiple subsets of sensor data (e.g., of different sensor modalities) to reduce uncertainties related to each individual type of sensor.
  • sensor data integrator 469 may be configured to integrate multiple subsets of sensor data (e.g., of different sensor modalities) to reduce uncertainties related to each individual type of sensor.
  • sensor data integrator 469 is configured to fuse sensor data (e.g., LIDAR data, camera data, radar data, etc.) to form integrated sensor data values for determining a local pose.
  • localizer 468 retrieves reference data originating from a reference data repository 405, which includes a map data repository 405a for storing 2D map data, 3D map data, 4D map data, and the like.
  • Localizer 468 may be configured to identity at least a subset of features in the environment to match against map data to identify, or otherwise confirm, a pose of autonomous vehicle 430.
  • localizer 468 may be configured to identify any amount of features in an environment, such that a set of features can one or more features, or all features.
  • any amount of LIDAR data may be compared against data representing a map for purposes of localization.
  • non-matched objects resulting from the comparison of the environment features and map data may be a dynamic object, such as a vehicle, bicyclist, pedestrian, etc.
  • detection of dynamic objects, including obstacles may be performed with or without map data.
  • dynamic objects may be detected and tracked independently of map data (i.e., in the absence of map data).
  • 2D map data and 3D map data may be viewed as "global map data" or map data that has been validated at a point in time by autonomous vehicle service platform 401.
  • map data in map data repository 405a may be updated and/or validated periodically, a deviation may exist between the map data and an actual environment in which the autonomous vehicle is positioned. Therefore, localizer 468 may retrieve locally-derived map data generated by local map generator 440 to enhance localization.
  • Local map generator 440 is configured to generate local map data in real-time or near real-time.
  • local map generator 440 may receive static and dynamic object map data to enhance the accuracy of locally generated maps by, for example, disregarding dynamic objects in localization.
  • local map generator 440 may be integrated with, or formed as part of, localizer 468.
  • local map generator 440 may be configured to generate map and/or reference data based on simultaneous localization and mapping ("SLAM") or the like.
  • SLAM simultaneous localization and mapping
  • localizer 468 may implement a "hybrid" approach to using map data, whereby logic in localizer 468 may be configured to select various amounts of map data from either map data repository 405a or local map data from local map generator 440, depending on the degrees of reliability of each source of map data. Therefore, localizer 468 may still use out-of-date map data in view of locally-generated map data.
  • Perception engine 466 is configured to, for example, assist planner 464 in planning routes and generating trajectories by identifying objects of interest in a surrounding environment in which autonomous vehicle 430 is transiting. Further, probabilities may be associated with each of the object of interest, whereby a probability may represent a likelihood that an object of interest may be a threat to safe travel (e.g., a fast-moving motorcycle may require enhanced tracking rather than a person sitting at a bus stop bench while reading a newspaper). As shown, perception engine 466 includes an object detector 442 and an object classifier 444.
  • Object detector 442 is configured to distinguish objects relative to other features in the environment, and object classifier 444 may be configured to classify objects as either dynamic or static objects and track the locations of the dynamic and the static objects relative to autonomous vehicle 430 for planning purposes. Further, perception engine 466 may be configured to assign an identifier to a static or dynamic object that specifies whether the object is (or has the potential to become) an obstacle that may impact path planning at planner 464. Although not shown in FIG. 4, note that perception engine 466 may also perform other perception-related functions, such as segmentation and tracking, examples of which are described below.
  • Planner 464 is configured to generate a number of candidate trajectories for accomplishing a goal to reaching a destination via a number of paths or routes that are available.
  • Trajectory evaluator 465 is configured to evaluate candidate traj ectories and identify which subsets of candidate trajectories are associated with higher degrees of confidence levels of providing collision- free paths to the destination. As such, trajectory evaluator 465 can select an optimal trajectory based on relevant criteria for causing commands to generate control signals for vehicle components 450 (e.g., actuators or other mechanisms). Note that the relevant criteria may include any number of factors that define optimal trajectories, the selection of which need not be limited to reducing collisions.
  • the selection of trajectories may be made to optimize user experience (e.g., user comfort) as well as collision-free trajectories that comply with traffic regulations and laws.
  • User experience may be optimized by moderating accelerations in various linear and angular directions (e.g., to reduce jerking-like travel or other unpleasant motion).
  • at least a portion of the relevant criteria can specify which of the other criteria to override or supersede, while maintain optimized, collision-free travel.
  • legal restrictions may be temporarily lifted or deemphasized when generating trajectories in limited situations (e.g., crossing double yellow lines to go around a cyclist or travelling at higher speeds than the posted speed limit to match traffic flows).
  • control signals are configured to cause propulsion and directional changes at the drivetrain and/or wheels.
  • motion controller 462 is configured to transform commands into control signals (e.g., velocity, wheel angles, etc.) for controlling the mobility of autonomous vehicle 430.
  • control signals e.g., velocity, wheel angles, etc.
  • planner 464 can generate a request to teleoperator 404 for teleoperator support.
  • Autonomous vehicle service platform 401 includes teleoperator 404 (e.g., a teleoperator computing device), reference data repository 405, a map updater 406, a vehicle data controller 408, a calibrator 409, and an off-line object classifier 410. Note that each element of autonomous vehicle service platform 401 may be independently located or distributed and in communication with other elements in autonomous vehicle service platform 401. Further, element of autonomous vehicle service platform 401 may independently communicate with the autonomous vehicle 430 via the communication layer 402.
  • Map updater 406 is configured to receive map data (e.g., from local map generator 440, sensors 460, or any other component of autonomous vehicle controller (447), and is further configured to detect deviations, for example, of map data in map data repository 405 a from a locally-generated map.
  • Vehicle data controller 408 can cause map updater 406 to update reference data within repository 405 and facilitate updates to 2D, 3D, and/or 4D map data.
  • vehicle data controller 408 can control the rate at which local map data is received into autonomous vehicle service platform 408 as well as the frequency at which map updater 406 performs updating of the map data.
  • Calibrator 409 is configured to perform calibration of various sensors of the same or different types. Calibrator 409 may be configured to determine the relative poses of the sensors (e.g., in Cartesian space (x, y, z)) and orientations of the sensors (e.g., roll, pitch and yaw). The pose and orientation of a sensor, such a camera, LIDAR sensor, radar sensor, etc., may be calibrated relative to other sensors, as well as globally relative to the vehicle's reference frame. Off-line self- calibration can also calibrate or estimate other parameters, such as vehicle inertial tensor, wheel base, wheel radius or surface road friction. Calibration can also be done online to detect parameter change, according to some examples.
  • the pose and orientation of a sensor such a camera, LIDAR sensor, radar sensor, etc.
  • Off-line self- calibration can also calibrate or estimate other parameters, such as vehicle inertial tensor, wheel base, wheel radius or surface road friction. Calibration can also be done online to detect
  • calibration by calibrator 409 may include intrinsic parameters of the sensors (e.g., optical distortion, beam angles, etc.) and extrinsic parameters. In some cases, calibrator 409 may be performed by maximizing a correlation between depth discontinuities in 3D laser data and edges of image data, as an example.
  • Off-line object classification 410 is configured to receive data, such as sensor data, from sensors 470 or any other component of autonomous vehicle controller 447.
  • an off-line classification pipeline of off-line object classification 410 may be configured to pre-collect and annotate objects (e.g., manually by a human and/or automatically using an offline labeling algorithm), and may further be configured to train an online classifier (e.g., object classifier 444), which can provide real-time classification of object types during online autonomous operation.
  • an online classifier e.g., object classifier 444.
  • FIG. 5 is an example of a flow diagram to control an autonomous vehicle, according to some embodiments.
  • flow 500 begins when sensor data originating from sensors of multiple modalities at an autonomous vehicle is received, for example, by an autonomous vehicle controller.
  • One or more subsets of sensor data may be integrated for generating fused data to improve, for example, estimates.
  • a sensor stream of one or more sensors e.g., of same or different modalities
  • subsets of LIDAR sensor data and camera sensor data may be fused at 504 to facilitate localization.
  • data representing objects based on the least two subsets of sensor data may be derived at a processor.
  • data identifying static objects or dynamic objects may be derived (e.g., at a perception engine) from at least LIDAR and camera data.
  • a detected object is determined to affect a planned path, and a subset of trajectories are evaluated (e.g., at a planner) responsive to the detected object at 510.
  • a confidence level is determined at 512 to exceed a range of acceptable confidence levels associated with normative operation of an autonomous vehicle.
  • a confidence level may be such that a certainty of selecting an optimized path is less likely, whereby an optimized path may be determined as a function of the probability of facilitating collision-free travel, complying with traffic laws, providing a comfortable user experience (e.g., comfortable ride), and/or generating candidate trajectories on any other factor.
  • a request for an alternate path may be transmitted to a teleoperator computing device at 514. Thereafter, the teleoperator computing device may provide a planner with an optimal traj ectory over which an autonomous vehicle made travel.
  • the vehicle may also determine that executing a safe-stop maneuver is the best course of action (e.g., safely and automatically causing an autonomous vehicle to a stop at a location of relatively low probabilities of danger).
  • a safe-stop maneuver is the best course of action (e.g., safely and automatically causing an autonomous vehicle to a stop at a location of relatively low probabilities of danger).
  • FIG. 6 is a diagram depicting an example of an architecture for an autonomous vehicle controller, according to some embodiments.
  • Diagram 600 depicts a number of processes including a motion controller process 662, a planner processor 664, a perception process 666, a mapping process 640, and a localization process 668, some of which may generate or receive data relative to other processes.
  • Other processes such as such as processes 670 and 650 may facilitate interactions with one or more mechanical components of an autonomous vehicle.
  • perception process 666, mapping process 640, and localization process 668 are configured to receive sensor data from sensors 670
  • planner process 664 and perception process 666 are configured to receive guidance data 606, which may include route data, such as road network data.
  • localization process 668 is configured to receive map data 605a (i.e., 2D map data), map data 605b (i.e., 3D map data), and local map data 642, among other types of map data.
  • localization process 668 may also receive other forms of map data, such as 4D map data, which may include, for example, an epoch determination.
  • Localization process 668 is configured to generate local position data 641 representing a local pose. Local position data 641 is provided to motion controller process 662, planner process 664, and perception process 666.
  • Perception process 666 is configured to generate static and dynamic obj ect map data 667, which, in turn, may be transmitted to planner process 664.
  • static and dynamic object map data 667 may be transmitted with other data, such as semantic classification information and predicted obj ect behavior.
  • Planner process 664 is configured to generate traj ectories data 665, which describes a number of traj ectories generated by planner 664.
  • Motion controller process uses trajectories data 665 to generate low-level commands or control signals for application to actuators 650 to cause changes in steering angles and/or velocity.
  • FIG. 7 is a diagram depicting an example of an autonomous vehicle service platform implementing redundant communication channels to maintain reliable communications with a fleet of autonomous vehicles, according to some embodiments.
  • Diagram 700 depicts an autonomous vehicle service platform 701 including a reference data generator 705, a vehicle data controller 702, an autonomous vehicle fleet manager 703, a teleoperator manager 707, a simulator 740, and a policy manager 742.
  • Reference data generator 705 is configured to generate and modify map data and route data (e.g., RNDF data). Further, reference data generator 705 may be configured to access 2D maps in 2D map data repository 720, access 3D maps in 3D map data repository 722, and access route data in route data repository 724.
  • map data and route data e.g., RNDF data
  • Vehicle data controller 702 may be configured to perform a variety of operations. For example, vehicle data controller 702 may be configured to change a rate that data is exchanged between a fleet of autonomous vehicles and platform 701 based on quality levels of communication over channels 770. During bandwidth-constrained periods, for example, data communications may be prioritized such that teleoperation requests from autonomous vehicle 730 are prioritized highly to ensure delivery. Further, variable levels of data abstraction may be transmitted per vehicle over channels 770, depending on bandwidth available for a particular channel.
  • Autonomous vehicle fleet manager 703 is configured to coordinate the dispatching of autonomous vehicles 730 to optimize multiple variables, including an efficient use of battery power, times of travel, whether or not an air-conditioning unit in an autonomous vehicle 730 may be used during low charge states of a battery, etc., any or all of which may be monitored in view of optimizing cost functions associated with operating an autonomous vehicle service.
  • An algorithm may be implemented to analyze a variety of variables with which to minimize costs or times of travel for a fleet of autonomous vehicles. Further, autonomous vehicle fleet manager 703 maintains an inventory of autonomous vehicles as well as parts for accommodating a service schedule in view of maximizing up-time of the fleet.
  • Teleoperator manager 707 is configured to manage a number of teleoperator computing devices 704 with which teleoperators 708 provide input.
  • Simulator 740 is configured to simulate operation of one or more autonomous vehicles 730, as well as the interactions between teleoperator manager 707 and an autonomous vehicle 730.
  • Simulator 740 may also simulate operation of a number of sensors (including the introduction of simulated noise) disposed in autonomous vehicle 730.
  • an environment such as a city, may be simulated such that a simulated autonomous vehicle can be introduced to the synthetic environment, whereby simulated sensors may receive simulated sensor data, such as simulated laser returns.
  • Simulator 740 may provide other functions as well, including validating software updates and/or map data.
  • Policy manager 742 is configured to maintain data representing policies or rules by which an autonomous vehicle ought to behave in view of a variety of conditions or events that an autonomous vehicle encounters while traveling in a network of roadways. In some cases, updated policies and/or rules may be simulated in simulator 740 to confirm safe operation of a fleet of autonomous vehicles in view of changes to a policy.
  • Communication channels 770 are configured to provide networked communication links among a fleet of autonomous vehicles 730 and autonomous vehicle service platform 701.
  • communication channel 770 includes a number of different types of networks 771, 772, 773, and 774, with corresponding subnetworks (e.g., 771a to 771n), to ensure a certain level of redundancy for operating an autonomous vehicle service reliably.
  • the different types of networks in communication channels 770 may include different cellular network providers, different types of data networks, etc., to ensure sufficient bandwidth in the event of reduced or lost communications due to outages in one or more networks 771, 772, 773, and 774.
  • FIG. 8 is a diagram depicting an example of a messaging application configured to exchange data among various applications, according to some embodiments.
  • Diagram 800 depicts an teleoperator application 801 disposed in a teleoperator manager, and an autonomous vehicle application 830 disposed in an autonomous vehicle, whereby teleoperator applications 801 and autonomous vehicle application 830 exchange message data via a protocol that facilitates communications over a variety of networks, such as network 871, 872, and other networks 873.
  • the communication protocol is a middleware protocol implemented as a Data Distribution ServiceTM having a specification maintained by the Obj ect Management Group consortium.
  • teleoperator application 801 and autonomous vehicle application 830 may include a message router 854 disposed in a message domain, the message router being configured to interface with the teleoperator API 852.
  • message router 854 is a routing service.
  • message domain 850a in teleoperator application 801 may be identified by ateleoperator identifier, whereas message domain 850b be may be identified as a domain associated with a vehicle identifier.
  • Teleoperator API 852 in teleoperator application 801 is configured to interface with teleoperator processes 803a to 803c, whereby teleoperator process 803b is associated with an autonomous vehicle identifier 804, and teleoperator process 803c is associated with an event identifier 806 (e.g., an identifier that specifies an intersection that may be problematic for collision-free path planning).
  • Teleoperator API 852 in autonomous vehicle application 830 is configured to interface with an autonomous vehicle operating system 840, which includes sensing application 842, a perception application 844, a localization application 846, and a control application 848.
  • the above- described communications protocol may facilitate data exchanges to facilitate teleoperations as described herein.
  • the above-described communications protocol may be adapted to provide secure data exchanges among one or more autonomous vehicles and one or more autonomous vehicle service platforms.
  • message routers 854 may be configured to encrypt and decrypt messages to provide for secured interactions between, for example, a teleoperator process 803 and an autonomous vehicle operation system 840.
  • FIG. 9 is a diagram depicting types of data for facilitating teleoperations using a communications protocol described in FIG. 8, according to some examples.
  • Diagram 900 depicts a teleoperator 908 interfacing with a teleoperator computing device 904 coupled to a teleoperator application 901, which is configured to exchange data via a data-centric messaging bus 972 implemented in one or more networks 971.
  • Data-centric messaging bus 972 provides a communication link between teleoperator application 901 and autonomous vehicle application 930.
  • Teleoperator API 962 of teleoperator application 901 is configured to receive message service configuration data 964 and route data 960, such as road network data (e.g., RNDF-like data), mission data (e.g., MDF-data), and the like.
  • a messaging service bridge 932 is also configured to receive messaging service configuration data 934.
  • Messaging service configuration data 934 and 964 provide configuration data to configure the messaging service between teleoperator application 901 and autonomous vehicle application 930.
  • An example of messaging service configuration data 934 and 964 includes quality of service (“QoS") configuration data implemented to configure a Data Distribution ServiceTM application.
  • QoS quality of service
  • obstacle data 920 is generated by a perception system of an autonomous vehicle controller.
  • planner options data 924 is generated by a planner to notify a teleoperator of a subset of candidate traj ectories, and position data 926 is generated by the localizer.
  • Obstacle data 920, planner options data 924, and position data 926 are transmitted to a messaging service bridge 932, which, in accordance with message service configuration data 934, generates telemetry data 940 and query data 942, both of which are transmitted via data-centric messaging bus 972 into teleoperator application 901 as telemetry data 950 and query data 952.
  • Teleoperator API 962 receives telemetry data 950 and inquiry data 952, which, in turn are processed in view of Route data 960 and message service configuration data 964. The resultant data is subsequently presented to a teleoperator 908 via teleoperator computing device 904 and/or a collaborative display (e.g., a dashboard display visible to a group of collaborating teleoperators 908). Teleoperator 908 reviews the candidate trajectory options that are presented on the display of teleoperator computing device 904, and selects a guided traj ectory, which generates command data 982 and query response data 980, both of which are passed through teleoperator API 962 as query response data 954 and command data 956.
  • a guided traj ectory which generates command data 982 and query response data 980, both of which are passed through teleoperator API 962 as query response data 954 and command data 956.
  • query response data 954 and command data 956 are transmitted via data-centric messaging bus 972 into autonomous vehicle application 930 as query response data 944 and command data 946.
  • Messaging service bridge 932 receives query response data 944 and command data 946 and generates teleoperator command data 928, which is configured to generate a teleoperator-selected trajectory for implementation by a planner. Note that the above-described messaging processes are not intended to be limiting, and other messaging protocols may be implemented as well.
  • FIG. 10 is a diagram illustrating an example of a teleoperator interface with which a teleoperator may influence path planning, according to some embodiments.
  • Diagram 1000 depicts examples of an autonomous vehicle 1030 in communication with an autonomous vehicle service platform 1001, which includes a teleoperator manager 1007 configured to facilitate teleoperations.
  • teleoperator manager 1007 receives data that requires teleoperator 1008 to preemptively view a path of an autonomous vehicle approaching a potential obstacle or an area of low planner confidence levels so that teleoperator 1008 may be able to address an issue in advance.
  • an intersection that an autonomous vehicle is approaching may be tagged as being problematic.
  • user interface 1010 displays a representation 1014 of a corresponding autonomous vehicle 1030 transiting along a path 1012, which has been predicted by a number of traj ectories generated by a planner. Also displayed are other vehicles 1011 and dynamic objects 1013, such as pedestrians, that may cause sufficient confusion at the planner, thereby requiring teleoperation support. User interface 1010 also presents to teleoperator 1008 a current velocity 1022, a speed limit 1024, and an amount of charge 1026 presently in the batteries. According to some examples, user interface 1010 may display other data, such as sensor data as acquired from autonomous vehicle 1030.
  • planner 1064 has generated a number of trajectories that are coextensive with a planner-generated path 1044 regardless of a detected unidentified object 1046.
  • Planner 1064 may also generate a subset of candidate trajectories 1040, but in this example, the planner is unable to proceed given present confidence levels. If planner 1064 fails to determine an alternative path, a teleoperation request may be transmitted. In this case, a teleoperator may select one of candidate trajectories 1040 to facilitate travel by autonomous vehicle 1030 that is consistent with teleoperator-based path 1042.
  • FIG. 11 is a diagram depicting an example of a planner configured to invoke teleoperations, according to some examples.
  • Diagram 1100 depicts planner 1164 including a topography manager 1110, a route manager 1112, a path generator 1114, a trajectory evaluator 1120, and a trajectory tracker 1128.
  • Topography manager 1110 is configured to receive map data, such as 3D map data or other like map data that specifies topographic features.
  • Topography manager 1110 is further configured to identity candidate paths based on topographic-related features on a path to a destination.
  • topography manager 11 10 receives 3D maps generated by sensors associated with one or more autonomous vehicles in the fleet.
  • Route manager 1 112 is configured to receive environmental data 1 103, which may include traffic-related information associated with one or more routes that may be selected as a path to the destination.
  • Path generator 1114 receives data from topography manager 11 10 and route manager 1 112, and generates one or more paths or path segments suitable to direct autonomous vehicle toward a destination. Data representing one or more paths or path segments is transmitted into trajectory evaluator 1120.
  • Traj ectory evaluator 1 120 includes a state and event manager 1 122, which, in turn, may include a confidence level generator 1123. Traj ectory evaluator 1120 further includes a guided traj ectory generator 1126 and a traj ectory generator 1124. Further, planner 1164 is configured to receive policy data 1130, perception engine data 1132, and localizer data 1134.
  • Policy data 1130 may include criteria with which planner 1164 uses to determine a path that has a sufficient confidence level with which to generate traj ectories, according to some examples.
  • Examples of policy data 1 130 include policies that specify that trajectory generation is bounded by stand-off distances to extemal objects (e.g., maintaining a safety buffer of 3 feet from a cyclist, as possible), or policies that require that traj ectories must not cross a center double yellow line, or policies that require trajectories to be limited to a single lane in a 4-lane roadway (e.g., based on past events, such as typically congregating at a lane closest to a bus stop), and any other similar criteria specified by policies.
  • Perception engine data 1 132 includes maps of locations of static obj ects and dynamic objects of interest, and localizer data 1134 includes at least a local pose or position.
  • State and event manager 1 122 may be configured to probabilistically determine a state of operation for an autonomous vehicle. For example, a first state of operation (i.e., "normative operation”) may describe a situation in which traj ectories are collision-free, whereas a second state of operation (i.e., "non-normative operation”) may describe another situation in which the confidence level associated with possible traj ectories are insufficient to guarantee collision-free travel.
  • state and event manager 1122 is configured to use perception data 1132 to determine a state of autonomous vehicle that is either normative or non-normative.
  • Confidence level generator 1123 may be configured to analyze perception data 1 132 to determine a state for the autonomous vehicle.
  • confidence level generator 1123 may use semantic information associated with static and dynamic objects, as well as associated probabilistic estimations, to enhance a degree of certainty that planner 1164 is determining safe course of action.
  • planner 1 164 may use perception engine data 1132 that specifies a probability that an object is either a person or not a person to determine whether planner 1 164 is operating safely (e.g., planner 1164 may receive a degree of certainty that an object has a 98% probability of being a person, and a probability of 2% that the obj ect is not a person).
  • a relatively low confidence level may trigger planner 1 164 to transmit a request 1 135 for teleoperation support to autonomous vehicle service platform 1 101.
  • a relatively low confidence level e.g., single probability score
  • planner 1 164 may trigger planner 1 164 to transmit a request 1 135 for teleoperation support to autonomous vehicle service platform 1 101.
  • telemetry data and a set of candidate traj ectories may accompany the request. Examples of telemetry data include sensor data, localization data, perception data, and the like.
  • a teleoperator 1 108 may transmit via teleoperator computing device 1104 a selected trajectory 1137 to guided trajectory generator 1 126.
  • selected trajectory 1137 is a traj ectory formed with guidance from a teleoperator.
  • guided traj ectory generator 1126 passes data to trajectory generator 1124, which, in turn, causes trajectory tracker 1128, as a trajectory tracking controller, to use the teleop-specified traj ectory for generating control signals 1 170 (e.g., steering angles, velocity, etc.).
  • planner 1164 may trigger transmission of a request 1135 for teleoperation support prior to a state transitioning to a non-normative state.
  • an autonomous vehicle controller and/or its components can predict that a distant obstacle may be problematic and preemptively cause planner 1164 to invoke teleoperations prior to the autonomous vehicle reaching the obstacle. Otherwise, the autonomous vehicle may cause a delay by transitioning to a safe state upon encountering the obstacle or scenario (e.g., pulling over and off the roadway).
  • teleoperations may be automatically invoked prior to an autonomous vehicle approaching a particular location that is known to be difficult to navigate. This determination may optionally take into consideration other factors, including the time of day, the position of the sun, if such situation is likely to cause a disturbance to the reliability of sensor readings, and traffic or accident data derived from a variety of sources.
  • FIG. 12 is an example of a flow diagram configured to control an autonomous vehicle, according to some embodiments.
  • flow 1200 begins.
  • Data representing a subset of objects that are received at a planner in an autonomous vehicle the subset of obj ects including at least one obj ect associated with data representing a degree of certainty for a classification type.
  • perception engine data may include metadata associated with obj ects, whereby the metadata specifies a degree of certainty associated with a specific classification type. For instance, a dynamic obj ect may be classified as a "young pedestrian" with an 85% confidence level of being correct.
  • localizer data may be received (e.g., at a planner).
  • the localizer data may include map data that is generated locally within the autonomous vehicle.
  • the local map data may specify a degree of certainty (including a degree of uncertainty) that an event at a geographic region may occur.
  • An event may be a condition or situation affecting operation, or potentially affecting operation, of an autonomous vehicle.
  • the events may be internal (e.g., failed or impaired sensor) to an autonomous vehicle, or external (e.g., roadway obstruction). Examples of events are described herein, such as in FIG. 2 as well as in other figures and passages.
  • a path coextensive with the geographic region of interest may be determined at 1206. For example, consider that the event is the positioning of the sun in the sky at a time of day in which the intensity of sunlight impairs the vision of drivers during rush hour traffic.
  • a planner may preemptively invoke teleoperations if an alternate path to avoid the event is less likely.
  • a local position is determined at a planner based on local pose data.
  • a state of operation of an autonomous vehicle may be determined (e.g., probabilistically), for example, based on a degree of certainty for a classification type and a degree of certainty of the event, which is may be based on any number of factors, such as speed, position, and other state information.
  • a young pedestrian is detected by the autonomous vehicle during the event in which other drivers' vision likely will be impaired by the sun, thereby causing an unsafe situation for the young pedestrian.
  • a relatively unsafe situation can be detected as a probabilistic event that may be likely to occur (i.e., an unsafe situation for which teleoperations may be invoked).
  • a likelihood that the state of operation is in a normative state is determined, and based on the determination, a message is transmitted to a teleoperator computing device requesting teleoperations to preempt a transition to a next state of operation (e.g., preempt transition from a normative to non-normative state of operation, such as an unsafe state of operation).
  • FIG. 13 depicts an example in which a planner may generate a traj ectory, according to some examples.
  • Diagram 1300 includes a trajectory evaluator 1320 and a trajectory generator 1324.
  • Trajectory evaluator 1320 includes a confidence level generator 1322 and a teleoperator query messenger 1329.
  • trajectory evaluator 1320 is coupled to a perception engine 1366 to receive static map data 1301, and current and predicted object state data 1303.
  • Trajectory evaluator 1320 also receives local pose data 1305 from localizer 1368 and plan data 1307 from a global planner 1369.
  • confidence level generator 1322 receives static map data 1301 and current and predicted obj ect state data 1303. Based on this data, confidence level generator 1322 may determine that detected trajectories are associated with unacceptable confidence level values. As such, confidence level generator 1322 transmits detected traj ectory data 1309 (e.g., data including candidate trajectories) to notify a teleoperator via teleoperator query messenger 1329, which, in turn, transmits a request 1370 for teleoperator assistance.
  • detected traj ectory data 1309 e.g., data including candidate trajectories
  • trajectory calculator 1325 In another state of operation (e.g., a normative state), static map data 1301 , current and predicted obj ect state data 1303, local pose data 1305, and plan data 1307 (e.g., global plan data) are received into trajectory calculator 1325, which is configured to calculate (e.g., iteratively) traj ectories to determine an optimal one or more paths. Next, at least one path is selected and is transmitted as selected path data 131 1. According to some embodiments, traj ectory calculator 1325 is configured to implement re-planning of trajectories as an example.
  • Nominal driving trajectory generator 1327 is configured to generate traj ectories in a refined approach, such as by generating traj ectories based on receding horizon control techniques. Nominal driving traj ectory generator 1327 subsequently may transmit nominal driving traj ectory path data 1372 to, for example, a traj ectory tracker or a vehicle controller to implement physical changes in steering, acceleration, and other components.
  • FIG. 14 is a diagram depicting another example of an autonomous vehicle service platform, according to some embodiments.
  • Diagram 1400 depicts an autonomous vehicle service platform 1401 including a teleoperator manager 1407 that is configured to manage interactions and/or communications among teleoperators 1408, teleoperator computing devices 1404, and other components of autonomous vehicle service platform 1401.
  • autonomous vehicle service platform 1401 includes a simulator 1440, a repository 1441, a policy manager 1442, a reference data updater 1438, a 2D map data repository 1420, a 3D map data repository 1422, and a route data repository 1424.
  • Other map data such as 4D map data (e.g., using epoch determination), may be implemented and stored in a repository (not shown).
  • Teleoperator action recommendation controller 1412 includes logic configured to receive and/or control a teleoperation service request via autonomous vehicle ("AV") planner data 1472, which can include requests for teleoperator assistance as well as telemetry data and other data.
  • planner data 1472 may include recommended candidate traj ectories or paths from which a teleoperator 1408 via teleoperator computing device 1404 may select.
  • teleoperator action recommendation controller 1412 may be configured to access other sources of recommended candidate trajectories from which to select an optimum trajectory.
  • candidate trajectories contained in autonomous vehicle planner data 1472 may, in parallel, be introduced into simulator 1440, which is configured to simulate an event or condition being experienced by an autonomous vehicle requesting teleoperator assistance.
  • Simulator 1440 can access map data and other data necessary for performing a simulation on the set of candidate traj ectories, whereby simulator 1440 need not exhaustively reiterate simulations to confirm sufficiency. Rather, simulator 1440 may provide either confirm the appropriateness of the candidate traj ectories, or may otherwise alert a teleoperator to be cautious in their selection.
  • Teleoperator interaction capture analyzer 1416 may be configured to capture numerous amounts of teleoperator transactions or interactions for storage in repository 1441, which, for example, may accumulate data relating to a number of teleoperator transactions for analysis and generation of policies, at least in some cases.
  • repository 1441 may also be configured to store policy data for access by policy manager 1442.
  • teleoperator interaction capture analyzer 1416 may apply machine learning techniques to empirically determine how best to respond to events or conditions causing requests for teleoperation assistance.
  • policy manager 1442 may be configured to update a particular policy or generate a new policy responsive to analyzing the large set of teleoperator interactions (e.g., subsequent to applying machine learning techniques).
  • Policy manager 1442 manages policies that may be viewed as rules or guidelines with which an autonomous vehicle controller and its components operate under to comply with autonomous operations of a vehicle. In some cases, a modified or updated policy may be applied to simulator 1440 to confirm the efficacy of permanently releasing or implementing such policy changes.
  • Simulator interface controller 1414 is configured to provide an interface between simulator 1440 and teleoperator computing devices 1404. For example, consider that sensor data from a fleet of autonomous vehicles is applied to reference data updater 1438 via autonomous ("AV") fleet data 1470, whereby reference data updater 1438 is configured to generate updated map and route data 1439.
  • updated map and route data 1439 may be preliminarily released as an update to data in map data repositories 1420 and 1422, or as an update to data in route data repository 1424.
  • such data may be tagged as being a "beta version" in which a lower threshold for requesting teleoperator service may be implemented when, for example, a map tile including preliminarily updated information is used by an autonomous vehicle.
  • updated map and route data 1439 may be introduced to simulator 1440 for validating the updated map data.
  • the previously lowered threshold for requesting a teleoperator service related to map tiles is canceled.
  • User interface graphics controller 1410 provides rich graphics to teleoperators 1408, whereby a fleet of autonomous vehicles may be simulated within simulator 1440 and may be accessed via teleoperator computing device 1404 as if the simulated fleet of autonomous vehicles were real.
  • FIG. 15 is an example of a flow diagram to control an autonomous vehicle, according to some embodiments.
  • flow 1500 begins.
  • Message data may be received at a teleoperator computing device for managing a fleet of autonomous vehicles.
  • the message data may indicate event attributes associated with a non-normative state of operation in the context of a planned path for an autonomous vehicle.
  • an event may be characterized as a particular intersection that becomes problematic due to, for example, a large number of pedestrians, hurriedly crossing the street against a traffic light.
  • the event attributes describe the characteristics of the event, such as, for example, the number of people crossing the street, the traffic delays resulting from an increased number of pedestrians, etc.
  • a teleoperation repository may be accessed to retrieve a first subset of recommendations based on simulated operations of aggregated data associated with a group of autonomous vehicles.
  • a simulator may be a source of recommendations with which a teleoperator may implement.
  • the teleoperation repository may also be accessed to retrieve a second subset of recommendations based on an aggregation of teleoperator interactions responsive to similar event attributes.
  • a teleoperator interaction capture analyzer may apply machine learning techniques to empirically determine how best to respond to events having similar attributes based on previous requests for teleoperation assistance.
  • the first subset and the second subset of recommendations are combined to form a set of recommended courses of action for the autonomous vehicle.
  • representations of the set of recommended courses of actions may be presented visually on a display of a teleoperator computing device.
  • data signals representing a selection (e.g., by teleoperator) of a recommended course of action may be detected.
  • FIG. 16 is a diagram of an example of an autonomous vehicle fleet manager implementing a fleet optimization manager, according to some examples.
  • Diagram 1600 depicts an autonomous vehicle fleet manager that is configured to manage a fleet of autonomous vehicles 1630 transiting within a road network 1650.
  • Autonomous vehicle fleet manager 1603 is coupled to a teleoperator 1608 via a teleoperator computing device 1604, and is also coupled to a fleet management data repository 1646.
  • Autonomous vehicle fleet manager 1603 is configured to receive policy data 1602 and environmental data 1606, as well as other data.
  • fleet optimization manager 1620 is shown to include a transit request processor 1631, which, in turn, includes a fleet data extractor 1632 and an autonomous vehicle dispatch optimization calculator 1634.
  • Transit request processor 1631 is configured to process transit requests, such as from a user 1688 who is requesting autonomous vehicle service.
  • Fleet data extractor 1632 is configured to extract data relating to autonomous vehicles in the fleet. Data associated with each autonomous vehicle is stored in repository 1646. For example, data for each vehicle may describe maintenance issues, scheduled service calls, daily usage, battery charge and discharge rates, and any other data, which may be updated in real-time, may be used for purposes of optimizing a fleet of autonomous vehicles to minimize downtime.
  • Autonomous vehicle dispatch optimization calculator 1634 is configured to analyze the extracted data and calculate optimized usage of the fleet so as to ensure that the next vehicle dispatched, such as from station 1652, provides for the least travel times and/or costs-in the aggregate-for the autonomous vehicle service.
  • Fleet optimization manager 1620 is shown to include a hybrid autonomous vehicle/non- autonomous vehicle processor 1640, which, in turn, includes an AV/non-AV optimization calculator 1642 and a non-AV selector 1644.
  • hybrid autonomous vehicle/ non-autonomous vehicle processor 1640 is configured to manage a hybrid fleet of autonomous vehicles and human-driven vehicles (e.g., as independent contractors).
  • autonomous vehicle service may employ non-autonomous vehicles to meet excess demand, or in areas, such as non-AV service region 1690, that may be beyond a geo-fence or in areas of poor communication coverage.
  • Non-AV selector 1644 includes logic for selecting a number of non-AV drivers to assist based on calculations derived by AV/non- AV optimization calculator 1642.
  • FIG. 17 is an example of a flow diagram to manage a fleet of autonomous vehicles, according to some embodiments.
  • flow 1700 begins.
  • policy data is received.
  • the policy data may include parameters that define how best apply to select an autonomous vehicle for servicing a transit request.
  • fleet management data from a repository may be extracted.
  • the fleet management data includes subsets of data for a pool of autonomous vehicles (e.g., the data describes the readiness of vehicles to service a transportation request).
  • data representing a transit request is received. For exemplary purposes, the transit request could be for transportation from a first geographic location to a second geographic location.
  • attributes based on the policy data are calculated to determine a subset of autonomous vehicles that are available to service the request.
  • attributes may include a battery charge level and time until next scheduled maintenance.
  • an autonomous vehicle is selected as transportation from the first geographic location to the second geographic location, and data is generated to dispatch the autonomous vehicle to a third geographic location associated with the origination of the transit request.
  • FIG. 18 is a diagram illustrating an autonomous vehicle fleet manager implementing an autonomous vehicle communications link manager, according to some embodiments.
  • Diagram 1800 depicts an autonomous vehicle fleet manager that is configured to manage a fleet of autonomous vehicles 1830 transiting within a road network 1850 that coincides with a communication outage at an area identified as "reduced communication region" 1880.
  • Autonomous vehicle fleet manager 1803 is coupled to a teleoperator 1808 via a teleoperator computing device 1804.
  • Autonomous vehicle fleet manager 1803 is configured to receive policy data 1802 and environmental data 1806, as well as other data.
  • an autonomous vehicle communications link manager 1820 is shown to include an environment event detector 1831, a policy adaption determinator 1832, and a transit request processor 1834.
  • Environment event detector 1831 is configured to receive environmental data 1806 specifying a change within the environment in which autonomous vehicle service is implemented. For example, environmental data 1806 may specify that region 1880 has degraded communication services, which may affect the autonomous vehicle service. Policy adaption determinator 1832 may specify parameters with which to apply when receiving transit requests during such an event (e.g., during a loss of communications).
  • Transit request processor 1834 is configured to process transit requests in view of the degraded communications. In this example, a user 1888 is requesting autonomous vehicle service. Further, transit request processor 1834 includes logic to apply an adapted policy for modifying the way autonomous vehicles are dispatched so to avoid complications due to poor communications.
  • Communication event detector 1840 includes a policy download manager 1842 and communications-configured ("COMM-configured") AV dispatcher 1844.
  • Policy download manager 1842 is configured to provide autonomous vehicles 1830 an updated policy in view of reduced communications region 1880, whereby the updated policy may specify routes to quickly exit region 1880 if an autonomous vehicle enters that region. For example, autonomous vehicle
  • COMM-configured AV dispatcher 1844 may be configured to identify points 1865 at which to park autonomous vehicles that are configured as relays to establishing a peer-to-peer network over region 1880. As such, COMM-configured AV dispatcher 1844 is configured to dispatch autonomous vehicles 1862 (without passengers) to park at locations
  • FIG. 19 is an example of a flow diagram to determine actions for autonomous vehicles during an event, such as degraded or lost communications, according to some embodiments.
  • flow 1900 begins. Policy data is received, whereby the policy data defines parameters with which to apply to transit requests in a geographical region during an event.
  • one or more of the following actions may be implemented: (1) dispatch a subset of autonomous vehicles to geographic locations in the portion of the geographic location, the subset of autonomous vehicles being configured to either park at specific geographic locations and each serve as a static communication relay, or transit in a geographic region to each serve as a mobile communication relay, (2) implement peer-to-peer communications among a portion of the pool of autonomous vehicles associated with the portion of the geographic region, (3) provide to the autonomous vehicles an event policy that describes a route to egress the portion of the geographic region during an event, (4) invoke teleoperations, and (5) recalculate paths so as to avoid the geographic portion. Subsequent to implementing the action, the fleet of autonomous vehicles is monitored at 1914. [0104] FIG.
  • Diagram 2000 includes a localizer 2068 configured to receive sensor data from sensors 2070, such as LIDAR data 2072, camera data 2074, radar data 2076, and other data 2078. Further, localizer 2068 is configured to receive reference data 2020, such as 2D map data 2022, 3D map data 2024, and 3D local map data. According to some examples, other map data, such as 4D map data 2025 and semantic map data (not shown), including corresponding data structures and repositories, may also be implemented. Further to diagram 2000, localizer 2068 includes a positioning system 2010 and a localization system 2012, both of which are configured to receive sensor data from sensors 2070 as well as reference data 2020. Localization data integrator 2014 is configured to receive data from positioning system 2010 and data from localization system 2012, whereby localization data integrator 2014 is configured to integrate or fuse sensor data from multiple sensors to form local pose data 2052.
  • FIG. 21 is an example of a flow diagram to generate local pose data based on integrated sensor data, according to some embodiments.
  • flow 2100 begins.
  • reference data is received, the reference data including three dimensional map data.
  • reference data such as 3D or 4D map data, may be received via one or more networks.
  • localization data from one or more localization sensors is received and placed into a localization system.
  • positioning data from one or more positioning sensors is received into a positioning system.
  • the localization and positioning data are integrated.
  • the localization data and positioning data are integrated to form local position data specifying a geographic position of an autonomous vehicle.
  • FIG. 22 is a diagram depicting another example of a localizer, according to some embodiments.
  • Diagram 2200 includes a localizer 2268, which, in turn, includes a localization system 2210 and a relative localization system 2212 to generate positioning-based data 2250 and local location-based data 2251, respectively.
  • Localization system 2210 includes a projection processor 2254a for processing GPS data 2273, a GPS datum 2211, and 3D Map data 2222, among other optional data (e.g., 4D map data).
  • Localization system 2210 also includes an odometry processor 2254b to process wheel data 2275 (e.g., wheel speed), vehicle model data 2213 and 3D map data 2222, among other optional data.
  • wheel data 2275 e.g., wheel speed
  • vehicle model data 2213 e.g., 3D map data 2222
  • localization system 2210 includes an integrator processor 2254c to process IMU data 2257, vehicle model data 2215, and 3D map data 2222, among other optional data.
  • relative localization system 2212 includes a LIDAR localization processor 2254d for processing LIDAR data 2272, 2D tile map data 2220, 3D map data 2222, and 3D local map data 2223, among other optional data.
  • Relative localization system 2212 also includes a visual registration processor 2254e to process camera data 2274, 3D map data 2222, and 3D local map data 2223, among other optional data.
  • relative localization system 2212 includes a radar return processor 2254f to process radar data 2276, 3D map data 2222, and 3D local map data 2223, among other optional data.
  • radar return processor 2254f to process radar data 2276, 3D map data 2222, and 3D local map data 2223, among other optional data.
  • other types of sensor data and sensors or processors may be implemented, such as sonar data and the like.
  • localization-based data 2250 and relative localization-based data 2251 may be fed into data integrator 2266a and localization data integrator 2266, respectively.
  • Data integrator 2266a and localization data integrator 2266 may be configured to fuse corresponding data, whereby localization-based data 2250 may be fused at data integrator 2266a prior to being fused with relative localization-based data 2251 at localization data integrator 2266.
  • data integrator 2266a is formed as part of localization data integrator 2266, or is absent. Regardless, a localization-based data 2250 and relative localization-based data 2251 can be both fed into localization data integrator 2266 for purposes of fusing data to generate local position data 2252.
  • Localization-based data 2250 may include unary-constrained data (and uncertainty values) from projection processor 2254a, as well as binary-constrained data (and uncertainty values) from odometry processor 2254b and integrator processor 2254c.
  • Relative localization-based data 2251 may include unary-constrained data (and uncertainty values) from localization processor 2254d and visual registration processor 2254e, and optionally from radar return processor 2254f.
  • localization data integrator 2266 may implement non-linear smoothing functionality, such as a Kalman filter (e.g., a gated Kalman filter), a relative bundle adjuster, pose-graph relaxation, particle filter, histogram filter, or the like.
  • FIG. 23 is a diagram depicting an example of a perception engine, according to some embodiments.
  • Diagram 2300 includes a perception engine 2366, which, in turn, includes a segmentation processor 2310, an object tracker 2330, and a classifier 2360. Further, perception engine 2366 is configured to receive a local position data 2352, LIDAR data 2372, camera data 2374, and radar data 2376, for example. Note that other sensor data, such as sonar data, may be accessed to provide functionalities of perception engine 2366.
  • Segmentation processor 2310 is configured to extract ground plane data and/or to segment portions of an image to distinguish objects from each other and from static imagery (e.g., background). In some cases, 3D blobs may be segmented to distinguish each other.
  • a blob may refer to a set of features that identity an object in a spatially-reproduced environment and may be composed of elements (e.g., pixels of camera data, points of laser return data, etc.) having similar characteristics, such as intensity and color.
  • a blob may also refer to a point cloud (e.g., composed of colored laser return data) or other elements constituting an obj ect.
  • Obj ect tracker 2330 is configured to perform frame-to-frame estimations of motion for blobs, or other segmented image portions.
  • object tracker 2330 is configured to perform real-time probabilistic tracking of 3D objects, such as blobs.
  • Classifier 2360 is configured to identify an obj ect and to classify that object by classification type (e.g., as a pedestrian, cyclist, etc.) and by energy/activity (e.g. whether the obj ect is dynamic or static), whereby data representing classification is described by a semantic label.
  • probabilistic estimations of obj ect categories may be performed, such as classifying an obj ect as a vehicle, bicyclist, pedestrian, etc. with varying confidences per obj ect class.
  • Perception engine 2366 is configured to determine perception engine data 2354, which may include static obj ect maps and/or dynamic object maps, as well as semantic information so that, for example, a planner may use this information to enhance path planning.
  • one or more of segmentation processor 2310, obj ect tracker 2330, and classifier 2360 may apply machine learning techniques to generate perception engine data 2354.
  • FIG. 24 is an example of a flow chart to generate perception engine data, according to some embodiments.
  • Flow chart 2400 begins at 2402, at which data representing a local position of an autonomous vehicle is retrieved.
  • localization data from one or more localization sensors is received, and features of an environment in which the autonomous vehicle is disposed are segmented at 2406 to form segmented objects.
  • One or more portions of the segmented obj ect are tracked spatially at 2408 to form at least one tracked obj ect having a motion (e.g., an estimated motion).
  • a tracked obj ect is classified at least as either being a static obj ect or a dynamic obj ect.
  • a static obj ect or a dynamic obj ect may be associated with a classification type.
  • data identifying a classified obj ect is generated.
  • the data identifying the classified object may include semantic information.
  • FIG. 25 is an example of a segmentation processor, according to some embodiments.
  • Diagram 2500 depicts a segmentation processor 2510 receiving LIDAR data from one or more LIDARs 2572 and camera image data from one or more cameras 2574.
  • Local pose data 2552, LIDAR data, and camera image data are received into meta spin generator 2521.
  • meta spin generator is configured to partition an image based on various attributes (e.g., color, intensity, etc.) into distinguishable regions (e.g., clusters or groups of a point cloud), at least two or more of which may be updated at the same time or about the same time.
  • Meta spin data 2522 is used to perform object segmentation and ground segmentation at segmentation processor 2523, whereby both meta spin data 2522 and segmentation-related data from segmentation processor 2523 are applied to a scanned differencing processor 2513.
  • Scanned differencing processor 2513 is configured to predict motion and/or relative velocity of segmented image portions, which can be used to identity dynamic objects at 2517. Data indicating obj ects with detected velocity at 2517 are optionally transmitted to the planner to enhance path planning decisions. Additionally, data from scanned differencing processor 2513 may be used to approximate locations of objects to form mapping of such objects (as well as optionally identifying a level of motion). In some examples, an occupancy grid map 2515 may be generated.
  • Data representing an occupancy grid map 2515 may be transmitted to the planner to further enhance path planning decisions (e.g., by reducing uncertainties).
  • image camera data from one or more cameras 2574 are used to classify blobs in blob classifier 2520, which also receives blob data 2524 from segmentation processor 2523.
  • Segmentation processor 2510 also may receive raw radar returns data 2512 from one or more radars 2576 to perform segmentation at a radar segmentation processor 2514, which generates radar-related blob data 2516. Further to FIG. 25, segmentation processor 2510 may also receive and/or generate tracked blob data 2518 related to radar data.
  • Blob data 2516, tracked blob data 2518, data from blob classifier 2520, and blob data 2524 may be used to track objects or portions thereof. According to some examples, one or more of the following may be optional: scanned differencing processor 2513, blob classification 2520, and data from radar 2576.
  • FIG. 26A is a diagram depicting examples of an obj ect tracker and a classifier, according to various embodiments.
  • Obj ect tracker 2630 of diagram 2600 is configured to receive blob data 2516, tracked blob data 2518, data from blob classifier 2520, blob data 2524, and camera image data from one or more cameras 2676.
  • Image tracker 2633 is configured to receive camera image data from one or more cameras 2676 to generate tracked image data, which, in turn, may be provided to data association processor 2632.
  • data association processor 2632 is configured to receive blob data 2516, tracked blob data 2518, data from blob classifier 2520, blob data 2524, and track image data from image tracker 2633, and is further configured to identity one or more associations among the above-described types of data.
  • Data association processor 2632 is configured to track, for example, various blob data from one frame to a next frame to, for example, estimate motion, among other things. Further, data generated by data association processor 2632 may be used by track updater 2634 to update one or more tracks, or tracked obj ects.
  • track updater 2634 may implement a Kalman Filter, or the like, to form updated data for tracked objects, which may be stored online in track database ("DB") 2636. Feedback data may be exchanged via path 2699 between data association processor 2632 and track database 2636.
  • image tracker 2633 may be optional and may be excluded.
  • Obj ect tracker 2630 may also use other sensor data, such as radar or sonar, as well as any other types of sensor data, for example.
  • FIG. 26B is a diagram depicting another example of an obj ect tracker according to at least some examples.
  • Diagram 2601 includes an obj ect tracker 2631 that may include structures and/or functions as similarly -named elements described in connection to one or more other drawings (e.g., FIG. 26 A).
  • object tracker 2631 includes an optional registration portion 2699 that includes a processor 2696 configured to perform obj ect scan registration and data fusion.
  • Processor 2696 is further configured to store the resultant data in 3D object database 2698. [0113] Referring back to FIG.
  • diagram 2600 also includes classifier 2660, which may include a track classification engine 2662 for generating static obstacle data 2672 and dynamic obstacle data 2674, both of which may be transmitted to the planner for path planning purposes.
  • track classification engine 2662 is configured to determine whether an obstacle is static or dynamic, as well as another classification type for the object (e.g., whether the object is a vehicle, pedestrian, tree, cyclist, dog, cat, paper bag, etc.).
  • Static obstacle data 2672 may be formed as part of an obstacle map (e.g., a 2D occupancy map), and dynamic obstacle data 2674 may be formed to include bounding boxes with data indicative of velocity and classification type.
  • Dynamic obstacle data 2674 at least in some cases, includes 2D dynamic obstacle map data.
  • FIG. 27 is an example of front-end processor for a perception engine, according to some examples.
  • Diagram 2700 includes a ground segmentation processor 2723a for performing ground segmentation, and an over segmentation processor 2723b for performing "over-segmentation," according to various examples.
  • Processors 2723a and 2723b are configured to receive optionally colored LIDAR data 2775.
  • Over segmentation processor 2723b generates data 2710 of a first blob type (e.g., a relatively small blob), which is provided to an aggregation classification and segmentation engine 2712 that generates data 2714 of a second blob type.
  • Data 2714 is provided to data association processor 2732, which is configured to detect whether data 2714 resides in track database 2736.
  • Track classification engine 2762 is coupled to track database 2736 to identity and update/modify tracks by, for example, adding, removing or modifying track-related data.
  • FIG. 28 is a diagram depicting a simulator configured to simulate an autonomous vehicle in a synthetic environment, according to various embodiments.
  • Diagram 2800 includes a simulator 2840 that is configured to generate a simulated environment 2803.
  • simulator 2840 is configured to use reference data 2822 (e.g., 3D map data and/or other map or route data including RNDF data or similar road network data) to generate simulated geometries, such as simulated surfaces 2892a and 2892b, within simulated environment 2803.
  • Simulated surfaces 2892a and 2892b may simulate walls or front sides of buildings adjacent a roadway.
  • Simulator 2840 may also pre-generated or procedurally generated use dynamic obj ect data 2825 to simulate dynamic agents in a synthetic environment.
  • a dynamic agent is simulated dynamic obj ect 2801, which is representative of a simulated cyclist having a velocity.
  • the simulated dynamic agents may optionally respond to other static and dynamic agents in the simulated environment, including the simulated autonomous vehicle.
  • simulated object 2801 may slow down for other obstacles in simulated environment 2803 rather than follow a preset trajectory, thereby creating a more realistic simulation of actual dynamic environments that exist in the real world.
  • Simulator 2840 may be configured to generate a simulated autonomous vehicle controller 2847, which includes synthetic adaptations of a perception engine 2866, a localizer 2868, a motion controller 2862, and a planner 2864, each of which may have functionalities described herein within simulated environment 2803. Simulator 2840 may also generate simulated interfaces ("I/F") 2849 to simulate the data exchanges with different sensors modalities and different sensor data formats. As such, simulated interface 2849 may simulate a software interface for packetized data from, for example, a simulated LIDAR sensor 2872. Further, simulator 2840 may also be configured to generate a simulated autonomous vehicle 2830 that implements simulated AV controller 2847.
  • a simulated autonomous vehicle controller 2847 which includes synthetic adaptations of a perception engine 2866, a localizer 2868, a motion controller 2862, and a planner 2864, each of which may have functionalities described herein within simulated environment 2803. Simulator 2840 may also generate simulated interfaces ("I/F") 2849 to simulate the
  • Simulated autonomous vehicle 2830 includes simulated LIDAR sensors 2872, simulated camera or image sensors 2874, and simulated radar sensors 2876.
  • simulated LIDAR sensor 2872 may be configured to generate a simulated laser consistent with ray trace 2892, which causes generation of simulated sensor return 2891.
  • simulator 2840 may simulate the addition of noise or other environmental effects on sensor data (e.g., added diffusion or reflections that affect simulated sensor return 2891, etc.). Further yet, simulator 2840 may be configured to simulate a variety of sensor defects, including sensor failure, sensor miscalibration, intermittent data outages, and the like.
  • Simulator 2840 includes a physics processor 2850 for simulating the mechanical, static, dynamic, and kinematic aspects of an autonomous vehicle for use in simulating behavior of simulated autonomous vehicle 2830.
  • physics processor 2850 includes a content mechanics module 2851 for simulating contact mechanics, a collision detection module 2852 for simulating the interaction between simulated bodies, and a multibody dynamics module 2854 to simulate the interaction between simulated mechanical interactions.
  • Simulator 2840 also includes a simulator controller 2856 configured to control the simulation to adapt the functionalities of any synthetically-generated element of simulated environment 2803 to determine cause-effect relationship, among other things.
  • Simulator 2840 includes a simulator evaluator 2858 to evaluate the performance synthetically-generated element of simulated environment 2803.
  • simulator evaluator 2858 may analyze simulated vehicle commands 2880 (e.g., simulated steering angles and simulated velocities) to determine whether such commands are an appropriate response to the simulated activities within simulated environment 2803.
  • simulator evaluator 2858 may evaluate interactions of a teleoperator 2808 with the simulated autonomous vehicle 2830 via teleoperator computing device 2804.
  • Simulator evaluator 2858 may evaluate the effects of updated reference data 2827, including updated map tiles and route data, which may be added to guide the responses of simulated autonomous vehicle 2830. Simulator evaluator 2858 may also evaluate the responses of simulator AV controller 2847 when policy data 2829 is updated, deleted, or added.
  • the above-description of simulator 2840 is not intended to be limiting. As such, simulator 2840 is configured to perform a variety of different simulations of an autonomous vehicle relative to a simulated environment, which include both static and dynamic features. For example, simulator 2840 may be used to validate changes in software versions to ensure reliability. Simulator 2840 may also be used to determine vehicle dynamics properties and for calibration purposes. Further, simulator 2840 may be used to explore the space of applicable controls and resulting traj ectories so as to effect learning by self-simulation.
  • FIG. 29 is an example of a flow chart to simulate various aspects of an autonomous vehicle, according to some embodiments.
  • Flow chart 2900 begins at 2902, at which reference data including three dimensional map data is received into a simulator. Dynamic object data defining motion patterns for a classified object may be retrieved at 2904.
  • a simulated environment is formed based on at least three dimensional ("3D") map data and the dynamic object data.
  • the simulated environment may include one or more simulated surfaces.
  • an autonomous vehicle is simulated that includes a simulated autonomous vehicle controller that forms part of a simulated environment.
  • the autonomous vehicle controller may include a simulated perception engine and a simulated localizer configured to receive sensor data.
  • simulated sensor data are generated based on data for at least one simulated sensor return, and simulated vehicle commands are generated at 2912 to cause motion (e.g., vectored propulsion) by a simulated autonomous vehicle in a synthetic environment.
  • simulated vehicle commands are evaluated to determine whether the simulated autonomous vehicle behaved consistent with expected behaviors (e.g., consistent with a policy).
  • FIG. 30 is an example of a flow chart to generate map data, according to some embodiments.
  • Flow chart 3000 begins at 3002, at which traj ectory data is retrieved.
  • the trajectory data may include traj ectories captured over a duration of time (e.g., as logged traj ectories).
  • at 3004 at least localization data may be received.
  • the localization data may be captured over a duration of time (e.g., as logged localization data).
  • a camera or other image sensor may be implemented to generate a subset of the localization data.
  • the retrieved localization data may include image data.
  • subsets of localization data are aligned to identifying a global position (e.g., a global pose).
  • three dimensional (“3D") map data is generated based on the global position, and at 3012, the 3 dimensional map data is available for implementation by, for example, a manual route data editor (e.g., including a manual road network data editor, such as an RNDF editor), an automated route data generator (e.g., including an automatic road network generator, including an automatic RNDF generator), a fleet of autonomous vehicles, a simulator, a teleoperator computing device, and any other component of an autonomous vehicle service.
  • a manual route data editor e.g., including a manual road network data editor, such as an RNDF editor
  • an automated route data generator e.g., including an automatic road network generator, including an automatic RNDF generator
  • a fleet of autonomous vehicles e.g., a simulator, a teleoperator computing device, and any other component of an autonomous vehicle service.
  • FIG. 31 is a diagram depicting an architecture of a mapping engine, according to some embodiments.
  • Diagram 3100 includes a 3D mapping engine that is configured to receive trajectory log data 3140, LIDAR log data 3172, camera log data 3174, radar log data 3176, and other optional logged sensor data (not shown).
  • Logic 3141 includes a loop-closure detector 3150 configured to detect whether sensor data indicates a nearby point in space has been previously visited, among other things.
  • Logic 3141 also includes a registration controller 3152 for aligning map data, including 3D map data in some cases, relative to one or more registration points. Further, logic 3141 provides data 3142 representing states of loop closures for use by a global pose graph generator 3143, which is configured to generate pose graph data 3145.
  • pose graph data 3145 may also be generated based on data from registration refinement module 3146.
  • Logic 3144 includes a 3D mapper 3154 and a LIDAR self-calibration unit 3156. Further, logic 3144 receives sensor data and pose graph data 3145 to generate 3D map data 3120 (or other map data, such as 4D map data). In some examples, logic 3144 may implement a truncated sign distance function ("TSDF") to fuse sensor data and/or map data to form optimal three-dimensional maps. Further, logic 3144 is configured to include texture and reflectance properties.
  • TSDF truncated sign distance function
  • 3D map data 3120 may be released for usage by a manual route data editor 3160 (e.g., an editor to manipulate Route data or other types of route or reference data), an automated route data generator 3162 (e.g., logic to configured to generate route data or other types of road network or reference data), a fleet of autonomous vehicles 3164, a simulator 3166, a teleoperator computing device 3168, and any other component of an autonomous vehicle service.
  • Mapping engine 3110 may capture semantic information from manual annotation or automatically-generated annotation as well as other sensors, such as sonar or instrumented environment (e.g., smart stop-lights).
  • FIG. 32 is a diagram depicting an autonomous vehicle application, according to some examples.
  • Diagram 3200 depicts a mobile computing device 3203 including an autonomous service application 3240 that is configured to contact an autonomous vehicle service platform 3201 to arrange transportation of user 3202 via an autonomous vehicle 3230.
  • autonomous service application 3240 may include a transportation controller 3242, which may be a software application residing on a computing device (e.g., a mobile phone 3203, etc.).
  • Transportation controller 3242 is configured to receive, schedule, select, or perform operations related to autonomous vehicles and/or autonomous vehicle fleets for which a user 3202 may arrange transportation from the user's location to a destination. For example, user 3202 may open up an application to request vehicle 3230.
  • the application may display a map and user 3202 may drop a pin to indicate their destination within, for example, a geo-fenced region.
  • the application may display a list of nearby pre- specified pick-up locations, or provide the user with a text entry field in which to type a destination either by address or by name.
  • autonomous vehicle application 3240 may also include a user identification controller 3246 that may be configured to detect that user 3202 is in a geographic region, or vicinity, near autonomous vehicle 3230, as the vehicle approaches. In some situations, user 3202 may not readily perceive or identify autonomous vehicle 3230 as it approaches for use by user 3203 (e.g., due to various other vehicles, including trucks, cars, taxis, and other obstructions that are typical in city environments).
  • a user identification controller 3246 may be configured to detect that user 3202 is in a geographic region, or vicinity, near autonomous vehicle 3230, as the vehicle approaches. In some situations, user 3202 may not readily perceive or identify autonomous vehicle 3230 as it approaches for use by user 3203 (e.g., due to various other vehicles, including trucks, cars, taxis, and other obstructions that are typical in city environments).
  • autonomous vehicle 3230 may establish a wireless communication link 3262 (e.g., via a radio frequency ("RF") signal, such as WiFi or Bluetooth®, including BLE, or the like) for communicating and/or determining a spatial location of user 3202 relative to autonomous vehicle 3230 (e.g., using relative direction of RF signal and signal strength).
  • RF radio frequency
  • autonomous vehicle 3230 may detect an approximate geographic location of user 3202 using, for example, GPS data or the like.
  • a GPS receiver (not shown) of mobile computing device 3203 may be configured to provide GPS data to autonomous vehicle service application 3240.
  • user identification controller 3246 may provide GPS data via link 3260 to autonomous vehicle service platform 3201, which, in turn, may provide that location to autonomous vehicle 3230 via link 3261.
  • autonomous vehicle 3230 may determine a relative distance and/or direction of user 3202 by comparing the user's GPS data to the vehicle's GPS-derived location.
  • Autonomous vehicle 3230 may also include additional logic to identify the presence of user 3202, such that logic configured to perform face detection algorithms to detect either user 3202 generally, or to specifically identify the identity (e.g., name, phone number, etc.) of user 3202 based on the user's unique facial characteristics. Further, autonomous vehicle 3230 may include logic to detect codes for identifying user 3202. Examples of such codes include specialized visual codes, such as QR codes, color codes, etc., specialized audio codes, such as voice activated or recognized codes, etc., and the like. In some cases, a code may be an encoded security key that may be transmitted digitally via link 3262 to autonomous vehicle 3230 to ensure secure ingress and/or egress.
  • one or more of the above-identified techniques for identifying user 3202 may be used as a secured means to grant ingress and egress privileges to user 3202 so as to prevent others from entering autonomous vehicle 3230 (e.g., to ensure third party persons do not enter an unoccupied autonomous vehicle prior to arriving at user 3202).
  • any other means for identifying user 3202 and providing secured ingress and egress may also be implemented in one or more of autonomous vehicle service application 3240, autonomous vehicle service platform 3201, and autonomous vehicle 3230.
  • autonomous vehicle 3230 may be configured to notify or otherwise alert user 3202 to the presence of autonomous vehicle 3230 as it approaches user 3202.
  • autonomous vehicle 3230 may activate one or more light-emitting devices 3280 (e.g., LEDs) in accordance with specific light patterns.
  • certain light pattems are created so that user 3202 may readily perceive that autonomous vehicle 3230 is reserved to service the transportation needs of user 3202.
  • autonomous vehicle 3230 may generate light patterns 3290 that may be perceived by user 3202 as a "wink," or other animation of its exterior and interior lights in such a visual and temporal way.
  • the pattems of light 3290 may be generated with or without pattems of sound to identity to user 3202 that this vehicle is the one that they booked.
  • autonomous vehicle user controller 3244 may implement a software application that is configured to control various functions of an autonomous vehicle. Further, an application may be configured to redirect or reroute the autonomous vehicle during transit to its initial destination. Further, autonomous vehicle user controller 3244 may be configured to cause on-board logic to modify interior lighting of autonomous vehicle 3230 to effect, for example, mood lighting.
  • Controller 3244 may also control a source of audio (e.g., an external source such as Spotify, or audio stored locally on the mobile computing device 3203), select a type of ride (e.g., modify desired acceleration and braking aggressiveness, modify active suspension parameters to select a set of "road-handling" characteristics to implement aggressive driving characteristics, including vibrations, or to select “soft-ride” qualities with vibrations dampened for comfort), and the like.
  • a source of audio e.g., an external source such as Spotify, or audio stored locally on the mobile computing device 3203
  • select a type of ride e.g., modify desired acceleration and braking aggressiveness, modify active suspension parameters to select a set of "road-handling" characteristics to implement aggressive driving characteristics, including vibrations, or to select “soft-ride” qualities with vibrations dampened for comfort
  • mobile computing device 3203 may be configured to control HVAC functions as well, like ventilation and temperature.
  • FIGs. 33 to 35 illustrate examples of various computing platforms configured to provide various functionalities to components of an autonomous vehicle service, according to various embodiments.
  • computing platform 3300 may be used to implement computer programs, applications, methods, processes, algorithms, or other software to perform the above- described techniques.
  • FIG. 33 Note that various structures and/or functionalities of FIG. 33 are applicable to FIGs. 34 and 35, and, as such, some elements in those figures may be discussed in the context of FIG. 33.
  • computing platform 3300 can be disposed in any device, such as a computing device 3390a, which may be disposed in one or more computing devices in an autonomous vehicle service platform, an autonomous vehicle 3391, and/or mobile computing device 3390b.
  • a computing device 3390a which may be disposed in one or more computing devices in an autonomous vehicle service platform, an autonomous vehicle 3391, and/or mobile computing device 3390b.
  • Computing platform 3300 includes a bus 3302 or other communication mechanism for communicating information, which interconnects subsystems and devices, such as processor 3304, system memory 3306 (e.g., RAM, etc.), storage device 3308 (e.g., ROM, etc.), an in-memory cache (which may be implemented in RAM 3306 or other portions of computing platform 3300), a communication interface 3313 (e.g., an Ethernet or wireless controller, a Bluetooth controller, NFC logic, etc.) to facilitate communications via a port on communication link 3321 to communicate, for example, with a computing device, including mobile computing and/or communication devices with processors.
  • a bus 3302 or other communication mechanism for communicating information which interconnects subsystems and devices, such as processor 3304, system memory 3306 (e.g., RAM, etc.), storage device 3308 (e.g., ROM, etc.), an in-memory cache (which may be implemented in RAM 3306 or other portions of computing platform 3300), a communication interface 3313 (e.g.,
  • Processor 3304 can be implemented with one or more graphics processing units (“GPUs”), with one or more central processing units (“CPUs”), such as those manufactured by Intel® Corporation, or one or more virtual processors, as well as any combination of CPUs and virtual processors.
  • GPUs graphics processing units
  • CPUs central processing units
  • Computing platform 3300 exchanges data representing inputs and outputs via input-and-output devices 3301, including, but not limited to, keyboards, mice, audio inputs (e.g., speech-to-text devices), user interfaces, displays, monitors, cursors, touch-sensitive displays, LCD or LED displays, and other I/O-related devices.
  • computing platform 3300 performs specific operations by processor 3304 executing one or more sequences of one or more instructions stored in system memory 3306, and computing platform 3300 can be implemented in a client-server arrangement, peer-to-peer arrangement, or as any mobile computing device, including smart phones and the like.
  • Such instructions or data may be read into system memory 3306 from another computer readable medium, such as storage device 3308.
  • hard-wired circuitry may be used in place of or in combination with software instructions for implementation. Instructions may be embedded in software or firmware.
  • the term "computer readable medium” refers to any tangible medium that participates in providing instructions to processor 3304 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media.
  • Non-volatile media includes, for example, optical or magnetic disks and the like.
  • Volatile media includes dynamic memory, such as system memory 3306.
  • Common forms of computer readable media includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD- ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read. Instructions may further be transmitted or received using a transmission medium.
  • transmission medium may include any tangible or intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such instructions.
  • Transmission media includes coaxial cables, copper wire, and fiber optics, including wires that comprise bus 3302 for transmitting a computer data signal.
  • execution of the sequences of instructions may be performed by computing platform 3300.
  • computing platform 3300 can be coupled by communication link 3321 (e.g., a wired network, such as LAN, PSTN, or any wireless network, including WiFi of various standards and protocols, Bluetooth®, NFC, Zig-Bee, etc.) to any other processor to perform the sequence of instructions in coordination with (or asynchronous to) one another.
  • Computing platform 3300 may transmit and receive messages, data, and instructions, including program code (e.g., application code) through communication link 3321 and communication interface 3313.
  • Received program code may be executed by processor 3304 as it is received, and/or stored in memory 3306 or other non-volatile storage for later execution.
  • system memory 3306 can include various modules that include executable instructions to implement functionalities described herein.
  • System memory 3306 may include an operating system (“O/S”) 3332, as well as an application 3336 and/or logic module(s) 3359.
  • system memory 3306 includes an autonomous vehicle (“AV") controller module 3350 and/or its components (e.g., a perception engine module, a localization module, a planner module, and/or a motion controller module), any of which, or one or more portions of which, can be configured to facilitate an autonomous vehicle service by implementing one or more functions described herein.
  • AV autonomous vehicle
  • system memory 3306 includes an autonomous vehicle service platform module 3450 and/or its components (e.g., a teleoperator manager, a simulator, etc.), any of which, or one or more portions of which, can be configured to facilitate managing an autonomous vehicle service by implementing one or more functions described herein.
  • an autonomous vehicle service platform module 3450 and/or its components (e.g., a teleoperator manager, a simulator, etc.), any of which, or one or more portions of which, can be configured to facilitate managing an autonomous vehicle service by implementing one or more functions described herein.
  • system memory 3306 includes an autonomous vehicle (“AV”) module and/or its components for use, for example, in a mobile computing device.
  • AV autonomous vehicle
  • One or more portions of module 3550 can be configured to facilitate delivery of an autonomous vehicle service by implementing one or more functions described herein.
  • the structures and/or functions of any of the above-described features can be implemented in software, hardware, firmware, circuitry, or a combination thereof.
  • the structures and constituent elements above, as well as their functionality may be aggregated with one or more other structures or elements.
  • the elements and their functionality may be subdivided into constituent sub-elements, if any.
  • the above- described techniques may be implemented using various types of programming or formatting languages, frameworks, syntax, applications, protocols, objects, or techniques.
  • module can refer, for example, to an algorithm or a portion thereof, and/or logic implemented in either hardware circuitry or software, or a combination thereof. These can be varied and are not limited to the examples or descriptions provided.
  • module 3350 of FIG. 33, module 3450 of FIG. 34, and module 3550 of FIG. 35, or one or more of their components, or any process or device described herein can be in communication (e.g., wired or wirelessly) with a mobile device, such as a mobile phone or computing device, or can be disposed therein.
  • a mobile device such as a mobile phone or computing device
  • a mobile device or any networked computing device (not shown) in communication with one or more modules 3359 (module 3350 of FIG. 33, module 3450 of FIG.
  • the structures and/or functions of any of the above-described features can be implemented in software, hardware, firmware, circuitry, or any combination thereof.
  • the structures and constituent elements above, as well as their functionality may be aggregated or combined with one or more other structures or elements. Alternatively, the elements and their functionality may be subdivided into constituent sub-elements, if any.
  • at least some of the above-described techniques may be implemented using various types of programming or formatting languages, frameworks, syntax, applications, protocols, objects, or techniques.
  • at least one of the elements depicted in any of the figures can represent one or more algorithms.
  • at least one of the elements can represent a portion of logic including a portion of hardware configured to provide constituent structures and/or functionalities.
  • module 3350 of FIG. 33 module 3450 of FIG. 34, and module 3550 of FIG.
  • 35 can be implemented in one or more computing devices (i.e., any mobile computing device, such as a wearable device, an audio device (such as headphones or a headset) or mobile phone, whether worn or carried) that include one or more processors configured to execute one or more algorithms in memory.
  • any mobile computing device such as a wearable device, an audio device (such as headphones or a headset) or mobile phone, whether worn or carried
  • processors configured to execute one or more algorithms in memory.
  • processors configured to execute one or more algorithms in memory.
  • the elements in the above-described figures can represent one or more algorithms.
  • at least one of the elements can represent a portion of logic including a portion of hardware configured to provide constituent structures and/or functionalities.
  • RTL register transfer language
  • FPGAs field-programmable gate arrays
  • ASICs application-specific integrated circuits
  • multi-chip modules multi-chip modules, or any other type of integrated circuit.
  • module 3350 of FIG. 33, module 3450 of FIG. 34, and module 3550 of FIG. 35, or one or more of its components, or any process or device described herein, can be implemented in one or more computing devices that include one or more circuits.
  • at least one of the elements in the above-described figures can represent one or more components of hardware.
  • at least one of the elements can represent a portion of logic including a portion of a circuit configured to provide constituent structures and/or functionalities.
  • the term "circuit” can refer, for example, to any system including a number of components through which current flows to perform one or more functions, the components including discrete and complex components.
  • discrete components include transistors, resistors, capacitors, inductors, diodes, and the like
  • complex components include memory, processors, analog circuits, digital circuits, and the like, including field-programmable gate arrays ("FPGAs"), application-specific integrated circuits ("ASICs"). Therefore, a circuit can include a system of electronic components and logic components (e.g., logic configured to execute instructions, such that a group of executable instructions of an algorithm, for example, and, thus, is a component of a circuit).
  • logic components e.g., logic configured to execute instructions, such that a group of executable instructions of an algorithm, for example, and, thus, is a component of a circuit.
  • the term “module” can refer, for example, to an algorithm or a portion thereof, and/or logic implemented in either hardware circuitry or software, or a combination thereof (i.e., a module can be implemented as a circuit).
  • algorithms and/or the memory in which the algorithms are stored are “components” of a circuit.
  • circuit can also refer, for example, to a system of components, including algorithms. These can be varied and are not limited to the examples or descriptions provided.
  • FIG. 36 is a diagram depicting a mapping engine configured to generate mapping data adaptively for autonomous vehicles responsive to changes in physical environments, according to some examples.
  • Diagram 3600 depicts a mapping engine 3654 disposed in an autonomous vehicle service platform 3601 communicatively coupled via a communication layer (not shown) to one or more autonomous vehicles 3630.
  • Mapping engine 3654 is configured to generate map data, and to modify map data adaptively responsive to changes in physical environments in which autonomous vehicle 3630 transits.
  • mapping engine 3654 may generate mapping data based on sensor data received from autonomous vehicle 3630, which is depicted as having any number of sensors or sensor devices 3604a, 3604b, and 3604c, of a sensor type 3602a, a sensor type 3602b, and a sensor type 3602c, respectively.
  • Autonomous vehicle 3630 may include any number of other sensors or sensor devices 3604n having any other sensor types 3602n.
  • Sensors 3604a, 3604b, 3604c, and 3604n respectively generate sensor data 3607a, 3607b, 3607c, and 3607n, one or more of which may be received into mapping engine 3654 for generating map data 3659 (e.g., 2D, 3D, and/or 4D map data).
  • Map data 3659 may be transmitted to autonomous vehicle 3630 for storage in map repository 3605a and for use to facilitate localization as well as other functionalities.
  • autonomous vehicle 3630 may include a localizer (not shown) that uses map data in map repository 3605a to determine a location and/or local pose of the autonomous vehicle at any time, including during transit.
  • mapping engine 3654 can facilitate the generation of "self-healing" maps and map data by, for example, detecting variations in portions of map data over time, and generating updated maps (i.e., updated map data) that include the variations or changes to the physical environment in which autonomous vehicle 3630 travels.
  • mapping engine 3654 may generate an adaptive three-dimensional model of a cityscape adjacent to networks of paths and roadways over which a fleet of autonomous vehicles travel.
  • a 3D model of a portion of the cityscape can be derived by identifying data that represent surfaces (and other surface attributes, such as shape, size, texture, color, etc., of the surfaces) that constitute the facade or exterior surfaces of objects, such as buildings (including commercial signage), trees, guard rails, barriers, street lamps, traffic signs and signals, and any other physical feature that may be detected by sensors 3604a, 3604b, 3604c, and 3604n.
  • mapping engine 3654 may be configured to detect an obj ect (or the absence of the obj ect) associated with a portion of map data, as well as changes in the obj ect (e.g., changes in color, size, etc.), and may be further configured to incorporate changes to an obj ect into map data to adaptively form (e.g., automatically) an updated portion of the map data. Therefore, an updated portion of map data may be stored in map repository 3605a so as to enhance, among other things, the accuracy of localization functions for autonomous vehicle 3630 (as well as other autonomous vehicle controller functions, including planning and the like).
  • map data 3659 generated by mapping engine 3654 may be used in combination with locally-generated map data (not shown), as generated by a local map generator (not shown) in autonomous vehicle 3630.
  • an autonomous vehicle controller (not shown) may detect that one or more portions of map data in map repository 3605a varies from one or more portions of locally-generated map data.
  • Logic in the autonomous vehicle controller can analyze the differences in map data (e.g., variation data) to identity a change in the physical environment (e.g., the addition, removal, or change in a static obj ect).
  • the term "variation data” may refer to the differences between remotely-generated and locally- generated map data.
  • the autonomous vehicle controller may implement varying proportional amounts of map data in map repository 3605a and locally-generated map data to optimize localization. For example, an autonomous vehicle controller may generate hybrid map data composed of both remotely-generated map data and locally- generated map data to optimize the determination of the location or local pose of autonomous vehicle 3630. Further, an autonomous vehicle controller, upon detecting variation data, may cause transmission (at various bandwidths or data rates) of varying amounts of sensor-based data or other data to autonomous vehicle service platform 3601. For example, autonomous vehicle service platform 3601 may receive different types of data at different data rates based on, for instance, the criticality of receiving guidance from a teleoperator.
  • subsets of sensor data 3607a, 3607b, 3607c, and 3607n may be transmitted (e.g., at appropriate data rates) to, for example, modify map data to form various degrees of updated map data in real-time (or near real-time), and to further perform one or more of the following: (1) evaluate and characterize differences in map data, (2) propagate updated portions of map data to other autonomous vehicles in the fleet, (3) generate a notification responsive to detecting map data differences to a teleoperator computing device, (4) generate a depiction of the environment (and the changed portion thereof), as sensed by various sensor devices 3604a, 3604b, 3604c, and 3604n, to display at any sufficient resolution in a user interface of a teleoperator computing device.
  • mapping engine 3654 in view of detected changes in physical environments relative to map data.
  • sensor type 3602a, sensor type 3602b, and sensor type 3602c may include laser-based sensors, image-based sensors, and radar-based sensors, respectively.
  • sensors 3604a, 3604b, and 3604c may include Lidars, cameras, and radar devices, respectively.
  • multiple sensor devices e.g., Lidars
  • each Lidar 3604a may be disposed at different locations on autonomous vehicle 3630, and each may be oriented differently (see FIGs. 3A and 3C, both of which depict different Lidars having different views and sensor fields).
  • mapping engine 3654 and/or components of autonomous vehicle services platform 3601 may be configured to align, map, transform, or correlate the laser returns of different Lidars 3604a for common points of laser returns from a surface in the environment. Mapping engine 3654 and/or components of autonomous vehicle services platform 3601 may also process sensor data 3607b and sensor data 3607c similarly.
  • one or more sensors 3604n may include various different sensor types (“n") 3602n to generate various different subsets of sensor data 3607n.
  • sensors 3604n include positioning sensors, such as one or more global positioning system (“GPS”) data receiver- sensors, one or more inertial measurement units (“IMUs”), one or more odometry sensors (e.g., wheel encoder sensors, wheel speed sensors, and the like), one or more wheel angle sensors, and the like to provide autonomous vehicle position and pose data.
  • GPS global positioning system
  • IMUs inertial measurement units
  • odometry sensors e.g., wheel encoder sensors, wheel speed sensors, and the like
  • wheel angle sensors e.g., wheel angle sensors, and the like
  • Such pose data may include one or more coordinates (e.g., an x-coordinate, a y-coordinate, and/or a z-coordinate), a yaw value, a roll value, a pitch value (e.g., an angle value), a rate (e.g., velocity), an altitude, and the like
  • coordinates e.g., an x-coordinate, a y-coordinate, and/or a z-coordinate
  • a yaw value e.g., a roll value
  • a pitch value e.g., an angle value
  • a rate e.g., velocity
  • a log data repository 3609 in autonomous vehicle services platform 3601 is configured to receive and store subsets of sensor data 3607a, 3607b, 3607c, and 3607n, which, in at least one example, include raw LIDAR data, raw camera data, raw radar data, and other raw sensor data, respectively.
  • subsets of sensor data 3607a, 3607b, at 3607c may be stored or logged at a common point in time or during a common interval of time as data set (“ 1 ") 3610a, data set ("2") 3610b, and data set ("n") 3610n, or as any number of data sets.
  • Data sets 3610a, 3610b, and 3610n may be stored in data structures of log files, according to some examples.
  • sensor data 3607n which may be sensed contemporaneously with subsets of sensor data 3607a, 3607b, and 3607c, may also be stored as part of log files for data sets 3610a, 3610b, and 361 On.
  • Alignment controller 3640 may be configured to receive one or more of sensor data 3607a, 3607b, 3607c, and 3607n, as well as other data 3603m. Alignment controller 3640 may also be configured to generate data representing aligned subsets of sensor data 3607a, 3607b, 3607c, and 3607n.
  • sensor data 3607 may include a subset of sensor data 3607n that includes positioning data (e.g., sensor data 3607m may include GPS, IMU, and odometry data).
  • examples of data representing aligned subsets of sensor data include data representing at least aligned Lidar data and aligned camera data.
  • alignment controller 3640 may be configured to implement a registration algorithm to align sensor data by identifying "registration" points at which to register portions or frames of Lidar sensor data and to register portions or frames of camera data. For example, alignment controller 3640 may map or relate laser returns from one Lidar to other Lidars, and may map or relate pixel data from one camera to other cameras. Further, alignment controller 3640 may generate positioning map data, such data may be stored in a data structure based on a pose graph model in which data specifying individual poses (e.g., local poses) may be interrelated spatially based on positioning sensor data collected from sensors 3607n (e.g., GPS data, IMU data, odometry data, etc.).
  • pose graph model in which data specifying individual poses (e.g., local poses) may be interrelated spatially based on positioning sensor data collected from sensors 3607n (e.g., GPS data, IMU data, odometry data, etc.).
  • Mapping engine 3654 may be configured to receive the above-described aligned sensor data (e.g., registered sensor data) and positioning map data (e.g., pose graph-related data) to generate a high definition ("HD") three-dimensional model of a cityscape adj acent a network of roadways based on the integration of the subsets of sensor data 3607a, 3607b, 3607c, and 3607n.
  • aligned sensor data e.g., registered sensor data
  • positioning map data e.g., pose graph-related data
  • mapping engine 3654 may include one or more of the following: an integrator 3651 to integrate sensor data, a calibrator 3652 to calibrate sensor data, a data change detector 3653 to detect changes in portions of map data, a tile generator 3656 to generate formatted map data, and a data change manager 3657 to manage implementation of changed map data, according to various examples.
  • Integrator 3651 may be configured to integrate multiple subsets of sensor data (e.g., of the same and different sensor modalities) to generate high-resolution (e.g., relatively high-resolution) imagery data as a 3D model of an environment in which autonomous vehicles travel, and may be further configured to reduce errors related to the individual types of sensors. According to some examples, integrator 3651 is configured to fuse sensor data (e.g., LIDAR data, camera data, radar data, etc.) to form integrated sensor data.
  • sensor data e.g., LIDAR data, camera data, radar data, etc.
  • raw sensor data sets 3610a, 3610b, and 3610n may be received from one or more autonomous vehicles 3630 so as to fuse the aggregation of one or more subsets of sensor data of one or more sensor modalities from a fleet of autonomous vehicles 3630.
  • integrator 3651 may generate 3D data sets that include fused sensor data, such as data set (" 1 ") 3655a and data set ("2") 3655b.
  • Integrator 3651 may integrate or otherwise fuse at least two types of sensor data, including the subsets of laser return data and the subsets of image data.
  • the fusing of laser and image data may include correlating pixel data of subsets of image data to subsets of laser return data.
  • integrator 3651 may associate pixel data of one or more pixels to one or more laser returns, whereby the laser data may be associated with a portion of a surface in the three- dimensional tile data.
  • pixel data may specify one or more surface characteristics including texture, color, reflectivity, transparency, etc.
  • integrator 3651 may implement a Kalman filtering process or a variant thereof (e.g., an extended Kalman filtering process), or any other process in which to fuse sensor data.
  • Integrator 3651 may also include logic to extract or otherwise determine surfaces of objects (e.g., buildings, trees, parked automobiles, etc.) or features, as well as surface characteristics, relative to a pose of an autonomous vehicle at which sensor data may be acquired.
  • Integrator 3651 may be configured to use sensor data sets 3655a and 3655b to extract surface-related data of physical obj ects in an environment of an autonomous vehicle.
  • Data set 3655a and data set 3655b, as well as others not shown, may include fused sensor data representing a three- dimensional model relative to different points in time or different intervals of time. Therefore, data sets 3655 may be used to detect whether there are changes to the physical environment, or portions thereof, over time.
  • integrator 3651 may also implement a distance transform, such as signed distance function (" SDF"), to determine one or more surfaces external to an autonomous vehicle.
  • SDF signed distance function
  • a truncated sign distance function may be implemented to identity one or more points on the surface relative to a reference point (e.g., one or more distances to points on a surface of an extemal object relative to a local pose).
  • Integrator 3651 may be configured to generate 3D models of a cityscape (or any extemal obj ect feature) as probabilistic maps, whereby map data may represent probability distributions over one or more environmental properties.
  • the probabilistic map may be formed using laser intensity (e.g., average laser intensity or reflectivity) and variances of infrared remittance value at distances or points in space relative to a pose of an autonomous vehicle.
  • a data structure for storing map data may include a number of cells that include, for example, an intensity average value and a variance value.
  • this or any other data structure may also include a number of cells for storing 3D map data, such as color data (e.g., RGB values or other color space values), texture data, reflectance data, or any other surface characteristic or attribute data (e.g., specular data).
  • a cell that is configured to store map-related data may be implemented as a voxel or as a 3D tile, according to some examples.
  • mapping engine 3654 and/or integrator 3651, as well as other components of mapping engine 3654, may be configured to generate 3D map data in an "offline" mode of operation.
  • mapping engine 3654 may implement algorithms (e.g., machine learning, including deep- learning algorithms) that analyze data sets 3655 based on logged data sets (e.g., static data) to generate map data.
  • mapping engine 3654 may not be limited to off-line map generation, but may also implement "online” map generation techniques in which one or more portions of raw sensor data may be received in real-time (or nearly real-time) to generate map data or identify changes thereto.
  • Mapping engine 3654 may implement logic configured to perform simultaneous localization and mapping ("SLAM”) or any suitable mapping technique.
  • SLAM simultaneous localization and mapping
  • Data change detector 3653 is configured to detect changes in data sets 3655a and 3655b, which are examples of any number of data sets of 3D map data. Data change detector 3653 also is configured to generate data identifying a portion of map data that has changed, as well as optionally identifying or classifying an object associated with the changed portion of map data. In the example shown, a number of data sets, including data set 3655a, includes map data configured to generate map data, which is conceptually depicted as 3D model data 3660 (e.g., a roadway at time, Tl, including portions of map data 3664).
  • 3D model data 3660 e.g., a roadway at time, Tl, including portions of map data 3664.
  • data change detector 3653 may detect that another number of data sets, including data set 3655b, includes data representing the presence of extemal obj ects in portions of map data 3665 of 3D model data 3661, whereby portions of map data 3665 coincide with portions of map data 3664 at different times. Therefore, data change detector 3653 may detect changes in map data, and may further adaptively modify map data to include the changed map data (e.g., as updated map data). [0157] According to some examples, data change detector 3653 is configured to perform one or more statistical change detection algorithms to detect changes in physical environments. Multi- temporal analysis techniques or the other suitable algorithms may also be used.
  • the structures of data sets 3655a and 3655b may be implemented as cumulative data structures with which to index sensor data (e.g., measurements thereof) stored in a 3D map data structure.
  • a statistical change detection algorithm may be configured to detect portions of map data that change by identifying boundaries over one or more iterations of a deep-learning computation.
  • data changed detector 3653 may be configured to detect boundaries of map data portions 3664 and 3665 over time, such as over two or more data sets (e.g., over one or more passes, or epochs, of application of data sets to statistical change detection algorithms or deep-learning algorithms). Epoch determination may also be applied to, for example, construct 4D maps and associated 4D map data.
  • data change detector 3653 may classify portions of map data, as well as an object therein, to identity whether an obj ect is static or dynamic. In some cases, dynamic objects may be filtered out from map data generation.
  • Mapping engine 3654 is configured to provide map data 3659 to map data repository 3605a in reference data repository 3605.
  • Mapping engine 3654 may be configured to apply the change in map data to form updated three-dimensional ("3D") map data as reference data for transmission to reference data stores (i.e., repositories) in a fleet of autonomous vehicles.
  • the change in data may be representative of a state change of an environment at which various types of sensor data are sensed.
  • the state change of the environment therefore, may be indicative of change in state of an object located therein (e.g., inclusion of data representing the presence or absence of one or more objects).
  • data change manager 3657 may be configured to identity or otherwise specify (e.g., via identifier or indicator data 3658) that a portion of map data includes changed map data 3658 (or an indication thereof).
  • map data 3692 stored map repository 3605a is associated with, or linked to, indication data ("delta data") 3694 that indicated that an associated portion of map data has changed.
  • indication data 3694 may identity a set of traffic cones, as changed portions of map data 3665, disposed in a physical environment associated with 3D model 3661 through which an autonomous vehicle travels.
  • a tile generator 3656 may be configured to generate two-dimensional or three-dimensional map tiles based on map data from data sets 3655a and 3655b.
  • the map tiles may be transmitted for storage in map repository 3605a.
  • Tile generator 3656 may generate map tiles that include indicator data for indicating a portion of the map is an updated portion of map data. Further, an updated map portion may be incorporated into a reference data repository 3605 in an autonomous vehicle. Therefore, consider an example in which an autonomous vehicle 3630 travels through the physical environment and plans on traveling near a recently-added object (e.g., traffic cones) in an environment.
  • a localizer (not shown) may access map data that is associated with a changed portion of map data (e.g., an updated portion of map data) to localize the autonomous vehicle.
  • logic may invoke additional processing to ensure that the use of updated map data may be used effectively and safely to navigate an autonomous vehicle 3630. For example, when a map tile including changed map data is accessed or implemented during localization, a request for teleoperator monitoring or assistance may be generated. Note that in some examples, changed portions of map data may also refer to temporary map data as such data may be used in fewer situations than, for example, validated map data.
  • changed portions of map data may also be validated for integration into map data, whereby the status of the changed map data is transitioned from "temporary" to "validated.”
  • a change in map data may be exported, as updated three-dimensional map data, to a simulator computing device.
  • the simulator computing device may then simulate performance of a portion of the fleet of autonomous vehicles in a simulated environment based on the updated three-dimensional map data.
  • the changed map portions may be incorporated to form new three-dimensional map data.
  • New three-dimensional map data may be viewed as three-dimensional map data that may be relied upon such that indications changed map data (i.e., indications of changed map data 3694) may be removed, as well as invocation of requests (e.g., automatic requests) for teleoperator assistance.
  • indications changed map data i.e., indications of changed map data 3694
  • requests e.g., automatic requests
  • mapping engine 3654 may include, or be implemented as, a 3D mapping engine and/or a mapper as shown in FIG. 31. Further, components of mapping engine 3654 may be combined or otherwise distributed within or without mapping engine 3654. Mapping engine 3654 and any of its components may be implemented in hardware or software, or a combination thereof. Moreover, mapping engine 3654 may include any functionality and/or structure described herein, including one or more components of a perception engine to perform object detection, segmentation, and/or classification.
  • alignment controller 3640 may include one or more components of mapping engine 3110 of FIG. 31.
  • alignment controller 3640 may include a loop-closure detector 3150, a registration controller 3152, a global pose generator 3143, and a registration refinement module 3146.
  • autonomous vehicle service platform 3601 may implement, as part of alignment controller 3640, loop-closure detector 3150 of FIG. 31 that may be configured to detect one or more portions of pose graphs at which autonomous vehicle 3630 of FIG. 36 has previously traversed (e.g., loop-closure detector 3150 of FIG. 31 may perform one or more loop-closure processes to identify a closed loop).
  • Registration controller 3152 may be configured to align or register multiple portions or frames of the same or different sensor data. For example, one or more data sets of image data may be transformed or otherwise mapped to each other, as well as to one or more data sets of laser return data and/or radar return data. Registration controller 3152 may be configured to align subsets of laser return data, subsets of image data, and the like based on trajectory data representing position data to identify a relative coordinate of the global coordinate system. Examples of trajectory data include GPS data, IMU data, odometry data, etc.
  • Global pose graph generator 3143 may be is configured to generate pose graph data 3145 to specify a pose of autonomous vehicle 3630 of FIG. 36 relative to global coordinate system.
  • global pose graph generator 3143 of FIG. 31 may be configured to form a global pose graph referenced to a global coordinate system.
  • a global pose graph may be formed based on a first type of sensor data (e.g., subsets of laser return data) and a second type of sensor data (e.g., subsets of image data), as well as other optional sensor data (e.g., subsets of radar data).
  • global pose graph generator 3143 may also be configured to align the subsets of laser return data and the subset of image data to a location relative to a coordinate of a global coordinate system.
  • Registration refinement module 3146 is configured to refine the registration of one or more of captured image data, captured laser return data, or other captured sensor data, such as radar data and the like. In some examples, registration refinement module 3146 is configured to reduce or eliminate artifacts of map data (e.g., blurring artifacts or the like) subsequent to, for example, the projection of color data onto 3D mapped surfaces.
  • map data e.g., blurring artifacts or the like
  • FIG. 37 is a diagram depicting an example of an autonomous vehicle controller implementing updated map data, according to some examples.
  • Diagram 3700 depicts a mapping engine 3754 configured to generate map data 3759, which may be implemented as three- dimensional map tiles.
  • map data 3759 may also include changed map data 3758 that either includes a portion of changed map data (e.g., updated portions of map data for use with unchanged portions of map data) or an indication (e.g., indicator data or a pointer) that identifies an updated portion of changed map data, or both.
  • an autonomous vehicle service platform 3701 may be configured to transmit map data 3786 and changed map data 3788 vianetwork 3702.
  • An autonomous vehicle controller 3747 uses map data 3786 and/or changed map data 3788 to localize autonomous vehicle 3730.
  • autonomous vehicle controller 3747 may detect changed map data 3788 is being accessed during localization.
  • autonomous vehicle controller 3747 may generate teleoperator request data 3770 to request teleoperator assistance.
  • Teleoperator request data 3770 may also be configured to request that a teleoperator at least monitor performance of autonomous vehicle 3730 during localization in which updated portions of map data is accessed or implemented (or when autonomous vehicle 3730 approaches or travel near a physical location associated with an updated portion of map data).
  • mapping data generated by mapping engine 3754 may be used to generate other reference data, such as route data (e.g., road network data)., such as RNDF-like data, mission data, such as MDF-like data, and other reference data that may be used to navigate a fleet of autonomous vehicles.
  • route data generator 3780 may be configured to generate route data 3782 based on unchanged and/or validated map data. Further, route generator 3780 may be configured to generate changed route data 3784, which may be generated using changed and/or non-validated map data.
  • autonomous vehicle controller 3747 may generate teleoperator request data 3770 responsive to detecting the use of changed route data 3784. Therefore, changed route data 3784 (e.g., non-validated, or temporary map data) route data may be used to navigate an autonomous vehicle, with or without assistance of guidance data generated by a teleoperator.
  • FIG. 38 is a flow chart illustrating an example of generating map data, according to some examples.
  • Flow 3800 begins at 3802.
  • Subsets of multiple types of sensor data are accessed at 3802 (e.g., in a data store or repository that may include log files).
  • the subset of multiple types of sensor data may correspond to groups of multiple sensors or sensor devices.
  • subsets of LIDAR sensor data may correspond to a group of different LIDAR sensors from which laser return data is received.
  • sensor data may be aligned relative to a global coordinate system to form aligned sensor data.
  • a registration process or algorithm may be configured to align or register the sensor data.
  • data sets of three-dimensional map data may be generated based on the aligned sensor data.
  • a change in map data may be detected relative to at least two data sets of three-dimensional map data.
  • a change in map data may be applied at 3810 to form updated three-dimensional map data.
  • the one or more updated portions of 3D map data may be formatted, as reference data, for transmission to one or more vehicles in a fleet of autonomous vehicles.
  • updated (e.g., changed) three-dimensional map data may be transmitted to at least one autonomous vehicle. Note that the order depicted in this and other flow charts herein are not intended to imply a requirement to linearly perform various functions as each portion of a flow chart may be performed serially or in parallel with any one or more other portions of the flow chart, as well as independent or dependent on other portions of the flow chart.
  • FIG. 39 is a diagram depicting an example of a localizer configured to implement map data and locally-generated map data, according to some examples.
  • localizer 3968 of an autonomous vehicle (“AV") controller 3947 may be configured to generate local pose data 3920 based on either locally-generated map data 3941 or map data 3943, or a combination thereof.
  • Local pose data 3920 may include data describing a local position of an autonomous vehicle 3930, and map data 3943 may be generated at mapping engine 3954 of an autonomous vehicle service platform 3901. Therefore, localizer 3968 may use map data 3943 to perform localization in view of changes, deviations, or variances between the locally-generated map data 3941 and map data 3943.
  • Diagram 3900 depicts an autonomous vehicle 3930, which includes autonomous vehicle controller 3947, a local map generator 3940, and a reference data repository 3905.
  • Diagram 3900 also depicts an autonomous vehicle service platform 3901 including a mapping engine 3954 and a teleoperator computing device 3904.
  • Reference data repository 3905 includes a map store 3905a configured to store three-dimensional map data 3943 and an route data store 3905b, which may be a data repository for storing route data (e.g., with or without an indication that a portion of a route data, or road network data, is associated with changed road network data or updated road network data).
  • Local map generator 3940 may be configured to receive multiple amounts and types of sensor data, such as sensor data from sensor types 3902a, 3902b, and 3902c. According to various examples, local map generator 3940 may be configured to generate map data (e.g., three- dimensional map data) locally in real-time (or nearly in real-time) based on sensor data from sensor types 3902a, 3902b, and 3902c (e.g., from groups of LIDAR sensors, groups of cameras, groups of radars, etc.). Local map generator 3940 may implement logic configured to perform simultaneous localization and mapping ("SLAM”) or any suitable mapping technique.
  • SLAM simultaneous localization and mapping
  • local map generator 3940 may implement "online" map generation techniques in which one or more portions of raw sensor data from sensor types 3902a to 3902c may be received in real-time (or nearly real-time) to generate map data (or identity changes thereto) with which to navigate autonomous vehicle 3930.
  • Local map generator 3940 may also implement a distance transform, such as signed distance function (“SDF”), to determine surfaces external to an autonomous vehicle.
  • SDF signed distance function
  • a truncated sign distance function (“TSDF"), or equivalent, may be implemented to identity one or more points on a surface relative to a reference point (e.g., one or more distances to points on a surface of an external object), whereby the TSDF function may be used to fuse sensor data and surface data to form three-dimensional local map data 3941.
  • TSDF truncated sign distance function
  • Localizer 3968 may be configured to receive sensor data, as well as locally-generated map data 3941 and map data 3943, to localize autonomous vehicle 3930 relative to a coordinate of the global coordinate system associated with three-dimensional map data 3943 (or any other reference data). Also, localizer 3968 is shown to include a variant detector 3969a and a hybrid map selection controller 3969b. Variant detector 3969a is configured to compare locally-generated map data 3941 to map data 3943 to determine whether portions of map data associated with a specific surface or point in space varies. In particular, variant detector 3969a may detect that data (e.g., variance data) representing one or more map portions of local map data 3941 varies from the three-dimensional map data 3943.
  • data e.g., variance data
  • Localizer 3968 upon detecting varying map data portions or variance data, may be configured to localize autonomous vehicle 3930 using hybrid map data from locally -generated map data 3941 and map data 3943.
  • hybrid map selection controller 3969b is configured to control whether locally-generated map data 3941 or map data 3943, or a combination thereof, may be used for localization.
  • different amounts of locally- generated map data 3941 and map data 3943 may be used based on, for example, corresponding probability distributions that may indicate the reliability or accuracy of each.
  • hybrid map selection controller 3969b may be configured to characterize the difference between the one or more map portions of map data 3943 and one or more portions of local map data 3941 to form variation data.
  • hybrid map selection controller 3969b may be configured to determine priorities of using local map data 3941 and priorities of using map data 3943, and may be further configured to cause localizer 3968 to use a first prioritized amount of local map data 3941 and a second prioritized amount of three-dimensional map data 3943, based on the variation data.
  • variant detector 3969a detects variance data for several portions of map data 3943 that varies from corresponding portions of local map data 3941.
  • local map data 3941 is determined to be more accurate for most portions of variance data.
  • at least one portion of local map data 3941 has a relatively lower probability of being accurate than a corresponding portion of map data 3943.
  • hybrid map selection controller 3969b may rely more on local map data 3941 for localization (with some reliance on map data 3943), but may also rely more on a specific portion of map data 3943 (e.g., having a higher priority) for localization than the corresponding portion of local map data 3941 (e.g., having a lower priority).
  • FIG. 40 is a diagram depicting an example of a localizer configured to vary transmission rates or amounts of locally-generated sensor and/or map data, according to some examples.
  • Diagram 4000 depicts a number of autonomous vehicles, including autonomous vehicle 4030a, 4030b, 4030c, and 4030n, and diagram 4000 also depicts an autonomous vehicle service platform 4001 including a mapping engine 4054 and teleoperator logic 4004, which is implemented in association with a teleoperator computing device 4006 that accepts data signals (e.g., user inputs) from a teleoperator 4008.
  • Teleoperator logic 4004 may be disposed in a server computing device (not shown) or teleoperator computing device 4006.
  • autonomous vehicle 4030a may include an autonomous vehicle controller 4047, a reference data repository 4005 (e.g., including a map store or repository 4005a for storing map data 4046, and an route data store or repository 4005b), and a transceiver 4044 that is configured to exchange data between autonomous vehicle 4030a and autonomous vehicle service platform 4001.
  • autonomous vehicle controller 4047 may include a local map generator 4040, which may be configured to generate local map data 4041 based on sensor data from different types of sensors 4002a to 4002c.
  • Autonomous vehicle controller 4047 is shown to also include a localizer 4068, which is shown to include a variant detector 4069a and a communication controller 4069b, for generating local pose data 4020.
  • elements depicted in diagram 4000 of FIG. 40 may include structures and/or functions as similarly-named elements described in connection to one or more other drawings, such as FIG. 39, among others.
  • communication controller 4069b may be configured to control transceiver 4044, as well as the types or amounts of data transmitted to autonomous vehicles service platform 4001. Therefore, communication controller 4069b is configured to provide sufficient data for teleoperation logic 4004 and/or teleoperator 4008 to select an optimal set of guidance data to resolve detected map data variations, according to various examples. Communication controller 4069b is configured to provide optimal amounts of data or data rates so as to conserve bandwidth. To illustrate operation of communication controller 4069b, consider that variant detector 4069a detects relatively minor or small amount of differences between map data 4043 and local map data 4041.
  • communication controller 4069b may transmit relatively low amounts of data to provide an alert to teleoperator 4008 to urge the teleoperator to at least monitor autonomous vehicle 4030a as it travels through an environment that includes a minor change.
  • simpler or more abstract depictions of the data may be transmitted (e.g., bounding boxes with associated metadata, etc.) rather than greater amounts of data.
  • variant detector 4069a detects relatively moderate amounts of differences between map data 4043 and local map data 4041.
  • communication controller 4069b may be configured to increase the transmission bandwidth of transceiver 4044 to transmit one or more portions of local map data 4041 to autonomous vehicle service platform 4001 for evaluation by teleoperator logic 4004.
  • variant detector 4069a detects relatively large amounts of differences between map data 4043 and local map data 4041.
  • communication controller 4069b may be configured to further increase the transmission bandwidth by transceiver 4044 to transmit one or more portions of high resolution sensor data 4047 to autonomous vehicle service platform 4001 for visual presentation of the physical environment on a display 4009.
  • Lidar data may be transmitted, however, any amount of less than all Lidar data may be transmitted.
  • Sensor-based data 4002 may be used to generate a three-dimensional view in real-time (or nearly in real-time) so that teleoperator 4008 may identify changes in map data visually.
  • recently-placed traffic cones 4011 are identified as being a cause of variance data, or the differences between portions of map data 4043 and local map data 4041. Note that the above-described implementations are just a few examples of any number of implementations of the elements shown in diagram 4000, and, as such, the above description of diagram 4000 is not intended to be limiting.
  • FIG. 41 is a flow diagram depicting an example process of using various amounts of locally-generated map data to localize an autonomous vehicle, according to some examples.
  • Flow 4100 begins at 4102, which includes localizing an autonomous vehicle relative to a coordinate of a global coordinate system in association with three-dimensional map data.
  • variance data may be detected. That is, data representing one or more map portions of three-dimensional map data that varies from sensed data (e.g., LIDAR data, camera data, etc.) generated by multiple sensor types may be detected.
  • sensed data e.g., LIDAR data, camera data, etc.
  • hybrid map data may be implemented using map data from a local map and from a three-dimensional map.
  • flow 4100 may implement hybrid map data from a local map and a three-dimensional map; a teleoperator request may be generated at 4108.
  • a difference between three-dimensional map data and sensed data e.g., data used to generate local map data
  • the rate of transmitting sensor- related data e.g., raw sensor data, local map data, etc.
  • a three-dimensional representation of an environment is generated at which the autonomous vehicle acquires the data to depict an event on a display of a teleoperator computing device. Therefore, the addition or absence of an object that causes a difference between map data and locally-generated map data may be visually presented to a teleoperator.
  • FIGs. 42 to 43 illustrate examples of various computing platforms configured to provide various mapping-related functionalities to components of an autonomous vehicle service, according to various embodiments.
  • computing platform 3300 may be used to implement computer programs, applications, methods, processes, algorithms, or other software to perform the above-described techniques.
  • FIG. 33 may be applicable to FIGs. 42 and 43, and, as such, some elements in those figures may be discussed in the context of FIG. 33.
  • elements depicted in diagram 4200 of FIG. 42 and diagram 4300 of FIG. 43 may include structures and/or functions as similarly -named elements described in connection to one or more other drawings, such as FIGs. 33 to 35, among others.
  • system memory 3306 includes an autonomous vehicle service platform module 4250 and/or its components (e.g., a mapping engine module 4252, etc.), any of which, or one or more portions of which, can be configured to facilitate navigation for an autonomous vehicle service by implementing one or more functions described herein.
  • system memory 3306 includes an autonomous vehicle (“AV") module 4350 and/or its components (e.g., a local map generator module 4352, a hybrid map selection control module 4354, a communication control module 4356, etc.) may be implemented, for example, in an autonomous vehicle 4391.
  • system memory 3306 or a portion thereof may be disposed in mobile computing device 4390a.
  • One or more portions of module 4350 can be configured to facilitate navigation of an autonomous vehicle service by implementing one or more functions described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Transportation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Geometry (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Sustainable Development (AREA)
  • Sustainable Energy (AREA)
  • Power Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Navigation (AREA)

Abstract

Various embodiments relate generally to autonomous vehicles and associated mechanical, electrical and electronic hardware, computer software and systems, and wired and wireless network communications to provide map data for autonomous vehicles. In particular, a method may include accessing subsets of multiple types of sensor data, aligning subsets of sensor data relative to a global coordinate system based on the multiple types of sensor data to form aligned sensor data, and generating datasets of three-dimensional map data. The method further includes detecting a change in data relative to at least two datasets of the three-dimensional map data and applying the change in data to form updated three-dimensional map data. The change in data may be representative of a state change of an environment at which the sensor data is sensed. The state change of the environment may be related to the presence or absences of an object located therein.

Description

ADAPTIVE MAPPING TO NAVIGATE AUTONOMOUS VEHICLES RESPONSIVE TO PHYSICAL ENVIRONMENT CHANGES
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This PCT international application is a continuation of U.S. Application No. 14/932,963 filed November 4, 2015 entitled "AUTONOMOUS VEHICLE FLEET SERVICE AND SYSTEM" and U.S. Patent Application No. 14/932,959 filed November 4, 2016 entitled "AUTONOMOUS VEHICLE FLEET SERVICE AND SYSTEM," and is related to U.S. Patent Application No. 14/932,966 filed November 4, 2015, entitled "TELEOPERATION SYSTEM AND METHOD FOR TRAJECTORY MODIFICATION OF AUTONOMOUS VEHICLES," U.S. Patent Application No. 14/932,940 filed November 4, 2015, entitled "AUTOMATED EXTRACTION OF SEMANTIC INFORMATION TO ENHANCE INCREMENTAL MAPPING MODIFICATIONS FOR ROBOTIC VEHICLES," U.S. Patent Application No. 14/756,995 filed November 4, 2015, entitled "COORDINATION OF DISPATCHING AND MAINTAINING FLEET OF AUTONOMOUS VEHICLES," U.S. Patent Application No. 14/756,992 filed November 4, 2015, entitled "ADAPTIVE AUTONOMOUS VEHICLE PLANNER LOGIC," U.S. Patent Application No. 14/756,991 filed November 4, 2015, entitled "SENSOR-BASED OBJECT-DETECTION OPTIMIZATION FOR AUTONOMOUS VEHICLES," and U.S. Patent Application No. 14/756,996 filed November 4, 2015, entitled "CALIBRATION FOR AUTONOMOUS VEHICLE OPERATION," all of which are hereby incorporated by reference in their entirety for all purposes.
FIELD
[0002] Various embodiments relate generally to autonomous vehicles and associated mechanical, electrical and electronic hardware, computer software and systems, and wired and wireless network communications to provide an autonomous vehicle fleet as a service. More specifically, systems, devices, and methods are configured to provide updates to maps, such as three-dimensional ("3D") maps, either locally (e.g., in-situ at autonomous vehicles) or remotely, or both, for navigating one or more of this vehicles adapted to changes in environments through which the vehicles traverse.
BACKGROUND
[0003] A variety of approaches to developing driverless vehicles focus predominately on automating conventional vehicles (e.g., manually-driven automotive vehicles) with an aim toward producing driverless vehicles for consumer purchase. For example, a number of automotive companies and affiliates are modifying conventional automobiles and control mechanisms, such as steering, to provide consumers with an ability to own a vehicle that may operate without a driver. In some approaches, a conventional driverless vehicle performs safety-critical driving functions in some conditions, but requires a driver to assume control (e.g., steering, etc.) should the vehicle controller fail to resolve certain issues that might jeopardize the safety of the occupants.
[0004] Although functional, conventional driverless vehicles typically have a number of drawbacks. For example, a large number of driverless cars under development have evolved from vehicles requiring manual (i.e., human-controlled) steering and other like automotive functions. Therefore, a majority of driverless cars are based on a paradigm that a vehicle is to be designed to accommodate a licensed driver, for which a specific seat or location is reserved within the vehicle. As such, driverless vehicles are designed sub-optimally and generally forego opportunities to simplify vehicle design and conserve resources (e.g., reducing costs of producing a driverless vehicle). Other drawbacks are also present in conventional driverless vehicles.
[0005] Other drawbacks are also present in conventional transportation services, which are not well-suited for managing, for example, inventory of vehicles effectively due to the common approaches of providing conventional transportation and ride-sharing services. In one conventional approach, passengers are required to access a mobile application to request transportation services via a centralized service that assigns a human driver and vehicle (e.g., under private ownership) to a passenger. With the use of differently-owned vehicles, maintenance of private vehicles and safety systems generally go unchecked. In another conventional approach, some entities enable ride- sharing for a group of vehicles by allowing drivers, who enroll as members, access to vehicles that are shared among the members. This approach is not well-suited to provide for convenient transportation services as drivers need to pick up and drop off shared vehicles at specific locations, which typically are rare and sparse in city environments, and require access to relatively expensive real estate (i.e., parking lots) at which to park ride-shared vehicles. In the above-described conventional approaches, the traditional vehicles used to provide transportation services are generally under-utilized, from an inventory perspective, as the vehicles are rendered immobile once a driver departs. Further, ride-sharing approaches (as well as individually-owned vehicle transportation services) generally are not well-suited to rebalance inventory to match demand of transportation services to accommodate usage and typical travel pattems. Note, too, that some conventionally-described vehicles having limited self-driving automation capabilities also are not well-suited to rebalance inventories as a human driver generally may be required. Examples of vehicles having limited self-driving automation capabilities are vehicles designated as Level 3 ("L3") vehicles, according to the U. S. Department of Transportation's National Highway Traffic Safety Administration ("NHTSA").
[0006] As another drawback, typical approaches to driverless vehicles are generally not well-suited to detect and navigate vehicles relative to interactions (e.g., social interactions) between a vehicle- in-travel and other drivers of vehicles or individuals. For example, some conventional approaches are not sufficiently able to identity pedestrians, cyclists, etc., and associated interactions, such as eye contact, gesturing, and the like, for purposes of addressing safety risks to occupants of a driverless vehicles, as well as drivers of other vehicles, pedestrians, etc.
[0007] Thus, what is needed is a solution for facilitating an implementation of autonomous vehicles, without the limitations of conventional techniques.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] Various embodiments or examples ("examples") of the invention are disclosed in the following detailed description and the accompanying drawings:
[0009] FIG. 1 is a diagram depicting implementation of a fleet of autonomous vehicles that are communicatively networked to an autonomous vehicle service platform, according to some embodiments;
[0010] FIG. 2 is an example of a flow diagram to monitor a fleet of autonomous vehicles, according to some embodiments;
[0011] FIG. 3A is a diagram depicting examples of sensors and other autonomous vehicle components, according to some examples;
[0012] FIGs. 3B to 3E are diagrams depicting examples of sensor field redundancy and autonomous vehicle adaption to a loss of a sensor field, according to some examples;
[0013] FIG. 4 is a functional block diagram depicting a system including an autonomous vehicle service platform that is communicatively coupled via a communication layer to an autonomous vehicle controller, according to some examples;
[0014] FIG. 5 is an example of a flow diagram to control an autonomous vehicle, according to some embodiments;
[0015] FIG. 6 is a diagram depicting an example of an architecture for an autonomous vehicle controller, according to some embodiments;
[0016] FIG. 7 is a diagram depicting an example of an autonomous vehicle service platform implementing redundant communication channels to maintain reliable communications with a fleet of autonomous vehicles, according to some embodiments;
[0017] FIG. 8 is a diagram depicting an example of a messaging application configured to exchange data among various applications, according to some embodiment;
[0018] FIG. 9 is a diagram depicting types of data for facilitating teleoperations using a communications protocol described in FIG. 8, according to some examples;
[0019] FIG. 10 is a diagram illustrating an example of a teleoperator interface with which a teleoperator may influence path planning, according to some embodiments; [0020] FIG. 11 is a diagram depicting an example of a planner configured to invoke teleoperations, according to some examples;
[0021] FIG. 12 is an example of a flow diagram configured to control an autonomous vehicle, according to some embodiments;
[0022] FIG. 13 depicts an example in which a planner may generate a traj ectory, according to some examples;
[0023] FIG. 14 is a diagram depicting another example of an autonomous vehicle service platform, according to some embodiments;
[0024] FIG. 15 is an example of a flow diagram to control an autonomous vehicle, according to some embodiments;
[0025] FIG. 16 is a diagram of an example of an autonomous vehicle fleet manager implementing a fleet optimization manager, according to some examples;
[0026] FIG. 17 is an example of a flow diagram for managing a fleet of autonomous vehicles, according to some embodiments;
[0027] FIG. 18 is a diagram illustrating an autonomous vehicle fleet manager implementing an autonomous vehicle communications link manager, according to some embodiments;
[0028] FIG. 19 is an example of a flow diagram to determine actions for autonomous vehicles during an event, according to some embodiments;
[0029] FIG. 20 is a diagram depicting an example of a localizer, according to some embodiments;
[0030] FIG. 21 is an example of a flow diagram to generate local pose data based on integrated sensor data, according to some embodiments;
[0031] FIG. 22 is a diagram depicting another example of a localizer, according to some embodiments;
[0032] FIG. 23 is a diagram depicting an example of a perception engine, according to some embodiments;
[0033] FIG. 24 is an example of a flow chart to generate perception engine data, according to some embodiments;
[0034] FIG. 25 is an example of a segmentation processor, according to some embodiments;
[0035] FIG. 26A is a diagram depicting examples of an object tracker and a classifier, according to various embodiments;
[0036] FIG. 26B is a diagram depicting another example of an object tracker according to at least some examples;
[0037] FIG. 27 is an example of front-end processor for a perception engine, according to some examples; [0038] FIG. 28 is a diagram depicting a simulator configured to simulate an autonomous vehicle in a synthetic environment, according to various embodiments;
[0039] FIG. 29 is an example of a flow chart to simulate various aspects of an autonomous vehicle, according to some embodiments;
[0040] FIG. 30 is an example of a flow chart to generate map data, according to some embodiments;
[0041] FIG. 31 is a diagram depicting an architecture of a mapping engine, according to some embodiments
[0042] FIG. 32 is a diagram depicting an autonomous vehicle application, according to some examples; and
[0043] FIGs. 33 to 35 illustrate examples of various computing platforms configured to provide various functionalities to components of an autonomous vehicle service, according to various embodiments;
[0044] FIG. 36 is a diagram depicting a mapping engine configured to generate mapping data adaptively for autonomous vehicles responsive to changes in physical environments, according to some examples;
[0045] FIG. 37 is a diagram depicting an example of an autonomous vehicle controller implementing updated map data, according to some examples;
[0046] FIG. 38 is a flow chart illustrating an example of generating map data, according to some examples;
[0047] FIG. 39 is a diagram depicting an example of a localizer configured to implement map data and locally-generated map data, according to some examples;
[0048] FIG. 40 is a diagram depicting an example of a localizer configured to vary transmission rates or amounts of locally-generated sensor and/or map data, according to some examples;
[0049] FIG. 41 is a flow diagram depicting an example using various amounts of locally-generated map data to localize an autonomous vehicle, according to some examples; and
[0050] FIGs. 42 to 43 illustrate examples of various computing platforms configured to provide various mapping-related functionalities to components of an autonomous vehicle service, according to various embodiments.
DETAILED DESCRIPTION
[0051] Various embodiments or examples may be implemented in numerous ways, including as a system, a process, an apparatus, a user interface, or a series of program instructions on a computer readable medium such as a computer readable storage medium or a computer network where the program instructions are sent over optical, electronic, or wireless communication links. In general, operations of disclosed processes may be performed in an arbitrary order, unless otherwise provided in the claims.
[0052] A detailed description of one or more examples is provided below along with accompanying figures. The detailed description is provided in connection with such examples, but is not limited to any particular example. The scope is limited only by the claims, and numerous alternatives, modifications, and equivalents thereof. Numerous specific details are set forth in the following description in order to provide a thorough understanding. These details are provided for the purpose of example and the described techniques may be practiced according to the claims without some or all of these specific details. For clarity, technical material that is known in the technical fields related to the examples has not been described in detail to avoid unnecessarily obscuring the description.
[0053] FIG. 1 is a diagram depicting an implementation of a fleet of autonomous vehicles that are communicatively networked to an autonomous vehicle service platform, according to some embodiments. Diagram 100 depicts a fleet of autonomous vehicles 109 (e.g., one or more of autonomous vehicles 109a to 109e) operating as a service, each autonomous vehicle 109 being configured to self-drive a road network 1 10 and establish a communication link 192 with an autonomous vehicle service platform 101. In examples in which a fleet of autonomous vehicles 109 constitutes a service, a user 102 may transmit a request 103 for autonomous transportation via one or more networks 106 to autonomous vehicle service platform 101. In response, autonomous vehicle service platform 101 may dispatch one of autonomous vehicles 109 to transport user 102 autonomously from geographic location 1 19 to geographic location 11 1. Autonomous vehicle service platform 101 may dispatch an autonomous vehicle from a station 190 to geographic location 119, or may divert an autonomous vehicle 109c, already in transit (e.g., without occupants), to service the transportation request for user 102. Autonomous vehicle service platform 101 may be further configured to divert an autonomous vehicle 109c in transit, with passengers, responsive to a request from user 102 (e.g., as a passenger). In addition, autonomous vehicle service platform 101 may be configured to reserve an autonomous vehicle 109c in transit, with passengers, for diverting to service a request of user 102 subsequent to dropping off existing passengers. Note that multiple autonomous vehicle service platforms 101 (not shown) and one or more stations 190 may be implemented to service one or more autonomous vehicles 109 in connection with road network 110. One or more stations 190 may be configured to store, service, manage, and/or maintain an inventory of autonomous vehicles 109 (e.g., station 190 may include one or more computing devices implementing autonomous vehicle service platform 101).
[0054] According to some examples, at least some of autonomous vehicles 109a to 109e are configured as bidirectional autonomous vehicles, such as bidirectional autonomous vehicle ("AV") 130. Bidirectional autonomous vehicle 130 may be configured to travel in either direction principally along, but not limited to, a longitudinal axis 131. Accordingly, bidirectional autonomous vehicle 130 may be configured to implement active lighting external to the vehicle to alert others (e.g., other drivers, pedestrians, cyclists, etc.) in the adj acent vicinity, and a direction in which bidirectional autonomous vehicle 130 is traveling. For example, active sources of light 136 may be implemented as active lights 138a when traveling in a first direction, or may be implemented as active lights 138b when traveling in a second direction. Active lights 138a may be implemented using a first subset of one or more colors, with optional animation (e.g., light patterns of variable intensities of light or color that may change over time). Similarly, active lights 138b may be implemented using a second subset of one or more colors and light patterns that may be different than those of active lights 138a. For example, active lights 138a may be implemented using white- colored lights as "headlights," whereas active lights 138b may be implemented using red-colored lights as "taillights. " Active lights 138a and 138b, or portions thereof, may be configured to provide other light-related functionalities, such as provide "turn signal indication" functions (e.g., using yellow light). According to various examples, logic in autonomous vehicle 130 may be configured to adapt active lights 138a and 138b to comply with various safety requirements and traffic regulations or laws for any number of jurisdictions.
[0055] In some embodiments, bidirectional autonomous vehicle 130 may be configured to have similar structural elements and components in each quad portion, such as quad portion 194. The quad portions are depicted, at least in this example, as portions of bidirectional autonomous vehicle 130 defined by the intersection of a plane 132 and a plane 134, both of which pass through the vehicle to form two similar halves on each side of planes 132 and 134. Further, bidirectional autonomous vehicle 130 may include an autonomous vehicle controller 147 that includes logic (e.g., hardware or software, or as combination thereof) that is configured to control a predominate number of vehicle functions, including driving control (e.g., propulsion, steering, etc.) and active sources 136 of light, among other functions. Bidirectional autonomous vehicle 130 also includes a number of sensors 139 disposed at various locations on the vehicle (other sensors are not shown).
[0056] Autonomous vehicle controller 147 may be further configured to determine a local pose (e.g., local position) of an autonomous vehicle 109 and to detect external objects relative to the vehicle. For example, consider that bidirectional autonomous vehicle 130 is traveling in the direction 119 in road network 110. A localizer (not shown) of autonomous vehicle controller 147 can determine a local pose at the geographic location 1 11. As such, the localizer may use acquired sensor data, such as sensor data associated with surfaces of buildings 1 15 and 1 17, which can be compared against reference data, such as map data (e.g., 3D map data, including reflectance data) to determine a local pose. Further, a perception engine (not shown) of autonomous vehicle controller 147 may be configured to detect, classify, and predict the behavior of external obj ects, such as external object 1 12 (a "tree") and external obj ect 1 14 (a "pedestrian"). Classification of such external obj ects may broadly classify obj ects as static obj ects, such as external obj ect 1 12, and dynamic obj ects, such as external obj ect 114. The localizer and the perception engine, as well as other components of the AV controller 147, collaborate to cause autonomous vehicles 109 to drive autonomously.
[0057] According to some examples, autonomous vehicle service platform 101 is configured to provide teleoperator services should an autonomous vehicle 109 request teleoperation. For example, consider that an autonomous vehicle controller 147 in autonomous vehicle 109d detects an obj ect 126 obscuring a path 124 on roadway 122 at point 191, as depicted in inset 120. If autonomous vehicle controller 147 cannot ascertain a path or traj ectory over which vehicle 109d may safely transit with a relatively high degree of certainty, then autonomous vehicle controller 147 may transmit request message 105 for teleoperation services. In response, a teleoperator computing device 104 may receive instructions from a teleoperator 108 to perform a course of action to successfully (and safely) negotiate obstacles 126. Response data 107 then can be transmitted back to autonomous vehicle 109d to cause the vehicle to, for example, safely cross a set of double lines as it transits along the alternate path 121. In some examples, teleoperator computing device 104 may generate a response identifying geographic areas to exclude from planning a path. In particular, rather than provide a path to follow, a teleoperator 108 may define areas or locations that the autonomous vehicle must avoid.
[0058] In view of the foregoing, the structures and/or functionalities of autonomous vehicle 130 and/or autonomous vehicle controller 147, as well as their components, can perform real-time (or near real-time) traj ectory calculations through autonomous-related operations, such as localization and perception, to enable autonomous vehicles 109 to self-drive.
[0059] In some cases, the bidirectional nature of bidirectional autonomous vehicle 130 provides for a vehicle that has quad portions 194 (or any other number of symmetric portions) that are similar or are substantially similar to each other. Such symmetry reduces complexity of design and decreases relatively the number of unique components or structures, thereby reducing inventory and manufacturing complexities. For example, a drivetrain and wheel system may be disposed in any of the quad portions. Further, autonomous vehicle controller 147 is configured to invoke teleoperation services to reduce the likelihood that an autonomous vehicle 109 is delayed in transit while resolving an event or issue that may otherwise affect the safety of the occupants. In some cases, the visible portion of road network 1 10 depicts a geo-fenced region that may limit or otherwise control the movement of autonomous vehicles 109 to the road network shown in FIG. 1. According to various examples, autonomous vehicle 109, and a fleet thereof, may be configurable to operate as a level 4 ("full self-driving automation," or L4) vehicle that can provide transportation on demand with the convenience and privacy of point-to-point personal mobility while providing the efficiency of shared vehicles. In some examples, autonomous vehicle 109, or any autonomous vehicle described herein, may be configured to omit a steering wheel or any other mechanical means of providing manual (i.e., human-controlled) steering for autonomous vehicle 109. Further, autonomous vehicle 109, or any autonomous vehicle described herein, may be configured to omit a seat or location reserved within the vehicle for an occupant to engage a steering wheel or any mechanical steering system.
[0060] FIG. 2 is an example of a flow diagram to monitor a fleet of autonomous vehicles, according to some embodiments. At 202, flow 200 begins when a fleet of autonomous vehicles are monitored. At least one autonomous vehicle includes an autonomous vehicle controller configured to cause the vehicle to autonomously transit from a first geographic region to a second geographic region. At 204, data representing an event associated with a calculated confidence level for a vehicle is detected. An event may be a condition or situation affecting operation, or potentially affecting operation, of an autonomous vehicle. The events may be internal to an autonomous vehicle, or external. For example, an obstacle obscuring a roadway may be viewed as an event, as well as a reduction or loss of communication. An event may include traffic conditions or congestion, as well as unexpected or unusual numbers or types of external obj ects (or tracks) that are perceived by a perception engine. An event may include weather-related conditions (e.g., loss of friction due to ice or rain) or the angle at which the sun is shining (e.g., at sunset), such as low angle to the horizon that cause sun to shine brightly in the eyes of human drivers of other vehicles. These and other conditions may be viewed as events that cause invocation of the teleoperator service or for the vehicle to execute a safe-stop traj ectory.
[0061] At 206, data representing a subset of candidate trajectories may be received from an autonomous vehicle responsive to the detection of the event. For example, a planner of an autonomous vehicle controller may calculate and evaluate large numbers of traj ectories (e.g., thousands or greater) per unit time, such as a second. In some embodiments, candidate traj ectories are a subset of the trajectories that provide for relatively higher confidence levels that an autonomous vehicle may move forward safely in view of the event (e.g., using an alternate path provided by a teleoperator). Note that some candidate traj ectories may be ranked or associated with higher degrees of confidence than other candidate traj ectories. According to some examples, subsets of candidate traj ectories may originate from any number of sources, such as a planner, a teleoperator computing device (e.g., teleoperators can determine and provide approximate paths), etc., and may be combined as a superset of candidate traj ectories. At 208, path guidance data may be identified at one or more processors. The path guidance data may be configured to assist a teleoperator in selecting a guided traj ectory from one or more of the candidate trajectories. In some instances, the path guidance data specifies a value indicative of a confidence level or probability that indicates the degree of certainty that a particular candidate trajectory may reduce or negate the probability that the event may impact operation of an autonomous vehicle. A guided traj ectory, as a selected candidate trajectory, may be received at 210, responsive to input from a teleoperator (e.g., a teleoperator may select at least one candidate traj ectory as a guided traj ectory from a group of differently -ranked candidate traj ectories). The selection may be made via an operator interface that lists a number of candidate traj ectories, for example, in order from highest confidence levels to lowest confidence levels. At 212, the selection a candidate traj ectory as a guided trajectory may be transmitted to the vehicle, which, in turn, implements the guided trajectory for resolving the condition by causing the vehicle to perform a teleoperator-specified maneuver. As such, the autonomous vehicle may transition from a non-normative operational state.
[0062] FIG. 3A is a diagram depicting examples of sensors and other autonomous vehicle components, according to some examples. Diagram 300 depicts an interior view of a bidirectional autonomous vehicle 330 that includes sensors, signal routers 345, drive trains 349, removable batteries 343, audio generators 344 (e.g., speakers or transducers), and autonomous vehicle (" AV") control logic 347. Sensors shown in diagram 300 include image capture sensors 340 (e.g., light capture devices or cameras of any type), audio capture sensors 342 (e.g., microphones of any type), radar devices 348, sonar devices 341 (or other like sensors, including ultrasonic sensors or acoustic- related sensors), and LIDAR devices 346, among other sensor types and modalities (some of which are not shown, such inertial measurement units, or "IMUs," global positioning system (" GPS") sensors, sonar sensors, etc.). Note that quad portion 350 is representative of the symmetry of each of four "quad portions" of bidirectional autonomous vehicle 330 (e.g., each quad portion 350 may include a wheel, a drivetrain 349, similar steering mechanisms, similar structural support and members, etc. beyond that which is depicted). As depicted in FIG. 3A, similar sensors may be placed in similar locations in each quad portion 350, however any other configuration may implemented. Each wheel may be steerable individually and independent of the others. Note, too, that removable batteries 343 may be configured to facilitate being swapped in and swapped out rather than charging in situ, thereby ensuring reduced or negligible downtimes due to the necessity of charging batteries 343. While autonomous vehicle controller 347a is depicted as being used in a bidirectional autonomous vehicle 330, autonomous vehicle controller 347a is not so limited and may be implemented in unidirectional autonomous vehicles or any other type of vehicle, whether on land, in air, or at sea. Note that the depicted and described positions, locations, orientations, quantities, and types of sensors shown in FIG. 3 A are not intended to be limiting, and, as such, there may be any number and type of sensor, and any sensor may be located and oriented anywhere on autonomous vehicle 330.
[0063] According to some embodiments, portions of the autonomous vehicle ("AV") control logic 347 may be implemented using clusters of graphics processing units ("GPUs") implementing a framework and programming model suitable for programming the clusters of GPUs. For example, a compute unified device architecture ("CUD A™ ") compatible programming language and application programming interface ("API") model may be used to program the GPUs. CUD A™ is produced and maintained by NVIDIA of Santa Clara, California. Note that other programming languages may be implemented, such as OpenCL, or any other parallel programming language.
[0064] According to some embodiments, autonomous vehicle control logic 347 may be implemented in hardware and/or software as autonomous vehicle controller 347a, which is shown to include a motion controller 362, a planner 364, a perception engine 366, and a localizer 368. As shown, autonomous vehicle controller 347a is configured to receive camera data 340a, LIDAR data 346a, and radar data 348a, or any other range-sensing or localization data, including sonar data 341a or the like. Autonomous vehicle controller 347a is also configured to receive positioning data, such as GPS data 352, IMU data 354, and other position-sensing data (e.g., wheel-related data, such as steering angles, angular velocity, etc.). Further, autonomous vehicle controller 347a may receive any other sensor data 356, as well as reference data 339. In some cases, reference data 339 includes map data (e.g., 3D map data, 2D map data, 4D map data (e.g., including Epoch Determination)) and route data (e.g., road network data, including, but not limited to, RNDF data (or similar data), MDF data (or similar data), etc.
[0065] Localizer 368 is configured to receive sensor data from one or more sources, such as GPS data 352, wheel data, IMU data 354, LIDAR data 346a, camera data 340a, radar data 348a, and the like, as well as reference data 339 (e.g., 3D map data and route data). Localizer 368 integrates (e.g., fuses the sensor data) and analyzes the data by comparing sensor data to map data to determine a local pose (or position) of bidirectional autonomous vehicle 330. According to some examples, localizer 368 may generate or update the pose or position of any autonomous vehicle in real-time or near real-time. Note that localizer 368 and its functionality need not be limited to "bi-directional" vehicles and can be implemented in any vehicle of any type. Therefore, localizer 368 (as well as other components of AV controller 347a) may be implemented in a "uni-directional" vehicle or any non-autonomous vehicle. According to some embodiments, data describing a local pose may include one or more of an x-coordinate, a y-coordinate, a z-coordinate (or any coordinate of any coordinate system, including polar or cylindrical coordinate systems, or the like), a yaw value, roll value, a pitch value (e.g., an angle value), a rate (e.g., velocity), altitude, and the like. [0066] Perception engine 366 is configured to receive sensor data from one or more sources, such as LIDAR data 346a, camera data 340a, radar data 348a, and the like, as well as local pose data. Perception engine 366 may be configured to determine locations of external obj ects based on sensor data and other data. Extemal objects, for instance, may be objects that are not part of a drivable surface. For example, perception engine 366 may be able to detect and classify extemal objects as pedestrians, bicyclists, dogs, other vehicles, etc. (e.g., perception engine 366 is configured to classify the obj ects in accordance with a type of classification, which may be associated with semantic information, including a label). Based on the classification of these extemal objects, the extemal obj ects may be labeled as dynamic objects or static obj ects. For example, an extemal obj ect classified as a tree may be labeled as a static object, while an external obj ect classified as a pedestrian may be labeled as a dynamic object. Extemal objects labeled as static may or may not be described in map data. Examples of external objects likely to be labeled as static include traffic cones, cement barriers arranged across a roadway, lane closure signs, newly-placed mailboxes or trash cans adj acent a roadway, etc. Examples of external obj ects likely to be labeled as dynamic include bicyclists, pedestrians, animals, other vehicles, etc. If the external object is labeled as dynamic, and further data about the external object may indicate a typical level of activity and velocity, as well as behavior patterns associated with the classification type. Further data about the extemal obj ect may be generated by tracking the extemal obj ect. As such, the classification type can be used to predict or otherwise determine the likelihood that an external object may, for example, interfere with an autonomous vehicle traveling along a planned path. For example, an extemal obj ect that is classified as a pedestrian may be associated with some maximum speed, as well as an average speed (e.g., based on tracking data). The velocity of the pedestrian relative to the velocity of an autonomous vehicle can be used to determine if a collision is likely. Further, perception engine 364 may determine a level of uncertainty associated with a current and future state of objects. In some examples, the level of uncertainty may be expressed as an estimated value (or probability).
[0067] Planner 364 is configured to receive perception data from perception engine 366, and may also include localizer data from localizer 368. According to some examples, the perception data may include an obstacle map specifying static and dynamic objects located in the vicinity of an autonomous vehicle, whereas the localizer data may include a local pose or position. In operation, planner 364 generates numerous trajectories, and evaluates the trajectories, based on at least the location of the autonomous vehicle against relative locations of external dynamic and static objects. Planner 364 selects an optimal trajectory based on a variety of criteria over which to direct the autonomous vehicle in way that provides for collision-free travel. In some examples, planner 364 may be configured to calculate the trajectories as probabilistically-determined trajectories. Further, planner 364 may transmit steering and propulsion commands (as well as decelerating or braking commands) to motion controller 362. Motion controller 362 subsequently may convert any of the commands, such as a steering command, a throttle or propulsion command, and a braking command, into control signals (e.g., for application to actuators or other mechanical interfaces) to implement changes in steering or wheel angles 351 and/or velocity 353.
[0068] FIGs. 3B to 3E are diagrams depicting examples of sensor field redundancy and autonomous vehicle adaption to a loss of a sensor field, according to some examples. Diagram 391 of FIG. 3B depicts a sensor field 301a in which sensor 310a detects objects (e.g., for determining range or distance, or other information). While sensor 310a may implement any type of sensor or sensor modality, sensor 310a and similarly-described sensors, such as sensors 310b, 310c, and 310d, may include LIDAR devices. Therefore, sensor fields 301a, 301b, 301c, and 301d each includes a field into which lasers extend. Diagram 392 of FIG. 3C depicts four overlapping sensor fields each of which is generated by a corresponding LIDAR sensor 310 (not shown). As shown, portions 301 of the sensor fields include no overlapping sensor fields (e.g., a single LIDAR field), portions 302 of the sensor fields include two overlapping sensor fields, and portions 303 include three overlapping sensor fields, whereby such sensors provide for multiple levels of redundancy should a LIDAR sensor fail.
[0069] FIG. 3D depicts a loss of a sensor field due to failed operation of LIDAR 309, according to some examples. Sensor field 302 of FIG. 3C is transformed into a single sensor field 305, one of sensor fields 301 of FIG. 3C is lost to a gap 304, and three of sensor fields 303 of FIG. 3C are transformed into sensor fields 306 (i.e., limited to two overlapping fields). Should autonomous car 330c be traveling in the direction of travel 396, the sensor field in front of the moving autonomous vehicle may be less robust than the one at the trailing end portion. According to some examples, an autonomous vehicle controller (not shown) is configured to leverage the bidirectional nature of autonomous vehicle 330c to address the loss of sensor field at the leading area in front of the vehicle. FIG. 3E depicts a bidirectional maneuver for restoring a certain robustness of the sensor field in front of autonomous vehicle 330d. As shown, a more robust sensor field 302 is disposed at the rear of the vehicle 330d coextensive with taillights 348. When convenient, autonomous vehicle 330d performs a bidirectional maneuver by pulling into a driveway 397 and switches its directionality such that taillights 348 actively switch to the other side (e.g., the railing edge) of autonomous vehicle 330d. As shown, autonomous vehicle 330d restores a robust sensor field 302 in front of the vehicle as it travels along direction of travel 398. Further, the above-described bidirectional maneuver obviates a requirement for a more complicated maneuver that requires backing up into a busy roadway. [0070] FIG. 4 is a functional block diagram depicting a system including an autonomous vehicle service platform that is communicatively coupled via a communication layer to an autonomous vehicle controller, according to some examples. Diagram 400 depicts an autonomous vehicle controller ("AV") 447 disposed in an autonomous vehicle 430, which, in turn, includes a number of sensors 470 coupled to autonomous vehicle controller 447. Sensors 470 include one or more LIDAR devices 472, one or more cameras 474, one or more radars 476, one or more global positioning system ("GPS") data receiver-sensors, one or more inertial measurement units ("IMUs") 475, one or more odometry sensors 477 (e.g., wheel encoder sensors, wheel speed sensors, and the like), and any other suitable sensors 478, such as infrared cameras or sensors, hyperspectral-capable sensors, ultrasonic sensors (or any other acoustic energy-based sensor), radio frequency-based sensors, etc. In some cases, wheel angle sensors configured to sense steering angles of wheels may be included as odometry sensors 477 or suitable sensors 478. In a non-limiting example, autonomous vehicle controller 447 may include four or more LIDARs 472, sixteen or more cameras 474 and four or more radar units 476. Further, sensors 470 may be configured to provide sensor data to components of autonomous vehicle controller 447 and to elements of autonomous vehicle service platform 401. As shown in diagram 400, autonomous vehicle controller 447 includes a planner 464, a motion controller 462, a localizer 468, a perception engine 466, and a local map generator 440. Note that elements depicted in diagram 400 of FIG. 4 may include structures and/or functions as similarly -named elements described in connection to one or more other drawings.
[0071] Localizer 468 is configured to localize autonomous vehicle (i.e., determine a local pose) relative to reference data, which may include map data, route data (e.g., road network data, such as RNDF-like data), and the like. In some cases, localizer 468 is configured to identity, for example, a point in space that may represent a location of autonomous vehicle 430 relative to features of a representation of an environment. Localizer 468 is shown to include a sensor data integrator 469, which may be configured to integrate multiple subsets of sensor data (e.g., of different sensor modalities) to reduce uncertainties related to each individual type of sensor. According to some examples, sensor data integrator 469 is configured to fuse sensor data (e.g., LIDAR data, camera data, radar data, etc.) to form integrated sensor data values for determining a local pose. According to some examples, localizer 468 retrieves reference data originating from a reference data repository 405, which includes a map data repository 405a for storing 2D map data, 3D map data, 4D map data, and the like. Localizer 468 may be configured to identity at least a subset of features in the environment to match against map data to identify, or otherwise confirm, a pose of autonomous vehicle 430. According to some examples, localizer 468 may be configured to identify any amount of features in an environment, such that a set of features can one or more features, or all features. In a specific example, any amount of LIDAR data (e.g., most or substantially all LIDAR data) may be compared against data representing a map for purposes of localization. Generally, non-matched objects resulting from the comparison of the environment features and map data may be a dynamic object, such as a vehicle, bicyclist, pedestrian, etc. Note that detection of dynamic objects, including obstacles, may be performed with or without map data. In particular, dynamic objects may be detected and tracked independently of map data (i.e., in the absence of map data). In some instances, 2D map data and 3D map data may be viewed as "global map data" or map data that has been validated at a point in time by autonomous vehicle service platform 401. As map data in map data repository 405a may be updated and/or validated periodically, a deviation may exist between the map data and an actual environment in which the autonomous vehicle is positioned. Therefore, localizer 468 may retrieve locally-derived map data generated by local map generator 440 to enhance localization. Local map generator 440 is configured to generate local map data in real-time or near real-time. Optionally, local map generator 440 may receive static and dynamic object map data to enhance the accuracy of locally generated maps by, for example, disregarding dynamic objects in localization. According to at least some embodiments, local map generator 440 may be integrated with, or formed as part of, localizer 468. In at least one case, local map generator 440, either individually or in collaboration with localizer 468, may be configured to generate map and/or reference data based on simultaneous localization and mapping ("SLAM") or the like. Note that localizer 468 may implement a "hybrid" approach to using map data, whereby logic in localizer 468 may be configured to select various amounts of map data from either map data repository 405a or local map data from local map generator 440, depending on the degrees of reliability of each source of map data. Therefore, localizer 468 may still use out-of-date map data in view of locally-generated map data.
[0072] Perception engine 466 is configured to, for example, assist planner 464 in planning routes and generating trajectories by identifying objects of interest in a surrounding environment in which autonomous vehicle 430 is transiting. Further, probabilities may be associated with each of the object of interest, whereby a probability may represent a likelihood that an object of interest may be a threat to safe travel (e.g., a fast-moving motorcycle may require enhanced tracking rather than a person sitting at a bus stop bench while reading a newspaper). As shown, perception engine 466 includes an object detector 442 and an object classifier 444. Object detector 442 is configured to distinguish objects relative to other features in the environment, and object classifier 444 may be configured to classify objects as either dynamic or static objects and track the locations of the dynamic and the static objects relative to autonomous vehicle 430 for planning purposes. Further, perception engine 466 may be configured to assign an identifier to a static or dynamic object that specifies whether the object is (or has the potential to become) an obstacle that may impact path planning at planner 464. Although not shown in FIG. 4, note that perception engine 466 may also perform other perception-related functions, such as segmentation and tracking, examples of which are described below.
[0073] Planner 464 is configured to generate a number of candidate trajectories for accomplishing a goal to reaching a destination via a number of paths or routes that are available. Trajectory evaluator 465 is configured to evaluate candidate traj ectories and identify which subsets of candidate trajectories are associated with higher degrees of confidence levels of providing collision- free paths to the destination. As such, trajectory evaluator 465 can select an optimal trajectory based on relevant criteria for causing commands to generate control signals for vehicle components 450 (e.g., actuators or other mechanisms). Note that the relevant criteria may include any number of factors that define optimal trajectories, the selection of which need not be limited to reducing collisions. For example, the selection of trajectories may be made to optimize user experience (e.g., user comfort) as well as collision-free trajectories that comply with traffic regulations and laws. User experience may be optimized by moderating accelerations in various linear and angular directions (e.g., to reduce jerking-like travel or other unpleasant motion). In some cases, at least a portion of the relevant criteria can specify which of the other criteria to override or supersede, while maintain optimized, collision-free travel. For example, legal restrictions may be temporarily lifted or deemphasized when generating trajectories in limited situations (e.g., crossing double yellow lines to go around a cyclist or travelling at higher speeds than the posted speed limit to match traffic flows). As such, the control signals are configured to cause propulsion and directional changes at the drivetrain and/or wheels. In this example, motion controller 462 is configured to transform commands into control signals (e.g., velocity, wheel angles, etc.) for controlling the mobility of autonomous vehicle 430. In the event that trajectory evaluator 465 has insufficient information to ensure a confidence level high enough to provide collision-free, optimized travel, planner 464 can generate a request to teleoperator 404 for teleoperator support.
[0074] Autonomous vehicle service platform 401 includes teleoperator 404 (e.g., a teleoperator computing device), reference data repository 405, a map updater 406, a vehicle data controller 408, a calibrator 409, and an off-line object classifier 410. Note that each element of autonomous vehicle service platform 401 may be independently located or distributed and in communication with other elements in autonomous vehicle service platform 401. Further, element of autonomous vehicle service platform 401 may independently communicate with the autonomous vehicle 430 via the communication layer 402. Map updater 406 is configured to receive map data (e.g., from local map generator 440, sensors 460, or any other component of autonomous vehicle controller (447), and is further configured to detect deviations, for example, of map data in map data repository 405 a from a locally-generated map. Vehicle data controller 408 can cause map updater 406 to update reference data within repository 405 and facilitate updates to 2D, 3D, and/or 4D map data. In some cases, vehicle data controller 408 can control the rate at which local map data is received into autonomous vehicle service platform 408 as well as the frequency at which map updater 406 performs updating of the map data.
[0075] Calibrator 409 is configured to perform calibration of various sensors of the same or different types. Calibrator 409 may be configured to determine the relative poses of the sensors (e.g., in Cartesian space (x, y, z)) and orientations of the sensors (e.g., roll, pitch and yaw). The pose and orientation of a sensor, such a camera, LIDAR sensor, radar sensor, etc., may be calibrated relative to other sensors, as well as globally relative to the vehicle's reference frame. Off-line self- calibration can also calibrate or estimate other parameters, such as vehicle inertial tensor, wheel base, wheel radius or surface road friction. Calibration can also be done online to detect parameter change, according to some examples. Note, too, that calibration by calibrator 409 may include intrinsic parameters of the sensors (e.g., optical distortion, beam angles, etc.) and extrinsic parameters. In some cases, calibrator 409 may be performed by maximizing a correlation between depth discontinuities in 3D laser data and edges of image data, as an example. Off-line object classification 410 is configured to receive data, such as sensor data, from sensors 470 or any other component of autonomous vehicle controller 447. According to some embodiments, an off-line classification pipeline of off-line object classification 410 may be configured to pre-collect and annotate objects (e.g., manually by a human and/or automatically using an offline labeling algorithm), and may further be configured to train an online classifier (e.g., object classifier 444), which can provide real-time classification of object types during online autonomous operation.
[0076] FIG. 5 is an example of a flow diagram to control an autonomous vehicle, according to some embodiments. At 502, flow 500 begins when sensor data originating from sensors of multiple modalities at an autonomous vehicle is received, for example, by an autonomous vehicle controller. One or more subsets of sensor data may be integrated for generating fused data to improve, for example, estimates. In some examples, a sensor stream of one or more sensors (e.g., of same or different modalities) may be fused to form fused sensor data at 504. In some examples, subsets of LIDAR sensor data and camera sensor data may be fused at 504 to facilitate localization. At 506, data representing objects based on the least two subsets of sensor data may be derived at a processor. For example, data identifying static objects or dynamic objects may be derived (e.g., at a perception engine) from at least LIDAR and camera data. At 508, a detected object is determined to affect a planned path, and a subset of trajectories are evaluated (e.g., at a planner) responsive to the detected object at 510. A confidence level is determined at 512 to exceed a range of acceptable confidence levels associated with normative operation of an autonomous vehicle. Therefore, in this case, a confidence level may be such that a certainty of selecting an optimized path is less likely, whereby an optimized path may be determined as a function of the probability of facilitating collision-free travel, complying with traffic laws, providing a comfortable user experience (e.g., comfortable ride), and/or generating candidate trajectories on any other factor. As such, a request for an alternate path may be transmitted to a teleoperator computing device at 514. Thereafter, the teleoperator computing device may provide a planner with an optimal traj ectory over which an autonomous vehicle made travel. In situations, the vehicle may also determine that executing a safe-stop maneuver is the best course of action (e.g., safely and automatically causing an autonomous vehicle to a stop at a location of relatively low probabilities of danger). Note that the order depicted in this and other flow charts herein are not intended to imply a requirement to linearly perform various functions as each portion of a flow chart may be performed serially or in parallel with any one or more other portions of the flow chart, as well as independent or dependent on other portions of the flow chart.
[0077] FIG. 6 is a diagram depicting an example of an architecture for an autonomous vehicle controller, according to some embodiments. Diagram 600 depicts a number of processes including a motion controller process 662, a planner processor 664, a perception process 666, a mapping process 640, and a localization process 668, some of which may generate or receive data relative to other processes. Other processes, such as such as processes 670 and 650 may facilitate interactions with one or more mechanical components of an autonomous vehicle. For example, perception process 666, mapping process 640, and localization process 668 are configured to receive sensor data from sensors 670, whereas planner process 664 and perception process 666 are configured to receive guidance data 606, which may include route data, such as road network data. Further to diagram 600, localization process 668 is configured to receive map data 605a (i.e., 2D map data), map data 605b (i.e., 3D map data), and local map data 642, among other types of map data. For example, localization process 668 may also receive other forms of map data, such as 4D map data, which may include, for example, an epoch determination. Localization process 668 is configured to generate local position data 641 representing a local pose. Local position data 641 is provided to motion controller process 662, planner process 664, and perception process 666. Perception process 666 is configured to generate static and dynamic obj ect map data 667, which, in turn, may be transmitted to planner process 664. In some examples, static and dynamic object map data 667 may be transmitted with other data, such as semantic classification information and predicted obj ect behavior. Planner process 664 is configured to generate traj ectories data 665, which describes a number of traj ectories generated by planner 664. Motion controller process uses trajectories data 665 to generate low-level commands or control signals for application to actuators 650 to cause changes in steering angles and/or velocity.
[0078] FIG. 7 is a diagram depicting an example of an autonomous vehicle service platform implementing redundant communication channels to maintain reliable communications with a fleet of autonomous vehicles, according to some embodiments. Diagram 700 depicts an autonomous vehicle service platform 701 including a reference data generator 705, a vehicle data controller 702, an autonomous vehicle fleet manager 703, a teleoperator manager 707, a simulator 740, and a policy manager 742. Reference data generator 705 is configured to generate and modify map data and route data (e.g., RNDF data). Further, reference data generator 705 may be configured to access 2D maps in 2D map data repository 720, access 3D maps in 3D map data repository 722, and access route data in route data repository 724. Other map representation data and repositories may be implemented in some examples, such as 4D map data including Epoch Determination. Vehicle data controller 702 may be configured to perform a variety of operations. For example, vehicle data controller 702 may be configured to change a rate that data is exchanged between a fleet of autonomous vehicles and platform 701 based on quality levels of communication over channels 770. During bandwidth-constrained periods, for example, data communications may be prioritized such that teleoperation requests from autonomous vehicle 730 are prioritized highly to ensure delivery. Further, variable levels of data abstraction may be transmitted per vehicle over channels 770, depending on bandwidth available for a particular channel. For example, in the presence of a robust network connection, full LIDAR data (e.g., substantially all LIDAR data, but also may be less) may be transmitted, whereas in the presence of a degraded or low-speed connection, simpler or more abstract depictions of the data may be transmitted (e.g., bounding boxes with associated metadata, etc.). Autonomous vehicle fleet manager 703 is configured to coordinate the dispatching of autonomous vehicles 730 to optimize multiple variables, including an efficient use of battery power, times of travel, whether or not an air-conditioning unit in an autonomous vehicle 730 may be used during low charge states of a battery, etc., any or all of which may be monitored in view of optimizing cost functions associated with operating an autonomous vehicle service. An algorithm may be implemented to analyze a variety of variables with which to minimize costs or times of travel for a fleet of autonomous vehicles. Further, autonomous vehicle fleet manager 703 maintains an inventory of autonomous vehicles as well as parts for accommodating a service schedule in view of maximizing up-time of the fleet.
[0079] Teleoperator manager 707 is configured to manage a number of teleoperator computing devices 704 with which teleoperators 708 provide input. Simulator 740 is configured to simulate operation of one or more autonomous vehicles 730, as well as the interactions between teleoperator manager 707 and an autonomous vehicle 730. Simulator 740 may also simulate operation of a number of sensors (including the introduction of simulated noise) disposed in autonomous vehicle 730. Further, an environment, such as a city, may be simulated such that a simulated autonomous vehicle can be introduced to the synthetic environment, whereby simulated sensors may receive simulated sensor data, such as simulated laser returns. Simulator 740 may provide other functions as well, including validating software updates and/or map data. Policy manager 742 is configured to maintain data representing policies or rules by which an autonomous vehicle ought to behave in view of a variety of conditions or events that an autonomous vehicle encounters while traveling in a network of roadways. In some cases, updated policies and/or rules may be simulated in simulator 740 to confirm safe operation of a fleet of autonomous vehicles in view of changes to a policy. Some of the above-described elements of autonomous vehicle service platform 701 are further described hereinafter.
[0080] Communication channels 770 are configured to provide networked communication links among a fleet of autonomous vehicles 730 and autonomous vehicle service platform 701. For example, communication channel 770 includes a number of different types of networks 771, 772, 773, and 774, with corresponding subnetworks (e.g., 771a to 771n), to ensure a certain level of redundancy for operating an autonomous vehicle service reliably. For example, the different types of networks in communication channels 770 may include different cellular network providers, different types of data networks, etc., to ensure sufficient bandwidth in the event of reduced or lost communications due to outages in one or more networks 771, 772, 773, and 774.
[0081] FIG. 8 is a diagram depicting an example of a messaging application configured to exchange data among various applications, according to some embodiments. Diagram 800 depicts an teleoperator application 801 disposed in a teleoperator manager, and an autonomous vehicle application 830 disposed in an autonomous vehicle, whereby teleoperator applications 801 and autonomous vehicle application 830 exchange message data via a protocol that facilitates communications over a variety of networks, such as network 871, 872, and other networks 873. According to some examples, the communication protocol is a middleware protocol implemented as a Data Distribution Service™ having a specification maintained by the Obj ect Management Group consortium. In accordance with the communications protocol, teleoperator application 801 and autonomous vehicle application 830 may include a message router 854 disposed in a message domain, the message router being configured to interface with the teleoperator API 852. In some examples, message router 854 is a routing service. In some examples, message domain 850a in teleoperator application 801 may be identified by ateleoperator identifier, whereas message domain 850b be may be identified as a domain associated with a vehicle identifier. Teleoperator API 852 in teleoperator application 801 is configured to interface with teleoperator processes 803a to 803c, whereby teleoperator process 803b is associated with an autonomous vehicle identifier 804, and teleoperator process 803c is associated with an event identifier 806 (e.g., an identifier that specifies an intersection that may be problematic for collision-free path planning). Teleoperator API 852 in autonomous vehicle application 830 is configured to interface with an autonomous vehicle operating system 840, which includes sensing application 842, a perception application 844, a localization application 846, and a control application 848. In view of the foregoing, the above- described communications protocol may facilitate data exchanges to facilitate teleoperations as described herein. Further, the above-described communications protocol may be adapted to provide secure data exchanges among one or more autonomous vehicles and one or more autonomous vehicle service platforms. For example, message routers 854 may be configured to encrypt and decrypt messages to provide for secured interactions between, for example, a teleoperator process 803 and an autonomous vehicle operation system 840.
[0082] FIG. 9 is a diagram depicting types of data for facilitating teleoperations using a communications protocol described in FIG. 8, according to some examples. Diagram 900 depicts a teleoperator 908 interfacing with a teleoperator computing device 904 coupled to a teleoperator application 901, which is configured to exchange data via a data-centric messaging bus 972 implemented in one or more networks 971. Data-centric messaging bus 972 provides a communication link between teleoperator application 901 and autonomous vehicle application 930. Teleoperator API 962 of teleoperator application 901 is configured to receive message service configuration data 964 and route data 960, such as road network data (e.g., RNDF-like data), mission data (e.g., MDF-data), and the like. Similarly, a messaging service bridge 932 is also configured to receive messaging service configuration data 934. Messaging service configuration data 934 and 964 provide configuration data to configure the messaging service between teleoperator application 901 and autonomous vehicle application 930. An example of messaging service configuration data 934 and 964 includes quality of service ("QoS") configuration data implemented to configure a Data Distribution Service™ application.
[0083] An example of a data exchange for facilitating teleoperations via the communications protocol is described as follows. Consider that obstacle data 920 is generated by a perception system of an autonomous vehicle controller. Further, planner options data 924 is generated by a planner to notify a teleoperator of a subset of candidate traj ectories, and position data 926 is generated by the localizer. Obstacle data 920, planner options data 924, and position data 926 are transmitted to a messaging service bridge 932, which, in accordance with message service configuration data 934, generates telemetry data 940 and query data 942, both of which are transmitted via data-centric messaging bus 972 into teleoperator application 901 as telemetry data 950 and query data 952. Teleoperator API 962 receives telemetry data 950 and inquiry data 952, which, in turn are processed in view of Route data 960 and message service configuration data 964. The resultant data is subsequently presented to a teleoperator 908 via teleoperator computing device 904 and/or a collaborative display (e.g., a dashboard display visible to a group of collaborating teleoperators 908). Teleoperator 908 reviews the candidate trajectory options that are presented on the display of teleoperator computing device 904, and selects a guided traj ectory, which generates command data 982 and query response data 980, both of which are passed through teleoperator API 962 as query response data 954 and command data 956. In turn, query response data 954 and command data 956 are transmitted via data-centric messaging bus 972 into autonomous vehicle application 930 as query response data 944 and command data 946. Messaging service bridge 932 receives query response data 944 and command data 946 and generates teleoperator command data 928, which is configured to generate a teleoperator-selected trajectory for implementation by a planner. Note that the above-described messaging processes are not intended to be limiting, and other messaging protocols may be implemented as well.
[0084] FIG. 10 is a diagram illustrating an example of a teleoperator interface with which a teleoperator may influence path planning, according to some embodiments. Diagram 1000 depicts examples of an autonomous vehicle 1030 in communication with an autonomous vehicle service platform 1001, which includes a teleoperator manager 1007 configured to facilitate teleoperations. In a first example, teleoperator manager 1007 receives data that requires teleoperator 1008 to preemptively view a path of an autonomous vehicle approaching a potential obstacle or an area of low planner confidence levels so that teleoperator 1008 may be able to address an issue in advance. To illustrate, consider that an intersection that an autonomous vehicle is approaching may be tagged as being problematic. As such, user interface 1010 displays a representation 1014 of a corresponding autonomous vehicle 1030 transiting along a path 1012, which has been predicted by a number of traj ectories generated by a planner. Also displayed are other vehicles 1011 and dynamic objects 1013, such as pedestrians, that may cause sufficient confusion at the planner, thereby requiring teleoperation support. User interface 1010 also presents to teleoperator 1008 a current velocity 1022, a speed limit 1024, and an amount of charge 1026 presently in the batteries. According to some examples, user interface 1010 may display other data, such as sensor data as acquired from autonomous vehicle 1030. In a second example, consider that planner 1064 has generated a number of trajectories that are coextensive with a planner-generated path 1044 regardless of a detected unidentified object 1046. Planner 1064 may also generate a subset of candidate trajectories 1040, but in this example, the planner is unable to proceed given present confidence levels. If planner 1064 fails to determine an alternative path, a teleoperation request may be transmitted. In this case, a teleoperator may select one of candidate trajectories 1040 to facilitate travel by autonomous vehicle 1030 that is consistent with teleoperator-based path 1042.
[0085] FIG. 11 is a diagram depicting an example of a planner configured to invoke teleoperations, according to some examples. Diagram 1100 depicts planner 1164 including a topography manager 1110, a route manager 1112, a path generator 1114, a trajectory evaluator 1120, and a trajectory tracker 1128. Topography manager 1110 is configured to receive map data, such as 3D map data or other like map data that specifies topographic features. Topography manager 1110 is further configured to identity candidate paths based on topographic-related features on a path to a destination. According to various examples, topography manager 11 10 receives 3D maps generated by sensors associated with one or more autonomous vehicles in the fleet. Route manager 1 112 is configured to receive environmental data 1 103, which may include traffic-related information associated with one or more routes that may be selected as a path to the destination. Path generator 1114 receives data from topography manager 11 10 and route manager 1 112, and generates one or more paths or path segments suitable to direct autonomous vehicle toward a destination. Data representing one or more paths or path segments is transmitted into trajectory evaluator 1120.
[0086] Traj ectory evaluator 1 120 includes a state and event manager 1 122, which, in turn, may include a confidence level generator 1123. Traj ectory evaluator 1120 further includes a guided traj ectory generator 1126 and a traj ectory generator 1124. Further, planner 1164 is configured to receive policy data 1130, perception engine data 1132, and localizer data 1134.
[0087] Policy data 1130 may include criteria with which planner 1164 uses to determine a path that has a sufficient confidence level with which to generate traj ectories, according to some examples. Examples of policy data 1 130 include policies that specify that trajectory generation is bounded by stand-off distances to extemal objects (e.g., maintaining a safety buffer of 3 feet from a cyclist, as possible), or policies that require that traj ectories must not cross a center double yellow line, or policies that require trajectories to be limited to a single lane in a 4-lane roadway (e.g., based on past events, such as typically congregating at a lane closest to a bus stop), and any other similar criteria specified by policies. Perception engine data 1 132 includes maps of locations of static obj ects and dynamic objects of interest, and localizer data 1134 includes at least a local pose or position.
[0088] State and event manager 1 122 may be configured to probabilistically determine a state of operation for an autonomous vehicle. For example, a first state of operation (i.e., "normative operation") may describe a situation in which traj ectories are collision-free, whereas a second state of operation (i.e., "non-normative operation") may describe another situation in which the confidence level associated with possible traj ectories are insufficient to guarantee collision-free travel. According to some examples, state and event manager 1122 is configured to use perception data 1132 to determine a state of autonomous vehicle that is either normative or non-normative. Confidence level generator 1123 may be configured to analyze perception data 1 132 to determine a state for the autonomous vehicle. For example, confidence level generator 1123 may use semantic information associated with static and dynamic objects, as well as associated probabilistic estimations, to enhance a degree of certainty that planner 1164 is determining safe course of action. For example, planner 1 164 may use perception engine data 1132 that specifies a probability that an object is either a person or not a person to determine whether planner 1 164 is operating safely (e.g., planner 1164 may receive a degree of certainty that an object has a 98% probability of being a person, and a probability of 2% that the obj ect is not a person).
[0089] Upon determining a confidence level (e.g., based on statistics and probabilistic determinations) is below a threshold required for predicted safe operation, a relatively low confidence level (e.g., single probability score) may trigger planner 1 164 to transmit a request 1 135 for teleoperation support to autonomous vehicle service platform 1 101. In some cases, telemetry data and a set of candidate traj ectories may accompany the request. Examples of telemetry data include sensor data, localization data, perception data, and the like. A teleoperator 1 108 may transmit via teleoperator computing device 1104 a selected trajectory 1137 to guided trajectory generator 1 126. As such, selected trajectory 1137 is a traj ectory formed with guidance from a teleoperator. Upon confirming there is no change in the state (e.g., a non-normative state is pending), guided traj ectory generator 1126 passes data to trajectory generator 1124, which, in turn, causes trajectory tracker 1128, as a trajectory tracking controller, to use the teleop-specified traj ectory for generating control signals 1 170 (e.g., steering angles, velocity, etc.). Note that planner 1164 may trigger transmission of a request 1135 for teleoperation support prior to a state transitioning to a non-normative state. In particular, an autonomous vehicle controller and/or its components can predict that a distant obstacle may be problematic and preemptively cause planner 1164 to invoke teleoperations prior to the autonomous vehicle reaching the obstacle. Otherwise, the autonomous vehicle may cause a delay by transitioning to a safe state upon encountering the obstacle or scenario (e.g., pulling over and off the roadway). In another example, teleoperations may be automatically invoked prior to an autonomous vehicle approaching a particular location that is known to be difficult to navigate. This determination may optionally take into consideration other factors, including the time of day, the position of the sun, if such situation is likely to cause a disturbance to the reliability of sensor readings, and traffic or accident data derived from a variety of sources.
[0090] FIG. 12 is an example of a flow diagram configured to control an autonomous vehicle, according to some embodiments. At 1202, flow 1200 begins. Data representing a subset of objects that are received at a planner in an autonomous vehicle, the subset of obj ects including at least one obj ect associated with data representing a degree of certainty for a classification type. For example, perception engine data may include metadata associated with obj ects, whereby the metadata specifies a degree of certainty associated with a specific classification type. For instance, a dynamic obj ect may be classified as a "young pedestrian" with an 85% confidence level of being correct. At 1204, localizer data may be received (e.g., at a planner). The localizer data may include map data that is generated locally within the autonomous vehicle. The local map data may specify a degree of certainty (including a degree of uncertainty) that an event at a geographic region may occur. An event may be a condition or situation affecting operation, or potentially affecting operation, of an autonomous vehicle. The events may be internal (e.g., failed or impaired sensor) to an autonomous vehicle, or external (e.g., roadway obstruction). Examples of events are described herein, such as in FIG. 2 as well as in other figures and passages. A path coextensive with the geographic region of interest may be determined at 1206. For example, consider that the event is the positioning of the sun in the sky at a time of day in which the intensity of sunlight impairs the vision of drivers during rush hour traffic. As such, it is expected or predicted that traffic may slow down responsive to the bright sunlight. Accordingly, a planner may preemptively invoke teleoperations if an alternate path to avoid the event is less likely. At 1208, a local position is determined at a planner based on local pose data. At 1210, a state of operation of an autonomous vehicle may be determined (e.g., probabilistically), for example, based on a degree of certainty for a classification type and a degree of certainty of the event, which is may be based on any number of factors, such as speed, position, and other state information. To illustrate, consider an example in which a young pedestrian is detected by the autonomous vehicle during the event in which other drivers' vision likely will be impaired by the sun, thereby causing an unsafe situation for the young pedestrian. Therefore, a relatively unsafe situation can be detected as a probabilistic event that may be likely to occur (i.e., an unsafe situation for which teleoperations may be invoked). At 1212, a likelihood that the state of operation is in a normative state is determined, and based on the determination, a message is transmitted to a teleoperator computing device requesting teleoperations to preempt a transition to a next state of operation (e.g., preempt transition from a normative to non-normative state of operation, such as an unsafe state of operation).
[0091] FIG. 13 depicts an example in which a planner may generate a traj ectory, according to some examples. Diagram 1300 includes a trajectory evaluator 1320 and a trajectory generator 1324. Trajectory evaluator 1320 includes a confidence level generator 1322 and a teleoperator query messenger 1329. As shown, trajectory evaluator 1320 is coupled to a perception engine 1366 to receive static map data 1301, and current and predicted object state data 1303. Trajectory evaluator 1320 also receives local pose data 1305 from localizer 1368 and plan data 1307 from a global planner 1369. In one state of operation (e.g., non-normative), confidence level generator 1322 receives static map data 1301 and current and predicted obj ect state data 1303. Based on this data, confidence level generator 1322 may determine that detected trajectories are associated with unacceptable confidence level values. As such, confidence level generator 1322 transmits detected traj ectory data 1309 (e.g., data including candidate trajectories) to notify a teleoperator via teleoperator query messenger 1329, which, in turn, transmits a request 1370 for teleoperator assistance. [0092] In another state of operation (e.g., a normative state), static map data 1301 , current and predicted obj ect state data 1303, local pose data 1305, and plan data 1307 (e.g., global plan data) are received into trajectory calculator 1325, which is configured to calculate (e.g., iteratively) traj ectories to determine an optimal one or more paths. Next, at least one path is selected and is transmitted as selected path data 131 1. According to some embodiments, traj ectory calculator 1325 is configured to implement re-planning of trajectories as an example. Nominal driving trajectory generator 1327 is configured to generate traj ectories in a refined approach, such as by generating traj ectories based on receding horizon control techniques. Nominal driving traj ectory generator 1327 subsequently may transmit nominal driving traj ectory path data 1372 to, for example, a traj ectory tracker or a vehicle controller to implement physical changes in steering, acceleration, and other components.
[0093] FIG. 14 is a diagram depicting another example of an autonomous vehicle service platform, according to some embodiments. Diagram 1400 depicts an autonomous vehicle service platform 1401 including a teleoperator manager 1407 that is configured to manage interactions and/or communications among teleoperators 1408, teleoperator computing devices 1404, and other components of autonomous vehicle service platform 1401. Further to diagram 1400, autonomous vehicle service platform 1401 includes a simulator 1440, a repository 1441, a policy manager 1442, a reference data updater 1438, a 2D map data repository 1420, a 3D map data repository 1422, and a route data repository 1424. Other map data, such as 4D map data (e.g., using epoch determination), may be implemented and stored in a repository (not shown).
[0094] Teleoperator action recommendation controller 1412 includes logic configured to receive and/or control a teleoperation service request via autonomous vehicle ("AV") planner data 1472, which can include requests for teleoperator assistance as well as telemetry data and other data. As such, planner data 1472 may include recommended candidate traj ectories or paths from which a teleoperator 1408 via teleoperator computing device 1404 may select. According to some examples, teleoperator action recommendation controller 1412 may be configured to access other sources of recommended candidate trajectories from which to select an optimum trajectory. For example, candidate trajectories contained in autonomous vehicle planner data 1472 may, in parallel, be introduced into simulator 1440, which is configured to simulate an event or condition being experienced by an autonomous vehicle requesting teleoperator assistance. Simulator 1440 can access map data and other data necessary for performing a simulation on the set of candidate traj ectories, whereby simulator 1440 need not exhaustively reiterate simulations to confirm sufficiency. Rather, simulator 1440 may provide either confirm the appropriateness of the candidate traj ectories, or may otherwise alert a teleoperator to be cautious in their selection. [0095] Teleoperator interaction capture analyzer 1416 may be configured to capture numerous amounts of teleoperator transactions or interactions for storage in repository 1441, which, for example, may accumulate data relating to a number of teleoperator transactions for analysis and generation of policies, at least in some cases. According to some embodiments, repository 1441 may also be configured to store policy data for access by policy manager 1442. Further, teleoperator interaction capture analyzer 1416 may apply machine learning techniques to empirically determine how best to respond to events or conditions causing requests for teleoperation assistance. In some cases, policy manager 1442 may be configured to update a particular policy or generate a new policy responsive to analyzing the large set of teleoperator interactions (e.g., subsequent to applying machine learning techniques). Policy manager 1442 manages policies that may be viewed as rules or guidelines with which an autonomous vehicle controller and its components operate under to comply with autonomous operations of a vehicle. In some cases, a modified or updated policy may be applied to simulator 1440 to confirm the efficacy of permanently releasing or implementing such policy changes.
[0096] Simulator interface controller 1414 is configured to provide an interface between simulator 1440 and teleoperator computing devices 1404. For example, consider that sensor data from a fleet of autonomous vehicles is applied to reference data updater 1438 via autonomous (" AV") fleet data 1470, whereby reference data updater 1438 is configured to generate updated map and route data 1439. In some implementations, updated map and route data 1439 may be preliminarily released as an update to data in map data repositories 1420 and 1422, or as an update to data in route data repository 1424. In this case, such data may be tagged as being a "beta version" in which a lower threshold for requesting teleoperator service may be implemented when, for example, a map tile including preliminarily updated information is used by an autonomous vehicle. Further, updated map and route data 1439 may be introduced to simulator 1440 for validating the updated map data. Upon full release (e.g., at the close of beta testing), the previously lowered threshold for requesting a teleoperator service related to map tiles is canceled. User interface graphics controller 1410 provides rich graphics to teleoperators 1408, whereby a fleet of autonomous vehicles may be simulated within simulator 1440 and may be accessed via teleoperator computing device 1404 as if the simulated fleet of autonomous vehicles were real.
[0097] FIG. 15 is an example of a flow diagram to control an autonomous vehicle, according to some embodiments. At 1502, flow 1500 begins. Message data may be received at a teleoperator computing device for managing a fleet of autonomous vehicles. The message data may indicate event attributes associated with a non-normative state of operation in the context of a planned path for an autonomous vehicle. For example, an event may be characterized as a particular intersection that becomes problematic due to, for example, a large number of pedestrians, hurriedly crossing the street against a traffic light. The event attributes describe the characteristics of the event, such as, for example, the number of people crossing the street, the traffic delays resulting from an increased number of pedestrians, etc. At 1504, a teleoperation repository may be accessed to retrieve a first subset of recommendations based on simulated operations of aggregated data associated with a group of autonomous vehicles. In this case, a simulator may be a source of recommendations with which a teleoperator may implement. Further, the teleoperation repository may also be accessed to retrieve a second subset of recommendations based on an aggregation of teleoperator interactions responsive to similar event attributes. In particular, a teleoperator interaction capture analyzer may apply machine learning techniques to empirically determine how best to respond to events having similar attributes based on previous requests for teleoperation assistance. At 1506, the first subset and the second subset of recommendations are combined to form a set of recommended courses of action for the autonomous vehicle. At 1508, representations of the set of recommended courses of actions may be presented visually on a display of a teleoperator computing device. At 1510, data signals representing a selection (e.g., by teleoperator) of a recommended course of action may be detected.
[0098] FIG. 16 is a diagram of an example of an autonomous vehicle fleet manager implementing a fleet optimization manager, according to some examples. Diagram 1600 depicts an autonomous vehicle fleet manager that is configured to manage a fleet of autonomous vehicles 1630 transiting within a road network 1650. Autonomous vehicle fleet manager 1603 is coupled to a teleoperator 1608 via a teleoperator computing device 1604, and is also coupled to a fleet management data repository 1646. Autonomous vehicle fleet manager 1603 is configured to receive policy data 1602 and environmental data 1606, as well as other data. Further to diagram 1600, fleet optimization manager 1620 is shown to include a transit request processor 1631, which, in turn, includes a fleet data extractor 1632 and an autonomous vehicle dispatch optimization calculator 1634. Transit request processor 1631 is configured to process transit requests, such as from a user 1688 who is requesting autonomous vehicle service. Fleet data extractor 1632 is configured to extract data relating to autonomous vehicles in the fleet. Data associated with each autonomous vehicle is stored in repository 1646. For example, data for each vehicle may describe maintenance issues, scheduled service calls, daily usage, battery charge and discharge rates, and any other data, which may be updated in real-time, may be used for purposes of optimizing a fleet of autonomous vehicles to minimize downtime. Autonomous vehicle dispatch optimization calculator 1634 is configured to analyze the extracted data and calculate optimized usage of the fleet so as to ensure that the next vehicle dispatched, such as from station 1652, provides for the least travel times and/or costs-in the aggregate-for the autonomous vehicle service. [0099] Fleet optimization manager 1620 is shown to include a hybrid autonomous vehicle/non- autonomous vehicle processor 1640, which, in turn, includes an AV/non-AV optimization calculator 1642 and a non-AV selector 1644. According to some examples, hybrid autonomous vehicle/ non-autonomous vehicle processor 1640 is configured to manage a hybrid fleet of autonomous vehicles and human-driven vehicles (e.g., as independent contractors). As such, autonomous vehicle service may employ non-autonomous vehicles to meet excess demand, or in areas, such as non-AV service region 1690, that may be beyond a geo-fence or in areas of poor communication coverage. AV/non-AV optimization calculator 1642 is configured to optimize usage of the fleet of autonomous and to invite non-AV drivers into the transportation service (e.g., with minimal or no detriment to the autonomous vehicle service). Non-AV selector 1644 includes logic for selecting a number of non-AV drivers to assist based on calculations derived by AV/non- AV optimization calculator 1642.
[0100] FIG. 17 is an example of a flow diagram to manage a fleet of autonomous vehicles, according to some embodiments. At 1702, flow 1700 begins. At 1702, policy data is received. The policy data may include parameters that define how best apply to select an autonomous vehicle for servicing a transit request. At 1704, fleet management data from a repository may be extracted. The fleet management data includes subsets of data for a pool of autonomous vehicles (e.g., the data describes the readiness of vehicles to service a transportation request). At 1706, data representing a transit request is received. For exemplary purposes, the transit request could be for transportation from a first geographic location to a second geographic location. At 1708, attributes based on the policy data are calculated to determine a subset of autonomous vehicles that are available to service the request. For example, attributes may include a battery charge level and time until next scheduled maintenance. At 1710, an autonomous vehicle is selected as transportation from the first geographic location to the second geographic location, and data is generated to dispatch the autonomous vehicle to a third geographic location associated with the origination of the transit request.
[0101] FIG. 18 is a diagram illustrating an autonomous vehicle fleet manager implementing an autonomous vehicle communications link manager, according to some embodiments. Diagram 1800 depicts an autonomous vehicle fleet manager that is configured to manage a fleet of autonomous vehicles 1830 transiting within a road network 1850 that coincides with a communication outage at an area identified as "reduced communication region" 1880. Autonomous vehicle fleet manager 1803 is coupled to a teleoperator 1808 via a teleoperator computing device 1804. Autonomous vehicle fleet manager 1803 is configured to receive policy data 1802 and environmental data 1806, as well as other data. Further to diagram 1800, an autonomous vehicle communications link manager 1820 is shown to include an environment event detector 1831, a policy adaption determinator 1832, and a transit request processor 1834. Environment event detector 1831 is configured to receive environmental data 1806 specifying a change within the environment in which autonomous vehicle service is implemented. For example, environmental data 1806 may specify that region 1880 has degraded communication services, which may affect the autonomous vehicle service. Policy adaption determinator 1832 may specify parameters with which to apply when receiving transit requests during such an event (e.g., during a loss of communications). Transit request processor 1834 is configured to process transit requests in view of the degraded communications. In this example, a user 1888 is requesting autonomous vehicle service. Further, transit request processor 1834 includes logic to apply an adapted policy for modifying the way autonomous vehicles are dispatched so to avoid complications due to poor communications.
[0102] Communication event detector 1840 includes a policy download manager 1842 and communications-configured ("COMM-configured") AV dispatcher 1844. Policy download manager 1842 is configured to provide autonomous vehicles 1830 an updated policy in view of reduced communications region 1880, whereby the updated policy may specify routes to quickly exit region 1880 if an autonomous vehicle enters that region. For example, autonomous vehicle
1864 may receive an updated policy moments before driving into region 1880. Upon loss of communications, autonomous vehicle 1864 implements the updated policy and selects route 1866 to drive out of region 1880 quickly. COMM-configured AV dispatcher 1844 may be configured to identify points 1865 at which to park autonomous vehicles that are configured as relays to establishing a peer-to-peer network over region 1880. As such, COMM-configured AV dispatcher 1844 is configured to dispatch autonomous vehicles 1862 (without passengers) to park at locations
1865 for the purposes of operating as communication towers in a peer-to-peer ad hoc network.
[0103] FIG. 19 is an example of a flow diagram to determine actions for autonomous vehicles during an event, such as degraded or lost communications, according to some embodiments. At 1901, flow 1900 begins. Policy data is received, whereby the policy data defines parameters with which to apply to transit requests in a geographical region during an event. At 1902, one or more of the following actions may be implemented: (1) dispatch a subset of autonomous vehicles to geographic locations in the portion of the geographic location, the subset of autonomous vehicles being configured to either park at specific geographic locations and each serve as a static communication relay, or transit in a geographic region to each serve as a mobile communication relay, (2) implement peer-to-peer communications among a portion of the pool of autonomous vehicles associated with the portion of the geographic region, (3) provide to the autonomous vehicles an event policy that describes a route to egress the portion of the geographic region during an event, (4) invoke teleoperations, and (5) recalculate paths so as to avoid the geographic portion. Subsequent to implementing the action, the fleet of autonomous vehicles is monitored at 1914. [0104] FIG. 20 is a diagram depicting an example of a localizer, according to some embodiments. Diagram 2000 includes a localizer 2068 configured to receive sensor data from sensors 2070, such as LIDAR data 2072, camera data 2074, radar data 2076, and other data 2078. Further, localizer 2068 is configured to receive reference data 2020, such as 2D map data 2022, 3D map data 2024, and 3D local map data. According to some examples, other map data, such as 4D map data 2025 and semantic map data (not shown), including corresponding data structures and repositories, may also be implemented. Further to diagram 2000, localizer 2068 includes a positioning system 2010 and a localization system 2012, both of which are configured to receive sensor data from sensors 2070 as well as reference data 2020. Localization data integrator 2014 is configured to receive data from positioning system 2010 and data from localization system 2012, whereby localization data integrator 2014 is configured to integrate or fuse sensor data from multiple sensors to form local pose data 2052.
[0105] FIG. 21 is an example of a flow diagram to generate local pose data based on integrated sensor data, according to some embodiments. At 2101, flow 2100 begins. At 2102, reference data is received, the reference data including three dimensional map data. In some examples, reference data, such as 3D or 4D map data, may be received via one or more networks. At 2104, localization data from one or more localization sensors is received and placed into a localization system. At 2106, positioning data from one or more positioning sensors is received into a positioning system. At 2108, the localization and positioning data are integrated. At 2110, the localization data and positioning data are integrated to form local position data specifying a geographic position of an autonomous vehicle.
[0106] FIG. 22 is a diagram depicting another example of a localizer, according to some embodiments. Diagram 2200 includes a localizer 2268, which, in turn, includes a localization system 2210 and a relative localization system 2212 to generate positioning-based data 2250 and local location-based data 2251, respectively. Localization system 2210 includes a projection processor 2254a for processing GPS data 2273, a GPS datum 2211, and 3D Map data 2222, among other optional data (e.g., 4D map data). Localization system 2210 also includes an odometry processor 2254b to process wheel data 2275 (e.g., wheel speed), vehicle model data 2213 and 3D map data 2222, among other optional data. Further yet, localization system 2210 includes an integrator processor 2254c to process IMU data 2257, vehicle model data 2215, and 3D map data 2222, among other optional data. Similarly, relative localization system 2212 includes a LIDAR localization processor 2254d for processing LIDAR data 2272, 2D tile map data 2220, 3D map data 2222, and 3D local map data 2223, among other optional data. Relative localization system 2212 also includes a visual registration processor 2254e to process camera data 2274, 3D map data 2222, and 3D local map data 2223, among other optional data. Further yet, relative localization system 2212 includes a radar return processor 2254f to process radar data 2276, 3D map data 2222, and 3D local map data 2223, among other optional data. Note that in various examples, other types of sensor data and sensors or processors may be implemented, such as sonar data and the like.
[0107] Further to diagram 2200, localization-based data 2250 and relative localization-based data 2251 may be fed into data integrator 2266a and localization data integrator 2266, respectively. Data integrator 2266a and localization data integrator 2266 may be configured to fuse corresponding data, whereby localization-based data 2250 may be fused at data integrator 2266a prior to being fused with relative localization-based data 2251 at localization data integrator 2266. According to some embodiments, data integrator 2266a is formed as part of localization data integrator 2266, or is absent. Regardless, a localization-based data 2250 and relative localization-based data 2251 can be both fed into localization data integrator 2266 for purposes of fusing data to generate local position data 2252. Localization-based data 2250 may include unary-constrained data (and uncertainty values) from projection processor 2254a, as well as binary-constrained data (and uncertainty values) from odometry processor 2254b and integrator processor 2254c. Relative localization-based data 2251 may include unary-constrained data (and uncertainty values) from localization processor 2254d and visual registration processor 2254e, and optionally from radar return processor 2254f. According to some embodiments, localization data integrator 2266 may implement non-linear smoothing functionality, such as a Kalman filter (e.g., a gated Kalman filter), a relative bundle adjuster, pose-graph relaxation, particle filter, histogram filter, or the like.
[0108] FIG. 23 is a diagram depicting an example of a perception engine, according to some embodiments. Diagram 2300 includes a perception engine 2366, which, in turn, includes a segmentation processor 2310, an object tracker 2330, and a classifier 2360. Further, perception engine 2366 is configured to receive a local position data 2352, LIDAR data 2372, camera data 2374, and radar data 2376, for example. Note that other sensor data, such as sonar data, may be accessed to provide functionalities of perception engine 2366. Segmentation processor 2310 is configured to extract ground plane data and/or to segment portions of an image to distinguish objects from each other and from static imagery (e.g., background). In some cases, 3D blobs may be segmented to distinguish each other. In some examples, a blob may refer to a set of features that identity an object in a spatially-reproduced environment and may be composed of elements (e.g., pixels of camera data, points of laser return data, etc.) having similar characteristics, such as intensity and color. In some examples, a blob may also refer to a point cloud (e.g., composed of colored laser return data) or other elements constituting an obj ect. Obj ect tracker 2330 is configured to perform frame-to-frame estimations of motion for blobs, or other segmented image portions. Further, data association is used to associate a blob at one location in a first frame at time, tl, to a blob in a different position in a second frame at time, t2. In some examples, object tracker 2330 is configured to perform real-time probabilistic tracking of 3D objects, such as blobs. Classifier 2360 is configured to identify an obj ect and to classify that object by classification type (e.g., as a pedestrian, cyclist, etc.) and by energy/activity (e.g. whether the obj ect is dynamic or static), whereby data representing classification is described by a semantic label. According to some embodiments, probabilistic estimations of obj ect categories may be performed, such as classifying an obj ect as a vehicle, bicyclist, pedestrian, etc. with varying confidences per obj ect class. Perception engine 2366 is configured to determine perception engine data 2354, which may include static obj ect maps and/or dynamic object maps, as well as semantic information so that, for example, a planner may use this information to enhance path planning. According to various examples, one or more of segmentation processor 2310, obj ect tracker 2330, and classifier 2360 may apply machine learning techniques to generate perception engine data 2354.
[0109] FIG. 24 is an example of a flow chart to generate perception engine data, according to some embodiments. Flow chart 2400 begins at 2402, at which data representing a local position of an autonomous vehicle is retrieved. At 2404, localization data from one or more localization sensors is received, and features of an environment in which the autonomous vehicle is disposed are segmented at 2406 to form segmented objects. One or more portions of the segmented obj ect are tracked spatially at 2408 to form at least one tracked obj ect having a motion (e.g., an estimated motion). At 2410, a tracked obj ect is classified at least as either being a static obj ect or a dynamic obj ect. In some cases, a static obj ect or a dynamic obj ect may be associated with a classification type. At 2412, data identifying a classified obj ect is generated. For example, the data identifying the classified object may include semantic information.
[0110] FIG. 25 is an example of a segmentation processor, according to some embodiments. Diagram 2500 depicts a segmentation processor 2510 receiving LIDAR data from one or more LIDARs 2572 and camera image data from one or more cameras 2574. Local pose data 2552, LIDAR data, and camera image data are received into meta spin generator 2521. In some examples, meta spin generator is configured to partition an image based on various attributes (e.g., color, intensity, etc.) into distinguishable regions (e.g., clusters or groups of a point cloud), at least two or more of which may be updated at the same time or about the same time. Meta spin data 2522 is used to perform object segmentation and ground segmentation at segmentation processor 2523, whereby both meta spin data 2522 and segmentation-related data from segmentation processor 2523 are applied to a scanned differencing processor 2513. Scanned differencing processor 2513 is configured to predict motion and/or relative velocity of segmented image portions, which can be used to identity dynamic objects at 2517. Data indicating obj ects with detected velocity at 2517 are optionally transmitted to the planner to enhance path planning decisions. Additionally, data from scanned differencing processor 2513 may be used to approximate locations of objects to form mapping of such objects (as well as optionally identifying a level of motion). In some examples, an occupancy grid map 2515 may be generated. Data representing an occupancy grid map 2515 may be transmitted to the planner to further enhance path planning decisions (e.g., by reducing uncertainties). Further to diagram 2500, image camera data from one or more cameras 2574 are used to classify blobs in blob classifier 2520, which also receives blob data 2524 from segmentation processor 2523. Segmentation processor 2510 also may receive raw radar returns data 2512 from one or more radars 2576 to perform segmentation at a radar segmentation processor 2514, which generates radar-related blob data 2516. Further to FIG. 25, segmentation processor 2510 may also receive and/or generate tracked blob data 2518 related to radar data. Blob data 2516, tracked blob data 2518, data from blob classifier 2520, and blob data 2524 may be used to track objects or portions thereof. According to some examples, one or more of the following may be optional: scanned differencing processor 2513, blob classification 2520, and data from radar 2576.
[0111] FIG. 26A is a diagram depicting examples of an obj ect tracker and a classifier, according to various embodiments. Obj ect tracker 2630 of diagram 2600 is configured to receive blob data 2516, tracked blob data 2518, data from blob classifier 2520, blob data 2524, and camera image data from one or more cameras 2676. Image tracker 2633 is configured to receive camera image data from one or more cameras 2676 to generate tracked image data, which, in turn, may be provided to data association processor 2632. As shown, data association processor 2632 is configured to receive blob data 2516, tracked blob data 2518, data from blob classifier 2520, blob data 2524, and track image data from image tracker 2633, and is further configured to identity one or more associations among the above-described types of data. Data association processor 2632 is configured to track, for example, various blob data from one frame to a next frame to, for example, estimate motion, among other things. Further, data generated by data association processor 2632 may be used by track updater 2634 to update one or more tracks, or tracked obj ects. In some examples, track updater 2634 may implement a Kalman Filter, or the like, to form updated data for tracked objects, which may be stored online in track database ("DB") 2636. Feedback data may be exchanged via path 2699 between data association processor 2632 and track database 2636. In some examples, image tracker 2633 may be optional and may be excluded. Obj ect tracker 2630 may also use other sensor data, such as radar or sonar, as well as any other types of sensor data, for example.
[0112] FIG. 26B is a diagram depicting another example of an obj ect tracker according to at least some examples. Diagram 2601 includes an obj ect tracker 2631 that may include structures and/or functions as similarly -named elements described in connection to one or more other drawings (e.g., FIG. 26 A). As shown, object tracker 2631 includes an optional registration portion 2699 that includes a processor 2696 configured to perform obj ect scan registration and data fusion. Processor 2696 is further configured to store the resultant data in 3D object database 2698. [0113] Referring back to FIG. 26A, diagram 2600 also includes classifier 2660, which may include a track classification engine 2662 for generating static obstacle data 2672 and dynamic obstacle data 2674, both of which may be transmitted to the planner for path planning purposes. In at least one example, track classification engine 2662 is configured to determine whether an obstacle is static or dynamic, as well as another classification type for the object (e.g., whether the object is a vehicle, pedestrian, tree, cyclist, dog, cat, paper bag, etc.). Static obstacle data 2672 may be formed as part of an obstacle map (e.g., a 2D occupancy map), and dynamic obstacle data 2674 may be formed to include bounding boxes with data indicative of velocity and classification type. Dynamic obstacle data 2674, at least in some cases, includes 2D dynamic obstacle map data.
[0114] FIG. 27 is an example of front-end processor for a perception engine, according to some examples. Diagram 2700 includes a ground segmentation processor 2723a for performing ground segmentation, and an over segmentation processor 2723b for performing "over-segmentation," according to various examples. Processors 2723a and 2723b are configured to receive optionally colored LIDAR data 2775. Over segmentation processor 2723b generates data 2710 of a first blob type (e.g., a relatively small blob), which is provided to an aggregation classification and segmentation engine 2712 that generates data 2714 of a second blob type. Data 2714 is provided to data association processor 2732, which is configured to detect whether data 2714 resides in track database 2736. A determination is made at 2740 whether data 2714 of the second blob type (e.g., a relatively large blob, which may include one or more smaller blobs) is a new track. If so, a track is initialized at 2742, otherwise, the tracked object data stored in track database 2736 and the track may be extended or updated by track updater 2742. Track classification engine 2762 is coupled to track database 2736 to identity and update/modify tracks by, for example, adding, removing or modifying track-related data.
[0115] FIG. 28 is a diagram depicting a simulator configured to simulate an autonomous vehicle in a synthetic environment, according to various embodiments. Diagram 2800 includes a simulator 2840 that is configured to generate a simulated environment 2803. As shown, simulator 2840 is configured to use reference data 2822 (e.g., 3D map data and/or other map or route data including RNDF data or similar road network data) to generate simulated geometries, such as simulated surfaces 2892a and 2892b, within simulated environment 2803. Simulated surfaces 2892a and 2892b may simulate walls or front sides of buildings adjacent a roadway. Simulator 2840 may also pre-generated or procedurally generated use dynamic obj ect data 2825 to simulate dynamic agents in a synthetic environment. An example of a dynamic agent is simulated dynamic obj ect 2801, which is representative of a simulated cyclist having a velocity. The simulated dynamic agents may optionally respond to other static and dynamic agents in the simulated environment, including the simulated autonomous vehicle. For example, simulated object 2801 may slow down for other obstacles in simulated environment 2803 rather than follow a preset trajectory, thereby creating a more realistic simulation of actual dynamic environments that exist in the real world.
[0116] Simulator 2840 may be configured to generate a simulated autonomous vehicle controller 2847, which includes synthetic adaptations of a perception engine 2866, a localizer 2868, a motion controller 2862, and a planner 2864, each of which may have functionalities described herein within simulated environment 2803. Simulator 2840 may also generate simulated interfaces ("I/F") 2849 to simulate the data exchanges with different sensors modalities and different sensor data formats. As such, simulated interface 2849 may simulate a software interface for packetized data from, for example, a simulated LIDAR sensor 2872. Further, simulator 2840 may also be configured to generate a simulated autonomous vehicle 2830 that implements simulated AV controller 2847. Simulated autonomous vehicle 2830 includes simulated LIDAR sensors 2872, simulated camera or image sensors 2874, and simulated radar sensors 2876. In the example shown, simulated LIDAR sensor 2872 may be configured to generate a simulated laser consistent with ray trace 2892, which causes generation of simulated sensor return 2891. Note that simulator 2840 may simulate the addition of noise or other environmental effects on sensor data (e.g., added diffusion or reflections that affect simulated sensor return 2891, etc.). Further yet, simulator 2840 may be configured to simulate a variety of sensor defects, including sensor failure, sensor miscalibration, intermittent data outages, and the like.
[0117] Simulator 2840 includes a physics processor 2850 for simulating the mechanical, static, dynamic, and kinematic aspects of an autonomous vehicle for use in simulating behavior of simulated autonomous vehicle 2830. For example, physics processor 2850 includes a content mechanics module 2851 for simulating contact mechanics, a collision detection module 2852 for simulating the interaction between simulated bodies, and a multibody dynamics module 2854 to simulate the interaction between simulated mechanical interactions.
[0118] Simulator 2840 also includes a simulator controller 2856 configured to control the simulation to adapt the functionalities of any synthetically-generated element of simulated environment 2803 to determine cause-effect relationship, among other things. Simulator 2840 includes a simulator evaluator 2858 to evaluate the performance synthetically-generated element of simulated environment 2803. For example, simulator evaluator 2858 may analyze simulated vehicle commands 2880 (e.g., simulated steering angles and simulated velocities) to determine whether such commands are an appropriate response to the simulated activities within simulated environment 2803. Further, simulator evaluator 2858 may evaluate interactions of a teleoperator 2808 with the simulated autonomous vehicle 2830 via teleoperator computing device 2804. Simulator evaluator 2858 may evaluate the effects of updated reference data 2827, including updated map tiles and route data, which may be added to guide the responses of simulated autonomous vehicle 2830. Simulator evaluator 2858 may also evaluate the responses of simulator AV controller 2847 when policy data 2829 is updated, deleted, or added. The above-description of simulator 2840 is not intended to be limiting. As such, simulator 2840 is configured to perform a variety of different simulations of an autonomous vehicle relative to a simulated environment, which include both static and dynamic features. For example, simulator 2840 may be used to validate changes in software versions to ensure reliability. Simulator 2840 may also be used to determine vehicle dynamics properties and for calibration purposes. Further, simulator 2840 may be used to explore the space of applicable controls and resulting traj ectories so as to effect learning by self-simulation.
[0119] FIG. 29 is an example of a flow chart to simulate various aspects of an autonomous vehicle, according to some embodiments. Flow chart 2900 begins at 2902, at which reference data including three dimensional map data is received into a simulator. Dynamic object data defining motion patterns for a classified object may be retrieved at 2904. At 2906, a simulated environment is formed based on at least three dimensional ("3D") map data and the dynamic object data. The simulated environment may include one or more simulated surfaces. At 2908, an autonomous vehicle is simulated that includes a simulated autonomous vehicle controller that forms part of a simulated environment. The autonomous vehicle controller may include a simulated perception engine and a simulated localizer configured to receive sensor data. At 2910, simulated sensor data are generated based on data for at least one simulated sensor return, and simulated vehicle commands are generated at 2912 to cause motion (e.g., vectored propulsion) by a simulated autonomous vehicle in a synthetic environment. At 2914, simulated vehicle commands are evaluated to determine whether the simulated autonomous vehicle behaved consistent with expected behaviors (e.g., consistent with a policy).
[0120] FIG. 30 is an example of a flow chart to generate map data, according to some embodiments. Flow chart 3000 begins at 3002, at which traj ectory data is retrieved. The trajectory data may include traj ectories captured over a duration of time (e.g., as logged traj ectories). At 3004, at least localization data may be received. The localization data may be captured over a duration of time (e.g., as logged localization data). At 3006, a camera or other image sensor may be implemented to generate a subset of the localization data. As such, the retrieved localization data may include image data. At 3008, subsets of localization data are aligned to identifying a global position (e.g., a global pose). At 3010, three dimensional ("3D") map data is generated based on the global position, and at 3012, the 3 dimensional map data is available for implementation by, for example, a manual route data editor (e.g., including a manual road network data editor, such as an RNDF editor), an automated route data generator (e.g., including an automatic road network generator, including an automatic RNDF generator), a fleet of autonomous vehicles, a simulator, a teleoperator computing device, and any other component of an autonomous vehicle service.
[0121] FIG. 31 is a diagram depicting an architecture of a mapping engine, according to some embodiments. Diagram 3100 includes a 3D mapping engine that is configured to receive trajectory log data 3140, LIDAR log data 3172, camera log data 3174, radar log data 3176, and other optional logged sensor data (not shown). Logic 3141 includes a loop-closure detector 3150 configured to detect whether sensor data indicates a nearby point in space has been previously visited, among other things. Logic 3141 also includes a registration controller 3152 for aligning map data, including 3D map data in some cases, relative to one or more registration points. Further, logic 3141 provides data 3142 representing states of loop closures for use by a global pose graph generator 3143, which is configured to generate pose graph data 3145. In some examples, pose graph data 3145 may also be generated based on data from registration refinement module 3146. Logic 3144 includes a 3D mapper 3154 and a LIDAR self-calibration unit 3156. Further, logic 3144 receives sensor data and pose graph data 3145 to generate 3D map data 3120 (or other map data, such as 4D map data). In some examples, logic 3144 may implement a truncated sign distance function ("TSDF") to fuse sensor data and/or map data to form optimal three-dimensional maps. Further, logic 3144 is configured to include texture and reflectance properties. 3D map data 3120 may be released for usage by a manual route data editor 3160 (e.g., an editor to manipulate Route data or other types of route or reference data), an automated route data generator 3162 (e.g., logic to configured to generate route data or other types of road network or reference data), a fleet of autonomous vehicles 3164, a simulator 3166, a teleoperator computing device 3168, and any other component of an autonomous vehicle service. Mapping engine 3110 may capture semantic information from manual annotation or automatically-generated annotation as well as other sensors, such as sonar or instrumented environment (e.g., smart stop-lights).
[0122] FIG. 32 is a diagram depicting an autonomous vehicle application, according to some examples. Diagram 3200 depicts a mobile computing device 3203 including an autonomous service application 3240 that is configured to contact an autonomous vehicle service platform 3201 to arrange transportation of user 3202 via an autonomous vehicle 3230. As shown, autonomous service application 3240 may include a transportation controller 3242, which may be a software application residing on a computing device (e.g., a mobile phone 3203, etc.). Transportation controller 3242 is configured to receive, schedule, select, or perform operations related to autonomous vehicles and/or autonomous vehicle fleets for which a user 3202 may arrange transportation from the user's location to a destination. For example, user 3202 may open up an application to request vehicle 3230. The application may display a map and user 3202 may drop a pin to indicate their destination within, for example, a geo-fenced region. Alternatively, the application may display a list of nearby pre- specified pick-up locations, or provide the user with a text entry field in which to type a destination either by address or by name.
[0123] Further to the example shown, autonomous vehicle application 3240 may also include a user identification controller 3246 that may be configured to detect that user 3202 is in a geographic region, or vicinity, near autonomous vehicle 3230, as the vehicle approaches. In some situations, user 3202 may not readily perceive or identify autonomous vehicle 3230 as it approaches for use by user 3203 (e.g., due to various other vehicles, including trucks, cars, taxis, and other obstructions that are typical in city environments). In one example, autonomous vehicle 3230 may establish a wireless communication link 3262 (e.g., via a radio frequency ("RF") signal, such as WiFi or Bluetooth®, including BLE, or the like) for communicating and/or determining a spatial location of user 3202 relative to autonomous vehicle 3230 (e.g., using relative direction of RF signal and signal strength). In some cases, autonomous vehicle 3230 may detect an approximate geographic location of user 3202 using, for example, GPS data or the like. A GPS receiver (not shown) of mobile computing device 3203 may be configured to provide GPS data to autonomous vehicle service application 3240. Thus, user identification controller 3246 may provide GPS data via link 3260 to autonomous vehicle service platform 3201, which, in turn, may provide that location to autonomous vehicle 3230 via link 3261. Subsequently, autonomous vehicle 3230 may determine a relative distance and/or direction of user 3202 by comparing the user's GPS data to the vehicle's GPS-derived location.
[0124] Autonomous vehicle 3230 may also include additional logic to identify the presence of user 3202, such that logic configured to perform face detection algorithms to detect either user 3202 generally, or to specifically identify the identity (e.g., name, phone number, etc.) of user 3202 based on the user's unique facial characteristics. Further, autonomous vehicle 3230 may include logic to detect codes for identifying user 3202. Examples of such codes include specialized visual codes, such as QR codes, color codes, etc., specialized audio codes, such as voice activated or recognized codes, etc., and the like. In some cases, a code may be an encoded security key that may be transmitted digitally via link 3262 to autonomous vehicle 3230 to ensure secure ingress and/or egress. Further, one or more of the above-identified techniques for identifying user 3202 may be used as a secured means to grant ingress and egress privileges to user 3202 so as to prevent others from entering autonomous vehicle 3230 (e.g., to ensure third party persons do not enter an unoccupied autonomous vehicle prior to arriving at user 3202). According to various examples, any other means for identifying user 3202 and providing secured ingress and egress may also be implemented in one or more of autonomous vehicle service application 3240, autonomous vehicle service platform 3201, and autonomous vehicle 3230. [0125] To assist user 3302 in identifying the arrival of its requested transportation, autonomous vehicle 3230 may be configured to notify or otherwise alert user 3202 to the presence of autonomous vehicle 3230 as it approaches user 3202. For example, autonomous vehicle 3230 may activate one or more light-emitting devices 3280 (e.g., LEDs) in accordance with specific light patterns. In particular, certain light pattems are created so that user 3202 may readily perceive that autonomous vehicle 3230 is reserved to service the transportation needs of user 3202. As an example, autonomous vehicle 3230 may generate light patterns 3290 that may be perceived by user 3202 as a "wink," or other animation of its exterior and interior lights in such a visual and temporal way. The pattems of light 3290 may be generated with or without pattems of sound to identity to user 3202 that this vehicle is the one that they booked.
[0126] According to some embodiments, autonomous vehicle user controller 3244 may implement a software application that is configured to control various functions of an autonomous vehicle. Further, an application may be configured to redirect or reroute the autonomous vehicle during transit to its initial destination. Further, autonomous vehicle user controller 3244 may be configured to cause on-board logic to modify interior lighting of autonomous vehicle 3230 to effect, for example, mood lighting. Controller 3244 may also control a source of audio (e.g., an external source such as Spotify, or audio stored locally on the mobile computing device 3203), select a type of ride (e.g., modify desired acceleration and braking aggressiveness, modify active suspension parameters to select a set of "road-handling" characteristics to implement aggressive driving characteristics, including vibrations, or to select "soft-ride" qualities with vibrations dampened for comfort), and the like. For example, mobile computing device 3203 may be configured to control HVAC functions as well, like ventilation and temperature.
[0127] FIGs. 33 to 35 illustrate examples of various computing platforms configured to provide various functionalities to components of an autonomous vehicle service, according to various embodiments. In some examples, computing platform 3300 may be used to implement computer programs, applications, methods, processes, algorithms, or other software to perform the above- described techniques.
[0128] Note that various structures and/or functionalities of FIG. 33 are applicable to FIGs. 34 and 35, and, as such, some elements in those figures may be discussed in the context of FIG. 33.
[0129] In some cases, computing platform 3300 can be disposed in any device, such as a computing device 3390a, which may be disposed in one or more computing devices in an autonomous vehicle service platform, an autonomous vehicle 3391, and/or mobile computing device 3390b.
[0130] Computing platform 3300 includes a bus 3302 or other communication mechanism for communicating information, which interconnects subsystems and devices, such as processor 3304, system memory 3306 (e.g., RAM, etc.), storage device 3308 (e.g., ROM, etc.), an in-memory cache (which may be implemented in RAM 3306 or other portions of computing platform 3300), a communication interface 3313 (e.g., an Ethernet or wireless controller, a Bluetooth controller, NFC logic, etc.) to facilitate communications via a port on communication link 3321 to communicate, for example, with a computing device, including mobile computing and/or communication devices with processors. Processor 3304 can be implemented with one or more graphics processing units ("GPUs"), with one or more central processing units ("CPUs"), such as those manufactured by Intel® Corporation, or one or more virtual processors, as well as any combination of CPUs and virtual processors. Computing platform 3300 exchanges data representing inputs and outputs via input-and-output devices 3301, including, but not limited to, keyboards, mice, audio inputs (e.g., speech-to-text devices), user interfaces, displays, monitors, cursors, touch-sensitive displays, LCD or LED displays, and other I/O-related devices.
[0131] According to some examples, computing platform 3300 performs specific operations by processor 3304 executing one or more sequences of one or more instructions stored in system memory 3306, and computing platform 3300 can be implemented in a client-server arrangement, peer-to-peer arrangement, or as any mobile computing device, including smart phones and the like. Such instructions or data may be read into system memory 3306 from another computer readable medium, such as storage device 3308. In some examples, hard-wired circuitry may be used in place of or in combination with software instructions for implementation. Instructions may be embedded in software or firmware. The term "computer readable medium" refers to any tangible medium that participates in providing instructions to processor 3304 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks and the like. Volatile media includes dynamic memory, such as system memory 3306.
[0132] Common forms of computer readable media includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD- ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read. Instructions may further be transmitted or received using a transmission medium. The term "transmission medium" may include any tangible or intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such instructions. Transmission media includes coaxial cables, copper wire, and fiber optics, including wires that comprise bus 3302 for transmitting a computer data signal. [0133] In some examples, execution of the sequences of instructions may be performed by computing platform 3300. According to some examples, computing platform 3300 can be coupled by communication link 3321 (e.g., a wired network, such as LAN, PSTN, or any wireless network, including WiFi of various standards and protocols, Bluetooth®, NFC, Zig-Bee, etc.) to any other processor to perform the sequence of instructions in coordination with (or asynchronous to) one another. Computing platform 3300 may transmit and receive messages, data, and instructions, including program code (e.g., application code) through communication link 3321 and communication interface 3313. Received program code may be executed by processor 3304 as it is received, and/or stored in memory 3306 or other non-volatile storage for later execution.
[0134] In the example shown, system memory 3306 can include various modules that include executable instructions to implement functionalities described herein. System memory 3306 may include an operating system ("O/S") 3332, as well as an application 3336 and/or logic module(s) 3359. In the example shown in FIG. 33, system memory 3306 includes an autonomous vehicle ("AV") controller module 3350 and/or its components (e.g., a perception engine module, a localization module, a planner module, and/or a motion controller module), any of which, or one or more portions of which, can be configured to facilitate an autonomous vehicle service by implementing one or more functions described herein.
[0135] Referring to the example shown in FIG. 34, system memory 3306 includes an autonomous vehicle service platform module 3450 and/or its components (e.g., a teleoperator manager, a simulator, etc.), any of which, or one or more portions of which, can be configured to facilitate managing an autonomous vehicle service by implementing one or more functions described herein.
[0136] Referring to the example shown in FIG. 35, system memory 3306 includes an autonomous vehicle ("AV") module and/or its components for use, for example, in a mobile computing device. One or more portions of module 3550 can be configured to facilitate delivery of an autonomous vehicle service by implementing one or more functions described herein.
[0137] Referring back to FIG. 33, the structures and/or functions of any of the above-described features can be implemented in software, hardware, firmware, circuitry, or a combination thereof. Note that the structures and constituent elements above, as well as their functionality, may be aggregated with one or more other structures or elements. Alternatively, the elements and their functionality may be subdivided into constituent sub-elements, if any. As software, the above- described techniques may be implemented using various types of programming or formatting languages, frameworks, syntax, applications, protocols, objects, or techniques. As hardware and/or firmware, the above- described techniques may be implemented using various types of programming or integrated circuit design languages, including hardware description languages, such as any register transfer language ("RTL") configured to design field-programmable gate arrays ("FPGAs"), application-specific integrated circuits ("ASICs"), or any other type of integrated circuit. According to some embodiments, the term "module" can refer, for example, to an algorithm or a portion thereof, and/or logic implemented in either hardware circuitry or software, or a combination thereof. These can be varied and are not limited to the examples or descriptions provided.
[0138] In some embodiments, module 3350 of FIG. 33, module 3450 of FIG. 34, and module 3550 of FIG. 35, or one or more of their components, or any process or device described herein, can be in communication (e.g., wired or wirelessly) with a mobile device, such as a mobile phone or computing device, or can be disposed therein.
[0139] In some cases, a mobile device, or any networked computing device (not shown) in communication with one or more modules 3359 (module 3350 of FIG. 33, module 3450 of FIG.
34, and module 3550 of FIG. 35) or one or more of its components (or any process or device described herein), can provide at least some of the structures and/or functions of any of the features described herein. As depicted in the above-described figures, the structures and/or functions of any of the above-described features can be implemented in software, hardware, firmware, circuitry, or any combination thereof. Note that the structures and constituent elements above, as well as their functionality, may be aggregated or combined with one or more other structures or elements. Alternatively, the elements and their functionality may be subdivided into constituent sub-elements, if any. As software, at least some of the above-described techniques may be implemented using various types of programming or formatting languages, frameworks, syntax, applications, protocols, objects, or techniques. For example, at least one of the elements depicted in any of the figures can represent one or more algorithms. Or, at least one of the elements can represent a portion of logic including a portion of hardware configured to provide constituent structures and/or functionalities.
[0140] For example, module 3350 of FIG. 33, module 3450 of FIG. 34, and module 3550 of FIG.
35, or one or more of its components, or any process or device described herein, can be implemented in one or more computing devices (i.e., any mobile computing device, such as a wearable device, an audio device (such as headphones or a headset) or mobile phone, whether worn or carried) that include one or more processors configured to execute one or more algorithms in memory. Thus, at least some of the elements in the above-described figures can represent one or more algorithms. Or, at least one of the elements can represent a portion of logic including a portion of hardware configured to provide constituent structures and/or functionalities. These can be varied and are not limited to the examples or descriptions provided.
[0141] As hardware and/or firmware, the above-described structures and techniques can be implemented using various types of programming or integrated circuit design languages, including hardware description languages, such as any register transfer language ("RTL") configured to design field-programmable gate arrays ("FPGAs"), application-specific integrated circuits ("ASICs"), multi-chip modules, or any other type of integrated circuit.
[0142] For example, module 3350 of FIG. 33, module 3450 of FIG. 34, and module 3550 of FIG. 35, or one or more of its components, or any process or device described herein, can be implemented in one or more computing devices that include one or more circuits. Thus, at least one of the elements in the above-described figures can represent one or more components of hardware. Or, at least one of the elements can represent a portion of logic including a portion of a circuit configured to provide constituent structures and/or functionalities.
[0143] According to some embodiments, the term "circuit" can refer, for example, to any system including a number of components through which current flows to perform one or more functions, the components including discrete and complex components. Examples of discrete components include transistors, resistors, capacitors, inductors, diodes, and the like, and examples of complex components include memory, processors, analog circuits, digital circuits, and the like, including field-programmable gate arrays ("FPGAs"), application-specific integrated circuits ("ASICs"). Therefore, a circuit can include a system of electronic components and logic components (e.g., logic configured to execute instructions, such that a group of executable instructions of an algorithm, for example, and, thus, is a component of a circuit). According to some embodiments, the term "module" can refer, for example, to an algorithm or a portion thereof, and/or logic implemented in either hardware circuitry or software, or a combination thereof (i.e., a module can be implemented as a circuit). In some embodiments, algorithms and/or the memory in which the algorithms are stored are "components" of a circuit. Thus, the term "circuit" can also refer, for example, to a system of components, including algorithms. These can be varied and are not limited to the examples or descriptions provided.
[0144] FIG. 36 is a diagram depicting a mapping engine configured to generate mapping data adaptively for autonomous vehicles responsive to changes in physical environments, according to some examples. Diagram 3600 depicts a mapping engine 3654 disposed in an autonomous vehicle service platform 3601 communicatively coupled via a communication layer (not shown) to one or more autonomous vehicles 3630. Mapping engine 3654 is configured to generate map data, and to modify map data adaptively responsive to changes in physical environments in which autonomous vehicle 3630 transits. In the example shown, mapping engine 3654 may generate mapping data based on sensor data received from autonomous vehicle 3630, which is depicted as having any number of sensors or sensor devices 3604a, 3604b, and 3604c, of a sensor type 3602a, a sensor type 3602b, and a sensor type 3602c, respectively. Autonomous vehicle 3630 may include any number of other sensors or sensor devices 3604n having any other sensor types 3602n. Sensors 3604a, 3604b, 3604c, and 3604n respectively generate sensor data 3607a, 3607b, 3607c, and 3607n, one or more of which may be received into mapping engine 3654 for generating map data 3659 (e.g., 2D, 3D, and/or 4D map data). Map data 3659 may be transmitted to autonomous vehicle 3630 for storage in map repository 3605a and for use to facilitate localization as well as other functionalities. In particular, autonomous vehicle 3630 may include a localizer (not shown) that uses map data in map repository 3605a to determine a location and/or local pose of the autonomous vehicle at any time, including during transit.
[0145] In view of the foregoing, the structures and/or functionalities of mapping engine 3654, as well as its components, can facilitate the generation of "self-healing" maps and map data by, for example, detecting variations in portions of map data over time, and generating updated maps (i.e., updated map data) that include the variations or changes to the physical environment in which autonomous vehicle 3630 travels. In some implementations, mapping engine 3654 may generate an adaptive three-dimensional model of a cityscape adjacent to networks of paths and roadways over which a fleet of autonomous vehicles travel. A 3D model of a portion of the cityscape can be derived by identifying data that represent surfaces (and other surface attributes, such as shape, size, texture, color, etc., of the surfaces) that constitute the facade or exterior surfaces of objects, such as buildings (including commercial signage), trees, guard rails, barriers, street lamps, traffic signs and signals, and any other physical feature that may be detected by sensors 3604a, 3604b, 3604c, and 3604n. Thus, mapping engine 3654 may be configured to detect an obj ect (or the absence of the obj ect) associated with a portion of map data, as well as changes in the obj ect (e.g., changes in color, size, etc.), and may be further configured to incorporate changes to an obj ect into map data to adaptively form (e.g., automatically) an updated portion of the map data. Therefore, an updated portion of map data may be stored in map repository 3605a so as to enhance, among other things, the accuracy of localization functions for autonomous vehicle 3630 (as well as other autonomous vehicle controller functions, including planning and the like).
[0146] In some cases, map data 3659 generated by mapping engine 3654 may be used in combination with locally-generated map data (not shown), as generated by a local map generator (not shown) in autonomous vehicle 3630. For example, an autonomous vehicle controller (not shown) may detect that one or more portions of map data in map repository 3605a varies from one or more portions of locally-generated map data. Logic in the autonomous vehicle controller can analyze the differences in map data (e.g., variation data) to identity a change in the physical environment (e.g., the addition, removal, or change in a static obj ect). In a number of examples, the term "variation data" may refer to the differences between remotely-generated and locally- generated map data. Based on the changed portion of an environment, the autonomous vehicle controller may implement varying proportional amounts of map data in map repository 3605a and locally-generated map data to optimize localization. For example, an autonomous vehicle controller may generate hybrid map data composed of both remotely-generated map data and locally- generated map data to optimize the determination of the location or local pose of autonomous vehicle 3630. Further, an autonomous vehicle controller, upon detecting variation data, may cause transmission (at various bandwidths or data rates) of varying amounts of sensor-based data or other data to autonomous vehicle service platform 3601. For example, autonomous vehicle service platform 3601 may receive different types of data at different data rates based on, for instance, the criticality of receiving guidance from a teleoperator. As another example, subsets of sensor data 3607a, 3607b, 3607c, and 3607n may be transmitted (e.g., at appropriate data rates) to, for example, modify map data to form various degrees of updated map data in real-time (or near real-time), and to further perform one or more of the following: (1) evaluate and characterize differences in map data, (2) propagate updated portions of map data to other autonomous vehicles in the fleet, (3) generate a notification responsive to detecting map data differences to a teleoperator computing device, (4) generate a depiction of the environment (and the changed portion thereof), as sensed by various sensor devices 3604a, 3604b, 3604c, and 3604n, to display at any sufficient resolution in a user interface of a teleoperator computing device. Note that the above-described examples are not limiting, and any other map-related functionality for managing a fleet of autonomous vehicles may be implemented using mapping engine 3654 in view of detected changes in physical environments relative to map data.
[0147] According to some examples, sensor type 3602a, sensor type 3602b, and sensor type 3602c may include laser-based sensors, image-based sensors, and radar-based sensors, respectively. As such, sensors 3604a, 3604b, and 3604c may include Lidars, cameras, and radar devices, respectively. As shown in diagram 3600, multiple sensor devices (e.g., Lidars) 3604a each generate different laser-based sensed data 3607a at a geographic location. For example, each Lidar 3604a may be disposed at different locations on autonomous vehicle 3630, and each may be oriented differently (see FIGs. 3A and 3C, both of which depict different Lidars having different views and sensor fields). Given the directional nature of projecting laser beams, different laser returns of different Lidars 3604a may return from a common point (or a common set of points associated with, for example, a traffic sign) at different times. Mapping engine 3654 and/or components of autonomous vehicle services platform 3601 may be configured to align, map, transform, or correlate the laser returns of different Lidars 3604a for common points of laser returns from a surface in the environment. Mapping engine 3654 and/or components of autonomous vehicle services platform 3601 may also process sensor data 3607b and sensor data 3607c similarly.
[0148] In some examples, one or more sensors 3604n may include various different sensor types ("n") 3602n to generate various different subsets of sensor data 3607n. Examples of sensors 3604n include positioning sensors, such as one or more global positioning system ("GPS") data receiver- sensors, one or more inertial measurement units ("IMUs"), one or more odometry sensors (e.g., wheel encoder sensors, wheel speed sensors, and the like), one or more wheel angle sensors, and the like to provide autonomous vehicle position and pose data. Such pose data may include one or more coordinates (e.g., an x-coordinate, a y-coordinate, and/or a z-coordinate), a yaw value, a roll value, a pitch value (e.g., an angle value), a rate (e.g., velocity), an altitude, and the like
[0149] A log data repository 3609 in autonomous vehicle services platform 3601 is configured to receive and store subsets of sensor data 3607a, 3607b, 3607c, and 3607n, which, in at least one example, include raw LIDAR data, raw camera data, raw radar data, and other raw sensor data, respectively. As shown in diagram 3600, subsets of sensor data 3607a, 3607b, at 3607c may be stored or logged at a common point in time or during a common interval of time as data set (" 1 ") 3610a, data set ("2") 3610b, and data set ("n") 3610n, or as any number of data sets. Data sets 3610a, 3610b, and 3610n may be stored in data structures of log files, according to some examples. Further, sensor data 3607n, which may be sensed contemporaneously with subsets of sensor data 3607a, 3607b, and 3607c, may also be stored as part of log files for data sets 3610a, 3610b, and 361 On.
[0150] Alignment controller 3640 may be configured to receive one or more of sensor data 3607a, 3607b, 3607c, and 3607n, as well as other data 3603m. Alignment controller 3640 may also be configured to generate data representing aligned subsets of sensor data 3607a, 3607b, 3607c, and 3607n. In some cases, sensor data 3607 may include a subset of sensor data 3607n that includes positioning data (e.g., sensor data 3607m may include GPS, IMU, and odometry data). Regarding sensor data alignment, examples of data representing aligned subsets of sensor data include data representing at least aligned Lidar data and aligned camera data. According to some examples, alignment controller 3640 may be configured to implement a registration algorithm to align sensor data by identifying "registration" points at which to register portions or frames of Lidar sensor data and to register portions or frames of camera data. For example, alignment controller 3640 may map or relate laser returns from one Lidar to other Lidars, and may map or relate pixel data from one camera to other cameras. Further, alignment controller 3640 may generate positioning map data, such data may be stored in a data structure based on a pose graph model in which data specifying individual poses (e.g., local poses) may be interrelated spatially based on positioning sensor data collected from sensors 3607n (e.g., GPS data, IMU data, odometry data, etc.).
[0151] Mapping engine 3654 may be configured to receive the above-described aligned sensor data (e.g., registered sensor data) and positioning map data (e.g., pose graph-related data) to generate a high definition ("HD") three-dimensional model of a cityscape adj acent a network of roadways based on the integration of the subsets of sensor data 3607a, 3607b, 3607c, and 3607n. As shown in diagram 3600, mapping engine 3654 may include one or more of the following: an integrator 3651 to integrate sensor data, a calibrator 3652 to calibrate sensor data, a data change detector 3653 to detect changes in portions of map data, a tile generator 3656 to generate formatted map data, and a data change manager 3657 to manage implementation of changed map data, according to various examples.
[0152] Integrator 3651 may be configured to integrate multiple subsets of sensor data (e.g., of the same and different sensor modalities) to generate high-resolution (e.g., relatively high-resolution) imagery data as a 3D model of an environment in which autonomous vehicles travel, and may be further configured to reduce errors related to the individual types of sensors. According to some examples, integrator 3651 is configured to fuse sensor data (e.g., LIDAR data, camera data, radar data, etc.) to form integrated sensor data. Further, raw sensor data sets 3610a, 3610b, and 3610n may be received from one or more autonomous vehicles 3630 so as to fuse the aggregation of one or more subsets of sensor data of one or more sensor modalities from a fleet of autonomous vehicles 3630. By fusing data from raw sensor data sets 3610a, 3610b, and 3610n, integrator 3651 may generate 3D data sets that include fused sensor data, such as data set (" 1 ") 3655a and data set ("2") 3655b. Integrator 3651 may integrate or otherwise fuse at least two types of sensor data, including the subsets of laser return data and the subsets of image data. In some examples, the fusing of laser and image data may include correlating pixel data of subsets of image data to subsets of laser return data. Optionally, integrator 3651 may associate pixel data of one or more pixels to one or more laser returns, whereby the laser data may be associated with a portion of a surface in the three- dimensional tile data. Note that pixel data may specify one or more surface characteristics including texture, color, reflectivity, transparency, etc. According to some examples, integrator 3651 may implement a Kalman filtering process or a variant thereof (e.g., an extended Kalman filtering process), or any other process in which to fuse sensor data. Integrator 3651 may also include logic to extract or otherwise determine surfaces of objects (e.g., buildings, trees, parked automobiles, etc.) or features, as well as surface characteristics, relative to a pose of an autonomous vehicle at which sensor data may be acquired.
[0153] Integrator 3651 may be configured to use sensor data sets 3655a and 3655b to extract surface-related data of physical obj ects in an environment of an autonomous vehicle. Data set 3655a and data set 3655b, as well as others not shown, may include fused sensor data representing a three- dimensional model relative to different points in time or different intervals of time. Therefore, data sets 3655 may be used to detect whether there are changes to the physical environment, or portions thereof, over time. Note that, at least in some implementations, integrator 3651 may also implement a distance transform, such as signed distance function (" SDF"), to determine one or more surfaces external to an autonomous vehicle. In one example, a truncated sign distance function ("TSDF"), or equivalent, may be implemented to identity one or more points on the surface relative to a reference point (e.g., one or more distances to points on a surface of an extemal object relative to a local pose).
[0154] Integrator 3651 may be configured to generate 3D models of a cityscape (or any extemal obj ect feature) as probabilistic maps, whereby map data may represent probability distributions over one or more environmental properties. For example, the probabilistic map may be formed using laser intensity (e.g., average laser intensity or reflectivity) and variances of infrared remittance value at distances or points in space relative to a pose of an autonomous vehicle. A data structure for storing map data may include a number of cells that include, for example, an intensity average value and a variance value. In some examples, this or any other data structure may also include a number of cells for storing 3D map data, such as color data (e.g., RGB values or other color space values), texture data, reflectance data, or any other surface characteristic or attribute data (e.g., specular data). A cell that is configured to store map-related data may be implemented as a voxel or as a 3D tile, according to some examples.
[0155] Mapping engine 3654 and/or integrator 3651, as well as other components of mapping engine 3654, may be configured to generate 3D map data in an "offline" mode of operation. For example, mapping engine 3654 may implement algorithms (e.g., machine learning, including deep- learning algorithms) that analyze data sets 3655 based on logged data sets (e.g., static data) to generate map data. Note, however, mapping engine 3654 may not be limited to off-line map generation, but may also implement "online" map generation techniques in which one or more portions of raw sensor data may be received in real-time (or nearly real-time) to generate map data or identify changes thereto. Mapping engine 3654 may implement logic configured to perform simultaneous localization and mapping (" SLAM") or any suitable mapping technique.
[0156] Data change detector 3653 is configured to detect changes in data sets 3655a and 3655b, which are examples of any number of data sets of 3D map data. Data change detector 3653 also is configured to generate data identifying a portion of map data that has changed, as well as optionally identifying or classifying an object associated with the changed portion of map data. In the example shown, a number of data sets, including data set 3655a, includes map data configured to generate map data, which is conceptually depicted as 3D model data 3660 (e.g., a roadway at time, Tl, including portions of map data 3664). At time, T2, however, data change detector 3653 may detect that another number of data sets, including data set 3655b, includes data representing the presence of extemal obj ects in portions of map data 3665 of 3D model data 3661, whereby portions of map data 3665 coincide with portions of map data 3664 at different times. Therefore, data change detector 3653 may detect changes in map data, and may further adaptively modify map data to include the changed map data (e.g., as updated map data). [0157] According to some examples, data change detector 3653 is configured to perform one or more statistical change detection algorithms to detect changes in physical environments. Multi- temporal analysis techniques or the other suitable algorithms may also be used. The structures of data sets 3655a and 3655b may be implemented as cumulative data structures with which to index sensor data (e.g., measurements thereof) stored in a 3D map data structure. As an example, a statistical change detection algorithm may be configured to detect portions of map data that change by identifying boundaries over one or more iterations of a deep-learning computation. In particular, data changed detector 3653 may be configured to detect boundaries of map data portions 3664 and 3665 over time, such as over two or more data sets (e.g., over one or more passes, or epochs, of application of data sets to statistical change detection algorithms or deep-learning algorithms). Epoch determination may also be applied to, for example, construct 4D maps and associated 4D map data. In some examples, data change detector 3653 may classify portions of map data, as well as an object therein, to identity whether an obj ect is static or dynamic. In some cases, dynamic objects may be filtered out from map data generation.
[0158] Mapping engine 3654 is configured to provide map data 3659 to map data repository 3605a in reference data repository 3605. Mapping engine 3654 may be configured to apply the change in map data to form updated three-dimensional ("3D") map data as reference data for transmission to reference data stores (i.e., repositories) in a fleet of autonomous vehicles. The change in data may be representative of a state change of an environment at which various types of sensor data are sensed. The state change of the environment, therefore, may be indicative of change in state of an object located therein (e.g., inclusion of data representing the presence or absence of one or more objects). In some examples, data change manager 3657 may be configured to identity or otherwise specify (e.g., via identifier or indicator data 3658) that a portion of map data includes changed map data 3658 (or an indication thereof). As shown, map data 3692 stored map repository 3605a is associated with, or linked to, indication data ("delta data") 3694 that indicated that an associated portion of map data has changed. Further to the example shown, indication data 3694 may identity a set of traffic cones, as changed portions of map data 3665, disposed in a physical environment associated with 3D model 3661 through which an autonomous vehicle travels.
[0159] A tile generator 3656 may be configured to generate two-dimensional or three-dimensional map tiles based on map data from data sets 3655a and 3655b. The map tiles may be transmitted for storage in map repository 3605a. Tile generator 3656 may generate map tiles that include indicator data for indicating a portion of the map is an updated portion of map data. Further, an updated map portion may be incorporated into a reference data repository 3605 in an autonomous vehicle. Therefore, consider an example in which an autonomous vehicle 3630 travels through the physical environment and plans on traveling near a recently-added object (e.g., traffic cones) in an environment. A localizer (not shown) may access map data that is associated with a changed portion of map data (e.g., an updated portion of map data) to localize the autonomous vehicle. Upon detecting the performance of localization with an updated map version, logic may invoke additional processing to ensure that the use of updated map data may be used effectively and safely to navigate an autonomous vehicle 3630. For example, when a map tile including changed map data is accessed or implemented during localization, a request for teleoperator monitoring or assistance may be generated. Note that in some examples, changed portions of map data may also refer to temporary map data as such data may be used in fewer situations than, for example, validated map data.
[0160] Note, however, changed portions of map data may also be validated for integration into map data, whereby the status of the changed map data is transitioned from "temporary" to "validated." To illustrate an example of validating such data, consider that a change in map data may be exported, as updated three-dimensional map data, to a simulator computing device. The simulator computing device may then simulate performance of a portion of the fleet of autonomous vehicles in a simulated environment based on the updated three-dimensional map data. Upon validating the updated three-dimensional map data, the changed map portions may be incorporated to form new three-dimensional map data. "New" three-dimensional map data may be viewed as three-dimensional map data that may be relied upon such that indications changed map data (i.e., indications of changed map data 3694) may be removed, as well as invocation of requests (e.g., automatic requests) for teleoperator assistance.
[0161] According to some examples, mapping engine 3654 may include, or be implemented as, a 3D mapping engine and/or a mapper as shown in FIG. 31. Further, components of mapping engine 3654 may be combined or otherwise distributed within or without mapping engine 3654. Mapping engine 3654 and any of its components may be implemented in hardware or software, or a combination thereof. Moreover, mapping engine 3654 may include any functionality and/or structure described herein, including one or more components of a perception engine to perform object detection, segmentation, and/or classification.
[0162] As a further example, consider that alignment controller 3640 may include one or more components of mapping engine 3110 of FIG. 31. For example, alignment controller 3640 may include a loop-closure detector 3150, a registration controller 3152, a global pose generator 3143, and a registration refinement module 3146. In the example shown in FIG. 36, autonomous vehicle service platform 3601 may implement, as part of alignment controller 3640, loop-closure detector 3150 of FIG. 31 that may be configured to detect one or more portions of pose graphs at which autonomous vehicle 3630 of FIG. 36 has previously traversed (e.g., loop-closure detector 3150 of FIG. 31 may perform one or more loop-closure processes to identify a closed loop). Registration controller 3152 may be configured to align or register multiple portions or frames of the same or different sensor data. For example, one or more data sets of image data may be transformed or otherwise mapped to each other, as well as to one or more data sets of laser return data and/or radar return data. Registration controller 3152 may be configured to align subsets of laser return data, subsets of image data, and the like based on trajectory data representing position data to identify a relative coordinate of the global coordinate system. Examples of trajectory data include GPS data, IMU data, odometry data, etc. Global pose graph generator 3143 may be is configured to generate pose graph data 3145 to specify a pose of autonomous vehicle 3630 of FIG. 36 relative to global coordinate system. Therefore, locally-detected poses of a pose graph may be referenced to a global coordinate system. For example, global pose graph generator 3143 of FIG. 31 may be configured to form a global pose graph referenced to a global coordinate system. A global pose graph may be formed based on a first type of sensor data (e.g., subsets of laser return data) and a second type of sensor data (e.g., subsets of image data), as well as other optional sensor data (e.g., subsets of radar data). Further, global pose graph generator 3143 may also be configured to align the subsets of laser return data and the subset of image data to a location relative to a coordinate of a global coordinate system. Registration refinement module 3146 is configured to refine the registration of one or more of captured image data, captured laser return data, or other captured sensor data, such as radar data and the like. In some examples, registration refinement module 3146 is configured to reduce or eliminate artifacts of map data (e.g., blurring artifacts or the like) subsequent to, for example, the projection of color data onto 3D mapped surfaces.
[0163] FIG. 37 is a diagram depicting an example of an autonomous vehicle controller implementing updated map data, according to some examples. Diagram 3700 depicts a mapping engine 3754 configured to generate map data 3759, which may be implemented as three- dimensional map tiles. In the example shown, map data 3759 may also include changed map data 3758 that either includes a portion of changed map data (e.g., updated portions of map data for use with unchanged portions of map data) or an indication (e.g., indicator data or a pointer) that identifies an updated portion of changed map data, or both. Further to diagram 3700, an autonomous vehicle service platform 3701 may be configured to transmit map data 3786 and changed map data 3788 vianetwork 3702. An autonomous vehicle controller 3747 uses map data 3786 and/or changed map data 3788 to localize autonomous vehicle 3730. In some examples, autonomous vehicle controller 3747 may detect changed map data 3788 is being accessed during localization. In turn, autonomous vehicle controller 3747 may generate teleoperator request data 3770 to request teleoperator assistance. Teleoperator request data 3770 may also be configured to request that a teleoperator at least monitor performance of autonomous vehicle 3730 during localization in which updated portions of map data is accessed or implemented (or when autonomous vehicle 3730 approaches or travel near a physical location associated with an updated portion of map data). [0164] In some examples, mapping data generated by mapping engine 3754 may be used to generate other reference data, such as route data (e.g., road network data)., such as RNDF-like data, mission data, such as MDF-like data, and other reference data that may be used to navigate a fleet of autonomous vehicles. As shown, route data generator 3780 may be configured to generate route data 3782 based on unchanged and/or validated map data. Further, route generator 3780 may be configured to generate changed route data 3784, which may be generated using changed and/or non-validated map data. In some cases, autonomous vehicle controller 3747 may generate teleoperator request data 3770 responsive to detecting the use of changed route data 3784. Therefore, changed route data 3784 (e.g., non-validated, or temporary map data) route data may be used to navigate an autonomous vehicle, with or without assistance of guidance data generated by a teleoperator.
[0165] FIG. 38 is a flow chart illustrating an example of generating map data, according to some examples. Flow 3800 begins at 3802. Subsets of multiple types of sensor data are accessed at 3802 (e.g., in a data store or repository that may include log files). The subset of multiple types of sensor data may correspond to groups of multiple sensors or sensor devices. For example, subsets of LIDAR sensor data may correspond to a group of different LIDAR sensors from which laser return data is received. At 3804, sensor data may be aligned relative to a global coordinate system to form aligned sensor data. For example, a registration process or algorithm may be configured to align or register the sensor data. At 3806, data sets of three-dimensional map data may be generated based on the aligned sensor data. At 3808, a change in map data may be detected relative to at least two data sets of three-dimensional map data. A change in map data may be applied at 3810 to form updated three-dimensional map data. The one or more updated portions of 3D map data may be formatted, as reference data, for transmission to one or more vehicles in a fleet of autonomous vehicles. At 3812, updated (e.g., changed) three-dimensional map data may be transmitted to at least one autonomous vehicle. Note that the order depicted in this and other flow charts herein are not intended to imply a requirement to linearly perform various functions as each portion of a flow chart may be performed serially or in parallel with any one or more other portions of the flow chart, as well as independent or dependent on other portions of the flow chart.
[0166] FIG. 39 is a diagram depicting an example of a localizer configured to implement map data and locally-generated map data, according to some examples. According to various examples, localizer 3968 of an autonomous vehicle ("AV") controller 3947 may be configured to generate local pose data 3920 based on either locally-generated map data 3941 or map data 3943, or a combination thereof. Local pose data 3920 may include data describing a local position of an autonomous vehicle 3930, and map data 3943 may be generated at mapping engine 3954 of an autonomous vehicle service platform 3901. Therefore, localizer 3968 may use map data 3943 to perform localization in view of changes, deviations, or variances between the locally-generated map data 3941 and map data 3943.
[0167] Diagram 3900 depicts an autonomous vehicle 3930, which includes autonomous vehicle controller 3947, a local map generator 3940, and a reference data repository 3905. Diagram 3900 also depicts an autonomous vehicle service platform 3901 including a mapping engine 3954 and a teleoperator computing device 3904. Reference data repository 3905 includes a map store 3905a configured to store three-dimensional map data 3943 and an route data store 3905b, which may be a data repository for storing route data (e.g., with or without an indication that a portion of a route data, or road network data, is associated with changed road network data or updated road network data).
[0168] Local map generator 3940 may be configured to receive multiple amounts and types of sensor data, such as sensor data from sensor types 3902a, 3902b, and 3902c. According to various examples, local map generator 3940 may be configured to generate map data (e.g., three- dimensional map data) locally in real-time (or nearly in real-time) based on sensor data from sensor types 3902a, 3902b, and 3902c (e.g., from groups of LIDAR sensors, groups of cameras, groups of radars, etc.). Local map generator 3940 may implement logic configured to perform simultaneous localization and mapping ("SLAM") or any suitable mapping technique. In at least some examples, local map generator 3940 may implement "online" map generation techniques in which one or more portions of raw sensor data from sensor types 3902a to 3902c may be received in real-time (or nearly real-time) to generate map data (or identity changes thereto) with which to navigate autonomous vehicle 3930. Local map generator 3940 may also implement a distance transform, such as signed distance function ("SDF"), to determine surfaces external to an autonomous vehicle. In one example, a truncated sign distance function ("TSDF"), or equivalent, may be implemented to identity one or more points on a surface relative to a reference point (e.g., one or more distances to points on a surface of an external object), whereby the TSDF function may be used to fuse sensor data and surface data to form three-dimensional local map data 3941.
[0169] Localizer 3968 may be configured to receive sensor data, as well as locally-generated map data 3941 and map data 3943, to localize autonomous vehicle 3930 relative to a coordinate of the global coordinate system associated with three-dimensional map data 3943 (or any other reference data). Also, localizer 3968 is shown to include a variant detector 3969a and a hybrid map selection controller 3969b. Variant detector 3969a is configured to compare locally-generated map data 3941 to map data 3943 to determine whether portions of map data associated with a specific surface or point in space varies. In particular, variant detector 3969a may detect that data (e.g., variance data) representing one or more map portions of local map data 3941 varies from the three-dimensional map data 3943. [0170] Localizer 3968, upon detecting varying map data portions or variance data, may be configured to localize autonomous vehicle 3930 using hybrid map data from locally -generated map data 3941 and map data 3943. In the example shown, hybrid map selection controller 3969b is configured to control whether locally-generated map data 3941 or map data 3943, or a combination thereof, may be used for localization. According to some examples, different amounts of locally- generated map data 3941 and map data 3943 may be used based on, for example, corresponding probability distributions that may indicate the reliability or accuracy of each. In some examples, hybrid map selection controller 3969b may be configured to characterize the difference between the one or more map portions of map data 3943 and one or more portions of local map data 3941 to form variation data. Based on the variation data, hybrid map selection controller 3969b may be configured to determine priorities of using local map data 3941 and priorities of using map data 3943, and may be further configured to cause localizer 3968 to use a first prioritized amount of local map data 3941 and a second prioritized amount of three-dimensional map data 3943, based on the variation data. As an example, consider an example in which variant detector 3969a detects variance data for several portions of map data 3943 that varies from corresponding portions of local map data 3941. Further consider that local map data 3941 is determined to be more accurate for most portions of variance data. However, at least one portion of local map data 3941 has a relatively lower probability of being accurate than a corresponding portion of map data 3943. In this case, hybrid map selection controller 3969b may rely more on local map data 3941 for localization (with some reliance on map data 3943), but may also rely more on a specific portion of map data 3943 (e.g., having a higher priority) for localization than the corresponding portion of local map data 3941 (e.g., having a lower priority).
[0171] FIG. 40 is a diagram depicting an example of a localizer configured to vary transmission rates or amounts of locally-generated sensor and/or map data, according to some examples. Diagram 4000 depicts a number of autonomous vehicles, including autonomous vehicle 4030a, 4030b, 4030c, and 4030n, and diagram 4000 also depicts an autonomous vehicle service platform 4001 including a mapping engine 4054 and teleoperator logic 4004, which is implemented in association with a teleoperator computing device 4006 that accepts data signals (e.g., user inputs) from a teleoperator 4008. Teleoperator logic 4004 may be disposed in a server computing device (not shown) or teleoperator computing device 4006. As shown, autonomous vehicle 4030a may include an autonomous vehicle controller 4047, a reference data repository 4005 (e.g., including a map store or repository 4005a for storing map data 4046, and an route data store or repository 4005b), and a transceiver 4044 that is configured to exchange data between autonomous vehicle 4030a and autonomous vehicle service platform 4001. Further to diagram 4000, autonomous vehicle controller 4047 may include a local map generator 4040, which may be configured to generate local map data 4041 based on sensor data from different types of sensors 4002a to 4002c. Autonomous vehicle controller 4047 is shown to also include a localizer 4068, which is shown to include a variant detector 4069a and a communication controller 4069b, for generating local pose data 4020. Note that elements depicted in diagram 4000 of FIG. 40 may include structures and/or functions as similarly-named elements described in connection to one or more other drawings, such as FIG. 39, among others.
[0172] Subsequent to detecting a variation between local map data 4041 and map data 4043 (generated by mapping engine 4054), communication controller 4069b may be configured to control transceiver 4044, as well as the types or amounts of data transmitted to autonomous vehicles service platform 4001. Therefore, communication controller 4069b is configured to provide sufficient data for teleoperation logic 4004 and/or teleoperator 4008 to select an optimal set of guidance data to resolve detected map data variations, according to various examples. Communication controller 4069b is configured to provide optimal amounts of data or data rates so as to conserve bandwidth. To illustrate operation of communication controller 4069b, consider that variant detector 4069a detects relatively minor or small amount of differences between map data 4043 and local map data 4041. In this case, communication controller 4069b may transmit relatively low amounts of data to provide an alert to teleoperator 4008 to urge the teleoperator to at least monitor autonomous vehicle 4030a as it travels through an environment that includes a minor change. Moreover, during degraded or low-speed data communication connections, simpler or more abstract depictions of the data may be transmitted (e.g., bounding boxes with associated metadata, etc.) rather than greater amounts of data.
[0173] As another example, consider that variant detector 4069a detects relatively moderate amounts of differences between map data 4043 and local map data 4041. In this case, communication controller 4069b may be configured to increase the transmission bandwidth of transceiver 4044 to transmit one or more portions of local map data 4041 to autonomous vehicle service platform 4001 for evaluation by teleoperator logic 4004. In yet another example, consider that variant detector 4069a detects relatively large amounts of differences between map data 4043 and local map data 4041. In this case, communication controller 4069b may be configured to further increase the transmission bandwidth by transceiver 4044 to transmit one or more portions of high resolution sensor data 4047 to autonomous vehicle service platform 4001 for visual presentation of the physical environment on a display 4009. For example, all or substantially all Lidar data may be transmitted, however, any amount of less than all Lidar data may be transmitted. Sensor-based data 4002 may be used to generate a three-dimensional view in real-time (or nearly in real-time) so that teleoperator 4008 may identify changes in map data visually. As shown, recently-placed traffic cones 4011 are identified as being a cause of variance data, or the differences between portions of map data 4043 and local map data 4041. Note that the above-described implementations are just a few examples of any number of implementations of the elements shown in diagram 4000, and, as such, the above description of diagram 4000 is not intended to be limiting.
[0174] FIG. 41 is a flow diagram depicting an example process of using various amounts of locally-generated map data to localize an autonomous vehicle, according to some examples. Flow 4100 begins at 4102, which includes localizing an autonomous vehicle relative to a coordinate of a global coordinate system in association with three-dimensional map data. At 4104, variance data may be detected. That is, data representing one or more map portions of three-dimensional map data that varies from sensed data (e.g., LIDAR data, camera data, etc.) generated by multiple sensor types may be detected. In one example, hybrid map data may be implemented using map data from a local map and from a three-dimensional map. Note different amounts of local map and from a three-dimensional map may be used based on, for example, of the predicted accuracy of the portions of map data. In another example, at 4106, flow 4100 may implement hybrid map data from a local map and a three-dimensional map; a teleoperator request may be generated at 4108. At 4110, a difference between three-dimensional map data and sensed data (e.g., data used to generate local map data) may be characterized, and, based on the characterization, the rate of transmitting sensor- related data (e.g., raw sensor data, local map data, etc.) to an autonomous vehicle platform may be adjusted at 4112. At 4114, a three-dimensional representation of an environment is generated at which the autonomous vehicle acquires the data to depict an event on a display of a teleoperator computing device. Therefore, the addition or absence of an object that causes a difference between map data and locally-generated map data may be visually presented to a teleoperator.
[0175] FIGs. 42 to 43 illustrate examples of various computing platforms configured to provide various mapping-related functionalities to components of an autonomous vehicle service, according to various embodiments. In some examples, computing platform 3300 may be used to implement computer programs, applications, methods, processes, algorithms, or other software to perform the above-described techniques. Note that various structures and/or functionalities of FIG. 33 may be applicable to FIGs. 42 and 43, and, as such, some elements in those figures may be discussed in the context of FIG. 33. Note further that elements depicted in diagram 4200 of FIG. 42 and diagram 4300 of FIG. 43 may include structures and/or functions as similarly -named elements described in connection to one or more other drawings, such as FIGs. 33 to 35, among others.
[0176] Referring to the example shown in FIG. 42, system memory 3306 includes an autonomous vehicle service platform module 4250 and/or its components (e.g., a mapping engine module 4252, etc.), any of which, or one or more portions of which, can be configured to facilitate navigation for an autonomous vehicle service by implementing one or more functions described herein. [0177] Referring to the example shown in FIG. 43, system memory 3306 includes an autonomous vehicle ("AV") module 4350 and/or its components (e.g., a local map generator module 4352, a hybrid map selection control module 4354, a communication control module 4356, etc.) may be implemented, for example, in an autonomous vehicle 4391. In some cases, system memory 3306 or a portion thereof may be disposed in mobile computing device 4390a. One or more portions of module 4350 can be configured to facilitate navigation of an autonomous vehicle service by implementing one or more functions described herein.
[0178] Although the foregoing examples have been described in some detail for purposes of clarity of understanding, the above-described inventive techniques are not limited to the details provided. There are many alternative ways of implementing the above-described invention techniques. The disclosed examples are illustrative and not restrictive.

Claims

Claims What is claimed:
1. A method comprising:
monitoring a fleet of vehicles by a computing system, at least one of which includes a vehicle controller configured to cause a vehicle to autonomously transit via a road network from a first geographic region to a second geographic region via a planned path;
receiving, by the computing system, data indicative of an event associated with the vehicle;
receiving, by the computing system, data representing multiple candidate trajectories for the vehicle that could be used to maneuver the vehicle from a current trajectory, each of the multiple candidate trajectories being configured to reduce impact of the event on operation of the vehicle and return the vehicle to the planned path, following receipt of the data indicative of the event, at least some of the multiple candidate trajectories having associated confidence levels; receiving, by the computing system, data representing a selection of a particular candidate traj ectory from among the multiple candidate trajectories to be used as a guided trajectory of the vehicle, wherein the selection is based at least in part on the confidence levels; and
transmitting the guided trajectory from the computing system to the vehicle for use by the vehicle to maneuver from the current trajectory in response to the event.
2. The method of claim 1, wherein transmitting the selection of the candidate traj ectory as the guided trajectory to the vehicle comprises transmitting the guided trajectory to a bidirectional autonomous vehicle that is capable of substantially symmetric control and motion in a first direction and a second opposite direction, the bidirectional autonomous vehicle having active lighting set to a first state to indicate that a forward direction for the bidirectional autonomous vehicle is in the first direction.
3. The method of claim 2, further comprising monitoring the bidirectional autonomous vehicle operating in accordance with the guided trajectory, wherein the guided traj ectory causes the active lighting to set to a second state to indicate that the primary forward direction of the bidirectional autonomous vehicle is in the second direction.
4. The method of claim 1, wherein receiving the data indicative of the event comprises receiving data specifying a detected obstacle associated with the current trajectory.
5. The method of claim 4, wherein receiving data specifying a detected obstacle comprises identifying a classification type for the detected obstacle.
6. The method of claim 4, wherein receiving data specifying a detected obstacle comprises identifying whether the detected obstacle is a static obstacle or a dynamic obstacle.
7. The method of claim 1, further comprising receiving a ranking of the multiple candidate trajectories according to the associated confidence levels.
8. The method of claim 1, further comprising generating a response from a teleoperator computing device that identifies geographic areas to exclude from consideration for the multiple candidate trajectories.
9. The method of claim 1, wherein the associated confidence levels are related to a probability that the event impacts operation of the vehicle.
10. The method of claim 1, wherein the associated confidence levels are related to respective degrees of certainty that a particular candidate trajectory will reduce a probability that the event impacts operation of the vehicle.
11. A method comprising:
monitoring, at a computing system, a fleet of independent driverless vehicles, at least one of which is configured to autonomously transit from a first geographic region to a second geographic region via a planned path via a road network, the driverless vehicles being
bidirectional autonomous vehicles capable of driving forward in a first direction or driving forward in an opposite second direction and having at least one mechanism for communicating a current forward direction to one or more potential people in a surrounding environment;
receiving, at the computing system, data from a driverless vehicle in the fleet indicating an event encountered by the driverless vehicle that may cause the driverless vehicle to maneuver from a current trajectory;
generating, by the computing system, multiple candidate trajectories for the driverless vehicle to maneuver from the current trajectory, each of the multiple candidate trajectories being configured to reduce impact of the event on operation of the vehicle and return the vehicle to the planned path;
calculating, by the computing system, confidence levels associated with the multiple candidate trajectories;
ranking, by the computing system, the multiple candidate trajectories according to the confidence levels;
choosing a particular candidate trajectory from among the multiple candidate trajectories based at least in part on the confidence levels associated with the multiple candidate traj ectories; and
transmitting, to the driverless vehicle, the particular candidate trajectory for use in maneuvering the driverless vehicle whereby the driverless vehicle uses the mechanism to communicate any change of the first direction to the potential people in the surrounding environment.
12. The method of claim 11, wherein the choosing comprises:
presenting, to a human teleoperator, at least a subset of the multiple candidate trajectories in ranked order according to the associated confidence levels; and
receiving input from the human teleoperator to select the particular candidate trajectory.
13. The method of claim 11, wherein the mechanism comprises active lighting, and further comprising placing the active lighting in a first state that indicates the driverless vehicle is moving forward in the first direction and as part of the maneuvering, placing the active lighting in a second state that indicates the driverless vehicle is moving forward in the second direction.
14. The method of claim 11, wherein the mechanism comprises active lighting, and further comprising causing the active lighting to enter an animated state when the driverless vehicle is maneuvering.
15. The method of claim 11, wherein the mechanism comprises a sound generator to emit a pattern of sounds, and further comprising causing the sound generator to emit the pattern of sounds when the driverless vehicle is maneuvering.
16. The method of claim 11, wherein receiving data indicating the event comprises receiving data specifying a detected obstacle in the current traj ectory, and further comprising determining whether the obstacle is a static obstacle or a dynamic obstacle.
17. The method of claim 16, wherein choosing a particular candidate traj ectory comprises selecting the particular candidate trajectory based at least in part on whether the obstacle is determined to be a static obstacle or a dynamic obstacle.
18. The method of claim 1 1, wherein the associated confidence levels are related to a probability that the event impacts operation of the vehicle.
19. The method of claim 11, wherein the associated confidence levels are related to respective degrees of certainty that a particular candidate trajectory will reduce a probability that the event impacts operation of the vehicle.
20. A system comprising:
one or more processors; and
computer readable memory comprising computer-executable instructions that, when executed by the one or more processors, cause the system to:
receive sensor data from an on-road autonomous vehicle configured to autonomously transit via a road network from a first geographic region to a second geographic region via a planned path, the sensor data indicating an event encountered by the on-road autonomous vehicle that may cause the on-road autonomous vehicle to maneuver from a current traj ectory along an outdoor roadway within a predefined area; generate a ranked list of multiple candidate trajectories for the on-road autonomous vehicle to maneuver from the current trajectory, each of the multiple candidate trajectories being configured to reduce impact of the event on operation of the on-road autonomous vehicle and return the on-road autonomous vehicle to the planned path;
receive input from a human operator to select a particular candidate traj ectory from among at least a subset of the multiple candidate traj ectories; and
transmit the particular candidate trajectory to the on-road autonomous vehicle for use in maneuvering the on-road autonomous vehicle.
21. The system of claim 20, wherein the computer-executable instructions, when executed by the one or more processors, further cause the system to calculate confidence levels for associated candidate traj ectories.
22. The system of claim 20, wherein the computer-executable instructions, when executed by the one or more processors, further cause the system to calculate confidence levels for associated candidate traj ectories and to present, to the human operator, the subset of the multiple candidate traj ectories along with the associated confidence levels for the human operator to consider when selecting the particular candidate traj ectory.
23. The system of claim 20, wherein the computer-executable instructions, when executed by the one or more processors, further cause the system to receive data specifying a detected obstacle in the current traj ectory and a classification type of the detected obstacle.
24. The system of claim 23, wherein the computer-executable instructions, when executed by the one or more processors, further cause the system to receive data specifying whether the detected obstacle is a static obstacle that is not expected to move or a dynamic obstacle that is one of moving or expected to move.
25. The system of claim 20, wherein the computer-executable instructions, when executed by the one or more processors, further cause the system to transmit the particular candidate trajectory that directs the on-road autonomous vehicle to change from going forward in a first direction to going forward in an opposite second direction without turning the on-road autonomous vehicle around, wherein the change of directions causes active lighting on the on- road autonomous vehicle to change from a first state that indicates the on-road autonomous vehicle is moving forward in the first direction to a second state that indicates the on-road autonomous vehicle is moving forward in the second direction.
26. The system of claim 20, wherein the computer-executable instructions, when executed by the one or more processors, further cause the system to transmit the particular candidate trajectory that directs the on-road autonomous vehicle to change directions, wherein the change of directions causes active lighting on the on-road autonomous vehicle to enter an animated state while the on-road autonomous vehicle is maneuvering.
27. A system comprising:
a bidirectional autonomous vehicle with quad portions of substantially similar structural components that enable the autonomous vehicle to drive forward in a first direction or drive forward in a substantially opposite second direction without turning around the autonomous vehicle, the autonomous vehicle having a plurality of sensors for sensing one or more objects that may cause the autonomous vehicle to maneuver from a current traj ectory in the first direction; and a computing system communicatively coupled to receive data from the autonomous vehicle and to transmit instructions to the autonomous vehicle, the computing system being programmed to:
receive sensor data from the autonomous vehicle;
generate multiple candidate trajectories for the autonomous vehicle to maneuver while driving;
select a particular candidate trajectory from among the multiple candidate traj ectories; and
transmit the particular candidate trajectory to the autonomous vehicle for use in maneuvering; and
wherein the bidirectional autonomous vehicle, upon receiving the particular candidate traj ectory, changes directions from driving forward in a first direction to driving forward in the second direction without turning around.
28. The system of claim 27, wherein the computing system is remote from and independent of the autonomous vehicle.
29. The system of claim 27, wherein the computing system is further programmed to: calculate confidence levels associated with the multiple candidate trajectories; and select the particular candidate trajectory from among the multiple candidate traj ectories based, at least in part, on the confident levels.
30. The system of claim 27, wherein the computing system is further programmed to receive input from a human operator to select the particular candidate trajectory from among the multiple candidate traj ectories.
31. The system of claim 27, wherein the autonomous vehicle has active lighting that changes appearance when the autonomous vehicle changes direction to indicate that the autonomous vehicle is no longer driving forward in the first direction, but is now driving forward in the second direction.
32. A method comprising:
receiving, at a computing system, a first type of sensor data from multiple driverless vehicles in a fleet of autonomous driverless vehicles, wherein the first type of sensor data is acquired by a first type of sensors on the driverless vehicles to sense objects in environments encountered while the driverless vehicles are driving along roads;
receiving, at the computing system, a second type of sensor data from the multiple driverless vehicles in the fleet of autonomous driverless vehicles, wherein the second type of sensor data is acquired by a second type of sensors on the driverless vehicles to sense objects in the environments encountered while the driverless vehicles are driving along the roads;
storing, at the computing system, the first type of sensor data and the second type of sensor data;
accessing, by the computing system and for a particular driverless vehicle, subsets of the first type of sensor data and subsets of the second type of sensor data;
aligning, by the computing system and for the particular driverless vehicle, the subsets of the first type of sensor data with the subsets of the second type of sensor data to provide aligned sensor data pertaining to objects in an environment encountered by the particular driverless vehicle;
aligning, by the computing system and for the particular driverless vehicle, the aligned sensor data relative to positioning map data pertaining to a global coordinate system to localize the aligned sensor data to a location of the particular driverless vehicle;
generating, by the computing system, data sets of three-dimensional map data based on the aligned sensor data and the positioning map data;
detecting a change between the generated data sets of three-dimensional map data and a stored data set of three-dimensional map data, the change being representative of a state change in the environment surrounding the particular driverless vehicle;
updating, at the computing system, the stored data set of three-dimensional map data to reflect the state change in the environment; and
transmitting the updated data set of three-dimensional map data to the multiple driverless vehicles in the fleet of autonomous driverless vehicles,
wherein at least one of the driverless vehicles is controlled based at least in part on the updated data set of three-dimensional map data.
33. The method of claim 32, wherein updating, at the computing system, the stored data set of three-dimensional map data to reflect the state change comprises:
associating the change between at least two data sets to a map portion; forming an updated map portion to include the change; and
incorporating the updated map portion into a reference data store in the particular driverless vehicle,
wherein implementation of the updated map portion for localization of the particular driverless vehicle invokes a teleoperator request.
34. The method of claim 32, further comprising:
localizing the particular driverless vehicle relative to a coordinate of the global coordinate system associated with the stored data set of three-dimensional map data;
detecting that data representing one or more map portions of the stored data set of three- dimensional map data vary from a local map maintained at the particular driverless vehicle; and generating a teleoperator request.
35. The method of claim 32, further comprising:
localizing the particular driverless vehicle relative to a coordinate of the global coordinate system associated with the stored data set of three-dimensional map data; and
detecting that data representing one or more map portions of the stored data set of three- dimensional map data vary from a local map maintained at the particular driverless vehicle.
36. The method of claim 35, further comprising localizing the particular driverless vehicle against hybrid map data from the local map and the stored data set of three-dimensional map data.
37. The method of claim 32, wherein accessing subsets of the first type of sensor data and subsets of the second type of sensor data comprise receiving subsets of laser retum data and receiving subsets of image data, respectively, wherein each subset of laser retum data is associated with a Lidar sensor and each subset of image data is associated with an image capture device.
38. The method of claim 37, further comprising receiving subsets of radar data into a data store originating from a group of radar sensors, wherein each subset radar data is associated with a radar sensor.
39. The method of claim 37, wherein aligning the subsets of the first type of sensor data with the subsets of the second type of sensor data and aligning the aligned sensor data comprises:
forming a global pose graph relative to the global coordinate system based on the subsets of the first type of sensor data and the subsets of the second type of sensor data; and
aligning the subsets of laser return data and the subsets of image data relative to a location relative to a coordinate of the global coordinate system.
40. The method of claim 39, wherein aligning the subsets of laser return data and the subsets of image data relative to a location relative to a coordinate of the global coordinate system comprises receiving trajectory data representing position data to identify the coordinate of the global coordinate system.
41. The method of claim 39, wherein forming the global pose graph comprises performing one or more loop-closure processes to identify a closed loop.
42. The method of claim 37, wherein generating the data sets of three-dimensional map data further comprises integrating at least two types of sensor data including the subsets of laser return data and the subsets of image data.
43. The method of claim 42, wherein integrating the at least two types of sensor data comprises fusing the subsets of laser return data and the subsets of image data to form the generated data sets of three-dimensional map data to include three-dimensional tile data.
44. The method of claim 42, wherein integrating the at least two types of sensor data comprises correlating pixel data of the subsets of image data to laser data of the subsets of laser return data.
45. The method of claim 44, further comprising associating the pixel data of one or more pixels to the laser data of one or more laser returns, the laser data being associated with a portion of a surface in three-dimensional tile data, wherein the pixel data specifies one or more surface characteristics including texture, color, reflectivity, or transparency.
46. The method of claim 32, wherein:
receiving the first type of sensor data and receiving the second type of sensor data comprises receiving the first and second types of sensor data via a network; and
logging the first type of sensor data and the second type of sensor data into a database.
47. The method of claim 35, further comprising:
characterizing a difference between the one or more map portions of the stored data set of three-dimensional map data and the local map; and
using at least part of the local map and at least part of the one or more map portions to update the stored data set of three-dimensional map data.
48. The method of claim 32, further comprising:
exporting the updated data set of three-dimensional map data to a simulator computing device; and
simulating performance of a portion of the fleet of autonomous vehicles in a simulated environment based on the updated data set of three-dimensional map data.
49. A method comprising:
receiving, at a computing system from individual autonomous vehicles in a fleet of autonomous vehicles, multiple types of sensor data indicative of driving conditions and obj ects present in environments surrounding the autonomous vehicles in the fleet;
storing the multiple types of sensor data in a data store;
accessing, from the data store, subsets of a first type of sensor data received from a first autonomous vehicle in a first location, the subsets of the first type of sensor data originating from one or more first sensors on the first autonomous vehicle;
accessing, from the data store, subsets of a second type of sensor data received from the first autonomous vehicle in the first location, the subsets of the second type of sensor data originating from one or more second sensors on the first autonomous vehicle;
aligning the subsets of the first type of sensor data and the subsets of the second type of sensor data to form aligned sensor data;
determining, at the computing system, whether the aligned sensor data manifests a variation between a stored map of the first location and an environment surrounding the first autonomous vehicle as sensed while the vehicle is in the first location;
updating the stored map to reflect the variation; and transmitting the updated map from the computing system to the individual autonomous vehicles in a fleet of autonomous vehicles,
wherein at least one of the autonomous vehicles is controlled based at least in part on the updated map.
50. The method of claim 49, wherein determining whether the aligned sensor data manifests a variation comprises:
localizing the first autonomous vehicle relative to a global coordinate system associated with map data; and
detecting a difference between one or more map portions of the map data and a local map maintained at the first autonomous vehicle.
51. The method of claim 50, further comprising:
characterizing the difference between the one or more map portions of the three- dimensional map data and the local map; and
using at least part of the local map and at least part of the one or more map portions to update the stored map.
52. The method of claim 49, further comprising:
exporting the updated map to a simulator computing device; and
simulating, at the simulator computing device, performance of a portion of the fleet of autonomous vehicles in a simulated environment based on the updated map.
53. The method of claim 49, further comprising aligning the aligned sensor data relative to a global coordinate system.
54. A system comprising:
a driverless vehicle configured to drive autonomously on a road network in traffic with other motor vehicles, the driverless vehicle being a passenger vehicle and having a plurality of sensors for sensing one or more objects in an environment surrounding the driverless vehicle; and a computing system communicatively coupled to receive data from the driverless vehicle and to transmit instructions to the driverless vehicle, the computing system being programmed to:
receive multiple types of sensor data from the driverless vehicle pertaining to the one or more objects in the environment while the driverless vehicle is at a location; process the multiple types of sensor data to determine whether the sensor data indicates a change detected in the environment with respect to the one or more obj ects therein as compared to map data of the location;
update the map data of the location to reflect the change detected in the environment; and
transmit the updated map data to the driverless vehicle,
wherein the driverless vehicle updates a local version of the map data with the updated map data.
55. The system of claim 54, wherein the computing system is remote from and independent of the driverless vehicle.
56. The system of claim 54, wherein the driverless vehicle is part of a fleet of driverless vehicles, the computing system is further programmed to transmit the updated map data to multiple driverless vehicles in the fleet.
57. The system of claim 56, further comprising a simulator configured to simulate performance of the fleet of driverless vehicles in a simulated environment based on the updated map data.
58. The system of claim 54, further comprising a teleoperator computing device configured to present the updated map data to a human teleoperator.
59. The system of claim 54, wherein the computing system is further programmed to align the multiple types of sensor data to provide aligned sensor data pertaining to one or more obj ects in the environment at the location.
PCT/US2016/060368 2015-11-04 2016-11-03 Aptive mapping to navigate autonomous vehicles responsive to physical environment changes WO2017079460A2 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN202410296946.0A CN118192555A (en) 2015-11-04 2016-11-03 Adaptive mapping to navigate autonomous vehicles in response to changes in physical environment
JP2018543270A JP7316789B2 (en) 2015-11-04 2016-11-03 Adaptive mapping for navigating autonomous vehicles in response to changes in the physical environment
EP16862985.5A EP3371797A4 (en) 2015-11-04 2016-11-03 Adaptive mapping to navigate autonomous vehicles responsive to physical environment changes
CN201680064836.5A CN108369775B (en) 2015-11-04 2016-11-03 Adaptive mapping to navigate an autonomous vehicle in response to changes in a physical environment
CN202111033039.XA CN113721629B (en) 2015-11-04 2016-11-03 Adaptive mapping to navigate autonomous vehicles in response to changes in physical environment
JP2022020682A JP2022065083A (en) 2015-11-04 2022-02-14 Adaptive mapping for navigating autonomous vehicles responsive to physical environment change

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US14/756,996 US9916703B2 (en) 2015-11-04 2015-11-04 Calibration for autonomous vehicle operation
US14/932,959 US9606539B1 (en) 2015-11-04 2015-11-04 Autonomous vehicle fleet service and system
US14/932,940 US9734455B2 (en) 2015-11-04 2015-11-04 Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
US14/932,963 US9612123B1 (en) 2015-11-04 2015-11-04 Adaptive mapping to navigate autonomous vehicles responsive to physical environment changes
US14/756,995 US9958864B2 (en) 2015-11-04 2015-11-04 Coordination of dispatching and maintaining fleet of autonomous vehicles
US14/756,991 US9720415B2 (en) 2015-11-04 2015-11-04 Sensor-based object-detection optimization for autonomous vehicles
US14/932,959 2015-11-04
US14/932,963 2015-11-04
US14/932,962 US9630619B1 (en) 2015-11-04 2015-11-04 Robotic vehicle active safety systems and methods
US14/756,992 US9910441B2 (en) 2015-11-04 2015-11-04 Adaptive autonomous vehicle planner logic

Publications (3)

Publication Number Publication Date
WO2017079460A2 true WO2017079460A2 (en) 2017-05-11
WO2017079460A8 WO2017079460A8 (en) 2017-07-13
WO2017079460A3 WO2017079460A3 (en) 2017-08-24

Family

ID=58419084

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2016/060018 WO2017079219A1 (en) 2015-11-04 2016-11-02 Teleoperation system and method for trajectory modification of autonomous vehicles
PCT/US2016/060368 WO2017079460A2 (en) 2015-11-04 2016-11-03 Aptive mapping to navigate autonomous vehicles responsive to physical environment changes

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/US2016/060018 WO2017079219A1 (en) 2015-11-04 2016-11-02 Teleoperation system and method for trajectory modification of autonomous vehicles

Country Status (3)

Country Link
US (3) US9612123B1 (en)
EP (2) EP3371668B1 (en)
WO (2) WO2017079219A1 (en)

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018224873A1 (en) 2017-06-06 2018-12-13 PlusAI Corp Method and system for close loop perception in autonomous driving vehicles
WO2019040800A1 (en) * 2017-08-23 2019-02-28 TuSimple 3d submap reconstruction system and method for centimeter precision localization using camera-based submap and lidar-based global map
CN109425348A (en) * 2017-08-23 2019-03-05 北京图森未来科技有限公司 A kind of while positioning and the method and apparatus for building figure
US10338594B2 (en) 2017-03-13 2019-07-02 Nio Usa, Inc. Navigation of autonomous vehicles to enhance safety under one or more fault conditions
US10369974B2 (en) 2017-07-14 2019-08-06 Nio Usa, Inc. Control and coordination of driverless fuel replenishment for autonomous vehicles
US10423162B2 (en) 2017-05-08 2019-09-24 Nio Usa, Inc. Autonomous vehicle logic to identify permissioned parking relative to multiple classes of restricted parking
CN111065980A (en) * 2017-08-23 2020-04-24 图森有限公司 System and method for centimeter-accurate positioning using camera-based sub-maps and LIDAR-based global maps
WO2020141694A1 (en) * 2019-01-04 2020-07-09 Seoul Robotics Co., Ltd. Vehicle using spatial information acquired using sensor, sensing device using spatial information acquired using sensor, and server
US10710633B2 (en) 2017-07-14 2020-07-14 Nio Usa, Inc. Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles
CN111464637A (en) * 2020-03-31 2020-07-28 北京百度网讯科技有限公司 Unmanned vehicle data processing method, device, equipment and medium
WO2020154676A1 (en) * 2019-01-25 2020-07-30 Uatc, Llc Operator assistance for autonomous vehicles
KR20200092819A (en) * 2019-01-04 2020-08-04 (주)서울로보틱스 Vehicle and sensing device of utilizing spatial information acquired using sensor, and server for the same
EP3722909A1 (en) * 2019-04-10 2020-10-14 Siemens Aktiengesellschaft Transport system for transporting transport pieces
US10816354B2 (en) 2017-08-22 2020-10-27 Tusimple, Inc. Verification module system and method for motion-based lane detection with multiple sensors
US10896539B2 (en) 2018-06-22 2021-01-19 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for updating highly automated driving maps
US10942271B2 (en) 2018-10-30 2021-03-09 Tusimple, Inc. Determining an angle between a tow vehicle and a trailer
US10953880B2 (en) 2017-09-07 2021-03-23 Tusimple, Inc. System and method for automated lane change control for autonomous vehicles
US10953881B2 (en) 2017-09-07 2021-03-23 Tusimple, Inc. System and method for automated lane change control for autonomous vehicles
US10962650B2 (en) 2017-10-31 2021-03-30 United States Of America As Represented By The Administrator Of Nasa Polyhedral geofences
WO2021092230A1 (en) 2019-11-08 2021-05-14 Zoox, Inc. Guidance authentication with vehicles
US11009365B2 (en) 2018-02-14 2021-05-18 Tusimple, Inc. Lane marking localization
US11009356B2 (en) 2018-02-14 2021-05-18 Tusimple, Inc. Lane marking localization and fusion
US11010874B2 (en) 2018-04-12 2021-05-18 Tusimple, Inc. Images for perception modules of autonomous vehicles
US11022971B2 (en) 2018-01-16 2021-06-01 Nio Usa, Inc. Event data recordation to identify and resolve anomalies associated with control of driverless vehicles
US20210180987A1 (en) * 2018-08-31 2021-06-17 Denso Corporation Vehicle-side device, method and non-transitory computer-readable storage medium for autonomously driving vehicle
WO2021138475A1 (en) * 2019-12-31 2021-07-08 Zoox, Inc. Vehicle control to join and depart a route
JP2021518557A (en) * 2018-03-19 2021-08-02 アウトサイト Methods and systems for identifying the material composition of moving objects
US11151393B2 (en) 2017-08-23 2021-10-19 Tusimple, Inc. Feature matching and corresponding refinement and 3D submap position refinement system and method for centimeter precision localization using camera-based submap and LiDAR-based global map
US11292480B2 (en) 2018-09-13 2022-04-05 Tusimple, Inc. Remote safe driving methods and systems
US11295146B2 (en) 2018-02-27 2022-04-05 Tusimple, Inc. System and method for online real-time multi-object tracking
US11305782B2 (en) 2018-01-11 2022-04-19 Tusimple, Inc. Monitoring system for autonomous vehicle operation
US11312334B2 (en) 2018-01-09 2022-04-26 Tusimple, Inc. Real-time remote control of vehicles with high redundancy
US11392133B2 (en) 2017-06-06 2022-07-19 Plusai, Inc. Method and system for object centric stereo in autonomous driving vehicles
US11500101B2 (en) 2018-05-02 2022-11-15 Tusimple, Inc. Curb detection by analysis of reflection images
US11550334B2 (en) 2017-06-06 2023-01-10 Plusai, Inc. Method and system for integrated global and distributed learning in autonomous driving vehicles
DE102021213147A1 (en) 2021-11-23 2023-05-25 Volkswagen Aktiengesellschaft Method, server device and motor vehicle for automatically mapping a surrounding area in sections
US11665017B2 (en) 2018-02-28 2023-05-30 Cisco Technology, Inc. Telemetry reporting in vehicle super resolution systems
US20230221719A1 (en) * 2019-11-26 2023-07-13 Zoox, Inc. Correction of sensor data alignment and environment mapping
US11701931B2 (en) 2020-06-18 2023-07-18 Tusimple, Inc. Angle and orientation measurements for vehicles with multiple drivable sections
RU2805133C2 (en) * 2019-01-21 2023-10-11 Рено С.А.С Method for determining reliability of the target in the environment of the vehicle
US11810322B2 (en) 2020-04-09 2023-11-07 Tusimple, Inc. Camera pose estimation techniques
US11823460B2 (en) 2019-06-14 2023-11-21 Tusimple, Inc. Image fusion for autonomous vehicle operation
US11853071B2 (en) 2017-09-07 2023-12-26 Tusimple, Inc. Data-driven prediction-based system and method for trajectory planning of autonomous vehicles
US11914378B2 (en) 2021-05-18 2024-02-27 Ford Global Technologies, Llc Intersection node-assisted high-definition mapping
US11972690B2 (en) 2018-12-14 2024-04-30 Beijing Tusen Zhitu Technology Co., Ltd. Platooning method, apparatus and system of autonomous driving platoon
US11993256B2 (en) 2020-05-22 2024-05-28 Cnh Industrial America Llc Dynamic perception zone estimation

Families Citing this family (454)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011112577A1 (en) * 2011-09-08 2013-03-14 Continental Teves Ag & Co. Ohg Method and device for an assistance system in a vehicle for carrying out an autonomous or semi-autonomous driving maneuver
US20170286884A1 (en) 2013-03-15 2017-10-05 Via Transportation, Inc. System and Method for Transportation
US10373301B2 (en) 2013-09-25 2019-08-06 Sikorsky Aircraft Corporation Structural hot spot and critical location monitoring system and method
US9499139B2 (en) 2013-12-05 2016-11-22 Magna Electronics Inc. Vehicle monitoring system
JP6174516B2 (en) * 2014-04-24 2017-08-02 本田技研工業株式会社 Collision avoidance support device, collision avoidance support method, and program
DE102014212898A1 (en) * 2014-07-03 2016-01-07 Robert Bosch Gmbh Method for determining an emergency trajectory and method for partially automated or automated guidance of an ego vehicle
BR112017002129B1 (en) * 2014-08-04 2022-01-04 Nissan Motor Co., Ltd SELF-POSITION CALCULATION APPARATUS AND SELF-POSITION CALCULATION METHOD
DE102014221888A1 (en) * 2014-10-28 2016-04-28 Robert Bosch Gmbh Method and device for locating a vehicle in its environment
EP3218885A4 (en) * 2014-11-11 2018-10-24 Sikorsky Aircraft Corporation Trajectory-based sensor planning
CN107624155B (en) * 2014-12-05 2021-09-28 苹果公司 Autonomous navigation system
CN107249954B (en) * 2014-12-29 2020-07-10 罗伯特·博世有限公司 System and method for operating an autonomous vehicle using a personalized driving profile
EP3245474A4 (en) * 2015-01-13 2018-07-04 Sikorsky Aircraft Corporation Structural health monitoring employing physics models
US9616773B2 (en) 2015-05-11 2017-04-11 Uber Technologies, Inc. Detecting objects within a vehicle in connection with a service
US10291339B2 (en) * 2015-07-22 2019-05-14 Bae Systems Plc Evaluating near range communications quality
US9849852B1 (en) * 2015-09-04 2017-12-26 Waymo Llc Intelligent deployment of safety mechanisms for autonomous vehicles
US9802568B1 (en) 2015-09-04 2017-10-31 Waymo Llc Interlocking vehicle airbags
US9817397B1 (en) 2015-09-04 2017-11-14 Waymo Llc Active safety mechanisms for an autonomous vehicle
US10248119B2 (en) 2015-11-04 2019-04-02 Zoox, Inc. Interactive autonomous vehicle command controller
US9606539B1 (en) 2015-11-04 2017-03-28 Zoox, Inc. Autonomous vehicle fleet service and system
US9804599B2 (en) 2015-11-04 2017-10-31 Zoox, Inc. Active lighting control for communicating a state of an autonomous vehicle to entities in a surrounding environment
US9507346B1 (en) 2015-11-04 2016-11-29 Zoox, Inc. Teleoperation system and method for trajectory modification of autonomous vehicles
US10401852B2 (en) 2015-11-04 2019-09-03 Zoox, Inc. Teleoperation system and method for trajectory modification of autonomous vehicles
US10334050B2 (en) 2015-11-04 2019-06-25 Zoox, Inc. Software application and logic to modify configuration of an autonomous vehicle
WO2017079341A2 (en) 2015-11-04 2017-05-11 Zoox, Inc. Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
US9754490B2 (en) 2015-11-04 2017-09-05 Zoox, Inc. Software application to request and control an autonomous vehicle service
US9494940B1 (en) 2015-11-04 2016-11-15 Zoox, Inc. Quadrant configuration of robotic vehicles
US11283877B2 (en) * 2015-11-04 2022-03-22 Zoox, Inc. Software application and logic to modify configuration of an autonomous vehicle
US9632502B1 (en) 2015-11-04 2017-04-25 Zoox, Inc. Machine-learning systems and techniques to optimize teleoperation and/or planner decisions
US9878664B2 (en) 2015-11-04 2018-01-30 Zoox, Inc. Method for robotic vehicle communication with an external environment via acoustic beam forming
US10871555B1 (en) * 2015-12-02 2020-12-22 Apple Inc. Ultrasonic sensor
FR3044800B1 (en) * 2015-12-07 2018-08-17 Valeo Schalter Und Sensoren Gmbh DEVICE AND METHOD FOR ASSISTING DRIVING
US9432929B1 (en) 2015-12-08 2016-08-30 Uber Technologies, Inc. Communication configuration system for a fleet of automated vehicles
US10243604B2 (en) 2015-12-08 2019-03-26 Uber Technologies, Inc. Autonomous vehicle mesh networking configuration
US10712160B2 (en) 2015-12-10 2020-07-14 Uatc, Llc Vehicle traction map for autonomous vehicles
KR101798519B1 (en) * 2015-12-15 2017-11-16 현대자동차주식회사 Emergency braking system and method for controlling the same
US9841763B1 (en) 2015-12-16 2017-12-12 Uber Technologies, Inc. Predictive sensor array configuration system for an autonomous vehicle
US9840256B1 (en) 2015-12-16 2017-12-12 Uber Technologies, Inc. Predictive sensor array configuration system for an autonomous vehicle
CN112051855A (en) * 2016-01-05 2020-12-08 御眼视觉技术有限公司 Navigation system for a host vehicle, autonomous vehicle and method of navigating an autonomous vehicle
US10029682B2 (en) * 2016-01-22 2018-07-24 Toyota Motor Engineering & Manufacturing North America, Inc. Surrounding vehicle classification and path prediction
JP6757749B2 (en) * 2016-01-29 2020-09-23 株式会社小松製作所 Work machine management system, work machine, work machine management method
US9904867B2 (en) * 2016-01-29 2018-02-27 Pointivo, Inc. Systems and methods for extracting information about objects from scene information
US9902311B2 (en) 2016-02-22 2018-02-27 Uber Technologies, Inc. Lighting device for a vehicle
US9990548B2 (en) 2016-03-09 2018-06-05 Uber Technologies, Inc. Traffic signal analysis system
US9903733B2 (en) * 2016-03-17 2018-02-27 Honda Motor Co., Ltd. Vehicular communications network and methods of use and manufacture thereof
US11267489B2 (en) * 2016-03-17 2022-03-08 Hitachi, Ltd. Automatic operation assistance system and automatic operation assistance method
JP6897668B2 (en) * 2016-03-30 2021-07-07 ソニーグループ株式会社 Information processing method and information processing equipment
JP6327283B2 (en) * 2016-04-06 2018-05-23 トヨタ自動車株式会社 Vehicle information providing device
US10816654B2 (en) * 2016-04-22 2020-10-27 Huawei Technologies Co., Ltd. Systems and methods for radar-based localization
US10545229B2 (en) * 2016-04-22 2020-01-28 Huawei Technologies Co., Ltd. Systems and methods for unified mapping of an environment
US9672446B1 (en) * 2016-05-06 2017-06-06 Uber Technologies, Inc. Object detection for an autonomous vehicle
EP3258433A1 (en) 2016-06-17 2017-12-20 Starship Technologies OÜ Method and system for delivering items
CN106096192B (en) * 2016-06-27 2019-05-28 百度在线网络技术(北京)有限公司 A kind of construction method and device of the test scene of automatic driving vehicle
US10474162B2 (en) 2016-07-01 2019-11-12 Uatc, Llc Autonomous vehicle localization using passive image data
US9754205B1 (en) * 2016-07-01 2017-09-05 Intraspexion Inc. Using classified text or images and deep learning algorithms to identify risk of product defect and provide early warning
US9754206B1 (en) * 2016-07-01 2017-09-05 Intraspexion Inc. Using classified text and deep learning algorithms to identify drafting risks in a document and provide early warning
US10821987B2 (en) * 2016-07-20 2020-11-03 Ford Global Technologies, Llc Vehicle interior and exterior monitoring
US10203696B2 (en) * 2016-07-26 2019-02-12 Waymo Llc Determining drivability of objects for autonomous vehicles
US20180052470A1 (en) * 2016-08-18 2018-02-22 GM Global Technology Operations LLC Obstacle Avoidance Co-Pilot For Autonomous Vehicles
WO2018037900A1 (en) * 2016-08-22 2018-03-01 ソニー株式会社 Driving support device, method, mobile body, and program
US10066946B2 (en) 2016-08-26 2018-09-04 Here Global B.V. Automatic localization geometry detection
US20190004524A1 (en) * 2016-08-31 2019-01-03 Faraday&Future Inc. System and method for planning a vehicle path
US10677894B2 (en) 2016-09-06 2020-06-09 Magna Electronics Inc. Vehicle sensing system for classification of vehicle model
US10317901B2 (en) 2016-09-08 2019-06-11 Mentor Graphics Development (Deutschland) Gmbh Low-level sensor fusion
US10585409B2 (en) 2016-09-08 2020-03-10 Mentor Graphics Corporation Vehicle localization with map-matched sensor measurements
US10678240B2 (en) 2016-09-08 2020-06-09 Mentor Graphics Corporation Sensor modification based on an annotated environmental model
US11067996B2 (en) 2016-09-08 2021-07-20 Siemens Industry Software Inc. Event-driven region of interest management
CN109789880B (en) * 2016-09-21 2022-03-11 苹果公司 External communication of vehicle
EP3485337B1 (en) * 2016-09-23 2020-10-21 Apple Inc. Decision making for autonomous vehicle motion control
US10152058B2 (en) * 2016-10-24 2018-12-11 Ford Global Technologies, Llc Vehicle virtual map
EP3322149B1 (en) * 2016-11-10 2023-09-13 Tata Consultancy Services Limited Customized map generation with real time messages and locations from concurrent users
US10769452B2 (en) * 2016-11-14 2020-09-08 Lyft, Inc. Evaluating and presenting pick-up and drop-off locations in a situational-awareness view of an autonomous vehicle
US11042161B2 (en) 2016-11-16 2021-06-22 Symbol Technologies, Llc Navigation control method and apparatus in a mobile automation system
JP2020516091A (en) * 2016-11-29 2020-05-28 ルミレッズ ホールディング ベーフェー Vehicle monitoring
US10065647B2 (en) 2016-12-02 2018-09-04 Starsky Robotics, Inc. Vehicle control system and method of use
EP3330908A1 (en) 2016-12-02 2018-06-06 Starship Technologies OÜ System and method for securely delivering packages to different delivery recipients with a single vehicle
WO2018099930A1 (en) 2016-12-02 2018-06-07 Starship Technologies Oü System and method for securely delivering packages to different delivery recipients with a single vehicle
KR101911703B1 (en) * 2016-12-09 2018-10-25 엘지전자 주식회사 Driving control apparatus for vehicle and vehicle
KR20180070932A (en) * 2016-12-19 2018-06-27 삼성전자주식회사 A movable object and a method for controlling the same
US10118627B2 (en) * 2016-12-20 2018-11-06 Uber Technologies, Inc. Vehicle controls based on the measured weight of freight
JP6551382B2 (en) 2016-12-22 2019-07-31 トヨタ自動車株式会社 Collision avoidance support device
GB2558273A (en) * 2016-12-23 2018-07-11 Hohla Martin Sensor system for a vehicle and method for determining assessment threat
EP3343431A1 (en) * 2016-12-28 2018-07-04 Volvo Car Corporation Method and system for vehicle localization from camera image
US10353931B2 (en) * 2016-12-30 2019-07-16 DeepMap Inc. High definition map and route storage management system for autonomous vehicles
WO2018126067A1 (en) * 2016-12-30 2018-07-05 DeepMap Inc. Vector data encoding of high definition map data for autonomous vehicles
US10837773B2 (en) * 2016-12-30 2020-11-17 DeepMap Inc. Detection of vertical structures based on LiDAR scanner data for high-definition maps for autonomous vehicles
US10168167B2 (en) 2017-01-25 2019-01-01 Via Transportation, Inc. Purposefully selecting longer routes to improve user satisfaction
JP6760114B2 (en) * 2017-01-31 2020-09-23 富士通株式会社 Information processing equipment, data management equipment, data management systems, methods, and programs
US10436595B2 (en) * 2017-02-02 2019-10-08 Baidu Usa Llc Method and system for updating localization maps of autonomous driving vehicles
DE102017201936A1 (en) * 2017-02-08 2018-08-09 Robert Bosch Gmbh Method for reducing collision damage
EP3360746A1 (en) * 2017-02-13 2018-08-15 Autoliv Development AB Apparatus operable to determine a position of a portion of a lane
US10780879B2 (en) * 2017-02-14 2020-09-22 Denso Ten Limited Parking controller, parking control system, and parking control method
US20180232840A1 (en) * 2017-02-15 2018-08-16 Uber Technologies, Inc. Geospatial clustering for service coordination systems
US10133275B1 (en) 2017-03-01 2018-11-20 Zoox, Inc. Trajectory generation using temporal logic and tree search
US10671076B1 (en) 2017-03-01 2020-06-02 Zoox, Inc. Trajectory prediction of third-party objects using temporal logic and tree search
WO2018159429A1 (en) 2017-03-02 2018-09-07 パナソニックIpマネジメント株式会社 Driving assistance method, and driving assistance device and driving assistance system using said method
US10293818B2 (en) 2017-03-07 2019-05-21 Uber Technologies, Inc. Teleassistance data prioritization for self-driving vehicles
US10202126B2 (en) 2017-03-07 2019-02-12 Uber Technologies, Inc. Teleassistance data encoding for self-driving vehicles
JP6717240B2 (en) * 2017-03-08 2020-07-01 株式会社デンソー Target detection device
US10518770B2 (en) * 2017-03-14 2019-12-31 Uatc, Llc Hierarchical motion planning for autonomous vehicles
WO2018170074A1 (en) 2017-03-14 2018-09-20 Starsky Robotics, Inc. Vehicle sensor system and method of use
DE102017204357A1 (en) * 2017-03-16 2018-09-20 Robert Bosch Gmbh Method and device for updating a digital map for vehicle navigation
JP6914065B2 (en) * 2017-03-17 2021-08-04 シャープ株式会社 Obstacle detection device, traveling device, obstacle detection system and obstacle detection method
US10466953B2 (en) * 2017-03-30 2019-11-05 Microsoft Technology Licensing, Llc Sharing neighboring map data across devices
US10248121B2 (en) * 2017-03-31 2019-04-02 Uber Technologies, Inc. Machine-learning based autonomous vehicle management system
US20180290590A1 (en) * 2017-04-07 2018-10-11 GM Global Technology Operations LLC Systems for outputting an alert from a vehicle to warn nearby entities
US20180299899A1 (en) * 2017-04-13 2018-10-18 Neato Robotics, Inc. Localized collection of ambient data
US10838422B2 (en) * 2017-04-13 2020-11-17 Panasonic Intellectual Property Corporation Of America Information processing method and information processing apparatus
US10552691B2 (en) 2017-04-25 2020-02-04 TuSimple System and method for vehicle position and velocity estimation based on camera and lidar data
US10679312B2 (en) * 2017-04-25 2020-06-09 Lyft Inc. Dynamic autonomous vehicle servicing and management
WO2018195996A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Multi-object tracking based on lidar point cloud
US11367092B2 (en) 2017-05-01 2022-06-21 Symbol Technologies, Llc Method and apparatus for extracting and processing price text from an image set
DE112018002314T5 (en) 2017-05-01 2020-01-23 Symbol Technologies, Llc METHOD AND DEVICE FOR DETECTING AN OBJECT STATUS
US10949798B2 (en) 2017-05-01 2021-03-16 Symbol Technologies, Llc Multimodal localization and mapping for a mobile automation apparatus
US11449059B2 (en) * 2017-05-01 2022-09-20 Symbol Technologies, Llc Obstacle detection for a mobile automation apparatus
WO2018204342A1 (en) 2017-05-01 2018-11-08 Symbol Technologies, Llc Product status detection system
US10884409B2 (en) 2017-05-01 2021-01-05 Mentor Graphics (Deutschland) Gmbh Training of machine learning sensor data classification system
GB2562049A (en) * 2017-05-02 2018-11-07 Kompetenzzentrum Das Virtuelle Fahrzeug Improved pedestrian prediction by using enhanced map data in automated vehicles
US11600084B2 (en) 2017-05-05 2023-03-07 Symbol Technologies, Llc Method and apparatus for detecting and interpreting price label text
US10942525B2 (en) 2017-05-09 2021-03-09 Uatc, Llc Navigational constraints for autonomous vehicles
US20180328745A1 (en) * 2017-05-09 2018-11-15 Uber Technologies, Inc. Coverage plan generation and implementation
DE102017004473A1 (en) * 2017-05-10 2018-11-15 GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) Method and device for creating a model for the environment of a vehicle
US9970615B1 (en) * 2017-05-23 2018-05-15 Ford Global Technologies, Llc Light-based vehicle-device communications
US11282009B2 (en) 2017-05-23 2022-03-22 Uatc, Llc Fleet utilization efficiency for on-demand transportation services
US10762447B2 (en) * 2017-05-23 2020-09-01 Uatc, Llc Vehicle selection for on-demand transportation services
US10186156B2 (en) * 2017-05-25 2019-01-22 Uber Technologies, Inc. Deploying human-driven vehicles for autonomous vehicle routing and localization map updating
US10663303B2 (en) * 2017-06-12 2020-05-26 Panasonic Intellectual Property Management Co., Ltd. System and method for dynamically authenticating map data using blockchains
US10543828B2 (en) * 2017-06-13 2020-01-28 Nissan North America, Inc. Structured multivariate contextual vehicle operation with integrated semiotic control
US10444759B2 (en) 2017-06-14 2019-10-15 Zoox, Inc. Voxel based ground plane estimation and object segmentation
US10983206B2 (en) * 2017-11-07 2021-04-20 FLIR Belgium BVBA Low cost high precision GNSS systems and methods
US10595455B2 (en) * 2017-06-19 2020-03-24 Cnh Industrial America Llc Planning system for an autonomous work vehicle system
US20180374341A1 (en) * 2017-06-27 2018-12-27 GM Global Technology Operations LLC Systems and methods for predicting traffic patterns in an autonomous vehicle
WO2019000417A1 (en) * 2017-06-30 2019-01-03 SZ DJI Technology Co., Ltd. Map generation systems and methods
US10759534B2 (en) 2017-07-03 2020-09-01 George A. Miller Method and system from controlling an unmanned aerial vehicle
US10564638B1 (en) * 2017-07-07 2020-02-18 Zoox, Inc. Teleoperator situational awareness
US10438486B2 (en) * 2017-07-10 2019-10-08 Lyft, Inc. Dynamic modeling and simulation of an autonomous vehicle fleet using real-time autonomous vehicle sensor input
US10578453B2 (en) * 2017-07-14 2020-03-03 Rosemount Aerospace Inc. Render-based trajectory planning
JP6974465B2 (en) * 2017-07-18 2021-12-01 パイオニア株式会社 Controls, control methods, and programs
US10274961B2 (en) * 2017-07-26 2019-04-30 GM Global Technology Operations LLC Path planning for autonomous driving
WO2019023324A1 (en) 2017-07-26 2019-01-31 Via Transportation, Inc. Systems and methods for managing and routing ridesharing vehicles
JP7299210B2 (en) * 2017-07-28 2023-06-27 ニューロ・インコーポレーテッド Systems and Mechanisms for Upselling Products in Autonomous Vehicles
CN113119963B (en) 2017-07-28 2024-03-26 现代摩比斯株式会社 Intelligent ultrasonic system, vehicle rear collision warning device and control method thereof
GB201718507D0 (en) 2017-07-31 2017-12-27 Univ Oxford Innovation Ltd A method of constructing a model of the motion of a mobile device and related systems
EP3667450B1 (en) * 2017-08-07 2021-10-13 Panasonic Corporation Mobile body and method for control of mobile body
US11175132B2 (en) 2017-08-11 2021-11-16 Zoox, Inc. Sensor perturbation
EP3665501A1 (en) * 2017-08-11 2020-06-17 Zoox, Inc. Vehicle sensor calibration and localization
US10983199B2 (en) 2017-08-11 2021-04-20 Zoox, Inc. Vehicle sensor calibration and localization
CN107450546A (en) * 2017-08-16 2017-12-08 北京克路德人工智能科技有限公司 Obstacle Avoidance based on GPS and ultrasonic radar
WO2019035997A1 (en) * 2017-08-17 2019-02-21 Sri International Advanced control system with multiple control paradigms
US10599931B2 (en) * 2017-08-21 2020-03-24 2236008 Ontario Inc. Automated driving system that merges heterogenous sensor data
US20190065878A1 (en) * 2017-08-22 2019-02-28 GM Global Technology Operations LLC Fusion of radar and vision sensor systems
US10558217B2 (en) * 2017-08-28 2020-02-11 GM Global Technology Operations LLC Method and apparatus for monitoring of an autonomous vehicle
CN111279734A (en) * 2017-09-01 2020-06-12 诺基亚技术有限公司 Reducing coverage problems via dynamic measurements
CN109425353B (en) * 2017-09-05 2020-12-15 阿里巴巴(中国)有限公司 Main and auxiliary road transfer identification method and device
US10282995B2 (en) 2017-09-05 2019-05-07 Starship Technologies Oü Mobile robot having collision avoidance system for crossing a road from a pedestrian pathway
US20190079526A1 (en) * 2017-09-08 2019-03-14 Uber Technologies, Inc. Orientation Determination in Object Detection and Tracking for Autonomous Vehicles
US11378955B2 (en) 2017-09-08 2022-07-05 Motional Ad Llc Planning autonomous motion
US10794710B1 (en) * 2017-09-08 2020-10-06 Perceptin Shenzhen Limited High-precision multi-layer visual and semantic map by autonomous units
CN115503752A (en) * 2017-09-08 2022-12-23 动态Ad有限责任公司 Method and system for operating a vehicle
DE102017215868A1 (en) * 2017-09-08 2019-03-14 Robert Bosch Gmbh Method and device for creating a map
US10515321B2 (en) * 2017-09-11 2019-12-24 Baidu Usa Llc Cost based path planning for autonomous driving vehicles
US10754339B2 (en) * 2017-09-11 2020-08-25 Baidu Usa Llc Dynamic programming and quadratic programming based decision and planning for autonomous driving vehicles
EP3665507A4 (en) * 2017-09-13 2021-08-11 Velodyne Lidar USA, Inc. Multiple resolution, simultaneous localization and mapping based on 3-d lidar measurements
US10710599B2 (en) * 2017-09-15 2020-07-14 Toyota Research Institute, Inc. System and method for online probabilistic change detection in feature-based maps
US10514697B2 (en) 2017-09-15 2019-12-24 GM Global Technology Operations LLC Vehicle remote assistance mode
US10606277B2 (en) * 2017-09-18 2020-03-31 Baidu Usa Llc Speed optimization based on constrained smoothing spline for autonomous driving vehicles
US10860034B1 (en) * 2017-09-27 2020-12-08 Apple Inc. Barrier detection
DE102017122440A1 (en) * 2017-09-27 2019-03-28 Valeo Schalter Und Sensoren Gmbh A method for locating and further developing a digital map by a motor vehicle; localization device
US11465586B1 (en) * 2017-09-28 2022-10-11 Apple Inc. User-to-vehicle interaction
US10612932B2 (en) * 2017-09-29 2020-04-07 Wipro Limited Method and system for correcting a pre-generated navigation path for an autonomous vehicle
US10782138B2 (en) * 2017-10-06 2020-09-22 Here Global B.V. Method, apparatus, and computer program product for pedestrian behavior profile generation
WO2019074478A1 (en) * 2017-10-09 2019-04-18 Vivek Anand Sujan Autonomous safety systems and methods for vehicles
US10802485B2 (en) 2017-10-09 2020-10-13 Here Global B.V. Apparatus, method and computer program product for facilitating navigation of a vehicle based upon a quality index of the map data
CN107678306B (en) * 2017-10-09 2021-04-16 驭势(上海)汽车科技有限公司 Dynamic scene information recording and simulation playback method, device, equipment and medium
JP6939376B2 (en) * 2017-10-10 2021-09-22 トヨタ自動車株式会社 Autonomous driving system
US10473772B2 (en) * 2017-10-12 2019-11-12 Ford Global Technologies, Llc Vehicle sensor operation
US20190113920A1 (en) 2017-10-18 2019-04-18 Luminar Technologies, Inc. Controlling an autonomous vehicle using model predictive control
US10867455B2 (en) 2017-10-20 2020-12-15 Appliedea, Inc. Diagnostics, prognostics, and health management for vehicles using kinematic clusters, behavioral sensor data, and maintenance impact data
DE102017124788A1 (en) * 2017-10-24 2019-04-25 Jungheinrich Ag Method for driving support in an industrial truck and industrial truck
US10739775B2 (en) * 2017-10-28 2020-08-11 Tusimple, Inc. System and method for real world autonomous vehicle trajectory simulation
US10203210B1 (en) 2017-11-03 2019-02-12 Toyota Research Institute, Inc. Systems and methods for road scene change detection using semantic segmentation
US10967862B2 (en) * 2017-11-07 2021-04-06 Uatc, Llc Road anomaly detection for autonomous vehicle
US11556777B2 (en) * 2017-11-15 2023-01-17 Uatc, Llc Continuous convolution and fusion in neural networks
US10535138B2 (en) 2017-11-21 2020-01-14 Zoox, Inc. Sensor data segmentation
WO2019107147A1 (en) * 2017-11-28 2019-06-06 株式会社小糸製作所 Mobile-body display device
US10860018B2 (en) * 2017-11-30 2020-12-08 Tusimple, Inc. System and method for generating simulated vehicles with configured behaviors for analyzing autonomous vehicle motion planners
US10908607B2 (en) * 2017-11-30 2021-02-02 Ford Global Technologies, Llc Enhanced traffic jam assist
US10343286B2 (en) 2017-12-01 2019-07-09 Starship Technologies Oü Storage system, use and method with robotic parcel retrieval and loading onto a delivery vehicle
US10509410B2 (en) 2017-12-06 2019-12-17 Zoox, Inc. External control of an autonomous vehicle
US11073828B2 (en) * 2017-12-08 2021-07-27 Samsung Electronics Co., Ltd. Compression of semantic information for task and motion planning
US10899348B2 (en) * 2017-12-20 2021-01-26 Here Global B.V. Method, apparatus and computer program product for associating map objects with road links
JP7007183B2 (en) * 2017-12-27 2022-01-24 日立Astemo株式会社 Traffic flow control device, data structure of driving scenario
EP3506040B8 (en) * 2017-12-28 2021-09-22 Einride AB Cooperative sensing
CN108344414A (en) 2017-12-29 2018-07-31 中兴通讯股份有限公司 A kind of map structuring, air navigation aid and device, system
US11009359B2 (en) 2018-01-05 2021-05-18 Lacuna Technologies Inc. Transportation systems and related methods
CN110015290B (en) * 2018-01-08 2020-12-01 湖南中车时代电动汽车股份有限公司 Control method for intelligent driving system
WO2019136341A1 (en) 2018-01-08 2019-07-11 Via Transportation, Inc. Systems and methods for managing and scheduling ridesharing vehicles
US11422561B2 (en) * 2018-01-11 2022-08-23 Toyota Jidosha Kabushiki Kaisha Sensor system for multiple perspective sensor data sets
US11262756B2 (en) * 2018-01-15 2022-03-01 Uatc, Llc Discrete decision architecture for motion planning system of an autonomous vehicle
EP3514494A1 (en) * 2018-01-19 2019-07-24 Zenuity AB Constructing and updating a behavioral layer of a multi layered road network high definition digital map
WO2019147235A1 (en) * 2018-01-24 2019-08-01 Ford Global Technologies, Llc Path planning for autonomous moving devices
US10553044B2 (en) * 2018-01-31 2020-02-04 Mentor Graphics Development (Deutschland) Gmbh Self-diagnosis of faults with a secondary system in an autonomous driving system
US11145146B2 (en) 2018-01-31 2021-10-12 Mentor Graphics (Deutschland) Gmbh Self-diagnosis of faults in an autonomous driving system
US10480952B2 (en) * 2018-02-01 2019-11-19 Didi Research America, Llc Probabilistic navigation system and method
DE102018201889A1 (en) * 2018-02-07 2019-08-08 Bayerische Motoren Werke Aktiengesellschaft Updating a geographical map
US10955851B2 (en) 2018-02-14 2021-03-23 Zoox, Inc. Detecting blocking objects
US11157527B2 (en) 2018-02-20 2021-10-26 Zoox, Inc. Creating clean maps including semantic information
US11093759B2 (en) 2018-03-06 2021-08-17 Here Global B.V. Automatic identification of roadside objects for localization
EP3540463B1 (en) * 2018-03-09 2022-06-01 Tata Consultancy Services Limited Radar and ultrasound sensor based real time tracking of a moving object
US10740964B2 (en) 2018-03-13 2020-08-11 Recogni Inc. Three-dimensional environment modeling based on a multi-camera convolver system
JP7030573B2 (en) * 2018-03-15 2022-03-07 本田技研工業株式会社 Vehicle control devices, vehicle control methods, and programs
US10384718B1 (en) 2018-03-26 2019-08-20 Zoox, Inc. Vehicle parking assist
WO2019189525A1 (en) * 2018-03-27 2019-10-03 パナソニックIpマネジメント株式会社 Automatic driving control device, vehicle, and demand mediation system
US10627818B2 (en) 2018-03-28 2020-04-21 Zoox, Inc. Temporal prediction model for semantic intent understanding
US20180215377A1 (en) * 2018-03-29 2018-08-02 GM Global Technology Operations LLC Bicycle and motorcycle protection behaviors
US10521913B2 (en) * 2018-03-29 2019-12-31 Aurora Innovation, Inc. Relative atlas for autonomous vehicle and generation thereof
US10468062B1 (en) 2018-04-03 2019-11-05 Zoox, Inc. Detecting errors in sensor data
US10823572B2 (en) 2018-04-05 2020-11-03 Symbol Technologies, Llc Method, system and apparatus for generating navigational data
US11327504B2 (en) 2018-04-05 2022-05-10 Symbol Technologies, Llc Method, system and apparatus for mobile automation apparatus localization
US10414395B1 (en) 2018-04-06 2019-09-17 Zoox, Inc. Feature-based prediction
US11262755B2 (en) 2018-04-09 2022-03-01 Toyota Motor Engineering & Manufacturing North America, Inc. Driver assistance system for autonomously indicating vehicle user intent in response to a predefined driving situation
US11620592B2 (en) 2018-04-09 2023-04-04 Via Transportation, Inc. Systems and methods for planning transportation routes
US11467590B2 (en) 2018-04-09 2022-10-11 SafeAI, Inc. Techniques for considering uncertainty in use of artificial intelligence models
US11169536B2 (en) 2018-04-09 2021-11-09 SafeAI, Inc. Analysis of scenarios for controlling vehicle operations
US11625036B2 (en) 2018-04-09 2023-04-11 SafeAl, Inc. User interface for presenting decisions
US11561541B2 (en) * 2018-04-09 2023-01-24 SafeAI, Inc. Dynamically controlling sensor behavior
KR102420568B1 (en) * 2018-04-27 2022-07-13 삼성전자주식회사 Method for determining a position of a vehicle and vehicle thereof
US11334753B2 (en) 2018-04-30 2022-05-17 Uatc, Llc Traffic signal state classification for autonomous vehicles
US11610165B2 (en) 2018-05-09 2023-03-21 Volvo Car Corporation Method and system for orchestrating multi-party services using semi-cooperative nash equilibrium based on artificial intelligence, neural network models,reinforcement learning and finite-state automata
US10860025B2 (en) * 2018-05-15 2020-12-08 Toyota Research Institute, Inc. Modeling graph of interactions between agents
US10499180B1 (en) 2018-05-17 2019-12-03 Zoox, Inc. Three-dimensional sound for passenger notification
US11126873B2 (en) 2018-05-17 2021-09-21 Zoox, Inc. Vehicle lighting state determination
US10315563B1 (en) 2018-05-22 2019-06-11 Zoox, Inc. Acoustic notifications
US10414336B1 (en) 2018-05-22 2019-09-17 Zoox, Inc. Acoustic notifications
US20210132604A1 (en) * 2018-05-31 2021-05-06 Carla R. Gillett Autonomous passenger vehicle system
US11650059B2 (en) 2018-06-06 2023-05-16 Toyota Research Institute, Inc. Systems and methods for localizing a vehicle using an accuracy specification
US11287816B2 (en) 2018-06-11 2022-03-29 Uatc, Llc Navigational constraints for autonomous vehicles
DE102018209603A1 (en) * 2018-06-14 2019-12-19 Robert Bosch Gmbh Method and device for controlling self-driving vehicles
CN110832275B (en) 2018-06-14 2021-05-18 北京嘀嘀无限科技发展有限公司 System and method for updating high-resolution map based on binocular image
KR102092392B1 (en) * 2018-06-15 2020-03-23 네이버랩스 주식회사 Method and system for automatically collecting and updating information about point of interest in real space
US10642275B2 (en) 2018-06-18 2020-05-05 Zoox, Inc. Occulsion aware planning and control
US11203318B2 (en) 2018-06-18 2021-12-21 Waymo Llc Airbag extension system
US11048265B2 (en) 2018-06-18 2021-06-29 Zoox, Inc. Occlusion aware planning
JP7422687B2 (en) * 2018-06-18 2024-01-26 ズークス インコーポレイテッド Occlusion awareness planning
US11354406B2 (en) * 2018-06-28 2022-06-07 Intel Corporation Physics-based approach for attack detection and localization in closed-loop controls for autonomous vehicles
US11066067B2 (en) * 2018-06-29 2021-07-20 Baidu Usa Llc Planning parking trajectory for self-driving vehicles
US10899364B2 (en) 2018-07-02 2021-01-26 International Business Machines Corporation Autonomous vehicle system
CN109141446B (en) * 2018-07-04 2021-11-12 阿波罗智能技术(北京)有限公司 Method, apparatus, device and computer-readable storage medium for obtaining map
DE102018211604A1 (en) * 2018-07-12 2020-01-16 Robert Bosch Gmbh Mobile device and method for operating the mobile device
JP7025294B2 (en) * 2018-07-13 2022-02-24 ヤフー株式会社 Decision device, decision method, and decision program
US20200033870A1 (en) * 2018-07-24 2020-01-30 Exyn Technologies Fault Tolerant State Estimation
US11138350B2 (en) * 2018-08-09 2021-10-05 Zoox, Inc. Procedural world generation using tertiary data
US11126763B1 (en) * 2018-08-22 2021-09-21 Waymo Llc Realism metric for testing software for controlling autonomous vehicles
TWI679511B (en) * 2018-08-22 2019-12-11 和碩聯合科技股份有限公司 Method and system for planning trajectory
CN110146097B (en) * 2018-08-28 2022-05-13 北京初速度科技有限公司 Method and system for generating automatic driving navigation map, vehicle-mounted terminal and server
US11009590B2 (en) 2018-08-29 2021-05-18 Aptiv Technologies Limited Annotation of radar-profiles of objects
CN109165150A (en) * 2018-08-30 2019-01-08 百度在线网络技术(北京)有限公司 Information processing method, device and equipment in automatic driving vehicle
CN109367500B (en) * 2018-08-31 2021-03-23 百度在线网络技术(北京)有限公司 Vehicle control processing method, device, equipment and storage medium
EP3620978A1 (en) * 2018-09-07 2020-03-11 Ibeo Automotive Systems GmbH Method and device for classifying objects
KR20200029785A (en) * 2018-09-11 2020-03-19 삼성전자주식회사 Localization method and apparatus of displaying virtual object in augmented reality
JP6873960B2 (en) * 2018-09-27 2021-05-19 株式会社日立製作所 Map data high-detailed system, its server, and its method
US11353577B2 (en) 2018-09-28 2022-06-07 Zoox, Inc. Radar spatial estimation
US10782136B2 (en) * 2018-09-28 2020-09-22 Zoox, Inc. Modifying map elements associated with map data
KR102233260B1 (en) * 2018-10-02 2021-03-29 에스케이텔레콤 주식회사 Apparatus and method for updating high definition map
US11506483B2 (en) 2018-10-05 2022-11-22 Zebra Technologies Corporation Method, system and apparatus for support structure depth determination
US11010920B2 (en) 2018-10-05 2021-05-18 Zebra Technologies Corporation Method, system and apparatus for object detection in point clouds
JP7253446B2 (en) * 2018-10-05 2023-04-06 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Information processing method and information processing system
US10816987B2 (en) * 2018-10-15 2020-10-27 Zoox, Inc. Responsive vehicle control
CN109949422B (en) * 2018-10-15 2020-12-15 华为技术有限公司 Data processing method and equipment for virtual scene
WO2020078572A1 (en) * 2018-10-19 2020-04-23 Harman Becker Automotive Systems Gmbh Global map creation using fleet trajectories and observations
US11035931B1 (en) * 2018-10-29 2021-06-15 Ansys, Inc. Accelerated radar data processing via limited pulse extrapolation
DK180774B1 (en) 2018-10-29 2022-03-04 Motional Ad Llc Automatic annotation of environmental features in a map during navigation of a vehicle
US10928819B2 (en) * 2018-10-29 2021-02-23 Here Global B.V. Method and apparatus for comparing relevant information between sensor measurements
US20200133272A1 (en) * 2018-10-29 2020-04-30 Aptiv Technologies Limited Automatic generation of dimensionally reduced maps and spatiotemporal localization for navigation of a vehicle
GB2613740B (en) * 2018-10-30 2023-12-06 Motional Ad Llc Redundancy in autonomous vehicles
US11829143B2 (en) * 2018-11-02 2023-11-28 Aurora Operations, Inc. Labeling autonomous vehicle data
WO2020097221A1 (en) * 2018-11-08 2020-05-14 Evangelos Simoudis Systems and methods for managing vehicle data
US11003188B2 (en) 2018-11-13 2021-05-11 Zebra Technologies Corporation Method, system and apparatus for obstacle handling in navigational path generation
US11124185B2 (en) 2018-11-13 2021-09-21 Zoox, Inc. Perception collision avoidance
US11090811B2 (en) 2018-11-13 2021-08-17 Zebra Technologies Corporation Method and apparatus for labeling of support structures
US10464577B1 (en) * 2018-11-15 2019-11-05 GM Global Technology Operations LLC Contextual autonomous vehicle support through written interaction
KR102499976B1 (en) * 2018-11-29 2023-02-15 현대자동차주식회사 Vehicle and control method thereof
US10950125B2 (en) * 2018-12-03 2021-03-16 Nec Corporation Calibration for wireless localization and detection of vulnerable road users
US11580687B2 (en) * 2018-12-04 2023-02-14 Ottopia Technologies Ltd. Transferring data from autonomous vehicles
US10885785B2 (en) 2018-12-04 2021-01-05 At&T Intellectual Property I, L.P. Network-controllable physical resources for vehicular transport system safety
US11079240B2 (en) 2018-12-07 2021-08-03 Zebra Technologies Corporation Method, system and apparatus for adaptive particle filter localization
US11416000B2 (en) 2018-12-07 2022-08-16 Zebra Technologies Corporation Method and apparatus for navigational ray tracing
US11100303B2 (en) 2018-12-10 2021-08-24 Zebra Technologies Corporation Method, system and apparatus for auxiliary label detection and association
US11022445B2 (en) * 2018-12-11 2021-06-01 Here Global B.V. Segmented path coordinate system
US11015938B2 (en) 2018-12-12 2021-05-25 Zebra Technologies Corporation Method, system and apparatus for navigational assistance
US11104332B2 (en) * 2018-12-12 2021-08-31 Zoox, Inc. Collision avoidance system with trajectory validation
DE102018132520A1 (en) * 2018-12-17 2020-06-18 Trw Automotive Gmbh Method and system for controlling a motor vehicle
KR102092482B1 (en) * 2018-12-21 2020-03-23 부산대학교 산학협력단 Method and Apparatus for Detection of Feature Interaction in Autonomous System Using Pattern
IL270540A (en) * 2018-12-26 2020-06-30 Yandex Taxi Llc Method and system for training machine learning algorithm to detect objects at distance
US10948300B2 (en) * 2018-12-27 2021-03-16 Beijing Voyager Technology Co., Ltd. Systems and methods for path determination
CA3028708A1 (en) 2018-12-28 2020-06-28 Zih Corp. Method, system and apparatus for dynamic loop closure in mapping trajectories
CN111382768B (en) 2018-12-29 2023-11-14 华为技术有限公司 Multi-sensor data fusion method and device
US10547941B1 (en) 2019-01-16 2020-01-28 Ford Global Technologies, Llc Vehicle acoustic transducer operation
US11016489B2 (en) * 2019-01-18 2021-05-25 Baidu Usa Llc Method to dynamically determine vehicle effective sensor coverage for autonomous driving application
JP7248208B2 (en) * 2019-01-30 2023-03-29 バイドゥドットコム タイムズ テクノロジー (ベイジン) カンパニー リミテッド Point cloud registration system for self-driving cars
GB2621722B (en) * 2019-02-14 2024-06-05 Mobileye Vision Technologies Ltd Systems and methods for vehicle navigation
US11312379B2 (en) * 2019-02-15 2022-04-26 Rockwell Collins, Inc. Occupancy map synchronization in multi-vehicle networks
WO2020166745A1 (en) * 2019-02-15 2020-08-20 엘지전자 주식회사 Electronic device for vehicle, and method and system for operating electronic device for vehicle
US11526816B2 (en) 2019-02-27 2022-12-13 Uber Technologies, Inc. Context-based remote autonomous vehicle assistance
CN113498391B (en) * 2019-03-08 2023-05-16 马自达汽车株式会社 Automobile computing device
US11232711B2 (en) * 2019-03-27 2022-01-25 Robotic Research Opco, Llc Message conveying system of rendezvous locations for stranded autonomous vehicles
US11507084B2 (en) * 2019-03-27 2022-11-22 Intel Corporation Collaborative 3-D environment map for computer-assisted or autonomous driving vehicles
JP7288781B2 (en) * 2019-03-27 2023-06-08 本田技研工業株式会社 INFORMATION PROVIDING DEVICE, INFORMATION PROVIDING METHOD AND PROGRAM
US11532060B2 (en) * 2019-03-28 2022-12-20 Lyft, Inc. Systems and methods for matching autonomous transportation provider vehicles and transportation requests in transportation management systems
US11181922B2 (en) 2019-03-29 2021-11-23 Zoox, Inc. Extension of autonomous driving functionality to new regions
US10957196B2 (en) 2019-04-03 2021-03-23 International Business Machines Corporation Traffic redirection for autonomous vehicles
US11321972B1 (en) 2019-04-05 2022-05-03 State Farm Mutual Automobile Insurance Company Systems and methods for detecting software interactions for autonomous vehicles within changing environmental conditions
US11048261B1 (en) 2019-04-05 2021-06-29 State Farm Mutual Automobile Insurance Company Systems and methods for evaluating autonomous vehicle software interactions for proposed trips
EP3722992B1 (en) * 2019-04-10 2023-03-01 Teraki GmbH System and method for pre-processing images captured by a vehicle
US11618439B2 (en) * 2019-04-11 2023-04-04 Phantom Auto Inc. Automatic imposition of vehicle speed restrictions depending on road situation analysis
US11105642B2 (en) 2019-04-17 2021-08-31 Waymo Llc Stranding and scoping analysis for autonomous vehicle services
US11922819B2 (en) * 2019-04-22 2024-03-05 Wonder Robotics Ltd System and method for autonomously landing a vertical take-off and landing (VTOL) aircraft
EP3730374A1 (en) * 2019-04-26 2020-10-28 Tusimple, Inc. Auditory assistant module for autonomous vehicles
DE102019206036A1 (en) 2019-04-26 2020-10-29 Volkswagen Aktiengesellschaft Method and device for determining the geographical position and orientation of a vehicle
US20200348668A1 (en) * 2019-04-30 2020-11-05 Nissan North America, Inc. Mobility Manager Trainer for Improved Autonomy in Exception Handling
US10511971B1 (en) * 2019-05-06 2019-12-17 Pointr Limited Systems and methods for location enabled search and secure authentication
US11716616B2 (en) * 2019-05-06 2023-08-01 Pointr Limited Systems and methods for location enabled search and secure authentication
US11447142B1 (en) 2019-05-16 2022-09-20 Waymo Llc Assessing surprise for autonomous vehicles
US11109041B2 (en) * 2019-05-16 2021-08-31 Tencent America LLC Method and apparatus for video coding
US11613253B2 (en) * 2019-05-29 2023-03-28 Baidu Usa Llc Method of monitoring localization functions in an autonomous driving vehicle
US11341663B2 (en) 2019-06-03 2022-05-24 Zebra Technologies Corporation Method, system and apparatus for detecting support structure obstructions
US11151743B2 (en) 2019-06-03 2021-10-19 Zebra Technologies Corporation Method, system and apparatus for end of aisle detection
US11662739B2 (en) 2019-06-03 2023-05-30 Zebra Technologies Corporation Method, system and apparatus for adaptive ceiling-based localization
US11960286B2 (en) 2019-06-03 2024-04-16 Zebra Technologies Corporation Method, system and apparatus for dynamic task sequencing
US11080566B2 (en) 2019-06-03 2021-08-03 Zebra Technologies Corporation Method, system and apparatus for gap detection in support structures with peg regions
US11402846B2 (en) 2019-06-03 2022-08-02 Zebra Technologies Corporation Method, system and apparatus for mitigating data capture light leakage
US11200677B2 (en) 2019-06-03 2021-12-14 Zebra Technologies Corporation Method, system and apparatus for shelf edge detection
CN112101069A (en) * 2019-06-18 2020-12-18 华为技术有限公司 Method and device for determining driving area information
US11549815B2 (en) * 2019-06-28 2023-01-10 GM Cruise Holdings LLC. Map change detection
US20220234621A1 (en) * 2019-06-28 2022-07-28 Gm Cruise Holdings Llc Augmented 3d map
US11574089B2 (en) * 2019-06-28 2023-02-07 Zoox, Inc. Synthetic scenario generator based on attributes
US11610078B2 (en) * 2019-06-28 2023-03-21 Zoox, Inc. Low variance region detection for improved high variance region detection using machine learning
US11163990B2 (en) * 2019-06-28 2021-11-02 Zoox, Inc. Vehicle control system and method for pedestrian detection based on head detection in sensor data
US11605236B2 (en) * 2019-06-28 2023-03-14 Zoox, Inc. Training a machine-learned model to detect low variance regions
US11568100B2 (en) * 2019-06-28 2023-01-31 Zoox, Inc. Synthetic scenario simulator based on events
WO2020264010A1 (en) 2019-06-28 2020-12-30 Zoox, Inc. Low variance region detection for improved detection
US11214270B2 (en) * 2019-06-28 2022-01-04 Woven Planet North America, Inc. Systems and methods for navigating vehicles with redundant navigation systems
CN112179361B (en) 2019-07-02 2022-12-06 华为技术有限公司 Method, device and storage medium for updating work map of mobile robot
US11593344B2 (en) * 2019-07-02 2023-02-28 Nvidia Corporation Updating high definition maps based on age of maps
GB201909693D0 (en) * 2019-07-05 2019-08-21 Vaion Ltd Computer-implemented method
JP7098580B2 (en) * 2019-07-05 2022-07-11 株式会社東芝 Predictors, predictors, programs and vehicle control systems
CN110414098B (en) * 2019-07-12 2021-07-06 北京三快在线科技有限公司 Generation method and device of simulation test environment
US11249479B2 (en) * 2019-07-18 2022-02-15 Nissan North America, Inc. System to recommend sensor view for quick situational awareness
CN110502797B (en) * 2019-07-24 2021-06-04 同济大学 Lane acquisition modeling system and method based on GNSS
US11458965B2 (en) 2019-08-13 2022-10-04 Zoox, Inc. Feasibility validation for vehicle trajectory selection
US11914368B2 (en) 2019-08-13 2024-02-27 Zoox, Inc. Modifying limits on vehicle dynamics for trajectories
US11407409B2 (en) 2019-08-13 2022-08-09 Zoox, Inc. System and method for trajectory validation
US11397434B2 (en) * 2019-08-13 2022-07-26 Zoox, Inc. Consistency validation for vehicle trajectory selection
US11314246B2 (en) * 2019-08-16 2022-04-26 Uber Technologies, Inc. Command toolbox for autonomous vehicles
US11353874B2 (en) 2019-08-20 2022-06-07 Zoox, Inc. Lane handling for merge prior to turn
CN114269618A (en) 2019-08-20 2022-04-01 祖克斯有限公司 Lane handling for merging before turning
US11468773B2 (en) 2019-08-20 2022-10-11 Zoox, Inc. Lane classification for improved vehicle handling
US11167754B2 (en) 2019-08-22 2021-11-09 Argo AI, LLC Systems and methods for trajectory based safekeeping of vehicles
US11072326B2 (en) * 2019-08-22 2021-07-27 Argo AI, LLC Systems and methods for trajectory based safekeeping of vehicles
US11958183B2 (en) 2019-09-19 2024-04-16 The Research Foundation For The State University Of New York Negotiation-based human-robot collaboration via augmented reality
US11541843B1 (en) 2019-09-20 2023-01-03 Zoox, Inc. Nonspecific vehicle
KR20210041224A (en) * 2019-10-07 2021-04-15 현대자동차주식회사 Vehicle and method of providing information around the same
CN110554396A (en) * 2019-10-21 2019-12-10 深圳市元征科技股份有限公司 laser radar mapping method, device, equipment and medium in indoor scene
US11480961B1 (en) 2019-11-21 2022-10-25 Zoox, Inc. Immersive sound for teleoperators
US11688082B2 (en) * 2019-11-22 2023-06-27 Baidu Usa Llc Coordinate gradient method for point cloud registration for autonomous vehicles
US11494533B2 (en) * 2019-11-27 2022-11-08 Waymo Llc Simulations with modified agents for testing autonomous vehicle software
FR3103906B1 (en) * 2019-11-29 2021-11-05 Balyo METHOD OF MAPPING THE MOVING ENVIRONMENT OF A FLEET OF SELF-GUIDED HANDLING VEHICLES
US11393489B2 (en) 2019-12-02 2022-07-19 Here Global B.V. Method, apparatus, and computer program product for road noise mapping
US11788859B2 (en) 2019-12-02 2023-10-17 Here Global B.V. Method, apparatus, and computer program product for road noise mapping
US11507103B2 (en) 2019-12-04 2022-11-22 Zebra Technologies Corporation Method, system and apparatus for localization-based historical obstacle handling
US11391577B2 (en) * 2019-12-04 2022-07-19 Pony Ai Inc. Dynamically modelling objects in map
US11361543B2 (en) * 2019-12-10 2022-06-14 Caterpillar Inc. System and method for detecting objects
US11345360B1 (en) 2019-12-12 2022-05-31 Zoox, Inc. Localization error handling
US11107238B2 (en) 2019-12-13 2021-08-31 Zebra Technologies Corporation Method, system and apparatus for detecting item facings
RU2733599C1 (en) * 2019-12-18 2020-10-05 федеральное государственное бюджетное образовательное учреждение высшего образования "Московский политехнический университет" (Московский Политех) Control system of power plant of unmanned hybrid vehicle
US11544936B2 (en) * 2019-12-20 2023-01-03 Zoox, Inc. In-path obstacle detection and avoidance system
US11462041B2 (en) 2019-12-23 2022-10-04 Zoox, Inc. Pedestrians with objects
US11789155B2 (en) 2019-12-23 2023-10-17 Zoox, Inc. Pedestrian object detection training
US11902327B2 (en) * 2020-01-06 2024-02-13 Microsoft Technology Licensing, Llc Evaluating a result of enforcement of access control policies instead of enforcing the access control policies
DE112020006441T5 (en) * 2020-01-08 2022-11-17 Mitsubishi Electric Corporation Vehicle control device and vehicle control method
US11383634B2 (en) 2020-01-14 2022-07-12 Qualcomm Incorporated Collaborative vehicle headlight directing
US11325524B2 (en) * 2020-01-14 2022-05-10 Qualcomm Incorporated Collaborative vehicle headlight directing
US11872929B2 (en) 2020-01-14 2024-01-16 Qualcomm Incorporated Collaborative vehicle headlight directing
US11241996B2 (en) 2020-01-14 2022-02-08 Qualcomm Incorporated Collaborative vehicle headlight directing
DE102020103906B4 (en) 2020-02-14 2022-12-29 Audi Aktiengesellschaft Method and processor circuit for updating a digital road map
US11643105B2 (en) 2020-02-21 2023-05-09 Argo AI, LLC Systems and methods for generating simulation scenario definitions for an autonomous vehicle system
US11526721B1 (en) 2020-02-21 2022-12-13 Zoox, Inc. Synthetic scenario generator using distance-biased confidences for sensor data
US11429107B2 (en) 2020-02-21 2022-08-30 Argo AI, LLC Play-forward planning and control system for an autonomous vehicle
US11385642B2 (en) 2020-02-27 2022-07-12 Zoox, Inc. Perpendicular cut-in training
US11466992B2 (en) 2020-03-02 2022-10-11 Beijing Baidu Netcom Science And Technology Co., Ltd. Method, apparatus, device and medium for detecting environmental change
SE2050260A1 (en) 2020-03-09 2021-09-10 Einride Ab Method for controlling a fleet of autonomous/remotely operated vehicles
US11592299B2 (en) * 2020-03-19 2023-02-28 Mobile Industrial Robots A/S Using static scores to control vehicle operations
US11830302B2 (en) 2020-03-24 2023-11-28 Uatc, Llc Computer system for utilizing ultrasonic signals to implement operations for autonomous vehicles
US20230095384A1 (en) * 2020-03-25 2023-03-30 Intel Corporation Dynamic contextual road occupancy map perception for vulnerable road user safety in intelligent transportation systems
US11822333B2 (en) 2020-03-30 2023-11-21 Zebra Technologies Corporation Method, system and apparatus for data capture illumination control
US11340081B2 (en) * 2020-04-08 2022-05-24 Pony Ai Inc. System and method for updating map
JP7251513B2 (en) * 2020-04-08 2023-04-04 トヨタ自動車株式会社 Automatic valet parking system and service provision method
US11472442B2 (en) 2020-04-23 2022-10-18 Zoox, Inc. Map consistency checker
US11628850B2 (en) * 2020-05-05 2023-04-18 Zoox, Inc. System for generating generalized simulation scenarios
US11915487B2 (en) * 2020-05-05 2024-02-27 Toyota Research Institute, Inc. System and method for self-supervised depth and ego-motion overfitting
CN111649739B (en) * 2020-06-02 2023-09-01 阿波罗智能技术(北京)有限公司 Positioning method and device, automatic driving vehicle, electronic equipment and storage medium
US11402840B2 (en) * 2020-06-30 2022-08-02 Woven Planet North America, Inc. Independent trajectory validation system for vehicles
US11733696B2 (en) 2020-07-17 2023-08-22 Waymo Llc Detecting loops for autonomous vehicles
US11450024B2 (en) 2020-07-17 2022-09-20 Zebra Technologies Corporation Mixed depth object detection
US11718320B1 (en) 2020-08-21 2023-08-08 Aurora Operations, Inc. Using transmission sensor(s) in localization of an autonomous vehicle
EP4185990A1 (en) * 2020-08-21 2023-05-31 Waymo Llc Object-centric three-dimensional auto labeling of point cloud data
US11687094B2 (en) 2020-08-27 2023-06-27 Here Global B.V. Method, apparatus, and computer program product for organizing autonomous vehicles in an autonomous transition region
US11691643B2 (en) 2020-08-27 2023-07-04 Here Global B.V. Method and apparatus to improve interaction models and user experience for autonomous driving in transition regions
US11713979B2 (en) * 2020-08-27 2023-08-01 Here Global B.V. Method, apparatus, and computer program product for generating a transition variability index related to autonomous driving
US20220067768A1 (en) * 2020-08-28 2022-03-03 Telenav, Inc. Navigation system with high definition mapping mechanism and method of operation thereof
US11417110B2 (en) 2020-09-09 2022-08-16 Waymo Llc Annotated surfel maps
JP7310764B2 (en) 2020-09-11 2023-07-19 トヨタ自動車株式会社 Vehicle allocation system, vehicle allocation server, and vehicle allocation method
US11561552B2 (en) * 2020-09-15 2023-01-24 Waymo Llc Detecting environment changes using surfel data
US11731661B2 (en) 2020-10-01 2023-08-22 Argo AI, LLC Systems and methods for imminent collision avoidance
US11618444B2 (en) 2020-10-01 2023-04-04 Argo AI, LLC Methods and systems for autonomous vehicle inference of routes for actors exhibiting unrecognized behavior
US11358598B2 (en) 2020-10-01 2022-06-14 Argo AI, LLC Methods and systems for performing outlet inference by an autonomous vehicle to determine feasible paths through an intersection
US11648959B2 (en) 2020-10-20 2023-05-16 Argo AI, LLC In-vehicle operation of simulation scenarios during autonomous vehicle runs
US11593915B2 (en) 2020-10-21 2023-02-28 Zebra Technologies Corporation Parallax-tolerant panoramic image generation
USD981819S1 (en) 2020-10-30 2023-03-28 Zoox, Inc. Vehicle with a door opening lever
US11619497B2 (en) 2020-10-30 2023-04-04 Pony Ai Inc. Autonomous vehicle navigation using with coalescing constraints for static map data
US11392891B2 (en) 2020-11-03 2022-07-19 Zebra Technologies Corporation Item placement detection and optimization in material handling systems
US11847832B2 (en) 2020-11-11 2023-12-19 Zebra Technologies Corporation Object classification for autonomous navigation systems
US11868137B2 (en) * 2020-11-12 2024-01-09 Honda Motor Co., Ltd. Systems and methods for path planning with latent state inference and graphical relationships
US11932280B2 (en) * 2020-11-16 2024-03-19 Ford Global Technologies, Llc Situation handling and learning for an autonomous vehicle control system
USD971255S1 (en) * 2020-11-20 2022-11-29 Zoox, Inc. Display screen or portion thereof having a graphical user interface
US11618320B1 (en) 2020-11-20 2023-04-04 Zoox, Inc. Multi-passenger interaction
USD971244S1 (en) * 2020-11-20 2022-11-29 Zoox, Inc. Display screen or portion thereof having an animated graphical user interface
CN116802699A (en) * 2020-11-26 2023-09-22 哲内提 Enhanced path planning for automotive applications
EP4252146A1 (en) * 2020-11-26 2023-10-04 Zenuity AB Augmented capabilities for automotive applications
US11958501B1 (en) 2020-12-07 2024-04-16 Zoox, Inc. Performance-based metrics for evaluating system quality
US11814042B1 (en) 2020-12-09 2023-11-14 Zoox, Inc. Reducing hydraulic fluid pressure based on predicted collision
US11577718B1 (en) 2020-12-09 2023-02-14 Zoox, Inc. Adjusting vehicle ride height based on predicted collision
US20220188581A1 (en) * 2020-12-10 2022-06-16 Toyota Research Institute, Inc. Learning-based online mapping
US11859994B1 (en) 2021-02-18 2024-01-02 Aurora Innovation, Inc. Landmark-based localization methods and architectures for an autonomous vehicle
US12004118B2 (en) 2021-03-01 2024-06-04 Toyota Motor North America, Inc. Detection of aberration on transport
US11828608B1 (en) * 2021-04-02 2023-11-28 Joseph E. Conroy Controlling vehicles in a complex ecosystem
US11709260B2 (en) * 2021-04-30 2023-07-25 Zoox, Inc. Data driven resolution function derivation
US11954882B2 (en) 2021-06-17 2024-04-09 Zebra Technologies Corporation Feature-based georegistration for mobile computing devices
PL4109194T3 (en) * 2021-06-25 2024-05-13 Siemens Aktiengesellschaft Sensor data generation for controlling an autonomous vehicle
US20230025579A1 (en) * 2021-07-26 2023-01-26 Cyngn, Inc. High-definition mapping
US20230035780A1 (en) * 2021-07-29 2023-02-02 Zoox, Inc. Systematic fault detection in vehicle control systems
WO2023048615A1 (en) * 2021-09-23 2023-03-30 Epiroc Rock Drills Aktiebolag Method and control node for assisted vehicle control in a mining environment
US11556403B1 (en) 2021-10-19 2023-01-17 Bank Of America Corporation System and method for an application programming interface (API) service modification
US11675362B1 (en) * 2021-12-17 2023-06-13 Motional Ad Llc Methods and systems for agent prioritization
US20230260404A1 (en) * 2022-02-15 2023-08-17 International Business Machines Corporation Multi-vehicle collaboration with group visualization
FR3132962A1 (en) * 2022-02-22 2023-08-25 Ez-Wheel electric motor control system with integrated safety and navigation control
US11543264B1 (en) 2022-02-23 2023-01-03 Plusai, Inc. Methods and apparatus for navigating an autonomous vehicle based on a map updated in regions
US12012108B1 (en) * 2022-03-31 2024-06-18 Zoox, Inc. Prediction models in autonomous vehicles using modified map data
US12008681B2 (en) * 2022-04-07 2024-06-11 Gm Technology Operations Llc Systems and methods for testing vehicle systems
DE102022110655A1 (en) 2022-05-02 2023-11-02 Valeo Schalter Und Sensoren Gmbh Method for executing a teleoperated driving function of an at least partially autonomously operated motor vehicle, computer program product, computer-readable storage medium and an assistance system
DE102022111181A1 (en) * 2022-05-05 2023-11-09 Bayerische Motoren Werke Aktiengesellschaft METHOD AND DEVICE FOR ASSESSING THE QUALITY OF AN AUTOMATED FUNCTION OF A MOTOR VEHICLE
DE102022115478A1 (en) 2022-06-22 2023-12-28 Valeo Schalter Und Sensoren Gmbh Method for the teleoperated operation of an at least partially autonomously operated motor vehicle by means of a teleoperation device, computer program product, computer-readable storage medium and teleoperation device
CN114995465B (en) * 2022-08-02 2022-11-15 北京理工大学 Multi-unmanned vehicle motion planning method and system considering vehicle motion capability
WO2024039888A1 (en) * 2022-08-18 2024-02-22 Faction Technology, Inc. Cooperative teleoperation
US11938963B1 (en) * 2022-12-28 2024-03-26 Aurora Operations, Inc. Remote live map system for autonomous vehicles

Family Cites Families (267)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5220507A (en) * 1990-11-08 1993-06-15 Motorola, Inc. Land vehicle multiple navigation route apparatus
US10361802B1 (en) 1999-02-01 2019-07-23 Blanding Hovenweep, Llc Adaptive pattern recognition based control system and method
US5558370A (en) 1995-03-30 1996-09-24 Automotive Systems Laboratory, Inc. Electronic seat belt tensioning system
IL117792A (en) * 1995-05-08 2003-10-31 Rafael Armament Dev Authority Autonomous command and control unit for mobile platform
US10573093B2 (en) 1995-06-07 2020-02-25 Automotive Technologies International, Inc. Vehicle computer design and use techniques for receiving navigation software
US5646613A (en) * 1996-05-20 1997-07-08 Cho; Myungeun System for minimizing automobile collision damage
DE19750338A1 (en) 1997-11-13 1999-05-20 Siemens Ag Motor vehicle cruise control system
US7426429B2 (en) 1998-04-27 2008-09-16 Joseph A Tabe Smart seatbelt control system
US6264353B1 (en) 1998-11-23 2001-07-24 Lear Automotive Dearborn, Inc. Exterior mirror with supplement turn signal
JP3865182B2 (en) * 1998-12-25 2007-01-10 タカタ株式会社 Seat belt system
US7036128B1 (en) 1999-01-05 2006-04-25 Sri International Offices Using a community of distributed electronic agents to support a highly mobile, ambient computing environment
US6615130B2 (en) 2000-03-17 2003-09-02 Makor Issues And Rights Ltd. Real time vehicle guidance and traffic forecasting system
US6728616B1 (en) 2000-10-20 2004-04-27 Joseph A. Tabe Smart seatbelt control system
US20020131608A1 (en) 2001-03-01 2002-09-19 William Lobb Method and system for providing digitally focused sound
JP2003048481A (en) 2001-08-08 2003-02-18 Koito Mfg Co Ltd Headlight system for vehicle
EP1505571A4 (en) 2002-04-12 2007-02-21 Mitsubishi Electric Corp Car navigation system and speech recognizing device thereof
US6746049B2 (en) 2002-07-24 2004-06-08 Visteon Global Technologies, Inc. Adaptive seat belt tensioning system
US20060064202A1 (en) 2002-08-26 2006-03-23 Sony Corporation Environment identification device, environment identification method, and robot device
US6910788B2 (en) 2003-06-24 2005-06-28 Bo T. Jones LED vehicle wheel well illumination device
US20070096447A1 (en) 2003-10-07 2007-05-03 Tabe Joseph A Smart seatbelt control system
AU2004294651A1 (en) 2003-10-21 2005-06-16 Proxy Aviation Systems, Inc. Methods and apparatus for unmanned vehicle control
US7447593B2 (en) 2004-03-26 2008-11-04 Raytheon Company System and method for adaptive path planning
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
JP2006113836A (en) 2004-10-15 2006-04-27 Fuji Heavy Ind Ltd Road information providing system
US7499774B2 (en) 2004-10-22 2009-03-03 Irobot Corporation System and method for processing safety signals in an autonomous vehicle
KR100679042B1 (en) 2004-10-27 2007-02-06 삼성전자주식회사 Method and apparatus for speech recognition, and navigation system using for the same
JP4760715B2 (en) 2004-12-28 2011-08-31 株式会社豊田中央研究所 Vehicle motion control device
US7644799B2 (en) 2005-02-10 2010-01-12 Friedman Research Corporation Vehicle safety control system
US20060207820A1 (en) 2005-03-20 2006-09-21 Hemant Joshi Active Vehile Shield
US7894951B2 (en) 2005-10-21 2011-02-22 Deere & Company Systems and methods for switching between autonomous and manual operation of a vehicle
US8280742B2 (en) 2005-12-16 2012-10-02 Panasonic Corporation Input device and input method for mobile body
US9459622B2 (en) 2007-01-12 2016-10-04 Legalforce, Inc. Driverless vehicle commerce network and community
US8164461B2 (en) 2005-12-30 2012-04-24 Healthsense, Inc. Monitoring task performance
US7813870B2 (en) 2006-03-03 2010-10-12 Inrix, Inc. Dynamic time series prediction of future traffic conditions
US9373149B2 (en) 2006-03-17 2016-06-21 Fatdoor, Inc. Autonomous neighborhood vehicle commerce network and community
US20090306989A1 (en) 2006-03-31 2009-12-10 Masayo Kaji Voice input support device, method thereof, program thereof, recording medium containing the program, and navigation device
JP2007290423A (en) 2006-04-21 2007-11-08 Toyota Motor Corp Seat belt device
JP2007322172A (en) 2006-05-30 2007-12-13 Nissan Motor Co Ltd Bypass proposal system and method
US20080033645A1 (en) 2006-08-03 2008-02-07 Jesse Sol Levinson Pobabilistic methods for mapping and localization in arbitrary outdoor environments
US7972045B2 (en) 2006-08-11 2011-07-05 Donnelly Corporation Automatic headlamp control system
KR101841753B1 (en) 2006-08-18 2018-03-23 브룩스 오토메이션 인코퍼레이티드 Reduced capacity carrier, transport, load port, buffer system
CN102663884B (en) 2006-09-14 2014-03-05 克朗设备公司 System and method of remotely controlling materials handling vehicle
US7579942B2 (en) 2006-10-09 2009-08-25 Toyota Motor Engineering & Manufacturing North America, Inc. Extra-vehicular threat predictor
WO2008043795A1 (en) 2006-10-13 2008-04-17 Continental Teves Ag & Co. Ohg Method and apparatus for identifying concealed objects in road traffic
ES2522589T3 (en) 2007-02-08 2014-11-17 Behavioral Recognition Systems, Inc. Behavioral recognition system
JP4949063B2 (en) 2007-02-14 2012-06-06 富士重工業株式会社 Vehicle driving support device
US10096038B2 (en) 2007-05-10 2018-10-09 Allstate Insurance Company Road segment safety rating system
US20080320421A1 (en) 2007-06-20 2008-12-25 Demaris David L Feature extraction that supports progressively refined search and classification of patterns in a semiconductor layout
KR20090001403A (en) 2007-06-29 2009-01-08 엘지전자 주식회사 Telematics terminal capable of receiving broadcast and method of processing broadcast signal
US20090248587A1 (en) 2007-08-31 2009-10-01 Van Buskirk Peter C Selectively negotiated ridershare system comprising riders, drivers, and vehicles
US9513125B2 (en) 2008-01-14 2016-12-06 The Boeing Company Computing route plans for routing around obstacles having spatial and temporal dimensions
US8244469B2 (en) 2008-03-16 2012-08-14 Irobot Corporation Collaborative engagement for target identification and tracking
WO2009117582A2 (en) 2008-03-19 2009-09-24 Appleseed Networks, Inc. Method and apparatus for detecting patterns of behavior
KR100946723B1 (en) 2008-04-12 2010-03-12 재단법인서울대학교산학협력재단 Steering Method for vehicle and Apparatus thereof
US8548727B2 (en) 2008-05-02 2013-10-01 Honeywell International Inc. Cognitive aircraft hazard advisory system (CAHAS)
WO2009137582A1 (en) 2008-05-06 2009-11-12 University Of Virginia Patent Foundation System and method for minimizing occupant injury during vehicle crash events
US8392064B2 (en) 2008-05-27 2013-03-05 The Board Of Trustees Of The Leland Stanford Junior University Systems, methods and devices for adaptive steering control of automotive vehicles
JP5215740B2 (en) 2008-06-09 2013-06-19 株式会社日立製作所 Mobile robot system
US7948439B2 (en) 2008-06-20 2011-05-24 Honeywell International Inc. Tracking of autonomous systems
US8686095B2 (en) 2008-06-30 2014-04-01 Basf Corporation Process for continuous production of epoxy resins
US8989972B2 (en) 2008-09-11 2015-03-24 Deere & Company Leader-follower fully-autonomous vehicle with operator on side
US8126642B2 (en) 2008-10-24 2012-02-28 Gray & Company, Inc. Control and systems for autonomously driven vehicles
JP5251496B2 (en) 2008-12-25 2013-07-31 アイシン・エィ・ダブリュ株式会社 Hydraulic control device for automatic transmission
US9449142B2 (en) 2009-03-09 2016-09-20 Massachusetts Institute Of Technology System and method for modeling supervisory control of heterogeneous unmanned vehicles through discrete event simulation
US8352111B2 (en) 2009-04-06 2013-01-08 GM Global Technology Operations LLC Platoon vehicle management
JP5210233B2 (en) 2009-04-14 2013-06-12 日立オートモティブシステムズ株式会社 Vehicle external recognition device and vehicle system using the same
US20100286845A1 (en) 2009-05-11 2010-11-11 Andrew Karl Wilhelm Rekow Fail-safe system for autonomous vehicle
EP2280241A3 (en) * 2009-07-30 2017-08-23 QinetiQ Limited Vehicle control
KR101597289B1 (en) 2009-07-31 2016-03-08 삼성전자주식회사 Apparatus for recognizing speech according to dynamic picture and method thereof
US20110098922A1 (en) 2009-10-27 2011-04-28 Visteon Global Technologies, Inc. Path Predictive System And Method For Vehicles
US20110106615A1 (en) 2009-11-03 2011-05-05 Yahoo! Inc. Multimode online advertisements and online advertisement exchanges
US9230292B2 (en) 2012-11-08 2016-01-05 Uber Technologies, Inc. Providing on-demand services through use of portable computing devices
US20130246301A1 (en) 2009-12-04 2013-09-19 Uber Technologies, Inc. Providing user feedback for transport services through use of mobile devices
US8559673B2 (en) 2010-01-22 2013-10-15 Google Inc. Traffic signal mapping and detection
US20110190972A1 (en) 2010-02-02 2011-08-04 Gm Global Technology Operations, Inc. Grid unlock
WO2011098848A1 (en) * 2010-02-11 2011-08-18 John Gano A method of bidirectional automotive transport
EP2549456B1 (en) 2010-03-16 2020-05-06 Toyota Jidosha Kabushiki Kaisha Driving assistance device
EP2550191B2 (en) * 2010-03-26 2022-01-26 Siemens Mobility S.A.S. Method and system for management of special events on a trip of a guided vehicle
US8031085B1 (en) 2010-04-15 2011-10-04 Deere & Company Context-based sound generation
US8260482B1 (en) 2010-04-28 2012-09-04 Google Inc. User interface for displaying internal state of autonomous driving system
JP2011248855A (en) 2010-04-30 2011-12-08 Denso Corp Vehicle collision warning apparatus
US9014848B2 (en) * 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
GB201009523D0 (en) 2010-06-07 2010-07-21 Capoco Design Ltd An autonomous vehicle
RU103114U1 (en) * 2010-09-29 2011-03-27 Закрытое акционерное общество ЗАО "ВНИИстройдормаш" AUTOMATED REVERSE PROFILE
US8509982B2 (en) 2010-10-05 2013-08-13 Google Inc. Zone driving
US9513630B2 (en) 2010-11-17 2016-12-06 General Electric Company Methods and systems for data communications
US20140358427A1 (en) 2010-12-13 2014-12-04 Google Inc. Enhancing driving navigation via passive drivers feedback
AU2012223032B2 (en) 2011-02-28 2015-10-29 Bae Systems Australia Limited Control computer for an unmanned vehicle
EP2681512B1 (en) 2011-03-03 2021-01-13 Verizon Patent and Licensing Inc. Vehicle route calculation
JP5670246B2 (en) 2011-04-07 2015-02-18 本田技研工業株式会社 Lower body structure
US9171268B1 (en) 2011-04-22 2015-10-27 Angel A. Penilla Methods and systems for setting and transferring user profiles to vehicles and temporary sharing of user profiles to shared-use vehicles
US20120310465A1 (en) 2011-06-02 2012-12-06 Harman International Industries, Incorporated Vehicle nagivation system
DE102011104925A1 (en) 2011-06-18 2012-12-20 Daimler Ag Motor vehicle with a driver assistance unit
CN103814394A (en) 2011-08-16 2014-05-21 佳境有限公司 Estimation and management of loads in electric vehicle networks
US8583361B2 (en) 2011-08-24 2013-11-12 Modular Mining Systems, Inc. Guided maneuvering of a mining vehicle to a target destination
IL214867A0 (en) 2011-08-29 2012-01-31 Elta Systems Ltd Moving cellular communicatio system
DE102011112577A1 (en) 2011-09-08 2013-03-14 Continental Teves Ag & Co. Ohg Method and device for an assistance system in a vehicle for carrying out an autonomous or semi-autonomous driving maneuver
CA2849739C (en) 2011-09-22 2018-09-04 Aethon, Inc. Monitoring, diagnostic and tracking tool for autonomous mobile robots
KR101552074B1 (en) 2011-10-03 2015-09-09 도요타 지도샤(주) Vehicle driving support system
US20160189544A1 (en) * 2011-11-16 2016-06-30 Autoconnect Holdings Llc Method and system for vehicle data collection regarding traffic
FR2984254B1 (en) 2011-12-16 2016-07-01 Renault Sa CONTROL OF AUTONOMOUS VEHICLES
US20140309872A1 (en) 2013-04-15 2014-10-16 Flextronics Ap, Llc Customization of vehicle user interfaces based on user intelligence
US8880272B1 (en) 2012-03-16 2014-11-04 Google Inc. Approach for estimating the geometry of roads and lanes by using vehicle trajectories
EP2645196B1 (en) 2012-03-30 2018-12-12 The Boeing Company Network of unmanned vehicles
US20130268138A1 (en) 2012-04-05 2013-10-10 Caterpillar Inc. High Availability For Autonomous Machine Control System
US9921069B2 (en) 2012-04-05 2018-03-20 Hitachi, Ltd. Map data creation device, autonomous movement system and autonomous movement control device
US9495874B1 (en) 2012-04-13 2016-11-15 Google Inc. Automated system and method for modeling the behavior of vehicles and other agents
WO2014021961A2 (en) * 2012-05-03 2014-02-06 Lockheed Martin Corporation Systems and methods for vehicle survivability planning
US20130304514A1 (en) 2012-05-08 2013-11-14 Elwha Llc Systems and methods for insurance based on monitored characteristics of an autonomous drive mode selection system
US10089537B2 (en) 2012-05-18 2018-10-02 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
US8768565B2 (en) 2012-05-23 2014-07-01 Enterprise Holdings, Inc. Rental/car-share vehicle access and management system and method
JP5987259B2 (en) 2012-06-19 2016-09-07 株式会社リコー Automated driving navigation system
US9255989B2 (en) 2012-07-24 2016-02-09 Toyota Motor Engineering & Manufacturing North America, Inc. Tracking on-road vehicles with sensors of different modalities
US8849515B2 (en) * 2012-07-24 2014-09-30 GM Global Technology Operations LLC Steering assist in driver initiated collision avoidance maneuver
US9686306B2 (en) 2012-11-02 2017-06-20 University Of Washington Through Its Center For Commercialization Using supplemental encrypted signals to mitigate man-in-the-middle attacks on teleoperated systems
US9671233B2 (en) 2012-11-08 2017-06-06 Uber Technologies, Inc. Dynamically providing position information of a transit object to a computing device
USD743978S1 (en) 2012-11-08 2015-11-24 Uber Technologies, Inc. Display screen of a computing device with a computer-generated electronic panel for providing confirmation for a service request
US20140129302A1 (en) 2012-11-08 2014-05-08 Uber Technologies, Inc. Providing a confirmation interface for on-demand services through use of portable computing devices
DE102012022392B4 (en) * 2012-11-15 2016-02-04 Audi Ag Method and device for controlling a safety belt connected to a seatbelt device of a vehicle with a predictive collision detection unit
US8914225B2 (en) 2012-12-04 2014-12-16 International Business Machines Corporation Managing vehicles on a road network
FR3000005B1 (en) 2012-12-21 2015-10-09 Valeo Securite Habitacle REMOTE CONTROL BOX OF A PARKING MANEUVER CONTROL SYSTEM OF A VEHICLE, AND ASSOCIATED METHOD
US20140188347A1 (en) 2012-12-31 2014-07-03 Joseph Akwo Tabe Smart supplemental restraint and occupant classification system
US9367065B2 (en) 2013-01-25 2016-06-14 Google Inc. Modifying behavior of autonomous vehicles based on sensor blind spots and limitations
US8948993B2 (en) 2013-03-08 2015-02-03 Richard Schulman Method and system for controlling the behavior of an occupant of a vehicle
US9613534B2 (en) 2013-03-11 2017-04-04 Rockwell Collins, Inc. Systems and methods for creating a network cloud based system for supporting regional, national and international unmanned aircraft systems
US9747898B2 (en) 2013-03-15 2017-08-29 Honda Motor Co., Ltd. Interpretation of ambiguous vehicle instructions
CN105074793A (en) 2013-03-15 2015-11-18 凯利普公司 Lane-level vehicle navigation for vehicle routing and traffic management
US8996224B1 (en) * 2013-03-15 2015-03-31 Google Inc. Detecting that an autonomous vehicle is in a stuck condition
US9395727B1 (en) 2013-03-22 2016-07-19 Google Inc. Single layer shared aperture beam forming network
US9342074B2 (en) 2013-04-05 2016-05-17 Google Inc. Systems and methods for transitioning control of an autonomous vehicle to a driver
US9141107B2 (en) 2013-04-10 2015-09-22 Google Inc. Mapping active and inactive construction zones for autonomous driving
US9632210B2 (en) 2013-05-07 2017-04-25 Google Inc. Methods and systems for detecting weather conditions using vehicle onboard sensors
US9025140B2 (en) 2013-05-07 2015-05-05 Google Inc. Methods and systems for detecting weather conditions including sunlight using vehicle onboard sensors
WO2014172316A1 (en) 2013-04-15 2014-10-23 Flextronics Ap, Llc Building profiles associated with vehicle users
US8977007B1 (en) 2013-04-23 2015-03-10 Google Inc. Detecting a vehicle signal through image differencing and filtering
US9411780B1 (en) 2013-06-03 2016-08-09 Amazon Technologies, Inc. Employing device sensor data to determine user characteristics
US9384443B2 (en) 2013-06-14 2016-07-05 Brain Corporation Robotic training apparatus and methods
JP6166107B2 (en) 2013-06-17 2017-07-19 株式会社Ihiエアロスペース Unmanned mobile remote control system and unmanned mobile
US9412173B2 (en) 2013-06-21 2016-08-09 National University Of Ireland, Maynooth Method for mapping an environment
US9760698B2 (en) 2013-09-17 2017-09-12 Toyota Motor Sales, U.S.A., Inc. Integrated wearable article for interactive vehicle control system
US9579789B2 (en) 2013-09-27 2017-02-28 Brain Corporation Apparatus and methods for training of robotic control arbitration
JP6257318B2 (en) 2013-09-30 2018-01-10 株式会社日本総合研究所 Mobile object in automatic driving traffic system, cooperative vehicle allocation apparatus for mobile object, and vehicle allocation method thereof
US9425654B2 (en) 2013-09-30 2016-08-23 Google Inc. Contactless electrical coupling for a rotatable LIDAR device
EP3056394B1 (en) 2013-10-08 2022-11-30 ICTK Holdings Co., Ltd. Vehicle security network device and design method therefor
US9528834B2 (en) 2013-11-01 2016-12-27 Intelligent Technologies International, Inc. Mapping techniques using probe vehicles
US20150127224A1 (en) 2013-11-02 2015-05-07 Joseph Akwo Tabe Advanced weight responsive supplemental restraint and occupant classification system
US20150268665A1 (en) 2013-11-07 2015-09-24 Google Inc. Vehicle communication using audible signals
US9212926B2 (en) 2013-11-22 2015-12-15 Ford Global Technologies, Llc In-vehicle path verification
US9984348B2 (en) 2013-11-29 2018-05-29 Fedex Corporate Services, Inc. Context management of a wireless node network
US9200910B2 (en) 2013-12-11 2015-12-01 Here Global B.V. Ranking of path segments based on incident probability
US9002634B1 (en) 2013-12-12 2015-04-07 Verizon Patent And Licensing Inc. Navigation service in support of mobile communication sessions
US10332405B2 (en) 2013-12-19 2019-06-25 The United States Of America As Represented By The Administrator Of Nasa Unmanned aircraft systems traffic management
WO2015099679A1 (en) 2013-12-23 2015-07-02 Intel Corporation In-vehicle authorization for autonomous vehicles
US9915950B2 (en) 2013-12-31 2018-03-13 Polysync Technologies, Inc. Autonomous vehicle interface system
US20150202770A1 (en) 2014-01-17 2015-07-23 Anthony Patron Sidewalk messaging of an autonomous robot
US20150203107A1 (en) 2014-01-17 2015-07-23 Ford Global Technologies, Llc Autonomous vehicle precipitation detection
US9550419B2 (en) 2014-01-21 2017-01-24 Honda Motor Co., Ltd. System and method for providing an augmented reality vehicle interface
US9984574B2 (en) 2014-01-21 2018-05-29 Tribal Rides, Inc. Method and system for anticipatory deployment of autonomously controlled vehicles
US9656667B2 (en) 2014-01-29 2017-05-23 Continental Automotive Systems, Inc. Method for minimizing automatic braking intrusion based on collision confidence
US9567077B2 (en) 2014-02-14 2017-02-14 Accenture Global Services Limited Unmanned vehicle (UV) control system
EP2908202B1 (en) 2014-02-14 2019-03-27 Accenture Global Services Limited Unmanned vehicle (UV) control system
US9201426B1 (en) 2014-02-19 2015-12-01 Google Inc. Reverse iteration of planning data for system control
US9465388B1 (en) 2014-03-03 2016-10-11 Google Inc. Remote assistance for an autonomous vehicle in low confidence situations
US9720410B2 (en) 2014-03-03 2017-08-01 Waymo Llc Remote assistance for autonomous vehicles in predetermined situations
US10692370B2 (en) 2014-03-03 2020-06-23 Inrix, Inc. Traffic obstruction detection
US9547989B2 (en) 2014-03-04 2017-01-17 Google Inc. Reporting road event data and sharing with other vehicles
JP6137001B2 (en) 2014-03-14 2017-05-31 株式会社デンソー In-vehicle device
US9960986B2 (en) 2014-03-19 2018-05-01 Uber Technologies, Inc. Providing notifications to devices based on real-time conditions related to an on-demand service
JP5877574B1 (en) 2014-04-01 2016-03-08 みこらった株式会社 Automotive and automotive programs
DE102014213171A1 (en) * 2014-04-09 2015-10-15 Continental Automotive Gmbh System for autonomous vehicle guidance and motor vehicle
US20150292894A1 (en) 2014-04-11 2015-10-15 Telecommunication Systems, Inc. Travel route
US10223479B1 (en) 2014-05-20 2019-03-05 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature evaluation
US9613274B2 (en) 2014-05-22 2017-04-04 International Business Machines Corporation Identifying an obstacle in a route
US20150338226A1 (en) 2014-05-22 2015-11-26 Telogis, Inc. Context-based routing and access path selection
US9475422B2 (en) 2014-05-22 2016-10-25 Applied Invention, Llc Communication between autonomous vehicle and external observers
US9436182B2 (en) 2014-05-23 2016-09-06 Google Inc. Autonomous vehicles
US9631933B1 (en) 2014-05-23 2017-04-25 Google Inc. Specifying unavailable locations for autonomous vehicles
US9393922B2 (en) 2014-05-23 2016-07-19 Google Inc. Devices and methods for an energy-absorbing end of a vehicle
US10124800B2 (en) 2014-05-30 2018-11-13 The Boeing Company Variably controlled ground vehicle
US10424036B2 (en) 2014-06-02 2019-09-24 Uber Technologies, Inc. Maintaining data for use with a transport service during connectivity loss between systems
US9235775B2 (en) 2014-06-08 2016-01-12 Uber Technologies, Inc. Entrance detection from street-level imagery
US9494937B2 (en) 2014-06-20 2016-11-15 Verizon Telematics Inc. Method and system for drone deliveries to vehicles in route
US9858922B2 (en) 2014-06-23 2018-01-02 Google Inc. Caching speech recognition scores
US10474470B2 (en) 2014-06-24 2019-11-12 The Boeing Company Techniques deployment system
EP2962903A1 (en) 2014-07-04 2016-01-06 Fujitsu Limited Configurable rental vehicle
US9365218B2 (en) 2014-07-14 2016-06-14 Ford Global Technologies, Llc Selectable autonomous driving modes
US9283678B2 (en) 2014-07-16 2016-03-15 Google Inc. Virtual safety cages for robotic devices
US10127927B2 (en) 2014-07-28 2018-11-13 Sony Interactive Entertainment Inc. Emotional speech processing
KR20160015987A (en) 2014-08-01 2016-02-15 한국전자통신연구원 Remote Autonomous Driving System based on the High Accuracy of Localization by indoor Infrastructure's Map and Sensor and Method thereof
JP6181300B2 (en) * 2014-09-05 2017-08-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd System for controlling the speed of unmanned aerial vehicles
US9459620B1 (en) 2014-09-29 2016-10-04 Amazon Technologies, Inc. Human interaction with unmanned aerial vehicles
US9381949B2 (en) 2014-10-15 2016-07-05 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicles having a cross-vehicle stabilizing structure
DE102014224108A1 (en) 2014-11-26 2016-06-02 Robert Bosch Gmbh Method and device for operating a vehicle
USPP28774P3 (en) 2014-11-27 2017-12-19 Agro Selections Fruits Peach tree named ‘CRISPONDA’
US9371093B1 (en) 2014-12-03 2016-06-21 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicles having upper side member reinforcement portions
JP6277948B2 (en) 2014-12-05 2018-02-14 マツダ株式会社 Lower body structure of automobile
USPP28706P3 (en) 2014-12-08 2017-11-28 Syngenta Participations Ag Dahlia plant named ‘DAHZ0001’
US10074224B2 (en) 2015-04-20 2018-09-11 Gate Labs Inc. Access management system
US10345775B2 (en) 2015-01-28 2019-07-09 Brian Westcott Methods and systems for infrastructure performance: monitoring, control, operations, analysis and adaptive learning
US9151628B1 (en) 2015-01-30 2015-10-06 Nissan North America, Inc. Associating parking areas with destinations
CA3067160A1 (en) 2015-02-10 2016-08-18 Mobileye Vision Technologies Ltd. Sparse map for autonomous vehicle navigation
US20160247106A1 (en) 2015-02-24 2016-08-25 Siemens Aktiengesellschaft Managing a fleet of autonomous electric vehicles for on-demand transportation and ancillary services to electrical grid
GB2535718A (en) 2015-02-24 2016-08-31 Addison Lee Ltd Resource management
WO2016145379A1 (en) 2015-03-12 2016-09-15 William Marsh Rice University Automated Compilation of Probabilistic Task Description into Executable Neural Network Specification
US9667710B2 (en) 2015-04-20 2017-05-30 Agverdict, Inc. Systems and methods for cloud-based agricultural data processing and management
US10345809B2 (en) 2015-05-13 2019-07-09 Uber Technologies, Inc. Providing remote assistance to an autonomous vehicle
US9494439B1 (en) 2015-05-13 2016-11-15 Uber Technologies, Inc. Autonomous vehicle operated with guide assistance of human driven vehicles
US9547309B2 (en) 2015-05-13 2017-01-17 Uber Technologies, Inc. Selecting vehicle type for providing transport
US9690290B2 (en) 2015-06-04 2017-06-27 Toyota Motor Engineering & Manufacturing North America, Inc. Situation-based transfer of vehicle sensor data during remote operation of autonomous vehicles
US20180348023A1 (en) 2015-06-09 2018-12-06 Google Llc Sensor Calibration Based On Environmental Factors
US9904900B2 (en) 2015-06-11 2018-02-27 Bao Tran Systems and methods for on-demand transportation
KR20170010645A (en) 2015-07-20 2017-02-01 엘지전자 주식회사 Autonomous vehicle and autonomous vehicle system including the same
US10623162B2 (en) 2015-07-23 2020-04-14 Centurylink Intellectual Property Llc Customer based internet of things (IoT)
US9805605B2 (en) 2015-08-12 2017-10-31 Madhusoodhan Ramanujam Using autonomous vehicles in a taxi service
US10023231B2 (en) 2015-08-12 2018-07-17 Madhusoodhan Ramanujam Parking autonomous vehicles
US10220705B2 (en) 2015-08-12 2019-03-05 Madhusoodhan Ramanujam Sharing autonomous vehicles
KR101895485B1 (en) 2015-08-26 2018-09-05 엘지전자 주식회사 Drive assistance appratus and method for controlling the same
US10540891B2 (en) 2015-08-27 2020-01-21 Nec Corporation Traffic-congestion prevention system, traffic-congestion prevention method, and recording medium
US10139237B2 (en) 2015-09-01 2018-11-27 Chris Outwater Method for remotely identifying one of a passenger and an assigned vehicle to the other
US10150448B2 (en) 2015-09-18 2018-12-11 Ford Global Technologies. Llc Autonomous vehicle unauthorized passenger or object detection
WO2017053046A1 (en) 2015-09-21 2017-03-30 Continental Intelligent Transportation Systems, LLC On-demand and on-site vehicle maintenance service
US10139828B2 (en) 2015-09-24 2018-11-27 Uber Technologies, Inc. Autonomous vehicle operated with safety augmentation
US9714089B1 (en) 2015-09-25 2017-07-25 Amazon Technologies, Inc. Trigger agents in video streams from drones
US9830757B2 (en) 2015-09-30 2017-11-28 Faraday & Future Inc. System and method for operating vehicle using mobile device
US10334050B2 (en) 2015-11-04 2019-06-25 Zoox, Inc. Software application and logic to modify configuration of an autonomous vehicle
WO2017079341A2 (en) 2015-11-04 2017-05-11 Zoox, Inc. Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
US9606539B1 (en) 2015-11-04 2017-03-28 Zoox, Inc. Autonomous vehicle fleet service and system
US9507346B1 (en) 2015-11-04 2016-11-29 Zoox, Inc. Teleoperation system and method for trajectory modification of autonomous vehicles
US20190227553A1 (en) 2015-11-04 2019-07-25 Zoox, Inc. Interactive autonomous vehicle command controller
US9734455B2 (en) 2015-11-04 2017-08-15 Zoox, Inc. Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
US11283877B2 (en) 2015-11-04 2022-03-22 Zoox, Inc. Software application and logic to modify configuration of an autonomous vehicle
US9754490B2 (en) 2015-11-04 2017-09-05 Zoox, Inc. Software application to request and control an autonomous vehicle service
US9494940B1 (en) 2015-11-04 2016-11-15 Zoox, Inc. Quadrant configuration of robotic vehicles
US9958864B2 (en) * 2015-11-04 2018-05-01 Zoox, Inc. Coordination of dispatching and maintaining fleet of autonomous vehicles
US10401852B2 (en) 2015-11-04 2019-09-03 Zoox, Inc. Teleoperation system and method for trajectory modification of autonomous vehicles
US9632502B1 (en) 2015-11-04 2017-04-25 Zoox, Inc. Machine-learning systems and techniques to optimize teleoperation and/or planner decisions
US10248119B2 (en) 2015-11-04 2019-04-02 Zoox, Inc. Interactive autonomous vehicle command controller
US9716565B2 (en) 2015-11-12 2017-07-25 Ben Mandeville-Clarke System, apparatus, and method for generating and transmitting an interruption signal to a substantially autonomous vehicle
US9963012B2 (en) 2015-12-07 2018-05-08 GM Global Technology Operations LLC Personalizing vehicular comfort settings for a specific user
US10036642B2 (en) 2015-12-08 2018-07-31 Uber Technologies, Inc. Automated vehicle communications system
US10887155B2 (en) 2015-12-30 2021-01-05 Sony Corporation System and method for a unified connected network
US11049391B2 (en) 2016-01-03 2021-06-29 Yosef Mintz System and methods to apply robust predictive traffic load balancing control and robust cooperative safe driving for smart cities
US9940834B1 (en) 2016-01-22 2018-04-10 State Farm Mutual Automobile Insurance Company Autonomous vehicle application
US9813541B2 (en) 2016-02-29 2017-11-07 Ford Global Technologies, Llc Mobile device control for powered door
US9836057B2 (en) 2016-03-24 2017-12-05 Waymo Llc Arranging passenger pickups for autonomous vehicles
US9989645B2 (en) 2016-04-01 2018-06-05 Uber Technologies, Inc. Utilizing accelerometer data to configure an autonomous vehicle for a user
US10012990B2 (en) 2016-04-01 2018-07-03 Uber Technologies, Inc. Optimizing timing for configuring an autonomous vehicle
US10255648B2 (en) 2016-04-14 2019-04-09 Eric John Wengreen Self-driving vehicle systems and methods
EP3232285B1 (en) 2016-04-14 2019-12-18 Volvo Car Corporation Method and arrangement for monitoring and adapting the performance of a fusion system of an autonomous vehicle
JP6569596B2 (en) 2016-05-20 2019-09-04 トヨタ自動車株式会社 vehicle
US9928434B1 (en) 2016-06-14 2018-03-27 State Farm Mutual Automobile Insurance Company Appartuses, systems, and methods for determining when a vehicle occupant is using a mobile telephone
US20180017399A1 (en) 2016-07-15 2018-01-18 Robert C. Rolnik Safety charging for computer vehicle
US9956910B2 (en) 2016-07-18 2018-05-01 Toyota Motor Engineering & Manufacturing North America, Inc. Audible notification systems and methods for autonomous vehicles
US10095229B2 (en) 2016-09-13 2018-10-09 Ford Global Technologies, Llc Passenger tracking systems and methods
US20180075565A1 (en) 2016-09-13 2018-03-15 Ford Global Technologies, Llc Passenger validation systems and methods
US9963106B1 (en) 2016-11-07 2018-05-08 Nio Usa, Inc. Method and system for authentication in autonomous vehicles
US10515390B2 (en) 2016-11-21 2019-12-24 Nio Usa, Inc. Method and system for data optimization
US10214219B2 (en) 2017-01-10 2019-02-26 Ford Global Technologies, Llc Methods and systems for powertrain NVH control in a vehicle
US9953538B1 (en) 2017-01-17 2018-04-24 Lyft, Inc. Autonomous vehicle notification system
US10147325B1 (en) 2017-02-02 2018-12-04 Wells Fargo Bank, N.A. Customization of sharing of rides
US10053088B1 (en) 2017-02-21 2018-08-21 Zoox, Inc. Occupant aware braking system
US10415983B2 (en) 2017-03-21 2019-09-17 Sony Corporation System and method for automatic passenger sharing among vehicles
US10303961B1 (en) 2017-04-13 2019-05-28 Zoox, Inc. Object detection and passenger notification
US10459444B1 (en) 2017-11-03 2019-10-29 Zoox, Inc. Autonomous vehicle fleet model training and testing
US11176426B2 (en) 2018-06-18 2021-11-16 Zoox, Inc. Sensor obstruction detection and mitigation using vibration and/or heat
US10647250B1 (en) 2019-03-08 2020-05-12 Pony Ai Inc. Directed acoustic alert notification from autonomous vehicles
US10919497B1 (en) 2019-10-09 2021-02-16 Ford Global Technologies, Llc Systems and methods for starting a vehicle using a secure password entry system

Cited By (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10338594B2 (en) 2017-03-13 2019-07-02 Nio Usa, Inc. Navigation of autonomous vehicles to enhance safety under one or more fault conditions
US10423162B2 (en) 2017-05-08 2019-09-24 Nio Usa, Inc. Autonomous vehicle logic to identify permissioned parking relative to multiple classes of restricted parking
EP3635625A4 (en) * 2017-06-06 2021-02-17 PlusAI Corp Method and system for close loop perception in autonomous driving vehicles
WO2018224873A1 (en) 2017-06-06 2018-12-13 PlusAI Corp Method and system for close loop perception in autonomous driving vehicles
US11392133B2 (en) 2017-06-06 2022-07-19 Plusai, Inc. Method and system for object centric stereo in autonomous driving vehicles
US11573573B2 (en) 2017-06-06 2023-02-07 Plusai, Inc. Method and system for distributed learning and adaptation in autonomous driving vehicles
CN110753892A (en) * 2017-06-06 2020-02-04 智加科技公司 Method and system for instant object tagging via cross-modality verification in autonomous vehicles
CN110785774A (en) * 2017-06-06 2020-02-11 智加科技公司 Method and system for closed loop sensing in autonomous vehicles
US11550334B2 (en) 2017-06-06 2023-01-10 Plusai, Inc. Method and system for integrated global and distributed learning in autonomous driving vehicles
US11537126B2 (en) 2017-06-06 2022-12-27 Plusai, Inc. Method and system for on-the-fly object labeling via cross modality validation in autonomous driving vehicles
US11435750B2 (en) 2017-06-06 2022-09-06 Plusai, Inc. Method and system for object centric stereo via cross modality validation in autonomous driving vehicles
US10710633B2 (en) 2017-07-14 2020-07-14 Nio Usa, Inc. Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles
US10369974B2 (en) 2017-07-14 2019-08-06 Nio Usa, Inc. Control and coordination of driverless fuel replenishment for autonomous vehicles
US10816354B2 (en) 2017-08-22 2020-10-27 Tusimple, Inc. Verification module system and method for motion-based lane detection with multiple sensors
US11874130B2 (en) 2017-08-22 2024-01-16 Tusimple, Inc. Verification module system and method for motion-based lane detection with multiple sensors
US11573095B2 (en) 2017-08-22 2023-02-07 Tusimple, Inc. Verification module system and method for motion-based lane detection with multiple sensors
US11846510B2 (en) 2017-08-23 2023-12-19 Tusimple, Inc. Feature matching and correspondence refinement and 3D submap position refinement system and method for centimeter precision localization using camera-based submap and LiDAR-based global map
CN109425348A (en) * 2017-08-23 2019-03-05 北京图森未来科技有限公司 A kind of while positioning and the method and apparatus for building figure
US10762673B2 (en) 2017-08-23 2020-09-01 Tusimple, Inc. 3D submap reconstruction system and method for centimeter precision localization using camera-based submap and LiDAR-based global map
CN109425348B (en) * 2017-08-23 2023-04-07 北京图森未来科技有限公司 Method and device for simultaneously positioning and establishing image
WO2019040800A1 (en) * 2017-08-23 2019-02-28 TuSimple 3d submap reconstruction system and method for centimeter precision localization using camera-based submap and lidar-based global map
CN111373337B (en) * 2017-08-23 2024-01-05 图森有限公司 3D sub-map reconstruction system and method for centimeter-accurate positioning using camera-based sub-maps and LIDAR-based global maps
US11151393B2 (en) 2017-08-23 2021-10-19 Tusimple, Inc. Feature matching and corresponding refinement and 3D submap position refinement system and method for centimeter precision localization using camera-based submap and LiDAR-based global map
CN111065980A (en) * 2017-08-23 2020-04-24 图森有限公司 System and method for centimeter-accurate positioning using camera-based sub-maps and LIDAR-based global maps
CN111373337A (en) * 2017-08-23 2020-07-03 图森有限公司 3D sub-map reconstruction system and method for centimeter-accurate positioning using camera-based sub-maps and LIDAR-based global maps
CN111065980B (en) * 2017-08-23 2023-07-14 图森有限公司 System and method for centimeter-accurate positioning using camera-based sub-maps and LIDAR-based global maps
US10953881B2 (en) 2017-09-07 2021-03-23 Tusimple, Inc. System and method for automated lane change control for autonomous vehicles
US10953880B2 (en) 2017-09-07 2021-03-23 Tusimple, Inc. System and method for automated lane change control for autonomous vehicles
US11853071B2 (en) 2017-09-07 2023-12-26 Tusimple, Inc. Data-driven prediction-based system and method for trajectory planning of autonomous vehicles
US10962650B2 (en) 2017-10-31 2021-03-30 United States Of America As Represented By The Administrator Of Nasa Polyhedral geofences
US11312334B2 (en) 2018-01-09 2022-04-26 Tusimple, Inc. Real-time remote control of vehicles with high redundancy
US11305782B2 (en) 2018-01-11 2022-04-19 Tusimple, Inc. Monitoring system for autonomous vehicle operation
US11022971B2 (en) 2018-01-16 2021-06-01 Nio Usa, Inc. Event data recordation to identify and resolve anomalies associated with control of driverless vehicles
US11740093B2 (en) 2018-02-14 2023-08-29 Tusimple, Inc. Lane marking localization and fusion
US11852498B2 (en) 2018-02-14 2023-12-26 Tusimple, Inc. Lane marking localization
US11009356B2 (en) 2018-02-14 2021-05-18 Tusimple, Inc. Lane marking localization and fusion
US11009365B2 (en) 2018-02-14 2021-05-18 Tusimple, Inc. Lane marking localization
US11830205B2 (en) 2018-02-27 2023-11-28 Tusimple, Inc. System and method for online real-time multi- object tracking
US11295146B2 (en) 2018-02-27 2022-04-05 Tusimple, Inc. System and method for online real-time multi-object tracking
US11665017B2 (en) 2018-02-28 2023-05-30 Cisco Technology, Inc. Telemetry reporting in vehicle super resolution systems
JP2021518557A (en) * 2018-03-19 2021-08-02 アウトサイト Methods and systems for identifying the material composition of moving objects
US11694308B2 (en) 2018-04-12 2023-07-04 Tusimple, Inc. Images for perception modules of autonomous vehicles
US11010874B2 (en) 2018-04-12 2021-05-18 Tusimple, Inc. Images for perception modules of autonomous vehicles
US11500101B2 (en) 2018-05-02 2022-11-15 Tusimple, Inc. Curb detection by analysis of reflection images
US10896539B2 (en) 2018-06-22 2021-01-19 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for updating highly automated driving maps
US11835361B2 (en) * 2018-08-31 2023-12-05 Denso Corporation Vehicle-side device, method and non-transitory computer-readable storage medium for autonomously driving vehicle
US20210180987A1 (en) * 2018-08-31 2021-06-17 Denso Corporation Vehicle-side device, method and non-transitory computer-readable storage medium for autonomously driving vehicle
US11292480B2 (en) 2018-09-13 2022-04-05 Tusimple, Inc. Remote safe driving methods and systems
US11714192B2 (en) 2018-10-30 2023-08-01 Tusimple, Inc. Determining an angle between a tow vehicle and a trailer
US10942271B2 (en) 2018-10-30 2021-03-09 Tusimple, Inc. Determining an angle between a tow vehicle and a trailer
US11972690B2 (en) 2018-12-14 2024-04-30 Beijing Tusen Zhitu Technology Co., Ltd. Platooning method, apparatus and system of autonomous driving platoon
AU2019419781B2 (en) * 2019-01-04 2021-12-09 Seoul Robotics Co.,Ltd. Vehicle using spatial information acquired using sensor, sensing device using spatial information acquired using sensor, and server
EP3756056A4 (en) * 2019-01-04 2022-02-23 Seoul Robotics Co., Ltd. Vehicle using spatial information acquired using sensor, sensing device using spatial information acquired using sensor, and server
KR20210152051A (en) * 2019-01-04 2021-12-14 (주)서울로보틱스 Vehicle and sensing device of tracking three-dimentional space, and computer program stored in storage medium
KR102338370B1 (en) * 2019-01-04 2021-12-13 주식회사 서울로보틱스 Vehicle and sensing device of utilizing spatial information acquired using sensor, and server for the same
US11914388B2 (en) 2019-01-04 2024-02-27 Seoul Robotics Co., Ltd. Vehicle using spatial information acquired using sensor, sensing device using spatial information acquired using sensor, and server
KR102453933B1 (en) * 2019-01-04 2022-10-14 (주)서울로보틱스 Vehicle and sensing device of tracking three-dimentional space, and computer program stored in storage medium
WO2020141694A1 (en) * 2019-01-04 2020-07-09 Seoul Robotics Co., Ltd. Vehicle using spatial information acquired using sensor, sensing device using spatial information acquired using sensor, and server
KR20200092819A (en) * 2019-01-04 2020-08-04 (주)서울로보틱스 Vehicle and sensing device of utilizing spatial information acquired using sensor, and server for the same
JP7405451B2 (en) 2019-01-04 2023-12-26 ソウル ロボティクス カンパニー リミテッド Vehicles that utilize spatial information acquired using sensors, sensing devices that utilize spatial information acquired using sensors, and servers
KR20200141422A (en) * 2019-01-04 2020-12-18 (주)서울로보틱스 Vehicle and sensing device of utilizing spatial information acquired using sensor, and server for the same
JP2022518369A (en) * 2019-01-04 2022-03-15 ソウル ロボティクス カンパニー リミテッド Vehicles that utilize spatial information acquired using sensors, sensing devices that utilize spatial information acquired using sensors, and servers
KR102193950B1 (en) * 2019-01-04 2020-12-22 주식회사 서울로보틱스 Vehicle and sensing device of utilizing spatial information acquired using sensor, and server for the same
US11507101B2 (en) 2019-01-04 2022-11-22 Seoul Robotics Co., Ltd. Vehicle using spatial information acquired using sensor, sensing device using spatial information acquired using sensor, and server
RU2805133C2 (en) * 2019-01-21 2023-10-11 Рено С.А.С Method for determining reliability of the target in the environment of the vehicle
US11884293B2 (en) 2019-01-25 2024-01-30 Uber Technologies, Inc. Operator assistance for autonomous vehicles
WO2020154676A1 (en) * 2019-01-25 2020-07-30 Uatc, Llc Operator assistance for autonomous vehicles
EP3722909A1 (en) * 2019-04-10 2020-10-14 Siemens Aktiengesellschaft Transport system for transporting transport pieces
US11823460B2 (en) 2019-06-14 2023-11-21 Tusimple, Inc. Image fusion for autonomous vehicle operation
US11736495B2 (en) 2019-11-08 2023-08-22 Zoox, Inc. Guidance authentication with vehicles
US11349851B2 (en) 2019-11-08 2022-05-31 Zoox, Inc. Guidance authentication with vehicles
WO2021092230A1 (en) 2019-11-08 2021-05-14 Zoox, Inc. Guidance authentication with vehicles
US20230221719A1 (en) * 2019-11-26 2023-07-13 Zoox, Inc. Correction of sensor data alignment and environment mapping
WO2021138475A1 (en) * 2019-12-31 2021-07-08 Zoox, Inc. Vehicle control to join and depart a route
CN111464637A (en) * 2020-03-31 2020-07-28 北京百度网讯科技有限公司 Unmanned vehicle data processing method, device, equipment and medium
US11810322B2 (en) 2020-04-09 2023-11-07 Tusimple, Inc. Camera pose estimation techniques
US11993256B2 (en) 2020-05-22 2024-05-28 Cnh Industrial America Llc Dynamic perception zone estimation
US11701931B2 (en) 2020-06-18 2023-07-18 Tusimple, Inc. Angle and orientation measurements for vehicles with multiple drivable sections
US11914378B2 (en) 2021-05-18 2024-02-27 Ford Global Technologies, Llc Intersection node-assisted high-definition mapping
DE102021213147A1 (en) 2021-11-23 2023-05-25 Volkswagen Aktiengesellschaft Method, server device and motor vehicle for automatically mapping a surrounding area in sections

Also Published As

Publication number Publication date
EP3371797A4 (en) 2019-05-01
EP3371668B1 (en) 2022-12-28
US9612123B1 (en) 2017-04-04
EP3371668A1 (en) 2018-09-12
WO2017079460A8 (en) 2017-07-13
EP3371797A2 (en) 2018-09-12
US11106218B2 (en) 2021-08-31
WO2017079219A1 (en) 2017-05-11
WO2017079460A3 (en) 2017-08-24
US9630619B1 (en) 2017-04-25
US20170248963A1 (en) 2017-08-31
US20170120904A1 (en) 2017-05-04

Similar Documents

Publication Publication Date Title
US11796998B2 (en) Autonomous vehicle fleet service and system
US11106218B2 (en) Adaptive mapping to navigate autonomous vehicles responsive to physical environment changes
US11061398B2 (en) Machine-learning systems and techniques to optimize teleoperation and/or planner decisions
US10921811B2 (en) Adaptive autonomous vehicle planner logic
US11314249B2 (en) Teleoperation system and method for trajectory modification of autonomous vehicles
US11301767B2 (en) Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
US20200074024A1 (en) Simulation system and methods for autonomous vehicles
JP7316789B2 (en) Adaptive mapping for navigating autonomous vehicles in response to changes in the physical environment
EP3371795B1 (en) Coordination of dispatching and maintaining fleet of autonomous vehicles
US9734455B2 (en) Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
US9720415B2 (en) Sensor-based object-detection optimization for autonomous vehicles
US9916703B2 (en) Calibration for autonomous vehicle operation
US9507346B1 (en) Teleoperation system and method for trajectory modification of autonomous vehicles
WO2017079229A1 (en) Simulation system and methods for autonomous vehicles
US20240028031A1 (en) Autonomous vehicle fleet service and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16862985

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2018543270

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016862985

Country of ref document: EP