US20200142428A1 - Systems and Methods for Controlling Autonomous Vehicles that Provide a Vehicle Service to Users - Google Patents
Systems and Methods for Controlling Autonomous Vehicles that Provide a Vehicle Service to Users Download PDFInfo
- Publication number
- US20200142428A1 US20200142428A1 US16/734,945 US202016734945A US2020142428A1 US 20200142428 A1 US20200142428 A1 US 20200142428A1 US 202016734945 A US202016734945 A US 202016734945A US 2020142428 A1 US2020142428 A1 US 2020142428A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- user
- computing system
- location
- autonomous vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/037—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/20—Means to switch the anti-theft system on or off
- B60R25/24—Means to switch the anti-theft system on or off using electronic identifiers containing a code not memorised by the user
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
- G01C21/3438—Rendez-vous, i.e. searching a destination where several users can meet, and the routes to this destination for these users; Ride sharing, i.e. searching a route such that at least two users can share a vehicle for at least part of the route
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3453—Special cost functions, i.e. other than distance or default speed limit of road segments
- G01C21/3492—Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
- G01C21/3676—Overview of the route on the road map
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0088—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/005—Traffic control systems for road vehicles including pedestrian guidance indicator
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0129—Traffic data processing for creating historical data or processing based on historical data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0145—Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09623—Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09626—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages where the origin of the information is within the own vehicle, e.g. a local storage device, digital map
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096791—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/096805—Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
- G08G1/096827—Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed onboard
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/123—Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/14—Traffic control systems for road vehicles indicating individual free spaces in parking areas
- G08G1/141—Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces
- G08G1/143—Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces inside the vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/14—Traffic control systems for road vehicles indicating individual free spaces in parking areas
- G08G1/145—Traffic control systems for road vehicles indicating individual free spaces in parking areas where the indication depends on the parking areas
- G08G1/147—Traffic control systems for road vehicles indicating individual free spaces in parking areas where the indication depends on the parking areas where the parking area is within an open public zone, e.g. city centre
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
- G08G1/202—Dispatching vehicles on the basis of a location, e.g. taxi dispatching
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
- G08G1/205—Indicating the location of the monitored vehicles as destination, e.g. accidents, stolen, rental
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/46—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
Definitions
- the present disclosure relates generally to controlling the travel holding pattern of an autonomous vehicle that provides a vehicle service to a user.
- An autonomous vehicle is a vehicle that is capable of sensing its environment and navigating without human input.
- an autonomous vehicle can observe its surrounding environment using a variety of sensors and can attempt to comprehend the environment by performing various processing techniques on data collected by the sensors. Given knowledge of its surrounding environment, the autonomous vehicle can identify an appropriate motion path through such surrounding environment.
- One example aspect of the present disclosure is directed to a computer-implemented method of controlling autonomous vehicles.
- the method includes obtaining, by a computing system that includes one or more computing devices, data indicative of a location associated with a user to which an autonomous vehicle is to travel. The autonomous vehicle is to travel along a first vehicle route that leads to the location associated with the user.
- the method includes obtaining, by the computing system, traffic data associated with geographic area that includes the location associated with the user.
- the method includes determining, by the computing system, an estimated traffic impact of the autonomous vehicle on the geographic area based at least in part on the traffic data.
- the method includes determining, by the computing system, one or more vehicle actions based at least in part on the estimated traffic impact.
- the method includes causing, by the computing system, the autonomous vehicle to perform the one or more vehicle actions.
- the one or more vehicle actions include at least one of stopping the autonomous vehicle at least partially in a travel way within a vicinity of the location associated with the user or travelling along a second vehicle route.
- the computing system includes one or more processors and one or more memory devices.
- the one or more memory devices store instructions that when executed by the one or more processors cause the computing system to perform operations.
- the operations include obtaining data indicative of a location associated with a user.
- the user is associated with a request for a vehicle service provided by an autonomous vehicle.
- the autonomous vehicle is to travel along a first vehicle route to arrive within a vicinity of the location associated with the user.
- the operations include obtaining traffic data associated with a geographic area that includes the location associated with the user.
- the operations include obtaining location data associated with a user device associated with the user.
- the operations include determining at least one of an estimated traffic impact of the autonomous vehicle on the geographic area based at least in part on the traffic data or an estimated time of user arrival based at least in part on the location data associated with the user device.
- the operations include determining one or more vehicle actions based at least in part on at least one of the estimated traffic impact or the estimated time of user arrival.
- the operations include causing the autonomous vehicle to perform the one or more vehicle actions.
- the one or more vehicle actions include at least one of stopping the autonomous vehicle at least partially in a travel way within a vicinity of the location associated with the user or travelling along a second vehicle route.
- an autonomous vehicle includes one or more sensors, a communication system, one or more processors, and one or more memory devices.
- the one or more memory devices store instructions that when executed by the one or more processors cause the autonomous vehicle to perform operations.
- the operations include obtaining data indicative of a location associated with a user.
- the user is associated with a request for a vehicle service provided by the autonomous vehicle.
- the operations include controlling the autonomous vehicle to travel along a first vehicle route to arrive within a vicinity of the location associated with the user.
- the operations include obtaining traffic data associated with a geographic area that includes the location associated with the user based at least in part on sensor data obtained via the one or more sensors.
- the traffic data is indicative of a level of traffic within a surrounding environment of the autonomous vehicle.
- the operations include obtaining, via the communication system, location data associated with a user device associated with the user.
- the location data associated with the user device is indicative of one or more locations of the user device associated with the user at one or more times.
- the operations include determining an estimated traffic impact of the autonomous vehicle on the geographic area based at least in part on the traffic data.
- the operations include determining an estimated time of user arrival based at least in part on the location data associated with the user device.
- the operations include determining one or more vehicle actions based at least in part on at least one of the estimated traffic impact or the estimated time of user arrival.
- the operations include causing the autonomous vehicle to perform the one or more vehicle actions.
- the one or more vehicle actions include at least one of stopping the autonomous vehicle within the vicinity of the location associated with the user or travelling along a second vehicle route.
- FIG. 1 depicts an example system overview according to example embodiments of the present disclosure
- FIG. 2 depicts an example geographic area that includes a location associated with a user according to example embodiments of the present disclosure
- FIG. 3 depicts example information associated with an acceptable walking distance according to example embodiments of the present disclosure
- FIG. 4 depicts an example display device with an example communication according to example embodiments of the present disclosure
- FIG. 5 depicts example information associated with an estimated traffic impact according to example embodiments of the present disclosure
- FIG. 6 depicts an example travel way according to example embodiments of the present disclosure
- FIG. 7 depicts a flow diagram of an example method of determining an estimated time of user arrival according to example embodiments of the present disclosure
- FIG. 8A depicts an example portion of a communications system according to example embodiments of the present disclosure
- FIG. 8B depicts an example portion of a communications system according to example embodiments of the present disclosure
- FIG. 8C depicts an example diagram of obtaining location data according to example embodiments of the present disclosure.
- FIG. 9 depicts example information associated with a second vehicle route according to example embodiments of the present disclosure.
- FIG. 10 depicts an example display device with an example communication according to example embodiments of the present disclosure
- FIG. 11 depicts example information associated with a vehicle service cancellation threshold according to example embodiments of the present disclosure
- FIG. 12 depicts a flow diagram of an example method of controlling autonomous vehicles according to example embodiments of the present disclosure
- FIGS. 13A-B depict a flow diagram of an example method of controlling autonomous vehicles according to example embodiments of the present disclosure.
- FIG. 14 depicts example system components according to example embodiments of the present disclosure.
- Example aspects of the present disclosure are directed to improving the travel patterns of an autonomous vehicle to account for potential traffic impacts, while waiting to provide a Vehicle service to a user.
- an entity e.g., service provider
- a fleet of vehicles can use a fleet of vehicles to provide a vehicle service (e.g., transportation service, delivery service, courier service, etc.) to a plurality of users.
- the fleet can include, for example, autonomous vehicles that can drive, navigate, operate, etc. with minimal and/or no interaction from a human driver.
- an autonomous vehicle can receive data indicative of a location associated with a user that has requested a vehicle service, such as a transportation service. The autonomous vehicle can autonomously navigate along a route towards the location associated with the user.
- the autonomous vehicle can attempt to identify a parking spot that is out of a travel way (e.g., out of a traffic lane). Such parking spots may not, however, be available to the autonomous vehicle.
- the autonomous vehicle can determine whether the vehicle should stop in a travel lane to pick up the user and/or whether the vehicle should enter a holding pattern whereby the vehicle continues to travel (e.g., around the block) to return to the user's location.
- the autonomous vehicle can estimate the impact the autonomous vehicle may have on traffic if the autonomous vehicle were to stop in a travel way to wait for the user, given, for example, the user's proximity to the vehicle.
- the autonomous vehicle can stop in the travel way and alert the user of the vehicle's location.
- the autonomous vehicle can be re-routed (e.g., around the block) so that the vehicle can again attempt to pick up the user.
- the system and methods of the present disclosure can improve the situational awareness of an autonomous vehicle that is waiting for a user of a vehicle service (e.g., while attempting to pick up a user for transport).
- an entity e.g., service provider, owner, manager
- vehicle can use one or more vehicles (e.g., ground-based vehicles) to provide a vehicle service such as a transportation service (e.g., rideshare service), a courier service, a delivery service, etc.
- the vehicle(s) can be autonomous vehicles that include various systems and devices configured to control the operation of the vehicle.
- an autonomous vehicle can include an onboard vehicle computing system for operating the vehicle (e.g., located on or within the autonomous vehicle).
- the vehicle computing system can receive sensor data from sensor(s) onboard the vehicle (e.g., cameras, LIDAR, RADAR), attempt to comprehend the vehicle's surrounding environment by performing various processing techniques on the sensor data, and generate an appropriate motion plan through the vehicle's surrounding environment.
- the autonomous vehicle can be configured to communicate with one or more computing devices that are remote from the vehicle.
- the autonomous vehicle can communicate with an operations computing system that can be associated with the entity.
- the operations computing system can help the entity monitor, communicate with, manage, etc. the fleet of vehicles.
- An autonomous vehicle can be configured to operate in a plurality of operating modes.
- an autonomous vehicle can be configured to operate in a fully autonomous (e.g., self-driving) operating mode in which the autonomous vehicle can drive and navigate with no interaction from a human driver present in the vehicle.
- a human driver may not be present in the autonomous vehicle.
- the autonomous vehicle can also be configured to operate in an approach mode in which autonomous vehicle performs various functions as it approaches a location associated with a user, such as searching its surrounding environment for a parking location.
- the approach operating mode can be utilized, for example, when the autonomous vehicle is approaching a user that has requested a vehicle service, as further described herein.
- a user can make a request for a vehicle service provided by the autonomous vehicle.
- a user can provide (e.g., via a user device) a request to the operations computing system of an entity (e.g., service provider, manager, owner) that is associated with the autonomous vehicle.
- the request can indicate the type of vehicle service that the user desires (e.g., a transportation service, a delivery service, a courier service, etc.), a location associated with the user (e.g., a current location of the user, a different location, etc.), an identifier (e.g., phone number, Bluetooth, WiFi, Cellular, other data that can be used to contact the user, etc.) associated with the user device that provided the request, and/or other information.
- an entity e.g., service provider, manager, owner
- the request can indicate the type of vehicle service that the user desires (e.g., a transportation service, a delivery service, a courier service, etc.), a location associated with the user (e.g.
- the operations computing system can process the request and select an autonomous vehicle to provide the requested vehicle service to the user.
- the operations computing system can provide, to the autonomous vehicle, data indicative of a location to which the autonomous vehicle is to travel.
- the location can be associated with the user that requested the vehicle service.
- the location can be the current location of the user and/or a different location, such as for example a location at which the user would like to be picked up by the autonomous vehicle, provide an item to the autonomous vehicle, retrieve an item from the autonomous vehicle, etc.
- the location can be expressed as a coordinate (e.g., GPS coordinate, latitude-longitude coordinate pair), an address, a place name, and/or other geographic reference that can be used to identify the location.
- the location associated with the user can be represented, for example, as a pin on a map user interface.
- the autonomous vehicle can obtain, from the operations computing system, the data indicative of the location associated with the user.
- the autonomous vehicle can also obtain a first vehicle route that leads to the location associated with the user.
- the first vehicle route can be, for example, a route from the current location of the vehicle to the location associated with the user.
- the operations computing system can provide the first vehicle route to the autonomous vehicle.
- the onboard vehicle computing system of the autonomous vehicle can determine the first Vehicle route.
- the autonomous vehicle can travel in accordance with the first vehicle route to arrive within a vicinity of the location associated with the user.
- the vicinity can be defined by a distance (e.g., a radial distance) from the location associated with the user.
- the distance can be indicative of an acceptable walking distance from the location associated with the user.
- an acceptable walking distance can be distance that a user would be willing walk to arrive at a vehicle.
- the acceptable walking distance can be determined based on a variety of information.
- Such information can include, for example, specific user preferences stored in a profile associated with the user, weather information (e.g., gathered via sensors onboard the vehicle, provided by a third party source, etc.), traffic conditions (e.g., current, historic, future predicted traffic conditions), historic vehicle services data (e.g., historic pickup data for previous transportation services), and/or other types of information.
- the autonomous vehicle can provide a communication to the user indicating that the autonomous vehicle is within the vicinity of the location associated with the user. For instance, such a communication can be in the form of a textual message stating “your ride is arriving please prepare to board”.
- the autonomous vehicle can begin to search for a parking location. For example, the autonomous vehicle can enter into the approach operating mode when the vehicle is within the vicinity of the location associated with the user. While it is within the vicinity, the autonomous vehicle can search for a parking location before it reaches the location associated with the user (e.g., before the GPS pin coordinate on a map) and/or after it passes the location associated with the user, but is still within the vicinity of the user (e.g., within acceptable walking distance for the user).
- the location associated with the user e.g., before the GPS pin coordinate on a map
- the autonomous vehicle can search for a parking location before it reaches the location associated with the user (e.g., before the GPS pin coordinate on a map) and/or after it passes the location associated with the user, but is still within the vicinity of the user (e.g., within acceptable walking distance for the user).
- the autonomous vehicle can obtain sensor data associated with one or more objects that are proximate to the vehicle (e.g., within a field of view of one or more of the vehicle's onboard sensor(s)).
- the sensor data can include image data, radar data, LIDAR data, and/or other data acquired by the vehicle's sensor(s).
- the object(s) can include, for example, pedestrians, vehicles, bicycles, and/or other objects.
- the sensor data can be indicative of locations associated with the object(s) within the surrounding environment of the vehicle at one or more times.
- the autonomous vehicle can process the sensor data to determine if there are any available parking locations that are not currently occupied by the objects (e.g., other vehicles) within the vehicle's surrounding environment.
- the autonomous vehicle can utilize map data to determine if there are any designated parking locations (e.g., parking lots, pullover lanes, etc.) within the vicinity of the location associated with the user.
- the autonomous vehicle In the event the autonomous vehicle is able to identify a parking location that is out of a travel way and within the vicinity of the location associated with the user (e.g., out of a traffic lane), the autonomous vehicle can position itself into that parking location accordingly.
- the autonomous vehicle can send a communication to a user device associated with the user.
- the communication can indicate that the vehicle has arrived as well as the location of the autonomous vehicle.
- the user device can display a map user interface that includes a user route.
- the user route can be a route along which a user can travel to arrive at the autonomous vehicle.
- the autonomous vehicle can decide whether or not to stop at least partially in a travel way to wait for the user.
- the autonomous vehicle can obtain traffic data associated with a geographic area that includes the location associated with the user.
- the traffic data can include various types of data such as historic traffic data, predicted traffic data, and/or current traffic data associated with the geographic area (e.g., within the vicinity of the location of the user, a wider area, etc.).
- the traffic data can be obtained from a variety of sources such as other autonomous vehicles (e.g., within the vehicle fleet), the operations computing system, third party sources (e.g., regional traffic management entities, etc.), as well as the autonomous vehicle itself.
- the autonomous vehicle can obtain the current traffic data associated with the geographic area that includes the location associated with the user.
- the autonomous vehicle can obtain sensor data (e.g., via its onboard sensors) associated with the surrounding environment of the autonomous vehicle that is within the vicinity of the location associated with the user, as described herein.
- the sensor data can be indicative of one or more objects within the surrounding environment of the autonomous vehicle.
- the autonomous vehicle can process the sensor data to classify which of the object(s) would be impacted (e.g., caused to stop) by the autonomous vehicle stopping in a travel way.
- the autonomous vehicle can classify the vehicles that are behind the autonomous vehicle (in the same travel lane) as objects that would be impacted in the event the autonomous vehicle were to stop at least partially in the travel way.
- the autonomous vehicle can also identify object(s) that would not be affected by the vehicle stopping in the travel way. For example, the autonomous vehicle can determine that object(s) that have a path to travel around the autonomous vehicle (e.g., bicycles, vehicles in adjacent lanes, vehicles with clear paths to change lanes, etc.), may not be impacted and/or may be impacted to an insignificant degree. After such classification, the autonomous vehicle can determine a level of traffic associated with the geographic area (e.g., within the vicinity of the user's location) based at least in part on the sensor data.
- object(s) that have a path to travel around the autonomous vehicle e.g., bicycles, vehicles in adjacent lanes, vehicles with clear paths to change lanes, etc.
- the autonomous vehicle can determine a level of traffic associated with the geographic area (e.g., within the vicinity of the user's location) based at least in part on the sensor data.
- the level of traffic can be based at least in part on the number of object(s) within the surrounding environment of the autonomous vehicle that would be impacted by the autonomous vehicle stopping at least partially in the travel way, while filtering out those object(s) that would not be impacted. Similar such information could be acquired via one or more other vehicles in the associated vehicle fleet.
- the autonomous vehicle can determine an estimated traffic impact of the autonomous vehicle on the geographic area based at least in part on the traffic data.
- the estimated traffic impact can be indicative of an estimated impact of the autonomous vehicle on one or more objects within a surrounding environment of the autonomous vehicle in the event that the autonomous vehicle were to stop at least partially in the travel way (e.g., in the vicinity of the location associated with the user).
- the autonomous vehicle can compare the level of traffic (e.g., based on the sensor data) to a traffic constraint to determine whether the estimated traffic impact would be high or low.
- the traffic constraint can be implemented in a variety of forms.
- the traffic constraint can include a traffic threshold that is indicative of an acceptable level of traffic (e.g., an acceptable number of objects) that would be impacted by the autonomous vehicle stopping at least partially in the travel way. A traffic level that exceeds the traffic threshold would be considered a high impact on traffic.
- the traffic constraint can be implemented as cost data (e.g., one or more cost function(s)).
- the autonomous vehicle's onboard vehicle computing system can include cost data that reflects the cost(s) of stopping vehicle motion, the cost(s) of causing traffic build-up, the cost(s) of illegally stopping in a travel way, etc.
- the traffic constraint can be based on a variety of information.
- the traffic constraint can be based at least in part on historic traffic data that indicates the level of traffic previously occurring in that geographic area. For example, if the geographic area normally experiences a high level of traffic build-up, a corresponding traffic threshold can be higher (and/or the cost of stopping can be lower).
- the traffic constraint can be based at least in part on real-time traffic data (e.g., from other vehicles in the fleet, from the autonomous vehicle, other sources). For example, in the event that there is already a traffic jam in the vicinity of the location of the user, the traffic threshold could be higher (and/or the cost of stopping could be lower).
- the traffic constraint can be based at least in part on the typical travel expectations of individuals in the geographic area. For example, individuals that are located in City A may be more patient when waiting in traffic than those in City B. Thus, a traffic threshold may be higher in City A than in City B (and/or the cost of stopping may be lower in City A than in City B). In some implementations, the traffic constraint can be based at least in part on map data. For example, in the event that the autonomous vehicle is traveling on a wide travel way in which impacted vehicles could eventually travel around the autonomous vehicle, the traffic threshold could be higher and/or the cost of stopping could be lower).
- the traffic constraint can be determined at least in part from a model, such as a machine-learned model.
- the machine-learned model can be or can otherwise include one or more various model(s) such as, for example, models using boosted random forest techniques, neural networks (e.g., deep neural networks), or other multi-layer non-linear models.
- Neural networks can include recurrent neural networks (e.g., long short-term memory recurrent neural networks), feed-forward neural networks, and/or other forms of neural networks.
- supervised training techniques can be performed to train the model (e.g., using previous driving logs) to determine a traffic constraint based at least in part on input data.
- the input data can include, for example, traffic data as described herein, map data, data from a traffic management entity, driving characteristics of individuals in an associated geographic area, complaints received from operators of vehicles that were caused to stop by autonomous vehicles, etc.
- the machine-learned model can provide, as an output, data indicative of a recommended traffic constraint.
- the recommended traffic constraint can be specific to the geographic area.
- the autonomous vehicle can also, or alternatively, determine an estimated time of user arrival in order to help determine whether or not to stop at least partially in the travel way. For instance, the autonomous vehicle can obtain location data associated with a user device associated with a user. The location data can be indicative of the position of the user device associated with the user. The autonomous vehicle can use one or more identifier(s) of the user device (e.g, provided by the operations computing system) to scan for and/or communicate with the user device when the vehicle is within the vicinity of the user. For example, the autonomous vehicle can utilize multiple input/multiple output communication, Bluetooth low energy protocol, RF signaling, and/or other communication technologies to obtain the location data. The autonomous vehicle can determine the estimated time of user arrival based at least in part on the location data associated with the user device.
- the autonomous vehicle can obtain location data associated with a user device associated with a user.
- the location data can be indicative of the position of the user device associated with the user.
- the autonomous vehicle can use one or more identifier(s) of the user device (
- the estimated time of user arrival can be indicative of, for example, a time at which the user is estimated to complete boarding of the autonomous vehicle (e.g., for a transportation service).
- the estimated time of user arrival can be expressed as a time duration (e.g., user estimated to arrive in 1 minute) and/or a point in time (e.g., user estimated to arrive at 10:31 am (PT)).
- PT 10:31 am
- the autonomous vehicle can determine an amount of stopping time that the object(s) within its surroundings would be impacted as the autonomous vehicle waits for the user's arrival (e,g., how long other vehicles would be caused to stop while waiting for the user).
- the operations computing system can provide the autonomous vehicle with one or more identifier(s) of a user device associated with the user.
- the autonomous vehicle can use the identifier(s) (e.g., Bluetooth, WiFi, Cellular, other identifier) to determine the location of the user. For example, when the autonomous vehicle is within a vicinity of the location associated with the user, the autonomous vehicle can scan for the user device based at least in part on the identifier(s). Once the user device is found, the autonomous vehicle can track changes in the signal strength (e.g., radio signal strength identifier) to determine the approximate heading of the user as well as the approximate distance between the user and autonomous vehicle (e.g., without authenticated connection).
- the signal strength e.g., radio signal strength identifier
- the autonomous vehicle can determine differences in Bluetooth Low Energy beacon radio signal strength identifiers over time and/or inertial measurement unit changes, which can indicate distance and direction of the autonomous vehicle from the user device (e.g., mobile phone associated with the user).
- the autonomous vehicle can calculate the estimated time of user arrival based at least on the heading of the user and the approximate distance between the user and the vehicle (and/or the estimated velocity of the user), as further described herein.
- the autonomous vehicle can obtain the location data using multiple input, multiple output communication between the autonomous vehicle and a user device associated with the user. This can allow the autonomous vehicle to take advantage of the multiple antennas included in the vehicle's communication system as well as those of the user device to increase accuracy of the location data associated with the user.
- the estimated time to user arrival can be based at least in part on historic data. Such historic data can include, for example, previous correlations between changes in the signal strength of an identifier and the user's time to arriving at a vehicle.
- the autonomous vehicle, systems, and methods described herein can utilize other communication methods.
- Such communication methods can include, for example, autonomous vehicle based triangulation (e.g., on vehicle triangulated RF), on-vehicle multi-range beacon (e.g., Bluetooth Low Energy) hardware (e.g., paired with a software application on a user device), application triangulation, user device to vehicle handshake (e.g., light signal handshake), autonomous vehicle perception of the user (e.g., via processing of sensor data to perceive the user and the user's location, distance, heading, velocity, and/or other state data associated therewith), GPS location of the user device, device specific techniques (e.g., associated with a specific device/model type), the autonomous vehicle serving as a localized base station (e.g., GPS, WiFi, etc.) for the user device, autonomous vehicle localization and user device image based localization via one or more network(s), and/or other techniques.
- autonomous vehicle based triangulation e.g., on vehicle
- the autonomous vehicle in order to determine whether the amount of stopping time is acceptable, can compare the estimated time of user arrival to a time constraint.
- the time constraint can be expressed as a time threshold (e.g., indicating an acceptable amount of stopping time) and/or cost data (e.g., cost functions expressing a cost in relation to stopping time). Similar to the traffic constraint, the time constraint can be based on historic data (e.g., indicating historic wait times), real-time data (e.g., indicating that the vehicles are already waiting due to another traffic build-up in front of the autonomous vehicle), expectations of individuals in the geographic area, machine-learned model(s), and/or other information.
- the estimated time of user arrival can also factor in additional amounts of time that can impact the objects) within the surrounding environment of the autonomous vehicle while the autonomous vehicle is stopped, awaiting the user.
- Example instances requiring such additional amounts of time can be associated with a user getting into the vehicle and securely fastening his/her seatbelt, a user helping other passengers enter and become securely positioned within the vehicle (e.g., children or others requiring assistance), a user loading luggage or other items for transportation within the vehicle, a user receiving delivered item(s) from/placing item(s) within the vehicle, etc.
- the additional amounts of time can be determined based at least in part on information provided with a service request (e.g., type of service, destination, number of passengers, child's car seat requested, etc.
- the autonomous vehicle can determine one or more vehicle actions based at least in part on the estimated traffic impact and/or the estimated time of user arrival.
- vehicle action(s) can include stopping within the vicinity of the location associated with the user (e.g., at least partially in the travel way). For example, in the event that the level of traffic (e.g., the number of other vehicles that would be impacted by an in-lane stop) is below a traffic threshold, the autonomous vehicle can determine that the vehicle can stop within the travel way to wait for the user to arrive at the vehicle. In another example, in the event that the estimated time of user arrival is below the time threshold, the autonomous vehicle can determine that the vehicle can stop art least partially within the travel way.
- the autonomous vehicle can base its determination to stop at least partially within the travel way on both the estimated traffic impact and the estimated time of user arrival. For instance, the autonomous vehicle can weigh each of these estimates to determine whether it would be appropriate for the vehicle to stop at least partially in the travel way to wait for the user.
- the autonomous vehicle can apply a first weighting factor to the estimated traffic impact and a second weighting factor to the estimated time of user arrival.
- the first weighting factor can be different than the second weighting factor.
- the estimated impact on traffic may be high while the estimated time of user arrival may be short.
- the autonomous vehicle may determine that it can stop within the travel way because although a higher number of objects (e.g., other vehicles) may be caused to stop, it would only be for a short period of time because the user is close in distance to the autonomous vehicle. In such a case, the estimated time of user arrival can he given a greater weight than the estimated traffic impact. In another example, the estimated impact on traffic may be low while the estimated time of user arrival may be long. Accordingly, the autonomous vehicle may determine that it should not stop within the travel way because although only a few objects (e.g., other vehicles) may be caused to stop, it would be for a greater period of time because the user is farther from the autonomous vehicle. In such a case, the estimated traffic impact can be given a greater weight than the estimated time of user arrival.
- the vehicle action(s) can also, or alternatively, include the autonomous vehicle entering into a holding pattern.
- the vehicle action(s) can include traveling along a second vehicle route (e.g., an optimal holding pattern route).
- a second vehicle route e.g., an optimal holding pattern route
- the autonomous vehicle may be unable to find a parking location before and/or after the location associated with the user (e.g., before and/or after a pin location of the user).
- the autonomous vehicle may determine that it should not stop within a travel way to wait for the user's arrival, as described herein.
- the autonomous vehicle can be re-routed along a second vehicle route that is at least partially different than the first vehicle route.
- the second vehicle route can be a path along which the autonomous vehicle can travel to re-arrive within the vicinity of the location of the user.
- the second vehicle route can be a path along which the autonomous vehicle can travel around a block, back to the location associated with the user. This can afford the user additional time to arrive at the vehicle, without the autonomous vehicle impacting traffic (e.g., due to a stop).
- the autonomous vehicle can determine that it can stop within the travel way, but later determine that it must begin to travel again (e.g., according to a holding pattern route). For example, the autonomous vehicle can determine that it would be appropriate to stop at least partially within the travel way to wait for the user based at least in part on the estimated traffic impact and/or the estimated time of user arrival. While the autonomous vehicle is stopped, the traffic impact may increase (e.g., due to an increase in the number of other vehicle(s) stopped behind the autonomous vehicle) and/or the user may take longer than estimated to arrive at the autonomous vehicle.
- the autonomous vehicle can determine an updated estimated traffic impact (e.g., based on the number of vehicles that have already stopped and/or additional vehicles that may be caused to stop) and/or an updated estimated time of user arrival (e.g., based on a change in the user device location data, if any).
- the autonomous vehicle can then determine that it can no longer remain stopped to wait for the user based at least in part on the updated estimates.
- the autonomous vehicle can begin to travel again, for example, along the second vehicle route (e.g., around the block).
- the vehicle computing system of the autonomous vehicle can implement the determined vehicle action(s). For example, in the event that the autonomous vehicle has determined to stop in the travel way, the vehicle computing system can cause the autonomous vehicle to stop by sending one or more control signals to the braking control system(s) of the autonomous vehicle. In the event that the autonomous vehicle has determined to travel along a second vehicle route (e.g., in accordance with the holding pattern), the vehicle computing system can obtain data associated with the second vehicle route and implement the route accordingly. For example, the vehicle computing system can request and obtain data indicative of the second vehicle route from the operations computing system. Additionally, or alternatively, the vehicle computing system can determine the second vehicle route onboard the vehicle. In either case, the vehicle computing system can send one or more control signals to cause a motion planning system of the autonomous vehicle to plan the motion of the vehicle in accordance with the second vehicle route (e.g., to implement a vehicle trajectory in accordance with the second vehicle action).
- the autonomous vehicle can provide the user with one or more communications indicating the actions taken by the autonomous vehicle. For example, in the event that the autonomous vehicle stops within the travel way, the autonomous vehicle can provide a communication to the user indicating that the vehicle is waiting for the user (e.g., “your vehicle has arrived, please proceed quickly to the vehicle”). In response to receiving such a communication, a user device associated with the user can display a map user interface indicating the vehicle location of the autonomous vehicle and a user route to the vehicle's location. In the event that the autonomous vehicle does not find a parking spot and does not stop in the travel way, the autonomous vehicle can provide a communication to the user indicating as such (e.g., “I could not locate you at the pin drop, traffic forced me to go around the block. Please proceed to the pin drop”).
- a communication to the user indicating as such (e.g., “I could not locate you at the pin drop, traffic forced me to go around the block. Please proceed to the pin drop”).
- the autonomous vehicle may be relieved of its responsibility to provide a vehicle service to the user. For instance, in the event that a vehicle computing system and/or operations computing system determines that an autonomous vehicle should travel along the second vehicle route, such computing system(s) can determine whether it would be advantageous (e.g., more time efficient, more fuel efficient, etc.) for another autonomous vehicle in the area to be routed to the user. In the event that it would be advantageous, the computing system(s) can provide data to the autonomous vehicle indicating that the autonomous vehicle is no longer responsible for the service request. Additionally, or alternatively, the computing system(s) can re-route the autonomous vehicle to provide a vehicle service to another user.
- a vehicle computing system and/or operations computing system determines that an autonomous vehicle should travel along the second vehicle route
- the computing system(s) can determine whether it would be advantageous (e.g., more time efficient, more fuel efficient, etc.) for another autonomous vehicle in the area to be routed to the user.
- the computing system(s) can provide data to the autonomous vehicle indicating
- the autonomous vehicle can cancel the service request of the user.
- the autonomous vehicle may be caused to re-route (e.g., circle the block) a certain number of times and/or the user may not arrive at the autonomous vehicle within a certain timeframe (e.g., above a trip cancellation threshold).
- the autonomous vehicle can send data to the operations computing system requesting the cancellation of the user's service request.
- the operations computing system can cancel the service request (and inform the user accordingly) and/or re-route the autonomous vehicle to provide a vehicle service to another user.
- the autonomous vehicle can communicate directly with a user device associated with the user to cancel the service request and inform the user accordingly.
- the autonomous vehicle can report such a cancellation to the operations computing system.
- the systems and methods described herein may provide a number of technical effects and benefits.
- the systems and methods enable an autonomous vehicle to determine a holding pattern, for waiting for a user, onboard the autonomous vehicle.
- the autonomous vehicle need not communicate with a remote computing system (e.g., operations computing system) each time the vehicle must decide whether to stop in a travel way or re-route the vehicle (e.g., around the block). This can help improve the response time of the vehicle computing system when deciding and/or implementing such a holding pattern.
- the systems and methods described herein can save computational resources of the operations computing system (that would otherwise be required for such determination). Accordingly, the computational resources of the operations computing system can be allocated to other core functions such as the management of service requests, routing of autonomous vehicles, etc.
- the systems and methods of the present disclosure also provide an improvement to vehicle computing technology, such as autonomous vehicle computing technology.
- vehicle computing technology such as autonomous vehicle computing technology.
- the computer-implemented methods and systems improve the situational awareness of the autonomous vehicle to provide a vehicle service to a user.
- the systems and methods can enable a computing system to obtain data indicative of a location associated with a user to which an autonomous vehicle is to travel.
- the autonomous vehicle can travel along a first vehicle route that leads to the location associated with the user.
- the computing system can obtain traffic data associated with a geographic area that includes the location associated with the user and location data associated with a user device associated with the user.
- the computing system can determine an estimated traffic impact of the autonomous vehicle on the geographic area based at least in part on the traffic data and/or an estimated time of user arrival based at least in part on the location data associated with the user device.
- the computing system can determine one or more vehicle actions based at least in part on the estimated traffic impact and/or the estimated time of user arrival.
- the computing system can cause the autonomous vehicle to perform the one or more vehicle actions.
- the vehicle action(s) can include at least one of stopping the autonomous vehicle at least partially in a travel way within a vicinity of the location associated with the user or travelling along a second vehicle route. In this way, the vehicle computing system can improve the holding pattern of the autonomous vehicle that is waiting for a user.
- the holding pattern can be customized based on the estimated impact on the traffic surrounding the autonomous vehicle and/or the estimated time it will take for the specific user to arrive at the vehicle.
- the systems and methods can improve the vehicle computing system's situational awareness by allowing it to take into account (e.g., in real-time) such circumstances when making a determination as to how to best provide vehicle services.
- Such approach can also increase the efficiency of implementing a holding pattern e.g., by avoiding the aforementioned latency issues) while providing an additional benefit of minimizing the autonomous vehicle's impact on traffic.
- the systems and methods of the present disclosure can enhance the user experience associated with the autonomous vehicle.
- the systems and methods described herein provide a systematic approach that enables users to engage autonomous vehicles and receive effectively communicated information regarding expected autonomous locations including, for example, arrival times, use of holding patterns when needed, etc.
- the communications and user interfaces described herein provide the user with updated information regarding the autonomous vehicle's actions and locations, thereby increasing the user's knowledge and understanding of the autonomous vehicle's intentions.
- the user experience is further enhanced in that the described systems and methods can decrease the likelihood that a user will be subjected to potential frustration from other drivers or pedestrians in the surrounding environment that could be impacted while the user is arriving to and/or boarding the autonomous vehicle.
- FIG. 1 depicts an example system 100 according to example embodiments of the present disclosure.
- the system 100 can include a vehicle computing system 102 associated with a vehicle 104 and an operations computing system 106 that is remote from the vehicle 104 .
- the vehicle 104 incorporating the vehicle computing system 102 can be a ground-based autonomous vehicle (e.g., car, truck, bus. etc.), an air-based autonomous vehicle (e.g., airplane, drone, helicopter, or other aircraft), or other types of vehicles (e.g., watercraft, etc.).
- the vehicle 104 can be an autonomous vehicle that can drive, navigate, operate, etc. with minimal and/or no interaction from a human driver.
- a human operator can be omitted from the vehicle 104 (and/or also omitted from remote control of the vehicle 104 ).
- the vehicle 104 can be configured to operate in a plurality of operating modes 108 A-D.
- the vehicle 104 can be configured to operate in a fully autonomous (e.g., self-driving) operating mode 108 A in which the vehicle 104 can drive and navigate with no input from a user present in the vehicle 104 .
- the vehicle 104 can be configured to operate in a semi-autonomous operating mode 108 B in which the vehicle 104 can operate with some input from a user present in the vehicle 104 .
- the vehicle 104 can enter into a manual operating mode 108 C in which the vehicle 104 is fully controllable by a user (e.g., human driver) and can be prohibited from performing autonomous navigation (e.g., autonomous driving).
- the vehicle 104 can implement vehicle operating assistance technology (e.g., collision mitigation system, power assist steering, etc.) while in the manual operating mode 108 C to help assist the operator of the vehicle 104 .
- vehicle operating assistance technology e.g., collision mitigation system, power assist
- the vehicle 104 can also be configured to operate in an approach mode 108 D in which vehicle 104 performs various functions as it approaches a location associated with a user. For example, the vehicle 104 can enter into the approach mode 108 D when the vehicle is within a vicinity of a user to which the vehicle 104 is to provide a vehicle service, as further described herein. While in the approach mode 108 D, the vehicle 104 can search its surrounding environment for a parking location as well as communicate with a user (e.g., via a user device associated with the user). Additionally, or alternatively, the vehicle 104 can be configured to evaluate its estimated impact on traffic and/or an estimated time of user arrival, as further described herein, when the vehicle 104 is in the approach mode 108 D.
- an approach mode 108 D in which vehicle 104 performs various functions as it approaches a location associated with a user.
- the vehicle 104 can enter into the approach mode 108 D when the vehicle is within a vicinity of a user to which the vehicle 104 is to provide a vehicle service, as
- the operating mode of the vehicle 104 can be adjusted in a variety of manners. In some implementations, the operating mode of the vehicle 104 can be selected remotely, off-board the vehicle 104 .
- an entity associated with the vehicle 104 e.g., a service provider
- the operations computing system 106 can send one or more control signals to the vehicle 104 instructing the vehicle 104 to enter into, exit from, maintain, etc. an operating mode.
- the operations computing system 106 can send one or more control signals to the vehicle 104 instructing the vehicle 104 to enter into the fully autonomous operating mode 108 A.
- the operating mode of the vehicle 104 can be set onboard and/or near the vehicle 104 .
- the vehicle computing system 102 can automatically determine when and where the vehicle 104 is to enter, change, maintain, etc. a particular operating mode (e.g., without user input).
- the operating mode of the vehicle 104 can be manually selected via one or more interfaces located onboard the vehicle 104 (e.g., key switch, button, etc.) and/or associated with a computing device proximate to the vehicle 104 (e.g., a tablet operated by authorized personnel located near the vehicle 104 ).
- the operating mode of the vehicle 104 can be adjusted based at least in part on a sequence of interfaces located on the vehicle 104 .
- the operating mode may be adjusted by manipulating a series of interfaces in a particular order to cause the vehicle 104 to enter into a particular operating mode.
- the vehicle computing system 102 can include one or more computing devices located onboard the vehicle 104 .
- the computing device(s) can be located on and/or within the vehicle 104 .
- the computing device(s) can include various components for performing various operations and functions.
- the computing device(s) can include one or more processor(s) and one or more tangible, non-transitory, computer readable media (e.g., memory devices).
- the one or more tangible, non-transitory, computer readable media can store instructions that when executed by the one or more processor(s) cause the vehicle 104 (e.g., its computing system, one or more processors, etc.) to perform operations and functions, such as those described herein for controlling an autonomous vehicle.
- the vehicle 104 can include one or more sensors 112 , an autonomy computing system 114 , and one or more vehicle control systems 116 .
- One or more of these systems can be configured to communicate with one another via a communication channel.
- the communication channel can include one or more data buses (e.g., controller area network (CAN)), on-board diagnostics connector (e.g., OBD-II), and/or a combination of wired and/or wireless communication links.
- the onboard systems can send and/or receive data, messages, signals, etc. amongst one another via the communication channel.
- the sensor(s) 112 can be configured to acquire sensor data 118 associated with one or more objects that are proximate to the vehicle 104 (e.g., within a field of view of one or more of the sensor(s) 112 ).
- the sensor(s) 112 can include a Light Detection and Ranging (LIDAR) system, a Radio Detection and Ranging (RADAR) system, one or more cameras (e.g., visible spectrum cameras, infrared cameras, etc.), motion sensors, and/or other types of imaging capture devices and/or sensors.
- LIDAR Light Detection and Ranging
- RADAR Radio Detection and Ranging
- the sensor data 118 can include image data, radar data, LIDAR data, and/or other data acquired by the sensor(s) 112 .
- the object(s) can include, for example, pedestrians, vehicles, bicycles, and/or other objects.
- the object(s) can be located in front of, to the rear of, and/or to the side of the vehicle 104 .
- the sensor data 118 can be indicative of locations associated with the object(s) within the surrounding environment of the vehicle 104 at one or more times.
- the sensor(s) 112 can provide the sensor data 118 to the autonomy computing system 114 .
- the autonomy computing system 114 can retrieve or otherwise obtain map data 120 .
- the map data 120 can provide detailed information about the surrounding environment of the vehicle 104 .
- the map data 120 can provide information regarding: the identity and location of different roadways, road segments, buildings, or other items or objects (e.g., lampposts, crosswalks, curbing, etc.) the location and directions of traffic lanes (e.g., the location and direction of a parking lane, a turning lane, a bicycle lane, or other lanes within a particular roadway or other travel way and/or one or more boundary markings associated therewith); traffic control data (e.g., the location and instructions of signage, traffic lights, or other traffic control devices); and/or any other map data that provides information that assists the vehicle 104 in comprehending and perceiving its surrounding environment and its relationship thereto.
- traffic lanes e.g., the location and direction of a parking lane, a turning lane, a bicycle lane, or other lanes within a particular roadway or other travel
- the vehicle 104 can include a positioning system 122 .
- the positioning system 122 can determine a current position of the vehicle 104 .
- the positioning system 122 can be any device or circuitry for analyzing the position of the vehicle 104 .
- the positioning system 122 can determine position by using one or more of inertial sensors, a satellite positioning system, based on IP/MAC address, by using triangulation and/or proximity to network access points or other network components (e.g., cellular towers, WiFi access points, etc.) and/or other suitable techniques.
- the position of the vehicle 104 can be used by various systems of the vehicle computing system 102 and/or provided to one or more remote computing device(s) (e.g., of the operations computing system 106 ).
- the map data 120 can provide the vehicle 104 relative positions of the surrounding environment of the vehicle 104 .
- the vehicle 104 can identify its position within the surrounding environment (e.g., across six axes) based at least in part on the data described herein.
- the vehicle 104 can process the sensor data 118 (e.g., LIDAR data, camera data) to match it to a map of the surrounding environment to get an understanding of the vehicle's position within that environment.
- the sensor data 118 e.g., LIDAR data, camera data
- the autonomy computing system 114 can include a perception system 124 , a prediction system 126 , a motion planning system 128 , and/or other systems that cooperate to perceive the surrounding environment of the vehicle 104 and determine a motion plan for controlling the motion of the vehicle 104 accordingly.
- the autonomy computing system 114 can receive the sensor data 118 from the sensor(s) 112 , attempt to comprehend the surrounding environment by performing various processing techniques on the sensor data 118 (and/or other data), and generate an appropriate motion plan through such surrounding environment.
- the autonomy computing system 114 can control the one or more vehicle control systems 116 to operate the vehicle 104 according to the motion plan.
- the autonomy computing system 114 can identify one or more objects that are proximate to the vehicle 104 based at least in pail on the sensor data 118 and/or the map data 120 .
- the perception system 124 can obtain state data 130 descriptive of a current state of an object that is proximate to the vehicle 104 .
- the state data 130 for each object can describe, for example, an estimate of the object's: current location (also referred to as position); current speed (also referred to as velocity); current acceleration; current heading; current orientation; size/footprint (e.g., as represented by a bounding shape); class (e.g., pedestrian class vs. vehicle class vs. bicycle class), and/or other state information.
- the perception system 124 can provide the state data 130 to the prediction system 126 (e.g., for predicting the movement of an object).
- the prediction system 126 can create predicted data 132 associated with each of the respective one or more objects proximate to the vehicle 104 .
- the predicted data 132 can be indicative of one or more predicted future locations of each respective object.
- the predicted data 132 can be indicative of a predicted path (e.g., predicted trajectory) of at least one object within the surrounding environment of the vehicle 104 .
- the predicted path e.g., trajectory
- the prediction system 126 can provide the predicted data 132 associated with the object(s) to the motion planning system 128 .
- the motion planning system 128 can determine a motion plan 134 for the vehicle 104 based at least in part on the predicted data 132 (and/or other data).
- the motion plan 134 can include vehicle actions with respect to the objects proximate to the vehicle 104 as well as the predicted movements.
- the motion planning system 128 can implement an optimization algorithm that considers cost data associated with a vehicle action as well as other objective functions (e.g., cost functions based on speed limits, traffic lights, etc.), if any, to determine optimized variables that make up the motion plan 134 .
- the motion planning system 128 can determine that the vehicle 104 can perform a certain action (e.g., pass an object) without increasing the potential risk to the vehicle 104 and/or violating any traffic laws (e.g., speed limits, lane boundaries, signage).
- the motion plan 134 can include a planned trajectory, speed, acceleration, other actions, etc. of the vehicle 104 .
- the motion planning system 128 can provide the motion plan 134 with data indicative of the vehicle actions, a planned trajectory, and/or other operating parameters to the vehicle control system(s) 116 to implement the motion plan 134 for the vehicle 104 .
- the vehicle 104 can include a mobility controller configured to translate the motion plan 134 into instructions.
- the mobility controller can translate a determined motion plan 134 into instructions to adjust the steering of the vehicle 104 “X” degrees, apply a certain magnitude of braking force, etc.
- the mobility controller can send one or more control signals to the responsible vehicle control component (e.g., braking control system, steering control system, acceleration control system) to execute the instructions and implement the motion plan 134 .
- the responsible vehicle control component e.g., braking control system, steering control system, acceleration control system
- the vehicle 104 can include a communications system 136 configured to allow the vehicle computing system 102 (and its computing device(s)) to communicate with other computing devices.
- the vehicle computing system 102 can use the communications system 136 to communicate with the operations computing system 106 and/or one or more other remote computing device(s) over one or more networks (e.g., via one or more wireless signal connections).
- the communications system 136 can allow communication among one or more of the system(s) on-board the vehicle 104 .
- the communications system 136 can also be configured to enable the autonomous vehicle to communication and/or otherwise receive data from a user device 138 associated with a user 110 .
- the communications system 136 can utilize various communication technologies such as, for example, Bluetooth low energy protocol, radio frequency signaling, etc.
- the communications systems 136 can enable the vehicle 104 to function as a WiFi base station for a user device 138 and/or implement localization techniques.
- the communications system 136 can include any suitable components for interfacing with one or more network(s), including, for example, transmitters, receivers, ports, controllers, antennas, and/or other suitable components that can help facilitate communication.
- the vehicle 104 can include one or more human-machine interfaces 139 .
- the vehicle 104 can include one or more display devices located onboard the vehicle 104 .
- a display device e.g., screen of a tablet, laptop, etc.
- a display device can be viewable by a user of the vehicle 104 that is located in the front of the vehicle 104 (e.g., driver's seat, front passenger seat).
- a display device can be viewable by a user of the vehicle 104 that is located in the rear of the vehicle 104 (e.g., back passenger seat(s)).
- the vehicle 104 can be associated with an entity (e.g., a service provider, owner, manager).
- entity can be one that provides one or more vehicle service(s) to a plurality of users via a fleet of vehicles that includes, for example, the vehicle 104 .
- the entity can be associated with only vehicle 104 (e.g., a sole owner, manager).
- the operations computing system 106 can be associated with the entity.
- the vehicle 104 can be configured to provide one or more vehicle services to one or more users.
- the vehicle service(s) can include transportation services (e.g., rideshare services in which the user rides in the vehicle 104 to be transported), courier services, delivery services, and/or other types of services.
- the vehicle service(s) can be offered to users by the entity, for example, via a software application (e.g., a mobile phone software application).
- the entity can utilize the operations computing system 106 to coordinate and/or manage the vehicle 104 (and its associated fleet, if any) to provide the vehicle services to a user 110 .
- the operations computing system 106 can include one or more computing devices that are remote from the vehicle 104 (e.g., located off-board the vehicle 104 ).
- such computing device(s) can be components of a cloud-based server system and/or other type of computing system that can communicate with the vehicle computing system 102 of the vehicle 104 .
- the computing device(s) of the operations computing system 106 can include various components for performing various operations and functions.
- the computing device(s) can include one or more processor(s) and one or more tangible, non-transitory, computer readable media (e.g., memory devices).
- the one or more tangible, non-transitory, computer readable media can store instructions that when executed by the one or more processor(s) cause the operations computing system 106 (e.g., the one or more processors, etc.) to perform operations and functions, such as coordinating vehicles to provide vehicle services.
- the operations computing system 106 e.g., the one or more processors, etc.
- a user 110 can request a vehicle service provided by the vehicle 104 .
- a user can provide (e.g., via a user device 138 ) data indicative of a request 140 to the operations computing system 106 (e.g., of the entity that is associated with the vehicle 104 ).
- the request 140 can be generated based at least in part on user input to a user interface displayed on the user device 138 (e.g., a user interface associated with a software application of the entity).
- the request 140 can indicate the type of vehicle service that the user 110 desires (e.g., a transportation service, a delivery service, a courier service, etc.) and a location associated with the user 110 (e.g., a current location of the user, a different location, etc.).
- the request 140 can also include an identifier (e.g., phone number, Bluetooth, WiFi, Cellular, IP address, other information, etc.) associated with the user device 138 that provided the request 140 (and/or other user device).
- the identifier can be used by the vehicle computing system 102 to communicate with the user device 138 and/or otherwise provide/obtain data associated therewith, as further described herein.
- such an identifier can be retrieved from a memory that securely stores such information in a profile/account associated with the user 110 (e.g., such that the request 140 need not provide the identifier).
- the operations computing system 106 can process the request 140 and select the vehicle 104 to provide the requested vehicle service to the user 110 .
- the operations computing system can provide, to the vehicle 104 , data 142 indicative of a location to which the vehicle 104 is to travel.
- FIG. 2 depicts an example geographic area 200 that includes a location 202 associated with the user 110 according to example embodiments of the present disclosure.
- the location 202 can be associated with the user 110 that requested the vehicle service.
- the location 202 can be the current location of the user 110 , as specified by the user 110 and/or determined based on user device location data (e.g., provided with the request 140 and/or otherwise obtained).
- the location 202 can also be a location that is different than a current location of a user 110 , such as for example a location at which the user 110 would like to be picked-up by the vehicle 104 , provide an item to the vehicle 104 , retrieve an item from the vehicle 104 , and/or otherwise interact with the vehicle 104 .
- the location 202 can be expressed as a coordinate (e.g., GPS coordinate, latitude-longitude coordinate pair), an address, a place name, and/or another geographic reference that can be used to identify the location 202 .
- the location 202 associated with the user 110 can be represented, for example, as a pin on a map user interface.
- the vehicle computing system 102 can obtain the data 142 indicative of the location 202 associated with the user 110 to which the vehicle 104 is to travel.
- the user 110 can be associated with a request 140 for a vehicle service provided by the vehicle 104 .
- the vehicle 104 can obtain the data 142 indicative of the location 202 associated with the user 110 from the operations computing system 106 .
- the user 110 may communicate directly with the vehicle 104 to request the vehicle service.
- the user 110 may use the user device 138 to send the request 140 to the vehicle computing system 102 .
- the vehicle computing system 102 can process the request 140 and determine the location 202 of the user 110 .
- the vehicle 104 can also obtain a first vehicle route 204 that leads to the location 202 associated with the user 110 .
- the first vehicle route 204 can be, for example, a route from the current location of the vehicle 104 to the location 202 associated with the user 110 .
- the operations computing system 106 can provide the first vehicle route 204 to the vehicle 104 .
- the vehicle computing system 102 of the vehicle 104 can determine the first vehicle route 204 .
- the vehicle computing system 102 can determine the first vehicle route 204 based at least in part on the map data 120 .
- the vehicle 104 is to travel in accordance with the first vehicle route 204 to arrive within a vicinity 206 of the location 202 associated with the user 110 .
- the vehicle computing system 102 can control the vehicle 104 (e.g., via a motion plan 134 implemented by the control system(s) 116 ) to travel along a first vehicle route 204 to arrive within a vicinity 206 of the location 202 associated with the user 110 .
- the vicinity 206 of the location 202 associated with the user 110 can be defined at least in part by a distance (e.g., a radial distance) from the location 202 associated with the user 110 .
- the distance can be indicative of an acceptable walking distance from the location 202 associated with the user 110 .
- an acceptable walking distance can be distance that a user would be willing walk (or otherwise travel) to arrive at a vehicle.
- the operations computing system 106 can determine the acceptable walking distance, as described herein, and provide such information to the vehicle 104 .
- the vehicle 104 can determine the acceptable walking distance.
- the acceptable walking distance can be determined by another computing system and provided to the operations computing system 106 and/or the vehicle 104 .
- the acceptable walking distance can be determined based on a variety of information.
- FIG. 3 depicts example information 300 associated with an acceptable walking distance according to example embodiments of the present disclosure.
- the acceptable walking distance 302 can be determined based at least in part on at least one of current traffic data 304 A, historic data 304 B, user preference data 304 C, local vehicle weather data 304 D, regional weather data 304 E, and/or other data.
- the current traffic data 304 A can be indicative of a current level of traffic within the geographic area 200 and/or an area that would affect the geographic area 200 .
- the current traffic data 304 A can be obtained from another computing system (e.g., city management database, the operations computing system 106 , another vehicle, etc.) and/or determined by a vehicle 104 , as further described herein.
- another computing system e.g., city management database, the operations computing system 106 , another vehicle, etc.
- the acceptable walking distance 302 may be higher so that the user 110 isn't waiting a greater time to board the vehicle 104 , place an item in the vehicle 104 , retrieve an item from the vehicle, etc.
- the historic data 304 B can include historic data associated with providing vehicle services to a user.
- the historic data 304 B can be indicative of previously calculated acceptable walking distances for the specific user 110 and/or for other user(s) of the vehicle services.
- the acceptable walking distance 302 can be determined to reflect the historically acceptable walking distance.
- the historic data 3049 can include historic traffic data.
- the acceptable walking distance 302 can be based at least in part on preferences of the user 110 .
- the entity associated with the vehicle 104 can maintain an account/profile associated with the user 110 .
- the user 110 can specify an acceptable walking distance (e.g., via user input to a user interface).
- the user-specified acceptable walking distance can be securely stored and used to determine the acceptable walking distance 302 when the user 110 requests a vehicle service.
- the user 110 may provide feedback regarding the distance the user 110 walked to arrive at a vehicle that is providing the user 110 vehicle services.
- the user 110 can provide feedback data (e.g., via user input to a user interface) indicating whether the distance was acceptable or unacceptable (e.g., as prompted by a software application). Accordingly, the acceptable walking distance 302 can be based at least in part on such feedback data. In some implementations, acceptable walking distance 302 can be based at least in part on preferences of other users such as those similar situated to the user 110 , within the geographic area, within a similar geographic area, etc.
- the acceptable walking distance 302 can also, or alternatively, be based at least in part on weather data.
- the acceptable walking distance 302 can be based at least in part on local weather data 304 D obtained via a vehicle.
- the vehicle 104 can include a rain sensor, thermometer, humidity sensor, and/or other types of sensor(s) that can be used to determine weather conditions within the surrounding environment of the vehicle 104 .
- the vehicle 104 can also be configured to determine the presence of one or more weather conditions (e.g., rain, sleet, snow, etc.) based at least in part on the sensor data 118 .
- the acceptable walking distance 302 can be based at least in part on regional weather data 304 E (e.g., from a third party weather source).
- the acceptable walking distance 302 can be adjusted depending on the weather conditions indicated by the weather data.
- the acceptable walking distance 302 may be decreased to a short distance from the location 202 .
- the acceptable walking distance 302 can be greater distance from the location 202 .
- the acceptable walking distance 302 can be determined at least in part from a model, such as a machine-learned model.
- the machine-learned model can be or can otherwise include one or more various model(s) such as, for example, models using boosted random forest techniques, neural networks (e.g., deep neural networks), or other multi-layer non-linear models.
- Neural networks can include recurrent neural networks (e.g., long short-term memory recurrent neural networks), feed-forward neural networks, and/or other forms of neural networks.
- supervised training techniques can be performed to train the model using historical acceptable walking distances, weather data, user feedback data, etc. to determine an acceptable walking distance 302 based at least in part on input data.
- the input data can include, for example, the various types of information 300 , as described herein.
- the machine-learned model can provide, as an output, data indicative of an acceptable walking distance 302 .
- the acceptable walking distance 302 can be specific to the geographic area.
- the model can be trained based on information associated with the geographic area 200 such that the outputted acceptable walking distance 302 is specific to the geographic area 200 .
- the vehicle 104 can provide a communication to the user 110 indicating that the vehicle 104 is within the vicinity 206 of the location 202 associated with the user 110 .
- the vehicle computing system 102 can determine that the vehicle 104 is within the acceptable walking distance 302 from the location 202 of the user 110 (e.g., based on the positioning system 144 ).
- the vehicle computing system 102 can send a communication to the user device 138 associated with the user 110 .
- the communication can indicate that the vehicle 104 is within the vicinity 206 of the user 110 (e.g., within the acceptable walking distance 302 ).
- a communication can be in the form of a textual message, auditory message, etc. stating “your vehicle is arriving please prepare to board”.
- the vehicle 104 can begin to search for a parking location (e.g., out of the vehicle's travel way) when the vehicle 104 is within the vicinity of the location 202 associated with the user 110 .
- a parking location e.g., out of the vehicle's travel way
- the vehicle 104 can enter into the approach operating mode 108 D when the vehicle 104 is within the vicinity 206 of the location 202 associated with the user 110 .
- the vehicle 104 can search for a parking location before it reaches the location 202 associated with the user 110 (e.g., before the GPS pin coordinate on a map).
- the vehicle 104 can search for a parking location after it passes the location 202 associated with the user 110 and is still within the vicinity 206 of the user 110 (e.g., within the acceptable walking distance 302 ).
- the vehicle computing system 102 can obtain sensor data 118 associated with one or more objects that are proximate to the vehicle 104 .
- the sensor data 118 can be indicative of locations associated with the object(s) within the surrounding environment of the vehicle 104 at one or more times.
- the vehicle 104 can process the sensor data 118 to determine if there are any available parking locations out of the vehicle's travel way (e.g., out of a traffic lane) that are not currently occupied by the objects (e.g., other vehicles) within the vehicle's surrounding environment.
- the vehicle 104 can utilize the map data 120 to determine whether any designated parking locations (e.g., parking lots, pullover lanes, etc.
- the vehicle 104 can identify a parking location that is out of a travel way and within the vicinity 206 of the location 202 associated with the user 110 (e.g., out of a traffic lane), the vehicle 104 can position itself into that parking location accordingly.
- the vehicle computing system 102 can send a communication to the user device 138 associated with the user 110 indicating that the vehicle 104 has parked.
- FIG. 4 depicts an example display device 400 with an example communication 402 according to example embodiments of the present disclosure.
- the display device 400 e.g., display screen
- the communication 402 can be presented via a user interface 404 on the display device 400 .
- the communication 402 can indicate that the vehicle 104 has arrived.
- the communication 402 and/or another portion of the user interface 404 can be indicative of a location of the vehicle 104 .
- the display device 400 can display a map user interface 406 that includes a user route 408 .
- the user route 408 can be a route along which a user 110 can travel to arrive at the vehicle 104 .
- the vehicle 104 can decide whether or not to stop at least partially in a travel way to wait for the user 110 . To help do so, the vehicle computing system 102 of the vehicle 104 can obtain traffic data associated with the geographic area 200 that includes the location 202 associated with the user 110 .
- FIG. 5 depicts example traffic data 500 that can be obtained by the vehicle computing system 102 according to example embodiments of the present disclosure.
- the traffic data 500 can be obtained from a variety of sources such as other vehicles (e.g., other autonomous vehicles within the vehicle fleet), the operations computing system 106 , third party sources (e.g., traffic management entities, etc.), as well as the vehicle 104 itself.
- the traffic data 500 can include, for example, in-lane traffic data 502 A, out-of-lane traffic data 502 B, other vehicle traffic data 502 C, current wider traffic data 502 D, historic traffic data 502 E, and/or other types of traffic data.
- the traffic data 500 can be updated periodically, as scheduled, upon request, in real-time, and/or in near real-time.
- the in-lane traffic data 502 A and/or out-of-lane traffic data 502 B can be indicative of a level of traffic within the surrounding environment of the vehicle 104 .
- the in-lane traffic data 502 A can be indicative of the number of objects, object locations, and the speed of the respective objects within the current travel lane (and/or other designated travel boundaries) of the vehicle 104 (e.g., the other vehicles to the rear and front of the vehicle 104 ).
- the out-of-lane traffic data 502 B can be indicative of the number of objects, object locations, and the speed of the respective objects within the surrounding environment, other than in the current travel lane (or other boundaries) of the vehicle 104 (e.g., other vehicles in the other lanes, all other classified objects around the vehicle 104 , etc.).
- the in-lane traffic data 502 A and/or out-of-lane traffic data 502 B can be based at least in part on the sensor data 118 associated with the surrounding environment of the vehicle 104 , as further described herein.
- the vehicle computing system 102 can obtain the other vehicle traffic data 502 C from one or more other vehicles, such as one or more other autonomous vehicles in a fleet that includes the vehicle 104 .
- the other vehicle traffic data 502 C can be indicative of the number of objects, object locations, and the speed of the respective objects within the surrounding environment of the other vehicle(s), while the other vehicle(s) are within the geographic area 200 , the vicinity 206 of the location 202 associated with the user 110 , and/or another location which may have a traffic effect on the geographic area 200 and/or the vicinity 206 of the location 202 associated with the user 110 .
- the vehicle computing system 102 can obtain the other vehicle traffic data 502 C via vehicle to vehicle communication and/or via other computing device(s) (e.g., the operations computing system 106 ).
- the current wider traffic data 502 D can be indicative of the traffic within a larger regional area that includes the geographic area 200 .
- the vehicle computing system 102 can obtain the current wider traffic data 502 D that indicates the current traffic patterns, build-ups, etc. within a region that includes the geographic area 200 .
- Such data can be obtained via a third patty source such as, for example, a traffic management entity associated with the region.
- the historic traffic data 502 E can include previously collected traffic data of the types of traffic data described herein and/or other types of traffic data.
- the historic traffic data 502 E can include traffic data previously obtained by the vehicle computing system 102 (e.g., in-lane, out-of-lane traffic data), traffic data previously obtained by other vehicle(s) (e.g., associated with the geographic area 200 , the vicinity 206 of the location 202 , etc.), historic traffic data (e.g., traffic patterns) associated with a region that includes the geographic area 200 , and/or other types of historic traffic data.
- the vehicle computing system 102 can determine an estimated traffic impact 504 of the vehicle 104 on the geographic area 200 based at least in part on the traffic data 500 .
- the estimated traffic impact 504 can be indicative of an estimated impact of the vehicle 104 on one or more objects within a surrounding environment of the vehicle 104 in the event that the vehicle 104 were to stop at least partially in the travel way within the vicinity 206 of the location 202 associated with the user 110 .
- the estimated traffic impact 504 can estimate the likelihood that an approach object (e.g., another vehicle) can pass safely without endangering the user 110 .
- the estimated traffic impact 504 can be based at least in part on a comparison of a level of traffic to a traffic constraint 506 .
- the vehicle computing system 102 can determine a level of traffic (e.g., a number of objects that could be impacted by the vehicle 104 stopping at least partially in a travel way) within the surrounding environment of the vehicle 104 based at least in part on one or more of the types of traffic data 500 .
- the vehicle computing system 102 can compare the level of traffic to a traffic constraint 506 to determine whether the estimated traffic impact 504 would be high or low (e.g., significant or insignificant, unacceptable or acceptable, etc.).
- the traffic constraint 506 can be implemented in a variety of forms.
- the traffic constraint 506 can include a traffic threshold that is indicative of an acceptable level of traffic (e.g., an acceptable number of objects) that would be impacted by the vehicle 104 stopping at least partially in the travel way. A level of traffic that exceeds the traffic threshold would be considered a high impact on traffic.
- the traffic constraint 506 can be implemented as cost data (e.g., one or more cost function(s)).
- the vehicle computing system 102 (e.g., the motion planning system 128 ) can include cost data that reflects the cost(s) of stopping vehicle motion, the cost(s) of causing traffic build-up, the cost(s) of illegally stopping in a travel way, etc.
- the traffic constraint 506 can be based on a variety of information.
- the traffic constraint 506 can be based at least in part on historic traffic data that indicates the level of traffic previously occurring in that geographic area. For example, if the geographic area 200 normally experiences a high level of traffic build-up, a corresponding traffic threshold can be higher (and/or the cost of stopping can be lower).
- the traffic constraint 506 can be based at least in part on real-time traffic data (e.g., from other vehicles in the fleet, from the vehicle 104 , other sources). For example, in the event that there is already a traffic jam in the vicinity 206 of the location 202 of the user 110 , the traffic threshold could be higher (and/or the cost of stopping could be lower).
- the traffic constraint 506 can be based at least in part on the typical travel expectations of individuals in the geographic area 200 . For example, individuals that are located in City A may he more patient when waiting in traffic than those in City B. Thus, a traffic threshold may be higher in City A than in City B (and/or the cost of stopping may be lower in City A than in City B). In some implementations, the traffic constraint 506 can be based at least in part on map data 120 (and/or other map data). For example, in the event that the vehicle 104 is traveling on a wide travel way in which impacted vehicles could eventually travel around the vehicle 104 , the traffic threshold could be higher (and/or the cost of stopping could be lower). The traffic constraint 506 can be determined dynamically, in real-time (and/or near real-time) to reflect the conditions currently faced by the vehicle 104 .
- the traffic constraint 506 can be determined at least in part from a model, such as a machine-learned model.
- the machine-learned model can he or can otherwise include one or more various model(s) such as, for example, models using boosted random forest techniques, neural networks (e.g., deep neural networks), or other multi-layer non-linear models.
- Neural networks can include recurrent neural networks (e.g., long short-term memory recurrent neural networks), feed-forward neural networks, and/or other forms of neural networks.
- supervised training techniques can he performed to train the model (e.g., using previous driving logs) to determine a traffic constraint 506 based at least in part on input data.
- the input data can include, for example, the traffic data 500 as described herein, map data, data from a traffic management entity, driving characteristics of individuals in an associated geographic area, complaints received from operators of vehicles that were caused to stop by autonomous vehicles, etc.
- the machine-learned model can provide, as an output, data indicative of a recommended traffic constraint.
- the recommended traffic constraint can be specific to the geographic area 200 .
- the model can be trained based at least in part on data associated with geographic area 200 such that the recommended traffic constraint is specific for that particular region.
- the estimated traffic impact 504 can be based at least in part on a model.
- the vehicle computing system 102 and/or the operations computing system 106 can obtain data descriptive of the model (e.g., machine learned model).
- the vehicle computing system 102 and/or the operations computing system 106 can provide input data to the model.
- the input data can include the one or more of the types of traffic data 500 (e.g., associated with the geographic area 200 ).
- the input data can include the map data 120 .
- the model can determine the estimated traffic impact 504 that the vehicle 104 would have on the geographic area 200 if the vehicle 104 were to stop at least partially within a travel way (e.g., within a current lane of travel).
- the model can evaluate the traffic data 500 to determine the level of traffic in the vehicle surrounding environment.
- the model can analyze the traffic data 500 with respect to the traffic constraint 506 to determine whether the estimated traffic impact 504 is high or low, significant or insignificant, unacceptable or acceptable, etc.
- the output of such a model can be the estimated traffic impact 504 , which can be indicative of, for example, a number of objects that would be impacted by the vehicle 104 stopping at least partially in a travel way and/or whether the estimated traffic impact is high or low, significant or insignificant, acceptable or unacceptable, etc.
- the output of the model can be provided as an input to the model for another set of traffic data (e.g., at a subsequent time step). In such fashion, confidence can be built that a determined estimated traffic impact is the accurate.
- the process can be iterative such that the estimated traffic impact can be recalculated over time as it becomes clearer what the estimated traffic impact is on the respective geographic area.
- the model can include one or more autoregressive models.
- the model can include one or more machine-learned recurrent neural networks.
- recurrent neural networks can include long or short-term memory recurrent neural networks, gated recurrent unit networks, or other forms of recurrent neural networks.
- the estimated traffic impact 504 can be based at least in part on the sensor data 118 acquired onboard the vehicle 104 .
- the vehicle computing system 102 can obtain traffic data associated with the geographic area 200 that includes the location 202 associated with the user 110 based at least in part on the sensor data 118 obtained via the one or more sensors 112 .
- the vehicle computing system 102 can obtain the in-lane traffic data 502 A and/or out-of-lane traffic data 502 B associated with the geographic area 200 that includes the location 202 associated with the user 110 .
- the vehicle computing system 102 can obtain sensor data 118 (e.g., via the one or more sensors 112 ) associated with the surrounding environment of the vehicle 104 that is within the vicinity 206 of the location 202 associated with the user 110 , as described herein.
- the sensor data 118 can be indicative of one or more objects within the surrounding environment of the vehicle 104 that is within the vicinity 206 of the location 202 .
- the vehicle computing system 102 can process the sensor data 118 to classify which of the object(s) would be impacted (e.g., caused to stop) by the vehicle 104 stopping in a travel way.
- FIG. 6 depicts an example travel way 600 according to example embodiments of the present disclosure.
- the travel way 600 can be associated with the geographic area 200 .
- the travel way 600 can be located within the vicinity 206 of the location 202 associated with the user 110 (e.g., the street on which the user 110 is located).
- the travel way 600 can include the current travel lane 602 (and/or other designated travel boundaries) of the vehicle 104 .
- the travel way 600 can include other travel lanes, such as other lane 604 (e.g., a lane adjacent to the current travel lane 602 ).
- the vehicle computing system 102 can classify the objects within the surrounding environment that would be impacted in the event the vehicle 104 were to stop at least partially in the travel way 600 .
- the object 606 e.g., another vehicle behind the vehicle 104 in the same travel lane 602
- the vehicle computing system 102 can also identify object(s) that would not be affected by the vehicle 104 stopping at least partially in the travel way 600 .
- object(s) can include one or more objects in another travel lane as well as objects with the same travel lane as the vehicle 104 .
- the vehicle computing system 102 can determine that the object 608 located in the other travel lane 604 (e.g., another vehicle in an adjacent travel lane) will not be impacted by the vehicle 104 stopping at least partially in the travel way 600 because the object 608 can continue passed the vehicle 104 via the other travel lane 604 .
- the vehicle computing system 102 can determine that an object 610 (e.g., a bicycle) may not be impacted by the vehicle 104 stopping (and/or may be impacted to an insignificant degree) because the object 608 may be parked and/or have the opportunity to travel around the vehicle 104 (e.g., via a lane change into the adjacent lane, maneuver around the vehicle 104 within the same lane, etc.).
- an object 610 e.g., a bicycle
- the vehicle computing system 102 can determine that an object 610 (e.g., a bicycle) may not be impacted by the vehicle 104 stopping (and/or may be impacted to an insignificant degree) because the object 608 may be parked and/or have the opportunity to travel around the vehicle 104 (e.g., via a lane change into the adjacent lane, maneuver around the vehicle 104 within the same lane, etc.).
- an object 610 e.g., a bicycle
- the vehicle computing system 102 can determine a level of traffic associated with the geographic area 200 (e.g., within the vicinity of the user's location) based at least in part on the sensor data 118 .
- the level of traffic can be based at least in part on the number of object(s) within the surrounding environment of the vehicle 104 that would be impacted by the vehicle 104 stopping at least partially in the travel way 600 (e.g., the current travel lane 602 ), while filtering out those object(s) that would not be impacted. Similar such information could be acquired via one or more other vehicles in the associated vehicle fleet.
- the vehicle computing system 102 can determine the estimated traffic impact by comparing the level of traffic to the traffic constraint 506 (e.g., a traffic threshold indicative of a threshold level of traffic). In the event that the level of traffic exceeds the traffic constraint 506 , the vehicle computing system 102 can determine that the estimated traffic impact 504 on the geographic area 200 would be high.
- the traffic constraint 506 e.g., a traffic threshold indicative of a threshold level of traffic.
- the vehicle computing system 102 can also, or alternatively, determine an estimated time of user arrival.
- the estimated time of user arrival can be indicative of an amount of time needed for the user 110 to interact with the vehicle 104 (e.g., before the vehicle 104 can begin moving again).
- the vehicle computing system 102 can determined an estimated time until the user 110 completes boarding of the vehicle 104 (e.g., enters the vehicle 104 and fasten a seatbelt for a transportation service).
- the vehicle computing system 102 can determine an estimated time until the user 110 completes the retrieval of an item from the vehicle 104 (e.g., completely removes an item from the vehicle 104 for a delivery service).
- the vehicle computing system 102 can determine an estimated time until the user 110 places an item in the vehicle 104 (e.g., for a courier service).
- the estimated time estimated time of user arrival can help the vehicle computing system 102 determine whether or not to stop at least partially in the travel way 600 .
- Such time estimate(s) can be expressed as a time duration (e.g., user estimated to arrive in 1 minute) and/or a point in time (e.g., user estimated to arrive at 10:31 am (PT)).
- the estimated time of user arrival (e.g., the estimated time until the user 110 completes boarding, item retrieval, item placement, etc.) can be based on a variety of information.
- FIG. 7 depicts example depicts a flow diagram of an example method 700 of determining an estimated time of user arrival (e.g., autonomous vehicle user boarding times) according to example embodiments of the present disclosure. While the following provides examples of the method 700 with respect to determining autonomous vehicle user boarding times, a similar approach can be taken for determining an estimated time until the user 110 completes the retrieval of an item from the vehicle 104 and/or an estimated time until the user 110 places an item in the vehicle 104 .
- One or more portion(s) of the method 700 can be implemented by one or more computing devices such as, for example, the one or more computing device(s) of the vehicle computing system 102 and/or other systems (e.g., as computing operations). Each respective portion of the method 700 can be performed by any (or any combination) of the one or more computing devices. Moreover, one or more portion(s) of the method 700 can be implemented as an algorithm on the hardware components of the device(s) described herein (e.g., as in FIGS. 1 and 14 ), for example, to control an autonomous vehicle.
- FIG. 7 depicts elements performed in a particular order for purposes of illustration and discussion.
- the method 700 can include obtaining one or more identifiers of a user device.
- the vehicle computing system 102 can use one or more identifier(s) of the user device 138 (e.g., obtained with the request 140 , obtained by the vehicle 104 , provided by the operations computing system 106 , etc.) to scan for and/or communicate with the user device 138 when the vehicle 104 is within the vicinity 206 of the user 110 .
- the operations computing system 106 can provide the vehicle computing system 102 with one or more identifier(s) of a user device 138 associated with the user 110 .
- the identifiers can be, for example, unique radio identifiers (e.g., Bluetooth, WiFi, Cellular, friendly names, contact number, other identifier) collected via a software application and provided to the operations computing system 106 .
- the vehicle computing system 102 can utilize the identifiers to communicate and/or locate a user device 138 associated with the user 110 . For example, when the vehicle 104 is within a vicinity 206 of the location 202 associated with the user 110 , the vehicle computing system 102 can scan for the user device 138 (e.g., opt-in radios) based at least in part on the identifier(s) (e.g., when the vehicle 104 is in the approach mode 108 D).
- the user device 138 e.g., opt-in radios
- the method 700 can include obtaining location data associated with the user device.
- the vehicle computing system 102 can obtain (e.g., via the communication system 136 ) location data associated with a user device 138 associated with a user 110 .
- the location data associated with the user device 138 can be indicative of one or more locations of the user device 138 associated with the user 110 , at one or more times.
- the vehicle computing system 102 can use the identifier(s) to determine the location of the user 110 .
- the vehicle computing system 102 can be configured to utilize a variety of communication technologies to obtain the location data associated with the user device 138 .
- the vehicle computing system 102 can obtain the location data based at least in part on a triangulation of signals via at least one of multiple input, multiple output communication between the vehicle 104 and the user device 138 , one or more Bluetooth low energy beacons located onboard the vehicle 104 , or a light signal handshake between the user device 138 and the vehicle 104 , as further described herein.
- the vehicle computing system 102 can use radio frequency (RF) signaling to obtain location data associated with the user 138 .
- FIG. 8A depicts an example portion 800 of a communications system 136 according to example embodiments of the present disclosure.
- the communications system 136 can include one or more electronic devices 802 (e.g., RF module) configured to transmit and/or obtain one or more RF signals (e.g., from the user device 138 ) via one or more communication device(s) 804 (e.g., transmitters, receivers, transmitters, RF sensors, etc.).
- RF radio frequency
- the vehicle computing system 102 can track changes in the signal strength (e.g., radio signal strength identifier (RSSI)) to determine various information about the user device 138 .
- RSSI radio signal strength identifier
- the vehicle computing system 102 can determine the approximate distance of the user device 138 (and the user 110 ) to the vehicle 104 (e.g., the difference between RSSI triangulation and improvements over time can be used to determine the approximate distance of the user 110 to the vehicle 104 ).
- the vehicle computing system 102 can determine a heading of the user 110 based at least in part on RF signal(s) associated with the user device 138 (e.g., the communication device(s) 804 with the best/strongest RSSI reading can indicate direction/heading of the user 110 ).
- the vehicle computing system 102 can also determine a speed/velocity of the user device 138 using such a technique. The approximate distance, heading, speed, etc. measurements can be determined with or without an authenticated connection.
- the vehicle computing system 102 can utilize Bluetooth low energy protocol to obtain location data associated with the user device 138 .
- FIG. 8B depicts an example portion 810 of a communications system 136 according to example embodiments of the present disclosure.
- the communications system 136 can include one or more Bluetooth low energy beacons 812 that are located onboard the vehicle 104 .
- the vehicle computing system 102 can obtain the location data associated with the user device 138 based at least in part on the one or more Bluetooth low energy beacons 812 located onboard the vehicle 104 , in some implementations, the vehicle computing system 102 can determine differences in Bluetooth Low Energy (BLE) beacon radio signal strength identifiers over time and/or inertial measurement unit changes, which can indicate a distance between the user device 138 of the user 110 (e.g., a mobile phone associated with the user 110 ) and the vehicle 104 .
- BLE Bluetooth Low Energy
- the vehicle computing system 102 can determine a heading of the user device 138 (and the user 110 ) based at least in part on the Bluetooth low energy beacon(s) 812 (e.g., the beacon(s) with the best/strongest RSSI reading can indicate direction/heading of the user 110 ).
- the vehicle computing system 102 can also determine a speed/velocity of the user device 138 using such a technique.
- the user device 138 can determine its location (e.g., relative to the vehicle 104 ) based at least in part on signals transmitted from one or more beacons 812 located onboard the vehicle 104 .
- the user device 138 can provide data indicative of its location to one or more remote computing device(s) (e.g., the operations computing system 106 , cloud-based system).
- the remote computing device(s) can provide data indicative of the location of the user device 138 (determined based on the beacons 812 ) to the vehicle computing system 102 .
- the vehicle computing system 102 can determine the location of the user device 138 (and the user 110 ) based on such data.
- the remote computing device(s) can process data from the user device 138 (e.g., associated with the beacons 812 ) to determine a location of the user device 138 and provide data associated therewith to the vehicle computing system 102 .
- the vehicle computing system 102 can utilize one or more altimeters (and/or other measuring device) to obtain location data associated with the user device 138 .
- FIG. 8C depicts an example diagram 820 of obtaining location data according to example embodiments of the present disclosure.
- the vehicle 104 can include one or more altimeters 822 located onboard the vehicle 104 .
- the user device 138 can include one or more altimeters 824 .
- the user device 138 (and the user 110 ) can be located on the second story of a building 826 .
- the vehicle computing system 102 can obtain location data associated with the user device 138 via at least one altimeter 822 located onboard the vehicle 104 .
- the vehicle computing system 102 can obtain location data associated with altimeter(s) 824 of the user device 138 via one or more network(s) 828 .
- the vehicle computing system 102 can compare the location data associated with altimeter(s) 824 of the user device 138 to location data associated with altimeter(s) 822 of the vehicle 104 .
- the vehicle computing system 102 can determine an elevation/altitude of the user device 138 (and the user 110 ) based at least in part on this comparison (e.g., a difference between the altimeter(s) 824 of the user device 138 and the altimeter(s) 822 of the vehicle 104 can indicate elevation/altitude difference).
- the elevation/altitude can be relative to the position of the vehicle 104 and/or another reference (e.g., ground level, sea level. etc.).
- the vehicle computing system 102 can utilize other communication technologies to obtain location data associated with the user device 138 .
- the vehicle computing system 102 can obtain the location data associated with the user device 138 based at least in part on multiple input, multiple output communication between the vehicle 104 and the user device 138 associated with the user 110 . This can allow the vehicle 104 to take advantage of the multiple antennas included in the vehicle's communication system 136 as well as those of the user device 138 to increase accuracy of the location data 704 A associated with the user 110 .
- the user device 138 can obtain an identifier (e.g., Radio Network Temporary Identifier) that can be associated with the user device 138 .
- an identifier e.g., Radio Network Temporary Identifier
- the user device 138 can provide data indicative of the identifier to one or more remote computing device(s) (e.g., the operations computing system 106 , cloud-based system).
- the remote computing device(s) can provide data indicative of the identifier to the vehicle computing system 102 .
- the vehicle computing system 102 can locate the user device 138 (and the user 110 ) based at least in part on the identifier via the multiple antennas onboard the vehicle 104 (and/or the antenna(s) of the user device 138 ).
- the vehicle computing system 102 can utilize a handshake (e.g., light signal handshake) between the user device 138 and the vehicle 104 .
- a handshake e.g., light signal handshake
- the vehicle computing system 102 can obtain location data associated with the user device 138 based at least in part on image data.
- the user 110 can obtain image data associated with the user 110 (e.g., via the user device 138 ).
- the image data can be indicative of one or more characteristics (e.g., buildings, street signs, etc) of the geographic area and/or surrounding environment of the user 110 .
- the user 110 may be included in the image data.
- the user device 138 can process the image data to determine the location of the user 110 (e.g., via a comparison of image data features to known geographic features).
- the user device 138 can provide the determined location to one or more remote computing device(s) (e.g., the operations computing system 106 and/or other cloud-based system).
- the remote computing device(s) can provide data indicative of the location of the user 110 to the vehicle computing system 102 .
- the vehicle computing system 102 can obtain the data indicative of the location of the user 110 determined based at least in part on image data associated with the user 110 .
- the estimated time until the user 110 starts boarding the vehicle 104 can be based at least in part on the location of the user 110 .
- the remote computing device e.g., the operations computing system 106 and/or other cloud-based system
- the remote computing device can obtain the image data associated with the user 110 , process the image data (e.g., as described herein), and provide the data indicative of the location of the user 110 to the vehicle computing system 102 . This can be helpful to save the processing resources of a computationally limited device (e.g., mobile device).
- the vehicle computing system 102 can obtain the image data associated with the user 110 (e.g., via the user device 138 and/or the remote computing device). The vehicle computing system 102 can determine a location of the user 110 based at least in part on the image data. For example, the vehicle computing system 102 can analyze the features of the image data (e.g., the background, street signs, buildings, etc.) and compare the features to other data (e.g., sensor data 118 , map data 120 , other data, etc.). The vehicle computing system 102 can determine a location of the user based at least in part on this comparison.
- the features of the image data e.g., the background, street signs, buildings, etc.
- other data e.g., sensor data 118 , map data 120 , other data, etc.
- the vehicle computing system 102 can utilize other communication techniques. These techniques can include, for example, vehicle perception of the user 110 (e.g., via processing of sensor data 118 to perceive the user 110 and the user's location, distance, heading, velocity, and/or other state data 130 associated therewith), GPS location of the user device 138 , device specific techniques (e.g., specific device/model type), the vehicle 104 serving as a localized base station (e.g., GPS, WiFi, etc.), and/or other techniques.
- vehicle perception of the user 110 e.g., via processing of sensor data 118 to perceive the user 110 and the user's location, distance, heading, velocity, and/or other state data 130 associated therewith
- GPS location of the user device 138 e.g., device specific techniques (e.g., specific device/model type)
- the vehicle 104 serving as a localized base station (e.g., GPS, WiFi, etc.), and/or other techniques.
- the method 700 can include determining an estimated time until the user starts interaction with the vehicle, at ( 706 ).
- the vehicle computing system 102 can determine an estimated time until the user starts boarding the vehicle 104 (ETSB) based at least in part on the location data associated with the user device 138 .
- the estimated time of until the user 110 starts interaction with the vehicle 104 e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)
- ETSB estimated time until the user starts boarding the vehicle 104
- the estimated time of until the user 110 starts interaction with the vehicle 104 can be indicative of, for example, a countdown in time until the user's location is near/at the vehicle 104 .
- the user 110 begins to board, until the doors of the vehicle 104 are unlocked, until the doors are opened, etc.
- the user 110 can, but need not, physically interact with the vehicle 104 for this time estimate.
- the estimated time until the user 110 starts interaction with the vehicle 104 can be based on a variety of data.
- the vehicle computing system 102 can determine a distance 708 between the user 110 and the vehicle 104 based at least in part on the location data associated with the user device 138 , as described herein.
- the estimated time until the user 110 starts interaction with the vehicle 104 can be based at least in part on the distance 708 between the user 110 and the vehicle 104 .
- the vehicle computing system 102 can determine an elevation/altitude 710 of the user 110 based at least in part on the location data associated with the user device 138 (e.g., obtained via the one or more altimeters onboard the vehicle 104 ), as described herein.
- the estimated time until the user 110 starts interaction with the vehicle 104 e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)
- ETSB estimated time until the user starts boarding the vehicle 104
- the vehicle computing system 102 can determine a heading 712 of the user 110 based at least in part on the location data associated with the user device 138 , as described herein.
- the estimated time until the user 110 starts interaction with the vehicle 104 e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)
- ETSB estimated time until the user starts boarding the vehicle 104
- the vehicle computing system 102 can determine a location of the user 110 based at least in part on image data, as described herein.
- the estimated until the user 110 starts interaction with the vehicle 104 e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)
- ETSB estimated time until the user starts boarding the vehicle 104
- the vehicle computing system 102 can obtain historic data 714 to help determine the estimated time until the user 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)).
- the historic data 714 can be indicative of, for example, historic start boarding times of one or more other users (and or the user 110 ).
- a machine learned model can be trained based on such historic data to determine the estimated time until the user 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)).
- the model can be trained based on training data indicative of previous location data, user distances, altitudes, headings, etc.
- the model can be trained to receive input data location data, user distances, altitudes, headings, etc.) and provide, as an output, an until the user 110 starts interaction with the vehicle 104 (e.g., an estimated time until the user starts boarding a vehicle (ETSB)).
- the estimated time until the user 110 starts interaction with the vehicle 104 can be based at least in part on the historic data 714 .
- vehicle computing system 102 can determine the estimated time until the user 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) based at least in part on such a machine-learned model (e.g., as an output thereto).
- ETSB estimated time until the user starts boarding the vehicle 104
- the estimated time until the user 110 starts interaction with the vehicle 104 can he used to determine one or more vehicle actions, at ( 716 ).
- the estimated time until the user 110 starts interaction with the vehicle 104 e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)
- the estimated time until the user 110 starts interaction with the vehicle 104 can indicate that the user 110 is close to the vehicle 104 and/or heading toward the vehicle 104 .
- the vehicle computing system 102 can cause one or more doors of the vehicle 102 to unlock based at least in part on the estimated time until the user 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)). To do so, the vehicle computing system 102 can provide control signal(s) to an associated door controller. Additionally, or alternatively, the vehicle computing system 102 can cause the vehicle 104 to implement one or more vehicle settings (e.g., temperature, music, etc.) associated with the user 110 based at least in part on the estimated time until the user 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)).
- vehicle settings e.g., temperature, music, etc.
- the vehicle computing system 102 can access a profile associated with the user 110 to identity the user's preferred vehicle settings and can provide one or more control signals to the appropriate systems onboard the vehicle 104 (e.g., temperature control system, sound system etc.) to implement the vehicle settings.
- the appropriate systems onboard the vehicle 104 e.g., temperature control system, sound system etc.
- the method 700 can include determining an estimated time of interaction duration between the user and the vehicle. This estimated time can be indicative of the time it will take for the user 110 to interact with the vehicle 102 .
- the estimated time of interaction duration can include an estimated board duration (EBD) that is indicative of how long the user 110 may take to load him/herself, children, luggage, etc. into the vehicle 104 , to securely fastening seatbelts, and/or undertake other tasks (e.g., for a transportation service).
- the estimated time of interaction duration can be indicative of how long the user 110 may take to unsecure and remove an item from the vehicle 104 (e.g., for a delivery service).
- the estimated time of interaction duration can be indicative of how long the user 110 may take to securely place an item into the vehicle 104 (e.g., for a courier service).
- the vehicle computing system 102 can obtain data associated with the user 110 .
- the vehicle computing system 102 can determine the estimated time of interaction duration (e.g., an estimated time of boarding duration for the user 110 ) based at least in part on the data associated with the user 110 .
- the data associated with the user 110 can include, for example, data indicative of one or more preferences 720 of the user 110 and/or one or more vehicle service parameters 722 associated with the user 110 .
- the preferences 720 can be indicative of the user's destination (e.g., airport, train station, etc.), service type, and/or other information specified by the user 110 (e.g., when requesting the vehicle service).
- the one or more vehicle service parameters 722 can be indicative of number of passengers, child's car seat request, presence/amount of luggage, etc.
- the vehicle service parameters 722 can also be specified by the user 110 (e.g., when requesting the vehicle service).
- the data associated with the user 110 can include historic data 724 (e.g., indicative of a boarding behavior associated with one or more other users).
- the historic data 724 can be indicative of historic wait time(s) associated with other users in the geographic area 200 and/or a greater region, similarly situated users, etc.
- the historic data 724 can be associated with the specific user 110 .
- the historic data 724 can include, for example, previous correlations between changes in the signal strength of an identifier and a user's time to arriving at a vehicle.
- the historic data 724 can indicate historic RSSI changes as a countdown to user arrival.
- the method 700 can include determining an estimated time of user arrival with the vehicle 104 .
- the estimated time of user arrival can include, for example, an estimated time until the user completes hoarding of the vehicle 104 (ETCB) (e.g., for a transportation service), an estimated time until the user 110 finishes retrieving an item from the vehicle 104 (e.g., for a delivery service), and estimated time until the user 110 finishes placing an item in the vehicle 104 (e.g., for a courier service), etc.
- the vehicle computing system 102 can determine an estimated time of user arrival based at least in part on the location data associated with the user device 138 .
- the vehicle computing system 102 can determine the estimated time of user arrival based at least in at part on the estimated time until the user 110 starts interaction with the vehicle 104 and the estimated time of interaction duration (e.g., a sum of these estimated times) way of example, the vehicle computing system 102 can determine an estimated time until the user 110 completes boarding of the vehicle 102 based at least in part on the location data associated with the user device 110 and the data associated with the user 110 . More particularly, the vehicle computing system 102 can determine an estimated time until the user 110 completes boarding of the vehicle 104 based at least in part on the estimated time until the user 110 starts boarding the vehicle 104 and the estimated time of boarding duration for the user 110 .
- the vehicle computing system 102 can compare the estimated time for interaction between the user 110 and the vehicle 104 (e.g., the estimated time until the user 110 completes boarding of the vehicle 104 ) to a time constraint 750 .
- the time constraint 750 can be expressed as a time threshold (e.g., indicating an acceptable amount of stopping time) and/or cost data (e.g., cost functions expressing a cost in relation to stopping time). This can allow the vehicle computing system 102 to determine whether the amount of stopping time is acceptable.
- the time constraint 750 can be based on historic data (e.g., indicating historic wait times), real-time data (e.g., indicating that the vehicles are already waiting due to another traffic build-up in front of the autonomous vehicle), expectations of individuals in the geographic area, machine-learned model(s), and/or other information. For example, in the event that there is already a traffic jam in front of the vehicle 104 , the time constraint 750 (e.g., indicative an acceptable wait time) can higher.
- the time constraint 750 can be determined at least in part from a model, such as a machine-learned model.
- the machine-learned model can be or can otherwise include one or more various model(s) such as, for example, models using boosted random forest techniques, neural networks (e.g., deep neural networks), or other multi-layer non-linear models.
- Neural networks can include recurrent neural networks (e.g., long short-term memory recurrent neural networks), feed-forward neural networks, and/or other forms of neural networks.
- supervised training techniques can be performed to train the model (e.g., using historical wait time data) to determine a time constraint 750 based at least in part on input data.
- the input data can include, for example, the data described herein with reference to FIGS. 7 and 8A -C, driving characteristics of individuals in an associated geographic area, complaints received from operators of vehicles that were caused to stop by autonomous vehicles, etc.
- the machine-learned model can provide, as an output, data indicative of a recommended time constraint.
- the recommended time constraint can be specific to the geographic area 200 .
- the model can be trained based at least in part on data associated with geographic area 200 such that the recommended time constraint is specific for that particular region.
- one or more time estimates of FIG. 7 can be based at least in part on a model.
- the vehicle computing system 102 can obtain data descriptive of the model (e.g., machine learned model).
- the vehicle computing system 102 can provide input data to the model.
- the input data can include the one or more of the types of data described herein with reference to FIGS. 7 and 8A -C.
- the model can determine an estimated time of user arrival (e.g., an estimated time until the user 110 completes boarding of the vehicle 104 (ETCB)), an estimated time until the user 110 starts interaction with the vehicle 104 (e.g., an estimate time until the user 110 starts boarding the vehicle 104 (ETSB)), and/or an estimated time of interaction duration between the user 110 and the vehicle 104 (e.g., an estimated time of hoarding duration for a user 110 (EBD)).
- ETCB estimated time of user arrival
- ETSB estimated time until the user 110 starts interaction with the vehicle 104
- ESD estimated time of interaction duration between the user 110 and the vehicle 104
- ETD estimated time of hoarding duration for a user 110
- the estimated time of user arrival to the vehicle 104 e.g., ETSB
- the input data e.g., the user's location, heading, distance, altitude, etc.
- the model can analyze the input data with respect to the time constraint 750 to determine whether the estimated time of user arrival (e.g., the estimated time until the user 110 completes boarding of the vehicle 104 (ETCB)) is high or low, significant or insignificant, acceptable or unacceptable, etc.
- the output of such a model can be the estimated time for interaction between the user 110 and the vehicle 104 (e.g., the estimated time until the user 110 completes boarding of the vehicle 104 (ETCB)) and/or whether it is high or low, significant or insignificant, acceptable or unacceptable, etc.
- the output of the model can be provided as an input to the model for another set of data (e.g., at a subsequent time step). In such fashion, confidence can be built that a determined time estimate is the accurate.
- the process can be iterative such that the time estimate can be recalculated over time as it becomes clearer what the time estimate is with respect to the user 110 .
- the model can include one or more autoregressive models.
- the model can include one or more machine-learned recurrent neural networks.
- recurrent neural networks can include long or short-term memory recurrent neural networks, gated recurrent unit networks, or other forms of recurrent neural networks.
- the vehicle computing system 102 can determine one or more vehicle actions based at least in part on at least one of the estimated traffic impact 504 or the estimated time of user arrival (e.g., one and/or both of the estimates). In some implementations, the vehicle computing system 102 can determine the vehicle actions) based at least in part on the estimated traffic impact 504 .
- the vehicle action(s) can include stopping within the vicinity 206 of the location 202 associated with the user 110 (e.g., at least partially in the travel way 600 ).
- the vehicle computing system 102 can determine that the vehicle 104 can stop within the travel way 600 to wait for the user 110 to arrive at the vehicle 104 .
- the vehicle computing system 102 can determine the vehicle action(s) based at least in part on the estimated time of user arrival. For instance, in the event that the estimated time of user 702 arrival is below the time constraint 706 (e.g., the estimated time of user arrival is low), the vehicle computing system 102 can determine that the vehicle 104 can stop at least partially within the travel way 600 . Such a stop can occur, for example, as close as possible (e.g., for the vehicle 104 ) to the location 202 associated with the user 110 .
- the vehicle computing system 102 can base its determination to stop at least partially within the travel way 600 on both the estimated traffic impact 504 and the estimated time of user arrival. For instance, vehicle computing system 102 can weigh each of these estimates to determine whether it would be appropriate for the vehicle 104 to stop at least partially in the travel way 600 to wait for the user 110 .
- the vehicle computing system 102 can apply a first weighting factor to the estimated traffic impact 504 and a second weighting factor to the estimated time of user arrival.
- the first weighting factor can be different than the second weighting factor.
- the first weighting factor can be inversely related to the second weighting factor.
- An example equation can include: ETI*w 1 +ETUR*w 2 , where “ETT” is the estimated traffic impact 504 , “w 1 ” is the first weighing factor (e.g., 0 to 1), “ETUR” is the estimated time to user arrival 702 , and “w 2 ” is the second weighting factor (e.g., 0 to 1).
- ETI is the estimated traffic impact 504
- w 1 is the first weighing factor (e.g., 0 to 1)
- ETUR is the estimated time to user arrival 702
- w 2 is the second weighting factor (e.g., 0 to 1).
- the estimated traffic impact 504 may be high while the estimated time of user arrival may be short.
- the vehicle computing system 102 may determine that it can stop within the travel way 600 because although a higher number of objects (e.g., other vehicles) may be caused to stop, it would only be for a short period of time because the user 110 is close in distance to and/or quickly heading toward the vehicle 104 .
- the estimated time of user arrival can be given a greater weight than the estimated traffic impact 504 .
- the estimated impact on traffic 504 may be low while the estimated time of user arrival may be long.
- the vehicle computing system 102 can determine that it should not stop within the travel way 600 because although only a few objects (e.g., other vehicles) may be caused to stop, it would be for a greater period of time because the user 110 is farther from (and/or moving slowly, moving away from, etc.) the vehicle 104 .
- the estimated time of user arrival can be given a greater weight than the estimated traffic impact 504
- the first and second weighting factors can be manually and/or automatically adjusted depending on the circumstances (e.g., a VIP user is being provided the vehicle services) and/or the geographic area 200 .
- the estimated traffic impact 504 can be adjusted based at least in part on the estimated time to user arrival 702 . For instance, in the event that the estimated time to user arrival 702 is long, the estimated traffic impact 504 can be higher.
- the vehicle computing system 102 can also, or alternatively determine that the vehicle 104 is to enter into a holding pattern.
- the vehicle action(s) can include traveling along a second vehicle route 208 (e.g., an optimal holding pattern route).
- the vehicle computing system 102 can cause the vehicle 104 to enter into a particular operating mode in which the vehicle 104 implements a holding pattern (e.g., a holding pattern operating mode).
- the vehicle 104 may be unable to find a parking location before and/or after travelling passed the location 202 associated with the user 110 .
- the vehicle 104 may determine that it should not stop within a travel way 600 to wait for the user's arrival, as described herein.
- the vehicle 104 can be re-routed along a second vehicle route 208 that is at least partially different than the first vehicle route.
- the second vehicle route 208 can be a path along which the vehicle 104 can travel to re-arrive within the vicinity 206 of the location 202 of the user 110 .
- the second vehicle route 208 can be a path along which the vehicle 104 can travel around a block, back to the location associated with the user.
- such a path may be similar to and/or the same as a portion of the first vehicle route 204 (e.g., along the street associated with the user 110 ).
- the second vehicle route 208 can be completely different from the first vehicle route 204 such that no portion of the second vehicle route 208 overlaps with the first vehicle route 204 .
- the determination of the second vehicle route 208 can be based on a variety of information.
- FIG. 9 depicts example information 900 associated with a second vehicle route 208 according to example embodiments of the present disclosure.
- the vehicle computing system 102 can determine the second vehicle route 208 based at least in part on such information.
- the second vehicle route 208 can be determined off-board the vehicle 104 by another computing system (e.g., the operations computing system 106 ) and data indicative of the second vehicle route 208 can be provided to the vehicle computing system 102 .
- the second vehicle route 208 can be determined based at least in part on current and/or historic traffic data 902 A. For example, the second vehicle route 208 can be determined to implement the route that will allow the vehicle 104 to arrive back within the vicinity 206 of the location 202 of the user 110 within the shortest amount of time and/or distance. The second vehicle route 208 can take into account the current traffic (and/or historic traffic patterns) within the geographic area 200 such that the vehicle 104 is minimally impeded by such traffic (e.g., such that the second vehicle route 208 is the fastest and/or shortest vehicle route to the location 202 of the user 110 ).
- the second vehicle route 208 can be determined based at least in part on map data 902 B.
- the map data 902 B can be used to determine the path (e.g., roads, other terrain, etc.) the vehicle 104 is to travel along to arrive back within the vicinity 206 of the user 110 .
- the second vehicle route 208 can be based on data 902 C associated with other vehicle(s) in the geographic area 200 .
- the data 902 C associated with other vehicle(s) can include additional traffic data associated with the geographic area 200 (e.g., indicating a certain road is impeded by heavy traffic).
- the data 902 C associated with the other vehicle(s) can also include the location of such vehicles.
- the vehicle computing system 102 and/or the operations computing system 106 can determine the second vehicle route 208 (e.g., optimal vehicle holding pattern) by processing map data and traffic data to establish an estimated time back to the location 202 associated with the user 110 .
- the other vehicle can be routed to the location 202 . If the vehicle 104 would arrive to the location 202 the fastest, then the vehicle 104 can be routed in accordance with the second vehicle route 208 .
- the second vehicle route 208 can be based on a model, such as a machine-learned model, in a manner similar to that described herein with respect to the estimated traffic impact 504 , the estimated time to user arrival 702 , etc.
- the vehicle computing system 102 can determine that the vehicle 104 can stop within the travel way 600 , but later determine that the vehicle 104 should begin to travel again (e.g., according to a holding pattern route). For example, the vehicle computing system 102 can determine that it would be appropriate for the vehicle 104 to stop at least partially within the travel way 600 to wait for the user 110 based at least in part on the estimated traffic impact 504 and/or the estimated time of user arrival, as described herein.
- the vehicle computing system 102 can be configured to update (e.g., continuously, periodically, as scheduled, in real-time, in near real-time, etc.) the estimated traffic impact 504 and/or the estimated time of user arrival.
- the traffic impact may increase (e.g., due to an increase in the number of other vehicle(s) stopped behind the vehicle 104 ) and/or the user 110 may take longer than estimated to arrive at the vehicle 104 .
- the vehicle computing system 102 can determine at least one of an updated estimated traffic impact (e.g., based on the number of vehicles that have already stopped and/or additional vehicles that may be caused to stop) or an updated estimated time of user arrival (e.g., based on a change in the user device location data, if any).
- the vehicle computing system 102 can determine that the vehicle 104 can no longer remain stopped to wait for the user 110 based at least in part on at least one of the updated estimated traffic impact or the updated estimated time of user arrival. Accordingly, the vehicle computing system 102 can cause the vehicle 104 to travel along the second vehicle route 208 based at least in part on at least one of the updated estimated traffic impact or the updated estimated time of user arrival.
- the vehicle computing system 102 can cause the vehicle 104 to perform one or more vehicle action(s).
- the vehicle action(s) can include at least one of stopping the vehicle 104 (e.g., at least partially within the travel way 600 ) within the vicinity 206 of the location 202 associated with the user 110 or travelling along a second vehicle route 208 .
- the vehicle computing system 102 can cause the vehicle 104 to stop.
- the vehicle computing system 102 can provide one or more control signals to a control system 116 of the vehicle 104 (e.g., braking control system) to cause the vehicle 104 to decelerate to a stopped position that is at least partially in a travel way 600 within the vicinity 206 of the location 202 associated with the user 110 .
- a control system 116 of the vehicle 104 e.g., braking control system
- the vehicle computing system 102 can obtain data associated with the second vehicle route 208 and implement the second vehicle route 208 accordingly.
- the vehicle computing system 102 can request and obtain data indicative of the second vehicle route 208 from the operations computing system 106 .
- the vehicle computing system 102 can determine the second vehicle route 208 onboard the vehicle 104 .
- the vehicle computing system 102 can provide one or more control signals to cause the vehicle 104 to implement a motion plan that causes the vehicle 104 to travel in accordance with the second vehicle route 208 (e.g., to implement one or more vehicle trajectories in accordance with the second vehicle route 208 ).
- the vehicle computing system 102 can provide the user 110 with one or more communications indicating the actions performed by (or to be performed by) the vehicle 104 .
- the vehicle computing system 102 can provide, via the communication system 136 , a communication to the user device 138 associated with the user 110 .
- the communication can include, for example, a textual message, auditory message, etc. that indicates the vehicle actions (e.g., “I could not locate you at the pin drop, traffic forced me to go around the block. Please proceed to the pin drop”).
- the vehicle computing system 102 can provide a communication (e.g., data) to a user device 138 associated with the user 110 .
- the communication can indicate that the vehicle 104 is travelling to return to the location 202 associated with the user 110 .
- the user device 138 associated with the user 110 can display a user interface indicative of the communication.
- FIG. 10 depicts an example display device 1000 with an example communication 1002 according to example embodiments of the present disclosure.
- the display device 1000 e.g., display screen
- the communication 1002 can be presented via a user interface 1004 on the display device 1000 .
- the communication 1002 can indicate that the vehicle 104 has arrived and is waiting in the travel way (e.g., in a current traffic lane).
- the communication 1002 and/or another portion of the user interface 1004 can be indicative of a location of the vehicle 104 .
- the display device 1000 can display a map user interface 1006 that includes a user route 1008 .
- the user route 1008 can be a route along which a user 110 can travel to arrive at the vehicle 104 .
- the vehicle 104 may be relieved of its responsibility to provide a vehicle service to the user 110 .
- a vehicle computing system 102 and/or operations computing system 106 ) determines that a vehicle 104 is to travel along the second vehicle route 208
- such computing system(s) can determine whether it would be advantageous (e.g., more time efficient, more fuel efficient, etc.) for another vehicle 210 (e.g., another autonomous vehicle) within the geographic area 200 to be routed to the user 110 .
- the operations computing system 106 can provide data to the vehicle 104 indicating that the vehicle 104 is no longer responsible for the request 140 .
- the other vehicle 210 can be routed to the user 110 in the manner described herein. Additionally, or alternatively, the operations computing system 106 (and/or the vehicle computing system 102 of the vehicle 104 ) can re-route the vehicle 104 to provide a vehicle service to another user.
- the vehicle computing system 102 can cancel the request 140 associated with the user 110 .
- the vehicle 104 may be caused to re-route e.g., circle the block) a certain number of times and/or the user 110 may not arrive at the vehicle 104 within a certain timeframe.
- the vehicle computing system 102 can determine whether to cancel the request 140 based at least in part on a vehicle service cancellation threshold.
- FIG. 11 depicts example information 1100 associated with a vehicle service cancellation threshold 1102 according to example embodiments of the present disclosure.
- the vehicle service cancellation threshold 1102 can he indicative of a threshold time and/or distance that the vehicle 104 is in the holding pattern.
- the vehicle service cancellation threshold 1102 can be indicative of a time between when the vehicle 104 initially arrived within a vicinity 206 of the user 110 (and/or passed the location 202 ) to the current time.
- the vehicle service cancellation threshold 1102 can be indicative of a distance travelled by the vehicle 104 while in a holding pattern (e.g., number of times the vehicle is re-routed to arrive at the user 110 ).
- the vehicle service cancellation threshold 1102 can be determined and updated continuously, periodically, as scheduled, on request, in real-time, in near real-time, etc. (e.g., per trip, while on a trip, etc.).
- the vehicle service cancellation threshold 1102 can be determined by the vehicle computing system 102 and/or off-board the vehicle 104 and provided to the vehicle computing system 102 .
- the information 1100 can include vehicle service demand data 1104 A, historic vehicle service data 1104 B, geographic area preferences 1104 C, data 1104 D associated with other vehicle(s) within the geographic area, user selected holding patterns 1104 E, and/or other types of information.
- the vehicle service demand data 1104 A can include a current level of demand (e.g., number of current service requests) for vehicle services (e.g., within the geographic area 200 ). Additionally, or alternatively, the vehicle service demand data 1104 A can include a historic level of demand for vehicle services at a certain time, day, etc. (e.g., within the geographic area, similarly situated area, etc.).
- the vehicle service cancellation threshold 1102 can be higher (e.g., because the vehicle 104 may not be needed for other vehicle service requests). In the event that the demand is higher, the vehicle service cancellation threshold 1102 can be lower (e.g., because the vehicle 104 may be needed for other vehicle service requests).
- the historic vehicle service data 1104 B can be associated with the specific user 110 and/or one or more other user(s).
- the historic vehicle service data 1104 B can indicate that the user 110 typically takes a longer amount of time to arrive at the vehicle 104 (e.g., due to a disability).
- the vehicle service cancellation threshold 1102 can be higher in order to cause the vehicle 104 to remain in the holding pattern longer, thereby giving the user 110 a greater opportunity to arrive at the vehicle 104 .
- the historic vehicle service data 1104 B can indicate that it generally takes user(s) of within the geographic area 200 longer to arrive at the vehicle 104 . As such, the vehicle service cancellation threshold 1102 may be higher. In the event that the historic vehicle service data 1104 B indicates that the user 110 (and/or other user(s)) typically arrives at the vehicle 104 in a relatively short timeframe, the vehicle service cancellation threshold 1102 may be lower.
- the geographic area preferences 1104 C can be descriptive of the preferences associated with a geographic area 200 (e.g., as indicated by the managers of the geographic area 200 ). For example, the geographic area 200 may prefer that a vehicle 104 only remain in a holding pattern (e.g., circle the block) for a certain time period and/or distance so as not to affect local traffic.
- a holding pattern e.g., circle the block
- the vehicle service cancellation threshold 1102 can be based at least in part on data 1104 D associated with one or more other vehicles (e.g., other autonomous vehicles in an associated fleet).
- the data 1104 D can be indicative of the location(s) of other vehicle(s) (e.g., within the geographic area 200 ) and/or whether the other vehicle(s) are available to provide a vehicle service (e.g., whether or not the other vehicle is assigned to a service request, currently providing a vehicle service, etc.).
- the vehicle service cancellation threshold 1102 may be lower (e.g., because the user 110 can be serviced by the other vehicle in the event a new request is made after cancellation).
- the vehicle service cancellation threshold 1102 may be higher (e.g., because another vehicle is not readily available in the event the user 110 makes a new request after cancellation).
- the vehicle service cancellation threshold 1102 can be based at least in part on user selected holding patterns 1104 E.
- the user selected holding patterns 1104 E can include data indicative of a vehicle service cancellation threshold 1102 selected by a user 110 .
- the user 110 can purchase (e.g., via a user interface associated with a software application) a higher vehicle service cancellation threshold 1102 , such that the vehicle 104 will remain in the holding pattern (e.g., circle the block) for a longer time/distance before the vehicle service request is cancelled.
- a user 110 can have a higher vehicle service cancellation threshold 1102 due to a higher user rating, specialized treatment (e.g., frequent user), and/or based on other conditions.
- the vehicle service cancellation threshold 1102 can be based at least in part on a model, such as a machine learned model.
- the model can be trained based on previously obtained information 1100 and labeled data indicative of the vehicle service cancellation thresholds associated therewith.
- the vehicle computing system 102 (or other computing system) can provide input data(e.g., the information 1100 ) into such a model and receive, as an output, a recommended vehicle service cancellation threshold.
- the vehicle computing system 102 can cancel the request 140 associated with the user 110 in the event that the user 110 has not arrived at the vehicle 104 and the vehicle 104 has exceeded the vehicle service cancellation threshold 1102 .
- the vehicle computing system 102 can provide data indicating that the request 140 for the vehicle service provided by the vehicle 104 is cancelled to the operations computing system 106 (and/or one or more other computing devices that are remote from the vehicle computing system 102 ).
- such data can request the cancellation of the user's service request 140 .
- the operations computing system 106 can cancel the service request 140 (and inform the user 110 accordingly) and/or re-route the vehicle 104 to provide a vehicle service to another user.
- the vehicle computing system 102 can communicate directly with a user device 138 associated with the user 110 to cancel the service request 140 and/or inform the user 110 of the vehicle service cancellation.
- the vehicle computing system 102 can report such a cancellation to the operations computing system 106 .
- FIG. 12 depicts a flow diagram of an example method 1200 of controlling autonomous vehicles according to example embodiments of the present disclosure.
- One or more portion(s) of the method 1200 can be implemented by one or more computing devices such as, for example, the one or more computing device(s) of the vehicle computing system 102 and/or other systems.
- Each respective portion of the method 1200 e.g., 1202 - 1222
- one or more portion(s) of the method 1200 can be implemented as an algorithm on the hardware components of the device(s) described herein (e.g., as in FIGS. 1 and 14 ), for example, to control an autonomous vehicle.
- FIG. 1 depicts a flow diagram of an example method 1200 of controlling autonomous vehicles according to example embodiments of the present disclosure.
- One or more portion(s) of the method 1200 can be implemented by one or more computing devices such as, for example, the one or more computing device(s) of the vehicle computing system
- the method 1200 can include obtaining data indicative of a location associated with a user.
- the vehicle computing system 102 can obtain data 142 indicative of a location 202 associated with a user 110 to which the vehicle 104 is to travel.
- the vehicle 104 is to travel along a first vehicle route 204 that leads to the location 202 associated with the user 110 .
- the user 110 can be associated with a request 140 for a vehicle service.
- the vehicle computing system 104 and/or the operations computing system 106 can determine the first vehicle route 204 so that the vehicle 104 can travel to the user 110 to provide the user 110 with the requested vehicle service (e.g., pick up the user 110 for a transportation service, deliver an item for a delivery service, receive an item for courier service, and/or provide another service).
- the requested vehicle service e.g., pick up the user 110 for a transportation service, deliver an item for a delivery service, receive an item for courier service, and/or provide another service.
- the method 1200 can include travelling along a first vehicle route.
- a human operator may not be located within the vehicle 104 .
- the vehicle computing system 102 can provide one or more control signals to the motion planning system 128 and/or the vehicle's control systems 116 to cause the vehicle 104 to plan its motion and/or implement a motion plan to travel in accordance with first vehicle route 204 (e.g., autonomously, without input from a human operator to the vehicle 104 ).
- the first vehicle route 204 can bring the vehicle 104 within the vicinity 206 of the location 202 associated with the user 110 .
- the vicinity of the location 202 associated with the user 110 can be defined at least in part by a distance from the location 202 associated with the user 110 .
- the distance from the location 202 associated with the user 110 can be based at least in part on an acceptable walking distance 302 from the location 202 associated with the user 110 , as described herein.
- the method 1200 can include determining whether a parking location out of a travel way is available for the vehicle. For instance, the vehicle computing system 102 can determine whether a parking location within the vicinity 206 of the location 202 associated with the user 110 is available for the vehicle 104 (e.g., based on sensor data 118 , map data 120 , etc.). The vehicle computing system 102 may search for a parking location while the vehicle 104 is in an approach operating mode 108 D, as described herein. The vehicle computing system 102 may search for a parking location before and/or after the vehicle 104 passes the location 202 associated with the user 110 .
- the method 1200 can include sending a communication to a user 110 to indicate that the vehicle 104 has arrived and the location of the vehicle 104 . Once the user has arrived at the vehicle 104 , the vehicle 104 can provide the vehicle service to the user 110 . In some implementations, the vehicle computing system 102 can determine that a parking location that is out of the travel way 600 is unavailable for the vehicle 104 .
- the method 1200 can include obtaining traffic data associated with the geographic area that includes the location associated with the user.
- the vehicle computing system 102 can obtain traffic data 500 associated with a geographic area 200 that includes the location 202 associated with the user 110 .
- the traffic data 500 can be associated with the vicinity 206 of the location 202 (e.g., a block, neighborhood, etc. where the user 110 is located) and/or other portions of the geographic area 200 .
- the vehicle computing system 102 can obtain, via one or more sensors 112 of the vehicle 104 , sensor data 118 associated with the surrounding environment of the vehicle 104 that is within the vicinity 206 of the location 202 associated with the user 110 .
- the vehicle computing system 102 can determine a level of traffic based at least in part on the sensor data 118 , as described herein.
- the method 1200 can include determining an estimated traffic impact.
- the vehicle computing system 102 can determine an estimated traffic impact 504 of the vehicle 104 on the geographic area 200 based at least in part on the traffic data 500 .
- the estimated traffic impact 504 can be indicative of an estimated impact of the vehicle 104 on one or more objects within a surrounding environment of the vehicle 104 in the event that the vehicle 104 were to stop at least partially in the travel way 600 (e.g., a current lane 602 ) within the vicinity 206 of the location 202 associated with the user 110 .
- the vehicle computing system 102 can compare the level of traffic (e.g., determined based at least in part on the sensor data, other traffic data) to a traffic constraint 506 .
- the traffic constraint 506 can include a traffic threshold indicative of a threshold level of traffic.
- the traffic constraint can be determined at least in part from a machine-learned model, as described herein.
- the method 1200 can include obtaining location data associated with a user.
- the vehicle computing system 102 can obtain location data associated with a user device 138 (e.g., mobile device) associated with the user 110 , as described herein.
- the location data associated with the user device 138 can be indicative of one or more locations of the user device 138 associated with the user 110 at one or more times.
- the method 1200 can include determining an estimated time of user arrival.
- the vehicle computing system 102 can determine an estimated time of user arrival based at least in pan on the location data associated with the user device 138 .
- the estimated time of user arrival can be indicative of an estimated time at which the user 110 will arrive at the vehicle 104 .
- the method 1200 can include determining one or more vehicle actions based at least in part on the estimated traffic impact and/or the estimated time of user arrival.
- the vehicle computing system 102 can determine one or more vehicle actions based at least in pan on the estimated traffic impact 504 .
- the vehicle computing system 102 can determine the one or more vehicle actions also, or alternatively, based at least in part on the estimated time of user arrival.
- the one or more vehicle actions can include at least one of stopping the vehicle 104 at least partially in a travel way 600 within a vicinity 206 of the location 202 associated with the user 110 or travelling along a second vehicle route 208 (e.g., entering into a vehicle holding pattern).
- the second vehicle route 208 can be at least partially different from the first vehicle route 204 .
- the second vehicle route 208 can include a route that leads to the location 202 associated with the user 110 (or at least to a vicinity 206 of the location 202 ).
- the method 1200 can include causing the vehicle to perform the one or more vehicle actions.
- the vehicle computing system 102 can cause the vehicle 104 to perform the one or more vehicle actions.
- the vehicle computing system 102 can provide one or more control signals to one or more systems onboard the vehicle 104 to cause the vehicle 104 to perform the vehicle action(s) (e.g., to stop in the travel way 600 , enter the vehicle holding pattern).
- the method 1200 can include providing a communication to the user.
- the vehicle computing system 102 can provide a communication to a user device 138 associated with the user 110 that is indicative of a vehicle action.
- the vehicle computing system 102 can provide, to the user device 138 associated with the user 110 , a communication indicating that the vehicle 104 is stopped (and/or will stop).
- the user device 138 can display a map user interface 1006 that indicates a vehicle location of the vehicle 104 and a user route 1008 to the vehicle location of the vehicle 104 .
- the vehicle computing system can provide a communication to the user device 138 associated with the user 110 indicating that the vehicle 104 is travelling to return to the location 202 associated with the user 110 .
- FIGS. 13A-B depict a flow diagram of an example method 1300 of controlling autonomous vehicles according to example embodiments of the present disclosure.
- One or more portion(s) of the method 1300 can be implemented by one or more computing devices such as, for example, the one or more computing device(s) of the vehicle computing system 102 and/or other systems.
- Each respective portion of the method 1300 e.g., 1302 - 1346
- one or more portion(s) of the method 1300 can be implemented as an algorithm on the hardware components of the device(s) described herein (e.g., as in FIGS. 1 and 14 ), for example, to control an autonomous vehicle
- FIGS. 13A-B depicts elements performed in a particular order for purposes of illustration and discussion.
- the vehicle 104 can enter into an approach operating mode 108 D.
- a vehicle computing system 102 can obtain data 142 indicative of a location 202 associated with a user 110 .
- the vehicle computing system 102 can cause the vehicle 104 to travel in accordance with a first vehicle route 204 to arrive within a vicinity 206 of the location 202 associated with the user 110 .
- the vehicle 104 can enter into the approach mode 108 D, for example, when it is within the vicinity 206 of the location 202 .
- the vehicle 104 can approach the location 202 associated with the user 110 in the approach operating mode 108 D.
- the vehicle computing system 102 can send accurate (e.g., localized) approach data (e.g., through a network) to a software application running on a user device 138 associated with the user 110 .
- the software application can cause the user device 138 to display a user interface via a display device.
- the user interface can display a map interface with the user's position (e.g., based on GPS) and a precise location approach (e.g., of the vehicle 104 ), as well as a target location of the vehicle 104 to meet the user 110 .
- the user interface can also alert the user 110 that the vehicle 104 is arriving (e.g., “your vehicle is arriving, please prepare to board”)
- the vehicle 104 can each include an outwardly visible lighting element, such as a number or array of LED lights capable of producing rapid flash patterns.
- the lighting element can be located within the interior and viewable through the front of the vehicle (e.g., windshield), and/or can be located on the exterior of the vehicle 104 .
- the operations computing system 106 can transmit a flash code to the vehicle 104 and the requesting user device 138 . As the vehicle 104 approaches the pick-up location, the vehicle 104 can output the flash code using the lighting element.
- the requesting user 110 can be prompted to hold up the user device 138 so that a camera or the camera lens of the user device 138 is pointed towards the vehicle 104 and can detect the flash code (e.g., the camera can be pointed towards the vehicle 104 and the display screen of the mobile device can display a viewfinder or preview of the imagery detected or captured by the camera).
- the user device 138 can determine whether the flash code matches the flash code provided by the operations computing system 106 (e.g., utilizing a perception algorithm). If so, the user device 138 can display an indicator, such as a circle or a highlight for the vehicle 104 , so that the requesting user 110 can readily identify the vehicle 104 .
- the vehicle 104 can search for an available parking location that is out of a travel way 600 (e.g., out of a current travel lane 602 ), at ( 1304 ).
- the vehicle computing system 102 can search for an out-of-lane parking location when the vehicle 104 is within a certain distance (e.g., an acceptable walking distance 302 ) from the location 202 associated with the user 110 .
- the vehicle computing system 102 can utilize the vehicle's sensor(s), as described herein.
- the vehicle computing system 102 can cause the vehicle 104 to park (e.g., autonomously, without user input), at ( 1306 ), and send a communication to the user, as described herein.
- the vehicle doors can be unlocked and/or user specific vehicle settings (e.g., music, temperature, seat position, etc.) can be implemented, at ( 1307 ) (e.g., because boarding is imminent).
- the vehicle 104 can start to provide the user 110 with a vehicle service (e.g., transport the user 110 to a destination location), at ( 1310 ). However, if the user 110 does not arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc.) at the vehicle 104 within the time frame, the vehicle computing system 102 can again contact the user 110 , at ( 1312 ).
- a vehicle service e.g., transport the user 110 to a destination location
- the contact can be facilitated by the vehicle computing system 102 providing a communication to the user 110 via a software application on the user device 138 (e.g., using a suitable communication protocol). If unsuccessful (e.g., the user 110 is unresponsive, does not travel to vehicle 104 ), the vehicle computing system 102 can cancel the vehicle service, at ( 1314 ), as described herein. However, if the contact is successful, the vehicle 104 can wait another timeframe “Y” time) for the user 110 to arrive e.g., start and/or complete boarding, item retrieval, item placement, etc.), at ( 1316 ).
- Y timeframe
- the vehicle computing system 102 can cause the vehicle 104 to provide the vehicle service to the user 110 , in the event that the user 110 arrives at the vehicle 104 (e.g., boards the vehicle 104 ). In the event the user 110 does not arrive within “Y” time (e.g., start and/or complete boarding, item retrieval, item placement, etc.), the vehicle 104 can enter into the holding pattern and travel in accordance with the second vehicle route 208 , at ( 1318 ).
- “Y” time e.g., start and/or complete boarding, item retrieval, item placement, etc.
- the vehicle computing system 102 can determine at least one of an estimated traffic impact 504 of the vehicle 104 on the geographic area 200 based at least in part on traffic data 500 , at ( 1320 ) or an estimated time of user arrival based at least in part on the location data associated with the user device 110 as described herein, at ( 1322 ).
- the vehicle computing system 102 can determine the estimated traffic impact 504 to determine if and how long to wait in-lane for the user 110 to arrive at the vehicle 104 .
- the vehicle computing system 102 can determine whether the vehicle 104 should stop at the location 202 based at least in part on at least one of the estimated traffic impact 504 or the estimated time to user arrival. For example, the vehicle computing system 102 can start sensing the traffic presence and object speed around the vehicle 104 (e.g., in-lane, behind and ahead of the vehicle 104 ). Moreover, the vehicle computing system 102 can use RF sensors and/or Bluetooth beacons to determine a user presence, general distance between the user 110 and the vehicle 104 , and/or changes in the distance indicating number of seconds from user arrival.
- the vehicle computing system 102 can cause the vehicle 104 to stop in the travel way 600 , at ( 1324 ). If the user 110 arrives (e.g., start and/or complete boarding, item retrieval, item placement, etc.) within a certain timeframe (e.g., within “Y” time), the vehicle 104 can provide the vehicle service to the user 110 . If the user 110 does not arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc.) within the timeframe, the vehicle 104 can enter into the holding pattern.
- a certain timeframe e.g., within “Y” time
- the vehicle 104 can provide the vehicle service to the user 110 . If the user 110 does not arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc.) within the timeframe, the vehicle 104 can enter into the holding pattern.
- the vehicle computing system 102 can determine the estimated time to user arrival. In the event that the estimated time to user arrival is low, the vehicle computing system 102 can cause the vehicle 104 to stop within the travel way 600 (e.g., in a travel lane), at ( 1326 ), despite potential traffic build-up. If the user 110 arrives (e.g., starts and/or completes boarding, item retrieval, item placement, etc.) within a certain timeframe (e.g., within “Y” time), the vehicle 104 can provide the vehicle service to the user 110 . If the user 110 does not arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc. within the timeframe, the vehicle 104 can enter into the holding pattern.
- the vehicle computing system 102 can determine the estimated time to user arrival. In the event that the estimated time to user arrival is low, the vehicle computing system 102 can cause the vehicle 104 to stop within the travel way 600 (e.g., in a travel lane), at ( 1326 ), despite potential
- the vehicle computing system 102 can cause the vehicle 104 to travel past the location 202 associated with the user 110 , at ( 1328 ). In some implementations, this can be a portion of the second vehicle route 208 .
- the vehicle computing system 102 can search for a parking location out of the travel way 600 after the vehicle 104 passes the location 202 associated with the user 110 , at ( 1330 ). This can occur until the vehicle 104 travels a certain distance past the location 202 (e.g., until the vehicle 104 reaches the acceptable walking distance 302 ). In some implementations, even after the vehicle travels past the location associated with the user 110 , if the estimated traffic impact 504 and/or the estimated time of user arrival is low enough the vehicle computing system 102 can cause the vehicle 104 to stop.
- the vehicle computing system 102 can contact the user 110 via the user device 138 (e.g., indicating the location of the vehicle 104 and a user route thereto). The contact can be facilitated by the vehicle computing system 102 providing a communication to the user 110 via a software application on the user device 138 (e.g., “This is as close as I can get. Please come to me”).
- a software application on the user device 138 e.g., “This is as close as I can get. Please come to me”.
- the vehicle doors can be unlocked and/or user specific vehicle settings (e.g., music, temperature, seat position, etc.) can be implemented, at ( 1333 ) (e.g., because boarding is imminent).
- user specific vehicle settings e.g., music, temperature, seat position, etc.
- the vehicle 104 can provide a vehicle service to the user 110 , at ( 1336 ). If the user 110 does not arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc.) within the timeframe, the vehicle computing system 102 can contact the user 110 , at ( 1338 ). If the contact is successful, the vehicle 104 can again wait for the user 110 to arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc.) to the vehicle 104 . If the contact is not successful, the vehicle computing system 102 can cancel the vehicle service, at ( 1340 ).
- the vehicle computing system 102 can cancel the vehicle service, at ( 1340 ).
- the vehicle computing system 102 can cause the vehicle 104 to implement a vehicle holding pattern (e.g., enter a holding pattern operating mode), at ( 1342 ). As such, the vehicle 104 can ignore any potential parking locations and provide a communication to the user device 138 associated with the user 110 .
- the user device 138 can display a map interface depicting a location of the vehicle 104 .
- the vehicle computing system 102 can request a second vehicle route 208 from the operations computing system 106 .
- the operations computing system 106 can provide data indicative of the second vehicle route 208 to the vehicle computing system 102 .
- the vehicle computing system 102 can obtain the data indicative of the second vehicle route 208 and send one or more control signals to cause the vehicle 104 to travel in accordance with the second vehicle route 208 .
- the vehicle computing system 102 can send a communication to the user 110 indicating that the vehicle 1 . 04 is travelling to return to the location 202 associated with the user 110 , as described herein.
- the vehicle 104 can enter into the approach operating mode 108 D again, at ( 1346 ). As such, the process can continue as shown in FIG. 13A .
- the vehicle 104 can continue in the holding pattern until the vehicle service cancellation threshold 1102 is reached.
- the vehicle computing system 102 can provide a communication directly to the user device 138 (e.g., via the vehicle's communication system 136 ) to inform the user 110 that the vehicle service has been cancelled.
- the operations computing system 106 can re-route the other vehicle to the location 202 associated with the user 110 .
- FIG. 14 depicts example system components of an example system 1400 according to example embodiments of the present disclosure.
- the example system 1400 can include the vehicle computing system 102 , the operations computing system 106 , and a machine learning computing system 1430 that are communicatively coupled over one or more network(s) 1480 .
- the vehicle computing system 102 can include one or more computing device(s) 1401 .
- the computing device(s) 1401 of the vehicle computing system 102 can include processor(s) 1402 and a memory 1404 (e.g., onboard the vehicle 104 ).
- the one or more processors 1402 can be any suitable processing device (e.g., a processor core, a microprocessor, an ASIC, a FPGA, a controller, a microcontroller, etc.) and can be one processor or a plurality of processors that are operatively connected.
- the memory 1404 can include one or more non-transitory computer-readable storage media, such as RAM, ROM, EEPROM, EPROM, one or more memory devices, flash memory devices, etc., and combinations thereof.
- the memory 1404 can store information that can be accessed by the one or more processors 1402 .
- the memory 1404 e.g., one or more non-transitory computer-readable storage mediums, memory devices
- the instructions 1406 can be software written in any suitable programming language or can be implemented in hardware. Additionally, or alternatively, the instructions 1406 can be executed in logically and/or virtually separate threads on processor(s) 1402 .
- the memory 1404 can store instructions 1406 that when executed by the one or more processors 1402 cause the one or more processors 1402 (the computing system 102 ) to perform operations such as any of the operations and functions of the vehicle computing system 102 , the vehicle 104 , or for which the vehicle computing system 102 and/or the vehicle 104 are configured, as described herein, the operations for determining autonomous boarding times and/or other time estimates (e.g., one or more portions of method 700 ) the operations for controlling autonomous vehicles (e.g., one or more portions of methods 1200 and/or 1300 ), and/or any other functions for the vehicle 104 , as described herein.
- the operations for determining autonomous boarding times and/or other time estimates e.g., one or more portions of method 700
- the operations for controlling autonomous vehicles e.g., one or more portions of methods 1200 and/or 1300
- any other functions for the vehicle 104 as described herein.
- the memory 1404 can store data 1408 that can be obtained, received, accessed, written, manipulated, created, and/or stored.
- the data 1408 can include, for instance, traffic data, location data, historic data, map data, sensor data, state data, prediction data, motion planning data, data associated with operating modes, data associated with estimated times, and/or other data information described herein.
- the computing device(s) 1401 can obtain data from one or more memory device(s) that are remote from the vehicle 104 .
- the computing device(s) 1401 can also include a communication interface 1409 used to communicate with one or more other system(s) on-board the vehicle 104 and/or a remote computing device that is remote from the vehicle 104 (e.g., the other systems of FIG. 1400 , a user device associated with a user, etc.).
- the communication interface 1409 can include any circuits, components, software, etc. for communicating via one or more networks (e.g., 1480 ).
- the communication interface 1409 can include for example, one or more of a communications controller, receiver, transceiver, transmitter, port, conductors, software and/or hardware for communicating data/information.
- the operations computing system 106 can perform the operations and functions for managing autonomous vehicles, as described herein.
- the operations computing system 106 can be located remotely from the vehicle 104 .
- the operations computing system 106 can operate offline, off-board, etc.
- the operations computing system 106 can include one or more distinct physical computing devices.
- the operations computing system 106 can include one or more computing devices 1420 .
- the one or more computing devices 1420 can include one or more processors 1422 and a memory 1424 .
- the one or more processors 1422 can be any suitable processing device (e.g., a processor core, a microprocessor, an ASIC, a FPGA, a controller, a microcontroller, etc.) and can be one processor or a plurality of processors that are operatively connected.
- the memory 1424 can include one or more non-transitory computer-readable storage media, such as RAM, ROM, EEPROM, EPROM, one or more memory devices, flash memory devices, etc., and combinations thereof.
- the memory 1424 can store information that can be accessed by the one or more processors 1422 .
- the memory 1424 e.g., one or more non-transitory computer-readable storage mediums, memory devices
- the data 1426 can include, for instance, service request data, vehicle data, vehicle service cancellation thresholds, and/or other data or information described herein.
- the operations computing system 106 can obtain data from one or more memory device(s) that are remote from the operations computing system 106 .
- the memory 1424 can also store computer-readable instructions 1428 that can be executed by the one or more processors 1422 .
- the instructions 1428 can be software written in any suitable programming language or can be implemented in hardware. Additionally, or alternatively, the instructions 1428 can be executed in logically and/or virtually separate threads on processor(s) 1422 .
- the memory 1424 can store instructions 1428 that when executed by the one or more processors 1422 cause the one or more processors 1422 to perform any of the operations and/or functions described herein, including, for example, any of the operations and functions of the operations computing system 106 , the computing device(s) 1420 , and any of the operations and functions for which the operations computing system 106 and/or the computing devices) 1420 are configured, as described herein, as well as one or more portions of methods 1200 and/or 1300 .
- the computing device(s) 1420 can also include a communication interface 1429 used to communicate with one or more other system(s).
- the communication interface 1429 can include any circuits, components, software, etc. for communicating via one or more networks (e.g., 1480 ).
- the communication interface 1429 can include for example, one or more of a communications controller, receiver, transceiver, transmitter, port, conductors, software and/or hardware for communicating data/information.
- the vehicle computing system 102 and/or the operations computing system 106 can store or include one or more machine-learned models 1440 .
- the machine-learned models 1440 can be or can otherwise include various machine-learned models such as, for example, neural networks (e.g., deep neural networks), support vector machines, decision trees, ensemble models, k-nearest neighbors models, Bayesian networks, or other types of models including linear models and/or non-linear models.
- Example neural networks include feed-forward neural networks, recurrent neural networks (e.g., long short-term memory recurrent neural networks), or other forms of neural networks.
- the vehicle computing system 102 and/or the operations computing system 106 can receive the one or more machine-learned models 1440 from the machine learning computing system 1430 over the network(s) 1480 and can store the one or more machine-learned models 1440 in the memory of the respective system.
- the vehicle computing system 102 and/or the operations computing system 106 can use or otherwise implement the one or more machine-learned models 1440 (e.g., by processor(s) 1402 , 1422 ).
- the vehicle computing system 102 and/or the operations computing system 106 can implement the machine learned model(s) 1440 to determine an acceptable walking distance, traffic constraint, estimated traffic impact, estimated time of user arrival (e.g., estimated boarding time, estimated boarding during, estimated boarding completion, etc.), second vehicle route (e.g., vehicle holding pattern vehicle service cancellation threshold, etc., as described herein.
- estimated time of user arrival e.g., estimated boarding time, estimated boarding during, estimated boarding completion, etc.
- second vehicle route e.g., vehicle holding pattern vehicle service cancellation threshold, etc.
- the machine learning computing system 1430 can include one or more processors 1432 and a memory 1434 .
- the one or more processors 1432 can be any suitable processing device (e.g., a processor core, a microprocessor, an ASIC, a FPGA, a controller, a microcontroller, etc.) and can be one processor or a plurality of processors that are operatively connected.
- the memory 1434 can include one or more non-transitory computer-readable storage media, such as RAM, ROM, EEPROM, EPROM, one or more memory devices, flash memory devices, etc., and combinations thereof.
- the memory 1434 can store information that can be accessed by the one or more processors 1432 .
- the memory 1434 e.g., one or more non-transitory computer-readable storage mediums, memory devices
- the memory 1434 can store data 1436 that can be obtained, received, accessed, written, manipulated, created, and/or stored.
- the machine learning computing system 1430 can obtain data from one or more memory devices that are remote from the system 1430 .
- the memory 1434 can also store computer-readable instructions 1438 that can be executed by the one or more processors 1432 .
- the instructions 1438 can be software written in any suitable programming language or can be implemented in hardware. Additionally, or alternatively, the instructions 1438 can be executed in logically and/or virtually separate threads on processor(s) 1432 .
- the memory 1434 can store the instructions 1438 that when executed by the one or more processors 1432 cause the one or more processors 1432 to perform operations.
- the machine learning computing system 1430 can include one or more server computing devices. If the machine learning computing system 1430 includes multiple server computing devices, such server computing devices can operate according to various computing architectures, including, for example, sequential computing architectures, parallel computing architectures, or some combination thereof.
- the machine learning computing system 1430 can include one or more machine-learned models 1450 .
- the machine-learned models 1450 can be or can otherwise include various machine-learned models such as, for example, neural networks (e.g., deep neural networks), support vector machines, decision trees, ensemble models, k-nearest neighbors models, Bayesian networks, or other types of models including linear models and/or non-linear models.
- Example neural networks include feed-forward neural networks, recurrent neural networks (e.g., long short-term memory recurrent neural networks, or other forms of neural networks.
- the machine-learned models 1450 can be similar to and/or the same as the machine-learned models 1440 .
- the machine learning computing system 1430 can communicate with the vehicle computing system 102 and/or the operations computing system 106 according to a client-server relationship.
- the machine learning computing system 1430 can implement the machine-learned models 1450 to provide a web service to the vehicle computing system 102 and/or the operations computing system 106 .
- the web service can provide machine-learned models to an entity associated with an autonomous vehicle; such that the entity can implement the machine-learned model (e.g., to determine estimated traffic impacts, vehicle service request cancellation, etc.).
- machine-learned models 1450 can be located and used at the vehicle computing system 102 and/or the operations computing system 106 and/or machine-learned models 1450 can be located and used at the machine learning computing system 1430 .
- the machine learning computing system 1430 , the vehicle computing system 102 , and/or the operations computing system 106 can train the machine-learned models 1440 and/or 1450 through use of a model trainer 1460 .
- the model trainer 1460 can train the machine-learned models 1440 and/or 1450 using one or more training or learning algorithms.
- One example training technique is backwards propagation of errors.
- the model trainer 1460 can perform supervised training techniques using a set of labeled training data.
- the model trainer 1460 can perform unsupervised training techniques using a set of unlabeled training data.
- the model trainer 1460 can perform a number of generalization techniques to improve the generalization capability of the models being trained. Generalization techniques include weight decays, dropouts, or other techniques.
- the model trainer 1460 can train a machine-learned model 1440 and/or 1450 based on a set of training data 1462 .
- the training data 1462 can include, for example, a number of sets of data from previous events (e.g., acceptable walking distance data, historic traffic data, user arrival data, trip cancellation data, user feedback data, other data described herein).
- the training data 1462 can be taken from the same geographic area (e.g., city, state, and/or country) in which an autonomous vehicle utilizing that model 1440 / 1450 is designed to operate.
- the models 1450 / 1450 can be trained to determine outputs (e.g., estimated traffic impact, acceptable walking distances) in a manner that is tailored to the customs of a particular location (e.g., waiting for a user longer, decreasing an acceptable walking distance, etc.).
- the model trainer 1460 can be implemented in hardware, firmware, and/or software controlling one or more processors.
- the network(s) 1480 can be any type of network or combination of networks that allows for communication between devices.
- the network(s) 1480 can include one or more of a local area network, wide area network, the Internet, secure network, cellular network, mesh network, peer-to-peer communication link and/or some combination thereof and can include any number of wired or wireless links.
- Communication over the network(s) 1480 can be accomplished, for instance, via a network interface using any type of protocol, protection scheme, encoding, format, packaging, etc.
- FIG. 14 illustrates one example system 1400 that can be used to implement the present disclosure.
- the vehicle computing system 102 and/or the operations computing system 106 can include the model trainer 1460 and the training dataset 1462 .
- the machine-learned models 1440 can be both trained and used locally at the vehicle computing system 102 and/or the operations computing system 106 .
- the vehicle computing system 102 and/or the operations computing system 104 may not be connected to other computing systems.
- Computing tasks discussed herein as being performed at computing device(s) remote from the vehicle can instead be performed at the vehicle (e.g., via the vehicle computing system), or vice versa. Such configurations can be implemented without deviating from the scope of the present disclosure.
- the use of computer-based systems allows for a great variety of possible configurations, combinations, and divisions of tasks and functionality between and among components.
- Computer-implemented operations can be performed on a single component or across multiple components.
- Computer-implemented tasks and/or operations can be performed sequentially or in parallel.
- Data and instructions can be stored in a single memory device or across multiple memory devices.
Abstract
Description
- The present application is based on and claims priority to U.S. Provisional Application 62/510,515 having a filing date of May 24, 2017, which is incorporated by reference herein.
- The present disclosure relates generally to controlling the travel holding pattern of an autonomous vehicle that provides a vehicle service to a user.
- An autonomous vehicle is a vehicle that is capable of sensing its environment and navigating without human input. In particular, an autonomous vehicle can observe its surrounding environment using a variety of sensors and can attempt to comprehend the environment by performing various processing techniques on data collected by the sensors. Given knowledge of its surrounding environment, the autonomous vehicle can identify an appropriate motion path through such surrounding environment.
- Aspects and advantages of embodiments of the present disclosure will be set forth in part in the following description, or may be learned from the description, or may be learned through practice of the embodiments.
- One example aspect of the present disclosure is directed to a computer-implemented method of controlling autonomous vehicles. The method includes obtaining, by a computing system that includes one or more computing devices, data indicative of a location associated with a user to which an autonomous vehicle is to travel. The autonomous vehicle is to travel along a first vehicle route that leads to the location associated with the user. The method includes obtaining, by the computing system, traffic data associated with geographic area that includes the location associated with the user. The method includes determining, by the computing system, an estimated traffic impact of the autonomous vehicle on the geographic area based at least in part on the traffic data. The method includes determining, by the computing system, one or more vehicle actions based at least in part on the estimated traffic impact. The method includes causing, by the computing system, the autonomous vehicle to perform the one or more vehicle actions. The one or more vehicle actions include at least one of stopping the autonomous vehicle at least partially in a travel way within a vicinity of the location associated with the user or travelling along a second vehicle route.
- Another example aspect of the present disclosure is directed to a computing system for controlling autonomous vehicles. The computing system includes one or more processors and one or more memory devices. The one or more memory devices store instructions that when executed by the one or more processors cause the computing system to perform operations. The operations include obtaining data indicative of a location associated with a user. The user is associated with a request for a vehicle service provided by an autonomous vehicle. The autonomous vehicle is to travel along a first vehicle route to arrive within a vicinity of the location associated with the user. The operations include obtaining traffic data associated with a geographic area that includes the location associated with the user. The operations include obtaining location data associated with a user device associated with the user. The operations include determining at least one of an estimated traffic impact of the autonomous vehicle on the geographic area based at least in part on the traffic data or an estimated time of user arrival based at least in part on the location data associated with the user device. The operations include determining one or more vehicle actions based at least in part on at least one of the estimated traffic impact or the estimated time of user arrival. The operations include causing the autonomous vehicle to perform the one or more vehicle actions. The one or more vehicle actions include at least one of stopping the autonomous vehicle at least partially in a travel way within a vicinity of the location associated with the user or travelling along a second vehicle route.
- Yet another example aspect of the present disclosure is directed to an autonomous vehicle includes one or more sensors, a communication system, one or more processors, and one or more memory devices. The one or more memory devices store instructions that when executed by the one or more processors cause the autonomous vehicle to perform operations. The operations include obtaining data indicative of a location associated with a user. The user is associated with a request for a vehicle service provided by the autonomous vehicle. The operations include controlling the autonomous vehicle to travel along a first vehicle route to arrive within a vicinity of the location associated with the user. The operations include obtaining traffic data associated with a geographic area that includes the location associated with the user based at least in part on sensor data obtained via the one or more sensors. The traffic data is indicative of a level of traffic within a surrounding environment of the autonomous vehicle. The operations include obtaining, via the communication system, location data associated with a user device associated with the user. The location data associated with the user device is indicative of one or more locations of the user device associated with the user at one or more times. The operations include determining an estimated traffic impact of the autonomous vehicle on the geographic area based at least in part on the traffic data. The operations include determining an estimated time of user arrival based at least in part on the location data associated with the user device. The operations include determining one or more vehicle actions based at least in part on at least one of the estimated traffic impact or the estimated time of user arrival. The operations include causing the autonomous vehicle to perform the one or more vehicle actions. The one or more vehicle actions include at least one of stopping the autonomous vehicle within the vicinity of the location associated with the user or travelling along a second vehicle route.
- Other example aspects of the present disclosure are directed to systems, methods, vehicles, apparatuses, tangible, non-transitory computer-readable media, and memory devices for controlling autonomous vehicles.
- These and other features, aspects and advantages of various embodiments will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the present disclosure and, together with the description, serve to explain the related principles.
- Detailed discussion of embodiments directed to one of ordinary skill in the art are set forth in the specification, which makes reference to the appended figures, in which:
-
FIG. 1 depicts an example system overview according to example embodiments of the present disclosure; -
FIG. 2 depicts an example geographic area that includes a location associated with a user according to example embodiments of the present disclosure; -
FIG. 3 depicts example information associated with an acceptable walking distance according to example embodiments of the present disclosure; -
FIG. 4 depicts an example display device with an example communication according to example embodiments of the present disclosure; -
FIG. 5 depicts example information associated with an estimated traffic impact according to example embodiments of the present disclosure; -
FIG. 6 depicts an example travel way according to example embodiments of the present disclosure; -
FIG. 7 depicts a flow diagram of an example method of determining an estimated time of user arrival according to example embodiments of the present disclosure; -
FIG. 8A depicts an example portion of a communications system according to example embodiments of the present disclosure; -
FIG. 8B depicts an example portion of a communications system according to example embodiments of the present disclosure; -
FIG. 8C depicts an example diagram of obtaining location data according to example embodiments of the present disclosure; -
FIG. 9 depicts example information associated with a second vehicle route according to example embodiments of the present disclosure; -
FIG. 10 depicts an example display device with an example communication according to example embodiments of the present disclosure; -
FIG. 11 depicts example information associated with a vehicle service cancellation threshold according to example embodiments of the present disclosure; -
FIG. 12 depicts a flow diagram of an example method of controlling autonomous vehicles according to example embodiments of the present disclosure; -
FIGS. 13A-B depict a flow diagram of an example method of controlling autonomous vehicles according to example embodiments of the present disclosure; and -
FIG. 14 depicts example system components according to example embodiments of the present disclosure. - Reference now will be made in detail to embodiments, one or more example(s) of which are illustrated in the drawings. Each example is provided by way of explanation of the embodiments, not limitation of the present disclosure. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made to the embodiments without departing from the scope or spirit of the present disclosure. For instance, features illustrated or described as part of one embodiment can be used with another embodiment to yield a still further embodiment. Thus, it is intended that aspects of the present disclosure cover such modifications and variations.
- Example aspects of the present disclosure are directed to improving the travel patterns of an autonomous vehicle to account for potential traffic impacts, while waiting to provide a Vehicle service to a user. For instance, an entity (e.g., service provider) can use a fleet of vehicles to provide a vehicle service (e.g., transportation service, delivery service, courier service, etc.) to a plurality of users. The fleet can include, for example, autonomous vehicles that can drive, navigate, operate, etc. with minimal and/or no interaction from a human driver. For example, an autonomous vehicle can receive data indicative of a location associated with a user that has requested a vehicle service, such as a transportation service. The autonomous vehicle can autonomously navigate along a route towards the location associated with the user. When within a vicinity of the location, the autonomous vehicle can attempt to identify a parking spot that is out of a travel way (e.g., out of a traffic lane). Such parking spots may not, however, be available to the autonomous vehicle. In such a case, the autonomous vehicle can determine whether the vehicle should stop in a travel lane to pick up the user and/or whether the vehicle should enter a holding pattern whereby the vehicle continues to travel (e.g., around the block) to return to the user's location. For example, as further described herein, the autonomous vehicle can estimate the impact the autonomous vehicle may have on traffic if the autonomous vehicle were to stop in a travel way to wait for the user, given, for example, the user's proximity to the vehicle. In the event that the impact would be low, the autonomous vehicle can stop in the travel way and alert the user of the vehicle's location. In the event that the impact would be high, or becomes high after the vehicle has stopped, the autonomous vehicle can be re-routed (e.g., around the block) so that the vehicle can again attempt to pick up the user. In this way, the system and methods of the present disclosure can improve the situational awareness of an autonomous vehicle that is waiting for a user of a vehicle service (e.g., while attempting to pick up a user for transport).
- More particularly, an entity (e.g., service provider, owner, manager) can use one or more vehicles (e.g., ground-based vehicles) to provide a vehicle service such as a transportation service (e.g., rideshare service), a courier service, a delivery service, etc. The vehicle(s) can be autonomous vehicles that include various systems and devices configured to control the operation of the vehicle. For example, an autonomous vehicle can include an onboard vehicle computing system for operating the vehicle (e.g., located on or within the autonomous vehicle). The vehicle computing system can receive sensor data from sensor(s) onboard the vehicle (e.g., cameras, LIDAR, RADAR), attempt to comprehend the vehicle's surrounding environment by performing various processing techniques on the sensor data, and generate an appropriate motion plan through the vehicle's surrounding environment. Moreover, the autonomous vehicle can be configured to communicate with one or more computing devices that are remote from the vehicle. For example, the autonomous vehicle can communicate with an operations computing system that can be associated with the entity. The operations computing system can help the entity monitor, communicate with, manage, etc. the fleet of vehicles.
- An autonomous vehicle can be configured to operate in a plurality of operating modes. For example, an autonomous vehicle can be configured to operate in a fully autonomous (e.g., self-driving) operating mode in which the autonomous vehicle can drive and navigate with no interaction from a human driver present in the vehicle. In some implementations, a human driver may not be present in the autonomous vehicle. The autonomous vehicle can also be configured to operate in an approach mode in which autonomous vehicle performs various functions as it approaches a location associated with a user, such as searching its surrounding environment for a parking location. The approach operating mode can be utilized, for example, when the autonomous vehicle is approaching a user that has requested a vehicle service, as further described herein.
- A user can make a request for a vehicle service provided by the autonomous vehicle. For instance, a user can provide (e.g., via a user device) a request to the operations computing system of an entity (e.g., service provider, manager, owner) that is associated with the autonomous vehicle. The request can indicate the type of vehicle service that the user desires (e.g., a transportation service, a delivery service, a courier service, etc.), a location associated with the user (e.g., a current location of the user, a different location, etc.), an identifier (e.g., phone number, Bluetooth, WiFi, Cellular, other data that can be used to contact the user, etc.) associated with the user device that provided the request, and/or other information.
- The operations computing system can process the request and select an autonomous vehicle to provide the requested vehicle service to the user. The operations computing system can provide, to the autonomous vehicle, data indicative of a location to which the autonomous vehicle is to travel. The location can be associated with the user that requested the vehicle service. For example, the location can be the current location of the user and/or a different location, such as for example a location at which the user would like to be picked up by the autonomous vehicle, provide an item to the autonomous vehicle, retrieve an item from the autonomous vehicle, etc. The location can be expressed as a coordinate (e.g., GPS coordinate, latitude-longitude coordinate pair), an address, a place name, and/or other geographic reference that can be used to identify the location. The location associated with the user can be represented, for example, as a pin on a map user interface.
- The autonomous vehicle can obtain, from the operations computing system, the data indicative of the location associated with the user. The autonomous vehicle can also obtain a first vehicle route that leads to the location associated with the user. The first vehicle route can be, for example, a route from the current location of the vehicle to the location associated with the user. In some implementations, the operations computing system can provide the first vehicle route to the autonomous vehicle. Additionally, or alternatively, the onboard vehicle computing system of the autonomous vehicle can determine the first Vehicle route.
- The autonomous vehicle can travel in accordance with the first vehicle route to arrive within a vicinity of the location associated with the user. The vicinity can be defined by a distance (e.g., a radial distance) from the location associated with the user. The distance can be indicative of an acceptable walking distance from the location associated with the user. For example, an acceptable walking distance can be distance that a user would be willing walk to arrive at a vehicle. The acceptable walking distance can be determined based on a variety of information. Such information can include, for example, specific user preferences stored in a profile associated with the user, weather information (e.g., gathered via sensors onboard the vehicle, provided by a third party source, etc.), traffic conditions (e.g., current, historic, future predicted traffic conditions), historic vehicle services data (e.g., historic pickup data for previous transportation services), and/or other types of information. The autonomous vehicle can provide a communication to the user indicating that the autonomous vehicle is within the vicinity of the location associated with the user. For instance, such a communication can be in the form of a textual message stating “your ride is arriving please prepare to board”.
- Once the autonomous vehicle is within the vicinity of the location associated with the user, the autonomous vehicle can begin to search for a parking location. For example, the autonomous vehicle can enter into the approach operating mode when the vehicle is within the vicinity of the location associated with the user. While it is within the vicinity, the autonomous vehicle can search for a parking location before it reaches the location associated with the user (e.g., before the GPS pin coordinate on a map) and/or after it passes the location associated with the user, but is still within the vicinity of the user (e.g., within acceptable walking distance for the user).
- To identify a parking spot, the autonomous vehicle can obtain sensor data associated with one or more objects that are proximate to the vehicle (e.g., within a field of view of one or more of the vehicle's onboard sensor(s)). The sensor data can include image data, radar data, LIDAR data, and/or other data acquired by the vehicle's sensor(s). The object(s) can include, for example, pedestrians, vehicles, bicycles, and/or other objects. The sensor data can be indicative of locations associated with the object(s) within the surrounding environment of the vehicle at one or more times. The autonomous vehicle can process the sensor data to determine if there are any available parking locations that are not currently occupied by the objects (e.g., other vehicles) within the vehicle's surrounding environment. In some implementations, the autonomous vehicle can utilize map data to determine if there are any designated parking locations (e.g., parking lots, pullover lanes, etc.) within the vicinity of the location associated with the user.
- In the event the autonomous vehicle is able to identify a parking location that is out of a travel way and within the vicinity of the location associated with the user (e.g., out of a traffic lane), the autonomous vehicle can position itself into that parking location accordingly. The autonomous vehicle can send a communication to a user device associated with the user. The communication can indicate that the vehicle has arrived as well as the location of the autonomous vehicle. In some implementations, the user device can display a map user interface that includes a user route. The user route can be a route along which a user can travel to arrive at the autonomous vehicle. In the event that the autonomous vehicle is unable to identify a parking location that is out of the travel way (and within the vicinity of the user), the autonomous vehicle can decide whether or not to stop at least partially in a travel way to wait for the user.
- To help decide whether to stop within a travel way, the autonomous vehicle can obtain traffic data associated with a geographic area that includes the location associated with the user. The traffic data can include various types of data such as historic traffic data, predicted traffic data, and/or current traffic data associated with the geographic area (e.g., within the vicinity of the location of the user, a wider area, etc.). The traffic data can be obtained from a variety of sources such as other autonomous vehicles (e.g., within the vehicle fleet), the operations computing system, third party sources (e.g., regional traffic management entities, etc.), as well as the autonomous vehicle itself.
- By way of example, the autonomous vehicle can obtain the current traffic data associated with the geographic area that includes the location associated with the user. To do so, the autonomous vehicle can obtain sensor data (e.g., via its onboard sensors) associated with the surrounding environment of the autonomous vehicle that is within the vicinity of the location associated with the user, as described herein. The sensor data can be indicative of one or more objects within the surrounding environment of the autonomous vehicle. The autonomous vehicle can process the sensor data to classify which of the object(s) would be impacted (e.g., caused to stop) by the autonomous vehicle stopping in a travel way. For example, the autonomous vehicle can classify the vehicles that are behind the autonomous vehicle (in the same travel lane) as objects that would be impacted in the event the autonomous vehicle were to stop at least partially in the travel way. The autonomous vehicle can also identify object(s) that would not be affected by the vehicle stopping in the travel way. For example, the autonomous vehicle can determine that object(s) that have a path to travel around the autonomous vehicle (e.g., bicycles, vehicles in adjacent lanes, vehicles with clear paths to change lanes, etc.), may not be impacted and/or may be impacted to an insignificant degree. After such classification, the autonomous vehicle can determine a level of traffic associated with the geographic area (e.g., within the vicinity of the user's location) based at least in part on the sensor data. For example, the level of traffic can be based at least in part on the number of object(s) within the surrounding environment of the autonomous vehicle that would be impacted by the autonomous vehicle stopping at least partially in the travel way, while filtering out those object(s) that would not be impacted. Similar such information could be acquired via one or more other vehicles in the associated vehicle fleet.
- The autonomous vehicle can determine an estimated traffic impact of the autonomous vehicle on the geographic area based at least in part on the traffic data. The estimated traffic impact can be indicative of an estimated impact of the autonomous vehicle on one or more objects within a surrounding environment of the autonomous vehicle in the event that the autonomous vehicle were to stop at least partially in the travel way (e.g., in the vicinity of the location associated with the user). For example, in some implementations, the autonomous vehicle can compare the level of traffic (e.g., based on the sensor data) to a traffic constraint to determine whether the estimated traffic impact would be high or low. The traffic constraint can be implemented in a variety of forms. For example, the traffic constraint can include a traffic threshold that is indicative of an acceptable level of traffic (e.g., an acceptable number of objects) that would be impacted by the autonomous vehicle stopping at least partially in the travel way. A traffic level that exceeds the traffic threshold would be considered a high impact on traffic. In some implementations, the traffic constraint can be implemented as cost data (e.g., one or more cost function(s)). For example, the autonomous vehicle's onboard vehicle computing system can include cost data that reflects the cost(s) of stopping vehicle motion, the cost(s) of causing traffic build-up, the cost(s) of illegally stopping in a travel way, etc.
- The traffic constraint can be based on a variety of information. In some implementations, the traffic constraint can be based at least in part on historic traffic data that indicates the level of traffic previously occurring in that geographic area. For example, if the geographic area normally experiences a high level of traffic build-up, a corresponding traffic threshold can be higher (and/or the cost of stopping can be lower). Additionally, or alternatively, the traffic constraint can be based at least in part on real-time traffic data (e.g., from other vehicles in the fleet, from the autonomous vehicle, other sources). For example, in the event that there is already a traffic jam in the vicinity of the location of the user, the traffic threshold could be higher (and/or the cost of stopping could be lower). In some implementations, the traffic constraint can be based at least in part on the typical travel expectations of individuals in the geographic area. For example, individuals that are located in City A may be more patient when waiting in traffic than those in City B. Thus, a traffic threshold may be higher in City A than in City B (and/or the cost of stopping may be lower in City A than in City B). In some implementations, the traffic constraint can be based at least in part on map data. For example, in the event that the autonomous vehicle is traveling on a wide travel way in which impacted vehicles could eventually travel around the autonomous vehicle, the traffic threshold could be higher and/or the cost of stopping could be lower).
- In some implementations, the traffic constraint can be determined at least in part from a model, such as a machine-learned model. For example, the machine-learned model can be or can otherwise include one or more various model(s) such as, for example, models using boosted random forest techniques, neural networks (e.g., deep neural networks), or other multi-layer non-linear models. Neural networks can include recurrent neural networks (e.g., long short-term memory recurrent neural networks), feed-forward neural networks, and/or other forms of neural networks. For instance, supervised training techniques can be performed to train the model (e.g., using previous driving logs) to determine a traffic constraint based at least in part on input data. The input data can include, for example, traffic data as described herein, map data, data from a traffic management entity, driving characteristics of individuals in an associated geographic area, complaints received from operators of vehicles that were caused to stop by autonomous vehicles, etc. The machine-learned model can provide, as an output, data indicative of a recommended traffic constraint. In some implementations, the recommended traffic constraint can be specific to the geographic area.
- The autonomous vehicle can also, or alternatively, determine an estimated time of user arrival in order to help determine whether or not to stop at least partially in the travel way. For instance, the autonomous vehicle can obtain location data associated with a user device associated with a user. The location data can be indicative of the position of the user device associated with the user. The autonomous vehicle can use one or more identifier(s) of the user device (e.g, provided by the operations computing system) to scan for and/or communicate with the user device when the vehicle is within the vicinity of the user. For example, the autonomous vehicle can utilize multiple input/multiple output communication, Bluetooth low energy protocol, RF signaling, and/or other communication technologies to obtain the location data. The autonomous vehicle can determine the estimated time of user arrival based at least in part on the location data associated with the user device. The estimated time of user arrival can be indicative of, for example, a time at which the user is estimated to complete boarding of the autonomous vehicle (e.g., for a transportation service). The estimated time of user arrival can be expressed as a time duration (e.g., user estimated to arrive in 1 minute) and/or a point in time (e.g., user estimated to arrive at 10:31 am (PT)). In this way, the autonomous vehicle can determine an amount of stopping time that the object(s) within its surroundings would be impacted as the autonomous vehicle waits for the user's arrival (e,g., how long other vehicles would be caused to stop while waiting for the user).
- By way of example, the operations computing system can provide the autonomous vehicle with one or more identifier(s) of a user device associated with the user. The autonomous vehicle can use the identifier(s) (e.g., Bluetooth, WiFi, Cellular, other identifier) to determine the location of the user. For example, when the autonomous vehicle is within a vicinity of the location associated with the user, the autonomous vehicle can scan for the user device based at least in part on the identifier(s). Once the user device is found, the autonomous vehicle can track changes in the signal strength (e.g., radio signal strength identifier) to determine the approximate heading of the user as well as the approximate distance between the user and autonomous vehicle (e.g., without authenticated connection). In some implementations, the autonomous vehicle can determine differences in Bluetooth Low Energy beacon radio signal strength identifiers over time and/or inertial measurement unit changes, which can indicate distance and direction of the autonomous vehicle from the user device (e.g., mobile phone associated with the user). The autonomous vehicle can calculate the estimated time of user arrival based at least on the heading of the user and the approximate distance between the user and the vehicle (and/or the estimated velocity of the user), as further described herein. In some implementations, the autonomous vehicle can obtain the location data using multiple input, multiple output communication between the autonomous vehicle and a user device associated with the user. This can allow the autonomous vehicle to take advantage of the multiple antennas included in the vehicle's communication system as well as those of the user device to increase accuracy of the location data associated with the user. In some implementations, the estimated time to user arrival can be based at least in part on historic data. Such historic data can include, for example, previous correlations between changes in the signal strength of an identifier and the user's time to arriving at a vehicle.
- Additionally, or alternatively, the autonomous vehicle, systems, and methods described herein can utilize other communication methods. Such communication methods can include, for example, autonomous vehicle based triangulation (e.g., on vehicle triangulated RF), on-vehicle multi-range beacon (e.g., Bluetooth Low Energy) hardware (e.g., paired with a software application on a user device), application triangulation, user device to vehicle handshake (e.g., light signal handshake), autonomous vehicle perception of the user (e.g., via processing of sensor data to perceive the user and the user's location, distance, heading, velocity, and/or other state data associated therewith), GPS location of the user device, device specific techniques (e.g., associated with a specific device/model type), the autonomous vehicle serving as a localized base station (e.g., GPS, WiFi, etc.) for the user device, autonomous vehicle localization and user device image based localization via one or more network(s), and/or other techniques.
- In some implementations, in order to determine whether the amount of stopping time is acceptable, the autonomous vehicle can compare the estimated time of user arrival to a time constraint. The time constraint can be expressed as a time threshold (e.g., indicating an acceptable amount of stopping time) and/or cost data (e.g., cost functions expressing a cost in relation to stopping time). Similar to the traffic constraint, the time constraint can be based on historic data (e.g., indicating historic wait times), real-time data (e.g., indicating that the vehicles are already waiting due to another traffic build-up in front of the autonomous vehicle), expectations of individuals in the geographic area, machine-learned model(s), and/or other information.
- In some implementations, the estimated time of user arrival can also factor in additional amounts of time that can impact the objects) within the surrounding environment of the autonomous vehicle while the autonomous vehicle is stopped, awaiting the user. Example instances requiring such additional amounts of time can be associated with a user getting into the vehicle and securely fastening his/her seatbelt, a user helping other passengers enter and become securely positioned within the vehicle (e.g., children or others requiring assistance), a user loading luggage or other items for transportation within the vehicle, a user receiving delivered item(s) from/placing item(s) within the vehicle, etc. In some implementations, the additional amounts of time can be determined based at least in part on information provided with a service request (e.g., type of service, destination, number of passengers, child's car seat requested, etc.
- The autonomous vehicle can determine one or more vehicle actions based at least in part on the estimated traffic impact and/or the estimated time of user arrival. In some implementations, vehicle action(s) can include stopping within the vicinity of the location associated with the user (e.g., at least partially in the travel way). For example, in the event that the level of traffic (e.g., the number of other vehicles that would be impacted by an in-lane stop) is below a traffic threshold, the autonomous vehicle can determine that the vehicle can stop within the travel way to wait for the user to arrive at the vehicle. In another example, in the event that the estimated time of user arrival is below the time threshold, the autonomous vehicle can determine that the vehicle can stop art least partially within the travel way.
- In some implementations, the autonomous vehicle can base its determination to stop at least partially within the travel way on both the estimated traffic impact and the estimated time of user arrival. For instance, the autonomous vehicle can weigh each of these estimates to determine whether it would be appropriate for the vehicle to stop at least partially in the travel way to wait for the user. The autonomous vehicle can apply a first weighting factor to the estimated traffic impact and a second weighting factor to the estimated time of user arrival. The first weighting factor can be different than the second weighting factor. By way of example, the estimated impact on traffic may be high while the estimated time of user arrival may be short. Accordingly, the autonomous vehicle may determine that it can stop within the travel way because although a higher number of objects (e.g., other vehicles) may be caused to stop, it would only be for a short period of time because the user is close in distance to the autonomous vehicle. In such a case, the estimated time of user arrival can he given a greater weight than the estimated traffic impact. In another example, the estimated impact on traffic may be low while the estimated time of user arrival may be long. Accordingly, the autonomous vehicle may determine that it should not stop within the travel way because although only a few objects (e.g., other vehicles) may be caused to stop, it would be for a greater period of time because the user is farther from the autonomous vehicle. In such a case, the estimated traffic impact can be given a greater weight than the estimated time of user arrival.
- The vehicle action(s) can also, or alternatively, include the autonomous vehicle entering into a holding pattern. In such a case, the vehicle action(s) can include traveling along a second vehicle route (e.g., an optimal holding pattern route). For example, the autonomous vehicle may be unable to find a parking location before and/or after the location associated with the user (e.g., before and/or after a pin location of the user). Additionally, the autonomous vehicle may determine that it should not stop within a travel way to wait for the user's arrival, as described herein. Thus, the autonomous vehicle can be re-routed along a second vehicle route that is at least partially different than the first vehicle route. The second vehicle route can be a path along which the autonomous vehicle can travel to re-arrive within the vicinity of the location of the user. For example, the second vehicle route can be a path along which the autonomous vehicle can travel around a block, back to the location associated with the user. This can afford the user additional time to arrive at the vehicle, without the autonomous vehicle impacting traffic (e.g., due to a stop).
- In some implementations, the autonomous vehicle can determine that it can stop within the travel way, but later determine that it must begin to travel again (e.g., according to a holding pattern route). For example, the autonomous vehicle can determine that it would be appropriate to stop at least partially within the travel way to wait for the user based at least in part on the estimated traffic impact and/or the estimated time of user arrival. While the autonomous vehicle is stopped, the traffic impact may increase (e.g., due to an increase in the number of other vehicle(s) stopped behind the autonomous vehicle) and/or the user may take longer than estimated to arrive at the autonomous vehicle. As such, the autonomous vehicle can determine an updated estimated traffic impact (e.g., based on the number of vehicles that have already stopped and/or additional vehicles that may be caused to stop) and/or an updated estimated time of user arrival (e.g., based on a change in the user device location data, if any). The autonomous vehicle can then determine that it can no longer remain stopped to wait for the user based at least in part on the updated estimates. As such, the autonomous vehicle can begin to travel again, for example, along the second vehicle route (e.g., around the block).
- The vehicle computing system of the autonomous vehicle can implement the determined vehicle action(s). For example, in the event that the autonomous vehicle has determined to stop in the travel way, the vehicle computing system can cause the autonomous vehicle to stop by sending one or more control signals to the braking control system(s) of the autonomous vehicle. In the event that the autonomous vehicle has determined to travel along a second vehicle route (e.g., in accordance with the holding pattern), the vehicle computing system can obtain data associated with the second vehicle route and implement the route accordingly. For example, the vehicle computing system can request and obtain data indicative of the second vehicle route from the operations computing system. Additionally, or alternatively, the vehicle computing system can determine the second vehicle route onboard the vehicle. In either case, the vehicle computing system can send one or more control signals to cause a motion planning system of the autonomous vehicle to plan the motion of the vehicle in accordance with the second vehicle route (e.g., to implement a vehicle trajectory in accordance with the second vehicle action).
- The autonomous vehicle can provide the user with one or more communications indicating the actions taken by the autonomous vehicle. For example, in the event that the autonomous vehicle stops within the travel way, the autonomous vehicle can provide a communication to the user indicating that the vehicle is waiting for the user (e.g., “your vehicle has arrived, please proceed quickly to the vehicle”). In response to receiving such a communication, a user device associated with the user can display a map user interface indicating the vehicle location of the autonomous vehicle and a user route to the vehicle's location. In the event that the autonomous vehicle does not find a parking spot and does not stop in the travel way, the autonomous vehicle can provide a communication to the user indicating as such (e.g., “I could not locate you at the pin drop, traffic forced me to go around the block. Please proceed to the pin drop”).
- In some implementations, the autonomous vehicle may be relieved of its responsibility to provide a vehicle service to the user. For instance, in the event that a vehicle computing system and/or operations computing system determines that an autonomous vehicle should travel along the second vehicle route, such computing system(s) can determine whether it would be advantageous (e.g., more time efficient, more fuel efficient, etc.) for another autonomous vehicle in the area to be routed to the user. In the event that it would be advantageous, the computing system(s) can provide data to the autonomous vehicle indicating that the autonomous vehicle is no longer responsible for the service request. Additionally, or alternatively, the computing system(s) can re-route the autonomous vehicle to provide a vehicle service to another user.
- In some implementations, the autonomous vehicle can cancel the service request of the user. By way of example, the autonomous vehicle may be caused to re-route (e.g., circle the block) a certain number of times and/or the user may not arrive at the autonomous vehicle within a certain timeframe (e.g., above a trip cancellation threshold). In response, the autonomous vehicle can send data to the operations computing system requesting the cancellation of the user's service request. The operations computing system can cancel the service request (and inform the user accordingly) and/or re-route the autonomous vehicle to provide a vehicle service to another user. In some implementations, the autonomous vehicle can communicate directly with a user device associated with the user to cancel the service request and inform the user accordingly. The autonomous vehicle can report such a cancellation to the operations computing system.
- The systems and methods described herein may provide a number of technical effects and benefits. For instance, the systems and methods enable an autonomous vehicle to determine a holding pattern, for waiting for a user, onboard the autonomous vehicle. As such, the autonomous vehicle need not communicate with a remote computing system (e.g., operations computing system) each time the vehicle must decide whether to stop in a travel way or re-route the vehicle (e.g., around the block). This can help improve the response time of the vehicle computing system when deciding and/or implementing such a holding pattern. Moreover, by enabling the vehicle computing system to determine whether to stop and/or to re-route onboard the Vehicle, the systems and methods described herein can save computational resources of the operations computing system (that would otherwise be required for such determination). Accordingly, the computational resources of the operations computing system can be allocated to other core functions such as the management of service requests, routing of autonomous vehicles, etc.
- The systems and methods of the present disclosure also provide an improvement to vehicle computing technology, such as autonomous vehicle computing technology. For instance, the computer-implemented methods and systems improve the situational awareness of the autonomous vehicle to provide a vehicle service to a user. The systems and methods can enable a computing system to obtain data indicative of a location associated with a user to which an autonomous vehicle is to travel. The autonomous vehicle can travel along a first vehicle route that leads to the location associated with the user. The computing system can obtain traffic data associated with a geographic area that includes the location associated with the user and location data associated with a user device associated with the user. The computing system can determine an estimated traffic impact of the autonomous vehicle on the geographic area based at least in part on the traffic data and/or an estimated time of user arrival based at least in part on the location data associated with the user device. The computing system can determine one or more vehicle actions based at least in part on the estimated traffic impact and/or the estimated time of user arrival. The computing system can cause the autonomous vehicle to perform the one or more vehicle actions. As described herein, the vehicle action(s) can include at least one of stopping the autonomous vehicle at least partially in a travel way within a vicinity of the location associated with the user or travelling along a second vehicle route. In this way, the vehicle computing system can improve the holding pattern of the autonomous vehicle that is waiting for a user. For example, the holding pattern can be customized based on the estimated impact on the traffic surrounding the autonomous vehicle and/or the estimated time it will take for the specific user to arrive at the vehicle. As such, the systems and methods can improve the vehicle computing system's situational awareness by allowing it to take into account (e.g., in real-time) such circumstances when making a determination as to how to best provide vehicle services. Such approach can also increase the efficiency of implementing a holding pattern e.g., by avoiding the aforementioned latency issues) while providing an additional benefit of minimizing the autonomous vehicle's impact on traffic.
- Additionally, the systems and methods of the present disclosure can enhance the user experience associated with the autonomous vehicle. For instance, the systems and methods described herein provide a systematic approach that enables users to engage autonomous vehicles and receive effectively communicated information regarding expected autonomous locations including, for example, arrival times, use of holding patterns when needed, etc. By way of example, the communications and user interfaces described herein provide the user with updated information regarding the autonomous vehicle's actions and locations, thereby increasing the user's knowledge and understanding of the autonomous vehicle's intentions. In addition to these advantages, the user experience is further enhanced in that the described systems and methods can decrease the likelihood that a user will be subjected to potential frustration from other drivers or pedestrians in the surrounding environment that could be impacted while the user is arriving to and/or boarding the autonomous vehicle.
- With reference now to the FIGS., example embodiments of the present disclosure will be discussed in further detail.
FIG. 1 depicts anexample system 100 according to example embodiments of the present disclosure. Thesystem 100 can include avehicle computing system 102 associated with avehicle 104 and anoperations computing system 106 that is remote from thevehicle 104. - The
vehicle 104 incorporating thevehicle computing system 102 can be a ground-based autonomous vehicle (e.g., car, truck, bus. etc.), an air-based autonomous vehicle (e.g., airplane, drone, helicopter, or other aircraft), or other types of vehicles (e.g., watercraft, etc.). Thevehicle 104 can be an autonomous vehicle that can drive, navigate, operate, etc. with minimal and/or no interaction from a human driver. A human operator can be omitted from the vehicle 104 (and/or also omitted from remote control of the vehicle 104). - The
vehicle 104 can be configured to operate in a plurality ofoperating modes 108A-D. Thevehicle 104 can be configured to operate in a fully autonomous (e.g., self-driving)operating mode 108A in which thevehicle 104 can drive and navigate with no input from a user present in thevehicle 104. Thevehicle 104 can be configured to operate in asemi-autonomous operating mode 108B in which thevehicle 104 can operate with some input from a user present in thevehicle 104. Thevehicle 104 can enter into amanual operating mode 108C in which thevehicle 104 is fully controllable by a user (e.g., human driver) and can be prohibited from performing autonomous navigation (e.g., autonomous driving). In some implementations, thevehicle 104 can implement vehicle operating assistance technology (e.g., collision mitigation system, power assist steering, etc.) while in themanual operating mode 108C to help assist the operator of thevehicle 104. - The
vehicle 104 can also be configured to operate in anapproach mode 108D in whichvehicle 104 performs various functions as it approaches a location associated with a user. For example, thevehicle 104 can enter into theapproach mode 108D when the vehicle is within a vicinity of a user to which thevehicle 104 is to provide a vehicle service, as further described herein. While in theapproach mode 108D, thevehicle 104 can search its surrounding environment for a parking location as well as communicate with a user (e.g., via a user device associated with the user). Additionally, or alternatively, thevehicle 104 can be configured to evaluate its estimated impact on traffic and/or an estimated time of user arrival, as further described herein, when thevehicle 104 is in theapproach mode 108D. - The operating mode of the
vehicle 104 can be adjusted in a variety of manners. In some implementations, the operating mode of thevehicle 104 can be selected remotely, off-board thevehicle 104. For example, an entity associated with the vehicle 104 (e.g., a service provider) can utilize theoperations computing system 106 to manage the vehicle 104 (and/or an associated fleet). Theoperations computing system 106 can send one or more control signals to thevehicle 104 instructing thevehicle 104 to enter into, exit from, maintain, etc. an operating mode. By way of example, theoperations computing system 106 can send one or more control signals to thevehicle 104 instructing thevehicle 104 to enter into the fullyautonomous operating mode 108A. In some implementations, the operating mode of thevehicle 104 can be set onboard and/or near thevehicle 104. For example, thevehicle computing system 102 can automatically determine when and where thevehicle 104 is to enter, change, maintain, etc. a particular operating mode (e.g., without user input). Additionally, or alternatively, the operating mode of thevehicle 104 can be manually selected via one or more interfaces located onboard the vehicle 104 (e.g., key switch, button, etc.) and/or associated with a computing device proximate to the vehicle 104 (e.g., a tablet operated by authorized personnel located near the vehicle 104). In some implementations, the operating mode of thevehicle 104 can be adjusted based at least in part on a sequence of interfaces located on thevehicle 104. For example, the operating mode may be adjusted by manipulating a series of interfaces in a particular order to cause thevehicle 104 to enter into a particular operating mode. - The
vehicle computing system 102 can include one or more computing devices located onboard thevehicle 104. For example, the computing device(s) can be located on and/or within thevehicle 104. The computing device(s) can include various components for performing various operations and functions. For instance, the computing device(s) can include one or more processor(s) and one or more tangible, non-transitory, computer readable media (e.g., memory devices). The one or more tangible, non-transitory, computer readable media can store instructions that when executed by the one or more processor(s) cause the vehicle 104 (e.g., its computing system, one or more processors, etc.) to perform operations and functions, such as those described herein for controlling an autonomous vehicle. - As shown in
FIG. 1 , thevehicle 104 can include one ormore sensors 112, anautonomy computing system 114, and one or morevehicle control systems 116. One or more of these systems can be configured to communicate with one another via a communication channel. The communication channel can include one or more data buses (e.g., controller area network (CAN)), on-board diagnostics connector (e.g., OBD-II), and/or a combination of wired and/or wireless communication links. The onboard systems can send and/or receive data, messages, signals, etc. amongst one another via the communication channel. - The sensor(s) 112 can be configured to acquire
sensor data 118 associated with one or more objects that are proximate to the vehicle 104 (e.g., within a field of view of one or more of the sensor(s) 112). The sensor(s) 112 can include a Light Detection and Ranging (LIDAR) system, a Radio Detection and Ranging (RADAR) system, one or more cameras (e.g., visible spectrum cameras, infrared cameras, etc.), motion sensors, and/or other types of imaging capture devices and/or sensors. Thesensor data 118 can include image data, radar data, LIDAR data, and/or other data acquired by the sensor(s) 112. The object(s) can include, for example, pedestrians, vehicles, bicycles, and/or other objects. The object(s) can be located in front of, to the rear of, and/or to the side of thevehicle 104. Thesensor data 118 can be indicative of locations associated with the object(s) within the surrounding environment of thevehicle 104 at one or more times. The sensor(s) 112 can provide thesensor data 118 to theautonomy computing system 114. - In addition to the
sensor data 118, theautonomy computing system 114 can retrieve or otherwise obtainmap data 120. Themap data 120 can provide detailed information about the surrounding environment of thevehicle 104. For example, themap data 120 can provide information regarding: the identity and location of different roadways, road segments, buildings, or other items or objects (e.g., lampposts, crosswalks, curbing, etc.) the location and directions of traffic lanes (e.g., the location and direction of a parking lane, a turning lane, a bicycle lane, or other lanes within a particular roadway or other travel way and/or one or more boundary markings associated therewith); traffic control data (e.g., the location and instructions of signage, traffic lights, or other traffic control devices); and/or any other map data that provides information that assists thevehicle 104 in comprehending and perceiving its surrounding environment and its relationship thereto. - The
vehicle 104 can include apositioning system 122. Thepositioning system 122 can determine a current position of thevehicle 104. Thepositioning system 122 can be any device or circuitry for analyzing the position of thevehicle 104. For example, thepositioning system 122 can determine position by using one or more of inertial sensors, a satellite positioning system, based on IP/MAC address, by using triangulation and/or proximity to network access points or other network components (e.g., cellular towers, WiFi access points, etc.) and/or other suitable techniques. The position of thevehicle 104 can be used by various systems of thevehicle computing system 102 and/or provided to one or more remote computing device(s) (e.g., of the operations computing system 106). For example, themap data 120 can provide thevehicle 104 relative positions of the surrounding environment of thevehicle 104. Thevehicle 104 can identify its position within the surrounding environment (e.g., across six axes) based at least in part on the data described herein. For example, thevehicle 104 can process the sensor data 118 (e.g., LIDAR data, camera data) to match it to a map of the surrounding environment to get an understanding of the vehicle's position within that environment. - The
autonomy computing system 114 can include aperception system 124, aprediction system 126, amotion planning system 128, and/or other systems that cooperate to perceive the surrounding environment of thevehicle 104 and determine a motion plan for controlling the motion of thevehicle 104 accordingly. For example, theautonomy computing system 114 can receive thesensor data 118 from the sensor(s) 112, attempt to comprehend the surrounding environment by performing various processing techniques on the sensor data 118 (and/or other data), and generate an appropriate motion plan through such surrounding environment. Theautonomy computing system 114 can control the one or morevehicle control systems 116 to operate thevehicle 104 according to the motion plan. - The
autonomy computing system 114 can identify one or more objects that are proximate to thevehicle 104 based at least in pail on thesensor data 118 and/or themap data 120. For example, theperception system 124 can obtainstate data 130 descriptive of a current state of an object that is proximate to thevehicle 104. Thestate data 130 for each object can describe, for example, an estimate of the object's: current location (also referred to as position); current speed (also referred to as velocity); current acceleration; current heading; current orientation; size/footprint (e.g., as represented by a bounding shape); class (e.g., pedestrian class vs. vehicle class vs. bicycle class), and/or other state information. Theperception system 124 can provide thestate data 130 to the prediction system 126 (e.g., for predicting the movement of an object). - The
prediction system 126 can create predicteddata 132 associated with each of the respective one or more objects proximate to thevehicle 104. The predicteddata 132 can be indicative of one or more predicted future locations of each respective object. The predicteddata 132 can be indicative of a predicted path (e.g., predicted trajectory) of at least one object within the surrounding environment of thevehicle 104. For example, the predicted path (e.g., trajectory) can indicate a path along which the respective object is predicted to travel over time (and/or the speed at which the object is predicted to travel along the predicted path). Theprediction system 126 can provide the predicteddata 132 associated with the object(s) to themotion planning system 128. - The
motion planning system 128 can determine amotion plan 134 for thevehicle 104 based at least in part on the predicted data 132 (and/or other data). Themotion plan 134 can include vehicle actions with respect to the objects proximate to thevehicle 104 as well as the predicted movements. For instance, themotion planning system 128 can implement an optimization algorithm that considers cost data associated with a vehicle action as well as other objective functions (e.g., cost functions based on speed limits, traffic lights, etc.), if any, to determine optimized variables that make up themotion plan 134. By way of example, themotion planning system 128 can determine that thevehicle 104 can perform a certain action (e.g., pass an object) without increasing the potential risk to thevehicle 104 and/or violating any traffic laws (e.g., speed limits, lane boundaries, signage). Themotion plan 134 can include a planned trajectory, speed, acceleration, other actions, etc. of thevehicle 104. - The
motion planning system 128 can provide themotion plan 134 with data indicative of the vehicle actions, a planned trajectory, and/or other operating parameters to the vehicle control system(s) 116 to implement themotion plan 134 for thevehicle 104. For instance, thevehicle 104 can include a mobility controller configured to translate themotion plan 134 into instructions. By way of example, the mobility controller can translate adetermined motion plan 134 into instructions to adjust the steering of thevehicle 104 “X” degrees, apply a certain magnitude of braking force, etc. The mobility controller can send one or more control signals to the responsible vehicle control component (e.g., braking control system, steering control system, acceleration control system) to execute the instructions and implement themotion plan 134. - The
vehicle 104 can include acommunications system 136 configured to allow the vehicle computing system 102 (and its computing device(s)) to communicate with other computing devices. Thevehicle computing system 102 can use thecommunications system 136 to communicate with theoperations computing system 106 and/or one or more other remote computing device(s) over one or more networks (e.g., via one or more wireless signal connections). In some implementations, thecommunications system 136 can allow communication among one or more of the system(s) on-board thevehicle 104. Thecommunications system 136 can also be configured to enable the autonomous vehicle to communication and/or otherwise receive data from auser device 138 associated with auser 110. Thecommunications system 136 can utilize various communication technologies such as, for example, Bluetooth low energy protocol, radio frequency signaling, etc. In some implementations, thecommunications systems 136 can enable thevehicle 104 to function as a WiFi base station for auser device 138 and/or implement localization techniques. Thecommunications system 136 can include any suitable components for interfacing with one or more network(s), including, for example, transmitters, receivers, ports, controllers, antennas, and/or other suitable components that can help facilitate communication. - The
vehicle 104 can include one or more human-machine interfaces 139. For example, thevehicle 104 can include one or more display devices located onboard thevehicle 104. A display device (e.g., screen of a tablet, laptop, etc.) can be viewable by a user of thevehicle 104 that is located in the front of the vehicle 104 (e.g., driver's seat, front passenger seat). Additionally, or alternatively, a display device can be viewable by a user of thevehicle 104 that is located in the rear of the vehicle 104 (e.g., back passenger seat(s)). - In some implementations, the
vehicle 104 can be associated with an entity (e.g., a service provider, owner, manager). The entity can be one that provides one or more vehicle service(s) to a plurality of users via a fleet of vehicles that includes, for example, thevehicle 104. In some implementations, the entity can be associated with only vehicle 104 (e.g., a sole owner, manager). In some implementations, theoperations computing system 106 can be associated with the entity. Thevehicle 104 can be configured to provide one or more vehicle services to one or more users. The vehicle service(s) can include transportation services (e.g., rideshare services in which the user rides in thevehicle 104 to be transported), courier services, delivery services, and/or other types of services. The vehicle service(s) can be offered to users by the entity, for example, via a software application (e.g., a mobile phone software application). The entity can utilize theoperations computing system 106 to coordinate and/or manage the vehicle 104 (and its associated fleet, if any) to provide the vehicle services to auser 110. - The
operations computing system 106 can include one or more computing devices that are remote from the vehicle 104 (e.g., located off-board the vehicle 104). For example, such computing device(s) can be components of a cloud-based server system and/or other type of computing system that can communicate with thevehicle computing system 102 of thevehicle 104. The computing device(s) of theoperations computing system 106 can include various components for performing various operations and functions. For instance, the computing device(s) can include one or more processor(s) and one or more tangible, non-transitory, computer readable media (e.g., memory devices). The one or more tangible, non-transitory, computer readable media can store instructions that when executed by the one or more processor(s) cause the operations computing system 106 (e.g., the one or more processors, etc.) to perform operations and functions, such as coordinating vehicles to provide vehicle services. - For example, a
user 110 can request a vehicle service provided by thevehicle 104. For instance, a user can provide (e.g., via a user device 138) data indicative of arequest 140 to the operations computing system 106 (e.g., of the entity that is associated with the vehicle 104). In some implementations, therequest 140 can be generated based at least in part on user input to a user interface displayed on the user device 138 (e.g., a user interface associated with a software application of the entity). Therequest 140 can indicate the type of vehicle service that theuser 110 desires (e.g., a transportation service, a delivery service, a courier service, etc.) and a location associated with the user 110 (e.g., a current location of the user, a different location, etc.). Therequest 140 can also include an identifier (e.g., phone number, Bluetooth, WiFi, Cellular, IP address, other information, etc.) associated with theuser device 138 that provided the request 140 (and/or other user device). The identifier can be used by thevehicle computing system 102 to communicate with theuser device 138 and/or otherwise provide/obtain data associated therewith, as further described herein. In some implementations, such an identifier can be retrieved from a memory that securely stores such information in a profile/account associated with the user 110 (e.g., such that therequest 140 need not provide the identifier). - The
operations computing system 106 can process therequest 140 and select thevehicle 104 to provide the requested vehicle service to theuser 110. The operations computing system can provide, to thevehicle 104,data 142 indicative of a location to which thevehicle 104 is to travel. For example,FIG. 2 depicts an examplegeographic area 200 that includes alocation 202 associated with theuser 110 according to example embodiments of the present disclosure. Thelocation 202 can be associated with theuser 110 that requested the vehicle service. For example, thelocation 202 can be the current location of theuser 110, as specified by theuser 110 and/or determined based on user device location data (e.g., provided with therequest 140 and/or otherwise obtained). Thelocation 202 can also be a location that is different than a current location of auser 110, such as for example a location at which theuser 110 would like to be picked-up by thevehicle 104, provide an item to thevehicle 104, retrieve an item from thevehicle 104, and/or otherwise interact with thevehicle 104. Thelocation 202 can be expressed as a coordinate (e.g., GPS coordinate, latitude-longitude coordinate pair), an address, a place name, and/or another geographic reference that can be used to identify thelocation 202. Thelocation 202 associated with theuser 110 can be represented, for example, as a pin on a map user interface. - The
vehicle computing system 102 can obtain thedata 142 indicative of thelocation 202 associated with theuser 110 to which thevehicle 104 is to travel. As described herein, theuser 110 can be associated with arequest 140 for a vehicle service provided by thevehicle 104. Thevehicle 104 can obtain thedata 142 indicative of thelocation 202 associated with theuser 110 from theoperations computing system 106. In some implementations, theuser 110 may communicate directly with thevehicle 104 to request the vehicle service. For example, theuser 110 may use theuser device 138 to send therequest 140 to thevehicle computing system 102. In such a case, thevehicle computing system 102 can process therequest 140 and determine thelocation 202 of theuser 110. - The
vehicle 104 can also obtain afirst vehicle route 204 that leads to thelocation 202 associated with theuser 110. Thefirst vehicle route 204 can be, for example, a route from the current location of thevehicle 104 to thelocation 202 associated with theuser 110. In some implementations, theoperations computing system 106 can provide thefirst vehicle route 204 to thevehicle 104. Additionally, or alternatively, thevehicle computing system 102 of thevehicle 104 can determine thefirst vehicle route 204. For example, thevehicle computing system 102 can determine thefirst vehicle route 204 based at least in part on themap data 120. - The
vehicle 104 is to travel in accordance with thefirst vehicle route 204 to arrive within avicinity 206 of thelocation 202 associated with theuser 110. For example, thevehicle computing system 102 can control the vehicle 104 (e.g., via amotion plan 134 implemented by the control system(s) 116) to travel along afirst vehicle route 204 to arrive within avicinity 206 of thelocation 202 associated with theuser 110. Thevicinity 206 of thelocation 202 associated with theuser 110 can be defined at least in part by a distance (e.g., a radial distance) from thelocation 202 associated with theuser 110. The distance can be indicative of an acceptable walking distance from thelocation 202 associated with theuser 110. For example, an acceptable walking distance can be distance that a user would be willing walk (or otherwise travel) to arrive at a vehicle. In some implementations, theoperations computing system 106 can determine the acceptable walking distance, as described herein, and provide such information to thevehicle 104. In some implementations, thevehicle 104 can determine the acceptable walking distance. In some implementations, the acceptable walking distance can be determined by another computing system and provided to theoperations computing system 106 and/or thevehicle 104. - The acceptable walking distance can be determined based on a variety of information.
FIG. 3 depictsexample information 300 associated with an acceptable walking distance according to example embodiments of the present disclosure. Theacceptable walking distance 302 can be determined based at least in part on at least one ofcurrent traffic data 304A,historic data 304B, user preference data 304C, localvehicle weather data 304D,regional weather data 304E, and/or other data. Thecurrent traffic data 304A can be indicative of a current level of traffic within thegeographic area 200 and/or an area that would affect thegeographic area 200. Thecurrent traffic data 304A can be obtained from another computing system (e.g., city management database, theoperations computing system 106, another vehicle, etc.) and/or determined by avehicle 104, as further described herein. For example, in the event that the current traffic level is high such that it would take a longer time for thevehicle 104 to reach thelocation 202, theacceptable walking distance 302 may be higher so that theuser 110 isn't waiting a greater time to board thevehicle 104, place an item in thevehicle 104, retrieve an item from the vehicle, etc. - The
historic data 304B can include historic data associated with providing vehicle services to a user. For instance, thehistoric data 304B can be indicative of previously calculated acceptable walking distances for thespecific user 110 and/or for other user(s) of the vehicle services. By way of example, in the event that the user (or similarly situated user(s)) has shown a willingness to walk a certain distance to a vehicle, theacceptable walking distance 302 can be determined to reflect the historically acceptable walking distance. In some implementations, the historic data 3049 can include historic traffic data. - Additionally, or alternatively, the
acceptable walking distance 302 can be based at least in part on preferences of theuser 110. For instance, the entity associated with thevehicle 104 can maintain an account/profile associated with theuser 110. In some implementations, theuser 110 can specify an acceptable walking distance (e.g., via user input to a user interface). The user-specified acceptable walking distance can be securely stored and used to determine theacceptable walking distance 302 when theuser 110 requests a vehicle service. In some implementations, theuser 110 may provide feedback regarding the distance theuser 110 walked to arrive at a vehicle that is providing theuser 110 vehicle services. Theuser 110 can provide feedback data (e.g., via user input to a user interface) indicating whether the distance was acceptable or unacceptable (e.g., as prompted by a software application). Accordingly, theacceptable walking distance 302 can be based at least in part on such feedback data. In some implementations,acceptable walking distance 302 can be based at least in part on preferences of other users such as those similar situated to theuser 110, within the geographic area, within a similar geographic area, etc. - The
acceptable walking distance 302 can also, or alternatively, be based at least in part on weather data. In some implementations, theacceptable walking distance 302 can be based at least in part onlocal weather data 304D obtained via a vehicle. For example, thevehicle 104 can include a rain sensor, thermometer, humidity sensor, and/or other types of sensor(s) that can be used to determine weather conditions within the surrounding environment of thevehicle 104. Thevehicle 104 can also be configured to determine the presence of one or more weather conditions (e.g., rain, sleet, snow, etc.) based at least in part on thesensor data 118. Additionally, or alternatively, theacceptable walking distance 302 can be based at least in part onregional weather data 304E (e.g., from a third party weather source). Theacceptable walking distance 302 can be adjusted depending on the weather conditions indicated by the weather data. By way of example, in the event that thelocal weather data 304D and/or theregional weather data 304E indicate that thegeographic area 200 is experiencing (and/or will experience) rain, theacceptable walking distance 302 may be decreased to a short distance from thelocation 202. In the event that thelocal weather data 304D and/or theregional weather data 304E indicate that thegeographic area 200 is experiencing (and/or will experience) clear skies, theacceptable walking distance 302 can be greater distance from thelocation 202. - In some implementations, the
acceptable walking distance 302 can be determined at least in part from a model, such as a machine-learned model. For example, the machine-learned model can be or can otherwise include one or more various model(s) such as, for example, models using boosted random forest techniques, neural networks (e.g., deep neural networks), or other multi-layer non-linear models. Neural networks can include recurrent neural networks (e.g., long short-term memory recurrent neural networks), feed-forward neural networks, and/or other forms of neural networks. For instance, supervised training techniques can be performed to train the model using historical acceptable walking distances, weather data, user feedback data, etc. to determine anacceptable walking distance 302 based at least in part on input data. The input data can include, for example, the various types ofinformation 300, as described herein. The machine-learned model can provide, as an output, data indicative of anacceptable walking distance 302. In some implementations, theacceptable walking distance 302 can be specific to the geographic area. For example, the model can be trained based on information associated with thegeographic area 200 such that the outputtedacceptable walking distance 302 is specific to thegeographic area 200. - In some implementations, the
vehicle 104 can provide a communication to theuser 110 indicating that thevehicle 104 is within thevicinity 206 of thelocation 202 associated with theuser 110. For example, thevehicle computing system 102 can determine that thevehicle 104 is within theacceptable walking distance 302 from thelocation 202 of the user 110 (e.g., based on the positioning system 144). In response, thevehicle computing system 102 can send a communication to theuser device 138 associated with theuser 110. The communication can indicate that thevehicle 104 is within thevicinity 206 of the user 110 (e.g., within the acceptable walking distance 302). For instance, for a transportation service, such a communication can be in the form of a textual message, auditory message, etc. stating “your vehicle is arriving please prepare to board”. - The
vehicle 104 can begin to search for a parking location (e.g., out of the vehicle's travel way) when thevehicle 104 is within the vicinity of thelocation 202 associated with theuser 110. For example, thevehicle 104 can enter into theapproach operating mode 108D when thevehicle 104 is within thevicinity 206 of thelocation 202 associated with theuser 110. While within thevicinity 206, thevehicle 104 can search for a parking location before it reaches thelocation 202 associated with the user 110 (e.g., before the GPS pin coordinate on a map). In some implementations, thevehicle 104 can search for a parking location after it passes thelocation 202 associated with theuser 110 and is still within thevicinity 206 of the user 110 (e.g., within the acceptable walking distance 302). - To identify a parking location, the
vehicle computing system 102 can obtainsensor data 118 associated with one or more objects that are proximate to thevehicle 104. As described herein, thesensor data 118 can be indicative of locations associated with the object(s) within the surrounding environment of thevehicle 104 at one or more times. Thevehicle 104 can process thesensor data 118 to determine if there are any available parking locations out of the vehicle's travel way (e.g., out of a traffic lane) that are not currently occupied by the objects (e.g., other vehicles) within the vehicle's surrounding environment. In some implementations, thevehicle 104 can utilize themap data 120 to determine whether any designated parking locations (e.g., parking lots, pullover lanes, etc. are located and/or are available within thevicinity 206 of thelocation 202 associated with theuser 110. In the event thevehicle 104 can identify a parking location that is out of a travel way and within thevicinity 206 of thelocation 202 associated with the user 110 (e.g., out of a traffic lane), thevehicle 104 can position itself into that parking location accordingly. - The
vehicle computing system 102 can send a communication to theuser device 138 associated with theuser 110 indicating that thevehicle 104 has parked. For example,FIG. 4 depicts anexample display device 400 with anexample communication 402 according to example embodiments of the present disclosure. The display device 400 (e.g., display screen) can be associated with theuser device 138 associated with theuser 110. Thecommunication 402 can be presented via auser interface 404 on thedisplay device 400. Thecommunication 402 can indicate that thevehicle 104 has arrived. In some implementations, thecommunication 402 and/or another portion of theuser interface 404 can be indicative of a location of thevehicle 104. For example, thedisplay device 400 can display amap user interface 406 that includes auser route 408. Theuser route 408 can be a route along which auser 110 can travel to arrive at thevehicle 104. - In the event that the
vehicle 104 is unable to identify a parking location that is out of the travel way (and within thevicinity 206 of the user 110), thevehicle 104 can decide whether or not to stop at least partially in a travel way to wait for theuser 110. To help do so, thevehicle computing system 102 of thevehicle 104 can obtain traffic data associated with thegeographic area 200 that includes thelocation 202 associated with theuser 110. -
FIG. 5 depictsexample traffic data 500 that can be obtained by thevehicle computing system 102 according to example embodiments of the present disclosure. Thetraffic data 500 can be obtained from a variety of sources such as other vehicles (e.g., other autonomous vehicles within the vehicle fleet), theoperations computing system 106, third party sources (e.g., traffic management entities, etc.), as well as thevehicle 104 itself. Thetraffic data 500 can include, for example, in-lane traffic data 502A, out-of-lane traffic data 502B, othervehicle traffic data 502C, currentwider traffic data 502D,historic traffic data 502E, and/or other types of traffic data. Thetraffic data 500 can be updated periodically, as scheduled, upon request, in real-time, and/or in near real-time. - The in-
lane traffic data 502A and/or out-of-lane traffic data 502B can be indicative of a level of traffic within the surrounding environment of thevehicle 104. For instance, the in-lane traffic data 502A can be indicative of the number of objects, object locations, and the speed of the respective objects within the current travel lane (and/or other designated travel boundaries) of the vehicle 104 (e.g., the other vehicles to the rear and front of the vehicle 104). The out-of-lane traffic data 502B can be indicative of the number of objects, object locations, and the speed of the respective objects within the surrounding environment, other than in the current travel lane (or other boundaries) of the vehicle 104 (e.g., other vehicles in the other lanes, all other classified objects around thevehicle 104, etc.). The in-lane traffic data 502A and/or out-of-lane traffic data 502B can be based at least in part on thesensor data 118 associated with the surrounding environment of thevehicle 104, as further described herein. - The
vehicle computing system 102 can obtain the othervehicle traffic data 502C from one or more other vehicles, such as one or more other autonomous vehicles in a fleet that includes thevehicle 104. The othervehicle traffic data 502C can be indicative of the number of objects, object locations, and the speed of the respective objects within the surrounding environment of the other vehicle(s), while the other vehicle(s) are within thegeographic area 200, thevicinity 206 of thelocation 202 associated with theuser 110, and/or another location which may have a traffic effect on thegeographic area 200 and/or thevicinity 206 of thelocation 202 associated with theuser 110. Thevehicle computing system 102 can obtain the othervehicle traffic data 502C via vehicle to vehicle communication and/or via other computing device(s) (e.g., the operations computing system 106). - The current
wider traffic data 502D can be indicative of the traffic within a larger regional area that includes thegeographic area 200. For instance, thevehicle computing system 102 can obtain the currentwider traffic data 502D that indicates the current traffic patterns, build-ups, etc. within a region that includes thegeographic area 200. Such data can be obtained via a third patty source such as, for example, a traffic management entity associated with the region. - The
historic traffic data 502E can include previously collected traffic data of the types of traffic data described herein and/or other types of traffic data. For example, thehistoric traffic data 502E can include traffic data previously obtained by the vehicle computing system 102 (e.g., in-lane, out-of-lane traffic data), traffic data previously obtained by other vehicle(s) (e.g., associated with thegeographic area 200, thevicinity 206 of thelocation 202, etc.), historic traffic data (e.g., traffic patterns) associated with a region that includes thegeographic area 200, and/or other types of historic traffic data. - The
vehicle computing system 102 can determine an estimatedtraffic impact 504 of thevehicle 104 on thegeographic area 200 based at least in part on thetraffic data 500. The estimatedtraffic impact 504 can be indicative of an estimated impact of thevehicle 104 on one or more objects within a surrounding environment of thevehicle 104 in the event that thevehicle 104 were to stop at least partially in the travel way within thevicinity 206 of thelocation 202 associated with theuser 110. In some implementations, the estimatedtraffic impact 504 can estimate the likelihood that an approach object (e.g., another vehicle) can pass safely without endangering theuser 110. - In some implementations, the estimated
traffic impact 504 can be based at least in part on a comparison of a level of traffic to atraffic constraint 506. Thevehicle computing system 102 can determine a level of traffic (e.g., a number of objects that could be impacted by thevehicle 104 stopping at least partially in a travel way) within the surrounding environment of thevehicle 104 based at least in part on one or more of the types oftraffic data 500. Thevehicle computing system 102 can compare the level of traffic to atraffic constraint 506 to determine whether the estimatedtraffic impact 504 would be high or low (e.g., significant or insignificant, unacceptable or acceptable, etc.). - The
traffic constraint 506 can be implemented in a variety of forms. For example, thetraffic constraint 506 can include a traffic threshold that is indicative of an acceptable level of traffic (e.g., an acceptable number of objects) that would be impacted by thevehicle 104 stopping at least partially in the travel way. A level of traffic that exceeds the traffic threshold would be considered a high impact on traffic. In some implementations, thetraffic constraint 506 can be implemented as cost data (e.g., one or more cost function(s)). For example, the vehicle computing system 102 (e.g., the motion planning system 128) can include cost data that reflects the cost(s) of stopping vehicle motion, the cost(s) of causing traffic build-up, the cost(s) of illegally stopping in a travel way, etc. - The
traffic constraint 506 can be based on a variety of information. In some implementations, thetraffic constraint 506 can be based at least in part on historic traffic data that indicates the level of traffic previously occurring in that geographic area. For example, if thegeographic area 200 normally experiences a high level of traffic build-up, a corresponding traffic threshold can be higher (and/or the cost of stopping can be lower). Additionally, or alternatively, thetraffic constraint 506 can be based at least in part on real-time traffic data (e.g., from other vehicles in the fleet, from thevehicle 104, other sources). For example, in the event that there is already a traffic jam in thevicinity 206 of thelocation 202 of theuser 110, the traffic threshold could be higher (and/or the cost of stopping could be lower). In some implementations, thetraffic constraint 506 can be based at least in part on the typical travel expectations of individuals in thegeographic area 200. For example, individuals that are located in City A may he more patient when waiting in traffic than those in City B. Thus, a traffic threshold may be higher in City A than in City B (and/or the cost of stopping may be lower in City A than in City B). In some implementations, thetraffic constraint 506 can be based at least in part on map data 120 (and/or other map data). For example, in the event that thevehicle 104 is traveling on a wide travel way in which impacted vehicles could eventually travel around thevehicle 104, the traffic threshold could be higher (and/or the cost of stopping could be lower). Thetraffic constraint 506 can be determined dynamically, in real-time (and/or near real-time) to reflect the conditions currently faced by thevehicle 104. - In some implementations, the
traffic constraint 506 can be determined at least in part from a model, such as a machine-learned model. For example, the machine-learned model can he or can otherwise include one or more various model(s) such as, for example, models using boosted random forest techniques, neural networks (e.g., deep neural networks), or other multi-layer non-linear models. Neural networks can include recurrent neural networks (e.g., long short-term memory recurrent neural networks), feed-forward neural networks, and/or other forms of neural networks. For instance, supervised training techniques can he performed to train the model (e.g., using previous driving logs) to determine atraffic constraint 506 based at least in part on input data. The input data can include, for example, thetraffic data 500 as described herein, map data, data from a traffic management entity, driving characteristics of individuals in an associated geographic area, complaints received from operators of vehicles that were caused to stop by autonomous vehicles, etc. The machine-learned model can provide, as an output, data indicative of a recommended traffic constraint. In some implementations, the recommended traffic constraint can be specific to thegeographic area 200. For example, the model can be trained based at least in part on data associated withgeographic area 200 such that the recommended traffic constraint is specific for that particular region. - Additionally, or alternatively, the estimated
traffic impact 504 can be based at least in part on a model. Thevehicle computing system 102 and/or theoperations computing system 106 can obtain data descriptive of the model (e.g., machine learned model). Thevehicle computing system 102 and/or theoperations computing system 106 can provide input data to the model. The input data can include the one or more of the types of traffic data 500 (e.g., associated with the geographic area 200). In some implementations, the input data can include themap data 120. The model can determine the estimatedtraffic impact 504 that thevehicle 104 would have on thegeographic area 200 if thevehicle 104 were to stop at least partially within a travel way (e.g., within a current lane of travel). By way of example, the model can evaluate thetraffic data 500 to determine the level of traffic in the vehicle surrounding environment. The model can analyze thetraffic data 500 with respect to thetraffic constraint 506 to determine whether the estimatedtraffic impact 504 is high or low, significant or insignificant, unacceptable or acceptable, etc. The output of such a model can be the estimatedtraffic impact 504, which can be indicative of, for example, a number of objects that would be impacted by thevehicle 104 stopping at least partially in a travel way and/or whether the estimated traffic impact is high or low, significant or insignificant, acceptable or unacceptable, etc. - In some implementations, the output of the model can be provided as an input to the model for another set of traffic data (e.g., at a subsequent time step). In such fashion, confidence can be built that a determined estimated traffic impact is the accurate. Stated differently, in some implementations, the process can be iterative such that the estimated traffic impact can be recalculated over time as it becomes clearer what the estimated traffic impact is on the respective geographic area. For example, the model can include one or more autoregressive models. In some implementations, the model can include one or more machine-learned recurrent neural networks. For example, recurrent neural networks can include long or short-term memory recurrent neural networks, gated recurrent unit networks, or other forms of recurrent neural networks.
- As described herein, the estimated
traffic impact 504 can be based at least in part on thesensor data 118 acquired onboard thevehicle 104. For instance, thevehicle computing system 102 can obtain traffic data associated with thegeographic area 200 that includes thelocation 202 associated with theuser 110 based at least in part on thesensor data 118 obtained via the one ormore sensors 112. By way of example, thevehicle computing system 102 can obtain the in-lane traffic data 502A and/or out-of-lane traffic data 502B associated with thegeographic area 200 that includes thelocation 202 associated with theuser 110. In some implementations, thevehicle computing system 102 can obtain sensor data 118 (e.g., via the one or more sensors 112) associated with the surrounding environment of thevehicle 104 that is within thevicinity 206 of thelocation 202 associated with theuser 110, as described herein. Thesensor data 118 can be indicative of one or more objects within the surrounding environment of thevehicle 104 that is within thevicinity 206 of thelocation 202. Thevehicle computing system 102 can process thesensor data 118 to classify which of the object(s) would be impacted (e.g., caused to stop) by thevehicle 104 stopping in a travel way. - By way of example,
FIG. 6 depicts anexample travel way 600 according to example embodiments of the present disclosure. Thetravel way 600 can be associated with thegeographic area 200. For example, thetravel way 600 can be located within thevicinity 206 of thelocation 202 associated with the user 110 (e.g., the street on which theuser 110 is located). Thetravel way 600 can include the current travel lane 602 (and/or other designated travel boundaries) of thevehicle 104. Thetravel way 600 can include other travel lanes, such as other lane 604 (e.g., a lane adjacent to the current travel lane 602). Thevehicle computing system 102 can classify the objects within the surrounding environment that would be impacted in the event thevehicle 104 were to stop at least partially in thetravel way 600. For example, the object 606 (e.g., another vehicle behind thevehicle 104 in the same travel lane 602) can be identified as an object that would be impacted in the event that thevehicle 104 stops at least partially in thecurrent lane 602. - The
vehicle computing system 102 can also identify object(s) that would not be affected by thevehicle 104 stopping at least partially in thetravel way 600. Such object(s) can include one or more objects in another travel lane as well as objects with the same travel lane as thevehicle 104. For example, thevehicle computing system 102 can determine that theobject 608 located in the other travel lane 604 (e.g., another vehicle in an adjacent travel lane) will not be impacted by thevehicle 104 stopping at least partially in thetravel way 600 because theobject 608 can continue passed thevehicle 104 via theother travel lane 604. In another example, thevehicle computing system 102 can determine that an object 610 (e.g., a bicycle) may not be impacted by thevehicle 104 stopping (and/or may be impacted to an insignificant degree) because theobject 608 may be parked and/or have the opportunity to travel around the vehicle 104 (e.g., via a lane change into the adjacent lane, maneuver around thevehicle 104 within the same lane, etc.). - After such classification, the
vehicle computing system 102 can determine a level of traffic associated with the geographic area 200 (e.g., within the vicinity of the user's location) based at least in part on thesensor data 118. For example, the level of traffic can be based at least in part on the number of object(s) within the surrounding environment of thevehicle 104 that would be impacted by thevehicle 104 stopping at least partially in the travel way 600 (e.g., the current travel lane 602), while filtering out those object(s) that would not be impacted. Similar such information could be acquired via one or more other vehicles in the associated vehicle fleet. Thevehicle computing system 102 can determine the estimated traffic impact by comparing the level of traffic to the traffic constraint 506 (e.g., a traffic threshold indicative of a threshold level of traffic). In the event that the level of traffic exceeds thetraffic constraint 506, thevehicle computing system 102 can determine that the estimatedtraffic impact 504 on thegeographic area 200 would be high. - The
vehicle computing system 102 can also, or alternatively, determine an estimated time of user arrival. The estimated time of user arrival can be indicative of an amount of time needed for theuser 110 to interact with the vehicle 104 (e.g., before thevehicle 104 can begin moving again). For example, thevehicle computing system 102 can determined an estimated time until theuser 110 completes boarding of the vehicle 104 (e.g., enters thevehicle 104 and fasten a seatbelt for a transportation service). This can be made up of an estimated time until theuser 110 starts boarding the vehicle 104 (e.g., the time needed to walk to thevehicle 104, the time until the vehicle doors unlock, etc.) and an estimated time of boarding duration for a user 110 (e.g., to load luggage, help other passengers board, etc.). In some implementations, thevehicle computing system 102 can determine an estimated time until theuser 110 completes the retrieval of an item from the vehicle 104 (e.g., completely removes an item from thevehicle 104 for a delivery service). In some implementations, thevehicle computing system 102 can determine an estimated time until theuser 110 places an item in the vehicle 104 (e.g., for a courier service). The estimated time estimated time of user arrival (e.g., estimated time until theuser 110 completes boarding) can help thevehicle computing system 102 determine whether or not to stop at least partially in thetravel way 600. Such time estimate(s) can be expressed as a time duration (e.g., user estimated to arrive in 1 minute) and/or a point in time (e.g., user estimated to arrive at 10:31 am (PT)). - The estimated time of user arrival (e.g., the estimated time until the
user 110 completes boarding, item retrieval, item placement, etc.) can be based on a variety of information.FIG. 7 depicts example depicts a flow diagram of anexample method 700 of determining an estimated time of user arrival (e.g., autonomous vehicle user boarding times) according to example embodiments of the present disclosure. While the following provides examples of themethod 700 with respect to determining autonomous vehicle user boarding times, a similar approach can be taken for determining an estimated time until theuser 110 completes the retrieval of an item from thevehicle 104 and/or an estimated time until theuser 110 places an item in thevehicle 104. One or more portion(s) of themethod 700 can be implemented by one or more computing devices such as, for example, the one or more computing device(s) of thevehicle computing system 102 and/or other systems (e.g., as computing operations). Each respective portion of themethod 700 can be performed by any (or any combination) of the one or more computing devices. Moreover, one or more portion(s) of themethod 700 can be implemented as an algorithm on the hardware components of the device(s) described herein (e.g., as inFIGS. 1 and 14 ), for example, to control an autonomous vehicle.FIG. 7 depicts elements performed in a particular order for purposes of illustration and discussion. Those of ordinary skill in the art, using the disclosures provided herein, will understand that the elements of any of the methods discussed herein can be adapted, rearranged, expanded, omitted, combined, and/or modified in various ways without deviating from the scope of the present disclosure. - At (702), the
method 700 can include obtaining one or more identifiers of a user device. Thevehicle computing system 102 can use one or more identifier(s) of the user device 138 (e.g., obtained with therequest 140, obtained by thevehicle 104, provided by theoperations computing system 106, etc.) to scan for and/or communicate with theuser device 138 when thevehicle 104 is within thevicinity 206 of theuser 110. By way of example, theoperations computing system 106 can provide thevehicle computing system 102 with one or more identifier(s) of auser device 138 associated with theuser 110. The identifiers can be, for example, unique radio identifiers (e.g., Bluetooth, WiFi, Cellular, friendly names, contact number, other identifier) collected via a software application and provided to theoperations computing system 106. Thevehicle computing system 102 can utilize the identifiers to communicate and/or locate auser device 138 associated with theuser 110. For example, when thevehicle 104 is within avicinity 206 of thelocation 202 associated with theuser 110, thevehicle computing system 102 can scan for the user device 138 (e.g., opt-in radios) based at least in part on the identifier(s) (e.g., when thevehicle 104 is in theapproach mode 108D). - At (704), the
method 700 can include obtaining location data associated with the user device. For instance, thevehicle computing system 102 can obtain (e.g., via the communication system 136) location data associated with auser device 138 associated with auser 110. The location data associated with theuser device 138 can be indicative of one or more locations of theuser device 138 associated with theuser 110, at one or more times. In some implementations, thevehicle computing system 102 can use the identifier(s) to determine the location of theuser 110. Thevehicle computing system 102 can be configured to utilize a variety of communication technologies to obtain the location data associated with theuser device 138. For example, thevehicle computing system 102 can obtain the location data based at least in part on a triangulation of signals via at least one of multiple input, multiple output communication between thevehicle 104 and theuser device 138, one or more Bluetooth low energy beacons located onboard thevehicle 104, or a light signal handshake between theuser device 138 and thevehicle 104, as further described herein. - In some implementations, the
vehicle computing system 102 can use radio frequency (RF) signaling to obtain location data associated with theuser 138. For example,FIG. 8A depicts anexample portion 800 of acommunications system 136 according to example embodiments of the present disclosure. Thecommunications system 136 can include one or more electronic devices 802 (e.g., RF module) configured to transmit and/or obtain one or more RF signals (e.g., from the user device 138) via one or more communication device(s) 804 (e.g., transmitters, receivers, transmitters, RF sensors, etc.). For example, once theuser device 138 is found (e.g., via the identifiers), thevehicle computing system 102 can track changes in the signal strength (e.g., radio signal strength identifier (RSSI)) to determine various information about theuser device 138. For example, thevehicle computing system 102 can determine the approximate distance of the user device 138 (and the user 110) to the vehicle 104 (e.g., the difference between RSSI triangulation and improvements over time can be used to determine the approximate distance of theuser 110 to the vehicle 104). Additionally, or alternatively, thevehicle computing system 102 can determine a heading of theuser 110 based at least in part on RF signal(s) associated with the user device 138 (e.g., the communication device(s) 804 with the best/strongest RSSI reading can indicate direction/heading of the user 110). Thevehicle computing system 102 can also determine a speed/velocity of theuser device 138 using such a technique. The approximate distance, heading, speed, etc. measurements can be determined with or without an authenticated connection. - Additionally, or alternatively, the
vehicle computing system 102 can utilize Bluetooth low energy protocol to obtain location data associated with theuser device 138. For example.FIG. 8B depicts anexample portion 810 of acommunications system 136 according to example embodiments of the present disclosure. Thecommunications system 136 can include one or more Bluetoothlow energy beacons 812 that are located onboard thevehicle 104. Thevehicle computing system 102 can obtain the location data associated with theuser device 138 based at least in part on the one or more Bluetoothlow energy beacons 812 located onboard thevehicle 104, in some implementations, thevehicle computing system 102 can determine differences in Bluetooth Low Energy (BLE) beacon radio signal strength identifiers over time and/or inertial measurement unit changes, which can indicate a distance between theuser device 138 of the user 110 (e.g., a mobile phone associated with the user 110) and thevehicle 104. Additionally, or alternatively, thevehicle computing system 102 can determine a heading of the user device 138 (and the user 110) based at least in part on the Bluetooth low energy beacon(s) 812 (e.g., the beacon(s) with the best/strongest RSSI reading can indicate direction/heading of the user 110). Thevehicle computing system 102 can also determine a speed/velocity of theuser device 138 using such a technique. In some implementations, theuser device 138 can determine its location (e.g., relative to the vehicle 104) based at least in part on signals transmitted from one ormore beacons 812 located onboard thevehicle 104. Theuser device 138 can provide data indicative of its location to one or more remote computing device(s) (e.g., theoperations computing system 106, cloud-based system). The remote computing device(s) can provide data indicative of the location of the user device 138 (determined based on the beacons 812) to thevehicle computing system 102. Thevehicle computing system 102 can determine the location of the user device 138 (and the user 110) based on such data. In some implementations, the remote computing device(s) can process data from the user device 138 (e.g., associated with the beacons 812) to determine a location of theuser device 138 and provide data associated therewith to thevehicle computing system 102. - In some implementations, the
vehicle computing system 102 can utilize one or more altimeters (and/or other measuring device) to obtain location data associated with theuser device 138. For example,FIG. 8C depicts an example diagram 820 of obtaining location data according to example embodiments of the present disclosure. Thevehicle 104 can include one ormore altimeters 822 located onboard thevehicle 104. Theuser device 138 can include one ormore altimeters 824. As shown, the user device 138 (and the user 110) can be located on the second story of abuilding 826. Thevehicle computing system 102 can obtain location data associated with theuser device 138 via at least onealtimeter 822 located onboard thevehicle 104. By way of example, thevehicle computing system 102 can obtain location data associated with altimeter(s) 824 of theuser device 138 via one or more network(s) 828. Thevehicle computing system 102 can compare the location data associated with altimeter(s) 824 of theuser device 138 to location data associated with altimeter(s) 822 of thevehicle 104. Thevehicle computing system 102 can determine an elevation/altitude of the user device 138 (and the user 110) based at least in part on this comparison (e.g., a difference between the altimeter(s) 824 of theuser device 138 and the altimeter(s) 822 of thevehicle 104 can indicate elevation/altitude difference). The elevation/altitude can be relative to the position of thevehicle 104 and/or another reference (e.g., ground level, sea level. etc.). - In some implementations, the
vehicle computing system 102 can utilize other communication technologies to obtain location data associated with theuser device 138. For example, thevehicle computing system 102 can obtain the location data associated with theuser device 138 based at least in part on multiple input, multiple output communication between thevehicle 104 and theuser device 138 associated with theuser 110. This can allow thevehicle 104 to take advantage of the multiple antennas included in the vehicle'scommunication system 136 as well as those of theuser device 138 to increase accuracy of the location data 704A associated with theuser 110. For example, theuser device 138 can obtain an identifier (e.g., Radio Network Temporary Identifier) that can be associated with theuser device 138. Theuser device 138 can provide data indicative of the identifier to one or more remote computing device(s) (e.g., theoperations computing system 106, cloud-based system). The remote computing device(s) can provide data indicative of the identifier to thevehicle computing system 102. Thevehicle computing system 102 can locate the user device 138 (and the user 110) based at least in part on the identifier via the multiple antennas onboard the vehicle 104 (and/or the antenna(s) of the user device 138). In some implementations, thevehicle computing system 102 can utilize a handshake (e.g., light signal handshake) between theuser device 138 and thevehicle 104. - In some implementations, the
vehicle computing system 102 can obtain location data associated with theuser device 138 based at least in part on image data. For example, theuser 110 can obtain image data associated with the user 110 (e.g., via the user device 138). The image data can be indicative of one or more characteristics (e.g., buildings, street signs, etc) of the geographic area and/or surrounding environment of theuser 110. In some implementations, theuser 110 may be included in the image data. Theuser device 138 can process the image data to determine the location of the user 110 (e.g., via a comparison of image data features to known geographic features). Theuser device 138 can provide the determined location to one or more remote computing device(s) (e.g., theoperations computing system 106 and/or other cloud-based system). The remote computing device(s) can provide data indicative of the location of theuser 110 to thevehicle computing system 102. Thevehicle computing system 102 can obtain the data indicative of the location of theuser 110 determined based at least in part on image data associated with theuser 110. The estimated time until theuser 110 starts boarding thevehicle 104 can be based at least in part on the location of theuser 110. In some implementations, the remote computing device (e.g., theoperations computing system 106 and/or other cloud-based system) can obtain the image data associated with theuser 110, process the image data (e.g., as described herein), and provide the data indicative of the location of theuser 110 to thevehicle computing system 102. This can be helpful to save the processing resources of a computationally limited device (e.g., mobile device). - In some implementations, the
vehicle computing system 102 can obtain the image data associated with the user 110 (e.g., via theuser device 138 and/or the remote computing device). Thevehicle computing system 102 can determine a location of theuser 110 based at least in part on the image data. For example, thevehicle computing system 102 can analyze the features of the image data (e.g., the background, street signs, buildings, etc.) and compare the features to other data (e.g.,sensor data 118,map data 120, other data, etc.). Thevehicle computing system 102 can determine a location of the user based at least in part on this comparison. - In some implementations, the
vehicle computing system 102 can utilize other communication techniques. These techniques can include, for example, vehicle perception of the user 110 (e.g., via processing ofsensor data 118 to perceive theuser 110 and the user's location, distance, heading, velocity, and/orother state data 130 associated therewith), GPS location of theuser device 138, device specific techniques (e.g., specific device/model type), thevehicle 104 serving as a localized base station (e.g., GPS, WiFi, etc.), and/or other techniques. - Returning to
FIG. 7 , themethod 700 can include determining an estimated time until the user starts interaction with the vehicle, at (706). For instance, thevehicle computing system 102 can determine an estimated time until the user starts boarding the vehicle 104 (ETSB) based at least in part on the location data associated with theuser device 138. The estimated time of until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) can be indicative of, for example, a countdown in time until the user's location is near/at thevehicle 104. until theuser 110 begins to board, until the doors of thevehicle 104 are unlocked, until the doors are opened, etc. Theuser 110 can, but need not, physically interact with thevehicle 104 for this time estimate. - The estimated time until the
user 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) can be based on a variety of data. For example, thevehicle computing system 102 can determine adistance 708 between theuser 110 and thevehicle 104 based at least in part on the location data associated with theuser device 138, as described herein. The estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) can be based at least in part on thedistance 708 between theuser 110 and thevehicle 104. Additionally, or alternatively, thevehicle computing system 102 can determine an elevation/altitude 710 of theuser 110 based at least in part on the location data associated with the user device 138 (e.g., obtained via the one or more altimeters onboard the vehicle 104), as described herein. The estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) can be based at least in part on the elevation/altitude 710 of the user 110 (e.g., if theuser 110 is in a tall building then additional time should be added). Additionally, or alternatively, thevehicle computing system 102 can determine a heading 712 of theuser 110 based at least in part on the location data associated with theuser device 138, as described herein. The estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) can be based at least in part on the heading 712 of the user 110 (e.g., if theuser 110 is across the street then additional time can be added). Additionally, or alternatively, thevehicle computing system 102 can determine a location of theuser 110 based at least in part on image data, as described herein. The estimated until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) can be based at least in part on the location of theuser 100, as determined from the image data. - In some implementations, the
vehicle computing system 102 can obtainhistoric data 714 to help determine the estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)). Thehistoric data 714 can be indicative of, for example, historic start boarding times of one or more other users (and or the user 110). In some implementations, a machine learned model can be trained based on such historic data to determine the estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)). For example, the model can be trained based on training data indicative of previous location data, user distances, altitudes, headings, etc. labeled with the times of user arrival (e.g., the times when the users started boarding the vehicle). The model can be trained to receive input data location data, user distances, altitudes, headings, etc.) and provide, as an output, an until theuser 110 starts interaction with the vehicle 104 (e.g., an estimated time until the user starts boarding a vehicle (ETSB)). In some implementations, the estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) can be based at least in part on thehistoric data 714. For example,vehicle computing system 102 can determine the estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) based at least in part on such a machine-learned model (e.g., as an output thereto). - In some implementations, the estimated time until the
user 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) can he used to determine one or more vehicle actions, at (716). For example, the estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)) can indicate that theuser 110 is close to thevehicle 104 and/or heading toward thevehicle 104. Thevehicle computing system 102 can cause one or more doors of thevehicle 102 to unlock based at least in part on the estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)). To do so, thevehicle computing system 102 can provide control signal(s) to an associated door controller. Additionally, or alternatively, thevehicle computing system 102 can cause thevehicle 104 to implement one or more vehicle settings (e.g., temperature, music, etc.) associated with theuser 110 based at least in part on the estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until the user starts boarding the vehicle 104 (ETSB)). For example, thevehicle computing system 102 can access a profile associated with theuser 110 to identity the user's preferred vehicle settings and can provide one or more control signals to the appropriate systems onboard the vehicle 104 (e.g., temperature control system, sound system etc.) to implement the vehicle settings. - At (718), the
method 700 can include determining an estimated time of interaction duration between the user and the vehicle. This estimated time can be indicative of the time it will take for theuser 110 to interact with thevehicle 102. By way of example, the estimated time of interaction duration can include an estimated board duration (EBD) that is indicative of how long theuser 110 may take to load him/herself, children, luggage, etc. into thevehicle 104, to securely fastening seatbelts, and/or undertake other tasks (e.g., for a transportation service). In some implementations, the estimated time of interaction duration can be indicative of how long theuser 110 may take to unsecure and remove an item from the vehicle 104 (e.g., for a delivery service). In some implementations, the estimated time of interaction duration can be indicative of how long theuser 110 may take to securely place an item into the vehicle 104 (e.g., for a courier service). - To help determine the estimated time of interaction duration between the
user 110 and thevehicle 104, thevehicle computing system 102 can obtain data associated with theuser 110. Thevehicle computing system 102 can determine the estimated time of interaction duration (e.g., an estimated time of boarding duration for the user 110) based at least in part on the data associated with theuser 110. The data associated with theuser 110 can include, for example, data indicative of one or more preferences 720 of theuser 110 and/or one or more vehicle service parameters 722 associated with theuser 110. The preferences 720 can be indicative of the user's destination (e.g., airport, train station, etc.), service type, and/or other information specified by the user 110 (e.g., when requesting the vehicle service). The one or more vehicle service parameters 722 can be indicative of number of passengers, child's car seat request, presence/amount of luggage, etc. The vehicle service parameters 722 can also be specified by the user 110 (e.g., when requesting the vehicle service). - In some implementations, the data associated with the
user 110 can include historic data 724 (e.g., indicative of a boarding behavior associated with one or more other users). For example, thehistoric data 724 can be indicative of historic wait time(s) associated with other users in thegeographic area 200 and/or a greater region, similarly situated users, etc. In some implementations, thehistoric data 724 can be associated with thespecific user 110. Thehistoric data 724 can include, for example, previous correlations between changes in the signal strength of an identifier and a user's time to arriving at a vehicle. For example, thehistoric data 724 can indicate historic RSSI changes as a countdown to user arrival. - At (726), the
method 700 can include determining an estimated time of user arrival with thevehicle 104. The estimated time of user arrival can include, for example, an estimated time until the user completes hoarding of the vehicle 104 (ETCB) (e.g., for a transportation service), an estimated time until theuser 110 finishes retrieving an item from the vehicle 104 (e.g., for a delivery service), and estimated time until theuser 110 finishes placing an item in the vehicle 104 (e.g., for a courier service), etc. Thevehicle computing system 102 can determine an estimated time of user arrival based at least in part on the location data associated with theuser device 138. Thevehicle computing system 102 can determine the estimated time of user arrival based at least in at part on the estimated time until theuser 110 starts interaction with thevehicle 104 and the estimated time of interaction duration (e.g., a sum of these estimated times) way of example, thevehicle computing system 102 can determine an estimated time until theuser 110 completes boarding of thevehicle 102 based at least in part on the location data associated with theuser device 110 and the data associated with theuser 110. More particularly, thevehicle computing system 102 can determine an estimated time until theuser 110 completes boarding of thevehicle 104 based at least in part on the estimated time until theuser 110 starts boarding thevehicle 104 and the estimated time of boarding duration for theuser 110. Thevehicle computing system 102 can determine the estimated time until theuser 110 completes hoarding of thevehicle 104 via the addition of these two time estimates (e.g., ETCB=ETSB+EBD). In this way, thevehicle computing system 102 can determine the amount of time that the object(s) within its surroundings would be impacted as thevehicle 104 waits for the user 110 (e.g., how long other vehicles would be caused to stop while waiting for theuser 110 to board the vehicle 104). - In some implementations, in order to determine whether the estimated time of user arrival (e.g., the estimated time until the
user 110 completes boarding of the vehicle 104) is acceptable, thevehicle computing system 102 can compare the estimated time for interaction between theuser 110 and the vehicle 104 (e.g., the estimated time until theuser 110 completes boarding of the vehicle 104) to atime constraint 750. Thetime constraint 750 can be expressed as a time threshold (e.g., indicating an acceptable amount of stopping time) and/or cost data (e.g., cost functions expressing a cost in relation to stopping time). This can allow thevehicle computing system 102 to determine whether the amount of stopping time is acceptable. - Similar to the
traffic constraint 506, thetime constraint 750 can be based on historic data (e.g., indicating historic wait times), real-time data (e.g., indicating that the vehicles are already waiting due to another traffic build-up in front of the autonomous vehicle), expectations of individuals in the geographic area, machine-learned model(s), and/or other information. For example, in the event that there is already a traffic jam in front of thevehicle 104, the time constraint 750 (e.g., indicative an acceptable wait time) can higher. - In some implementations, the
time constraint 750 can be determined at least in part from a model, such as a machine-learned model. For example, the machine-learned model can be or can otherwise include one or more various model(s) such as, for example, models using boosted random forest techniques, neural networks (e.g., deep neural networks), or other multi-layer non-linear models. Neural networks can include recurrent neural networks (e.g., long short-term memory recurrent neural networks), feed-forward neural networks, and/or other forms of neural networks. For instance, supervised training techniques can be performed to train the model (e.g., using historical wait time data) to determine atime constraint 750 based at least in part on input data. The input data can include, for example, the data described herein with reference toFIGS. 7 and 8A -C, driving characteristics of individuals in an associated geographic area, complaints received from operators of vehicles that were caused to stop by autonomous vehicles, etc. The machine-learned model can provide, as an output, data indicative of a recommended time constraint. In some implementations, the recommended time constraint can be specific to thegeographic area 200. For example, the model can be trained based at least in part on data associated withgeographic area 200 such that the recommended time constraint is specific for that particular region. - Additionally, or alternatively, one or more time estimates of
FIG. 7 can be based at least in part on a model. Thevehicle computing system 102 can obtain data descriptive of the model (e.g., machine learned model). Thevehicle computing system 102 can provide input data to the model. The input data can include the one or more of the types of data described herein with reference toFIGS. 7 and 8A -C. For example, the model can determine an estimated time of user arrival (e.g., an estimated time until theuser 110 completes boarding of the vehicle 104 (ETCB)), an estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., an estimate time until theuser 110 starts boarding the vehicle 104 (ETSB)), and/or an estimated time of interaction duration between theuser 110 and the vehicle 104 (e.g., an estimated time of hoarding duration for a user 110 (EBD)). For example, the estimated time of user arrival to the vehicle 104 (e.g., ETSB) can be based at least in part on the input data (e.g., the user's location, heading, distance, altitude, etc.). In some implementations, the model can analyze the input data with respect to thetime constraint 750 to determine whether the estimated time of user arrival (e.g., the estimated time until theuser 110 completes boarding of the vehicle 104 (ETCB)) is high or low, significant or insignificant, acceptable or unacceptable, etc. The output of such a model can be the estimated time for interaction between theuser 110 and the vehicle 104 (e.g., the estimated time until theuser 110 completes boarding of the vehicle 104 (ETCB)) and/or whether it is high or low, significant or insignificant, acceptable or unacceptable, etc. - In some implementations, the output of the model can be provided as an input to the model for another set of data (e.g., at a subsequent time step). In such fashion, confidence can be built that a determined time estimate is the accurate. Stated differently, in some implementations, the process can be iterative such that the time estimate can be recalculated over time as it becomes clearer what the time estimate is with respect to the
user 110. For example, the model can include one or more autoregressive models. In some implementations, the model can include one or more machine-learned recurrent neural networks. For example, recurrent neural networks can include long or short-term memory recurrent neural networks, gated recurrent unit networks, or other forms of recurrent neural networks. - Returning to
FIG. 6 , thevehicle computing system 102 can determine one or more vehicle actions based at least in part on at least one of the estimatedtraffic impact 504 or the estimated time of user arrival (e.g., one and/or both of the estimates). In some implementations, thevehicle computing system 102 can determine the vehicle actions) based at least in part on the estimatedtraffic impact 504. By way of example, the vehicle action(s) can include stopping within thevicinity 206 of thelocation 202 associated with the user 110 (e.g., at least partially in the travel way 600). In the event that the level of traffic (e.g., the number of other vehicles that would be impacted by an in-lane stop) is below a traffic threshold (e.g., the estimatedtraffic impact 504 is low), thevehicle computing system 102 can determine that thevehicle 104 can stop within thetravel way 600 to wait for theuser 110 to arrive at thevehicle 104. In some implementations, thevehicle computing system 102 can determine the vehicle action(s) based at least in part on the estimated time of user arrival. For instance, in the event that the estimated time ofuser 702 arrival is below the time constraint 706 (e.g., the estimated time of user arrival is low), thevehicle computing system 102 can determine that thevehicle 104 can stop at least partially within thetravel way 600. Such a stop can occur, for example, as close as possible (e.g., for the vehicle 104) to thelocation 202 associated with theuser 110. - In some implementations, the
vehicle computing system 102 can base its determination to stop at least partially within thetravel way 600 on both the estimatedtraffic impact 504 and the estimated time of user arrival. For instance,vehicle computing system 102 can weigh each of these estimates to determine whether it would be appropriate for thevehicle 104 to stop at least partially in thetravel way 600 to wait for theuser 110. Thevehicle computing system 102 can apply a first weighting factor to the estimatedtraffic impact 504 and a second weighting factor to the estimated time of user arrival. The first weighting factor can be different than the second weighting factor. In some implementations, the first weighting factor can be inversely related to the second weighting factor. An example equation can include: ETI*w1+ETUR*w2, where “ETT” is the estimatedtraffic impact 504, “w1” is the first weighing factor (e.g., 0 to 1), “ETUR” is the estimated time touser arrival 702, and “w2” is the second weighting factor (e.g., 0 to 1). By way of example, the estimatedtraffic impact 504 may be high while the estimated time of user arrival may be short. Accordingly, thevehicle computing system 102 may determine that it can stop within thetravel way 600 because although a higher number of objects (e.g., other vehicles) may be caused to stop, it would only be for a short period of time because theuser 110 is close in distance to and/or quickly heading toward thevehicle 104. In such a case, the estimated time of user arrival can be given a greater weight than the estimatedtraffic impact 504. In another example, the estimated impact ontraffic 504 may be low while the estimated time of user arrival may be long. Accordingly, thevehicle computing system 102 can determine that it should not stop within thetravel way 600 because although only a few objects (e.g., other vehicles) may be caused to stop, it would be for a greater period of time because theuser 110 is farther from (and/or moving slowly, moving away from, etc.) thevehicle 104. In such a case, the estimated time of user arrival can be given a greater weight than the estimatedtraffic impact 504 In some implementations, the first and second weighting factors can be manually and/or automatically adjusted depending on the circumstances (e.g., a VIP user is being provided the vehicle services) and/or thegeographic area 200. - In some implementations, the estimated
traffic impact 504 can be adjusted based at least in part on the estimated time touser arrival 702. For instance, in the event that the estimated time touser arrival 702 is long, the estimatedtraffic impact 504 can be higher. - With reference again to
FIG. 2 , thevehicle computing system 102 can also, or alternatively determine that thevehicle 104 is to enter into a holding pattern. For instance, the vehicle action(s) can include traveling along a second vehicle route 208 (e.g., an optimal holding pattern route). In some implementations, thevehicle computing system 102 can cause thevehicle 104 to enter into a particular operating mode in which thevehicle 104 implements a holding pattern (e.g., a holding pattern operating mode). Thevehicle 104 may be unable to find a parking location before and/or after travelling passed thelocation 202 associated with theuser 110. Additionally, thevehicle 104 may determine that it should not stop within atravel way 600 to wait for the user's arrival, as described herein. Thus, thevehicle 104 can be re-routed along asecond vehicle route 208 that is at least partially different than the first vehicle route. Thesecond vehicle route 208 can be a path along which thevehicle 104 can travel to re-arrive within thevicinity 206 of thelocation 202 of theuser 110. For example, thesecond vehicle route 208 can be a path along which thevehicle 104 can travel around a block, back to the location associated with the user. In some implementations, such a path may be similar to and/or the same as a portion of the first vehicle route 204 (e.g., along the street associated with the user 110). In some implementations, thesecond vehicle route 208 can be completely different from thefirst vehicle route 204 such that no portion of thesecond vehicle route 208 overlaps with thefirst vehicle route 204. - The determination of the
second vehicle route 208 can be based on a variety of information. For example,FIG. 9 depictsexample information 900 associated with asecond vehicle route 208 according to example embodiments of the present disclosure. In some implementations, thevehicle computing system 102 can determine thesecond vehicle route 208 based at least in part on such information. In some implementations, thesecond vehicle route 208 can be determined off-board thevehicle 104 by another computing system (e.g., the operations computing system 106) and data indicative of thesecond vehicle route 208 can be provided to thevehicle computing system 102. - The
second vehicle route 208 can be determined based at least in part on current and/orhistoric traffic data 902A. For example, thesecond vehicle route 208 can be determined to implement the route that will allow thevehicle 104 to arrive back within thevicinity 206 of thelocation 202 of theuser 110 within the shortest amount of time and/or distance. Thesecond vehicle route 208 can take into account the current traffic (and/or historic traffic patterns) within thegeographic area 200 such that thevehicle 104 is minimally impeded by such traffic (e.g., such that thesecond vehicle route 208 is the fastest and/or shortest vehicle route to thelocation 202 of the user 110). - Additionally, or alternatively, the
second vehicle route 208 can be determined based at least in part onmap data 902B. For example, themap data 902B can be used to determine the path (e.g., roads, other terrain, etc.) thevehicle 104 is to travel along to arrive back within thevicinity 206 of theuser 110. - In some implementations, the
second vehicle route 208 can be based ondata 902C associated with other vehicle(s) in thegeographic area 200. For instance, thedata 902C associated with other vehicle(s) can include additional traffic data associated with the geographic area 200 (e.g., indicating a certain road is impeded by heavy traffic). Thedata 902C associated with the other vehicle(s) can also include the location of such vehicles. For example, thevehicle computing system 102 and/or theoperations computing system 106 can determine the second vehicle route 208 (e.g., optimal vehicle holding pattern) by processing map data and traffic data to establish an estimated time back to thelocation 202 associated with theuser 110. If another vehicle (e.g., another autonomous vehicle in an associated fleet) can arrive at thelocation 202 associated with theuser 110 in a shorter time period than thevehicle 104, the other vehicle (rather than the vehicle 104) can be routed to thelocation 202. If thevehicle 104 would arrive to thelocation 202 the fastest, then thevehicle 104 can be routed in accordance with thesecond vehicle route 208. In some implementations, thesecond vehicle route 208 can be based on a model, such as a machine-learned model, in a manner similar to that described herein with respect to the estimatedtraffic impact 504, the estimated time touser arrival 702, etc. - In some implementations, the
vehicle computing system 102 can determine that thevehicle 104 can stop within thetravel way 600, but later determine that thevehicle 104 should begin to travel again (e.g., according to a holding pattern route). For example, thevehicle computing system 102 can determine that it would be appropriate for thevehicle 104 to stop at least partially within thetravel way 600 to wait for theuser 110 based at least in part on the estimatedtraffic impact 504 and/or the estimated time of user arrival, as described herein. Thevehicle computing system 102 can be configured to update (e.g., continuously, periodically, as scheduled, in real-time, in near real-time, etc.) the estimatedtraffic impact 504 and/or the estimated time of user arrival. For example, while thevehicle 104 is stopped, the traffic impact may increase (e.g., due to an increase in the number of other vehicle(s) stopped behind the vehicle 104) and/or theuser 110 may take longer than estimated to arrive at thevehicle 104. Thevehicle computing system 102 can determine at least one of an updated estimated traffic impact (e.g., based on the number of vehicles that have already stopped and/or additional vehicles that may be caused to stop) or an updated estimated time of user arrival (e.g., based on a change in the user device location data, if any). Thevehicle computing system 102 can determine that thevehicle 104 can no longer remain stopped to wait for theuser 110 based at least in part on at least one of the updated estimated traffic impact or the updated estimated time of user arrival. Accordingly, thevehicle computing system 102 can cause thevehicle 104 to travel along thesecond vehicle route 208 based at least in part on at least one of the updated estimated traffic impact or the updated estimated time of user arrival. - The
vehicle computing system 102 can cause thevehicle 104 to perform one or more vehicle action(s). As described herein, the vehicle action(s) can include at least one of stopping the vehicle 104 (e.g., at least partially within the travel way 600) within thevicinity 206 of thelocation 202 associated with theuser 110 or travelling along asecond vehicle route 208. In the event that the one or more vehicle actions include stopping thevehicle 104 within thevicinity 206 of thelocation 202 associated with theuser 110, thevehicle computing system 102 can cause thevehicle 104 to stop. For example, thevehicle computing system 102 can provide one or more control signals to acontrol system 116 of the vehicle 104 (e.g., braking control system) to cause thevehicle 104 to decelerate to a stopped position that is at least partially in atravel way 600 within thevicinity 206 of thelocation 202 associated with theuser 110. In the event that thevehicle computing system 102 has determined that thevehicle 104 is to travel along a second vehicle route 208 (e.g., in accordance with the holding pattern), thevehicle computing system 102 can obtain data associated with thesecond vehicle route 208 and implement thesecond vehicle route 208 accordingly. For example, thevehicle computing system 102 can request and obtain data indicative of thesecond vehicle route 208 from theoperations computing system 106. Additionally, or alternatively, thevehicle computing system 102 can determine thesecond vehicle route 208 onboard thevehicle 104. Thevehicle computing system 102 can provide one or more control signals to cause thevehicle 104 to implement a motion plan that causes thevehicle 104 to travel in accordance with the second vehicle route 208 (e.g., to implement one or more vehicle trajectories in accordance with the second vehicle route 208). - The
vehicle computing system 102 can provide theuser 110 with one or more communications indicating the actions performed by (or to be performed by) thevehicle 104. For example, in the event that thevehicle 104 does not find a parking location and does not stop in the travel way, thevehicle computing system 102 can provide, via thecommunication system 136, a communication to theuser device 138 associated with theuser 110. The communication can include, for example, a textual message, auditory message, etc. that indicates the vehicle actions (e.g., “I could not locate you at the pin drop, traffic forced me to go around the block. Please proceed to the pin drop”). - In the event that the
vehicle 104 stops at least partially within thetravel way 600, thevehicle computing system 102 can provide a communication (e.g., data) to auser device 138 associated with theuser 110. The communication can indicate that thevehicle 104 is travelling to return to thelocation 202 associated with theuser 110. In response to receiving such a communication, theuser device 138 associated with theuser 110 can display a user interface indicative of the communication. For example,FIG. 10 depicts anexample display device 1000 with anexample communication 1002 according to example embodiments of the present disclosure. The display device 1000 (e.g., display screen) can be associated with theuser device 138 associated with theuser 110. Thecommunication 1002 can be presented via auser interface 1004 on thedisplay device 1000. Thecommunication 1002 can indicate that thevehicle 104 has arrived and is waiting in the travel way (e.g., in a current traffic lane). In sonic implementations, thecommunication 1002 and/or another portion of theuser interface 1004 can be indicative of a location of thevehicle 104. For example, thedisplay device 1000 can display amap user interface 1006 that includes auser route 1008. Theuser route 1008 can be a route along which auser 110 can travel to arrive at thevehicle 104. - Returning to
FIG. 2 , in some implementations, thevehicle 104 may be relieved of its responsibility to provide a vehicle service to theuser 110. For instance, in the event that a vehicle computing system 102 (and/or operations computing system 106) determines that avehicle 104 is to travel along thesecond vehicle route 208, such computing system(s) can determine whether it would be advantageous (e.g., more time efficient, more fuel efficient, etc.) for another vehicle 210 (e.g., another autonomous vehicle) within thegeographic area 200 to be routed to theuser 110. In the event that it would be advantageous (e.g., because theother vehicle 210 can arrive sooner), theoperations computing system 106 can provide data to thevehicle 104 indicating that thevehicle 104 is no longer responsible for therequest 140. Theother vehicle 210 can be routed to theuser 110 in the manner described herein. Additionally, or alternatively, the operations computing system 106 (and/or thevehicle computing system 102 of the vehicle 104) can re-route thevehicle 104 to provide a vehicle service to another user. - In some implementations, the
vehicle computing system 102 can cancel therequest 140 associated with theuser 110. By way of example, thevehicle 104 may be caused to re-route e.g., circle the block) a certain number of times and/or theuser 110 may not arrive at thevehicle 104 within a certain timeframe. Thevehicle computing system 102 can determine whether to cancel therequest 140 based at least in part on a vehicle service cancellation threshold. -
FIG. 11 depictsexample information 1100 associated with a vehicleservice cancellation threshold 1102 according to example embodiments of the present disclosure. The vehicleservice cancellation threshold 1102 can he indicative of a threshold time and/or distance that thevehicle 104 is in the holding pattern. For instance, the vehicleservice cancellation threshold 1102 can be indicative of a time between when thevehicle 104 initially arrived within avicinity 206 of the user 110 (and/or passed the location 202) to the current time. Additionally, or alternatively, the vehicleservice cancellation threshold 1102 can be indicative of a distance travelled by thevehicle 104 while in a holding pattern (e.g., number of times the vehicle is re-routed to arrive at the user 110). The vehicleservice cancellation threshold 1102 can be determined and updated continuously, periodically, as scheduled, on request, in real-time, in near real-time, etc. (e.g., per trip, while on a trip, etc.). The vehicleservice cancellation threshold 1102 can be determined by thevehicle computing system 102 and/or off-board thevehicle 104 and provided to thevehicle computing system 102. - The
information 1100 can include vehicleservice demand data 1104A, historicvehicle service data 1104B, geographic area preferences 1104C, data 1104D associated with other vehicle(s) within the geographic area, user selected holding patterns 1104E, and/or other types of information. The vehicleservice demand data 1104A can include a current level of demand (e.g., number of current service requests) for vehicle services (e.g., within the geographic area 200). Additionally, or alternatively, the vehicleservice demand data 1104A can include a historic level of demand for vehicle services at a certain time, day, etc. (e.g., within the geographic area, similarly situated area, etc.). In the event that the demand is lower, the vehicleservice cancellation threshold 1102 can be higher (e.g., because thevehicle 104 may not be needed for other vehicle service requests). In the event that the demand is higher, the vehicleservice cancellation threshold 1102 can be lower (e.g., because thevehicle 104 may be needed for other vehicle service requests). - The historic
vehicle service data 1104B can be associated with thespecific user 110 and/or one or more other user(s). For example, the historicvehicle service data 1104B can indicate that theuser 110 typically takes a longer amount of time to arrive at the vehicle 104 (e.g., due to a disability). As such, the vehicleservice cancellation threshold 1102 can be higher in order to cause thevehicle 104 to remain in the holding pattern longer, thereby giving the user 110 a greater opportunity to arrive at thevehicle 104. Additionally, or alternatively, the historicvehicle service data 1104B can indicate that it generally takes user(s) of within thegeographic area 200 longer to arrive at thevehicle 104. As such, the vehicleservice cancellation threshold 1102 may be higher. In the event that the historicvehicle service data 1104B indicates that the user 110 (and/or other user(s)) typically arrives at thevehicle 104 in a relatively short timeframe, the vehicleservice cancellation threshold 1102 may be lower. - The geographic area preferences 1104C can be descriptive of the preferences associated with a geographic area 200 (e.g., as indicated by the managers of the geographic area 200). For example, the
geographic area 200 may prefer that avehicle 104 only remain in a holding pattern (e.g., circle the block) for a certain time period and/or distance so as not to affect local traffic. - In some implementations, the vehicle
service cancellation threshold 1102 can be based at least in part on data 1104D associated with one or more other vehicles (e.g., other autonomous vehicles in an associated fleet). For example, the data 1104D can be indicative of the location(s) of other vehicle(s) (e.g., within the geographic area 200) and/or whether the other vehicle(s) are available to provide a vehicle service (e.g., whether or not the other vehicle is assigned to a service request, currently providing a vehicle service, etc.). In the event that another vehicle is located close to thelocation 202 associated with theuser 110, the vehicleservice cancellation threshold 1102 may be lower (e.g., because theuser 110 can be serviced by the other vehicle in the event a new request is made after cancellation). In the event that another vehicle is not located close to thelocation 202 associated with theuser 110, the vehicleservice cancellation threshold 1102 may be higher (e.g., because another vehicle is not readily available in the event theuser 110 makes a new request after cancellation). - In some implementations, the vehicle
service cancellation threshold 1102 can be based at least in part on user selected holding patterns 1104E. The user selected holding patterns 1104E can include data indicative of a vehicleservice cancellation threshold 1102 selected by auser 110. For example, theuser 110 can purchase (e.g., via a user interface associated with a software application) a higher vehicleservice cancellation threshold 1102, such that thevehicle 104 will remain in the holding pattern (e.g., circle the block) for a longer time/distance before the vehicle service request is cancelled. In some implementations, auser 110 can have a higher vehicleservice cancellation threshold 1102 due to a higher user rating, specialized treatment (e.g., frequent user), and/or based on other conditions. - In some implementations, the vehicle
service cancellation threshold 1102 can be based at least in part on a model, such as a machine learned model. For example, the model can be trained based on previously obtainedinformation 1100 and labeled data indicative of the vehicle service cancellation thresholds associated therewith. In a manner similar to that described herein for the estimatedtraffic impact 504 and the estimated time of user arrival, the vehicle computing system 102 (or other computing system) can provide input data(e.g., the information 1100) into such a model and receive, as an output, a recommended vehicle service cancellation threshold. - The
vehicle computing system 102 can cancel therequest 140 associated with theuser 110 in the event that theuser 110 has not arrived at thevehicle 104 and thevehicle 104 has exceeded the vehicleservice cancellation threshold 1102. In response, thevehicle computing system 102 can provide data indicating that therequest 140 for the vehicle service provided by thevehicle 104 is cancelled to the operations computing system 106 (and/or one or more other computing devices that are remote from the vehicle computing system 102). In some implementations, such data can request the cancellation of the user'sservice request 140. Theoperations computing system 106 can cancel the service request 140 (and inform theuser 110 accordingly) and/or re-route thevehicle 104 to provide a vehicle service to another user. In some implementations, thevehicle computing system 102 can communicate directly with auser device 138 associated with theuser 110 to cancel theservice request 140 and/or inform theuser 110 of the vehicle service cancellation. Thevehicle computing system 102 can report such a cancellation to theoperations computing system 106. -
FIG. 12 depicts a flow diagram of anexample method 1200 of controlling autonomous vehicles according to example embodiments of the present disclosure. One or more portion(s) of themethod 1200 can be implemented by one or more computing devices such as, for example, the one or more computing device(s) of thevehicle computing system 102 and/or other systems. Each respective portion of the method 1200 (e.g., 1202-1222) can be performed by any (or any combination) of the one or more computing devices. Moreover, one or more portion(s) of themethod 1200 can be implemented as an algorithm on the hardware components of the device(s) described herein (e.g., as inFIGS. 1 and 14 ), for example, to control an autonomous vehicle.FIG. 12 depicts elements performed in a particular order for purposes of illustration and discussion. Those of ordinary skill in the art, using the disclosures provided herein, will understand that the elements of any of the methods discussed herein can be adapted, rearranged, expanded, omitted, combined, and/or modified in various ways without deviating from the scope of the present disclosure. - At (1202), the
method 1200 can include obtaining data indicative of a location associated with a user. For instance, thevehicle computing system 102 can obtaindata 142 indicative of alocation 202 associated with auser 110 to which thevehicle 104 is to travel. Thevehicle 104 is to travel along afirst vehicle route 204 that leads to thelocation 202 associated with theuser 110. As described herein, theuser 110 can be associated with arequest 140 for a vehicle service. Thevehicle computing system 104 and/or theoperations computing system 106 can determine thefirst vehicle route 204 so that thevehicle 104 can travel to theuser 110 to provide theuser 110 with the requested vehicle service (e.g., pick up theuser 110 for a transportation service, deliver an item for a delivery service, receive an item for courier service, and/or provide another service). - At (1204), the
method 1200 can include travelling along a first vehicle route. A human operator may not be located within thevehicle 104. Thevehicle computing system 102 can provide one or more control signals to themotion planning system 128 and/or the vehicle'scontrol systems 116 to cause thevehicle 104 to plan its motion and/or implement a motion plan to travel in accordance with first vehicle route 204 (e.g., autonomously, without input from a human operator to the vehicle 104). - The
first vehicle route 204 can bring thevehicle 104 within thevicinity 206 of thelocation 202 associated with theuser 110. The vicinity of thelocation 202 associated with theuser 110 can be defined at least in part by a distance from thelocation 202 associated with theuser 110. In some implementations, the distance from thelocation 202 associated with theuser 110 can be based at least in part on anacceptable walking distance 302 from thelocation 202 associated with theuser 110, as described herein. - At (1206), the
method 1200 can include determining whether a parking location out of a travel way is available for the vehicle. For instance, thevehicle computing system 102 can determine whether a parking location within thevicinity 206 of thelocation 202 associated with theuser 110 is available for the vehicle 104 (e.g., based onsensor data 118,map data 120, etc.). Thevehicle computing system 102 may search for a parking location while thevehicle 104 is in anapproach operating mode 108D, as described herein. Thevehicle computing system 102 may search for a parking location before and/or after thevehicle 104 passes thelocation 202 associated with theuser 110. In the event that thevehicle computing system 102 is able to identify an available parking location, themethod 1200 can include sending a communication to auser 110 to indicate that thevehicle 104 has arrived and the location of thevehicle 104. Once the user has arrived at thevehicle 104, thevehicle 104 can provide the vehicle service to theuser 110. In some implementations, thevehicle computing system 102 can determine that a parking location that is out of thetravel way 600 is unavailable for thevehicle 104. - At (1210), the
method 1200 can include obtaining traffic data associated with the geographic area that includes the location associated with the user. Thevehicle computing system 102 can obtaintraffic data 500 associated with ageographic area 200 that includes thelocation 202 associated with theuser 110. Thetraffic data 500 can be associated with thevicinity 206 of the location 202 (e.g., a block, neighborhood, etc. where theuser 110 is located) and/or other portions of thegeographic area 200. For instance, thevehicle computing system 102 can obtain, via one ormore sensors 112 of thevehicle 104,sensor data 118 associated with the surrounding environment of thevehicle 104 that is within thevicinity 206 of thelocation 202 associated with theuser 110. Thevehicle computing system 102 can determine a level of traffic based at least in part on thesensor data 118, as described herein. - At (1212), the
method 1200 can include determining an estimated traffic impact. For instance, thevehicle computing system 102 can determine an estimatedtraffic impact 504 of thevehicle 104 on thegeographic area 200 based at least in part on thetraffic data 500. The estimatedtraffic impact 504 can be indicative of an estimated impact of thevehicle 104 on one or more objects within a surrounding environment of thevehicle 104 in the event that thevehicle 104 were to stop at least partially in the travel way 600 (e.g., a current lane 602) within thevicinity 206 of thelocation 202 associated with theuser 110. In some implementations, to determine the estimatedtraffic impact 504, thevehicle computing system 102 can compare the level of traffic (e.g., determined based at least in part on the sensor data, other traffic data) to atraffic constraint 506. Thetraffic constraint 506 can include a traffic threshold indicative of a threshold level of traffic. In some implementations, the traffic constraint can be determined at least in part from a machine-learned model, as described herein. - At (1214), the
method 1200 can include obtaining location data associated with a user. For instance, thevehicle computing system 102 can obtain location data associated with a user device 138 (e.g., mobile device) associated with theuser 110, as described herein. The location data associated with theuser device 138 can be indicative of one or more locations of theuser device 138 associated with theuser 110 at one or more times. - At (1216), the
method 1200 can include determining an estimated time of user arrival. For instance, thevehicle computing system 102 can determine an estimated time of user arrival based at least in pan on the location data associated with theuser device 138. The estimated time of user arrival can be indicative of an estimated time at which theuser 110 will arrive at thevehicle 104. - At (1218), the
method 1200 can include determining one or more vehicle actions based at least in part on the estimated traffic impact and/or the estimated time of user arrival. For instance, thevehicle computing system 102 can determine one or more vehicle actions based at least in pan on the estimatedtraffic impact 504. Thevehicle computing system 102 can determine the one or more vehicle actions also, or alternatively, based at least in part on the estimated time of user arrival. The one or more vehicle actions can include at least one of stopping thevehicle 104 at least partially in atravel way 600 within avicinity 206 of thelocation 202 associated with theuser 110 or travelling along a second vehicle route 208 (e.g., entering into a vehicle holding pattern). Thesecond vehicle route 208 can be at least partially different from thefirst vehicle route 204. Thesecond vehicle route 208 can include a route that leads to thelocation 202 associated with the user 110 (or at least to avicinity 206 of the location 202). - At (1220), the
method 1200 can include causing the vehicle to perform the one or more vehicle actions. For instance, thevehicle computing system 102 can cause thevehicle 104 to perform the one or more vehicle actions. To do so, thevehicle computing system 102 can provide one or more control signals to one or more systems onboard thevehicle 104 to cause thevehicle 104 to perform the vehicle action(s) (e.g., to stop in thetravel way 600, enter the vehicle holding pattern). - At (1222), the
method 1200 can include providing a communication to the user. For instance, thevehicle computing system 102 can provide a communication to auser device 138 associated with theuser 110 that is indicative of a vehicle action. By way of example, in the event that thevehicle 104 stops at least partially within atravel way 600 within thevicinity 206 of thelocation 202 associated with theuser 110, thevehicle computing system 102 can provide, to theuser device 138 associated with theuser 110, a communication indicating that thevehicle 104 is stopped (and/or will stop). In response to receiving the communication, theuser device 138 can display amap user interface 1006 that indicates a vehicle location of thevehicle 104 and auser route 1008 to the vehicle location of thevehicle 104. In the event that thevehicle 104 travels in accordance with thesecond vehicle route 208, the vehicle computing system can provide a communication to theuser device 138 associated with theuser 110 indicating that thevehicle 104 is travelling to return to thelocation 202 associated with theuser 110. -
FIGS. 13A-B depict a flow diagram of anexample method 1300 of controlling autonomous vehicles according to example embodiments of the present disclosure. One or more portion(s) of themethod 1300 can be implemented by one or more computing devices such as, for example, the one or more computing device(s) of thevehicle computing system 102 and/or other systems. Each respective portion of the method 1300 (e.g., 1302-1346) can be performed by any (or any combination) of the one or more computing devices. Moreover, one or more portion(s) of themethod 1300 can be implemented as an algorithm on the hardware components of the device(s) described herein (e.g., as inFIGS. 1 and 14 ), for example, to control an autonomous vehicleFIGS. 13A-B depicts elements performed in a particular order for purposes of illustration and discussion. - At (1302) of
FIG. 13A , thevehicle 104 can enter into anapproach operating mode 108D. For example, avehicle computing system 102 can obtaindata 142 indicative of alocation 202 associated with auser 110. Thevehicle computing system 102 can cause thevehicle 104 to travel in accordance with afirst vehicle route 204 to arrive within avicinity 206 of thelocation 202 associated with theuser 110. Thevehicle 104 can enter into theapproach mode 108D, for example, when it is within thevicinity 206 of thelocation 202. Thevehicle 104 can approach thelocation 202 associated with theuser 110 in theapproach operating mode 108D. Moreover, thevehicle computing system 102 can send accurate (e.g., localized) approach data (e.g., through a network) to a software application running on auser device 138 associated with theuser 110. The software application can cause theuser device 138 to display a user interface via a display device. The user interface can display a map interface with the user's position (e.g., based on GPS) and a precise location approach (e.g., of the vehicle 104), as well as a target location of thevehicle 104 to meet theuser 110. The user interface can also alert theuser 110 that thevehicle 104 is arriving (e.g., “your vehicle is arriving, please prepare to board”) - In some implementations, the
vehicle 104 can each include an outwardly visible lighting element, such as a number or array of LED lights capable of producing rapid flash patterns. For example, the lighting element can be located within the interior and viewable through the front of the vehicle (e.g., windshield), and/or can be located on the exterior of thevehicle 104. When thevehicle 104 is assigned (or accepts) a service request, theoperations computing system 106 can transmit a flash code to thevehicle 104 and the requestinguser device 138. As thevehicle 104 approaches the pick-up location, thevehicle 104 can output the flash code using the lighting element. The requestinguser 110 can be prompted to hold up theuser device 138 so that a camera or the camera lens of theuser device 138 is pointed towards thevehicle 104 and can detect the flash code (e.g., the camera can be pointed towards thevehicle 104 and the display screen of the mobile device can display a viewfinder or preview of the imagery detected or captured by the camera). Upon detecting the flash code from thevehicle 104, theuser device 138 can determine whether the flash code matches the flash code provided by the operations computing system 106 (e.g., utilizing a perception algorithm). If so, theuser device 138 can display an indicator, such as a circle or a highlight for thevehicle 104, so that the requestinguser 110 can readily identify thevehicle 104. - While in the
approach operating mode 108D, thevehicle 104 can search for an available parking location that is out of a travel way 600 (e.g., out of a current travel lane 602), at (1304). For example, thevehicle computing system 102 can search for an out-of-lane parking location when thevehicle 104 is within a certain distance (e.g., an acceptable walking distance 302) from thelocation 202 associated with theuser 110. To do so, thevehicle computing system 102 can utilize the vehicle's sensor(s), as described herein. - In the event an out-of-lane parking location is found, the
vehicle computing system 102 can cause thevehicle 104 to park (e.g., autonomously, without user input), at (1306), and send a communication to the user, as described herein. In some implementations, if the estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until theuser 110 starts boarding the vehicle 104) is low less than a few seconds, 1, 2, 3 s, etc.) the vehicle doors can be unlocked and/or user specific vehicle settings (e.g., music, temperature, seat position, etc.) can be implemented, at (1307) (e.g., because boarding is imminent). If theuser 110 arrives within a certain time frame (e.g., starts and/or completes boarding, item retrieval, item placement, etc. within “X” time), at (1308), thevehicle 104 can start to provide theuser 110 with a vehicle service (e.g., transport theuser 110 to a destination location), at (1310). However, if theuser 110 does not arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc.) at thevehicle 104 within the time frame, thevehicle computing system 102 can again contact theuser 110, at (1312). The contact can be facilitated by thevehicle computing system 102 providing a communication to theuser 110 via a software application on the user device 138 (e.g., using a suitable communication protocol). If unsuccessful (e.g., theuser 110 is unresponsive, does not travel to vehicle 104), thevehicle computing system 102 can cancel the vehicle service, at (1314), as described herein. However, if the contact is successful, thevehicle 104 can wait another timeframe “Y” time) for theuser 110 to arrive e.g., start and/or complete boarding, item retrieval, item placement, etc.), at (1316). Thevehicle computing system 102 can cause thevehicle 104 to provide the vehicle service to theuser 110, in the event that theuser 110 arrives at the vehicle 104 (e.g., boards the vehicle 104). In the event theuser 110 does not arrive within “Y” time (e.g., start and/or complete boarding, item retrieval, item placement, etc.), thevehicle 104 can enter into the holding pattern and travel in accordance with thesecond vehicle route 208, at (1318). - In the event that an out-of-lane parking location is not found, the
vehicle computing system 102 can determine at least one of an estimatedtraffic impact 504 of thevehicle 104 on thegeographic area 200 based at least in part ontraffic data 500, at (1320) or an estimated time of user arrival based at least in part on the location data associated with theuser device 110 as described herein, at (1322). Thevehicle computing system 102 can determine the estimatedtraffic impact 504 to determine if and how long to wait in-lane for theuser 110 to arrive at thevehicle 104. As thevehicle 104 moves towards thelocation 202 associated with theuser 110, thevehicle computing system 102 can determine whether thevehicle 104 should stop at thelocation 202 based at least in part on at least one of the estimatedtraffic impact 504 or the estimated time to user arrival. For example, thevehicle computing system 102 can start sensing the traffic presence and object speed around the vehicle 104 (e.g., in-lane, behind and ahead of the vehicle 104). Moreover, thevehicle computing system 102 can use RF sensors and/or Bluetooth beacons to determine a user presence, general distance between theuser 110 and thevehicle 104, and/or changes in the distance indicating number of seconds from user arrival. - In some implementations, in the event that the estimated
traffic impact 504 is low, thevehicle computing system 102 can cause thevehicle 104 to stop in thetravel way 600, at (1324). If theuser 110 arrives (e.g., start and/or complete boarding, item retrieval, item placement, etc.) within a certain timeframe (e.g., within “Y” time), thevehicle 104 can provide the vehicle service to theuser 110. If theuser 110 does not arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc.) within the timeframe, thevehicle 104 can enter into the holding pattern. - In some implementations, in the event that the estimated
traffic impact 504 is high, thevehicle computing system 102 can determine the estimated time to user arrival. In the event that the estimated time to user arrival is low, thevehicle computing system 102 can cause thevehicle 104 to stop within the travel way 600 (e.g., in a travel lane), at (1326), despite potential traffic build-up. If theuser 110 arrives (e.g., starts and/or completes boarding, item retrieval, item placement, etc.) within a certain timeframe (e.g., within “Y” time), thevehicle 104 can provide the vehicle service to theuser 110. If theuser 110 does not arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc. within the timeframe, thevehicle 104 can enter into the holding pattern. - With reference to
FIG. 13B , thevehicle computing system 102 can cause thevehicle 104 to travel past thelocation 202 associated with theuser 110, at (1328). In some implementations, this can be a portion of thesecond vehicle route 208. Thevehicle computing system 102 can search for a parking location out of thetravel way 600 after thevehicle 104 passes thelocation 202 associated with theuser 110, at (1330). This can occur until thevehicle 104 travels a certain distance past the location 202 (e.g., until thevehicle 104 reaches the acceptable walking distance 302). In some implementations, even after the vehicle travels past the location associated with theuser 110, if the estimatedtraffic impact 504 and/or the estimated time of user arrival is low enough thevehicle computing system 102 can cause thevehicle 104 to stop. - In the event that a parking location is found, at (1332), the
vehicle computing system 102 can contact theuser 110 via the user device 138 (e.g., indicating the location of thevehicle 104 and a user route thereto). The contact can be facilitated by thevehicle computing system 102 providing a communication to theuser 110 via a software application on the user device 138 (e.g., “This is as close as I can get. Please come to me”). In some implementations, if the estimated time until theuser 110 starts interaction with the vehicle 104 (e.g., the estimated time until theuser 110 starts boarding the vehicle 104) is low (e.g., less than a few seconds, 1, 2, 3 s, etc.) the vehicle doors can be unlocked and/or user specific vehicle settings (e.g., music, temperature, seat position, etc.) can be implemented, at (1333) (e.g., because boarding is imminent). If theuser 110 arrives (e.g., starts and/or completes boarding, item retrieval, item placement, etc.) at thevehicle 104 within a certain timeframe (e.g., with “Z” time), at (1334), thevehicle 104 can provide a vehicle service to theuser 110, at (1336). If theuser 110 does not arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc.) within the timeframe, thevehicle computing system 102 can contact theuser 110, at (1338). If the contact is successful, thevehicle 104 can again wait for theuser 110 to arrive (e.g., start and/or complete boarding, item retrieval, item placement, etc.) to thevehicle 104. If the contact is not successful, thevehicle computing system 102 can cancel the vehicle service, at (1340). - In the event that the
vehicle computing system 102 does not find a parking location after thelocation 202 associated with theuser 110 and within thevicinity 206 of the location 202 (e.g., an acceptable walking distance 302), thevehicle computing system 102 can cause thevehicle 104 to implement a vehicle holding pattern (e.g., enter a holding pattern operating mode), at (1342). As such, thevehicle 104 can ignore any potential parking locations and provide a communication to theuser device 138 associated with theuser 110. Theuser device 138 can display a map interface depicting a location of thevehicle 104. In some implementations, thevehicle computing system 102 can request asecond vehicle route 208 from theoperations computing system 106. Theoperations computing system 106 can provide data indicative of thesecond vehicle route 208 to thevehicle computing system 102. Thevehicle computing system 102 can obtain the data indicative of thesecond vehicle route 208 and send one or more control signals to cause thevehicle 104 to travel in accordance with thesecond vehicle route 208. At (1344), thevehicle computing system 102 can send a communication to theuser 110 indicating that the vehicle 1.04 is travelling to return to thelocation 202 associated with theuser 110, as described herein. As thevehicle 104 returns back toward thelocation 202 associated with theuser 110, thevehicle 104 can enter into theapproach operating mode 108D again, at (1346). As such, the process can continue as shown inFIG. 13A . - The
vehicle 104 can continue in the holding pattern until the vehicleservice cancellation threshold 1102 is reached. After the vehicleservice cancellation threshold 1102 is reached, thevehicle computing system 102 can provide a communication directly to the user device 138 (e.g., via the vehicle's communication system 136) to inform theuser 110 that the vehicle service has been cancelled. In some implementations, if another vehicle would be more appropriate to provide the vehicle service to the user 110 (e.g., another vehicle could arrive to thelocation 202 associated with theuser 110 quicker than the vehicle 104), theoperations computing system 106 can re-route the other vehicle to thelocation 202 associated with theuser 110. -
FIG. 14 depicts example system components of anexample system 1400 according to example embodiments of the present disclosure. Theexample system 1400 can include thevehicle computing system 102, theoperations computing system 106, and a machinelearning computing system 1430 that are communicatively coupled over one or more network(s) 1480. - The
vehicle computing system 102 can include one or more computing device(s) 1401. The computing device(s) 1401 of thevehicle computing system 102 can include processor(s) 1402 and a memory 1404 (e.g., onboard the vehicle 104). The one ormore processors 1402 can be any suitable processing device (e.g., a processor core, a microprocessor, an ASIC, a FPGA, a controller, a microcontroller, etc.) and can be one processor or a plurality of processors that are operatively connected. Thememory 1404 can include one or more non-transitory computer-readable storage media, such as RAM, ROM, EEPROM, EPROM, one or more memory devices, flash memory devices, etc., and combinations thereof. - The
memory 1404 can store information that can be accessed by the one ormore processors 1402. For instance, the memory 1404 (e.g., one or more non-transitory computer-readable storage mediums, memory devices) can include computer-readable instructions 1406 that can be executed by the one ormore processors 1402. Theinstructions 1406 can be software written in any suitable programming language or can be implemented in hardware. Additionally, or alternatively, theinstructions 1406 can be executed in logically and/or virtually separate threads on processor(s) 1402. - For example, the
memory 1404 can storeinstructions 1406 that when executed by the one ormore processors 1402 cause the one or more processors 1402 (the computing system 102) to perform operations such as any of the operations and functions of thevehicle computing system 102, thevehicle 104, or for which thevehicle computing system 102 and/or thevehicle 104 are configured, as described herein, the operations for determining autonomous boarding times and/or other time estimates (e.g., one or more portions of method 700) the operations for controlling autonomous vehicles (e.g., one or more portions ofmethods 1200 and/or 1300), and/or any other functions for thevehicle 104, as described herein. - The
memory 1404 can storedata 1408 that can be obtained, received, accessed, written, manipulated, created, and/or stored. Thedata 1408 can include, for instance, traffic data, location data, historic data, map data, sensor data, state data, prediction data, motion planning data, data associated with operating modes, data associated with estimated times, and/or other data information described herein. In some implementations, the computing device(s) 1401 can obtain data from one or more memory device(s) that are remote from thevehicle 104. - The computing device(s) 1401 can also include a
communication interface 1409 used to communicate with one or more other system(s) on-board thevehicle 104 and/or a remote computing device that is remote from the vehicle 104 (e.g., the other systems ofFIG. 1400 , a user device associated with a user, etc.). Thecommunication interface 1409 can include any circuits, components, software, etc. for communicating via one or more networks (e.g., 1480). In some implementations, thecommunication interface 1409 can include for example, one or more of a communications controller, receiver, transceiver, transmitter, port, conductors, software and/or hardware for communicating data/information. - The
operations computing system 106 can perform the operations and functions for managing autonomous vehicles, as described herein. Theoperations computing system 106 can be located remotely from thevehicle 104. For example, theoperations computing system 106 can operate offline, off-board, etc. Theoperations computing system 106 can include one or more distinct physical computing devices. - The
operations computing system 106 can include one ormore computing devices 1420. The one ormore computing devices 1420 can include one ormore processors 1422 and amemory 1424. The one ormore processors 1422 can be any suitable processing device (e.g., a processor core, a microprocessor, an ASIC, a FPGA, a controller, a microcontroller, etc.) and can be one processor or a plurality of processors that are operatively connected. Thememory 1424 can include one or more non-transitory computer-readable storage media, such as RAM, ROM, EEPROM, EPROM, one or more memory devices, flash memory devices, etc., and combinations thereof. - The
memory 1424 can store information that can be accessed by the one ormore processors 1422. For instance, the memory 1424 (e.g., one or more non-transitory computer-readable storage mediums, memory devices) can storedata 1426 that can be obtained, received, accessed, written, manipulated, created, and/or stored. Thedata 1426 can include, for instance, service request data, vehicle data, vehicle service cancellation thresholds, and/or other data or information described herein. In some implementations, theoperations computing system 106 can obtain data from one or more memory device(s) that are remote from theoperations computing system 106. - The
memory 1424 can also store computer-readable instructions 1428 that can be executed by the one ormore processors 1422. Theinstructions 1428 can be software written in any suitable programming language or can be implemented in hardware. Additionally, or alternatively, theinstructions 1428 can be executed in logically and/or virtually separate threads on processor(s) 1422. - For example, the
memory 1424 can storeinstructions 1428 that when executed by the one ormore processors 1422 cause the one ormore processors 1422 to perform any of the operations and/or functions described herein, including, for example, any of the operations and functions of theoperations computing system 106, the computing device(s) 1420, and any of the operations and functions for which theoperations computing system 106 and/or the computing devices) 1420 are configured, as described herein, as well as one or more portions ofmethods 1200 and/or 1300. - The computing device(s) 1420 can also include a
communication interface 1429 used to communicate with one or more other system(s). Thecommunication interface 1429 can include any circuits, components, software, etc. for communicating via one or more networks (e.g., 1480). In some implementations, thecommunication interface 1429 can include for example, one or more of a communications controller, receiver, transceiver, transmitter, port, conductors, software and/or hardware for communicating data/information. - According to an aspect of the present disclosure, the
vehicle computing system 102 and/or theoperations computing system 106 can store or include one or more machine-learnedmodels 1440. As examples, the machine-learnedmodels 1440 can be or can otherwise include various machine-learned models such as, for example, neural networks (e.g., deep neural networks), support vector machines, decision trees, ensemble models, k-nearest neighbors models, Bayesian networks, or other types of models including linear models and/or non-linear models. Example neural networks include feed-forward neural networks, recurrent neural networks (e.g., long short-term memory recurrent neural networks), or other forms of neural networks. - In some implementations, the
vehicle computing system 102 and/or theoperations computing system 106 can receive the one or more machine-learnedmodels 1440 from the machinelearning computing system 1430 over the network(s) 1480 and can store the one or more machine-learnedmodels 1440 in the memory of the respective system. Thevehicle computing system 102 and/or theoperations computing system 106 can use or otherwise implement the one or more machine-learned models 1440 (e.g., by processor(s) 1402, 1422). In particular, thevehicle computing system 102 and/or theoperations computing system 106 can implement the machine learned model(s) 1440 to determine an acceptable walking distance, traffic constraint, estimated traffic impact, estimated time of user arrival (e.g., estimated boarding time, estimated boarding during, estimated boarding completion, etc.), second vehicle route (e.g., vehicle holding pattern vehicle service cancellation threshold, etc., as described herein. - The machine
learning computing system 1430 can include one ormore processors 1432 and amemory 1434. The one ormore processors 1432 can be any suitable processing device (e.g., a processor core, a microprocessor, an ASIC, a FPGA, a controller, a microcontroller, etc.) and can be one processor or a plurality of processors that are operatively connected. Thememory 1434 can include one or more non-transitory computer-readable storage media, such as RAM, ROM, EEPROM, EPROM, one or more memory devices, flash memory devices, etc., and combinations thereof. - The
memory 1434 can store information that can be accessed by the one ormore processors 1432. For instance, the memory 1434 (e.g., one or more non-transitory computer-readable storage mediums, memory devices) can storedata 1436 that can be obtained, received, accessed, written, manipulated, created, and/or stored. In some implementations, the machinelearning computing system 1430 can obtain data from one or more memory devices that are remote from thesystem 1430. - The
memory 1434 can also store computer-readable instructions 1438 that can be executed by the one ormore processors 1432. Theinstructions 1438 can be software written in any suitable programming language or can be implemented in hardware. Additionally, or alternatively, theinstructions 1438 can be executed in logically and/or virtually separate threads on processor(s) 1432. Thememory 1434 can store theinstructions 1438 that when executed by the one ormore processors 1432 cause the one ormore processors 1432 to perform operations. - In some implementations, the machine
learning computing system 1430 can include one or more server computing devices. If the machinelearning computing system 1430 includes multiple server computing devices, such server computing devices can operate according to various computing architectures, including, for example, sequential computing architectures, parallel computing architectures, or some combination thereof. - In addition or alternatively to the model(s) 1440 at the
vehicle computing system 102 and/or theoperations computing system 106, the machinelearning computing system 1430 can include one or more machine-learnedmodels 1450. As examples, the machine-learnedmodels 1450 can be or can otherwise include various machine-learned models such as, for example, neural networks (e.g., deep neural networks), support vector machines, decision trees, ensemble models, k-nearest neighbors models, Bayesian networks, or other types of models including linear models and/or non-linear models. Example neural networks include feed-forward neural networks, recurrent neural networks (e.g., long short-term memory recurrent neural networks, or other forms of neural networks. The machine-learnedmodels 1450 can be similar to and/or the same as the machine-learnedmodels 1440. - As an example, the machine
learning computing system 1430 can communicate with thevehicle computing system 102 and/or theoperations computing system 106 according to a client-server relationship. For example, the machinelearning computing system 1430 can implement the machine-learnedmodels 1450 to provide a web service to thevehicle computing system 102 and/or theoperations computing system 106. For example, the web service can provide machine-learned models to an entity associated with an autonomous vehicle; such that the entity can implement the machine-learned model (e.g., to determine estimated traffic impacts, vehicle service request cancellation, etc.). Thus, machine-learnedmodels 1450 can be located and used at thevehicle computing system 102 and/or theoperations computing system 106 and/or machine-learnedmodels 1450 can be located and used at the machinelearning computing system 1430. - In some implementations, the machine
learning computing system 1430, thevehicle computing system 102, and/or theoperations computing system 106 can train the machine-learnedmodels 1440 and/or 1450 through use of amodel trainer 1460. Themodel trainer 1460 can train the machine-learnedmodels 1440 and/or 1450 using one or more training or learning algorithms. One example training technique is backwards propagation of errors. In some implementations, themodel trainer 1460 can perform supervised training techniques using a set of labeled training data. In other implementations, themodel trainer 1460 can perform unsupervised training techniques using a set of unlabeled training data. Themodel trainer 1460 can perform a number of generalization techniques to improve the generalization capability of the models being trained. Generalization techniques include weight decays, dropouts, or other techniques. - In particular, the
model trainer 1460 can train a machine-learnedmodel 1440 and/or 1450 based on a set oftraining data 1462. Thetraining data 1462 can include, for example, a number of sets of data from previous events (e.g., acceptable walking distance data, historic traffic data, user arrival data, trip cancellation data, user feedback data, other data described herein). In some implementations, thetraining data 1462 can be taken from the same geographic area (e.g., city, state, and/or country) in which an autonomous vehicle utilizing thatmodel 1440/1450 is designed to operate. In this way, themodels 1450/1450 can be trained to determine outputs (e.g., estimated traffic impact, acceptable walking distances) in a manner that is tailored to the customs of a particular location (e.g., waiting for a user longer, decreasing an acceptable walking distance, etc.). Themodel trainer 1460 can be implemented in hardware, firmware, and/or software controlling one or more processors. - The network(s) 1480 can be any type of network or combination of networks that allows for communication between devices. In some embodiments, the network(s) 1480 can include one or more of a local area network, wide area network, the Internet, secure network, cellular network, mesh network, peer-to-peer communication link and/or some combination thereof and can include any number of wired or wireless links. Communication over the network(s) 1480 can be accomplished, for instance, via a network interface using any type of protocol, protection scheme, encoding, format, packaging, etc.
-
FIG. 14 illustrates oneexample system 1400 that can be used to implement the present disclosure. Other computing systems can be used as well. For example, in some implementations, thevehicle computing system 102 and/or theoperations computing system 106 can include themodel trainer 1460 and thetraining dataset 1462. In such implementations, the machine-learnedmodels 1440 can be both trained and used locally at thevehicle computing system 102 and/or theoperations computing system 106. As another example, in some implementations, thevehicle computing system 102 and/or theoperations computing system 104 may not be connected to other computing systems. - Computing tasks discussed herein as being performed at computing device(s) remote from the vehicle can instead be performed at the vehicle (e.g., via the vehicle computing system), or vice versa. Such configurations can be implemented without deviating from the scope of the present disclosure. The use of computer-based systems allows for a great variety of possible configurations, combinations, and divisions of tasks and functionality between and among components. Computer-implemented operations can be performed on a single component or across multiple components. Computer-implemented tasks and/or operations can be performed sequentially or in parallel. Data and instructions can be stored in a single memory device or across multiple memory devices.
- While the present subject matter has been described in detail with respect to specific example embodiments and methods thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing can readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, the scope of the present disclosure is by way of example rather than by way of limitation, and the subject disclosure does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/734,945 US20200142428A1 (en) | 2017-05-24 | 2020-01-06 | Systems and Methods for Controlling Autonomous Vehicles that Provide a Vehicle Service to Users |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762510515P | 2017-05-24 | 2017-05-24 | |
US15/662,314 US10528059B2 (en) | 2017-05-24 | 2017-07-28 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
US16/734,945 US20200142428A1 (en) | 2017-05-24 | 2020-01-06 | Systems and Methods for Controlling Autonomous Vehicles that Provide a Vehicle Service to Users |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/662,314 Continuation US10528059B2 (en) | 2017-05-24 | 2017-07-28 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200142428A1 true US20200142428A1 (en) | 2020-05-07 |
Family
ID=64401112
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/662,327 Active US10372141B2 (en) | 2017-05-24 | 2017-07-28 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
US15/662,314 Active 2037-09-27 US10528059B2 (en) | 2017-05-24 | 2017-07-28 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
US16/504,034 Active 2037-11-18 US11029703B2 (en) | 2017-05-24 | 2019-07-05 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
US16/734,947 Active 2038-03-18 US11385657B2 (en) | 2017-05-24 | 2020-01-06 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
US16/734,945 Abandoned US20200142428A1 (en) | 2017-05-24 | 2020-01-06 | Systems and Methods for Controlling Autonomous Vehicles that Provide a Vehicle Service to Users |
US17/340,881 Active US11599123B2 (en) | 2017-05-24 | 2021-06-07 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/662,327 Active US10372141B2 (en) | 2017-05-24 | 2017-07-28 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
US15/662,314 Active 2037-09-27 US10528059B2 (en) | 2017-05-24 | 2017-07-28 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
US16/504,034 Active 2037-11-18 US11029703B2 (en) | 2017-05-24 | 2019-07-05 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
US16/734,947 Active 2038-03-18 US11385657B2 (en) | 2017-05-24 | 2020-01-06 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/340,881 Active US11599123B2 (en) | 2017-05-24 | 2021-06-07 | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
Country Status (1)
Country | Link |
---|---|
US (6) | US10372141B2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11385657B2 (en) | 2017-05-24 | 2022-07-12 | Uber Technologies, Inc. | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
Families Citing this family (77)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8931400B1 (en) | 2009-05-28 | 2015-01-13 | iDevices. LLC | Remote cooking systems and methods |
US9547989B2 (en) | 2014-03-04 | 2017-01-17 | Google Inc. | Reporting road event data and sharing with other vehicles |
US10536951B2 (en) * | 2015-06-29 | 2020-01-14 | Sony Corporation | Methods, base station, infrastructure node and terminal |
US10388085B2 (en) | 2017-07-14 | 2019-08-20 | Allstate Insurance Company | Distributed data processing system for processing remotely captured sensor data |
US10671087B2 (en) * | 2017-07-28 | 2020-06-02 | Crown Equipment Corporation | Traffic management for materials handling vehicles in a warehouse environment |
WO2019027288A1 (en) * | 2017-08-03 | 2019-02-07 | 엘지전자 주식회사 | Method and apparatus for calculating ranging information by terminal in wireless communication system supporting device to device communication |
US20190066515A1 (en) * | 2017-08-22 | 2019-02-28 | Waymo Llc | Estimating time to pick up and drop off passengers for improved stopping analysis in autonomous vehicles |
US10725473B2 (en) * | 2017-09-01 | 2020-07-28 | Uatc, Llc | Systems and methods for changing a destination of an autonomous vehicle in real-time |
US10768621B1 (en) * | 2017-09-25 | 2020-09-08 | Uatc, Llc | Determining routes for autonomous vehicles |
US10739155B2 (en) * | 2017-09-29 | 2020-08-11 | International Business Machines Corporation | Cognitive route quality-learning service |
JP6852638B2 (en) * | 2017-10-10 | 2021-03-31 | トヨタ自動車株式会社 | Self-driving vehicle dispatch system, self-driving vehicle, and vehicle dispatch method |
CN107948956A (en) * | 2017-11-07 | 2018-04-20 | 北京小米移动软件有限公司 | Localization method and device |
JP2019101463A (en) * | 2017-11-28 | 2019-06-24 | トヨタ自動車株式会社 | Delivery system, server, movable body, and package delivery method |
JP7006187B2 (en) * | 2017-11-28 | 2022-01-24 | トヨタ自動車株式会社 | Mobiles, vehicle allocation systems, servers, and mobile vehicle allocation methods |
CN109936819B (en) * | 2017-12-15 | 2021-04-30 | 北京嘀嘀无限科技发展有限公司 | Method, device and equipment for recommending boarding points |
US20190206258A1 (en) * | 2018-01-04 | 2019-07-04 | nuTonomy Inc. | Augmented reality vehicle interfacing |
EP3735340B1 (en) * | 2018-01-05 | 2023-06-07 | iRobot Corporation | Mobile robot enabling the display of networked devices |
US20190215378A1 (en) * | 2018-01-05 | 2019-07-11 | Cisco Technology, Inc. | Predicting vehicle dwell time to optimize v2i communication |
DE112019000070T5 (en) | 2018-01-07 | 2020-03-12 | Nvidia Corporation | GUIDING VEHICLES BY VEHICLE MANEUVER USING MODELS FOR MACHINE LEARNING |
CN110352153A (en) | 2018-02-02 | 2019-10-18 | 辉达公司 | It is analyzed in autonomous vehicle for the security procedure of Obstacle avoidance |
JP6831344B2 (en) * | 2018-02-07 | 2021-02-17 | 株式会社Subaru | Control system |
CN111712420B (en) * | 2018-02-22 | 2024-02-09 | 本田技研工业株式会社 | Vehicle control system, vehicle control method, and storage medium |
CA3095197A1 (en) * | 2018-03-27 | 2019-10-03 | AirboardX Pty Ltd | A passenger handling system and method |
US20190317525A1 (en) * | 2018-04-11 | 2019-10-17 | Uber Technologies, Inc. | Controlling an Autonomous Vehicle and the Service Selection of an Autonomous Vehicle |
US10809081B1 (en) * | 2018-05-03 | 2020-10-20 | Zoox, Inc. | User interface and augmented reality for identifying vehicles and persons |
US11846514B1 (en) | 2018-05-03 | 2023-12-19 | Zoox, Inc. | User interface and augmented reality for representing vehicles and persons |
US10837788B1 (en) | 2018-05-03 | 2020-11-17 | Zoox, Inc. | Techniques for identifying vehicles and persons |
US10754336B2 (en) * | 2018-05-04 | 2020-08-25 | Waymo Llc | Using environmental information to estimate sensor functionality for autonomous vehicles |
CN112119436A (en) * | 2018-05-15 | 2020-12-22 | 日产自动车株式会社 | Riding position calculation method, riding position calculation device and riding position calculation system |
US10908610B2 (en) * | 2018-06-21 | 2021-02-02 | Daniel KHURGIN | Method of vehicle operation in a mixed mode |
KR20200017810A (en) * | 2018-08-09 | 2020-02-19 | 현대자동차주식회사 | Autonomous valet service apparatus and mehtod |
US11163300B2 (en) * | 2018-08-21 | 2021-11-02 | GM Global Technology Operations LLC | Navigating an autonomous vehicle based upon an image from a mobile computing device |
US11240649B2 (en) * | 2018-08-31 | 2022-02-01 | Good2Go, Inc. | Real-time adaptive facility deployment |
US11725960B2 (en) * | 2018-09-14 | 2023-08-15 | Uber Technologies, Inc. | Determining navigation data based on service type |
US11146522B1 (en) * | 2018-09-24 | 2021-10-12 | Amazon Technologies, Inc. | Communication with user location |
US10962381B2 (en) * | 2018-11-01 | 2021-03-30 | Here Global B.V. | Method, apparatus, and computer program product for creating traffic information for specialized vehicle types |
US10675205B1 (en) * | 2018-11-20 | 2020-06-09 | Toyota Mobility Foundation | Transportation support for a user having chronic or acute mobility needs |
US10704918B2 (en) | 2018-11-26 | 2020-07-07 | Ford Global Technologies, Llc | Method and apparatus for improved location decisions based on surroundings |
US11127162B2 (en) * | 2018-11-26 | 2021-09-21 | Ford Global Technologies, Llc | Method and apparatus for improved location decisions based on surroundings |
US11175156B2 (en) * | 2018-12-12 | 2021-11-16 | Ford Global Technologies, Llc | Method and apparatus for improved location decisions based on surroundings |
US11578986B2 (en) * | 2018-12-20 | 2023-02-14 | Here Global B.V. | Autonomous driving instructions |
US20200201354A1 (en) * | 2018-12-20 | 2020-06-25 | Here Global B.V. | Methods and systems for autonomous vehicle navigation |
CN109523827A (en) * | 2018-12-24 | 2019-03-26 | 广东电网有限责任公司 | A kind of underground parking lot shutdown system |
CN109785617B (en) * | 2019-01-03 | 2021-01-26 | 中国联合网络通信集团有限公司 | Method for processing traffic control information |
DE102020201039A1 (en) * | 2019-01-31 | 2020-08-06 | Hyundai Motor Company | METHOD AND DEVICE FOR THE SHARED USE OF MOVABLE OBJECTS IN A FLEET SYSTEM |
WO2020163390A1 (en) | 2019-02-05 | 2020-08-13 | Nvidia Corporation | Driving lane perception diversity and redundancy in autonomous driving applications |
USD916919S1 (en) * | 2019-02-21 | 2021-04-20 | Volvo Car Corporation | Display screen or portion thereof with graphical user interface |
JP7127578B2 (en) * | 2019-02-26 | 2022-08-30 | トヨタ自動車株式会社 | Vehicle allocation support device, program, and control method |
JP7145105B2 (en) * | 2019-03-04 | 2022-09-30 | 本田技研工業株式会社 | Vehicle control system, vehicle control method, and program |
US11532060B2 (en) * | 2019-03-28 | 2022-12-20 | Lyft, Inc. | Systems and methods for matching autonomous transportation provider vehicles and transportation requests in transportation management systems |
US11269353B2 (en) * | 2019-03-28 | 2022-03-08 | Gm Cruise Holdings Llc | Autonomous vehicle hailing and pickup location refinement through use of an identifier |
JP7188249B2 (en) * | 2019-04-11 | 2022-12-13 | トヨタ自動車株式会社 | Parking lot management device, parking lot management method and parking lot management program |
US11512968B2 (en) * | 2019-05-30 | 2022-11-29 | Ford Global Technologies, Llc | Systems and methods for queue management of passenger waypoints for autonomous vehicles |
US11281217B2 (en) * | 2019-06-25 | 2022-03-22 | Ford Global Technologies, Llc | Enhanced vehicle operation |
US11645685B2 (en) | 2019-06-26 | 2023-05-09 | Lyft, Inc. | Dynamically adjusting transportation provider pool size |
US11615355B2 (en) * | 2019-06-26 | 2023-03-28 | Waymo Llc | Service area maps for autonomous vehicles |
US20210004728A1 (en) * | 2019-07-05 | 2021-01-07 | Lyft, Inc. | Determining arrival of transportation providers to pickup locations utilizing a hiking distance predictor model |
US11308736B2 (en) | 2019-07-30 | 2022-04-19 | T-Mobile Usa, Inc. | Selecting V2X communications interface |
US11328592B2 (en) * | 2019-08-14 | 2022-05-10 | Toyota Motor North America, Inc. | Systems and methods for roadway obstruction detection |
US20210053567A1 (en) * | 2019-08-21 | 2021-02-25 | Waymo Llc | Identifying pullover regions for autonomous vehicles |
US11462019B2 (en) * | 2019-09-20 | 2022-10-04 | Gm Cruise Holdings Llc | Predicting rider entry time for pick-up and drop-off locations |
US11188853B2 (en) * | 2019-09-30 | 2021-11-30 | The Travelers Indemnity Company | Systems and methods for artificial intelligence (AI) damage triage and dynamic resource allocation, routing, and scheduling |
US10743136B1 (en) * | 2019-09-30 | 2020-08-11 | GM Cruise Holdings, LLC | Communication between autonomous vehicles and operations personnel |
JP2021147143A (en) * | 2020-03-18 | 2021-09-27 | 本田技研工業株式会社 | Management device, management method, and program |
US11485515B2 (en) * | 2020-03-25 | 2022-11-01 | Ge Aviation Systems Llc | Method and system for operating an aircraft |
US11597393B2 (en) * | 2020-03-26 | 2023-03-07 | Intel Corporation | Systems, methods, and devices for driving control |
GB2594974A (en) | 2020-05-14 | 2021-11-17 | Dromos Tech Ag | A method and infrastructure for boarding a plurality of passengers into an autonomous vehicle |
US11570576B2 (en) * | 2020-07-01 | 2023-01-31 | Here Global B.V. | Image-based approach for device localization based on a vehicle location |
US11510026B2 (en) | 2020-07-01 | 2022-11-22 | Here Global B.V. | Radio-visual approach for device localization based on a vehicle location |
CN112071105A (en) * | 2020-08-28 | 2020-12-11 | 重庆长安汽车股份有限公司 | High-precision map-based automatic driving receiving method and device for parking lot |
KR20220039090A (en) * | 2020-09-21 | 2022-03-29 | 현대자동차주식회사 | System and method for autonomous un-parking control of vehicle |
US20220092718A1 (en) * | 2020-09-23 | 2022-03-24 | Apple Inc. | Vehicle Hailing In A Mobile Ecosystem |
US11624627B2 (en) | 2020-11-17 | 2023-04-11 | Ford Global Technologies, Llc | Augmented reality displays for locating vehicles |
US11631327B2 (en) | 2021-06-30 | 2023-04-18 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for learning driver parking preferences and generating parking recommendations |
US11639180B1 (en) * | 2021-06-30 | 2023-05-02 | Gm Cruise Holdings Llc | Notifications from an autonomous vehicle to a driver |
US11675362B1 (en) * | 2021-12-17 | 2023-06-13 | Motional Ad Llc | Methods and systems for agent prioritization |
US20240040349A1 (en) * | 2022-07-28 | 2024-02-01 | Robert Bosch Gmbh | Vehicle to target range finder via rf power |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9373149B2 (en) * | 2006-03-17 | 2016-06-21 | Fatdoor, Inc. | Autonomous neighborhood vehicle commerce network and community |
WO2012093483A1 (en) * | 2011-01-06 | 2012-07-12 | アクアエンタープライズ株式会社 | Travel process prediction system, travel process prediction method, travel process prediction device, and computer program |
US8774752B1 (en) | 2011-12-14 | 2014-07-08 | Lonestar Inventions, L.P. | Method for emergency alert using SMS text |
JP6182482B2 (en) * | 2014-03-12 | 2017-08-16 | オムロンオートモーティブエレクトロニクス株式会社 | Control device and control system |
EP3183688B1 (en) | 2014-08-18 | 2023-08-23 | Mobileye Vision Technologies Ltd. | Recognition and prediction of lane constraints |
US9904900B2 (en) * | 2015-06-11 | 2018-02-27 | Bao Tran | Systems and methods for on-demand transportation |
US20160364812A1 (en) * | 2015-06-11 | 2016-12-15 | Raymond Cao | Systems and methods for on-demand transportation |
US9701239B2 (en) * | 2015-11-04 | 2017-07-11 | Zoox, Inc. | System of configuring active lighting to indicate directionality of an autonomous vehicle |
US10745003B2 (en) * | 2015-11-04 | 2020-08-18 | Zoox, Inc. | Resilient safety system for a robotic vehicle |
US10379533B2 (en) | 2016-01-04 | 2019-08-13 | GM Global Technology Operations LLC | System and method for autonomous vehicle fleet routing |
US10088846B2 (en) | 2016-03-03 | 2018-10-02 | GM Global Technology Operations LLC | System and method for intended passenger detection |
JP2019525299A (en) * | 2016-06-21 | 2019-09-05 | ヴィア トランスポーテーション、インコーポレイテッド | System and method for vehicle sharing management |
US10198941B2 (en) * | 2016-07-27 | 2019-02-05 | Here Global B.V. | Method and apparatus for evaluating traffic approaching a junction at a lane level |
US11599833B2 (en) | 2016-08-03 | 2023-03-07 | Ford Global Technologies, Llc | Vehicle ride sharing system and method using smart modules |
WO2018046102A1 (en) * | 2016-09-10 | 2018-03-15 | Swiss Reinsurance Company Ltd. | Automated, telematics-based system with score-driven triggering and operation of automated sharing economy risk-transfer systems and corresponding method thereof |
US10248120B1 (en) * | 2016-09-16 | 2019-04-02 | Amazon Technologies, Inc. | Navigable path networks for autonomous vehicles |
US10032373B2 (en) | 2016-09-29 | 2018-07-24 | Cubic Corporation | Systems and methods for using autonomous vehicles in traffic |
US9805595B1 (en) * | 2016-10-27 | 2017-10-31 | International Business Machines Corporation | Vehicle and non-vehicle traffic flow control |
US10810695B2 (en) | 2016-12-31 | 2020-10-20 | Ava Information Systems Gmbh | Methods and systems for security tracking and generating alerts |
US10677602B2 (en) | 2017-01-25 | 2020-06-09 | Via Transportation, Inc. | Detecting the number of vehicle passengers |
US10678244B2 (en) * | 2017-03-23 | 2020-06-09 | Tesla, Inc. | Data synthesis for autonomous control systems |
US20180281815A1 (en) | 2017-03-31 | 2018-10-04 | Uber Technologies, Inc. | Predictive teleassistance system for autonomous vehicles |
US9769616B1 (en) | 2017-04-04 | 2017-09-19 | Lyft, Inc. | Geohash-related location predictions |
US10942525B2 (en) | 2017-05-09 | 2021-03-09 | Uatc, Llc | Navigational constraints for autonomous vehicles |
US10372141B2 (en) | 2017-05-24 | 2019-08-06 | Uber Technologies, Inc. | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
US20200162489A1 (en) | 2018-11-16 | 2020-05-21 | Airspace Systems, Inc. | Security event detection and threat assessment |
US10890907B2 (en) | 2018-12-14 | 2021-01-12 | Toyota Jidosha Kabushiki Kaisha | Vehicle component modification based on vehicular accident reconstruction data |
US11494921B2 (en) | 2019-04-26 | 2022-11-08 | Samsara Networks Inc. | Machine-learned model based event detection |
US11080568B2 (en) | 2019-04-26 | 2021-08-03 | Samsara Inc. | Object-model based event detection system |
-
2017
- 2017-07-28 US US15/662,327 patent/US10372141B2/en active Active
- 2017-07-28 US US15/662,314 patent/US10528059B2/en active Active
-
2019
- 2019-07-05 US US16/504,034 patent/US11029703B2/en active Active
-
2020
- 2020-01-06 US US16/734,947 patent/US11385657B2/en active Active
- 2020-01-06 US US16/734,945 patent/US20200142428A1/en not_active Abandoned
-
2021
- 2021-06-07 US US17/340,881 patent/US11599123B2/en active Active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11385657B2 (en) | 2017-05-24 | 2022-07-12 | Uber Technologies, Inc. | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
US11599123B2 (en) | 2017-05-24 | 2023-03-07 | Uber Technologies, Inc. | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users |
Also Published As
Publication number | Publication date |
---|---|
US11029703B2 (en) | 2021-06-08 |
US20180342157A1 (en) | 2018-11-29 |
US20190332123A1 (en) | 2019-10-31 |
US20180341274A1 (en) | 2018-11-29 |
US10528059B2 (en) | 2020-01-07 |
US11599123B2 (en) | 2023-03-07 |
US11385657B2 (en) | 2022-07-12 |
US20200150682A1 (en) | 2020-05-14 |
US10372141B2 (en) | 2019-08-06 |
US20210365042A1 (en) | 2021-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11599123B2 (en) | Systems and methods for controlling autonomous vehicles that provide a vehicle service to users | |
US11667283B2 (en) | Autonomous vehicle motion control systems and methods | |
EP3704684B1 (en) | Object motion prediction and vehicle control for autonomous vehicles | |
RU2761270C2 (en) | System and method for providing transportation | |
US11747808B2 (en) | Systems and methods for matching an autonomous vehicle to a rider | |
US11378980B2 (en) | Cellular device location discovery systems and methods for autonomous vehicles | |
CN112446989A (en) | Method for occupant authentication and door operation of an autonomous vehicle | |
US11315431B2 (en) | Systems and methods for autonomous vehicle controls | |
US20220137615A1 (en) | Systems and Methods for Dynamic Data Buffering for Autonomous Vehicle Remote Assistance | |
US11743564B2 (en) | Sensor platform for autonomous vehicles | |
US20220041146A1 (en) | Systems and Methods for Emergency Braking in Autonomous Vehicles | |
US20230168095A1 (en) | Route providing device and route providing method therefor | |
KR20230069899A (en) | Autonomous vehicle stations | |
CN116670735A (en) | Method for navigating an autonomous vehicle to a passenger pick-up/drop-off position | |
US20230111327A1 (en) | Techniques for finding and accessing vehicles | |
CN117083575A (en) | Track inspector | |
US10421396B2 (en) | Systems and methods for signaling intentions to riders | |
CN114170823A (en) | Vehicle allocation system, vehicle allocation server and vehicle allocation method | |
JP2019035622A (en) | Information storage method for vehicle, travel control method for vehicle, and information storage device for vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: UBER TECHNOLOGIES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DONNELLY, RICHARD BRIAN;AITKEN, MICHAEL;SIGNING DATES FROM 20170615 TO 20170619;REEL/FRAME:051440/0108 Owner name: UATC, LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:UBER TECHNOLOGIES, INC.;REEL/FRAME:051497/0001 Effective date: 20190702 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: UBER TECHNOLOGIES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:UATC, LLC;REEL/FRAME:054940/0765 Effective date: 20201204 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
AS | Assignment |
Owner name: UBER TECHNOLOGIES, INC., CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNMENT DOCUMENT PREVIOUSLY RECORDED AT REEL: 054940 FRAME: 0765. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:UATC, LLC;REEL/FRAME:059692/0345 Effective date: 20201204 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |