WO2019139957A1 - Systems and methods for controlling an autonomous vehicle - Google Patents

Systems and methods for controlling an autonomous vehicle Download PDF

Info

Publication number
WO2019139957A1
WO2019139957A1 PCT/US2019/012857 US2019012857W WO2019139957A1 WO 2019139957 A1 WO2019139957 A1 WO 2019139957A1 US 2019012857 W US2019012857 W US 2019012857W WO 2019139957 A1 WO2019139957 A1 WO 2019139957A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
autonomous
instructions
computing system
autonomous vehicle
Prior art date
Application number
PCT/US2019/012857
Other languages
French (fr)
Inventor
Robert Evan MILLER
Alden James WOODROW
Eyal Cohen
Original Assignee
Uber Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/933,499 external-priority patent/US11243547B2/en
Priority claimed from US15/980,324 external-priority patent/US11215984B2/en
Application filed by Uber Technologies, Inc. filed Critical Uber Technologies, Inc.
Priority to EP21210479.8A priority Critical patent/EP3989032B1/en
Priority to EP19703800.3A priority patent/EP3721313B1/en
Publication of WO2019139957A1 publication Critical patent/WO2019139957A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0293Convoy travelling
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/20Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
    • G08G1/202Dispatching vehicles on the basis of a location, e.g. taxi dispatching
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/22Platooning, i.e. convoy of communicating vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/46Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]

Definitions

  • the present disclosure relates generally to controlling an autonomous vehicle in response to communications sent by a third-party entity and controlling an autonomous vehicle to provide inspection information to a remote third-party entity.
  • An autonomous vehicle is a vehicle that is capable of sensing its environment and navigating without human input.
  • an autonomous vehicle can observe its surrounding environment using a variety of sensors and can attempt to comprehend the environment by performing various processing techniques on data collected by the sensors. Given knowledge of its surrounding environment, the autonomous vehicle can identify an appropriate motion plan through such surrounding environment.
  • One example aspect of the present disclosure is directed to a computer- implemented method for controlling an autonomous vehicle in response to vehicle
  • the method includes controlling, by one or more computing devices, a first autonomous vehicle to provide a vehicle service, the first autonomous vehicle being associated with a first convoy that includes one or more second autonomous vehicles.
  • the method includes receiving, by the one or more computing devices, one or more communications from a remote computing system associated with a third-party entity, the one or more communications including one or more vehicle instructions.
  • the method includes coordinating, by the one or more computing devices, with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity.
  • the method includes controlling, by the one or more computing devices, the first autonomous vehicle to implement the one or more vehicle actions.
  • the computing system includes one or more processors and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the computing system to perform operations.
  • the operations include controlling a first autonomous vehicle to provide a vehicle service, the first autonomous vehicle being associated with a first convoy that includes one or more second autonomous vehicles.
  • the operations include receiving one or more communications from a remote computing system associated with a third-party entity, the one or more communications including one or more vehicle instructions.
  • the operations include coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity.
  • the operations include controlling the first autonomous vehicle to implement the one or more vehicle actions.
  • the autonomous vehicle includes one or more processors and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the autonomous vehicle to perform operations.
  • the operations include controlling a first autonomous vehicle to provide a vehicle service, the first autonomous vehicle being selected from a fleet of vehicles controlled by a first entity to provide the vehicle service to a second entity.
  • the operations include receiving one or more communications from a remote computing system associated with a third entity, the one or more communications including one or more vehicle instructions.
  • the operations include determining one or more vehicle actions to perform in response to the one or more vehicle instructions.
  • the operations include controlling the first autonomous vehicle to implement the one or more vehicle actions.
  • Another example aspect of the present disclosure is directed to a computer- implemented method for controlling an autonomous vehicle to provide a vehicle service.
  • the method includes determining, by one or more computing devices, vehicle diagnostics information associated with a first autonomous vehicle that is part of a fleet of vehicles controlled by a first entity to provide a vehicle service to a second entity.
  • the method includes determining, by the one or more computing devices, remote inspection information that includes an assessment of one or more categories pertaining to a third entity, based at least in part on the vehicle diagnostics information.
  • the method includes providing, by the one or more computing devices, the remote inspection information to the third entity to provide the vehicle service.
  • the computing system includes one or more processors and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the computing system to perform operations.
  • the operations include determining vehicle diagnostics information associated with a first autonomous vehicle that is part of a fleet of vehicles controlled by a first entity to provide a vehicle service to a second entity.
  • the operations include determining remote inspection information that includes an assessment of one or more categories pertaining to a third entity, based at least in part on the vehicle diagnostics information.
  • the operations include providing the remote inspection information to the third entity to provide the vehicle service.
  • the autonomous vehicle includes one or more processors and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the autonomous vehicle to perform operations.
  • the operations include determining vehicle diagnostics information associated with the autonomous vehicle, the autonomous vehicle controlled by a first entity to provide a vehicle service to a second entity.
  • the operations include determining remote inspection information that includes an assessment of one or more categories pertaining to a third entity, based at least in part on the vehicle diagnostics information.
  • the operations include providing the remote inspection information to the third entity to provide the vehicle service.
  • FIG. 1 depicts a block diagram of an example system overview according to example embodiments of the present disclosure
  • FIG. 2 depicts a block diagram of an example vehicle computing system according to example embodiments of the present disclosure
  • FIG. 3 depicts a block diagram of an example vehicle remote computing system interface according to example embodiments of the present disclosure
  • FIGS. 4A-4C depict diagrams that illustrate an example of controlling an autonomous vehicle according to example embodiments of the present disclosure
  • FIGS. 5 A and 5B depict diagrams that illustrate an example of controlling an autonomous vehicle according to example embodiments of the present disclosure
  • FIG. 6 depicts a diagram that illustrates an example of controlling an autonomous vehicle according to example embodiments of the present disclosure
  • FIG. 7 depicts a diagram that illustrates an example of controlling an autonomous vehicle according to example embodiments of the present disclosure
  • FIG. 8 depicts a diagram that illustrates an example of controlling an autonomous vehicle according to example embodiments of the present disclosure
  • FIG. 9 depicts a diagram that illustrates an example of controlling an autonomous vehicle according to example embodiments of the present disclosure.
  • FIG. 10 depicts a diagram that illustrates an example of controlling an
  • FIG. 11 depicts a diagram that illustrates an example of controlling an
  • FIG. 12 depicts a diagram that illustrates an example of controlling an
  • FIG. 13 depicts a flow diagram of controlling an autonomous vehicle according to example embodiments of the present disclosure
  • FIG. 14 depicts an example transfer hub according to example embodiments of the present disclosure
  • FIG. 15 depicts an example transportation route according to example
  • FIGS. 16A-16C depict determining remote inspection information using a mobile external monitor according to example embodiments of the present disclosure
  • FIG. 17 depicts example remote inspection information according to example embodiments of the present disclosure
  • FIG. 18 depicts a flow diagram of controlling an autonomous vehicle to provide a vehicle service according to example embodiments of the present disclosure.
  • FIG. 19 depicts example system components according to example embodiments of the present disclosure.
  • Example aspects of the present disclosure are directed to managing a fleet of vehicles in response to communications sent by a third-party entity.
  • An entity e.g., a service provider
  • the service provider can operate the fleet of vehicles to provide a vehicle service for another entity requesting the vehicle service (e.g., a client).
  • the service provider can operate the fleet of vehicles in one or more jurisdictions under the purview of one or more third-party entities (e.g., law enforcement entity, transportation infrastructure regulatory entity, tax assessment entity, etc.).
  • the fleet can include, for example, autonomous vehicles that can drive, navigate, operate, etc. with minimal and/or no interaction from a human driver.
  • a service provider can control an autonomous vehicle in the fleet of vehicles to provide the vehicle service for a client in one or more jurisdictions under the purview of one or more third-party entities (e.g., law enforcement entity, transportation infrastructure regulatory entity, tax assessment entity, etc.).
  • third-party entities e.g., law enforcement entity, transportation infrastructure regulatory entity, tax assessment entity, etc.
  • Systems and methods of the present disclosure can enable an autonomous vehicle that is providing a vehicle service, to receive one or more vehicle command(s) from a third-party entity, and to perform one or more vehicle action(s) in response to the vehicle command(s) sent by the third-party entity.
  • the service provider can control the autonomous vehicle to provide remote inspection information associated with the autonomous vehicle to the one or more third-party entities when the autonomous vehicle is operating in the one or more jurisdictions.
  • a service provider can operate a fleet of one or more vehicles (e.g., ground-based vehicles) to provide a vehicle service such as a transportation service, a courier service, a delivery service, etc.
  • the vehicles can be autonomous vehicles that include various systems and devices configured to control the operation of the vehicle.
  • an autonomous vehicle can include an onboard vehicle computing system for operating the vehicle (e.g., located on or within the autonomous vehicle).
  • the autonomous vehicles can operate in an autonomous mode.
  • the vehicle computing system can receive sensor data from sensors onboard the vehicle (e.g., cameras, LIDAR, RADAR), attempt to comprehend the environment proximate to the vehicle by performing various processing techniques on the sensor data, and generate an appropriate motion plan through the environment.
  • the autonomous vehicles can operate in a manual mode.
  • a human operator e.g., driver
  • the autonomous vehicle can be configured to communicate with one or more computing device(s) that are remote from the vehicle.
  • the autonomous vehicle can communicate with an operations computing system that can be associated with the service provider.
  • the operations computing system can help the service provider monitor, communicate with, manage, etc. the fleet of vehicles.
  • the autonomous vehicle can communicate with one or more other vehicles (e.g., a vehicle computing system onboard each of the one or more other vehicles in the fleet), third- party computing systems (e.g., a client computing system, law enforcement computing system, transportation infrastructure computing system, tax assessment computing system, etc.), or other remote computing systems.
  • the operations computing system can mediate communication between the autonomous vehicle and the computing device(s) that are remote from the vehicle.
  • a vehicle application programming interface (Vehicle API) platform can provide for a translation/transport layer as an interface between vehicle computing systems onboard vehicles within an entity's fleet and one or more remote clients and/or applications operating on a remote computing system (e.g., a vehicle computing system onboard each of the one or more other vehicles in the fleet, a third-party computing system, etc.).
  • a remote computing system e.g., a vehicle computing system onboard each of the one or more other vehicles in the fleet, a third-party computing system, etc.
  • the Vehicle API platform can receive data from a vehicle over a communications pipeline established with the Vehicle API.
  • the Vehicle API platform can provide for communicating vehicle data to the remote computing system in a secure manner that allows for expanded processing of vehicle data off the vehicle, analyzing such data in real time, and/or the like.
  • a Vehicle API platform can be vehicle agnostic, allowing for any autonomous vehicle and/or computer-capable vehicle to interact with a remote computing system by providing a consistent communication pipeline that any vehicle computing system would be able to use to send vehicle data (e.g., vehicle state information, etc.) and/or receive messages (e.g., command/control messages, configuration messages, etc.) from a remote computing system.
  • vehicle data e.g., vehicle state information, etc.
  • messages e.g., command/control messages, configuration messages, etc.
  • one or more autonomous vehicle(s) in the fleet can receive one or more vehicle command(s) from a third-party computing system associated with a third-party entity, and determine one or more vehicle action(s) to perform in response to the vehicle command(s).
  • the third-party entity can be a law enforcement entity, and more particularly a representative of the law enforcement entity, such as, for example, a police officer.
  • the police officer can send vehicle command(s) to the autonomous vehicle(s) via the third-party computing system.
  • the autonomous vehicle(s) can be configured to receive the vehicle command(s) from the third-party computing system via a local-area or short-range communication network.
  • the autonomous vehicle(s) can receive the vehicle command(s) via a direct line-of-sight communication network.
  • the third-party computing system can be limited to controlling the autonomous vehicle(s) via the vehicle command(s) when the third-party entity associated with the third-party computing system can directly observe the autonomous vehicle(s), or when the third-party entity and/or third-party computing system is proximate to the autonomous vehicle(s).
  • the autonomous vehicle(s) can be initially configured as being unselected.
  • the unselected autonomous vehicle(s) can be configured to respond only to vehicle command(s) for selecting an autonomous vehicle from the third-party entity.
  • the unselected autonomous vehicle(s) can be configured as being selected.
  • the selected autonomous vehicle(s) can be configured to respond to additional vehicle command(s) from the third-party entity.
  • a police officer can identify a first autonomous vehicle and send vehicle command(s) for selecting an autonomous vehicle, to the first autonomous vehicle.
  • the first autonomous vehicle can be configured as being selected.
  • the first autonomous vehicle can perform vehicle action(s) to indicate the selection, for example, by flashing external indicator lights, displaying a message on an external display, or performing other vehicle action(s) that can be perceived by the police officer so that the police officer can determine that the first autonomous vehicle is selected.
  • a police officer can identify a first autonomous vehicle and send vehicle command(s) to select the first autonomous vehicle.
  • the police officer can then send vehicle command(s) instructing the selected first autonomous vehicle to relay future vehicle command(s) from the police officer to a second autonomous vehicle.
  • the vehicle command(s) can identify the second autonomous vehicle based on an identifier associated with the second autonomous vehicle, or based on a relative position of the second
  • the first autonomous vehicle with respect to the first autonomous vehicle (e.g., in front, behind, etc.).
  • the first autonomous vehicle can communicate with the second autonomous vehicle for sending data indicative of the future vehicle command(s) to the second autonomous vehicle.
  • the future vehicle command(s) can include vehicle command(s) for selecting an autonomous vehicle.
  • the first autonomous vehicle receives the future vehicle command(s)
  • the first autonomous vehicle can send data indicative of the future vehicle command(s) to the second autonomous vehicle.
  • the second autonomous vehicle can be configured as being selected.
  • the first autonomous vehicle can then be configured as being unselected.
  • a police officer can identify a first autonomous vehicle and broadcast vehicle command(s) for selecting an autonomous vehicle, with an intent to select the first autonomous vehicle. If a second autonomous vehicle receives the vehicle command(s) instead of the first autonomous vehicle, then the second autonomous vehicle can be selected.
  • the police officer can broadcast vehicle command(s) instructing the selected second autonomous vehicle to relay future vehicle command(s) to the first autonomous vehicle.
  • the vehicle command(s) can, for example, instruct the second autonomous vehicle to relay the future vehicle command(s) to an autonomous vehicle in front of the second autonomous vehicle or behind the second autonomous vehicle.
  • the future vehicle command(s) can include vehicle command(s) for selecting an autonomous vehicle.
  • the second autonomous vehicle can send data indicative of the future vehicle command(s) to the first autonomous vehicle.
  • the first autonomous vehicle can be configured as being selected.
  • the second autonomous vehicle can then be configured as being unselected.
  • a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle.
  • the vehicle command(s) can include or otherwise indicate a reason for the selection, such as, for example, low tire pressure, broken windshield, fluid leak, etc.
  • the police officer can send vehicle command(s) instructing the first autonomous vehicle to provide information indicating the reason for the selection to a service provider.
  • the first autonomous vehicle can determine one or more vehicle action(s) to perform in response to the vehicle command(s) from the police officer.
  • the vehicle action(s) can include, for example, communicating with the service provider to send data indicative of the reason for the selection.
  • the service provider can, for example, schedule the first autonomous vehicle for a maintenance service or a repair service at a later time.
  • a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the first autonomous vehicle to provide status information associated with the vehicle.
  • the first autonomous vehicle can determine one or more vehicle action(s) to perform in response to the vehicle command(s) from the police officer. If the vehicle command(s) instructing the first autonomous vehicle to provide the status information include or otherwise indicate one or more vehicle component(s) of interest, then the vehicle action(s) can include generating status information data that includes a status of the vehicle component(s).
  • the vehicle action(s) can include generating status information data that includes, for example, an overall status or health of the first autonomous vehicle, a status of a default set of vehicle component s), or other status information associated with the first autonomous vehicle.
  • the first autonomous vehicle can provide the status information to the third-party computing system associated with the police officer.
  • the police officer can review the status information to verify the tire pressure of the first autonomous vehicle and take further action if necessary.
  • the further action can include, for example, sending vehicle command(s) that instruct the first autonomous vehicle to stop, travel to a maintenance area, etc.
  • a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the first autonomous vehicle to stop.
  • the first autonomous vehicle can determine one or more vehicle action(s) to perform in response to the vehicle command(s) from the police officer.
  • the vehicle action(s) can include, for example, implementing a stopping action.
  • the vehicle command(s) from a third-party entity can include or otherwise indicate a reason or a priority associated with the vehicle command(s), and an autonomous vehicle can determine vehicle action(s) to perform in response to the vehicle command(s) based on the associated reason or the priority.
  • a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the vehicle to provide status information associated with the vehicle.
  • the vehicle command(s) can include a reason for the status information, such as, for example, to check a tire pressure of the first autonomous vehicle.
  • the first autonomous vehicle can determine one or more vehicle action(s) to perform in response to the vehicle command(s) from the police officer.
  • the vehicle action(s) can include, for example, generating status information indicative of the vehicle’s tire pressure. In this way, the first autonomous vehicle can determine vehicle action(s) to perform based on the reason associated with the vehicle command(s) from the third-party entity.
  • a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the vehicle to stop.
  • the vehicle command(s) can include a reason for stopping the vehicle, such as, for example, low tire pressure.
  • the first autonomous vehicle can determine that the reason for the stop is not a critical reason, and the first autonomous vehicle can perform a soft-stop vehicle action in response to the vehicle command(s).
  • the soft-stop vehicle action can include the first autonomous vehicle continuing to travel along its route until a safe stopping location is identified (e.g., a shoulder lane, off-ramp, etc.), and stopping at the safe stopping location.
  • a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the vehicle to stop.
  • the vehicle command(s) can include a reason for the stop, such as, for example, a fluid leak.
  • the first autonomous vehicle can determine that the reason for the stop is a critical reason, and the first autonomous vehicle can perform an emergency-stop vehicle action in response to the vehicle command(s).
  • the emergency-stop vehicle action can include the first autonomous vehicle immediately stopping in a lane that the autonomous vehicle is travelling in.
  • a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the vehicle to stop.
  • the vehicle command(s) can include an associated priority level, such as, for example, low-priority or high-priority. If the priority level is low-priority, then the first autonomous vehicle can perform a soft-stop vehicle action in response to the vehicle command(s). If the priority level is high-priority, then the first autonomous vehicle can perform an emergency- stop vehicle action in response to the vehicle command(s).
  • a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the vehicle to stop.
  • the vehicle command(s) can indicate that a reason for the stop is to enable the police officer to inspect the first autonomous vehicle.
  • the first autonomous vehicle can perform a stopping action so that the police officer can inspect the vehicle.
  • an autonomous vehicle that performs a vehicle action to come to a stop can implement one or more vehicle action(s) subsequent to the stopping action.
  • an autonomous vehicle can provide limited access to a police officer who is inspecting the autonomous vehicle.
  • an autonomous vehicle can communicate with the service provider to provide information indicative of vehicle command(s) from a third-party entity.
  • the information can include a reason for the stop.
  • a police officer can send vehicle command(s) instructing the autonomous vehicle to resume normal operations.
  • the autonomous vehicle can, for example, resume a vehicle service, join/rejoin a convoy, etc.
  • an autonomous vehicle can implement one or more action(s) subsequent to a stopping action based on one or more cargo asset(s) being transported by the autonomous vehicle.
  • the first autonomous vehicle can request that a second autonomous vehicle pick-up the cargo asset(s) from a location of the first autonomous vehicle and transport the cargo asset(s) instead of the first autonomous vehicle.
  • the first autonomous vehicle can lock-down and restrict access in order to safeguard the cargo asset(s).
  • the vehicle command(s) from a third-party entity can include information used to authenticate the third-party entity.
  • the vehicle command(s) can be encrypted using a predetermined key.
  • the predetermined key can be one of a plurality of predetermined keys previously shared between the autonomous vehicle and a plurality of third-party computing systems each associated with a third-party entity.
  • the autonomous vehicle can receive one or more encrypted vehicle command(s) from a third- party computing system, and decrypt the encrypted vehicle command(s) using a
  • a first predetermined key can be shared between an autonomous vehicle and a first third-party computing system associated with a first third-party entity (e.g., law enforcement entity, transportation infrastructure regulatory entity, tax assessment entity, etc.), a second predetermined key can be shared between the autonomous vehicle and a second third-party computing system associated with a second third-party entity, and a third predetermined key can be shared between the autonomous vehicle and a third third-party computing system associated with a third third-party entity.
  • a first third-party computing system associated with a first third-party entity (e.g., law enforcement entity, transportation infrastructure regulatory entity, tax assessment entity, etc.)
  • a second predetermined key can be shared between the autonomous vehicle and a second third-party computing system associated with a second third-party entity
  • a third predetermined key can be shared between the autonomous vehicle and a third third-party computing system associated with a third third-party entity.
  • the autonomous vehicle can decrypt the encrypted vehicle command(s) with the first predetermined key, then the autonomous vehicle can authenticate the first third-party entity in association with the encrypted vehicle command(s). If the autonomous vehicle can decrypt the encrypted vehicle command(s) with the second predetermined key, then the autonomous vehicle can
  • the autonomous vehicle can decrypt the encrypted vehicle command(s) with the third predetermined key, then the autonomous vehicle can authenticate the third third- party entity in association with the encrypted vehicle command(s).
  • the autonomous vehicle can determine one or more vehicle action(s) to perform based in part on an authentication of the third-party entity.
  • an autonomous vehicle can receive vehicle command(s) instructing the vehicle to stop. If the autonomous vehicle authenticates a law enforcement entity in association with the vehicle command(s), then the autonomous vehicle can perform a stopping action to come to a stop. Alternatively, if the autonomous vehicle authenticates a tax assessment entity in association with the vehicle command(s), then the autonomous vehicle can ignore the vehicle command(s).
  • an autonomous vehicle can receive a vehicle command(s) instructing the vehicle to provide status information associated with the vehicle. If the autonomous vehicle authenticates a law enforcement entity in association with the vehicle command(s), then the autonomous vehicle can provide the status information relevant to the law enforcement entity (e.g., tire pressure, speed, etc.) to a third-party computing system associated with the law enforcement entity. Alternatively, if the autonomous vehicle authenticates a tax assessment entity in association with the vehicle command(s), then the autonomous vehicle can provide the status information relevant to the tax assessment entity (e.g., cargo type, cargo weight, etc.) to a third-party computing system associated with the tax assessment entity. [0064] In some implementations, the vehicle command(s) from a third-party entity can include a vehicle identifier corresponding to a specific autonomous vehicle.
  • an autonomous vehicle can include a static display of an identifier (e.g., an identification code corresponding to the autonomous vehicle painted on the outside of the autonomous vehicle) or a dynamic display of an identifier (e.g., an external display that displays an identification code corresponding to the autonomous vehicle).
  • an identifier e.g., an identification code corresponding to the autonomous vehicle painted on the outside of the autonomous vehicle
  • a dynamic display of an identifier e.g., an external display that displays an identification code corresponding to the autonomous vehicle.
  • enforcement entity e.g., police officer
  • a first autonomous vehicle that receives the vehicle command(s) can determine whether the vehicle identifier included in the vehicle command(s) corresponds to the first autonomous vehicle. If the vehicle identifier
  • the first autonomous vehicle can determine one or more vehicle action(s) to perform based on the vehicle command(s). If the vehicle identifier does not correspond to the first autonomous vehicle, then the first autonomous vehicle can ignore the vehicle command(s). Alternatively, if the vehicle identifier corresponds to a second autonomous vehicle that is proximate to the first autonomous vehicle, then the first autonomous vehicle can relay the vehicle command(s) to the second autonomous vehicle.
  • a plurality of autonomous vehicles can each receive the vehicle command(s) from a third-party entity.
  • the plurality of autonomous vehicles can each determine one or more vehicle action(s) to perform in response to the vehicle command(s).
  • a law enforcement entity e.g., computing system associated with the law enforcement entity
  • the first autonomous vehicle, second autonomous vehicle, and third autonomous vehicle can each independently determine one or more vehicle action(s) to perform in response to the vehicle command(s).
  • the service provider can control the autonomous vehicle to provide remote inspection information associated with the
  • a service provider can provide a vehicle service across a state boundary between two states that are each governed by a different state government with different laws and regulations for operating an autonomous vehicle, and policed by different law enforcement entities.
  • a service provider can provide a vehicle service across a national boundary between two nations that are each governed by different government entities.
  • a service provider can provide a vehicle service using a highway transportation infrastructure administered by a highway transportation administration, using a maritime transportation infrastructure administered by a port authority, using an air transportation infrastructure administered by an aviation administration, and/or within a shared environment administered by one or more of a national, state and local environmental protection entity.
  • a service provider can control an autonomous vehicle to autonomously determine remote inspection information, and control the autonomous vehicle to provide the remote inspection information to a third-party entity.
  • the autonomous vehicle can determine the remote inspection information based on diagnostics information associated with the autonomous vehicle.
  • the autonomous vehicle can generate the diagnostics information and/or obtain the diagnostics information from an external monitor.
  • the remote inspection information can include, for example, a speed of the autonomous vehicle, a weight of a cargo item attached to the autonomous vehicle, a contents of a cargo item attached to the autonomous vehicle, an engine status, tire pressure, tire wear, readings of one or more sensors on-board the autonomous vehicle, an operating status of one or more sensors on-board the autonomous vehicle, a distance travelled by the autonomous vehicle, fuel consumption, emissions levels, a physical location of the autonomous vehicle, an indication of one or more faults detected by the autonomous vehicle, etc.
  • the autonomous vehicle can provide, for example, remote inspection information including a speed of the autonomous vehicle to a law enforcement entity, remote inspection information including a weight of a cargo item attached to the autonomous vehicle to a tax assessment entity, etc.
  • an operations computing system associated with the service provider can mediate communication between the autonomous vehicle and one or more third-party entities.
  • a service provider can control an autonomous vehicle to travel to a vicinity of an external monitor that can determine remote inspection information associated with the autonomous vehicle.
  • the external monitor can determine the remote inspection information based on diagnostics information associated with the autonomous vehicle.
  • the external monitor can generate the diagnostics information and/or obtain the diagnostics information from the autonomous vehicle.
  • the external monitor can provide the remote inspection information to a third-party entity.
  • the remote inspection information can include, for example, a speed of the autonomous vehicle, a weight of a cargo item attached to the autonomous vehicle, a content of a cargo item attached to the
  • the external monitor can provide, for example, remote inspection information including a weight of a cargo item attached to the autonomous vehicle to a tax assessment entity, and remote inspection information including emissions levels of the autonomous vehicle to an environmental protection entity.
  • the service provider can control an autonomous vehicle to autonomously determine remote inspection information.
  • the autonomous vehicle can generate diagnostics information associated with an operation of the autonomous vehicle, and determine the remote inspection information based on the diagnostics information.
  • the service provider can control the autonomous vehicle to autonomously generate such diagnostics information while the autonomous vehicle is in use.
  • the diagnostics information can include information corresponding to one or more systems on board the autonomous vehicle and/or information corresponding to an environment in which the autonomous vehicle operates.
  • diagnostics information can include information on one or more faults detected with respect to one or more systems on-board an autonomous vehicle.
  • diagnostics information can include sensor data obtained by one or more sensors on-board an autonomous vehicle.
  • the sensor data can include information on one or more components of the autonomous vehicle and/or information on an environment in which the autonomous vehicle operates.
  • the service provider can control an autonomous vehicle to travel to a vicinity of an external monitor to determine remote inspection information associated with the autonomous vehicle.
  • the service provider can control the autonomous vehicle to provide a vehicle service using a transportation network that includes one or more external monitors at one or more locations.
  • the transportation network can include a plurality of transfer hubs and a plurality of transportation routes that link the plurality of transfer hubs with one another.
  • the transportation network can utilize, for example, one or more of a highway transportation infrastructure, maritime transportation infrastructure, and air transportation infrastructure.
  • the transportation network can include one or more external monitors that can determine remote inspection information associated with an autonomous vehicle.
  • the one or more external monitors can determine the remote inspection information based on diagnostics information associated with the autonomous vehicle, and can provide the remote inspection information to a third-party entity.
  • the one or more external monitors can generate the diagnostics information and/or obtain the diagnostics information from the autonomous vehicle, when the autonomous vehicle is in a vicinity of the external monitor.
  • the plurality of transfer hubs can include one or more external monitors that can determine remote inspection information associated with an autonomous vehicle when the autonomous vehicle enters and/or exits the transfer hub.
  • the one or more external monitors can include an autonomous inspector that can inspect the autonomous vehicle and/or a human inspector that can inspect the autonomous vehicle.
  • an external monitor can include a camera that visually inspects an autonomous vehicle to generate diagnostics information associated with the autonomous vehicle and determine remote inspection information based on the diagnostics information.
  • an external monitor can include a weigh scale that can weigh a cargo item attached to an autonomous vehicle to generate diagnostics information associated with the
  • an external monitor can include a human inspector that can perform an inspection of an autonomous vehicle to generate diagnostics information associated with the autonomous vehicle and determine remote inspection information based on the diagnostics information.
  • an external monitor can include a wireless beacon configured to wirelessly communicate with an autonomous vehicle as it passes a location of the beacon to obtain diagnostics information from the autonomous vehicle.
  • the plurality of transportation routes can include one or more external monitors located at one or more locations along the transportation routes that can determine remote inspection information associated with an autonomous vehicle when the autonomous vehicle passes within a vicinity of the external monitor.
  • a transportation route can include a plurality of external monitors at periodic intervals along the transportation route.
  • a transportation route can include an external monitor where the transportation route crosses a jurisdictional boundary.
  • an external monitor can include a dedicated and/or physical connection to a communications network, and the transportation route can include an external monitor where wireless communication is unavailable or unreliable.
  • the service provider can control an autonomous vehicle to provide remote inspection information to a third-party entity.
  • a service provider can control an autonomous vehicle to autonomously provide remote inspection information to a third-party entity at one or more times when the autonomous vehicle is in use.
  • a service provider can control an autonomous vehicle to travel to a vicinity of an external monitor when the autonomous vehicle enters and/or exits a transfer hub, to provide remote inspection information to a third-party entity.
  • a service provider can control an autonomous vehicle to autonomously provide remote inspection information to an external monitor, when the autonomous vehicle is travelling on a transportation route.
  • an external monitor can communicate with one or more of a service provider, an autonomous vehicle, and a third-party entity.
  • an external monitor can communicate diagnostics information to a service provider.
  • an external monitor can communicate diagnostics information to an autonomous vehicle that it inspected so that the autonomous vehicle can aggregate diagnostics information form one or more sources.
  • the service provider can determine remote inspection information based on diagnostics information obtained from an external monitor and/or an autonomous vehicle, and provide the remote inspection information to a third-party entity. Additionally and/or alternatively, the service provider can control the autonomous vehicle to
  • an external monitor can be fixed at a particular geographic location.
  • an external monitor that includes a weigh scale can be fixed at a location in a transfer hub.
  • an external monitor can be mobile.
  • an external monitor that includes a camera to visually inspect an autonomous vehicle can be affixed to another vehicle.
  • a service provider can control a first autonomous vehicle affixed with an external monitor to travel to a vicinity of a second autonomous vehicle so that the external monitor can inspect the second autonomous vehicle and determine remote inspection information associated with the second autonomous vehicle.
  • a service provider can control a first autonomous vehicle affixed with an external monitor to travel to a vicinity of a second autonomous vehicle in response to a request by the second autonomous vehicle. For example, if a first autonomous vehicle detects a fault with a tire pressure sensor, then the first autonomous vehicle can request that a second autonomous vehicle affixed with an external monitor travel to a vicinity of the first autonomous vehicle to visually inspect one or more tires of the first autonomous vehicle. The second autonomous vehicle can communicate a status of the first autonomous vehicle’s tires to the first autonomous vehicle.
  • one or more sensors onboard an autonomous vehicle can be used to inspect another vehicle (e.g., a second autonomous vehicle) and determine remote inspection information associated with the vehicle.
  • a second autonomous vehicle can be used to inspect another vehicle (e.g., a second autonomous vehicle) and determine remote inspection information associated with the vehicle.
  • a service provider can control one or more other autonomous vehicles in a fleet of vehicles to travel to a vicinity of the first autonomous vehicle to visually inspect one or more tires of the first autonomous vehicle.
  • the second autonomous vehicle can communicate a status of the first autonomous vehicle’s tires to the first autonomous vehicle.
  • the remote inspection information can include a status of one or more categories pertaining to a third-party entity to which the remote inspection information is provided.
  • remote inspection information can include a numerical value as a status for a speed and/or a weight associated with an autonomous vehicle.
  • remote inspection information can include an indication of“under speed limit”,“within a threshold value of the speed limit”, or“over speed limit” as a status for a speed.
  • remote inspection information can include an indication of “unchanged” or“changed” as a status for a weight, relative to a starting weight associated with an autonomous vehicle.
  • a status of one or more categories can include“green”, “yellow”, and“red”, and the one or more categories can include a first category
  • a third category corresponding to one or more components of an autonomous vehicle, a second category corresponding to a performance of an autonomous vehicle, and a third category
  • a category of one or more components of an autonomous vehicle can include one or more of a vehicle platform, vehicle computing system, one or more sensors, engine, tires, etc.
  • Remote inspection information associated with the autonomous vehicle can include“green” for a status of the components if diagnostics information associated with the autonomous vehicle indicates that all the components of the autonomous vehicle are functioning properly and there are no detected faults.
  • Remote inspection information associated with the autonomous vehicle can include“yellow” for a status of the components of the autonomous vehicle if vehicle diagnostics information associated with the autonomous vehicle indicates that a problem or detected fault (e.g., engine temperature is high, tire pressure is low), but a current operation of the autonomous vehicle can be completed safely.
  • Remote inspection information associated with the autonomous vehicle can include“red” for a status of the components of the autonomous vehicle if vehicle diagnostics information associated with the autonomous vehicle indicates a critical problem or error that affects a safe operation of the autonomous vehicle.
  • a category of performance of an autonomous vehicle can include one or more of a speed, distance travelled, fuel consumption, weight, emissions, coolant level, brake wear, etc.
  • autonomous vehicle can include“green” for a status of the performance of the autonomous vehicle if a speed, weight, and emissions are within an acceptable range for a jurisdiction in which the autonomous vehicle is operating.
  • Remote inspection information associated with the autonomous vehicle can include“yellow” for a status of the performance of the autonomous vehicle if there is a spike in fuel consumption or emissions while the
  • Remote inspection information associated with the autonomous vehicle can include“red” for a status of the performance of the autonomous vehicle if a coolant level drops below a critical level and a brake wear exceeds a critical level.
  • autonomous vehicle can include one or more of road conditions, weather conditions, traffic conditions, etc.
  • Remote inspection information associated with the autonomous vehicle can include“green” for a status of the surrounding environment if the autonomous vehicle encounters good road conditions (e.g., well maintained, existence of safety lanes, etc.).
  • Remote inspection information associated with the autonomous vehicle can include“yellow” for a status of the surrounding environment if the autonomous vehicle encounters
  • Remote inspection information associated with the autonomous vehicle can include“red” for a status of the surrounding environment if the autonomous vehicle encounters a hazard, accident, or other event that renders a road segment impassable.
  • the service provider can control an autonomous vehicle and/or an external monitor to provide remote inspection information to an appropriate third- party entity.
  • remote inspection information that includes a status of one or more components of an autonomous vehicle can be provided to a highway transportation administration, port authority, or aviation administration to ensure compliance with rules and/or regulations concerning operation of an autonomous vehicle over a highway
  • remote inspection information that includes a status of a location, speed, and transportation route of an autonomous vehicle can be provided to a law enforcement entity to assist the law enforcement entity in monitoring vehicular traffic in its jurisdiction.
  • remote inspection information that includes a status of a cargo attached to an autonomous vehicle e.g., cargo weight
  • remote inspection information that includes emissions information of an autonomous vehicle can be provided to an environmental protection entity to ensure compliance with emissions standards.
  • remote inspection information that includes a status of a surrounding environment of an autonomous vehicle can be provided to a highway
  • remote inspection information that includes a status of a surrounding environment can be provided to a law enforcement entity to report an accident.
  • the service provider can control an autonomous vehicle and/or an external monitor to provide remote inspection information to a third-party entity at one or more times.
  • remote inspection information associated with an autonomous vehicle can be provided to a law enforcement entity each time the autonomous vehicle exits a transfer hub onto a transportation route.
  • remote inspection information associated with an autonomous vehicle can be provided to an environmental protection entity at a predetermined time interval and/or if diagnostics information associated with the autonomous vehicle indicates an emissions spike.
  • remote inspection information associated with an autonomous vehicle can be provided to a tax assessment entity each time the autonomous vehicle crosses into a different tax jurisdiction.
  • a plurality of vehicles in the fleet can operate as a convoy, such that the plurality of vehicles in the convoy travel together as a group.
  • the convoy can include a lead vehicle and one or more follower vehicle(s).
  • the lead vehicle can be configured to operate ahead of the follower vehicle(s), and the follower vehicle(s) can be configured to follow behind the lead vehicle.
  • the follower vehicle(s) can be configured to follow a preceding vehicle in the convoy.
  • the follower vehicle(s) can include a first follower vehicle, second follower vehicle, third follower vehicle, and fourth follower vehicle.
  • the first follower vehicle can be configured to follow behind the lead vehicle, the second follower vehicle configured to follow behind the first follower vehicle, the third follower vehicle configured to follow behind the second follower vehicle, and the fourth follower vehicle can be configured to follow behind the third follower vehicle.
  • the follower vehicle(s) can be configured to follow at a predetermined distance.
  • the predetermined distance can be static or dynamic, and can be based on, for example, whether the plurality of vehicles are operating on a highway or on local roads, traffic conditions (e.g., volume of traffic, speed of traffic, etc.), road conditions (e.g., road incline/decline, speed limit, construction zones, etc.), a communication range (e.g., so that the plurality of autonomous vehicles can communicate with each other), weather conditions (e.g., that affect visibility, vehicle traction, stopping distance, etc.), etc.
  • traffic conditions e.g., volume of traffic, speed of traffic, etc.
  • road conditions e.g., road incline/decline, speed limit, construction zones, etc.
  • a communication range e.g., so that the plurality of autonomous vehicles can communicate with each other
  • weather conditions e.g., that affect visibility, vehicle traction, stopping distance, etc.
  • the plurality of vehicles in the convoy can operate independently, and/or communicate with each other to coordinate one or more vehicle action(s) to perform in response to vehicle command(s) from a third-party entity.
  • one or more autonomous vehicle(s) in the convoy can operate in an autonomous mode.
  • Each of the autonomous vehicle(s) can obtain sensor data from sensors onboard the vehicle, attempt to comprehend the environment proximate to the vehicle by performing processing techniques on the sensor data, and generate an appropriate motion plan through the environment.
  • each of the autonomous vehicle(s) can identify one or more obstacle(s) in the environment proximate to the vehicle, and generate a motion plan to avoid the obstacle(s).
  • the motion plan can include, for example, avoidance maneuvers, stopping actions, etc.
  • the autonomous vehicle(s) can communicate with each other to share information, such as, for example, one or more obstacle(s) identified in the environment, avoidance maneuvers, stopping actions, or other information.
  • the autonomous vehicle(s) can generate an appropriate motion plan based in part on the shared information.
  • an autonomous vehicle in the convoy can know in advance about a location of an obstacle or a trajectory of another autonomous vehicle, so that the autonomous vehicle can generate a motion plan to avoid the obstacle and/or other vehicle.
  • an autonomous vehicle in the convoy in response to a stop command, can determine vehicle actions to perform that include: removal from the convoy, and a stopping action.
  • the autonomous vehicle can communicate with other vehicles in the convoy to send data indicative of the stop command and/or the determined vehicle actions, and then perform the stopping action.
  • the other vehicles in the convoy can receive the data and adjust their motion plan to maneuver past the autonomous vehicle as it performs the stopping action.
  • an autonomous vehicle in the convoy in response to a stop command, can determine vehicle actions to perform that include: coordinating with other vehicles in the convoy to stop as a group.
  • the autonomous vehicle can communicate with the other vehicles in the convoy to determine coordinated stopping actions for the vehicles in the convoy.
  • Each vehicle in the convoy can perform a respective coordinated stopping action so that the convoy can stop as a group.
  • the lead vehicle in the convoy can include a human operator.
  • the human operator can manually control the lead vehicle (e.g., via a human- machine interface), and send one or more vehicle command(s) to the follower vehicle(s) to manage/control the convoy.
  • the vehicle command(s) can include, for example, instructions for a follower vehicle to start, stop, slow down, speed up, increase/decrease a follow distance, follow a specific/different vehicle in the convoy, respond to or ignore a vehicle command sent by a third-party entity, etc.
  • the vehicle command(s) can also include instructions to rearrange the convoy by adding or removing a vehicle from the convoy, as will be described further below.
  • the lead vehicle can be an autonomous vehicle operating in a manual mode, and a human operator can be a driver who can manually drive the lead vehicle.
  • the human operator can also communicate (e.g., receive data, send vehicle command(s), etc.) with the follower vehicle(s) via the lead vehicle.
  • a human operator can identify one or more obstacle(s) in an environment.
  • the obstacle(s) can include, for example, traffic conditions (e.g., volume of traffic, speed of traffic, etc.), road conditions (e.g., blocked lanes, incline/decline, sharp turns, speed limit, construction zones, etc.), weather conditions (e.g., that affect visibility, vehicle traction, stopping distance, etc.), etc.
  • the human operator can manage/control the convoy by sending one or more vehicle command(s) to assist the follower vehicle(s) in navigating the environment including the obstacle(s).
  • a human operator in the lead vehicle of a stopped convoy can control the convoy to start/resume travelling in an environment.
  • the human operator can determine when and how the convoy should start/resume based on one or more obstacle(s) in the environment (e.g., traffic conditions, road conditions, weather conditions, etc.) at one or more times.
  • the human operator decides to start/resume, the human operator can send vehicle command(s) to the follower vehicle(s) to perform coordinated starting actions and start/resume travelling in the environment.
  • a human operator in the lead vehicle of a convoy can determine that the convoy is approaching (or has entered) a road segment including a construction zone.
  • the construction zone can be associated with a lower speed limit than a normal speed limit for the road segment.
  • the human operator can send vehicle command(s) to instruct the follower vehicle(s) to reduce speed.
  • the human operator can send vehicle command(s) to instruct the follower vehicle(s) to increase speed.
  • a human operator in the lead vehicle of a convoy can determine that an upcoming road segment along the convoy’s route includes a steep decline segment.
  • the human operator can send vehicle command(s) instructing the follower vehicle(s) to reduce speed and increase a follow distance while traversing the decline segment. After traversing the decline segment, the human operator can send vehicle command(s) instructing the follower vehicle(s) to increase speed and decrease a follow distance.
  • the lead vehicle can communicate with the follower vehicle(s) to obtain status information from the follower vehicle(s).
  • Each of the follower vehicle(s) can generate status information data associated with the vehicle, and send the status information data to the lead vehicle.
  • the status information data can include, for example, a vehicle location, vehicle health, vehicle diagnostics, raw sensor data, audio-visual data, a vehicle command sent by a third-party entity, etc., associated with one or more of the follower vehicle(s).
  • a human operator in the lead vehicle can manage/control the convoy based on the received data.
  • each of the follower vehicle(s) in a convoy can periodically generate data indicative of the vehicle’s location, and send the data to the lead vehicle. If a human operator in the lead vehicle determines that a follower vehicle is travelling too slow based on the follower vehicle’s location, then the human operator can send vehicle command(s) instructing the follower vehicle to increase speed. If the human operator in the lead vehicle determines that a follower vehicle is travelling too fast, based on the follower vehicle’s location, then the human operator can send vehicle command(s) instructing the follower vehicle to reduce speed.
  • a follower vehicle in a convoy can encounter an obstacle or an unfamiliar environment that the follower vehicle is unable to navigate.
  • the follower vehicle can generate data indicative of the obstacle or unfamiliar environment, and send the data to the lead vehicle in the convoy.
  • a human operator in the lead vehicle can instruct the follower vehicle to send audio-visual data from one or more cameras onboard the vehicle, and manually control the follower vehicle to navigate past the obstacle or unfamiliar environment.
  • a follower vehicle in a convoy can receive vehicle command(s) from a third-party entity selecting the vehicle and instructing the it to stop.
  • the vehicle command(s) can also include a reason for the stop, such as, for example, because the third-party entity believes that a tire pressure of the selected follower vehicle is too low.
  • the selected follower vehicle can send data indicative of the vehicle command(s) sent by the third-party entity to the lead vehicle.
  • a human operator in the lead vehicle can check the reason for the stop command based on vehicle diagnostics data received from the selected follower vehicle.
  • the vehicle diagnostics data can include, for example, a tire pressure of the vehicle, as measured by one or more sensor(s) onboard the vehicle.
  • the human operator can send vehicle command(s) to the selected follower vehicle for the vehicle diagnostics data, and the selected follower vehicle can send the data in response to the vehicle command(s). If the human operator determines that the tire pressure is normal, then the human operator can send vehicle command(s) instructing the selected follower vehicle to ignore the stop command sent by the third-party entity. If the human operator determines that the tire pressure is not normal, then the human operator can send vehicle command(s) instructing all vehicles in the convoy to stop as a group, so that the human operator can inspect the selected follower vehicle.
  • the human operator can send vehicle command(s) to remove the selected follower vehicle from the convoy, so that the selected follower vehicle can come to a stop independently of the convoy, and the convoy can continue without the selected follower vehicle.
  • a selected follower vehicle in a convoy can receive vehicle command(s) from a third-party entity instructing the vehicle to stop because the third-party entity would like to inspect the vehicle.
  • the vehicle command(s) can also include a request for a human to be present at the inspection.
  • the selected follower vehicle can send data indicative of the vehicle command(s) from the third-party entity to the lead vehicle.
  • a human operator in the lead vehicle can send vehicle command(s) instructing all vehicles in the convoy to stop as a group, so that the human operator can be present for the inspection by the third-party entity.
  • one or more vehicles in a convoy can be removed from the convoy, and the convoy can be rearranged to continue without the one or more vehicles.
  • the convoy can include a first vehicle that is configured as a lead vehicle, a second vehicle configured to follow the first vehicle, a third vehicle configured to follow the second vehicle, and a fourth vehicle configured to follow the third vehicle. If the first vehicle in the convoy receives a vehicle command to stop, then the first vehicle can send vehicle commands to stop the convoy. If the second vehicle in the convoy receives a vehicle command to stop from a third-party entity, then the second vehicle can be removed from the convoy, and the third vehicle can be configured to follow the first vehicle. Additionally, the first vehicle can slow down and/or the third and fourth vehicles can speed up to maintain a predetermined distance between the vehicles in the convoy.
  • the third vehicle in the convoy receives a vehicle command to stop, then the third vehicle can be removed from the convoy, and the fourth vehicle can be configured to follow the second vehicle. Additionally, the first and second vehicles can slow down and/or the fourth vehicle can speed up to maintain a predetermined distance between the vehicles in the convoy.
  • a selected autonomous vehicle in the convoy in response to vehicle command(s) to stop, can determine vehicle actions to perform that include: removal from the convoy, and a stopping action.
  • the selected autonomous vehicle can communicate with other vehicles in the convoy to send data indicative of the determined vehicle actions.
  • the convoy can be rearranged to continue without the selected autonomous vehicle, and the rearranged convoy can maneuver past the selected autonomous vehicle as it performs the stopping action.
  • a selected autonomous vehicle in the convoy in response to vehicle command(s) to stop, can determine vehicle actions to perform that include: notifying a lead vehicle in the convoy.
  • the selected autonomous vehicle can communicate with the lead vehicle to send data indicative of the vehicle command(s) to stop, and wait for a decision from the lead vehicle. If a human operator in the lead vehicle decides to remove the selected autonomous vehicle from the convoy, then the human operator can send vehicle command(s) to remove the selected autonomous vehicle from the convoy and rearrange the convoy to continue without the selected autonomous vehicle.
  • the selected autonomous vehicle can automatically perform the stopping action.
  • one or more vehicles in a convoy can be added to the convoy, and the convoy can be rearranged to incorporate the one or more vehicles.
  • the convoy can include a first vehicle that is configured as a lead vehicle, a second vehicle configured to follow the first vehicle, a third vehicle configured to follow the second vehicle, and a fourth vehicle configured to follow the third vehicle.
  • a fifth vehicle that is not in the convoy can attempt to join the convoy by sending a request to the lead vehicle. If a human operator in the lead vehicle decides to add the fifth vehicle to the convoy, then the human operator can send vehicle command(s) to add the fifth vehicle to the convoy. In particular, the human operator can send vehicle command(s) to the fifth vehicle to follow the fourth vehicle. Alternatively, the human operator can send vehicle command(s) to the fifth vehicle to follow the third vehicle, and send vehicle command(s) to the fourth vehicle to follow the fifth vehicle.
  • the human operator can send vehicle command(s) to the fifth vehicle to follow the second vehicle, and send vehicle command(s) to the third vehicle to follow the fifth vehicle.
  • the human operator can send vehicle command(s) to the fifth vehicle to follow the first vehicle, and send vehicle command(s) to the second vehicle to follow the fifth vehicle.
  • a selected autonomous vehicle in a first convoy can receive vehicle command(s) to stop from a third-party entity.
  • the selected autonomous vehicle can be removed from the first convoy so that the selected autonomous vehicle can come to a stop and the first convoy can continue without the selected autonomous vehicle.
  • the reason for the stop is resolved (e.g., an inspection is completed, one or more tires are inflated, etc.)
  • the selected autonomous vehicle can attempt to rejoin the first convoy if it can safely catch up to the first convoy.
  • the selected autonomous vehicle can send a request to join/rejoin the first convoy to the lead vehicle.
  • a human operator in the lead vehicle can decide whether to add the selected autonomous vehicle to the first convoy. If the human operator decides to add the selected autonomous vehicle, then the human operator can send vehicle command(s) instructing the selected autonomous vehicle to follow a vehicle in the first convoy.
  • a selected autonomous vehicle in a first convoy can receive vehicle command(s) to stop from a third-party entity.
  • the selected autonomous vehicle can be removed from the first convoy so that the selected autonomous vehicle can come to a stop and the first convoy can continue without the selected autonomous vehicle.
  • the selected autonomous vehicle can attempt to join a second convoy.
  • the selected autonomous vehicle can send a request to join the second convoy to the lead vehicle in the second convoy.
  • a human operator in the lead vehicle can decide whether to add the selected autonomous vehicle to the second convoy. If the human operator decides to add the selected autonomous vehicle, then the human operator can send vehicle command(s) instructing the selected autonomous vehicle to follow a vehicle in the second convoy.
  • the systems and methods described herein may provide a number of technical effects and benefits. For instance, instead of stopping at weigh stations intermittently along a route, inspections and weigh-ins can be performed during transfer at a transfer hub and/or in real-time as an autonomous vehicle is travelling from a first transfer hub to a second transfer hub.
  • diagnostics information associated with the autonomous vehicle can be generated and used to determine remote inspection information that can be provided to a remote third-party enforcement entity.
  • the third-party entity can be provided with more accurate inspection information associated with an autonomous vehicle.
  • improvements to computing technology tasked with providing a vehicle service and/or managing a fleet of vehicles to provide a vehicle service may provide improvements in a utilization of the fleet of vehicles for providing the vehicle service, resulting in greater throughput and reduced energy expenditure by avoiding intermittent stops along a route.
  • FIG. 1 depicts an example system 100 according to example embodiments of the present disclosure.
  • the system 100 can include a vehicle computing system 102 associated with a vehicle 104.
  • the system 100 can also include one or more vehicle(s) 105, each including a respective vehicle computing system (not shown).
  • the system 100 can include one or more remote computing system(s) 103 that are remote from the vehicle 104 and the vehicle(s) 105.
  • the remote computing system(s) 103 can include an operations computing system 120, one or more client computing system(s) 122, one or more third-party computing system(s) 124, and one or more external monitor computing system(s) 126.
  • the remote computing system(s) can include an operations computing system 120, one or more client computing system(s) 122, one or more third-party computing system(s) 124, and one or more external monitor computing system(s) 126.
  • 103 can be separate from one another or share computing device(s).
  • the vehicle 104 can be part of a fleet of vehicles operated by the operations computing system 120.
  • the fleet of vehicles can also include the vehicle(s) 105.
  • the operations computing system 120 can manage or operate the vehicle 104 via the vehicle computing system 102, and manage or operate the vehicle(s) 105 via the respective vehicle computing system for each vehicle.
  • the operations computing system 120 can obtain data indicative of a vehicle service request from a client, for example, via the client computing system 122.
  • the operations computing system 120 can select the vehicle
  • the operations computing system 120 can control the vehicle 104 to provide remote inspection information to the one or more third-party computing system(s) 124.
  • the vehicles 104 incorporating the vehicle computing system 102, and the vehicle(s) 105 can be a ground-based autonomous vehicle (e.g., car, truck, bus), an air-based autonomous vehicle (e.g., airplane, drone, helicopter, or other aircraft), or other type of vehicle (e.g., watercraft).
  • the vehicle 104, and vehicle(s) 105 can be an autonomous vehicle that can drive, navigate, operate, etc. with minimal and/or no interaction from a human driver.
  • the vehicle computing system 102 can include one or more computing device(s) located on-board the vehicle 104 (e.g., located on and/or within the vehicle 104).
  • the computing device(s) can include various components for performing various operations and functions.
  • the computing device(s) can include one or more processor(s) and one or more tangible, non-transitory, computer readable media.
  • the one or more tangible, non-transitory, computer readable media can store instructions that when executed by the one or more processor(s) cause the vehicle 104 (e.g., its computing system, one or more processors, etc.) to perform operations and functions, such as those described herein.
  • the vehicle computing system 102 can include a Vehicle API client that can enable bidirectional communication with a remote computing system 103 (e.g., operations computing system 120, client computing system(s) 122, third- party computing system(s) 124) and/or a vehicle computing system onboard each of the vehicle(s) 105 through a Vehicle API Platform operating on the remote computing system 103 and/or the vehicle computing system onboard each of the vehicle(s) 105.
  • a Vehicle API Platform operating on the remote computing system 103 and/or the vehicle computing system onboard each of the vehicle(s) 105.
  • the Vehicle API Platform and the Vehicle API client can provide for establishing
  • the Vehicle API client can provide for communicating data using intelligent quality of service (QoS), multiplexing data over different communication streams, prioritizing and/or de-prioritizing data traffic dynamically, for example, based on link conditions and/or the like.
  • QoS quality of service
  • the vehicle 104 can include one or more sensors 108, an autonomy computing system 110, a vehicle control system 112, a communications system 114, and a memory system 116.
  • One or more of these systems can be configured to communicate with one another via a communication channel.
  • the communication channel can include one or more data buses (e.g., controller area network (CAN)), on-board diagnostics connector (e.g., OBD-II), and/or a combination of wired and/or wireless communication links.
  • the on-board systems can send and/or receive data, messages, signals, etc. amongst one another via the communication channel.
  • the sensor(s) 108 can be configured to acquire sensor data 109 associated with one or more objects that are proximate to the vehicle 104 (e.g., within a field of view of one or more of the sensor(s) 108).
  • the sensor(s) 108 can include a Light Detection and Ranging (LIDAR) system, a Radio Detection and Ranging (RADAR) system, one or more cameras (e.g., visible spectrum cameras, infrared cameras, etc.), motion sensors, and/or other types of imaging capture devices and/or sensors.
  • LIDAR Light Detection and Ranging
  • RADAR Radio Detection and Ranging
  • the sensor data 109 can include image data, radar data, LIDAR data, and/or other data acquired by the sensor(s) 108.
  • the object(s) can include, for example, pedestrians, vehicles, bicycles, and/or other objects.
  • the object(s) can be located in front of, to the rear of, and/or to the side of the vehicle 104.
  • the sensor data 109 can be indicative of locations associated with the object(s) within the surrounding
  • the sensor(s) 108 can provide the sensor data 109 to the autonomy computing system 110.
  • the autonomy computing system 110 can include a perception system 202, a prediction system 204, a motion planning system 206, and/or other systems that cooperate to perceive the surrounding environment of the vehicle 104 and determine a motion plan for controlling the motion of the vehicle 104 accordingly.
  • the autonomy computing system 110 can receive the sensor data 109 from the sensor(s) 108, attempt to comprehend the surrounding environment by performing various processing techniques on the sensor data 109 (and/or other data), and generate an appropriate motion plan through such surrounding environment.
  • the autonomy computing system 110 can control the one or more vehicle control systems 112 to operate the vehicle 104 according to the motion plan.
  • the autonomy computing system 110 can identify one or more objects that are proximate to the vehicle 104 based at least in part on the sensor data 109 and/or the map data 260.
  • the perception system 202 can perform various processing techniques on the sensor data 109 to determine perception data 262 that is descriptive of a current state of one or more object(s) that are proximate to the vehicle 104.
  • the prediction system 204 can create prediction data 264 associated with each of the respective one or more object(s) proximate to the vehicle 104.
  • the prediction data 264 can be indicative of one or more predicted future locations of each respective object.
  • the motion planning system 206 can determine a motion plan for the vehicle 104 based at least in part on the prediction data 264 (and/or other data), and save the motion plan as motion plan data 266.
  • the motion plan data 266 can include vehicle actions with respect to the object(s) proximate to the vehicle 104 as well as the predicted movements.
  • the motion plan data 266 can include a planned trajectory, speed, acceleration, etc. of the vehicle 104.
  • the motion planning system 206 can provide at least a portion of the motion plan data 266 that indicates one or more vehicle actions, a planned trajectory, and/or other operating parameters to the vehicle control system 112 to implement the motion plan for the vehicle 104.
  • the vehicle 104 can include a mobility controller configured to translate the motion plan data 266 into instructions.
  • the mobility controller can translate the motion plan data 266 into instructions to adjust the steering of the vehicle 104“X” degrees, apply a certain magnitude of braking force, etc.
  • the mobility controller can send one or more control signals to the responsible vehicle control sub-system (e.g., powertrain control system 220, steering control system 222, braking control system 224) to execute the instructions and implement the motion plan.
  • the responsible vehicle control sub-system e.g., powertrain control system 220, steering control system 222, braking control system 224
  • the communications system 114 can allow the vehicle computing system 102 (and its computing system(s)) to communicate with one or more other computing systems (e.g., remote computing system(s) 103, additional vehicle(s) 105).
  • the vehicle computing system 102 can use the communications system 114 to communicate with one or more remote computing system(s) 103 (e.g., operations computing system 120, third-party computing system(s) 124, external monitor computing system 126) or a vehicle computing system onboard each of the vehicle(s) 105 over one or more networks (e.g., via one or more wireless signal connections).
  • the vehicle computing system 102 can communicate with the operations computing system 120 over one or more wide-area networks (e.g., satellite network, cellular network, etc.) that use a relatively low-frequency spectrum and/or that are associated with relatively long-range communications.
  • the vehicle computing system 102 can communicate with the third-party computing system(s) 124 over one or more local-area networks (e.g., WiFi networks, infrared or laser based communication networks, ad-hoc mesh networks, etc.) that use a relatively high-frequency spectrum and/or are associated with a relatively short-range communications.
  • the communications system 114 can allow communication among one or more of the system(s) on-board the vehicle 104.
  • the communications system 114 can include any suitable sub-systems for interfacing with one or more network(s) including, for example, transmitters, receivers, ports, controllers, antennas, and/or other suitable sub systems that can help facilitate communication.
  • the memory system 116 of the vehicle 104 can include one or more memory devices located at the same or different locations (e.g., on-board the vehicle 104, distributed throughout the vehicle 104, off-board the vehicle 104, etc.).
  • the vehicle computing system 102 can use the memory system 116 to store and retrieve data/information.
  • the memory system 116 can store map data 260, perception data 262, prediction data 264, motion plan data 266, third-party identification data 270, vehicle identification data 272, status information data 273, diagnostics data 274, and remote inspection data 276.
  • the map data 260 can include information regarding: an identity and location of different roadways, road segments, buildings, or other items or objects (e.g., lampposts, crosswalks, curbing, etc.); a location and direction of traffic lanes (e.g., the location and direction of a parking lane, a turning lane, a bicycle lane, or other lanes within a particular roadway or other travel way and/or one or more boundary markings associated therewith); and/or any other data that assists the vehicle computing system 102 in comprehending and perceiving its surrounding environment and its relationship thereto.
  • an identity and location of different roadways, road segments, buildings, or other items or objects e.g., lampposts, crosswalks, curbing, etc.
  • a location and direction of traffic lanes e.g., the location and direction of a parking lane, a turning lane, a bicycle lane, or other lanes within a particular roadway or other travel way and/or one or more boundary markings associated therewith
  • any other data
  • the third-party identification data 270 can include information associated with one or more third-party entities.
  • the third-party identification data 270 can include authentication information used to authenticate a third-party entity.
  • the third-party identification data 270 can include one or more predetermined keys that have been previously shared between the vehicle computing system 102 and the third- party computing system(s) 124.
  • the third-party identification data 270 can include information indicating which predetermined key is shared with which third-party computing system 124.
  • the vehicle identification data 272 can include information indicative of a vehicle identifier that corresponds to the vehicle 104.
  • the vehicle identification data 272 can include an identification code corresponding to the vehicle 104 that is painted on the outside of the vehicle 104.
  • the vehicle identification data 272 can include an algorithm to generate an identification code corresponding to the vehicle 104.
  • Each generated identification code can be valid for a limited time, and a new identification code can be generated to replace an outdated identification code.
  • the vehicle 104 can include a display board that displays a valid identification code at any given time.
  • the status information data 273 can include status information associated with the vehicle 104.
  • the status information data 273 can be generated by the vehicle computing system 102 periodically, or in response to vehicle command(s) from a third-party entity.
  • the status information can include a status associated with one or more component(s) of the vehicle 104, and/or an overall health/status associated with the vehicle 104.
  • the vehicle computing system 102 can autonomously generate diagnostics information corresponding to one or more systems on-board the vehicle 104 and/or information corresponding to an environment in which the vehicle 104 operates.
  • the vehicle computing system can obtain diagnostics information associated with the vehicle 104 from the one or more external monitor computing system(s) 126.
  • the vehicle computing system 102 can store the diagnostics information as the diagnostics data 270.
  • the vehicle computing system 102 can autonomously determine remote inspection information based on the diagnostics data 270, and store the remote inspection information as remote inspection data 272.
  • the vehicle computing system 102 can provide the remote inspection information to one or more remote computing system(s) 103 (e.g., operations computing system 120, third-party computing system(s) 124, external monitor computing system(s) 126) at one or more times.
  • remote computing system(s) 103 e.g., operations computing system 120, third-party computing system(s) 124, external monitor computing system(s) 126) at one or more times.
  • the vehicle-operations system interface 300 can include a Vehicle API 304 associated with a remote computing system 301 (e.g., remote computing system(s) 103, vehicle computing system onboard each of the vehicle(s) 105, etc.).
  • the Vehicle API 304 can provide for a translation/transport layer as an interface between vehicle computing systems onboard vehicles within an entity’s fleet (e.g., vehicle 104, additional vehicle(s) 105) and one or more remote clients and/or applications operating within the remote computing system 301.
  • the Vehicle API 304 can include an offboard gateway 306 which can provide for establishing one or more communication channels between the Vehicle API 304 and a vehicle, such as vehicle 104 (e.g., via vehicle computing system 102, etc.).
  • the offboard gateway 306 can establish multiplexing connections between the vehicle 104 and the Vehicle API 304 that can be used to send arbitrary communications through the same connections.
  • the Vehicle API 304 through offboard gateway 306, can provide for establishing multiple hypertext transfer protocol (or other suitable protocol) connections, for example, using HTTP/2, between a Vehicle API relay/client 308 and the offboard gateway 306, allowing the ability to parallelize and assert traffic priority within a connection.
  • the offboard gateway 306 of Vehicle API 304 can establish at least two hypertext transfer protocol (or other suitable protocol) connections, such as HTTP/2 connections, to the operations computing system from a vehicle, where at least one connection can be dedicated to high reliability, high deliverability traffic and at least one connection can be dedicated to best-effort, unguaranteed traffic.
  • the use of multiple connections can allow for the underlying transport to be controlled in terms of different connections having different weights such that data can be identified as more important.
  • the vehicle 104 can include a Vehicle API relay/client 308, for example, associated with a vehicle computing system 102, which can provide for establishing the one or more communication channels between the offboard gateway 306 of the Vehicle API 304 and the vehicle 104.
  • the Vehicle API relay/client 308 onboard the vehicle 104 can provide for communicating data using intelligent quality of service (QoS), multiplexing data over different communication streams, prioritizing and/or de-prioritizing data traffic dynamically, for example, based on link conditions and/or the like.
  • QoS quality of service
  • the Vehicle API relay/client 308 can provide for making determinations about what data it thinks is more important and handling the communication of that data as appropriate.
  • the Vehicle API 304 through offboard gateway 306 and Vehicle API relay/client 308, can provide for communicating onboard data traffic 310 (e.g., telemetry, vehicle state information, etc.) from the vehicle 104 to the remote computing system 301.
  • onboard data traffic 310 e.g., telemetry, vehicle state information, etc.
  • the offboard gateway 306 can receive the onboard data traffic 310 from the Vehicle API relay/client 308 and the Vehicle API 304 can provide for handling the onboard data traffic 310 and providing the onboard data traffic 310 to one or more clients and/or application associated with the remote computing system 301 in client messages 314.
  • the Vehicle API 304 can provide for communicating authenticated vehicle messages 312 from the remote computing system 301 to the vehicle 104 (e.g., to vehicle computing system 102, etc.).
  • the offboard gateway 306 can receive vehicle messages 316 from one or more clients/applications associated with the remote computing system 301 (e.g., messages signed by the client to allow for authenticating the messages before sending to a vehicle) and the Vehicle API 304 can provide for communicating the vehicle messages 316 to the vehicle 104, through offboard gateway 306 and Vehicle API relay/client 308, as authenticated vehicle messages 312 (e.g., once the Vehicle API 304 has authenticated the signed vehicle messages 316).
  • the Vehicle API 304 can allow for a vehicle 104 to send multiple types of data to the remote computing system 301 over the established connections with the vehicle 104.
  • the Vehicle API 304 can provide for a vehicle 104 sending status information data 273 to the remote computing system 301.
  • the Vehicle API 304 can provide for a vehicle 104 to send low resolution perception data, such as labels and/or geometries, to the operations computing system 120, allowing for processing the data offboard the vehicle 104 by one or more clients/applications associated with the operations computing system 120 and allowing for developing a better understanding of the world.
  • the Vehicle API 304 can provide for a vehicle to send data such as current vehicle pose (e.g., global and relative to map), vehicle trajectory, onboard diagnostics, status information, and/or the like, to the remote computing system 301 to be processed by one or more clients/applications associated with the remote computing system 301.
  • data such as current vehicle pose (e.g., global and relative to map), vehicle trajectory, onboard diagnostics, status information, and/or the like, to the remote computing system 301 to be processed by one or more clients/applications associated with the remote computing system 301.
  • the Vehicle API 304 can provide for the remote computing system 301 to receive multiple types of data from the vehicle 104 and/or additional vehicle(s) 105.
  • the Vehicle API 304 can provide for the remote computing system 301 to receive multiple types of data from the vehicle 104 over the established connections with the vehicle 104.
  • the Vehicle API 304 can provide for receiving status information data 273 from the vehicle 104 at one or more times, and analyzing the status information data 273 to determine a vehicle state associated with the vehicle 104.
  • the Vehicle API 304 can provide for the remote computing system 301 to send multiple types of data to the vehicle 104 and/or additional vehicle(s) 105.
  • the Vehicle API 304 can provide for the remote computing system 301 to send multiple types of data to the vehicle 104 over the established connections to the vehicle 104.
  • the Vehicle API 304 can provide for sending command signals to the vehicle 104, such as, for example, sending specific vehicle command(s) to the vehicle 104, sending advisories to the vehicle 104, etc.
  • the specific vehicle command(s) can, for example, instruct the vehicle 104 to offload the data from its computing system, instruct the vehicle 104 to go to certain geographic coordinates, instruct the vehicle 104 to report for maintenance, instruct the vehicle 104 to procure fuel, instruct the vehicle 104 to indicate a selection by the remote computing system 301, instruct the vehicle 104 to relay vehicle command(s), instruct the vehicle 104 to come to a stop, and/or the like.
  • the advisories can, for example, notify a vehicle operator associated with the vehicle 104 about status information associated with the vehicle 104 or vehicle(s) 105, flagged geo regions (e.g., areas to avoid, areas to proceed with caution, areas under construction that should be routed around, etc.), etc.
  • FIG. 4A depicts a diagram of a plurality of vehicles 105 operating in an environment under the jurisdiction of a third-party entity 410.
  • the plurality of vehicles 105 can include autonomous vehicles 411, 412, 413, and 414.
  • the vehicles 411, 412, 413, and 414 can operate as a convoy.
  • the third-party entity 410 can be associated with a third-party computing system 124, and the third-party entity 410 can send one or more vehicle command(s) to the vehicles 411, 412, 413, and/or 414, via the third-party computing system 124.
  • the vehicle computing system(s) corresponding to the vehicles 411, 412, 413, and/or 414 can perform one or more vehicle action(s) in response to receiving the vehicle command(s).
  • the third-party entity 410 can send vehicle command(s) to select the vehicle 411.
  • the vehicle 411 can be configured as being selected, and perform vehicle action(s) to indicate the selection.
  • the vehicle 411 can flash an external indicator light, display a message on an external display, and/or perform other vehicle action(s) that the third-party entity 410 can perceive to determine that the vehicle 411 is selected.
  • the third-party entity 410 can broadcast vehicle command(s) in a general direction toward the vehicles 411, 412, 413, and 414. If the third-party entity 410 broadcasts vehicle command(s) in order to select the vehicle 411 and the vehicle 412 receives the vehicle command(s), then the vehicle 412 can perform vehicle action(s) to indicate selection of the vehicle 412. The third-party entity 410 can perceive the vehicle action(s) and determine that the vehicle 412 is selected. The third-party entity 410 can broadcast vehicle command(s) instructing the vehicle 412 to relay future vehicle command(s) to an autonomous vehicle in front of the vehicle 412. In response to receiving the vehicle command(s), the vehicle 412 can communicate with the vehicle 411 and relay the future vehicle command(s) from the third-party entity 410 to the vehicle 411.
  • the third-party entity 410 can broadcast vehicle command(s) in a general direction toward the vehicles 411, 412, 413, and 414. If the vehicle command(s) include a vehicle identifier corresponding to the vehicle 411, and the vehicle 412 receives the vehicle command(s), then the vehicle 412 can ignore the vehicle command(s). Alternatively, the vehicle 412 can determine that the vehicle 411 is proximate to the vehicle 412, and the vehicle 412 can relay the vehicle command(s) to the vehicle 411.
  • the third-party entity 410 can determine that the vehicle 411 has low tire pressure, and send vehicle command(s) selecting the vehicle 411.
  • the vehicle command(s) can include low tire pressure as the reason for the selection, and instruct the vehicle 411 to provide information indicating the reason to a service provider.
  • the vehicle 411 can communicate with the operations computing system 120 to send data indicative of the reason for the selection.
  • the third-party entity can determine that the vehicle 411 appears to have low tire pressure, and send vehicle command(s) selecting the vehicle 411 and instructing the vehicle 411 to provide status information indicative of its tire pressure.
  • the vehicle 411 can retrieve the status information from the status information data 273, or generate the status information, and send the status information to the third-party entity 410.
  • the third-party entity can verify the tire pressure of the vehicle 411 based on the status information and send vehicle command(s) instructing the vehicle 411 to travel to a maintenance area if the tire pressure is low.
  • the third-party entity can send vehicle command(s) selecting the vehicle 411 and instructing the vehicle 411 to stop.
  • the vehicle 411 can perform a stopping action to come to a stop. If the vehicle command(s) include a reason for stopping the vehicle 411, then the vehicle 411 can perform a safe-stop action if the reason is determined not to be critical, or the vehicle 411 can perform an emergency-stop action if the reason is determined to be critical. If the vehicle command(s) include an associated priority level (e.g., low-priority, high-priority), then the vehicle 411 can perform a stopping action corresponding to the priority level.
  • an associated priority level e.g., low-priority, high-priority
  • the third-party entity 410 can send one or more encrypted vehicle command(s) to the vehicle 411.
  • the vehicle 411 can receive the encrypted vehicle command(s), and decrypt the vehicle command(s) using a predetermined key that was previously shared between the vehicle 411 and the third-party entity 410.
  • the vehicle 411 can retrieve the third-party identification data 270 to authenticate the third-party entity 410.
  • the vehicle 411 can perform vehicle action(s) in response to the vehicle command(s) from the third-party entity 410.
  • FIG. 4B depicts a diagram of a plurality of vehicles 105 operating in an environment under the jurisdiction of a third-party entity 410.
  • the plurality of vehicles 105 can include autonomous vehicles 411, 412, 413, and 414.
  • the vehicles 411, 412, 413, and 414 can operate as a convoy.
  • the third-party entity 410 can identify the vehicle 411, and send vehicle command(s) that include a vehicle identifier corresponding to the vehicle 411.
  • the vehicle computing system associated with the vehicle 412 can receive the vehicle command(s) and determine that the vehicle identifier included in the vehicle command(s) does not correspond to the vehicle 412.
  • the vehicle computing system can determine that the vehicle identifier included in the vehicle command(s) corresponds to the vehicle 411, and that the vehicle 411 is proximate to the vehicle 412.
  • the vehicle computing system can control the vehicle 412 to data indicative of the vehicle command(s) to the vehicle 411.
  • FIG. 4C depicts a diagram of a plurality of vehicles 105 operating in an environment under the jurisdiction of a third-party entity 410.
  • the plurality of vehicles 105 can include autonomous vehicles 411, 412, 413, and 414.
  • the vehicles 411, 412, 413, and 414 can operate as a convoy.
  • the third-party entity 410 can identify the vehicle 414, and send vehicle command(s) that include a vehicle identifier corresponding to the vehicle 414.
  • the vehicle computing system associated with the vehicle 412 can receive the vehicle command(s) and determine that the vehicle identifier included in the vehicle command(s) does not correspond to the vehicle 412.
  • FIG. 5 A depicts a diagram of a plurality of vehicles 105 operating in an environment under the jurisdiction of a third-party entity 510.
  • the plurality of vehicles 105 can include autonomous vehicles 511, 512, 513, and 514.
  • the vehicles 511, 512, 513, and 514 can operate as a convoy.
  • the third-party entity 510 can send one or more vehicle command(s) to select the vehicle 512.
  • the vehicle computing system associated with the vehicle 512 can control the vehicle 512 to flash its hazard lights so that the third-party entity 510 can verify that the vehicle 512 is selected.
  • FIG. 5B depicts a diagram of a plurality of vehicles 105 operating in an environment under the jurisdiction of a third-party entity 510.
  • the plurality of vehicles 105 can include autonomous vehicles 511, 512, 513, and 514.
  • the vehicles 511, 512, 513, and 514 can operate as a convoy.
  • the third-party entity 510 can send vehicle command(s) to the vehicle 512 indicating a selection of the vehicle 511.
  • the vehicle computing system associated with the vehicle 512 can communicate with the vehicle computing system associated with the vehicle 511 to indicate a selection of the vehicle 511 by third-party entity 510.
  • the vehicle computing system associated with the vehicle 511 can control the vehicle 511 to flash its hazard lights so that the third- party entity 510 can verify that the vehicle 511 is selected.
  • FIG. 6 depicts a diagram of a plurality of vehicles 105 including vehicles 611,
  • the vehicle 611 can be configured as the lead vehicle in the convoy, and vehicles 612 and 613 can be configured as follower vehicles.
  • the vehicle 612 can be configured to follow vehicle 611 at a first distance ( ⁇ 3 ⁇ 4), and vehicle 613 can be configured to follow vehicle 612 at the first distance (d ).
  • the vehicles 611, 612, and 613 can be configured to travel at a first velocity (v / ).
  • vehicle 611 can be positioned at a first location marker (A)
  • vehicle 612 can be positioned at the first distance (d ) behind vehicle 611
  • vehicle 613 can be positioned at the first distance (V / ) behind vehicle 612.
  • vehicle 611 can travel to a second location marker ( B ), vehicle 612 can travel to maintain a position at the first distance (d ) behind vehicle 611, and vehicle 613 can travel to maintain a position at the first distance (d ) behind vehicle 612.
  • vehicle 611 can travel to a third location marker (C)
  • vehicle 612 can travel to maintain a position at the first distance (d ) behind vehicle 611
  • vehicle 613 can travel to maintain a position at the first distance (d ) behind vehicle 612.
  • FIG. 7 depicts a diagram of a plurality of vehicles 105 including vehicles 711,
  • the vehicle 711 can be configured as the lead vehicle in the convoy, and vehicles 712 and 713 can be configured as follower vehicles.
  • the vehicle 712 can be configured to follow vehicle 711 at a first distance ( ⁇ 3 ⁇ 4), and vehicle 713 can be configured to follow vehicle 712 at the first distance (d / ).
  • vehicle 711, 712, and 713 can travel at a first velocity (v 7 ).
  • the vehicle 712 can follow vehicle 711 at the first distance ( ⁇ 3 ⁇ 4), and the vehicle 713 can follow vehicle 712 at the first distance (d / ).
  • vehicle 712 can detect an obstacle 1001 in front of the vehicle, and generate a motion plan to avoid hitting the obstacle 1001.
  • the motion plan of vehicle 712 can include reducing speed to a second velocity (v 2 ⁇ v 7 ).
  • the vehicle 711 can continue to travel at the first velocity ( i).
  • the vehicle 713 can generate a motion plan to maintain a position at the first distance (d / ) behind vehicle 712, and to avoid hitting the vehicle 712.
  • the motion plan of vehicle 713 can include reducing speed to the second velocity (v 2 ).
  • vehicle 712 can communicate with vehicle 713 to provide data indicative of the obstacle 1001 and the motion plan of vehicle 712.
  • the vehicle 713 can receive the data and generate the motion plan of vehicle 713 based in part on the received data.
  • vehicle 713 can detect that vehicle 712 that is in front of vehicle 713 is slowing down.
  • the vehicle 713 can generate the motion plan of vehicle 713 based in part on the detected slow down.
  • vehicle 712 can determine that the obstacle 1001 is clear, but that vehicle 712 is now a second distance (d 2 > d 2 ) behind vehicle 711.
  • the vehicle 712 can generate a motion plan to resume a position at the first distance (d / ) behind vehicle 711.
  • the motion plan of vehicle 712 can include increasing speed to a third velocity (v 3 > v 2 ) to reduce a follow distance of vehicle 712 behind vehicle 711.
  • the vehicle 713 can generate a motion plan to maintain a position at the first distance (d / ) behind vehicle 712.
  • the motion plan of vehicle 713 can include increasing speed to the third velocity ( v 3 ).
  • vehicle 712 can communicate with vehicle 713 to provide data indicative of the motion plan of vehicle 712.
  • the vehicle 713 can receive the data and generate the motion plan of vehicle 713 based in part on the received data.
  • vehicle 713 can detect that vehicle 712 that is in front of vehicle 713 is speeding up.
  • the vehicle 713 can generate the motion plan of vehicle 713 based in part on the detected speed up.
  • FIG. 8 depicts a diagram of a plurality of vehicles 105 including vehicles 811,
  • the vehicle 811 can be configured as the lead vehicle in the convoy, and vehicles 812 and 813 can be configured as follower vehicles.
  • the vehicle 811 can be configured as the lead vehicle in the convoy, and vehicles 812 and 813 can be configured as follower vehicles.
  • vehicle 812 can be configured to follow vehicle 811 at a first distance (d ), and vehicle 813 can be configured to follow vehicle 812 at the first distance (d / ).
  • vehicle 811, 812, and 813 can travel at a first velocity (v 7 ).
  • the vehicle 812 can follow vehicle 811 at the first distance ( ⁇ 3 ⁇ 4), and the vehicle 813 can follow vehicle 812 at the first distance (d / ).
  • vehicle 812 can detect an obstacle 1101 in front of the vehicle, and generate a motion plan to avoid hitting the obstacle 1101.
  • the motion plan of vehicle 812 can include reducing speed, and moving to a different travel lane.
  • the vehicle 811 can continue to travel at the first velocity ( i).
  • the vehicle 812 can communicate with vehicle 813 to provide data indicative of the obstacle 1101 and the motion plan of vehicle 812.
  • the vehicle 813 can receive the data and generate a motion plan of vehicle 813 based in part on the received data.
  • the motion plan of vehicle 813 can include reducing speed, and moving to a different travel lane.
  • vehicle 812 can determine that the obstacle 1101 is clear, but that vehicle 812 is now a second distance ( d 2 > d ) behind vehicle 811.
  • the vehicle 812 can generate a motion plan to maintain a position at the first distance (d / ) behind vehicle 811.
  • the motion plan of vehicle 812 can include increasing speed to a second velocity (v 2 > v ) to reduce a follow distance of vehicle 812 behind vehicle 811.
  • vehicle 813 can determine that it is a third distance ( d 3 > d / ) behind vehicle 812.
  • the vehicle 813 can generate a motion plan to maintain a position at the first distance (d / ) behind vehicle 812.
  • the motion plan of vehicle 813 can include increasing speed to a third velocity ( v 3 > vy) to reduce a follow distance of vehicle 813 behind vehicle 812.
  • FIG. 9 depicts a diagram of a plurality of vehicles 105 including vehicles 911,
  • the vehicle 911 can be configured as the lead vehicle in the convoy, and vehicles 912 and 913 can be configured as follower vehicles.
  • the vehicle 912 can be configured to follow vehicle 911, and vehicle 913 can be configured to follow vehicle 912.
  • vehicles 911, 912, and 913 can travel as a group.
  • the vehicle 912 can receive one or more vehicle command(s) from the third-party entity.
  • the vehicle command(s) can instruct vehicle 912 to stop.
  • the vehicle command(s) can include or otherwise indicate a low-priority for the stop.
  • the vehicle 912 can determine one or more vehicle action(s) to perform.
  • the vehicle action(s) can include, for example, continuing to travel with the convoy until a safe stopping location is identified, removal from the convoy, and a safe-stop action.
  • vehicle 912 can identify the safety lane 1202 as a safe stopping location.
  • the vehicle 912 can send data indicative of the vehicle command(s) from the third-party entity and the determined vehicle action(s) to vehicles 911 and 913, and generate a motion plan to come to a stop in the safety lane 902.
  • a human operator in the vehicle 911 can verify the vehicle command(s) and confirm the vehicle action(s) by sending one or more vehicle command(s) to remove vehicle 912 from the convoy.
  • vehicle 913 can be configured to follow vehicle 911.
  • vehicle 912 can come to a stop in the safety lane 1202.
  • the vehicle 911 can slow down and/or vehicle 913 can speed up so that vehicle 913 can maintain a predetermined follow distance behind vehicle 911.
  • FIG. 10 depicts a diagram of a plurality of vehicles 105 including vehicles 1011, 1012, and 1013 operating as a convoy in an environment under the jurisdiction of a third- party entity (not shown).
  • the vehicle 1011 can be configured as the lead vehicle in the convoy, and vehicles 1012 and 1013 can be configured as follower vehicles.
  • the vehicle 1012 can be configured to follow vehicle 1011, and vehicle 1013 can be configured to follow vehicle 1012.
  • vehicles 1011, 1012, and 1013 can travel as a group.
  • vehicle 1012 can receive one or more vehicle command(s) from the third-party entity.
  • the vehicle command(s) can instruct vehicle 1012 to stop.
  • the vehicle command(s) can include or otherwise indicate a high-priority for the stop.
  • the vehicle 1012 can determine one or more vehicle action(s) to perform.
  • the vehicle action(s) can include, for example, removal from the convoy, and an emergency-stop action.
  • the vehicle 1012 can send data indicative of the vehicle command(s) from the third-party entity and the determined vehicle action(s) to vehicles 1011 and 1013, and generate a motion plan to immediately come to a stop in the same lane that the vehicle is travelling in.
  • a human operator in the vehicle 1011 can verify the vehicle command(s) and confirm the vehicle action(s) by sending one or more vehicle command(s) to remove vehicle 1012 from the convoy.
  • vehicle 1013 can be configured to follow vehicle 1011.
  • the vehicle 1013 can generate a motion plan to avoid hitting vehicle 1012 when it is stopping/stopped. [0185]
  • vehicle 1012 can come to a stop.
  • the vehicle 1011 can slow down and/or vehicle 1013 can speed up so that vehicle 1013 can maintain a predetermined follow distance behind vehicle 1011.
  • FIG. 11 depicts a diagram of a plurality of vehicles 105 including vehicles 1111,
  • the vehicle 1111 can be configured as the lead vehicle in the convoy, and vehicles 1112 and 1113 can be configured as follower vehicles.
  • the vehicle 1112 can be configured to follow vehicle 1111, and vehicle 1113 can be configured to follow vehicle 1112.
  • vehicles 1111, 1112, and 1113 can travel as a group.
  • vehicle 1112 can receive one or more vehicle command(s) from the third-party entity.
  • the vehicle command(s) can instruct vehicle 1112 to stop.
  • the vehicle command(s) can include or otherwise indicate a request for a human to be present at an inspection, or instructions for all vehicles in the convoy to stop.
  • the vehicle 1112 can determine one or more vehicle action(s) to perform.
  • the vehicle action(s) can include, for example, coordinating with other vehicles in the convoy to stop as a group.
  • the vehicle 1112 can send data indicative of the vehicle command(s) from the third-party entity and the determined vehicle action(s) to vehicle 1111.
  • a human operator in the vehicle 1111 can verify the vehicle command(s) from the third-party entity, and send one or more vehicle command(s) to determine coordinated stopping actions for all the vehicles in the convoy.
  • vehicles 1111, 1112, and 1113 can perform a respective coordinated stopping action so that vehicles 1111, 1112, and 1113 can stop as a group in the safety lane 1102.
  • FIG. 12 depicts a diagram of a plurality of vehicles 105 including vehicles 1211, 1212, and 1213 operating as a convoy in an environment under the jurisdiction of a third- party entity (not shown).
  • the vehicle 1211 can be configured as the lead vehicle in the convoy, and vehicles 1212 and 1213 can be configured as follower vehicles.
  • the vehicle 1212 can be configured to follow vehicle 1211, and vehicle 1213 can be configured to follow vehicle 1212.
  • vehicles 1211, 1212, and 1213 can travel as a group.
  • vehicle 1212 can receive one or more vehicle command(s) from the third-party entity.
  • the vehicle command(s) can instruct vehicle 1212 to stop.
  • the vehicle command(s) can include or otherwise indicate a request for a human to be present at an inspection, or instructions for all vehicles in the convoy to stop.
  • the vehicle 1212 can determine one or more vehicle action(s) to perform.
  • the vehicle action(s) can include, for example, coordinating with other vehicles in the convoy to stop as a group.
  • the vehicle 1212 can send data indicative of the vehicle command(s) from the third-party entity and the determined vehicle action(s) to vehicle 1211.
  • a human operator in the vehicle 1211 can verify the vehicle command(s) from the third-party entity, and send one or more vehicle command(s) to determine coordinated stopping actions for all the vehicles in the convoy.
  • vehicles 1211, 1212, and 1213 can perform a respective coordinated stopping action so that the convoy can stop as a group in a same lane that vehicles 1211, 1212, and 1213 are travelling in.
  • FIG. 13 depicts a flow diagram of an example method 1300 for controlling an autonomous vehicle according to example embodiments of the present disclosure.
  • One or more portion(s) of the method 1300 can be implemented as operations by one or more computing system(s) such as, for example, the computing system(s) 102, 120, 1401, and 1410 shown in FIGS. 1, 2, and 10.
  • one or more portion(s) of the method 1300 can be implemented as an algorithm on the hardware components of the system(s) described herein (e.g., as in FIGS. 1, 2, and 14), for example, to control an autonomous vehicle in response to vehicle instructions from a remote computing system associated with a third-party entity.
  • FIG. 13 depicts elements performed in a particular order for purposes of illustration and discussion. Those of ordinary skill in the art, using the disclosures provided herein, will understand that the elements of any of the methods (e.g., of FIG. 13) discussed herein can be adapted, rearranged, expanded, omitted, combined, and/or modified in various ways without deviating from the scope of the present disclosure.
  • the method 1300 can include controlling a first autonomous vehicle that is part of a convoy to provide a vehicle service.
  • the vehicle computing system 102 can control the vehicle 104 to provide a vehicle service.
  • the vehicle 104 can be part of a fleet of vehicles controlled by an operations computing system 120 associated with a service provider, and more particularly, part of a convoy that includes a plurality of vehicles from the fleet.
  • the vehicle computing system 102 can control the vehicle 104 to provide the vehicle service for a second entity associated with the client computing system 122.
  • the vehicle computing system 102 can control the vehicle 104 to provide the vehicle service at least partly in a geographic area under jurisdiction of a third-party entity.
  • the method 1300 can include receiving one or more communication(s) from a remote computing system associated with a third-party entity.
  • the vehicle computing system 102 can receive one or more communications from the third-party computing system 124 associated with the third-party entity.
  • one or more clients and/or applications operating associated with the third-party computing system 124 can send vehicle messages 316 corresponding to the communication(s) to the offboard gateway 306, and the Vehicle API 304 can provide the vehicle messages 316 to the vehicle 104, through offboard gateway 306 and Vehicle API relay/client 308, as authenticated vehicle messages 312.
  • the communication(s) can include one or more vehicle instruction(s), such as, for example, instructions for selecting the vehicle 104, instructions for stopping the vehicle 104, instructions for the vehicle 104 to relay information to the other vehicles in the convoy, or instructions for the vehicle 104 to provide information to the third-party entity.
  • vehicle instruction(s) such as, for example, instructions for selecting the vehicle 104, instructions for stopping the vehicle 104, instructions for the vehicle 104 to relay information to the other vehicles in the convoy, or instructions for the vehicle 104 to provide information to the third-party entity.
  • the vehicle computing system 102 can receive the vehicle instruction(s) as one or more encrypted communication(s) from the third-party computing system 124 associated with a third-party entity, and the vehicle computing system 102 can decrypt the encrypted communication(s) based on a predetermined key.
  • the method 1300 can include determining one or more vehicle action(s) to perform based on one or more vehicle instruction(s) included in the communication(s).
  • the vehicle instruction(s) can instruct the vehicle computing system 102 to stop the vehicle 104.
  • the vehicle computing system 102 can determine an identify of the third- party entity based on the communication(s), and determine one or more vehicle action(s) to perform based on the identity.
  • the vehicle computing system 102 can also determine a vehicle identifier associated with the communication(s), and determine one or more vehicle action(s) to perform if the vehicle identifier corresponds to the vehicle 104. If the vehicle instruction(s) in the communication(s) include an associated priority-level, then the vehicle computing system 102 can determine the vehicle action(s) to perform that correspond to the priority level.
  • the vehicle computing system 102 can send data indicative of the vehicle instruction(s) to the other vehicle(s) 105 in the convoy, determine a stopping action that corresponds to the vehicle instruction(s), remove the vehicle 104 from the convoy, and implement the stopping action to bring the vehicle 104 to a stop.
  • the vehicle computing system 102 can send data indicative of the vehicle instruction(s) to a lead vehicle 105 in the convoy, receive one or more vehicle instruction(s) from the lead vehicle 105, and implement the vehicle instruction(s) received from the lead vehicle 105.
  • one or more clients and/or applications operating associated with the vehicle computing system 102 can send vehicle messages corresponding to the vehicle instruction(s) to an offboard gateway, and a Vehicle API can provide the vehicle messages to the other vehicle(s) 105, through the offboard gateway and a Vehicle API relay/client associated with the vehicle(s) 105, as authenticated vehicle messages.
  • the method 1300 can include controlling the first autonomous vehicle to implement the vehicle action(s).
  • the vehicle computing system 102 can control the vehicle 104 to implement the determined vehicle action(s) in response to receiving the one or more vehicle instructions. If the vehicle instruction(s) include instructions for the vehicle computing system 102 to stop the vehicle 104, and the vehicle instruction(s) include a non-critical reason for stopping the vehicle 104, then the vehicle computing system 102 can control the vehicle 104 to implement a soft-stop vehicle action.
  • the vehicle computing system 102 can control the vehicle 104 to implement an emergency-stop vehicle action. If the vehicle instruction(s) include instructions for the vehicle computing system 102 to stop the vehicle 104, and the vehicle instruction(s) are associated with a low-priority, then the vehicle computing system 102 can control the vehicle 104 to implement a soft-stop vehicle action. If the vehicle instruction(s) include instructions for the vehicle computing system 102 to stop the vehicle 104, and the vehicle instruction(s) are associated with a high-priority, then the vehicle computing system 102 can control the vehicle 104 to implement an emergency-stop vehicle action.
  • FIG. 14 depicts a diagram 1400 of a transfer hub 1460 according to example embodiments of the present disclosure.
  • the transfer hub 1460 can include a loading zone 1462, launch zone 1464, and landing zone 1466.
  • the loading zone 1462 can be connected to the launch zone 1464 via an access route 1472, and connected to the landing zone 1466 via an access route 1478.
  • the launch zone 1464 can be connected to a highway via an on-ramp 1474, and the landing zone can be connected to the highway via an off-ramp 1476.
  • An autonomous vehicle can exit the transfer hub 1460 via the on-ramp 1474, and the autonomous vehicle can enter the transfer hub 1460 via the off-ramp 1476.
  • the transfer hub 1460 can include a first external monitor 1482.
  • the vehicle computing system 102, and/or the operations computing system 120 can control the vehicle 104 to travel to a vicinity of the first external monitor 1482 when the vehicle 104 enters the transfer hub 1460 via the off-ramp 276.
  • the first external monitor 1482 e.g., external monitor computing system 126 corresponding to the first external monitor 1482 can inspect the vehicle 104 to generate diagnostics information associated with the vehicle 104.
  • the first external monitor 1482 can obtain diagnostics information from the vehicle 104.
  • the external monitor 1482 can determine remote inspection information associated with the vehicle 104 based on the diagnostics information, and provide the remote inspection information to a third-party computing system 124.
  • the first external monitor 1482 can inspect the vehicle 104 to generate diagnostics information associated with the vehicle 104, and provide the diagnostics information to the vehicle computing system 102.
  • the vehicle computing system 102 can store the diagnostics information and/or determine remote inspection information associated with the vehicle 104 based on the diagnostics information.
  • the vehicle computing system 102 can provide the remote inspection information to a third-party computing system 124.
  • the transfer hub 1460 can include a second external monitor 1484.
  • the vehicle computing system 102, and/or the operations computing system 120 can control the vehicle 104 to travel to a vicinity of the second external monitor 1484 before the vehicle 104 exits the transfer hub 1460 via the on-ramp 1474.
  • the second external monitor 1484 can inspect the vehicle 104 to generate diagnostics information associated with the vehicle 104. Additionally, or alternatively, the second external monitor 1484 can obtain diagnostics information from the vehicle 104. The external monitor 1484 can determine remote inspection information associated with the vehicle 104 based on the diagnostics information, and provide the remote inspection information to a third-party computing system 124.
  • the second external monitor 1484 can inspect the vehicle 104 to generate diagnostics information associated with the vehicle 104, and provide the diagnostics information to the vehicle computing system 102.
  • the vehicle computing system 102 can store the diagnostics information and/or determine remote inspection information associated with the vehicle 104 based on the diagnostics information.
  • the vehicle computing system 102 can provide the remote inspection information to a third-party computing system 124.
  • FIG. 15 depicts a diagram 1500 of a transportation route 1502 according to example embodiments of the present disclosure.
  • the transportation route 1502 can be part of a highway transportation infrastructure administered by a highway transportation administration.
  • the transportation route 1502 can extend across regions 1503, 1505, 1507, and 1509, that are separated by boundaries 1504, 1506, 1508, and 1510.
  • the 1505, and 1507 can each correspond to, for example, different government entities, and the boundaries 1504 and 1506 can each correspond to, for example, political boundaries.
  • the boundary 1504 can separate region 1503 and region 1505, and the boundary 1506 can separate region 1505 and region 1507.
  • the region 1509 can correspond to, for example, a region where wireless communication is unavailable or unreliable.
  • the region 1509 can be bounded by the boundaries 1508 and 1510.
  • the transportation route 1502 can include an external monitor 1512 located at the boundary 1504, and an external monitor 1516 located at the boundary 1506.
  • the vehicle 104 can travel within a vicinity of the external monitor 1512 when the vehicle 104 crosses the boundary 1504, and within a vicinity of the external monitor 1516 when the vehicle 104 crosses the boundary 1506.
  • the vehicle 104 can autonomously provide remote inspection information associated with it to the external monitors 1512 and 1516 when the vehicle 104 is within a vicinity of the external monitors 1512 and 1516, respectively.
  • the vehicle 104 when the vehicle 104 is travelling from region 1503 to region 1507, the vehicle 104 can provide remote inspection information including a weight of an attached cargo item to the external monitor 1512 when the vehicle 104 crosses the boundary 1504.
  • the external monitor 1512 can provide the remote inspection information received from the vehicle 104 to a tax assessment entity associated with the region 1505.
  • the vehicle 104 when the vehicle 104 is travelling from region 1503 to region 1507, the vehicle 104 can provide remote inspection information including a weight of an attached cargo item to the external monitor 1516 when the vehicle 104 crosses the boundary
  • the external monitor 1516 can provide the remote inspection information received from the vehicle 104 to a tax assessment entity associated with region 1507.
  • the transportation route 1502 can include an external monitor 1522 located at the boundary 1504, and an external monitor 1518 located at the boundary 1506.
  • the vehicle 104 can travel within a vicinity of the external monitor 1518 when the vehicle 104 crosses the boundary 1506, and within a vicinity of the external monitor 1522 when the vehicle 104 crosses the boundary 1504.
  • the vehicle 104 can autonomously provide remote inspection information associated with it to the external monitors 1518 and 1522 when the vehicle 104 is within a vicinity of the external monitors 1518 and 1522, respectively.
  • the vehicle 104 can provide remote inspection information including a weight of an attached cargo item to the external monitor 1518 when the vehicle 104 crosses the boundary
  • the external monitor 1518 can provide the remote inspection information received from the vehicle 104 to a tax assessment entity associated with the region 1505.
  • the vehicle 104 can provide remote inspection information including a weight of an attached cargo item to the external monitor 1522 when the vehicle 104 crosses the boundary
  • the external monitor 1522 can provide the remote inspection information received from the vehicle 104 to a tax assessment entity associated with region 1503.
  • the vehicle computing system 102, and/or the operations computing system 120 can control the vehicle 104 to wirelessly provide remote inspection information to a third-party entity at periodic intervals along the transportation route 1502.
  • the vehicle 104 can provide remote inspection information including a location and speed associated with the vehicle 104 to a law enforcement entity to assist the law enforcement entity in monitoring vehicular traffic in its jurisdiction.
  • the vehicle 104 can provide such remote inspection information to a law enforcement entity associated with the region 1503; when the vehicle 104 located in the region 1505, the vehicle 104 can provide such remote inspection information to a law enforcement entity associated with the region 1505; and when the vehicle 104 located in the region 1507, the vehicle 104 can provide such remote inspection information to a law enforcement entity associated with the region 1507.
  • the transportation route 1502 can include external monitors 1514 and 1520 located within the region 1509.
  • the external monitors 1514 and 1520 can include a dedicated and/or physical connection to a communications network to provide remote inspection information to a third-party entity.
  • the vehicle 104 can be unable to provide remote inspection information including a location and speed associated with the vehicle 104 to a law enforcement entity associated with the region 1505 because wireless communication is unavailable or unreliable in the region 1509.
  • the vehicle 104 can instead travel within a vicinity of the external monitor 1514 when travelling from the region 1503 to the region
  • the vehicle 104 can provide remote inspection information including a location and speed associated with the vehicle 104 to the external monitors 1514 and 1520, and the external monitors 1514 and 1520 can provide the remote inspection information to a law-enforcement entity associated with the region 1505.
  • FIGS. 16A, 16B, and 16C depict diagrams 1602, 1604, and 1606 of determining remote inspection information using a mobile external monitor according to exemplary embodiments of the present disclosure.
  • the vehicle 104 can detect a fault, for example, with a tire pressure sensor of the vehicle 104, when travelling via the transportation route 1610. In response to the fault, the vehicle 104 can pull-over in the safety lane 1604 and request a mobile inspection of its tires.
  • the vehicle 105 can be selected to travel to a vicinity of the vehicle 104 to inspect one or more tires of the vehicle 104.
  • the vehicle 105 can be affixed with an external monitor that can inspect the vehicle 104.
  • the vehicle 105 can use one or more sensors onboard the vehicle 105 to inspect the vehicle 104.
  • the vehicle 105 can inspect the vehicle 104 to generate diagnostics information associated with the vehicle 104, and provide the diagnostics information to the vehicle 104.
  • the vehicle 105 can determine remote inspection information associated with the vehicle 104, and provide the remote inspection information to a third-party computing system 124.
  • the vehicle 104 can detect a fault, for example, with a tire pressure sensor of the vehicle 104, when travelling via the transportation route 1610. In response to the fault, the vehicle 104 can travel in a right-side lane of the transportation route 1610, and request a mobile inspection of its left-side tires.
  • the vehicle 105 can be selected to travel to a vicinity of the vehicle 104 to inspect one or more tires of the vehicle 104.
  • the vehicle 105 can be affixed with an external monitor that can inspect the vehicle 104. Additionally, or alternatively, the vehicle 105 can use one or more sensors onboard the vehicle 105 to inspect the vehicle 104.
  • the vehicle 105 can travel in a left-side lane of the transportation route 1610 and inspect a left-side of the vehicle 104 to generate diagnostics information associated with the vehicle 104.
  • the vehicle 105 can provide the diagnostics information to the vehicle 104. Additionally, or alternatively, the vehicle 105 can determine remote inspection information associated with the vehicle 104, and provide the remote inspection information to a third- party computing system 124.
  • the vehicle 104 can detect a fault, for example, with a tire pressure sensor of the vehicle 104, when travelling via the transportation route 1610. In response to the fault, the vehicle 104 can travel in a left-side lane of the transportation route 1610, and request a mobile inspection of its right-side tires.
  • the vehicle 105 can be selected to travel to a vicinity of the vehicle 104 to inspect one or more tires of the vehicle 104.
  • the vehicle 105 can be affixed with an external monitor that can inspect the vehicle 104. Additionally, or alternatively, the vehicle 105 can use one or more sensors onboard the vehicle 105 to inspect the vehicle 104.
  • the vehicle 105 can travel in a right-side lane of the transportation route 1610 and inspect a right-side of the vehicle 104 to generate diagnostics information associated with the vehicle 104.
  • the vehicle 105 can provide the diagnostics information to the vehicle 104. Additionally, or alternatively, the vehicle 105 can determine remote inspection information associated with the vehicle 104, and provide the remote inspection information to a third-party computing system 124.
  • FIG. 17 depicts a diagram 1700 of remote inspection information 1702 according to example embodiments of the present disclosure.
  • the remote inspection information 1702 can be associated with the vehicle 104, and can indicate a vehicle components status 1704, vehicle performance status 1706, and vehicle environment status 1708 associated with the vehicle 104.
  • the vehicle components status 1704 can correspond to, for example, a status of the sensor(s) 108, autonomy computing system 110, vehicle control system 112,
  • the vehicle performance status 1706 can correspond to, for example, a speed, distance travelled, fuel consumption, weight, coolant levels, and brake wear associated with the vehicle 104.
  • the vehicle environment status 1708 can correspond to, for example, road conditions, weather conditions, and traffic conditions that are determined based on the sensor data 109.
  • FIG. 18 depicts a flow diagrams of an example method 1800 for controlling an autonomous vehicle according to example embodiments of the present disclosure.
  • One or more portion(s) of the method 1800 can be implemented as operations by one or more computing system(s) such as, for example, the computing system(s) 102, 120, 801, and 810 shown in FIGS. 1, 2, and 8.
  • one or more portion(s) of the method 1800 can be implemented as an algorithm on the hardware components of the system(s) described herein (e.g., as in FIGS. 1, 2, and 8) to, for example, provide remote inspection information to a third-party entity.
  • FIG. 18 depicts elements performed in a particular order for purposes of illustration and discussion.
  • the method 1800 can include controlling an autonomous vehicle to provide a vehicle service.
  • the vehicle computing system 102 can control the vehicle 104 to provide a vehicle service to a client entity.
  • the method 1800 can include determining diagnostics information associated with the autonomous vehicle.
  • the vehicle computing system 102 can determine vehicle diagnostics information associated with the vehicle 104.
  • the vehicle computing system 102 can determine the vehicle diagnostics information by autonomously generating diagnostics information associated with the vehicle 104. Additionally, or alternatively, the vehicle computing system 102 can determine the vehicle diagnostics information by controlling the vehicle 104 to ravel to a vicinity of an external monitor that can generate diagnostics information associated with the vehicle 104.
  • the external monitor can include one or more of an automated inspection device and a human inspector.
  • the external monitor can be located at one or more of a transfer hub and along a transportation route of a transportation network used to provide the vehicle service to the client entity.
  • the external monitor can be mobile, and affixed to an additional vehicle 105.
  • the external monitor can include on or more sensors onboard the vehicle 105.
  • the operations computing system 120 can control the vehicle 105 to travel to a vicinity of the vehicle 104 and generate diagnostics information associated with the vehicle 104.
  • the operations computing system 120 can control the vehicle 105 in response to a request by the vehicle 104 for an external monitor to generate diagnostics information associated with the vehicle 104.
  • the method 1800 can include determining remote inspection information associated with the autonomous vehicle.
  • the vehicle computing system 102 can determine remote inspection information associated with the vehicle 104, that includes an assessment of one or more categories pertaining to a third-party entity, based on vehicle diagnostics information associated with the vehicle 104.
  • the vehicle computing system 102 can determine one or more categories pertaining to the third-party entity, analyze the vehicle diagnostics information associated with the vehicle 104 to determine an assessment for each of the one or more categories pertaining to the third-party entity, and generate remote inspection information based at least in part on the assessment for each of the one or more categories pertaining to the third-party entity.
  • the method 1800 can include providing remote inspection information to a third-party entity.
  • the vehicle computing system 102 can provide remote inspection information associated with the vehicle 104 to a third-party computing system 124 corresponding to a third-party entity.
  • the vehicle computing system 102 can provide the remote inspection information to a remote third-party entity at one or more times when providing the vehicle service to the client entity.
  • FIG. 19 depicts an example computing system 1900 according to example embodiments of the present disclosure.
  • the example system 1900 illustrated in FIG. 19 is provided as an example only.
  • the components, systems, connections, and/or other aspects illustrated in FIG. 19 are optional and are provided as examples of what is possible, but not required, to implement the present disclosure.
  • the example system 1900 can include the vehicle computing system 102 of the vehicle 104 and, in some implementations, remote computing system(s) 1910 including one or more remote computing system(s) that are remote from the vehicle 104 (e.g., operations computing system 120) that can be communicatively coupled to one another over one or more networks 1920.
  • the remote computing system 1910 can be associated with a central operations system and/or an entity associated with the vehicle 104 such as, for example, a vehicle owner, vehicle manager, fleet operator, service provider, etc.
  • the computing device(s) 1901 of the vehicle computing system 102 can include processor(s) 1902 and a memory 1904.
  • the one or more processors 1902 can be any suitable processing device (e.g., a processor core, a microprocessor, an ASIC, a FPGA, a controller, a microcontroller, etc.) and can be one processor or a plurality of processors that are operatively connected.
  • the memory 1904 can include one or more non-transitory computer- readable storage media, such as RAM, ROM, EEPROM, EPROM, one or more memory devices, flash memory devices, etc., and combinations thereof.
  • the memory 1904 can store information that can be accessed by the one or more processors 1902.
  • the memory 1904 e.g., one or more non-transitory computer- readable storage mediums, memory devices
  • the memory 1904 e.g., one or more non-transitory computer- readable storage mediums, memory devices
  • the instructions 1906 can be software written in any suitable programming language or can be implemented in hardware. Additionally, or alternatively, the instructions 1906 can be executed in logically and/or virtually separate threads on processor(s) 1902.
  • the memory 1904 on-board the vehicle 104 can store instructions 1906 that when executed by the one or more processors 1902 on-board the vehicle 104 cause the one or more processors 1902 (the vehicle computing system 102) to perform operations such as any of the operations and functions of the vehicle computing system 102, as described herein, one or more operations of method 1300, and/or any other operations and functions of the vehicle computing system 102, as described herein.
  • the memory 1904 can store data 1908 that can be obtained, received, accessed, written, manipulated, created, and/or stored.
  • the data 1908 can include, for instance, data associated with perception, prediction, motion plan, maps, third-party identification, vehicle identification, vehicle status information, vehicle diagnostics, remote inspection, and/or other data/information as described herein.
  • the computing device(s) can store data 1908 that can be obtained, received, accessed, written, manipulated, created, and/or stored.
  • the data 1908 can include, for instance, data associated with perception, prediction, motion plan, maps, third-party identification, vehicle identification, vehicle status information, vehicle diagnostics, remote inspection, and/or other data/information as described herein.
  • the computing device(s) can store data 1908 that can be obtained, received, accessed, written, manipulated, created, and/or stored.
  • the data 1908 can include, for instance, data associated with perception, prediction, motion plan, maps, third-party identification, vehicle identification, vehicle status information, vehicle diagnostics, remote inspection,
  • 1901 can obtain data from one or more memory device(s) that are remote from the vehicle 104.
  • the computing device(s) 1901 can also include a communication interface 1903 used to communicate with one or more other system(s) on-board the vehicle 104 and/or a remote computing device that is remote from the vehicle 104 (e.g., of remote computing system(s) 1910).
  • the communication interface 1903 can include any circuits, components, software, etc. for communicating via one or more networks (e.g., 1920).
  • networks e.g., 1920.
  • the communication interface 1903 can include, for example, one or more of a communications controller, receiver, transceiver, transmitter, port, conductors, software, and/or hardware for communicating data.
  • the network(s) 1920 can be any type of network or combination of networks that allows for communication between devices.
  • the network(s) can include one or more of a local area network, wide area network, the Internet, secure network, cellular network, mesh network, peer-to-peer communication link, and/or some combination thereof, and can include any number of wired or wireless links.
  • Communication over the network(s) 1920 can be accomplished, for instance, via a communication interface using any type of protocol, protection scheme, encoding, format, packaging, etc.
  • the remote computing system 1910 can include one or more remote computing devices that are remote from the vehicle computing system 102.
  • the remote computing devices can include components (e.g., processor(s), memory, instructions, data) similar to that described herein for the computing device(s) 1901.
  • the remote computing system(s) 1910 can be configured to perform one or more operations of the operations computing system 120, as described herein.
  • the computing systems of other vehicles described herein can include components similar to that of vehicle computing system 102
  • Computing tasks discussed herein as being performed at computing device(s) remote from the vehicle can instead be performed at the vehicle (e.g., via the vehicle computing system), or vice versa.
  • Such configurations can be implemented without deviating from the scope of the present disclosure.
  • the use of computer-based systems allows for a great variety of possible configurations, combinations, and divisions of tasks and
  • Computer-implemented operations can be performed on a single component or across multiple components.
  • Computer-implemented tasks and/or operations can be performed sequentially or in parallel.
  • Data and instructions can be stored in a single memory device or across multiple memory devices.

Abstract

Systems and methods for controlling an autonomous vehicle in response to vehicle instructions from a remote computing system are provided. In one example embodiment, a computer-implemented method includes controlling a first autonomous vehicle to provide a vehicle service, the first autonomous vehicle being associated with a first convoy that includes one or more second autonomous vehicles. The method includes receiving one or more communications from a remote computing system associated with a third-party entity, the one or more communications including one or more vehicle instructions. The method includes coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity. The method includes controlling the first autonomous vehicle to implement the one or more vehicle actions.

Description

SYSTEMS AND METHODS FOR CONTROLLING AN AUTONOMOUS VEHICLE
PRIORITY CLAIM
[0001] The present application claims the benefit of priority of U.S. Provisional Patent Application No. 62/615,206 filed January 9, 2018, entitled“Systems and Methods for Controlling Autonomous Vehicle,” and U.S. Provisional Patent Application No. 62/620,656 filed January 23, 2018, entitled“Systems and Methods For Remote Inspection of a Vehicle.” The above-referenced patent applications are incorporated herein by reference.
FIELD
[0002] The present disclosure relates generally to controlling an autonomous vehicle in response to communications sent by a third-party entity and controlling an autonomous vehicle to provide inspection information to a remote third-party entity.
BACKGROUND
[0003] An autonomous vehicle is a vehicle that is capable of sensing its environment and navigating without human input. In particular, an autonomous vehicle can observe its surrounding environment using a variety of sensors and can attempt to comprehend the environment by performing various processing techniques on data collected by the sensors. Given knowledge of its surrounding environment, the autonomous vehicle can identify an appropriate motion plan through such surrounding environment.
SUMMARY
[0004] Aspects and advantages of the present disclosure will be set forth in part in the following description, or may be learned from the description, or may be learned through practice of the embodiments.
[0005] One example aspect of the present disclosure is directed to a computer- implemented method for controlling an autonomous vehicle in response to vehicle
instructions from a remote computing system. The method includes controlling, by one or more computing devices, a first autonomous vehicle to provide a vehicle service, the first autonomous vehicle being associated with a first convoy that includes one or more second autonomous vehicles. The method includes receiving, by the one or more computing devices, one or more communications from a remote computing system associated with a third-party entity, the one or more communications including one or more vehicle instructions. The method includes coordinating, by the one or more computing devices, with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity. The method includes controlling, by the one or more computing devices, the first autonomous vehicle to implement the one or more vehicle actions.
[0006] Another example aspect of the present disclosure is directed to a computing system for controlling an autonomous vehicle in response to vehicle instructions from a remote computing system. The computing system includes one or more processors and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the computing system to perform operations. The operations include controlling a first autonomous vehicle to provide a vehicle service, the first autonomous vehicle being associated with a first convoy that includes one or more second autonomous vehicles. The operations include receiving one or more communications from a remote computing system associated with a third-party entity, the one or more communications including one or more vehicle instructions. The operations include coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity. The operations include controlling the first autonomous vehicle to implement the one or more vehicle actions.
[0007] Another example aspect of the present disclosure is directed to an autonomous vehicle. The autonomous vehicle includes one or more processors and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the autonomous vehicle to perform operations. The operations include controlling a first autonomous vehicle to provide a vehicle service, the first autonomous vehicle being selected from a fleet of vehicles controlled by a first entity to provide the vehicle service to a second entity. The operations include receiving one or more communications from a remote computing system associated with a third entity, the one or more communications including one or more vehicle instructions. The operations include determining one or more vehicle actions to perform in response to the one or more vehicle instructions. The operations include controlling the first autonomous vehicle to implement the one or more vehicle actions.
[0008] Another example aspect of the present disclosure is directed to a computer- implemented method for controlling an autonomous vehicle to provide a vehicle service. The method includes determining, by one or more computing devices, vehicle diagnostics information associated with a first autonomous vehicle that is part of a fleet of vehicles controlled by a first entity to provide a vehicle service to a second entity. The method includes determining, by the one or more computing devices, remote inspection information that includes an assessment of one or more categories pertaining to a third entity, based at least in part on the vehicle diagnostics information. The method includes providing, by the one or more computing devices, the remote inspection information to the third entity to provide the vehicle service.
[0009] Another example aspect of the present disclosure is directed to a computing system for controlling an autonomous vehicle to provide a vehicle service. The computing system includes one or more processors and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the computing system to perform operations. The operations include determining vehicle diagnostics information associated with a first autonomous vehicle that is part of a fleet of vehicles controlled by a first entity to provide a vehicle service to a second entity. The operations include determining remote inspection information that includes an assessment of one or more categories pertaining to a third entity, based at least in part on the vehicle diagnostics information. The operations include providing the remote inspection information to the third entity to provide the vehicle service.
[0010] Yet another example aspect of the present disclosure is directed to an autonomous vehicle. The autonomous vehicle includes one or more processors and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the autonomous vehicle to perform operations. The operations include determining vehicle diagnostics information associated with the autonomous vehicle, the autonomous vehicle controlled by a first entity to provide a vehicle service to a second entity. The operations include determining remote inspection information that includes an assessment of one or more categories pertaining to a third entity, based at least in part on the vehicle diagnostics information. The operations include providing the remote inspection information to the third entity to provide the vehicle service.
[0011] Other example aspects of the present disclosure are directed to systems, methods, vehicles, apparatuses, tangible, non-transitory computer-readable media, and memory devices for controlling an autonomous vehicle.
[0012] These and other features, aspects, and advantages of various embodiments will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the present disclosure and, together with the description, serve to explain the related principles.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] Detailed discussion of embodiments directed to one of ordinary skill in the art are set forth below, which make reference to the appended figures, in which:
[0014] FIG. 1 depicts a block diagram of an example system overview according to example embodiments of the present disclosure;
[0015] FIG. 2 depicts a block diagram of an example vehicle computing system according to example embodiments of the present disclosure;
[0016] FIG. 3 depicts a block diagram of an example vehicle remote computing system interface according to example embodiments of the present disclosure;
[0017] FIGS. 4A-4C depict diagrams that illustrate an example of controlling an autonomous vehicle according to example embodiments of the present disclosure;
[0018] FIGS. 5 A and 5B depict diagrams that illustrate an example of controlling an autonomous vehicle according to example embodiments of the present disclosure;
[0019] FIG. 6 depicts a diagram that illustrates an example of controlling an autonomous vehicle according to example embodiments of the present disclosure;
[0020] FIG. 7 depicts a diagram that illustrates an example of controlling an autonomous vehicle according to example embodiments of the present disclosure;
[0021] FIG. 8 depicts a diagram that illustrates an example of controlling an autonomous vehicle according to example embodiments of the present disclosure;
[0022] FIG. 9 depicts a diagram that illustrates an example of controlling an autonomous vehicle according to example embodiments of the present disclosure; and
[0023] FIG. 10 depicts a diagram that illustrates an example of controlling an
autonomous vehicle according to example embodiments of the present disclosure;
[0024] FIG. 11 depicts a diagram that illustrates an example of controlling an
autonomous vehicle according to example embodiments of the present disclosure;
[0025] FIG. 12 depicts a diagram that illustrates an example of controlling an
autonomous vehicle according to example embodiments of the present disclosure;
[0026] FIG. 13 depicts a flow diagram of controlling an autonomous vehicle according to example embodiments of the present disclosure;
[0027] FIG. 14 depicts an example transfer hub according to example embodiments of the present disclosure; [0028] FIG. 15 depicts an example transportation route according to example
embodiments of the present disclosure;
[0029] FIGS. 16A-16C depict determining remote inspection information using a mobile external monitor according to example embodiments of the present disclosure;
[0030] FIG. 17 depicts example remote inspection information according to example embodiments of the present disclosure;
[0031] FIG. 18 depicts a flow diagram of controlling an autonomous vehicle to provide a vehicle service according to example embodiments of the present disclosure; and
[0032] FIG. 19 depicts example system components according to example embodiments of the present disclosure.
[0033] Reference numerals that are repeated across plural figures are intended to identify the same components or features in various implementations.
DETAILED DESCRIPTION
[0034] Example aspects of the present disclosure are directed to managing a fleet of vehicles in response to communications sent by a third-party entity. An entity (e.g., a service provider) can operate the fleet of vehicles to provide a vehicle service for another entity requesting the vehicle service (e.g., a client). The service provider can operate the fleet of vehicles in one or more jurisdictions under the purview of one or more third-party entities (e.g., law enforcement entity, transportation infrastructure regulatory entity, tax assessment entity, etc.). The fleet can include, for example, autonomous vehicles that can drive, navigate, operate, etc. with minimal and/or no interaction from a human driver. A service provider can control an autonomous vehicle in the fleet of vehicles to provide the vehicle service for a client in one or more jurisdictions under the purview of one or more third-party entities (e.g., law enforcement entity, transportation infrastructure regulatory entity, tax assessment entity, etc.). Systems and methods of the present disclosure can enable an autonomous vehicle that is providing a vehicle service, to receive one or more vehicle command(s) from a third-party entity, and to perform one or more vehicle action(s) in response to the vehicle command(s) sent by the third-party entity. In some implementations, for example, the service provider can control the autonomous vehicle to provide remote inspection information associated with the autonomous vehicle to the one or more third-party entities when the autonomous vehicle is operating in the one or more jurisdictions.
[0035] More particularly, a service provider can operate a fleet of one or more vehicles (e.g., ground-based vehicles) to provide a vehicle service such as a transportation service, a courier service, a delivery service, etc. The vehicles can be autonomous vehicles that include various systems and devices configured to control the operation of the vehicle. For example, an autonomous vehicle can include an onboard vehicle computing system for operating the vehicle (e.g., located on or within the autonomous vehicle). In some implementations, the autonomous vehicles can operate in an autonomous mode. For example, the vehicle computing system can receive sensor data from sensors onboard the vehicle (e.g., cameras, LIDAR, RADAR), attempt to comprehend the environment proximate to the vehicle by performing various processing techniques on the sensor data, and generate an appropriate motion plan through the environment. In some implementations, the autonomous vehicles can operate in a manual mode. For example, a human operator (e.g., driver) can manually control the autonomous vehicle. Moreover, the autonomous vehicle can be configured to communicate with one or more computing device(s) that are remote from the vehicle. For example, the autonomous vehicle can communicate with an operations computing system that can be associated with the service provider. The operations computing system can help the service provider monitor, communicate with, manage, etc. the fleet of vehicles. As another example, the autonomous vehicle can communicate with one or more other vehicles (e.g., a vehicle computing system onboard each of the one or more other vehicles in the fleet), third- party computing systems (e.g., a client computing system, law enforcement computing system, transportation infrastructure computing system, tax assessment computing system, etc.), or other remote computing systems. In some implementations, the operations computing system can mediate communication between the autonomous vehicle and the computing device(s) that are remote from the vehicle.
[0036] According to aspects of the present disclosure, a vehicle application programming interface (Vehicle API) platform can provide for a translation/transport layer as an interface between vehicle computing systems onboard vehicles within an entity's fleet and one or more remote clients and/or applications operating on a remote computing system (e.g., a vehicle computing system onboard each of the one or more other vehicles in the fleet, a third-party computing system, etc.). For example, the Vehicle API platform can receive data from a vehicle over a communications pipeline established with the Vehicle API. The Vehicle API platform can provide for communicating vehicle data to the remote computing system in a secure manner that allows for expanded processing of vehicle data off the vehicle, analyzing such data in real time, and/or the like. According to example aspects of the present disclosure, a Vehicle API platform can be vehicle agnostic, allowing for any autonomous vehicle and/or computer-capable vehicle to interact with a remote computing system by providing a consistent communication pipeline that any vehicle computing system would be able to use to send vehicle data (e.g., vehicle state information, etc.) and/or receive messages (e.g., command/control messages, configuration messages, etc.) from a remote computing system.
[0037] According to aspects of the present disclosure, one or more autonomous vehicle(s) in the fleet can receive one or more vehicle command(s) from a third-party computing system associated with a third-party entity, and determine one or more vehicle action(s) to perform in response to the vehicle command(s). The third-party entity can be a law enforcement entity, and more particularly a representative of the law enforcement entity, such as, for example, a police officer. The police officer can send vehicle command(s) to the autonomous vehicle(s) via the third-party computing system.
[0038] In some implementations, the autonomous vehicle(s) can be configured to receive the vehicle command(s) from the third-party computing system via a local-area or short-range communication network. For example, the autonomous vehicle(s) can receive the vehicle command(s) via a direct line-of-sight communication network. In this way, the third-party computing system can be limited to controlling the autonomous vehicle(s) via the vehicle command(s) when the third-party entity associated with the third-party computing system can directly observe the autonomous vehicle(s), or when the third-party entity and/or third-party computing system is proximate to the autonomous vehicle(s).
[0039] In some implementations, the autonomous vehicle(s) can be initially configured as being unselected. The unselected autonomous vehicle(s) can be configured to respond only to vehicle command(s) for selecting an autonomous vehicle from the third-party entity. In response to receiving vehicle command(s) for selecting an autonomous vehicle, the unselected autonomous vehicle(s) can be configured as being selected. The selected autonomous vehicle(s) can be configured to respond to additional vehicle command(s) from the third-party entity.
[0040] As an example, a police officer can identify a first autonomous vehicle and send vehicle command(s) for selecting an autonomous vehicle, to the first autonomous vehicle. In response to receiving the vehicle command(s), the first autonomous vehicle can be configured as being selected. In addition, the first autonomous vehicle can perform vehicle action(s) to indicate the selection, for example, by flashing external indicator lights, displaying a message on an external display, or performing other vehicle action(s) that can be perceived by the police officer so that the police officer can determine that the first autonomous vehicle is selected. [0041] As another example, a police officer can identify a first autonomous vehicle and send vehicle command(s) to select the first autonomous vehicle. The police officer can then send vehicle command(s) instructing the selected first autonomous vehicle to relay future vehicle command(s) from the police officer to a second autonomous vehicle. The vehicle command(s) can identify the second autonomous vehicle based on an identifier associated with the second autonomous vehicle, or based on a relative position of the second
autonomous vehicle with respect to the first autonomous vehicle (e.g., in front, behind, etc.). In response to the vehicle command(s), the first autonomous vehicle can communicate with the second autonomous vehicle for sending data indicative of the future vehicle command(s) to the second autonomous vehicle. The future vehicle command(s) can include vehicle command(s) for selecting an autonomous vehicle. When the first autonomous vehicle receives the future vehicle command(s), the first autonomous vehicle can send data indicative of the future vehicle command(s) to the second autonomous vehicle. In response to receiving the data from the first autonomous vehicle, the second autonomous vehicle can be configured as being selected. The first autonomous vehicle can then be configured as being unselected.
[0042] As another example, a police officer can identify a first autonomous vehicle and broadcast vehicle command(s) for selecting an autonomous vehicle, with an intent to select the first autonomous vehicle. If a second autonomous vehicle receives the vehicle command(s) instead of the first autonomous vehicle, then the second autonomous vehicle can be selected. The police officer can broadcast vehicle command(s) instructing the selected second autonomous vehicle to relay future vehicle command(s) to the first autonomous vehicle. The vehicle command(s) can, for example, instruct the second autonomous vehicle to relay the future vehicle command(s) to an autonomous vehicle in front of the second autonomous vehicle or behind the second autonomous vehicle. The future vehicle command(s) can include vehicle command(s) for selecting an autonomous vehicle. When the second autonomous vehicle receives the future vehicle command(s), the second autonomous vehicle can send data indicative of the future vehicle command(s) to the first autonomous vehicle. In response to receiving the data from the second autonomous vehicle, the first autonomous vehicle can be configured as being selected. The second autonomous vehicle can then be configured as being unselected.
[0043] As another example, a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle. The vehicle command(s) can include or otherwise indicate a reason for the selection, such as, for example, low tire pressure, broken windshield, fluid leak, etc. The police officer can send vehicle command(s) instructing the first autonomous vehicle to provide information indicating the reason for the selection to a service provider. The first autonomous vehicle can determine one or more vehicle action(s) to perform in response to the vehicle command(s) from the police officer. The vehicle action(s) can include, for example, communicating with the service provider to send data indicative of the reason for the selection. In response, the service provider can, for example, schedule the first autonomous vehicle for a maintenance service or a repair service at a later time.
[0044] As another example, a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the first autonomous vehicle to provide status information associated with the vehicle. The first autonomous vehicle can determine one or more vehicle action(s) to perform in response to the vehicle command(s) from the police officer. If the vehicle command(s) instructing the first autonomous vehicle to provide the status information include or otherwise indicate one or more vehicle component(s) of interest, then the vehicle action(s) can include generating status information data that includes a status of the vehicle component(s). Otherwise, the vehicle action(s) can include generating status information data that includes, for example, an overall status or health of the first autonomous vehicle, a status of a default set of vehicle component s), or other status information associated with the first autonomous vehicle. The first autonomous vehicle can provide the status information to the third-party computing system associated with the police officer. The police officer can review the status information to verify the tire pressure of the first autonomous vehicle and take further action if necessary. The further action can include, for example, sending vehicle command(s) that instruct the first autonomous vehicle to stop, travel to a maintenance area, etc.
[0045] As another example, a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the first autonomous vehicle to stop. The first autonomous vehicle can determine one or more vehicle action(s) to perform in response to the vehicle command(s) from the police officer. The vehicle action(s) can include, for example, implementing a stopping action.
[0046] In some implementations, the vehicle command(s) from a third-party entity can include or otherwise indicate a reason or a priority associated with the vehicle command(s), and an autonomous vehicle can determine vehicle action(s) to perform in response to the vehicle command(s) based on the associated reason or the priority.
[0047] As an example, a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the vehicle to provide status information associated with the vehicle. The vehicle command(s) can include a reason for the status information, such as, for example, to check a tire pressure of the first autonomous vehicle. The first autonomous vehicle can determine one or more vehicle action(s) to perform in response to the vehicle command(s) from the police officer. The vehicle action(s) can include, for example, generating status information indicative of the vehicle’s tire pressure. In this way, the first autonomous vehicle can determine vehicle action(s) to perform based on the reason associated with the vehicle command(s) from the third-party entity.
[0048] As another example, a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the vehicle to stop. The vehicle command(s) can include a reason for stopping the vehicle, such as, for example, low tire pressure. The first autonomous vehicle can determine that the reason for the stop is not a critical reason, and the first autonomous vehicle can perform a soft-stop vehicle action in response to the vehicle command(s). The soft-stop vehicle action can include the first autonomous vehicle continuing to travel along its route until a safe stopping location is identified (e.g., a shoulder lane, off-ramp, etc.), and stopping at the safe stopping location.
[0049] As another example, a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the vehicle to stop. The vehicle command(s) can include a reason for the stop, such as, for example, a fluid leak. The first autonomous vehicle can determine that the reason for the stop is a critical reason, and the first autonomous vehicle can perform an emergency-stop vehicle action in response to the vehicle command(s). The emergency-stop vehicle action can include the first autonomous vehicle immediately stopping in a lane that the autonomous vehicle is travelling in.
[0050] As another example, a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the vehicle to stop. The vehicle command(s) can include an associated priority level, such as, for example, low-priority or high-priority. If the priority level is low-priority, then the first autonomous vehicle can perform a soft-stop vehicle action in response to the vehicle command(s). If the priority level is high-priority, then the first autonomous vehicle can perform an emergency- stop vehicle action in response to the vehicle command(s).
[0051] As another example, a police officer can identify a first autonomous vehicle and send vehicle command(s) selecting the first autonomous vehicle and instructing the vehicle to stop. The vehicle command(s) can indicate that a reason for the stop is to enable the police officer to inspect the first autonomous vehicle. In response, the first autonomous vehicle can perform a stopping action so that the police officer can inspect the vehicle.
[0052] In some implementations, an autonomous vehicle that performs a vehicle action to come to a stop (e.g., safe-stop, emergency-stop, etc.) can implement one or more vehicle action(s) subsequent to the stopping action.
[0053] As an example, an autonomous vehicle can provide limited access to a police officer who is inspecting the autonomous vehicle.
[0054] As another example, an autonomous vehicle can communicate with the service provider to provide information indicative of vehicle command(s) from a third-party entity. The information can include a reason for the stop.
[0055] As another example, a police officer can send vehicle command(s) instructing the autonomous vehicle to resume normal operations. In response to the vehicle command(s), the autonomous vehicle can, for example, resume a vehicle service, join/rejoin a convoy, etc.
[0056] In some implementations, an autonomous vehicle can implement one or more action(s) subsequent to a stopping action based on one or more cargo asset(s) being transported by the autonomous vehicle.
[0057] As an example, if the cargo asset(s) include perishable goods, then the first autonomous vehicle can request that a second autonomous vehicle pick-up the cargo asset(s) from a location of the first autonomous vehicle and transport the cargo asset(s) instead of the first autonomous vehicle.
[0058] As another example, if the cargo asset(s) are confidential or protected, then the first autonomous vehicle can lock-down and restrict access in order to safeguard the cargo asset(s).
[0059] In some implementations, the vehicle command(s) from a third-party entity can include information used to authenticate the third-party entity. For example, the vehicle command(s) can be encrypted using a predetermined key. The predetermined key can be one of a plurality of predetermined keys previously shared between the autonomous vehicle and a plurality of third-party computing systems each associated with a third-party entity. The autonomous vehicle can receive one or more encrypted vehicle command(s) from a third- party computing system, and decrypt the encrypted vehicle command(s) using a
predetermined key that is previously shared between the autonomous vehicle and the third- party computing system. In this way, the autonomous vehicle can authenticate the third-party entity. [0060] As an example, a first predetermined key can be shared between an autonomous vehicle and a first third-party computing system associated with a first third-party entity (e.g., law enforcement entity, transportation infrastructure regulatory entity, tax assessment entity, etc.), a second predetermined key can be shared between the autonomous vehicle and a second third-party computing system associated with a second third-party entity, and a third predetermined key can be shared between the autonomous vehicle and a third third-party computing system associated with a third third-party entity. If the autonomous vehicle can decrypt the encrypted vehicle command(s) with the first predetermined key, then the autonomous vehicle can authenticate the first third-party entity in association with the encrypted vehicle command(s). If the autonomous vehicle can decrypt the encrypted vehicle command(s) with the second predetermined key, then the autonomous vehicle can
authenticate the second third-party entity in association with the encrypted vehicle command(s). If the autonomous vehicle can decrypt the encrypted vehicle command(s) with the third predetermined key, then the autonomous vehicle can authenticate the third third- party entity in association with the encrypted vehicle command(s).
[0061] In some implementations, the autonomous vehicle can determine one or more vehicle action(s) to perform based in part on an authentication of the third-party entity.
[0062] As an example, an autonomous vehicle can receive vehicle command(s) instructing the vehicle to stop. If the autonomous vehicle authenticates a law enforcement entity in association with the vehicle command(s), then the autonomous vehicle can perform a stopping action to come to a stop. Alternatively, if the autonomous vehicle authenticates a tax assessment entity in association with the vehicle command(s), then the autonomous vehicle can ignore the vehicle command(s).
[0063] As another example, an autonomous vehicle can receive a vehicle command(s) instructing the vehicle to provide status information associated with the vehicle. If the autonomous vehicle authenticates a law enforcement entity in association with the vehicle command(s), then the autonomous vehicle can provide the status information relevant to the law enforcement entity (e.g., tire pressure, speed, etc.) to a third-party computing system associated with the law enforcement entity. Alternatively, if the autonomous vehicle authenticates a tax assessment entity in association with the vehicle command(s), then the autonomous vehicle can provide the status information relevant to the tax assessment entity (e.g., cargo type, cargo weight, etc.) to a third-party computing system associated with the tax assessment entity. [0064] In some implementations, the vehicle command(s) from a third-party entity can include a vehicle identifier corresponding to a specific autonomous vehicle.
[0065] For example, an autonomous vehicle can include a static display of an identifier (e.g., an identification code corresponding to the autonomous vehicle painted on the outside of the autonomous vehicle) or a dynamic display of an identifier (e.g., an external display that displays an identification code corresponding to the autonomous vehicle). A law
enforcement entity (e.g., police officer) can identify a specific autonomous vehicle based on the vehicle identifier corresponding to the specific autonomous vehicle, and send vehicle command(s) that include the vehicle identifier. A first autonomous vehicle that receives the vehicle command(s) can determine whether the vehicle identifier included in the vehicle command(s) corresponds to the first autonomous vehicle. If the vehicle identifier
corresponds to the first autonomous vehicle, then the first autonomous vehicle can determine one or more vehicle action(s) to perform based on the vehicle command(s). If the vehicle identifier does not correspond to the first autonomous vehicle, then the first autonomous vehicle can ignore the vehicle command(s). Alternatively, if the vehicle identifier corresponds to a second autonomous vehicle that is proximate to the first autonomous vehicle, then the first autonomous vehicle can relay the vehicle command(s) to the second autonomous vehicle.
[0066] In some implementations, a plurality of autonomous vehicles can each receive the vehicle command(s) from a third-party entity. The plurality of autonomous vehicles can each determine one or more vehicle action(s) to perform in response to the vehicle command(s).
[0067] For example, a law enforcement entity (e.g., computing system associated with the law enforcement entity) can broadcast vehicle command(s) that are received by a first autonomous vehicle, a second autonomous vehicle, and a third autonomous vehicle. In response to receiving the vehicle command(s), the first autonomous vehicle, second autonomous vehicle, and third autonomous vehicle can each independently determine one or more vehicle action(s) to perform in response to the vehicle command(s).
[0068] According to aspects of the present disclosure, the service provider can control the autonomous vehicle to provide remote inspection information associated with the
autonomous vehicle to the one or more third-party entities when the autonomous vehicle is operating in one or more jurisdictions. As an example, a service provider can provide a vehicle service across a state boundary between two states that are each governed by a different state government with different laws and regulations for operating an autonomous vehicle, and policed by different law enforcement entities. As another example, a service provider can provide a vehicle service across a national boundary between two nations that are each governed by different government entities. As yet another example, a service provider can provide a vehicle service using a highway transportation infrastructure administered by a highway transportation administration, using a maritime transportation infrastructure administered by a port authority, using an air transportation infrastructure administered by an aviation administration, and/or within a shared environment administered by one or more of a national, state and local environmental protection entity.
[0069] As another example, a service provider can control an autonomous vehicle to autonomously determine remote inspection information, and control the autonomous vehicle to provide the remote inspection information to a third-party entity. In particular, the autonomous vehicle can determine the remote inspection information based on diagnostics information associated with the autonomous vehicle. The autonomous vehicle can generate the diagnostics information and/or obtain the diagnostics information from an external monitor. The remote inspection information can include, for example, a speed of the autonomous vehicle, a weight of a cargo item attached to the autonomous vehicle, a contents of a cargo item attached to the autonomous vehicle, an engine status, tire pressure, tire wear, readings of one or more sensors on-board the autonomous vehicle, an operating status of one or more sensors on-board the autonomous vehicle, a distance travelled by the autonomous vehicle, fuel consumption, emissions levels, a physical location of the autonomous vehicle, an indication of one or more faults detected by the autonomous vehicle, etc. The autonomous vehicle can provide, for example, remote inspection information including a speed of the autonomous vehicle to a law enforcement entity, remote inspection information including a weight of a cargo item attached to the autonomous vehicle to a tax assessment entity, etc. Additionally, and/or alternatively, an operations computing system associated with the service provider can mediate communication between the autonomous vehicle and one or more third-party entities.
[0070] As yet another example, a service provider can control an autonomous vehicle to travel to a vicinity of an external monitor that can determine remote inspection information associated with the autonomous vehicle. In particular, the external monitor can determine the remote inspection information based on diagnostics information associated with the autonomous vehicle. The external monitor can generate the diagnostics information and/or obtain the diagnostics information from the autonomous vehicle. The external monitor can provide the remote inspection information to a third-party entity. The remote inspection information can include, for example, a speed of the autonomous vehicle, a weight of a cargo item attached to the autonomous vehicle, a content of a cargo item attached to the
autonomous vehicle, tire wear, emissions levels, a physical location of the autonomous vehicle, physical damage to the autonomous vehicle, etc. The external monitor can provide, for example, remote inspection information including a weight of a cargo item attached to the autonomous vehicle to a tax assessment entity, and remote inspection information including emissions levels of the autonomous vehicle to an environmental protection entity.
[0071] In some implementations, the service provider can control an autonomous vehicle to autonomously determine remote inspection information. In particular, the autonomous vehicle can generate diagnostics information associated with an operation of the autonomous vehicle, and determine the remote inspection information based on the diagnostics information. The service provider can control the autonomous vehicle to autonomously generate such diagnostics information while the autonomous vehicle is in use. The diagnostics information can include information corresponding to one or more systems on board the autonomous vehicle and/or information corresponding to an environment in which the autonomous vehicle operates. For example, diagnostics information can include information on one or more faults detected with respect to one or more systems on-board an autonomous vehicle. As another example, diagnostics information can include sensor data obtained by one or more sensors on-board an autonomous vehicle. The sensor data can include information on one or more components of the autonomous vehicle and/or information on an environment in which the autonomous vehicle operates.
[0072] In some implementations, the service provider can control an autonomous vehicle to travel to a vicinity of an external monitor to determine remote inspection information associated with the autonomous vehicle. In particular, the service provider can control the autonomous vehicle to provide a vehicle service using a transportation network that includes one or more external monitors at one or more locations. The transportation network can include a plurality of transfer hubs and a plurality of transportation routes that link the plurality of transfer hubs with one another. The transportation network can utilize, for example, one or more of a highway transportation infrastructure, maritime transportation infrastructure, and air transportation infrastructure.
[0073] In some implementations, the transportation network can include one or more external monitors that can determine remote inspection information associated with an autonomous vehicle. In particular, the one or more external monitors can determine the remote inspection information based on diagnostics information associated with the autonomous vehicle, and can provide the remote inspection information to a third-party entity. The one or more external monitors can generate the diagnostics information and/or obtain the diagnostics information from the autonomous vehicle, when the autonomous vehicle is in a vicinity of the external monitor.
[0074] In some implementations, the plurality of transfer hubs can include one or more external monitors that can determine remote inspection information associated with an autonomous vehicle when the autonomous vehicle enters and/or exits the transfer hub. The one or more external monitors can include an autonomous inspector that can inspect the autonomous vehicle and/or a human inspector that can inspect the autonomous vehicle. For example, an external monitor can include a camera that visually inspects an autonomous vehicle to generate diagnostics information associated with the autonomous vehicle and determine remote inspection information based on the diagnostics information. As another example, an external monitor can include a weigh scale that can weigh a cargo item attached to an autonomous vehicle to generate diagnostics information associated with the
autonomous vehicle and determine remote inspection information based on the diagnostics information. As yet another example, an external monitor can include a human inspector that can perform an inspection of an autonomous vehicle to generate diagnostics information associated with the autonomous vehicle and determine remote inspection information based on the diagnostics information. As yet another example, an external monitor can include a wireless beacon configured to wirelessly communicate with an autonomous vehicle as it passes a location of the beacon to obtain diagnostics information from the autonomous vehicle.
[0075] In some implementations, the plurality of transportation routes can include one or more external monitors located at one or more locations along the transportation routes that can determine remote inspection information associated with an autonomous vehicle when the autonomous vehicle passes within a vicinity of the external monitor. For example, a transportation route can include a plurality of external monitors at periodic intervals along the transportation route. As another example, a transportation route can include an external monitor where the transportation route crosses a jurisdictional boundary. As yet another example, an external monitor can include a dedicated and/or physical connection to a communications network, and the transportation route can include an external monitor where wireless communication is unavailable or unreliable.
[0076] In some implementations, the service provider can control an autonomous vehicle to provide remote inspection information to a third-party entity. For example, a service provider can control an autonomous vehicle to autonomously provide remote inspection information to a third-party entity at one or more times when the autonomous vehicle is in use. As another example, a service provider can control an autonomous vehicle to travel to a vicinity of an external monitor when the autonomous vehicle enters and/or exits a transfer hub, to provide remote inspection information to a third-party entity. As yet another example, a service provider can control an autonomous vehicle to autonomously provide remote inspection information to an external monitor, when the autonomous vehicle is travelling on a transportation route.
[0077] In some implementations, an external monitor can communicate with one or more of a service provider, an autonomous vehicle, and a third-party entity. For example, an external monitor can communicate diagnostics information to a service provider. As another example, an external monitor can communicate diagnostics information to an autonomous vehicle that it inspected so that the autonomous vehicle can aggregate diagnostics information form one or more sources. The service provider can determine remote inspection information based on diagnostics information obtained from an external monitor and/or an autonomous vehicle, and provide the remote inspection information to a third-party entity. Additionally and/or alternatively, the service provider can control the autonomous vehicle to
autonomously determine remote inspection information based on aggregated diagnostics information, and provide the remote inspection information to a third-party entity.
[0078] In some implementations, an external monitor can be fixed at a particular geographic location. For example, an external monitor that includes a weigh scale can be fixed at a location in a transfer hub. In some implementations, an external monitor can be mobile. For example, an external monitor that includes a camera to visually inspect an autonomous vehicle can be affixed to another vehicle. A service provider can control a first autonomous vehicle affixed with an external monitor to travel to a vicinity of a second autonomous vehicle so that the external monitor can inspect the second autonomous vehicle and determine remote inspection information associated with the second autonomous vehicle.
[0079] In some implementations, a service provider can control a first autonomous vehicle affixed with an external monitor to travel to a vicinity of a second autonomous vehicle in response to a request by the second autonomous vehicle. For example, if a first autonomous vehicle detects a fault with a tire pressure sensor, then the first autonomous vehicle can request that a second autonomous vehicle affixed with an external monitor travel to a vicinity of the first autonomous vehicle to visually inspect one or more tires of the first autonomous vehicle. The second autonomous vehicle can communicate a status of the first autonomous vehicle’s tires to the first autonomous vehicle. [0080] In some implementations, one or more sensors onboard an autonomous vehicle (e.g., cameras, LIDAR, RADAR) can be used to inspect another vehicle (e.g., a second autonomous vehicle) and determine remote inspection information associated with the vehicle. For example, if a first autonomous vehicle detects a fault with respect to a tire pressure sensor and requests an external inspection, a service provider can control one or more other autonomous vehicles in a fleet of vehicles to travel to a vicinity of the first autonomous vehicle to visually inspect one or more tires of the first autonomous vehicle. The second autonomous vehicle can communicate a status of the first autonomous vehicle’s tires to the first autonomous vehicle.
[0081] In some implementations, the remote inspection information can include a status of one or more categories pertaining to a third-party entity to which the remote inspection information is provided. For example, remote inspection information can include a numerical value as a status for a speed and/or a weight associated with an autonomous vehicle. As another example, remote inspection information can include an indication of“under speed limit”,“within a threshold value of the speed limit”, or“over speed limit” as a status for a speed. As yet another example, remote inspection information can include an indication of “unchanged” or“changed” as a status for a weight, relative to a starting weight associated with an autonomous vehicle.
[0082] In some implementations, a status of one or more categories can include“green”, “yellow”, and“red”, and the one or more categories can include a first category
corresponding to one or more components of an autonomous vehicle, a second category corresponding to a performance of an autonomous vehicle, and a third category
corresponding to a surrounding environment of an autonomous vehicle.
[0083] As an example, a category of one or more components of an autonomous vehicle can include one or more of a vehicle platform, vehicle computing system, one or more sensors, engine, tires, etc. Remote inspection information associated with the autonomous vehicle can include“green” for a status of the components if diagnostics information associated with the autonomous vehicle indicates that all the components of the autonomous vehicle are functioning properly and there are no detected faults. Remote inspection information associated with the autonomous vehicle can include“yellow” for a status of the components of the autonomous vehicle if vehicle diagnostics information associated with the autonomous vehicle indicates that a problem or detected fault (e.g., engine temperature is high, tire pressure is low), but a current operation of the autonomous vehicle can be completed safely. Remote inspection information associated with the autonomous vehicle can include“red” for a status of the components of the autonomous vehicle if vehicle diagnostics information associated with the autonomous vehicle indicates a critical problem or error that affects a safe operation of the autonomous vehicle.
[0084] As another example, a category of performance of an autonomous vehicle can include one or more of a speed, distance travelled, fuel consumption, weight, emissions, coolant level, brake wear, etc. Remote inspection information associated with the
autonomous vehicle can include“green” for a status of the performance of the autonomous vehicle if a speed, weight, and emissions are within an acceptable range for a jurisdiction in which the autonomous vehicle is operating. Remote inspection information associated with the autonomous vehicle can include“yellow” for a status of the performance of the autonomous vehicle if there is a spike in fuel consumption or emissions while the
autonomous vehicle is in operation. Remote inspection information associated with the autonomous vehicle can include“red” for a status of the performance of the autonomous vehicle if a coolant level drops below a critical level and a brake wear exceeds a critical level.
[0085] As yet another example, a category of a surrounding environment of an
autonomous vehicle can include one or more of road conditions, weather conditions, traffic conditions, etc. Remote inspection information associated with the autonomous vehicle can include“green” for a status of the surrounding environment if the autonomous vehicle encounters good road conditions (e.g., well maintained, existence of safety lanes, etc.).
Remote inspection information associated with the autonomous vehicle can include“yellow” for a status of the surrounding environment if the autonomous vehicle encounters
construction zones, inclement weather affecting visibility or traction, or traffic congestion. Remote inspection information associated with the autonomous vehicle can include“red” for a status of the surrounding environment if the autonomous vehicle encounters a hazard, accident, or other event that renders a road segment impassable.
[0086] In some implementations, the service provider can control an autonomous vehicle and/or an external monitor to provide remote inspection information to an appropriate third- party entity.
[0087] As an example, remote inspection information that includes a status of one or more components of an autonomous vehicle can be provided to a highway transportation administration, port authority, or aviation administration to ensure compliance with rules and/or regulations concerning operation of an autonomous vehicle over a highway
transportation infrastructure, maritime transportation infrastructure, or air transportation infrastructure, respectively. [0088] As another example, remote inspection information that includes a status of a location, speed, and transportation route of an autonomous vehicle can be provided to a law enforcement entity to assist the law enforcement entity in monitoring vehicular traffic in its jurisdiction. Additionally, or alternatively, remote inspection information that includes a status of a cargo attached to an autonomous vehicle (e.g., cargo weight) can be provided to a tax assessment entity when the autonomous vehicle crosses a boundary from one state to another. Additionally, or alternatively, remote inspection information that includes emissions information of an autonomous vehicle can be provided to an environmental protection entity to ensure compliance with emissions standards.
[0089] As yet another example, remote inspection information that includes a status of a surrounding environment of an autonomous vehicle can be provided to a highway
transportation administration to report segments of a road that are in need of repair and/or lack a safety lane. Additionally, or alternatively, remote inspection information that includes a status of a surrounding environment can be provided to a law enforcement entity to report an accident.
[0090] In some implementations, the service provider can control an autonomous vehicle and/or an external monitor to provide remote inspection information to a third-party entity at one or more times. For example, remote inspection information associated with an autonomous vehicle can be provided to a law enforcement entity each time the autonomous vehicle exits a transfer hub onto a transportation route. As another example, remote inspection information associated with an autonomous vehicle can be provided to an environmental protection entity at a predetermined time interval and/or if diagnostics information associated with the autonomous vehicle indicates an emissions spike. As yet another example, remote inspection information associated with an autonomous vehicle can be provided to a tax assessment entity each time the autonomous vehicle crosses into a different tax jurisdiction.
[0091] According to aspects of the present disclosure, a plurality of vehicles in the fleet can operate as a convoy, such that the plurality of vehicles in the convoy travel together as a group.
[0092] In some implementations, the convoy can include a lead vehicle and one or more follower vehicle(s). The lead vehicle can be configured to operate ahead of the follower vehicle(s), and the follower vehicle(s) can be configured to follow behind the lead vehicle.
[0093] As an example, the follower vehicle(s) can be configured to follow a preceding vehicle in the convoy. The follower vehicle(s) can include a first follower vehicle, second follower vehicle, third follower vehicle, and fourth follower vehicle. The first follower vehicle can be configured to follow behind the lead vehicle, the second follower vehicle configured to follow behind the first follower vehicle, the third follower vehicle configured to follow behind the second follower vehicle, and the fourth follower vehicle can be configured to follow behind the third follower vehicle.
[0094] As another example, the follower vehicle(s) can be configured to follow at a predetermined distance. The predetermined distance can be static or dynamic, and can be based on, for example, whether the plurality of vehicles are operating on a highway or on local roads, traffic conditions (e.g., volume of traffic, speed of traffic, etc.), road conditions (e.g., road incline/decline, speed limit, construction zones, etc.), a communication range (e.g., so that the plurality of autonomous vehicles can communicate with each other), weather conditions (e.g., that affect visibility, vehicle traction, stopping distance, etc.), etc.
[0095] In some implementations, the plurality of vehicles in the convoy can operate independently, and/or communicate with each other to coordinate one or more vehicle action(s) to perform in response to vehicle command(s) from a third-party entity.
[0096] As an example, one or more autonomous vehicle(s) in the convoy can operate in an autonomous mode. Each of the autonomous vehicle(s) can obtain sensor data from sensors onboard the vehicle, attempt to comprehend the environment proximate to the vehicle by performing processing techniques on the sensor data, and generate an appropriate motion plan through the environment. In this way, each of the autonomous vehicle(s) can identify one or more obstacle(s) in the environment proximate to the vehicle, and generate a motion plan to avoid the obstacle(s). The motion plan can include, for example, avoidance maneuvers, stopping actions, etc. Additionally, or alternatively, the autonomous vehicle(s) can communicate with each other to share information, such as, for example, one or more obstacle(s) identified in the environment, avoidance maneuvers, stopping actions, or other information. The autonomous vehicle(s) can generate an appropriate motion plan based in part on the shared information. In this way, an autonomous vehicle in the convoy can know in advance about a location of an obstacle or a trajectory of another autonomous vehicle, so that the autonomous vehicle can generate a motion plan to avoid the obstacle and/or other vehicle.
[0097] As another example, in response to a stop command, an autonomous vehicle in the convoy can determine vehicle actions to perform that include: removal from the convoy, and a stopping action. The autonomous vehicle can communicate with other vehicles in the convoy to send data indicative of the stop command and/or the determined vehicle actions, and then perform the stopping action. The other vehicles in the convoy can receive the data and adjust their motion plan to maneuver past the autonomous vehicle as it performs the stopping action.
[0098] As another example, in response to a stop command, an autonomous vehicle in the convoy can determine vehicle actions to perform that include: coordinating with other vehicles in the convoy to stop as a group. The autonomous vehicle can communicate with the other vehicles in the convoy to determine coordinated stopping actions for the vehicles in the convoy. Each vehicle in the convoy can perform a respective coordinated stopping action so that the convoy can stop as a group.
[0099] In some implementations, the lead vehicle in the convoy can include a human operator. The human operator can manually control the lead vehicle (e.g., via a human- machine interface), and send one or more vehicle command(s) to the follower vehicle(s) to manage/control the convoy. The vehicle command(s) can include, for example, instructions for a follower vehicle to start, stop, slow down, speed up, increase/decrease a follow distance, follow a specific/different vehicle in the convoy, respond to or ignore a vehicle command sent by a third-party entity, etc. The vehicle command(s) can also include instructions to rearrange the convoy by adding or removing a vehicle from the convoy, as will be described further below.
[0100] As an example, the lead vehicle can be an autonomous vehicle operating in a manual mode, and a human operator can be a driver who can manually drive the lead vehicle. The human operator can also communicate (e.g., receive data, send vehicle command(s), etc.) with the follower vehicle(s) via the lead vehicle.
[0101] As another example, a human operator can identify one or more obstacle(s) in an environment. The obstacle(s) can include, for example, traffic conditions (e.g., volume of traffic, speed of traffic, etc.), road conditions (e.g., blocked lanes, incline/decline, sharp turns, speed limit, construction zones, etc.), weather conditions (e.g., that affect visibility, vehicle traction, stopping distance, etc.), etc. The human operator can manage/control the convoy by sending one or more vehicle command(s) to assist the follower vehicle(s) in navigating the environment including the obstacle(s).
[0102] As another example, a human operator in the lead vehicle of a stopped convoy can control the convoy to start/resume travelling in an environment. The human operator can determine when and how the convoy should start/resume based on one or more obstacle(s) in the environment (e.g., traffic conditions, road conditions, weather conditions, etc.) at one or more times. When the human operator decides to start/resume, the human operator can send vehicle command(s) to the follower vehicle(s) to perform coordinated starting actions and start/resume travelling in the environment.
[0103] As another example, a human operator in the lead vehicle of a convoy can determine that the convoy is approaching (or has entered) a road segment including a construction zone. The construction zone can be associated with a lower speed limit than a normal speed limit for the road segment. Upon approaching (or having entered) the construction zone, the human operator can send vehicle command(s) to instruct the follower vehicle(s) to reduce speed. Upon leaving the construction zone, the human operator can send vehicle command(s) to instruct the follower vehicle(s) to increase speed.
[0104] As another example, a human operator in the lead vehicle of a convoy can determine that an upcoming road segment along the convoy’s route includes a steep decline segment. The human operator can send vehicle command(s) instructing the follower vehicle(s) to reduce speed and increase a follow distance while traversing the decline segment. After traversing the decline segment, the human operator can send vehicle command(s) instructing the follower vehicle(s) to increase speed and decrease a follow distance.
[0105] In some implementations, the lead vehicle can communicate with the follower vehicle(s) to obtain status information from the follower vehicle(s). Each of the follower vehicle(s) can generate status information data associated with the vehicle, and send the status information data to the lead vehicle. The status information data can include, for example, a vehicle location, vehicle health, vehicle diagnostics, raw sensor data, audio-visual data, a vehicle command sent by a third-party entity, etc., associated with one or more of the follower vehicle(s). A human operator in the lead vehicle can manage/control the convoy based on the received data.
[0106] As an example, each of the follower vehicle(s) in a convoy can periodically generate data indicative of the vehicle’s location, and send the data to the lead vehicle. If a human operator in the lead vehicle determines that a follower vehicle is travelling too slow based on the follower vehicle’s location, then the human operator can send vehicle command(s) instructing the follower vehicle to increase speed. If the human operator in the lead vehicle determines that a follower vehicle is travelling too fast, based on the follower vehicle’s location, then the human operator can send vehicle command(s) instructing the follower vehicle to reduce speed.
[0107] As another example, a follower vehicle in a convoy can encounter an obstacle or an unfamiliar environment that the follower vehicle is unable to navigate. The follower vehicle can generate data indicative of the obstacle or unfamiliar environment, and send the data to the lead vehicle in the convoy. A human operator in the lead vehicle can instruct the follower vehicle to send audio-visual data from one or more cameras onboard the vehicle, and manually control the follower vehicle to navigate past the obstacle or unfamiliar environment.
[0108] As another example, a follower vehicle in a convoy can receive vehicle command(s) from a third-party entity selecting the vehicle and instructing the it to stop. The vehicle command(s) can also include a reason for the stop, such as, for example, because the third-party entity believes that a tire pressure of the selected follower vehicle is too low. The selected follower vehicle can send data indicative of the vehicle command(s) sent by the third-party entity to the lead vehicle. A human operator in the lead vehicle can check the reason for the stop command based on vehicle diagnostics data received from the selected follower vehicle. The vehicle diagnostics data can include, for example, a tire pressure of the vehicle, as measured by one or more sensor(s) onboard the vehicle. Alternatively, the human operator can send vehicle command(s) to the selected follower vehicle for the vehicle diagnostics data, and the selected follower vehicle can send the data in response to the vehicle command(s). If the human operator determines that the tire pressure is normal, then the human operator can send vehicle command(s) instructing the selected follower vehicle to ignore the stop command sent by the third-party entity. If the human operator determines that the tire pressure is not normal, then the human operator can send vehicle command(s) instructing all vehicles in the convoy to stop as a group, so that the human operator can inspect the selected follower vehicle. Alternatively, if the human operator determines that the tire pressure is not normal, then the human operator can send vehicle command(s) to remove the selected follower vehicle from the convoy, so that the selected follower vehicle can come to a stop independently of the convoy, and the convoy can continue without the selected follower vehicle.
[0109] As another example, a selected follower vehicle in a convoy can receive vehicle command(s) from a third-party entity instructing the vehicle to stop because the third-party entity would like to inspect the vehicle. The vehicle command(s) can also include a request for a human to be present at the inspection. The selected follower vehicle can send data indicative of the vehicle command(s) from the third-party entity to the lead vehicle. In response, a human operator in the lead vehicle can send vehicle command(s) instructing all vehicles in the convoy to stop as a group, so that the human operator can be present for the inspection by the third-party entity. [0110] In some implementations, one or more vehicles in a convoy can be removed from the convoy, and the convoy can be rearranged to continue without the one or more vehicles.
[0111] As an example, the convoy can include a first vehicle that is configured as a lead vehicle, a second vehicle configured to follow the first vehicle, a third vehicle configured to follow the second vehicle, and a fourth vehicle configured to follow the third vehicle. If the first vehicle in the convoy receives a vehicle command to stop, then the first vehicle can send vehicle commands to stop the convoy. If the second vehicle in the convoy receives a vehicle command to stop from a third-party entity, then the second vehicle can be removed from the convoy, and the third vehicle can be configured to follow the first vehicle. Additionally, the first vehicle can slow down and/or the third and fourth vehicles can speed up to maintain a predetermined distance between the vehicles in the convoy. If the third vehicle in the convoy receives a vehicle command to stop, then the third vehicle can be removed from the convoy, and the fourth vehicle can be configured to follow the second vehicle. Additionally, the first and second vehicles can slow down and/or the fourth vehicle can speed up to maintain a predetermined distance between the vehicles in the convoy.
[0112] As another example, in response to vehicle command(s) to stop, a selected autonomous vehicle in the convoy can determine vehicle actions to perform that include: removal from the convoy, and a stopping action. The selected autonomous vehicle can communicate with other vehicles in the convoy to send data indicative of the determined vehicle actions. In response, the convoy can be rearranged to continue without the selected autonomous vehicle, and the rearranged convoy can maneuver past the selected autonomous vehicle as it performs the stopping action.
[0113] As another example, in response to vehicle command(s) to stop, a selected autonomous vehicle in the convoy can determine vehicle actions to perform that include: notifying a lead vehicle in the convoy. The selected autonomous vehicle can communicate with the lead vehicle to send data indicative of the vehicle command(s) to stop, and wait for a decision from the lead vehicle. If a human operator in the lead vehicle decides to remove the selected autonomous vehicle from the convoy, then the human operator can send vehicle command(s) to remove the selected autonomous vehicle from the convoy and rearrange the convoy to continue without the selected autonomous vehicle. Alternatively, if the selected autonomous vehicle is unable to send the data and/or receive vehicle command(s) (e.g., because of a communications fault or error), or if the wait time exceeds a threshold value, then the selected autonomous vehicle can automatically perform the stopping action. [0114] In some implementations, one or more vehicles in a convoy can be added to the convoy, and the convoy can be rearranged to incorporate the one or more vehicles.
[0115] As an example, the convoy can include a first vehicle that is configured as a lead vehicle, a second vehicle configured to follow the first vehicle, a third vehicle configured to follow the second vehicle, and a fourth vehicle configured to follow the third vehicle. A fifth vehicle that is not in the convoy can attempt to join the convoy by sending a request to the lead vehicle. If a human operator in the lead vehicle decides to add the fifth vehicle to the convoy, then the human operator can send vehicle command(s) to add the fifth vehicle to the convoy. In particular, the human operator can send vehicle command(s) to the fifth vehicle to follow the fourth vehicle. Alternatively, the human operator can send vehicle command(s) to the fifth vehicle to follow the third vehicle, and send vehicle command(s) to the fourth vehicle to follow the fifth vehicle. Alternatively, the human operator can send vehicle command(s) to the fifth vehicle to follow the second vehicle, and send vehicle command(s) to the third vehicle to follow the fifth vehicle. Alternatively, the human operator can send vehicle command(s) to the fifth vehicle to follow the first vehicle, and send vehicle command(s) to the second vehicle to follow the fifth vehicle.
[0116] As another example, a selected autonomous vehicle in a first convoy can receive vehicle command(s) to stop from a third-party entity. In response, the selected autonomous vehicle can be removed from the first convoy so that the selected autonomous vehicle can come to a stop and the first convoy can continue without the selected autonomous vehicle. Once the reason for the stop is resolved (e.g., an inspection is completed, one or more tires are inflated, etc.), the selected autonomous vehicle can attempt to rejoin the first convoy if it can safely catch up to the first convoy. The selected autonomous vehicle can send a request to join/rejoin the first convoy to the lead vehicle. A human operator in the lead vehicle can decide whether to add the selected autonomous vehicle to the first convoy. If the human operator decides to add the selected autonomous vehicle, then the human operator can send vehicle command(s) instructing the selected autonomous vehicle to follow a vehicle in the first convoy.
[0117] As another example, a selected autonomous vehicle in a first convoy can receive vehicle command(s) to stop from a third-party entity. In response, the selected autonomous vehicle can be removed from the first convoy so that the selected autonomous vehicle can come to a stop and the first convoy can continue without the selected autonomous vehicle. Once a reason for the stop is resolved, the selected autonomous vehicle can attempt to join a second convoy. The selected autonomous vehicle can send a request to join the second convoy to the lead vehicle in the second convoy. A human operator in the lead vehicle can decide whether to add the selected autonomous vehicle to the second convoy. If the human operator decides to add the selected autonomous vehicle, then the human operator can send vehicle command(s) instructing the selected autonomous vehicle to follow a vehicle in the second convoy.
[0118] The systems and methods described herein may provide a number of technical effects and benefits. For instance, instead of stopping at weigh stations intermittently along a route, inspections and weigh-ins can be performed during transfer at a transfer hub and/or in real-time as an autonomous vehicle is travelling from a first transfer hub to a second transfer hub. By utilizing diagnostics sensors and communications tools on-board an autonomous vehicle, as well as external monitors, diagnostics information associated with the autonomous vehicle can be generated and used to determine remote inspection information that can be provided to a remote third-party enforcement entity. Moreover, by generating an inspection report using up-to-date diagnostics information, the third-party entity can be provided with more accurate inspection information associated with an autonomous vehicle.
[0119] The systems and methods described herein may also provide resulting
improvements to computing technology tasked with providing a vehicle service and/or managing a fleet of vehicles to provide a vehicle service. For example, the systems and methods described herein may provide improvements in a utilization of the fleet of vehicles for providing the vehicle service, resulting in greater throughput and reduced energy expenditure by avoiding intermittent stops along a route.
[0120] Reference now will be made in detail to embodiments, one or more example(s) of which are illustrated in the drawings. Each example is provided by way of explanation of the embodiments, not limitation of the present disclosure. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made to the embodiments without departing from the scope or spirit of the present disclosure. For instance, features illustrated or described as part of one embodiment can be used with another embodiment to yield a still further embodiment. Thus, it is intended that aspects of the present disclosure cover such modifications and variations.
[0121] With reference now to the FIGS., example embodiments of the present disclosure will be discussed in further detail. FIG. 1 depicts an example system 100 according to example embodiments of the present disclosure. The system 100 can include a vehicle computing system 102 associated with a vehicle 104. The system 100 can also include one or more vehicle(s) 105, each including a respective vehicle computing system (not shown). [0122] In some implementations, the system 100 can include one or more remote computing system(s) 103 that are remote from the vehicle 104 and the vehicle(s) 105. The remote computing system(s) 103 can include an operations computing system 120, one or more client computing system(s) 122, one or more third-party computing system(s) 124, and one or more external monitor computing system(s) 126. The remote computing system(s)
103 can be separate from one another or share computing device(s).
[0123] In some implementations, the vehicle 104 can be part of a fleet of vehicles operated by the operations computing system 120. The fleet of vehicles can also include the vehicle(s) 105.
[0124] The operations computing system 120 can manage or operate the vehicle 104 via the vehicle computing system 102, and manage or operate the vehicle(s) 105 via the respective vehicle computing system for each vehicle. The operations computing system 120 can obtain data indicative of a vehicle service request from a client, for example, via the client computing system 122. The operations computing system 120 can select the vehicle
104 (or one of the vehicle(s) 105) to provide the vehicle service for the client. In some implementations, the operations computing system 120 can control the vehicle 104 to provide remote inspection information to the one or more third-party computing system(s) 124.
[0125] The vehicles 104 incorporating the vehicle computing system 102, and the vehicle(s) 105, can be a ground-based autonomous vehicle (e.g., car, truck, bus), an air-based autonomous vehicle (e.g., airplane, drone, helicopter, or other aircraft), or other type of vehicle (e.g., watercraft). The vehicle 104, and vehicle(s) 105, can be an autonomous vehicle that can drive, navigate, operate, etc. with minimal and/or no interaction from a human driver.
[0126] The vehicle computing system 102 can include one or more computing device(s) located on-board the vehicle 104 (e.g., located on and/or within the vehicle 104). The computing device(s) can include various components for performing various operations and functions. For instance, the computing device(s) can include one or more processor(s) and one or more tangible, non-transitory, computer readable media. The one or more tangible, non-transitory, computer readable media can store instructions that when executed by the one or more processor(s) cause the vehicle 104 (e.g., its computing system, one or more processors, etc.) to perform operations and functions, such as those described herein.
[0127] In some implementations, the vehicle computing system 102 can include a Vehicle API client that can enable bidirectional communication with a remote computing system 103 (e.g., operations computing system 120, client computing system(s) 122, third- party computing system(s) 124) and/or a vehicle computing system onboard each of the vehicle(s) 105 through a Vehicle API Platform operating on the remote computing system 103 and/or the vehicle computing system onboard each of the vehicle(s) 105. For example, the Vehicle API Platform and the Vehicle API client can provide for establishing
communication tunnels between the vehicle computing system 102 and the remote computing system 103 and/or the vehicle computing system onboard each of the vehicle(s) 105. In some implementations, the Vehicle API client can provide for communicating data using intelligent quality of service (QoS), multiplexing data over different communication streams, prioritizing and/or de-prioritizing data traffic dynamically, for example, based on link conditions and/or the like.
[0128] As shown in FIG. 1, the vehicle 104 can include one or more sensors 108, an autonomy computing system 110, a vehicle control system 112, a communications system 114, and a memory system 116. One or more of these systems can be configured to communicate with one another via a communication channel. The communication channel can include one or more data buses (e.g., controller area network (CAN)), on-board diagnostics connector (e.g., OBD-II), and/or a combination of wired and/or wireless communication links. The on-board systems can send and/or receive data, messages, signals, etc. amongst one another via the communication channel.
[0129] The sensor(s) 108 can be configured to acquire sensor data 109 associated with one or more objects that are proximate to the vehicle 104 (e.g., within a field of view of one or more of the sensor(s) 108). The sensor(s) 108 can include a Light Detection and Ranging (LIDAR) system, a Radio Detection and Ranging (RADAR) system, one or more cameras (e.g., visible spectrum cameras, infrared cameras, etc.), motion sensors, and/or other types of imaging capture devices and/or sensors. The sensor data 109 can include image data, radar data, LIDAR data, and/or other data acquired by the sensor(s) 108. The object(s) can include, for example, pedestrians, vehicles, bicycles, and/or other objects. The object(s) can be located in front of, to the rear of, and/or to the side of the vehicle 104. The sensor data 109 can be indicative of locations associated with the object(s) within the surrounding
environment of the vehicle 104 at one or more times. The sensor(s) 108 can provide the sensor data 109 to the autonomy computing system 110.
[0130] As shown in FIG. 2, the autonomy computing system 110 can include a perception system 202, a prediction system 204, a motion planning system 206, and/or other systems that cooperate to perceive the surrounding environment of the vehicle 104 and determine a motion plan for controlling the motion of the vehicle 104 accordingly. For example, the autonomy computing system 110 can receive the sensor data 109 from the sensor(s) 108, attempt to comprehend the surrounding environment by performing various processing techniques on the sensor data 109 (and/or other data), and generate an appropriate motion plan through such surrounding environment. The autonomy computing system 110 can control the one or more vehicle control systems 112 to operate the vehicle 104 according to the motion plan.
[0131] The autonomy computing system 110 can identify one or more objects that are proximate to the vehicle 104 based at least in part on the sensor data 109 and/or the map data 260. For instance, the perception system 202 can perform various processing techniques on the sensor data 109 to determine perception data 262 that is descriptive of a current state of one or more object(s) that are proximate to the vehicle 104. The prediction system 204 can create prediction data 264 associated with each of the respective one or more object(s) proximate to the vehicle 104. The prediction data 264 can be indicative of one or more predicted future locations of each respective object. The motion planning system 206 can determine a motion plan for the vehicle 104 based at least in part on the prediction data 264 (and/or other data), and save the motion plan as motion plan data 266. The motion plan data 266 can include vehicle actions with respect to the object(s) proximate to the vehicle 104 as well as the predicted movements. The motion plan data 266 can include a planned trajectory, speed, acceleration, etc. of the vehicle 104.
[0132] The motion planning system 206 can provide at least a portion of the motion plan data 266 that indicates one or more vehicle actions, a planned trajectory, and/or other operating parameters to the vehicle control system 112 to implement the motion plan for the vehicle 104. For instance, the vehicle 104 can include a mobility controller configured to translate the motion plan data 266 into instructions. By way of example, the mobility controller can translate the motion plan data 266 into instructions to adjust the steering of the vehicle 104“X” degrees, apply a certain magnitude of braking force, etc. The mobility controller can send one or more control signals to the responsible vehicle control sub-system (e.g., powertrain control system 220, steering control system 222, braking control system 224) to execute the instructions and implement the motion plan.
[0133] The communications system 114 can allow the vehicle computing system 102 (and its computing system(s)) to communicate with one or more other computing systems (e.g., remote computing system(s) 103, additional vehicle(s) 105). The vehicle computing system 102 can use the communications system 114 to communicate with one or more remote computing system(s) 103 (e.g., operations computing system 120, third-party computing system(s) 124, external monitor computing system 126) or a vehicle computing system onboard each of the vehicle(s) 105 over one or more networks (e.g., via one or more wireless signal connections). In some implementations, the vehicle computing system 102 can communicate with the operations computing system 120 over one or more wide-area networks (e.g., satellite network, cellular network, etc.) that use a relatively low-frequency spectrum and/or that are associated with relatively long-range communications. In some implementations, the vehicle computing system 102 can communicate with the third-party computing system(s) 124 over one or more local-area networks (e.g., WiFi networks, infrared or laser based communication networks, ad-hoc mesh networks, etc.) that use a relatively high-frequency spectrum and/or are associated with a relatively short-range communications. In some implementations, the communications system 114 can allow communication among one or more of the system(s) on-board the vehicle 104. The communications system 114 can include any suitable sub-systems for interfacing with one or more network(s) including, for example, transmitters, receivers, ports, controllers, antennas, and/or other suitable sub systems that can help facilitate communication.
[0134] The memory system 116 of the vehicle 104 can include one or more memory devices located at the same or different locations (e.g., on-board the vehicle 104, distributed throughout the vehicle 104, off-board the vehicle 104, etc.). The vehicle computing system 102 can use the memory system 116 to store and retrieve data/information. For instance, the memory system 116 can store map data 260, perception data 262, prediction data 264, motion plan data 266, third-party identification data 270, vehicle identification data 272, status information data 273, diagnostics data 274, and remote inspection data 276.
[0135] The map data 260 can include information regarding: an identity and location of different roadways, road segments, buildings, or other items or objects (e.g., lampposts, crosswalks, curbing, etc.); a location and direction of traffic lanes (e.g., the location and direction of a parking lane, a turning lane, a bicycle lane, or other lanes within a particular roadway or other travel way and/or one or more boundary markings associated therewith); and/or any other data that assists the vehicle computing system 102 in comprehending and perceiving its surrounding environment and its relationship thereto.
[0136] The third-party identification data 270 can include information associated with one or more third-party entities. As an example, the third-party identification data 270 can include authentication information used to authenticate a third-party entity. As another example, the third-party identification data 270 can include one or more predetermined keys that have been previously shared between the vehicle computing system 102 and the third- party computing system(s) 124. The third-party identification data 270 can include information indicating which predetermined key is shared with which third-party computing system 124.
[0137] The vehicle identification data 272 can include information indicative of a vehicle identifier that corresponds to the vehicle 104. As an example, the vehicle identification data 272 can include an identification code corresponding to the vehicle 104 that is painted on the outside of the vehicle 104. As another example, the vehicle identification data 272 can include an algorithm to generate an identification code corresponding to the vehicle 104.
Each generated identification code can be valid for a limited time, and a new identification code can be generated to replace an outdated identification code. Additionally, the vehicle 104 can include a display board that displays a valid identification code at any given time.
[0138] The status information data 273 can include status information associated with the vehicle 104. The status information data 273 can be generated by the vehicle computing system 102 periodically, or in response to vehicle command(s) from a third-party entity. The status information can include a status associated with one or more component(s) of the vehicle 104, and/or an overall health/status associated with the vehicle 104.
[0139] The vehicle computing system 102 can autonomously generate diagnostics information corresponding to one or more systems on-board the vehicle 104 and/or information corresponding to an environment in which the vehicle 104 operates.
Additionally, and/or alternatively, the vehicle computing system can obtain diagnostics information associated with the vehicle 104 from the one or more external monitor computing system(s) 126. The vehicle computing system 102 can store the diagnostics information as the diagnostics data 270.
[0140] The vehicle computing system 102 can autonomously determine remote inspection information based on the diagnostics data 270, and store the remote inspection information as remote inspection data 272. The vehicle computing system 102 can provide the remote inspection information to one or more remote computing system(s) 103 (e.g., operations computing system 120, third-party computing system(s) 124, external monitor computing system(s) 126) at one or more times.
[0141] As illustrated in FIG. 3, the vehicle-operations system interface 300 can include a Vehicle API 304 associated with a remote computing system 301 (e.g., remote computing system(s) 103, vehicle computing system onboard each of the vehicle(s) 105, etc.). The Vehicle API 304 can provide for a translation/transport layer as an interface between vehicle computing systems onboard vehicles within an entity’s fleet (e.g., vehicle 104, additional vehicle(s) 105) and one or more remote clients and/or applications operating within the remote computing system 301.
[0142] The Vehicle API 304 can include an offboard gateway 306 which can provide for establishing one or more communication channels between the Vehicle API 304 and a vehicle, such as vehicle 104 (e.g., via vehicle computing system 102, etc.). The offboard gateway 306 can establish multiplexing connections between the vehicle 104 and the Vehicle API 304 that can be used to send arbitrary communications through the same connections.
[0143] In some implementations, the Vehicle API 304, through offboard gateway 306, can provide for establishing multiple hypertext transfer protocol (or other suitable protocol) connections, for example, using HTTP/2, between a Vehicle API relay/client 308 and the offboard gateway 306, allowing the ability to parallelize and assert traffic priority within a connection. In some implementations, the offboard gateway 306 of Vehicle API 304 can establish at least two hypertext transfer protocol (or other suitable protocol) connections, such as HTTP/2 connections, to the operations computing system from a vehicle, where at least one connection can be dedicated to high reliability, high deliverability traffic and at least one connection can be dedicated to best-effort, unguaranteed traffic. In some implementations, the use of multiple connections can allow for the underlying transport to be controlled in terms of different connections having different weights such that data can be identified as more important.
[0144] The vehicle 104 can include a Vehicle API relay/client 308, for example, associated with a vehicle computing system 102, which can provide for establishing the one or more communication channels between the offboard gateway 306 of the Vehicle API 304 and the vehicle 104. In some implementations, the Vehicle API relay/client 308 onboard the vehicle 104 can provide for communicating data using intelligent quality of service (QoS), multiplexing data over different communication streams, prioritizing and/or de-prioritizing data traffic dynamically, for example, based on link conditions and/or the like. In some implementations, the Vehicle API relay/client 308 can provide for making determinations about what data it thinks is more important and handling the communication of that data as appropriate.
[0145] In some implementations, the Vehicle API 304, through offboard gateway 306 and Vehicle API relay/client 308, can provide for communicating onboard data traffic 310 (e.g., telemetry, vehicle state information, etc.) from the vehicle 104 to the remote computing system 301. For example, the offboard gateway 306 can receive the onboard data traffic 310 from the Vehicle API relay/client 308 and the Vehicle API 304 can provide for handling the onboard data traffic 310 and providing the onboard data traffic 310 to one or more clients and/or application associated with the remote computing system 301 in client messages 314.
[0146] In some implementations, the Vehicle API 304, through offboard gateway 306 and Vehicle API relay/client 308, can provide for communicating authenticated vehicle messages 312 from the remote computing system 301 to the vehicle 104 (e.g., to vehicle computing system 102, etc.). For example, the offboard gateway 306 can receive vehicle messages 316 from one or more clients/applications associated with the remote computing system 301 (e.g., messages signed by the client to allow for authenticating the messages before sending to a vehicle) and the Vehicle API 304 can provide for communicating the vehicle messages 316 to the vehicle 104, through offboard gateway 306 and Vehicle API relay/client 308, as authenticated vehicle messages 312 (e.g., once the Vehicle API 304 has authenticated the signed vehicle messages 316).
[0147] In some implementations, the Vehicle API 304 can allow for a vehicle 104 to send multiple types of data to the remote computing system 301 over the established connections with the vehicle 104. For instance, in some implementations, the Vehicle API 304 can provide for a vehicle 104 sending status information data 273 to the remote computing system 301. In some implementations, the Vehicle API 304 can provide for a vehicle 104 to send low resolution perception data, such as labels and/or geometries, to the operations computing system 120, allowing for processing the data offboard the vehicle 104 by one or more clients/applications associated with the operations computing system 120 and allowing for developing a better understanding of the world. In some implementations, the Vehicle API 304 can provide for a vehicle to send data such as current vehicle pose (e.g., global and relative to map), vehicle trajectory, onboard diagnostics, status information, and/or the like, to the remote computing system 301 to be processed by one or more clients/applications associated with the remote computing system 301.
[0148] In some implementations, the Vehicle API 304 can provide for the remote computing system 301 to receive multiple types of data from the vehicle 104 and/or additional vehicle(s) 105. For example, the Vehicle API 304 can provide for the remote computing system 301 to receive multiple types of data from the vehicle 104 over the established connections with the vehicle 104. In some implementations, the Vehicle API 304 can provide for receiving status information data 273 from the vehicle 104 at one or more times, and analyzing the status information data 273 to determine a vehicle state associated with the vehicle 104. [0149] In some implementations, the Vehicle API 304 can provide for the remote computing system 301 to send multiple types of data to the vehicle 104 and/or additional vehicle(s) 105. For example, the Vehicle API 304 can provide for the remote computing system 301 to send multiple types of data to the vehicle 104 over the established connections to the vehicle 104. For example, in some implementations, the Vehicle API 304 can provide for sending command signals to the vehicle 104, such as, for example, sending specific vehicle command(s) to the vehicle 104, sending advisories to the vehicle 104, etc. The specific vehicle command(s) can, for example, instruct the vehicle 104 to offload the data from its computing system, instruct the vehicle 104 to go to certain geographic coordinates, instruct the vehicle 104 to report for maintenance, instruct the vehicle 104 to procure fuel, instruct the vehicle 104 to indicate a selection by the remote computing system 301, instruct the vehicle 104 to relay vehicle command(s), instruct the vehicle 104 to come to a stop, and/or the like. The advisories can, for example, notify a vehicle operator associated with the vehicle 104 about status information associated with the vehicle 104 or vehicle(s) 105, flagged geo regions (e.g., areas to avoid, areas to proceed with caution, areas under construction that should be routed around, etc.), etc.
[0150] FIG. 4A depicts a diagram of a plurality of vehicles 105 operating in an environment under the jurisdiction of a third-party entity 410. The plurality of vehicles 105 can include autonomous vehicles 411, 412, 413, and 414. In some implementations, the vehicles 411, 412, 413, and 414 can operate as a convoy. The third-party entity 410 can be associated with a third-party computing system 124, and the third-party entity 410 can send one or more vehicle command(s) to the vehicles 411, 412, 413, and/or 414, via the third-party computing system 124. The vehicle computing system(s) corresponding to the vehicles 411, 412, 413, and/or 414 can perform one or more vehicle action(s) in response to receiving the vehicle command(s).
[0151] As an example, the third-party entity 410 can send vehicle command(s) to select the vehicle 411. In response to receiving the vehicle command(s), the vehicle 411 can be configured as being selected, and perform vehicle action(s) to indicate the selection. The vehicle 411 can flash an external indicator light, display a message on an external display, and/or perform other vehicle action(s) that the third-party entity 410 can perceive to determine that the vehicle 411 is selected.
[0152] As another example, the third-party entity 410 can broadcast vehicle command(s) in a general direction toward the vehicles 411, 412, 413, and 414. If the third-party entity 410 broadcasts vehicle command(s) in order to select the vehicle 411 and the vehicle 412 receives the vehicle command(s), then the vehicle 412 can perform vehicle action(s) to indicate selection of the vehicle 412. The third-party entity 410 can perceive the vehicle action(s) and determine that the vehicle 412 is selected. The third-party entity 410 can broadcast vehicle command(s) instructing the vehicle 412 to relay future vehicle command(s) to an autonomous vehicle in front of the vehicle 412. In response to receiving the vehicle command(s), the vehicle 412 can communicate with the vehicle 411 and relay the future vehicle command(s) from the third-party entity 410 to the vehicle 411.
[0153] As another example, the third-party entity 410 can broadcast vehicle command(s) in a general direction toward the vehicles 411, 412, 413, and 414. If the vehicle command(s) include a vehicle identifier corresponding to the vehicle 411, and the vehicle 412 receives the vehicle command(s), then the vehicle 412 can ignore the vehicle command(s). Alternatively, the vehicle 412 can determine that the vehicle 411 is proximate to the vehicle 412, and the vehicle 412 can relay the vehicle command(s) to the vehicle 411.
[0154] As another example, the third-party entity 410 can determine that the vehicle 411 has low tire pressure, and send vehicle command(s) selecting the vehicle 411. The vehicle command(s) can include low tire pressure as the reason for the selection, and instruct the vehicle 411 to provide information indicating the reason to a service provider. In response, the vehicle 411 can communicate with the operations computing system 120 to send data indicative of the reason for the selection.
[0155] As another example, the third-party entity can determine that the vehicle 411 appears to have low tire pressure, and send vehicle command(s) selecting the vehicle 411 and instructing the vehicle 411 to provide status information indicative of its tire pressure. In response, the vehicle 411 can retrieve the status information from the status information data 273, or generate the status information, and send the status information to the third-party entity 410. The third-party entity can verify the tire pressure of the vehicle 411 based on the status information and send vehicle command(s) instructing the vehicle 411 to travel to a maintenance area if the tire pressure is low.
[0156] As another example, the third-party entity can send vehicle command(s) selecting the vehicle 411 and instructing the vehicle 411 to stop. In response, the vehicle 411 can perform a stopping action to come to a stop. If the vehicle command(s) include a reason for stopping the vehicle 411, then the vehicle 411 can perform a safe-stop action if the reason is determined not to be critical, or the vehicle 411 can perform an emergency-stop action if the reason is determined to be critical. If the vehicle command(s) include an associated priority level (e.g., low-priority, high-priority), then the vehicle 411 can perform a stopping action corresponding to the priority level.
[0157] As another example, the third-party entity 410 can send one or more encrypted vehicle command(s) to the vehicle 411. The vehicle 411 can receive the encrypted vehicle command(s), and decrypt the vehicle command(s) using a predetermined key that was previously shared between the vehicle 411 and the third-party entity 410. The vehicle 411 can retrieve the third-party identification data 270 to authenticate the third-party entity 410.
If the third-party entity 410 is authenticated, then the vehicle 411 can perform vehicle action(s) in response to the vehicle command(s) from the third-party entity 410.
[0158] FIG. 4B depicts a diagram of a plurality of vehicles 105 operating in an environment under the jurisdiction of a third-party entity 410. The plurality of vehicles 105 can include autonomous vehicles 411, 412, 413, and 414. In some implementations, the vehicles 411, 412, 413, and 414 can operate as a convoy. The third-party entity 410 can identify the vehicle 411, and send vehicle command(s) that include a vehicle identifier corresponding to the vehicle 411. The vehicle computing system associated with the vehicle 412 can receive the vehicle command(s) and determine that the vehicle identifier included in the vehicle command(s) does not correspond to the vehicle 412. In some implementations, the vehicle computing system can determine that the vehicle identifier included in the vehicle command(s) corresponds to the vehicle 411, and that the vehicle 411 is proximate to the vehicle 412. The vehicle computing system can control the vehicle 412 to data indicative of the vehicle command(s) to the vehicle 411.
[0159] FIG. 4C depicts a diagram of a plurality of vehicles 105 operating in an environment under the jurisdiction of a third-party entity 410. The plurality of vehicles 105 can include autonomous vehicles 411, 412, 413, and 414. In some implementations, the vehicles 411, 412, 413, and 414 can operate as a convoy. The third-party entity 410 can identify the vehicle 414, and send vehicle command(s) that include a vehicle identifier corresponding to the vehicle 414. The vehicle computing system associated with the vehicle 412 can receive the vehicle command(s) and determine that the vehicle identifier included in the vehicle command(s) does not correspond to the vehicle 412. In some implementations, the vehicle computing system can determine that the vehicle identifier included in the vehicle command(s) corresponds to the vehicle 414, and that the vehicle 414 is not proximate to the vehicle 412. The vehicle computing system associated with the vehicle 412 can discard the vehicle command(s) that include the vehicle identifier corresponding to the vehicle 414. [0160] FIG. 5 A depicts a diagram of a plurality of vehicles 105 operating in an environment under the jurisdiction of a third-party entity 510. The plurality of vehicles 105 can include autonomous vehicles 511, 512, 513, and 514. In some implementations, the vehicles 511, 512, 513, and 514 can operate as a convoy. The third-party entity 510 can send one or more vehicle command(s) to select the vehicle 512. In response, the vehicle computing system associated with the vehicle 512 can control the vehicle 512 to flash its hazard lights so that the third-party entity 510 can verify that the vehicle 512 is selected.
[0161] FIG. 5B depicts a diagram of a plurality of vehicles 105 operating in an environment under the jurisdiction of a third-party entity 510. The plurality of vehicles 105 can include autonomous vehicles 511, 512, 513, and 514. In some implementations, the vehicles 511, 512, 513, and 514 can operate as a convoy. The third-party entity 510 can send vehicle command(s) to the vehicle 512 indicating a selection of the vehicle 511. In response, the vehicle computing system associated with the vehicle 512 can communicate with the vehicle computing system associated with the vehicle 511 to indicate a selection of the vehicle 511 by third-party entity 510. In response, the vehicle computing system associated with the vehicle 511 can control the vehicle 511 to flash its hazard lights so that the third- party entity 510 can verify that the vehicle 511 is selected.
[0162] FIG. 6 depicts a diagram of a plurality of vehicles 105 including vehicles 611,
612, and 613 operating as a convoy. The vehicle 611 can be configured as the lead vehicle in the convoy, and vehicles 612 and 613 can be configured as follower vehicles. The vehicle 612 can be configured to follow vehicle 611 at a first distance (<¾), and vehicle 613 can be configured to follow vehicle 612 at the first distance (d ). The vehicles 611, 612, and 613 can be configured to travel at a first velocity (v/).
[0163] At a first time (/ = 7), vehicle 611 can be positioned at a first location marker (A), vehicle 612 can be positioned at the first distance (d ) behind vehicle 611, and vehicle 613 can be positioned at the first distance (V/) behind vehicle 612.
[0164] At a second time (/ = 2), vehicle 611 can travel to a second location marker ( B ), vehicle 612 can travel to maintain a position at the first distance (d ) behind vehicle 611, and vehicle 613 can travel to maintain a position at the first distance (d ) behind vehicle 612.
[0165] At a third time (/ = 3), vehicle 611 can travel to a third location marker (C), vehicle 612 can travel to maintain a position at the first distance (d ) behind vehicle 611, and vehicle 613 can travel to maintain a position at the first distance (d ) behind vehicle 612.
[0166] FIG. 7 depicts a diagram of a plurality of vehicles 105 including vehicles 711,
712, and 713 operating as a convoy. The vehicle 711 can be configured as the lead vehicle in the convoy, and vehicles 712 and 713 can be configured as follower vehicles. The vehicle 712 can be configured to follow vehicle 711 at a first distance (<¾), and vehicle 713 can be configured to follow vehicle 712 at the first distance (d/).
[0167] At a first time (/ = 7), vehicle 711, 712, and 713 can travel at a first velocity (v7). The vehicle 712 can follow vehicle 711 at the first distance (<¾), and the vehicle 713 can follow vehicle 712 at the first distance (d/).
[0168] At a second time (/ = 2), vehicle 712 can detect an obstacle 1001 in front of the vehicle, and generate a motion plan to avoid hitting the obstacle 1001. The motion plan of vehicle 712 can include reducing speed to a second velocity (v2 < v7). The vehicle 711 can continue to travel at the first velocity ( i).
[0169] The vehicle 713 can generate a motion plan to maintain a position at the first distance (d/) behind vehicle 712, and to avoid hitting the vehicle 712. The motion plan of vehicle 713 can include reducing speed to the second velocity (v2). In some implementations, vehicle 712 can communicate with vehicle 713 to provide data indicative of the obstacle 1001 and the motion plan of vehicle 712. The vehicle 713 can receive the data and generate the motion plan of vehicle 713 based in part on the received data. In some implementations, vehicle 713 can detect that vehicle 712 that is in front of vehicle 713 is slowing down. The vehicle 713 can generate the motion plan of vehicle 713 based in part on the detected slow down.
[0170] At a third time (/ = 3), vehicle 712 can determine that the obstacle 1001 is clear, but that vehicle 712 is now a second distance (d2 > d2) behind vehicle 711. The vehicle 712 can generate a motion plan to resume a position at the first distance (d/) behind vehicle 711. The motion plan of vehicle 712 can include increasing speed to a third velocity (v3 > v2) to reduce a follow distance of vehicle 712 behind vehicle 711.
[0171] The vehicle 713 can generate a motion plan to maintain a position at the first distance (d/) behind vehicle 712. The motion plan of vehicle 713 can include increasing speed to the third velocity ( v3 ). In some implementations, vehicle 712 can communicate with vehicle 713 to provide data indicative of the motion plan of vehicle 712. The vehicle 713 can receive the data and generate the motion plan of vehicle 713 based in part on the received data. In some implementations, vehicle 713 can detect that vehicle 712 that is in front of vehicle 713 is speeding up. The vehicle 713 can generate the motion plan of vehicle 713 based in part on the detected speed up.
[0172] FIG. 8 depicts a diagram of a plurality of vehicles 105 including vehicles 811,
812, and 813 operating as a convoy. The vehicle 811 can be configured as the lead vehicle in the convoy, and vehicles 812 and 813 can be configured as follower vehicles. The vehicle
812 can be configured to follow vehicle 811 at a first distance (d ), and vehicle 813 can be configured to follow vehicle 812 at the first distance (d/).
[0173] At a first time (/ = 7), vehicle 811, 812, and 813 can travel at a first velocity (v7). The vehicle 812 can follow vehicle 811 at the first distance (<¾), and the vehicle 813 can follow vehicle 812 at the first distance (d/).
[0174] At a second time (/ = 2), vehicle 812 can detect an obstacle 1101 in front of the vehicle, and generate a motion plan to avoid hitting the obstacle 1101. The motion plan of vehicle 812 can include reducing speed, and moving to a different travel lane. The vehicle 811 can continue to travel at the first velocity ( i).
[0175] The vehicle 812 can communicate with vehicle 813 to provide data indicative of the obstacle 1101 and the motion plan of vehicle 812. The vehicle 813 can receive the data and generate a motion plan of vehicle 813 based in part on the received data. The vehicle
813 can generate the motion plan to try and maintain a position at the first distance (d/) behind vehicle 812, avoid hitting the vehicle 812, and avoid hitting the obstacle 1101. The motion plan of vehicle 813 can include reducing speed, and moving to a different travel lane.
[0176] At a third time (/ = 3), vehicle 812 can determine that the obstacle 1101 is clear, but that vehicle 812 is now a second distance ( d2 > d ) behind vehicle 811. The vehicle 812 can generate a motion plan to maintain a position at the first distance (d/) behind vehicle 811. The motion plan of vehicle 812 can include increasing speed to a second velocity (v2 > v ) to reduce a follow distance of vehicle 812 behind vehicle 811.
[0177] Additionally, vehicle 813 can determine that it is a third distance ( d3 > d/) behind vehicle 812. The vehicle 813 can generate a motion plan to maintain a position at the first distance (d/) behind vehicle 812. The motion plan of vehicle 813 can include increasing speed to a third velocity ( v3 > vy) to reduce a follow distance of vehicle 813 behind vehicle 812.
[0178] FIG. 9 depicts a diagram of a plurality of vehicles 105 including vehicles 911,
912, and 913 operating as a convoy in an environment under the jurisdiction of a third-party entity (not shown). The vehicle 911 can be configured as the lead vehicle in the convoy, and vehicles 912 and 913 can be configured as follower vehicles. The vehicle 912 can be configured to follow vehicle 911, and vehicle 913 can be configured to follow vehicle 912.
[0179] At a first time (/ = 7), vehicles 911, 912, and 913 can travel as a group. The vehicle 912 can receive one or more vehicle command(s) from the third-party entity. The vehicle command(s) can instruct vehicle 912 to stop. The vehicle command(s) can include or otherwise indicate a low-priority for the stop. In response to the vehicle command(s), the vehicle 912 can determine one or more vehicle action(s) to perform. The vehicle action(s) can include, for example, continuing to travel with the convoy until a safe stopping location is identified, removal from the convoy, and a safe-stop action.
[0180] At a second time (/ = 2), vehicle 912 can identify the safety lane 1202 as a safe stopping location. The vehicle 912 can send data indicative of the vehicle command(s) from the third-party entity and the determined vehicle action(s) to vehicles 911 and 913, and generate a motion plan to come to a stop in the safety lane 902. A human operator in the vehicle 911 can verify the vehicle command(s) and confirm the vehicle action(s) by sending one or more vehicle command(s) to remove vehicle 912 from the convoy. In particular, vehicle 913 can be configured to follow vehicle 911.
[0181] At a third time (/ = 3), vehicle 912 can come to a stop in the safety lane 1202.
The vehicle 911 can slow down and/or vehicle 913 can speed up so that vehicle 913 can maintain a predetermined follow distance behind vehicle 911.
[0182] FIG. 10 depicts a diagram of a plurality of vehicles 105 including vehicles 1011, 1012, and 1013 operating as a convoy in an environment under the jurisdiction of a third- party entity (not shown). The vehicle 1011 can be configured as the lead vehicle in the convoy, and vehicles 1012 and 1013 can be configured as follower vehicles. The vehicle 1012 can be configured to follow vehicle 1011, and vehicle 1013 can be configured to follow vehicle 1012.
[0183] At a first time (/ = 7), vehicles 1011, 1012, and 1013 can travel as a group.
[0184] At a second time (/ = 2), vehicle 1012 can receive one or more vehicle command(s) from the third-party entity. The vehicle command(s) can instruct vehicle 1012 to stop. The vehicle command(s) can include or otherwise indicate a high-priority for the stop. In response to the vehicle command(s), the vehicle 1012 can determine one or more vehicle action(s) to perform. The vehicle action(s) can include, for example, removal from the convoy, and an emergency-stop action. The vehicle 1012 can send data indicative of the vehicle command(s) from the third-party entity and the determined vehicle action(s) to vehicles 1011 and 1013, and generate a motion plan to immediately come to a stop in the same lane that the vehicle is travelling in. A human operator in the vehicle 1011 can verify the vehicle command(s) and confirm the vehicle action(s) by sending one or more vehicle command(s) to remove vehicle 1012 from the convoy. In particular, vehicle 1013 can be configured to follow vehicle 1011. The vehicle 1013 can generate a motion plan to avoid hitting vehicle 1012 when it is stopping/stopped. [0185] At a third time (/ = 3), vehicle 1012 can come to a stop. The vehicle 1011 can slow down and/or vehicle 1013 can speed up so that vehicle 1013 can maintain a predetermined follow distance behind vehicle 1011.
[0186] FIG. 11 depicts a diagram of a plurality of vehicles 105 including vehicles 1111,
1112, and 1113 operating as a convoy in an environment under the jurisdiction of a third- party entity (not shown). The vehicle 1111 can be configured as the lead vehicle in the convoy, and vehicles 1112 and 1113 can be configured as follower vehicles. The vehicle 1112 can be configured to follow vehicle 1111, and vehicle 1113 can be configured to follow vehicle 1112.
[0187] At a first time (/ = 7), vehicles 1111, 1112, and 1113 can travel as a group.
[0188] At a second time (/ = 2), vehicle 1112 can receive one or more vehicle command(s) from the third-party entity. The vehicle command(s) can instruct vehicle 1112 to stop. The vehicle command(s) can include or otherwise indicate a request for a human to be present at an inspection, or instructions for all vehicles in the convoy to stop. In response to the vehicle command(s), the vehicle 1112 can determine one or more vehicle action(s) to perform. The vehicle action(s) can include, for example, coordinating with other vehicles in the convoy to stop as a group. In particular, the vehicle 1112 can send data indicative of the vehicle command(s) from the third-party entity and the determined vehicle action(s) to vehicle 1111. A human operator in the vehicle 1111 can verify the vehicle command(s) from the third-party entity, and send one or more vehicle command(s) to determine coordinated stopping actions for all the vehicles in the convoy.
[0189] At a third time (/ = 3), vehicles 1111, 1112, and 1113 can perform a respective coordinated stopping action so that vehicles 1111, 1112, and 1113 can stop as a group in the safety lane 1102.
[0190] FIG. 12 depicts a diagram of a plurality of vehicles 105 including vehicles 1211, 1212, and 1213 operating as a convoy in an environment under the jurisdiction of a third- party entity (not shown). The vehicle 1211 can be configured as the lead vehicle in the convoy, and vehicles 1212 and 1213 can be configured as follower vehicles. The vehicle 1212 can be configured to follow vehicle 1211, and vehicle 1213 can be configured to follow vehicle 1212.
[0191] At a first time (/ = 7), vehicles 1211, 1212, and 1213 can travel as a group.
[0192] At a second time (/ = 2), vehicle 1212 can receive one or more vehicle command(s) from the third-party entity. The vehicle command(s) can instruct vehicle 1212 to stop. The vehicle command(s) can include or otherwise indicate a request for a human to be present at an inspection, or instructions for all vehicles in the convoy to stop. In response to the vehicle command(s), the vehicle 1212 can determine one or more vehicle action(s) to perform. The vehicle action(s) can include, for example, coordinating with other vehicles in the convoy to stop as a group. In particular, the vehicle 1212 can send data indicative of the vehicle command(s) from the third-party entity and the determined vehicle action(s) to vehicle 1211. A human operator in the vehicle 1211 can verify the vehicle command(s) from the third-party entity, and send one or more vehicle command(s) to determine coordinated stopping actions for all the vehicles in the convoy.
[0193] At a third time (/ = 3), vehicles 1211, 1212, and 1213 can perform a respective coordinated stopping action so that the convoy can stop as a group in a same lane that vehicles 1211, 1212, and 1213 are travelling in.
[0194] FIG. 13 depicts a flow diagram of an example method 1300 for controlling an autonomous vehicle according to example embodiments of the present disclosure. One or more portion(s) of the method 1300 can be implemented as operations by one or more computing system(s) such as, for example, the computing system(s) 102, 120, 1401, and 1410 shown in FIGS. 1, 2, and 10. Moreover, one or more portion(s) of the method 1300 can be implemented as an algorithm on the hardware components of the system(s) described herein (e.g., as in FIGS. 1, 2, and 14), for example, to control an autonomous vehicle in response to vehicle instructions from a remote computing system associated with a third-party entity.
FIG. 13 depicts elements performed in a particular order for purposes of illustration and discussion. Those of ordinary skill in the art, using the disclosures provided herein, will understand that the elements of any of the methods (e.g., of FIG. 13) discussed herein can be adapted, rearranged, expanded, omitted, combined, and/or modified in various ways without deviating from the scope of the present disclosure.
[0195] At (1301), the method 1300 can include controlling a first autonomous vehicle that is part of a convoy to provide a vehicle service. For example, the vehicle computing system 102 can control the vehicle 104 to provide a vehicle service. The vehicle 104 can be part of a fleet of vehicles controlled by an operations computing system 120 associated with a service provider, and more particularly, part of a convoy that includes a plurality of vehicles from the fleet. The vehicle computing system 102 can control the vehicle 104 to provide the vehicle service for a second entity associated with the client computing system 122. The vehicle computing system 102 can control the vehicle 104 to provide the vehicle service at least partly in a geographic area under jurisdiction of a third-party entity. [0196] At (1302), the method 1300 can include receiving one or more communication(s) from a remote computing system associated with a third-party entity. For example, the vehicle computing system 102 can receive one or more communications from the third-party computing system 124 associated with the third-party entity. In particular, one or more clients and/or applications operating associated with the third-party computing system 124 can send vehicle messages 316 corresponding to the communication(s) to the offboard gateway 306, and the Vehicle API 304 can provide the vehicle messages 316 to the vehicle 104, through offboard gateway 306 and Vehicle API relay/client 308, as authenticated vehicle messages 312. The communication(s) can include one or more vehicle instruction(s), such as, for example, instructions for selecting the vehicle 104, instructions for stopping the vehicle 104, instructions for the vehicle 104 to relay information to the other vehicles in the convoy, or instructions for the vehicle 104 to provide information to the third-party entity.
The vehicle computing system 102 can receive the vehicle instruction(s) as one or more encrypted communication(s) from the third-party computing system 124 associated with a third-party entity, and the vehicle computing system 102 can decrypt the encrypted communication(s) based on a predetermined key.
[0197] At (1303), the method 1300 can include determining one or more vehicle action(s) to perform based on one or more vehicle instruction(s) included in the communication(s).
For example, the vehicle instruction(s) can instruct the vehicle computing system 102 to stop the vehicle 104. The vehicle computing system 102 can determine an identify of the third- party entity based on the communication(s), and determine one or more vehicle action(s) to perform based on the identity. The vehicle computing system 102 can also determine a vehicle identifier associated with the communication(s), and determine one or more vehicle action(s) to perform if the vehicle identifier corresponds to the vehicle 104. If the vehicle instruction(s) in the communication(s) include an associated priority-level, then the vehicle computing system 102 can determine the vehicle action(s) to perform that correspond to the priority level. If the vehicle instruction(s) include instructions for the vehicle computing system 102 to stop the vehicle 104, then the vehicle computing system 102 can send data indicative of the vehicle instruction(s) to the other vehicle(s) 105 in the convoy, determine a stopping action that corresponds to the vehicle instruction(s), remove the vehicle 104 from the convoy, and implement the stopping action to bring the vehicle 104 to a stop.
Alternatively, the vehicle computing system 102 can send data indicative of the vehicle instruction(s) to a lead vehicle 105 in the convoy, receive one or more vehicle instruction(s) from the lead vehicle 105, and implement the vehicle instruction(s) received from the lead vehicle 105. In particular, one or more clients and/or applications operating associated with the vehicle computing system 102 can send vehicle messages corresponding to the vehicle instruction(s) to an offboard gateway, and a Vehicle API can provide the vehicle messages to the other vehicle(s) 105, through the offboard gateway and a Vehicle API relay/client associated with the vehicle(s) 105, as authenticated vehicle messages.
[0198] At (1304), the method 1300 can include controlling the first autonomous vehicle to implement the vehicle action(s). For example, the vehicle computing system 102 can control the vehicle 104 to implement the determined vehicle action(s) in response to receiving the one or more vehicle instructions. If the vehicle instruction(s) include instructions for the vehicle computing system 102 to stop the vehicle 104, and the vehicle instruction(s) include a non-critical reason for stopping the vehicle 104, then the vehicle computing system 102 can control the vehicle 104 to implement a soft-stop vehicle action. If the vehicle instruction(s) include instructions for the vehicle computing system 102 to stop the vehicle 104, and the vehicle instruction(s) include a critical reason for stopping the vehicle 104, then the vehicle computing system 102 can control the vehicle 104 to implement an emergency-stop vehicle action. If the vehicle instruction(s) include instructions for the vehicle computing system 102 to stop the vehicle 104, and the vehicle instruction(s) are associated with a low-priority, then the vehicle computing system 102 can control the vehicle 104 to implement a soft-stop vehicle action. If the vehicle instruction(s) include instructions for the vehicle computing system 102 to stop the vehicle 104, and the vehicle instruction(s) are associated with a high-priority, then the vehicle computing system 102 can control the vehicle 104 to implement an emergency-stop vehicle action.
[0199] FIG. 14 depicts a diagram 1400 of a transfer hub 1460 according to example embodiments of the present disclosure. The transfer hub 1460 can include a loading zone 1462, launch zone 1464, and landing zone 1466. The loading zone 1462 can be connected to the launch zone 1464 via an access route 1472, and connected to the landing zone 1466 via an access route 1478. The launch zone 1464 can be connected to a highway via an on-ramp 1474, and the landing zone can be connected to the highway via an off-ramp 1476. An autonomous vehicle can exit the transfer hub 1460 via the on-ramp 1474, and the autonomous vehicle can enter the transfer hub 1460 via the off-ramp 1476.
[0200] The transfer hub 1460 can include a first external monitor 1482. The vehicle computing system 102, and/or the operations computing system 120 can control the vehicle 104 to travel to a vicinity of the first external monitor 1482 when the vehicle 104 enters the transfer hub 1460 via the off-ramp 276. [0201] In some implementations, the first external monitor 1482 (e.g., external monitor computing system 126 corresponding to the first external monitor 1482) can inspect the vehicle 104 to generate diagnostics information associated with the vehicle 104.
Additionally, or alternatively, the first external monitor 1482 can obtain diagnostics information from the vehicle 104. The external monitor 1482 can determine remote inspection information associated with the vehicle 104 based on the diagnostics information, and provide the remote inspection information to a third-party computing system 124.
[0202] In some implementations, the first external monitor 1482 can inspect the vehicle 104 to generate diagnostics information associated with the vehicle 104, and provide the diagnostics information to the vehicle computing system 102. The vehicle computing system 102 can store the diagnostics information and/or determine remote inspection information associated with the vehicle 104 based on the diagnostics information. The vehicle computing system 102 can provide the remote inspection information to a third-party computing system 124.
[0203] The transfer hub 1460 can include a second external monitor 1484. The vehicle computing system 102, and/or the operations computing system 120 can control the vehicle 104 to travel to a vicinity of the second external monitor 1484 before the vehicle 104 exits the transfer hub 1460 via the on-ramp 1474.
[0204] In some implementations, the second external monitor 1484 (e.g., external monitor computing system 126 corresponding to the second external monitor 1484) can inspect the vehicle 104 to generate diagnostics information associated with the vehicle 104. Additionally, or alternatively, the second external monitor 1484 can obtain diagnostics information from the vehicle 104. The external monitor 1484 can determine remote inspection information associated with the vehicle 104 based on the diagnostics information, and provide the remote inspection information to a third-party computing system 124.
[0205] In some implementations, the second external monitor 1484 can inspect the vehicle 104 to generate diagnostics information associated with the vehicle 104, and provide the diagnostics information to the vehicle computing system 102. The vehicle computing system 102 can store the diagnostics information and/or determine remote inspection information associated with the vehicle 104 based on the diagnostics information. The vehicle computing system 102 can provide the remote inspection information to a third-party computing system 124.
[0206] FIG. 15 depicts a diagram 1500 of a transportation route 1502 according to example embodiments of the present disclosure. The transportation route 1502 can be part of a highway transportation infrastructure administered by a highway transportation administration. The transportation route 1502 can extend across regions 1503, 1505, 1507, and 1509, that are separated by boundaries 1504, 1506, 1508, and 1510. The regions 1503,
1505, and 1507 can each correspond to, for example, different government entities, and the boundaries 1504 and 1506 can each correspond to, for example, political boundaries. The boundary 1504 can separate region 1503 and region 1505, and the boundary 1506 can separate region 1505 and region 1507. The region 1509 can correspond to, for example, a region where wireless communication is unavailable or unreliable. The region 1509 can be bounded by the boundaries 1508 and 1510.
[0207] The transportation route 1502 can include an external monitor 1512 located at the boundary 1504, and an external monitor 1516 located at the boundary 1506. When the vehicle 104 is travelling from region 1503 to region 1507 on the transportation route 1502, the vehicle 104 can travel within a vicinity of the external monitor 1512 when the vehicle 104 crosses the boundary 1504, and within a vicinity of the external monitor 1516 when the vehicle 104 crosses the boundary 1506. The vehicle 104 can autonomously provide remote inspection information associated with it to the external monitors 1512 and 1516 when the vehicle 104 is within a vicinity of the external monitors 1512 and 1516, respectively.
[0208] As an example, when the vehicle 104 is travelling from region 1503 to region 1507, the vehicle 104 can provide remote inspection information including a weight of an attached cargo item to the external monitor 1512 when the vehicle 104 crosses the boundary 1504. The external monitor 1512 can provide the remote inspection information received from the vehicle 104 to a tax assessment entity associated with the region 1505.
[0209] As another example, when the vehicle 104 is travelling from region 1503 to region 1507, the vehicle 104 can provide remote inspection information including a weight of an attached cargo item to the external monitor 1516 when the vehicle 104 crosses the boundary
1506. The external monitor 1516 can provide the remote inspection information received from the vehicle 104 to a tax assessment entity associated with region 1507.
[0210] The transportation route 1502 can include an external monitor 1522 located at the boundary 1504, and an external monitor 1518 located at the boundary 1506. When the vehicle 104 is travelling from region 1507 to region 1503 on the transportation route 1502, the vehicle 104 can travel within a vicinity of the external monitor 1518 when the vehicle 104 crosses the boundary 1506, and within a vicinity of the external monitor 1522 when the vehicle 104 crosses the boundary 1504. The vehicle 104 can autonomously provide remote inspection information associated with it to the external monitors 1518 and 1522 when the vehicle 104 is within a vicinity of the external monitors 1518 and 1522, respectively.
[0211] As an example, when the vehicle 104 is travelling from region 1507 to region 1503, the vehicle 104 can provide remote inspection information including a weight of an attached cargo item to the external monitor 1518 when the vehicle 104 crosses the boundary
1506. The external monitor 1518 can provide the remote inspection information received from the vehicle 104 to a tax assessment entity associated with the region 1505.
[0212] As another example, when the vehicle 104 is travelling from region 1507 to region
1503, the vehicle 104 can provide remote inspection information including a weight of an attached cargo item to the external monitor 1522 when the vehicle 104 crosses the boundary
1504. The external monitor 1522 can provide the remote inspection information received from the vehicle 104 to a tax assessment entity associated with region 1503.
[0213] In some implementations, the vehicle computing system 102, and/or the operations computing system 120 can control the vehicle 104 to wirelessly provide remote inspection information to a third-party entity at periodic intervals along the transportation route 1502. For example, the vehicle 104 can provide remote inspection information including a location and speed associated with the vehicle 104 to a law enforcement entity to assist the law enforcement entity in monitoring vehicular traffic in its jurisdiction. When the vehicle 104 located in the region 1503, the vehicle 104 can provide such remote inspection information to a law enforcement entity associated with the region 1503; when the vehicle 104 located in the region 1505, the vehicle 104 can provide such remote inspection information to a law enforcement entity associated with the region 1505; and when the vehicle 104 located in the region 1507, the vehicle 104 can provide such remote inspection information to a law enforcement entity associated with the region 1507.
[0214] The transportation route 1502 can include external monitors 1514 and 1520 located within the region 1509. The external monitors 1514 and 1520 can include a dedicated and/or physical connection to a communications network to provide remote inspection information to a third-party entity. For example, when the vehicle 104 is located in the region 1509 that is included in the region 1505, the vehicle 104 can be unable to provide remote inspection information including a location and speed associated with the vehicle 104 to a law enforcement entity associated with the region 1505 because wireless communication is unavailable or unreliable in the region 1509. The vehicle 104 can instead travel within a vicinity of the external monitor 1514 when travelling from the region 1503 to the region
1507, and travel within a vicinity of the external monitor 1520 when travelling from the region 1507 to the region 1503. The vehicle 104 can provide remote inspection information including a location and speed associated with the vehicle 104 to the external monitors 1514 and 1520, and the external monitors 1514 and 1520 can provide the remote inspection information to a law-enforcement entity associated with the region 1505.
[0215] FIGS. 16A, 16B, and 16C depict diagrams 1602, 1604, and 1606 of determining remote inspection information using a mobile external monitor according to exemplary embodiments of the present disclosure. In FIG. 16A, the vehicle 104 can detect a fault, for example, with a tire pressure sensor of the vehicle 104, when travelling via the transportation route 1610. In response to the fault, the vehicle 104 can pull-over in the safety lane 1604 and request a mobile inspection of its tires. The vehicle 105 can be selected to travel to a vicinity of the vehicle 104 to inspect one or more tires of the vehicle 104. The vehicle 105 can be affixed with an external monitor that can inspect the vehicle 104. Additionally, or alternatively, the vehicle 105 can use one or more sensors onboard the vehicle 105 to inspect the vehicle 104. The vehicle 105 can inspect the vehicle 104 to generate diagnostics information associated with the vehicle 104, and provide the diagnostics information to the vehicle 104. Additionally, or alternatively, the vehicle 105 can determine remote inspection information associated with the vehicle 104, and provide the remote inspection information to a third-party computing system 124.
[0216] In FIG. 16B, the vehicle 104 can detect a fault, for example, with a tire pressure sensor of the vehicle 104, when travelling via the transportation route 1610. In response to the fault, the vehicle 104 can travel in a right-side lane of the transportation route 1610, and request a mobile inspection of its left-side tires. The vehicle 105 can be selected to travel to a vicinity of the vehicle 104 to inspect one or more tires of the vehicle 104. The vehicle 105 can be affixed with an external monitor that can inspect the vehicle 104. Additionally, or alternatively, the vehicle 105 can use one or more sensors onboard the vehicle 105 to inspect the vehicle 104. The vehicle 105 can travel in a left-side lane of the transportation route 1610 and inspect a left-side of the vehicle 104 to generate diagnostics information associated with the vehicle 104. The vehicle 105 can provide the diagnostics information to the vehicle 104. Additionally, or alternatively, the vehicle 105 can determine remote inspection information associated with the vehicle 104, and provide the remote inspection information to a third- party computing system 124.
[0217] In FIG. 16C, the vehicle 104 can detect a fault, for example, with a tire pressure sensor of the vehicle 104, when travelling via the transportation route 1610. In response to the fault, the vehicle 104 can travel in a left-side lane of the transportation route 1610, and request a mobile inspection of its right-side tires. The vehicle 105 can be selected to travel to a vicinity of the vehicle 104 to inspect one or more tires of the vehicle 104. The vehicle 105 can be affixed with an external monitor that can inspect the vehicle 104. Additionally, or alternatively, the vehicle 105 can use one or more sensors onboard the vehicle 105 to inspect the vehicle 104. The vehicle 105 can travel in a right-side lane of the transportation route 1610 and inspect a right-side of the vehicle 104 to generate diagnostics information associated with the vehicle 104. The vehicle 105 can provide the diagnostics information to the vehicle 104. Additionally, or alternatively, the vehicle 105 can determine remote inspection information associated with the vehicle 104, and provide the remote inspection information to a third-party computing system 124.
[0218] FIG. 17 depicts a diagram 1700 of remote inspection information 1702 according to example embodiments of the present disclosure. The remote inspection information 1702 can be associated with the vehicle 104, and can indicate a vehicle components status 1704, vehicle performance status 1706, and vehicle environment status 1708 associated with the vehicle 104. The vehicle components status 1704 can correspond to, for example, a status of the sensor(s) 108, autonomy computing system 110, vehicle control system 112,
communications system 114, and memory system 116. The vehicle performance status 1706 can correspond to, for example, a speed, distance travelled, fuel consumption, weight, coolant levels, and brake wear associated with the vehicle 104. The vehicle environment status 1708 can correspond to, for example, road conditions, weather conditions, and traffic conditions that are determined based on the sensor data 109.
[0219] FIG. 18 depicts a flow diagrams of an example method 1800 for controlling an autonomous vehicle according to example embodiments of the present disclosure. One or more portion(s) of the method 1800 can be implemented as operations by one or more computing system(s) such as, for example, the computing system(s) 102, 120, 801, and 810 shown in FIGS. 1, 2, and 8. Moreover, one or more portion(s) of the method 1800 can be implemented as an algorithm on the hardware components of the system(s) described herein (e.g., as in FIGS. 1, 2, and 8) to, for example, provide remote inspection information to a third-party entity. FIG. 18 depicts elements performed in a particular order for purposes of illustration and discussion. Those of ordinary skill in the art, using the disclosures provided herein, will understand that the elements of any of the methods (e.g., of FIG. 18) discussed herein can be adapted, rearranged, expanded, omitted, combined, and/or modified in various ways without deviating from the scope of the present disclosure. [0220] At (1801), the method 1800 can include controlling an autonomous vehicle to provide a vehicle service. For example, the vehicle computing system 102 can control the vehicle 104 to provide a vehicle service to a client entity.
[0221] At (1802), the method 1800 can include determining diagnostics information associated with the autonomous vehicle. For example, the vehicle computing system 102 can determine vehicle diagnostics information associated with the vehicle 104. The vehicle computing system 102 can determine the vehicle diagnostics information by autonomously generating diagnostics information associated with the vehicle 104. Additionally, or alternatively, the vehicle computing system 102 can determine the vehicle diagnostics information by controlling the vehicle 104 to ravel to a vicinity of an external monitor that can generate diagnostics information associated with the vehicle 104. The external monitor can include one or more of an automated inspection device and a human inspector. The external monitor can be located at one or more of a transfer hub and along a transportation route of a transportation network used to provide the vehicle service to the client entity. In some implementations, the external monitor can be mobile, and affixed to an additional vehicle 105. In some implementations, the external monitor can include on or more sensors onboard the vehicle 105. The operations computing system 120 can control the vehicle 105 to travel to a vicinity of the vehicle 104 and generate diagnostics information associated with the vehicle 104. The operations computing system 120 can control the vehicle 105 in response to a request by the vehicle 104 for an external monitor to generate diagnostics information associated with the vehicle 104.
[0222] At (1803), the method 1800 can include determining remote inspection information associated with the autonomous vehicle. For example, the vehicle computing system 102 can determine remote inspection information associated with the vehicle 104, that includes an assessment of one or more categories pertaining to a third-party entity, based on vehicle diagnostics information associated with the vehicle 104. The vehicle computing system 102 can determine one or more categories pertaining to the third-party entity, analyze the vehicle diagnostics information associated with the vehicle 104 to determine an assessment for each of the one or more categories pertaining to the third-party entity, and generate remote inspection information based at least in part on the assessment for each of the one or more categories pertaining to the third-party entity.
[0223] At (1804), the method 1800 can include providing remote inspection information to a third-party entity. For example, the vehicle computing system 102 can provide remote inspection information associated with the vehicle 104 to a third-party computing system 124 corresponding to a third-party entity. The vehicle computing system 102 can provide the remote inspection information to a remote third-party entity at one or more times when providing the vehicle service to the client entity.
[0224] FIG. 19 depicts an example computing system 1900 according to example embodiments of the present disclosure. The example system 1900 illustrated in FIG. 19 is provided as an example only. The components, systems, connections, and/or other aspects illustrated in FIG. 19 are optional and are provided as examples of what is possible, but not required, to implement the present disclosure. The example system 1900 can include the vehicle computing system 102 of the vehicle 104 and, in some implementations, remote computing system(s) 1910 including one or more remote computing system(s) that are remote from the vehicle 104 (e.g., operations computing system 120) that can be communicatively coupled to one another over one or more networks 1920. The remote computing system 1910 can be associated with a central operations system and/or an entity associated with the vehicle 104 such as, for example, a vehicle owner, vehicle manager, fleet operator, service provider, etc.
[0225] The computing device(s) 1901 of the vehicle computing system 102 can include processor(s) 1902 and a memory 1904. The one or more processors 1902 can be any suitable processing device (e.g., a processor core, a microprocessor, an ASIC, a FPGA, a controller, a microcontroller, etc.) and can be one processor or a plurality of processors that are operatively connected. The memory 1904 can include one or more non-transitory computer- readable storage media, such as RAM, ROM, EEPROM, EPROM, one or more memory devices, flash memory devices, etc., and combinations thereof.
[0226] The memory 1904 can store information that can be accessed by the one or more processors 1902. For instance, the memory 1904 (e.g., one or more non-transitory computer- readable storage mediums, memory devices) on-board the vehicle 104 can include computer- readable instructions 1906 that can be executed by the one or more processors 1902. The instructions 1906 can be software written in any suitable programming language or can be implemented in hardware. Additionally, or alternatively, the instructions 1906 can be executed in logically and/or virtually separate threads on processor(s) 1902.
[0227] For example, the memory 1904 on-board the vehicle 104 can store instructions 1906 that when executed by the one or more processors 1902 on-board the vehicle 104 cause the one or more processors 1902 (the vehicle computing system 102) to perform operations such as any of the operations and functions of the vehicle computing system 102, as described herein, one or more operations of method 1300, and/or any other operations and functions of the vehicle computing system 102, as described herein.
[0228] The memory 1904 can store data 1908 that can be obtained, received, accessed, written, manipulated, created, and/or stored. The data 1908 can include, for instance, data associated with perception, prediction, motion plan, maps, third-party identification, vehicle identification, vehicle status information, vehicle diagnostics, remote inspection, and/or other data/information as described herein. In some implementations, the computing device(s)
1901 can obtain data from one or more memory device(s) that are remote from the vehicle 104.
[0229] The computing device(s) 1901 can also include a communication interface 1903 used to communicate with one or more other system(s) on-board the vehicle 104 and/or a remote computing device that is remote from the vehicle 104 (e.g., of remote computing system(s) 1910). The communication interface 1903 can include any circuits, components, software, etc. for communicating via one or more networks (e.g., 1920). In some
implementations, the communication interface 1903 can include, for example, one or more of a communications controller, receiver, transceiver, transmitter, port, conductors, software, and/or hardware for communicating data.
[0230] The network(s) 1920 can be any type of network or combination of networks that allows for communication between devices. In some embodiments, the network(s) can include one or more of a local area network, wide area network, the Internet, secure network, cellular network, mesh network, peer-to-peer communication link, and/or some combination thereof, and can include any number of wired or wireless links. Communication over the network(s) 1920 can be accomplished, for instance, via a communication interface using any type of protocol, protection scheme, encoding, format, packaging, etc.
[0231] The remote computing system 1910 can include one or more remote computing devices that are remote from the vehicle computing system 102. The remote computing devices can include components (e.g., processor(s), memory, instructions, data) similar to that described herein for the computing device(s) 1901. Moreover, the remote computing system(s) 1910 can be configured to perform one or more operations of the operations computing system 120, as described herein. Moreover, the computing systems of other vehicles described herein can include components similar to that of vehicle computing system 102
[0232] Computing tasks discussed herein as being performed at computing device(s) remote from the vehicle can instead be performed at the vehicle (e.g., via the vehicle computing system), or vice versa. Such configurations can be implemented without deviating from the scope of the present disclosure. The use of computer-based systems allows for a great variety of possible configurations, combinations, and divisions of tasks and
functionality between and among components. Computer-implemented operations can be performed on a single component or across multiple components. Computer-implemented tasks and/or operations can be performed sequentially or in parallel. Data and instructions can be stored in a single memory device or across multiple memory devices.
[0233] While the present subject matter has been described in detail with respect to specific example embodiments and methods thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing can readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, the scope of the present disclosure is by way of example rather than by way of limitation, and the subject disclosure does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.

Claims

WHAT IS CLAIMED IS:
1. A computer-implemented method for controlling an autonomous vehicle in response to vehicle instructions from a remote computing system, the method comprising: controlling, by one or more computing devices, a first autonomous vehicle to provide a vehicle service, the first autonomous vehicle being associated with a first convoy that includes one or more second autonomous vehicles, the vehicle service being associated with a service provider entity;
receiving, by the one or more computing devices, one or more communications from a remote computing system associated with a third-party entity that is separate from the service provider entity, the one or more communications including one or more vehicle instructions;
coordinating, by the one or more computing devices, with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity; and
controlling, by the one or more computing devices, the first autonomous vehicle to implement the one or more vehicle actions.
2. The computer-implemented method of claim 1, wherein coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity comprises:
providing, by the one or more computing devices, data indicative of the one or more vehicle instructions to the one or more second autonomous vehicles;
determining, by the one or more computing devices, a stopping action that corresponds to the one or more vehicle instructions;
removing, by the one or more computing devices, the first autonomous vehicle from the first convoy; and
controlling, by the one or more computing devices, the first autonomous vehicle to implement the stopping action.
3. The computer-implemented method of any of the preceding claims, wherein coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity comprises:
providing, by the one or more computing devices, data indicative of the one or more vehicle instructions to a lead vehicle among the one or more second autonomous vehicles; receiving, by the one or more computing devices, one or more vehicle instructions from the lead vehicle; and
implementing, by the one or more computing devices, the one or more vehicle instructions received from the lead vehicle.
4. The computer-implemented method of any of the preceding claims, wherein coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity comprises:
determining, by the one or more computing devices, an identity of the third-party entity based at least in part on the one or more communications;
determining, by the one or more computing devices, the one or more vehicle actions based at least in part on the identity; and
providing, by the one or more computing devices, data indicative of the one or more determined vehicle actions to the one or more second autonomous vehicles;
5. The computer-implemented method of any of the preceding claims, wherein coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity comprises:
determining, by the one or more computing devices, a vehicle identifier associated with the one or more communications;
determining, by the one or more computing devices, the one or more vehicle actions based at least in part on the vehicle identifier; and
providing, by the one or more computing devices, data indicative of the one or more determined vehicle actions to the one or more second autonomous vehicles.
6. The computer-implemented method of any of the preceding claims, wherein coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity comprises:
determining, by the one or more computing devices, a priority level associated with the one or more vehicle instructions;
determining, by the one or more computing devices, the one or more vehicle actions based at least in part on the priority level; and
providing, by the one or more computing devices, data indicative of the one or more determined vehicle actions to the one or more second autonomous vehicles.
7. The computer-implemented method of any of the preceding claims, wherein receiving the one or more communications from the remote computing system associated with the third-party entity comprises:
receiving, by the one or more computing devices, one or more encrypted
communications from the remote computing system; and
decrypting, by the one or more computing devices, the one or more encrypted communications based on a predetermined key that is shared between the one or more computing devices and the remote computing system, the decrypted communications including the one or more vehicle instructions.
8. The computer-implemented method of any of the preceding claims, wherein the one or more vehicle instructions include at least one of: instructions for selecting the first autonomous vehicle, instructions for stopping the first autonomous vehicle, instructions for the first autonomous vehicle to relay information to the one or more second autonomous vehicles, and instructions for the first autonomous vehicle to provide information to the third- party entity.
9. The computer-implemented method of any of the preceding claims, wherein the one or more vehicle actions include at least one of: a stopping action, and a coordinated stopping action for stopping the first autonomous vehicles and the one or more second autonomous vehicles in the first convoy as a group.
10. A computing system for controlling an autonomous vehicle in response to vehicle instructions from a remote computing system, the system comprising:
one or more processors; and one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the computing system to perform operations, the operations comprising:
controlling a first autonomous vehicle to provide a vehicle service, the first autonomous vehicle being associated with a first convoy that includes one or more second autonomous vehicles;
receiving one or more communications from a remote computing system associated with a third-party entity, the one or more communications including one or more vehicle instructions;
coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity; and
controlling the first autonomous vehicle to implement the one or more vehicle actions.
11. The computing system of claim 10 wherein coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity comprises:
providing data indicative of the one or more vehicle instructions to the one or more second autonomous vehicles;
determining a stopping action that corresponds to the one or more vehicle
instructions;
removing, by the one or more computing devices, the first autonomous vehicle from the first convoy; and
controlling the first autonomous vehicle to implement the stopping action.
12. The computing system of any of the preceding claims wherein coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity comprises:
providing data indicative of the one or more vehicle instructions to a lead vehicle among the one or more second autonomous vehicles;
receiving one or more vehicle instructions from the lead vehicle; and implementing, by the one or more computing devices, the one or more vehicle instructions received from the lead vehicle.
13. The computing system of any of the preceding claims wherein coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity comprises:
authenticating an identity of the third-party entity based at least in part on the one or more communications;
determining the one or more vehicle actions based at least in part on the
authentication; and
providing data indicative of the one or more determined vehicle actions to the one or more second autonomous vehicles.
14. The computing system of any of the preceding claims wherein coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity comprises:
determining a vehicle identifier associated with the one or more communications; determining the one or more vehicle actions based at least in part on the vehicle identifier; and
providing data indicative of the one or more determined vehicle actions to the one or more second autonomous vehicles.
15. The computing system of any of the preceding claims wherein coordinating with the one or more second autonomous vehicles to determine one or more vehicle actions to perform in response to receiving the one or more vehicle instructions from the third-party entity comprises:
determining a priority level associated with the one or more vehicle instructions; determining the one or more vehicle actions based at least in part on the priority level; and
providing data indicative of the one or more determined vehicle actions to the one or more second autonomous vehicles.
16. An autonomous vehicle, comprising:
one or more processors; and
one or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the autonomous vehicle to perform operations, the operations comprising:
controlling a first autonomous vehicle to provide a vehicle service, the first autonomous vehicle being selected from a fleet of vehicles controlled by a first entity to provide the vehicle service to a second entity;
receiving one or more communications from a remote computing system associated with a third entity, the one or more communications including one or more vehicle instructions;
determining one or more vehicle actions to perform in response to the one or more vehicle instructions; and
controlling the first autonomous vehicle to implement the one or more vehicle actions.
17. The autonomous vehicle of claim 16 wherein receiving the one or more communications from the remote computing system associated with the third entity comprises:
receiving, by the one or more computing devices, one or more encrypted
communications from the remote computing system; and
decrypting, by the one or more computing devices, the one or more encrypted communications based on a predetermined key that is shared between the one or more computing devices and the remote computing system, the decrypted communications including the one or more vehicle instructions.
18. The autonomous vehicle of any of the preceding claims wherein determining the one or more vehicle actions to perform in response to the one or more vehicle instructions comprises:
determining an identity of the third entity based at least in part on the one or more communications; and
determining the one or more vehicle actions when the identity of the third entity is one of a plurality of predetermined third entities that can control the first autonomous vehicle.
19. The autonomous vehicle of claim 16 wherein determining the one or more vehicle actions to perform in response to the one or more vehicle instructions comprises: determining a vehicle identifier associated with the one or more communications; and determining the one or more vehicle actions based at least in part on the vehicle identifier.
20. The autonomous vehicle of any of the preceding claims wherein determining the one or more vehicle actions based at least in part on the vehicle identifier comprises: controlling the first autonomous vehicle to implement the one or more determined vehicle actions when the vehicle identifier corresponds to the first autonomous vehicle; and controlling the first autonomous vehicle to transmit the one or more vehicle instructions to a second autonomous vehicle when the vehicle identifier corresponds to the second autonomous vehicle.
PCT/US2019/012857 2018-01-09 2019-01-09 Systems and methods for controlling an autonomous vehicle WO2019139957A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21210479.8A EP3989032B1 (en) 2018-01-09 2019-01-09 Systems and methods for controlling an autonomous vehicle
EP19703800.3A EP3721313B1 (en) 2018-01-09 2019-01-09 Systems and methods for controlling an autonomous vehicle

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201862615206P 2018-01-09 2018-01-09
US62/615,206 2018-01-09
US201862620656P 2018-01-23 2018-01-23
US62/620,656 2018-01-23
US15/933,499 2018-03-23
US15/933,499 US11243547B2 (en) 2018-01-23 2018-03-23 Systems and methods for remote inspection of a vehicle
US15/980,324 US11215984B2 (en) 2018-01-09 2018-05-15 Systems and methods for controlling an autonomous vehicle
US15/980,324 2018-05-15

Publications (1)

Publication Number Publication Date
WO2019139957A1 true WO2019139957A1 (en) 2019-07-18

Family

ID=67218389

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/012857 WO2019139957A1 (en) 2018-01-09 2019-01-09 Systems and methods for controlling an autonomous vehicle

Country Status (1)

Country Link
WO (1) WO2019139957A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3098777A1 (en) * 2019-07-19 2021-01-22 Psa Automobiles Sa Method of insertion into a convoy of autonomous vehicles by a motor vehicle
US20210295620A1 (en) * 2020-03-20 2021-09-23 Beijing Idriverplus Information Technology Co., Ltd. Method and system for real-time and reliable autonomous vehicle fault diagnosis and protection

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016006672A1 (en) * 2015-06-04 2016-12-08 Scania Cv Ab Method and control unit for vehicle communication
WO2017048165A1 (en) * 2015-09-17 2017-03-23 Telefonaktiebolaget Lm Ericsson (Publ) Communication device, first radio node, second radio node, and methods therein, for determining whether to allow a first vehicle to overtake a vehicle platoon
WO2017148531A1 (en) * 2016-03-04 2017-09-08 Telefonaktiebolaget Lm Ericsson (Publ) Method and traffic control entity for controlling vehicle traffic

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016006672A1 (en) * 2015-06-04 2016-12-08 Scania Cv Ab Method and control unit for vehicle communication
WO2017048165A1 (en) * 2015-09-17 2017-03-23 Telefonaktiebolaget Lm Ericsson (Publ) Communication device, first radio node, second radio node, and methods therein, for determining whether to allow a first vehicle to overtake a vehicle platoon
WO2017148531A1 (en) * 2016-03-04 2017-09-08 Telefonaktiebolaget Lm Ericsson (Publ) Method and traffic control entity for controlling vehicle traffic

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3098777A1 (en) * 2019-07-19 2021-01-22 Psa Automobiles Sa Method of insertion into a convoy of autonomous vehicles by a motor vehicle
US20210295620A1 (en) * 2020-03-20 2021-09-23 Beijing Idriverplus Information Technology Co., Ltd. Method and system for real-time and reliable autonomous vehicle fault diagnosis and protection

Similar Documents

Publication Publication Date Title
US11840266B2 (en) Systems and methods for controlling an autonomous vehicle
US11842642B2 (en) Connected automated vehicle highway systems and methods related to heavy vehicles
US20210020048A1 (en) Systems and Methods for Directing Another Computing System to Aid in Autonomous Navigation
US11243547B2 (en) Systems and methods for remote inspection of a vehicle
US11860625B2 (en) System and method for updating vehicle operation based on remote intervention
EP3916696A1 (en) Method, apparatus and device for driving control, and medium and system
US10493622B2 (en) Systems and methods for communicating future vehicle actions to be performed by an autonomous vehicle
US20200209845A1 (en) System and method for remote intervention of vehicles
CN111260946A (en) Automatic driving truck operation control system based on intelligent network connection system
CN110036423A (en) For adjusting the method and control unit of the separation between vehicles in vehicle platoon between vehicle
US11308138B2 (en) Danger warning method for vehicle, danger warning device for vehicle, and medium
US20210394797A1 (en) Function allocation for automated driving systems
US10725473B2 (en) Systems and methods for changing a destination of an autonomous vehicle in real-time
US20210311491A1 (en) Intelligent roadside toolbox
CN114360269A (en) Automatic driving cooperative control system and method under intelligent network connection road support
JP7207670B2 (en) Highway system for connected autonomous vehicles and methods using it
CN114111819A (en) Trajectory planning for vehicles using route information
CN114120687A (en) Conditional motion prediction
EP3721313B1 (en) Systems and methods for controlling an autonomous vehicle
WO2019139957A1 (en) Systems and methods for controlling an autonomous vehicle
CN114647522A (en) Computer-implemented method, vehicle and storage medium
CN105580061A (en) Road information receiving system and receiving method
DE102019004481A1 (en) Method and arrangement for controlling a sophisticated driver assistance system
US11960301B2 (en) Systems and methods for remote inspection of a vehicle
CN114255604A (en) Method, apparatus, device, medium, and system for driving control

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19703800

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019703800

Country of ref document: EP

Effective date: 20200709