WO2019023704A1 - Fleet of robot vehicles for specialty product and service delivery - Google Patents

Fleet of robot vehicles for specialty product and service delivery Download PDF

Info

Publication number
WO2019023704A1
WO2019023704A1 PCT/US2018/044361 US2018044361W WO2019023704A1 WO 2019023704 A1 WO2019023704 A1 WO 2019023704A1 US 2018044361 W US2018044361 W US 2018044361W WO 2019023704 A1 WO2019023704 A1 WO 2019023704A1
Authority
WO
WIPO (PCT)
Prior art keywords
fleet
robot
goods
customer
services
Prior art date
Application number
PCT/US2018/044361
Other languages
French (fr)
Inventor
David Ferguson
Jiajun Zhu
Cosimo LEIPOLD
Original Assignee
Nuro, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/047,659 external-priority patent/US10860015B2/en
Application filed by Nuro, Inc. filed Critical Nuro, Inc.
Priority to CA3070725A priority Critical patent/CA3070725A1/en
Priority to JP2020504119A priority patent/JP7236434B2/en
Priority to CN201880048988.5A priority patent/CN110945451A/en
Priority to EP18837696.6A priority patent/EP3659003A4/en
Priority to US16/158,917 priority patent/US20190050808A1/en
Priority to US16/158,982 priority patent/US10719805B2/en
Priority to US16/159,047 priority patent/US11551278B2/en
Priority to US16/158,889 priority patent/US11200613B2/en
Priority to US16/159,016 priority patent/US10486640B2/en
Priority to US16/158,963 priority patent/US11556970B2/en
Priority to US16/158,940 priority patent/US11151632B2/en
Priority to US16/176,462 priority patent/US20190064847A1/en
Priority to US16/181,724 priority patent/US11574352B2/en
Publication of WO2019023704A1 publication Critical patent/WO2019023704A1/en
Priority to EP19761983.6A priority patent/EP3830777A1/en
Priority to JP2021503926A priority patent/JP2021532480A/en
Priority to CN201980047757.7A priority patent/CN112437934A/en
Priority to PCT/US2019/043614 priority patent/WO2020028162A1/en
Priority to CA3107444A priority patent/CA3107444A1/en
Priority to CA3107746A priority patent/CA3107746A1/en
Priority to EP19759770.1A priority patent/EP3830776A1/en
Priority to EP19750228.9A priority patent/EP3830800B1/en
Priority to PCT/US2019/043893 priority patent/WO2020028238A1/en
Priority to JP2021503911A priority patent/JP2021532476A/en
Priority to CN201980048134.1A priority patent/CN112424841A/en
Priority to CA3107512A priority patent/CA3107512A1/en
Priority to PCT/US2019/043897 priority patent/WO2020028241A1/en
Priority to PCT/US2019/043887 priority patent/WO2020028235A1/en
Priority to CN201980048683.9A priority patent/CN112470178A/en
Priority to JP2021503912A priority patent/JP7365395B2/en
Priority to US16/654,216 priority patent/US11222378B2/en
Priority to US17/539,819 priority patent/US11645696B2/en
Priority to US17/676,563 priority patent/US20220180419A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0297Fleet control by controlling means in a control room
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management

Definitions

  • the present application relates to autonomous vehicles.
  • Robots are being used for many purposes including warehouse inventory operations, household vacuuming robots, hospital delivery robots, sanitation robots, and military or defense applications.
  • This disclosure relates to an autonomous and/or semi-autonomous robot fleet comprising a plurality of robots, in particular robots for transporting or retrieving deliveries in either unstructured outdoor environment or closed environments.
  • a robot fleet comprising a plurality of robot vehicles operating autonomously and/or semi-autonomously and a fleet management module, associated with a central server for coordination of the robot fleet; the fleet management module configured to coordinate the activity and positioning of each robot in the fleet, wherein the fleet is configured for transporting, delivering or retrieving goods or services and capable of operating in an unstructured open or closed environments; each robot in the fleet comprising: a power system, a conveyance system; (e.g., a drive system with a propulsion engine, wheels, treads, wings, rotors, blowers, rockets, propellers, brakes, etc.); a navigation module for navigation in the unstructured open or closed environments; (e.g., digital maps, HD maps, GPS); a communication module configurable to receive, store and send data to the fleet management module, a user, and the robots in the fleet, related to at least; user interactions and the robot fleet interactions, comprising: scheduled requests or orders, on- demand requests or orders
  • the unstructured open environment is a non-confined geographic region accessible by navigable pathways comprising: public roads; private roads; bike paths; open fields; open public lands; open private lands; pedestrian walkways; lakes; rivers; streams; or open airspace.
  • the closed environment is a confined, enclosed or semi- enclosed structure accessible by navigable pathways comprising: open areas or rooms within commercial architecture, with or without structures or obstacles therein; airspace within open areas or rooms within commercial architecture, with or without structures or obstacles therein; public or dedicated aisles; hallways; tunnels; ramps; elevators; conveyors; or pedestrian walkways.
  • the navigation module controls routing of the conveyance system of the robots in the fleet in the unstructured open or closed environments.
  • the communication to the user, to the robots in the fleet, between the robots of the fleet, and between the user and the robots in the fleet occurs via wireless transmission.
  • the user comprises a fleet manager; a sub-contracting vendor; a service provider; a customer; a business entity; an individual; or a third party.
  • the user's wireless transmission interactions and the robot fleet wireless transmission interactions occur via mobile application transmitted by an electronic device and forwarded to the communication module via: a central server; a fleet management module; and/or a mesh network.
  • the electronic device comprises: a phone; a personal mobile device; a personal digital assistant (PDA); a mainframe computer; a desktop computer; a laptop computer; a tablet computer; and/or wearable computing device comprising: a communication headset; smart glasses; a contact lens or lenses; a digital watch; a bracelet; a ring; jewelry; or a combination thereof.
  • PDA personal digital assistant
  • each robot fleet is configured with a maximum speed range from 1.0 mph to 90.0 mph.
  • the plurality of securable compartments is humidity and temperature controlled for: hot goods, cold goods, wet goods, dry goods, or combinations or variants thereof.
  • the plurality of securable compartments is configurable for a plurality of goods.
  • Such configurations and goods comprise: bookshelves for books; thin drawers for documents; larger box-like drawers for packages, and sized compartments for vending machines, coffee makers, pizza ovens and dispensers.
  • the plurality of securable compartments is variably configurable based on: anticipated demands; patterns of behaviors; area of service; or types of goods to be transported.
  • the services comprise: subscription services; prescription services; marketing services; advertising services; notification services; a mobile marketplace; or requested, ordered or scheduled delivery services.
  • the scheduled delivery services include, by way of example, special repeat deliveries such as groceries, prescriptions, drinks, mail, documents, etc.
  • the services further comprise: the user receiving and returning the same or similar goods within the same interaction; (e.g., signed documents); the user receiving one set of goods and returning a different set of goods within the same interaction; (e.g., product replacement/ returns, groceries, merchandise, books, recording, videos, movies, payment transactions, etc.); a third party user providing instruction and or authorization to a goods or service provider to prepare, transport, deliver and/or retrieve goods to a principle user in a different location.
  • the user receiving and returning the same or similar goods within the same interaction
  • the user e.g., signed documents
  • the user receiving one set of goods and returning a different set of goods within the same interaction
  • a third party user providing instruction and or authorization to a goods or service provider to prepare, transport, deliver and/or retrieve goods to a principle user in a different location.
  • the services further comprise: general services, (e.g., picking up a user's dry cleaning, dropping off a user's dry cleaning, renting goods, (such as tools, DVDs, etc.), sharing/borrowing goods from other users or businesses, etc.). Further still, it may be a general pickup service for items to be shipped, returned, or sent to other users/businesses, etc.
  • general services e.g., picking up a user's dry cleaning, dropping off a user's dry cleaning, renting goods, (such as tools, DVDs, etc.), sharing/borrowing goods from other users or businesses, etc.
  • At least one robot in the fleet is further configured to process or manufacture goods.
  • the processed or manufactured goods comprise: beverages, etc., with or without condiments; (e.g., coffee, tea, carbonated drinks, etc.); a plurality of fast foods; or microwavable foods.
  • the robot fleet further comprises at least one robot having a digital display for curated content comprising: advertisements (i.e., for both specific user and general public), including; services provided, marketing/ promotion, regional / location of areas served, customer details, local environment, lost, sought or detected people, public service announcements, date, time, or weather.
  • advertisements i.e., for both specific user and general public
  • services provided i.e., marketing/ promotion, regional / location of areas served, customer details, local environment, lost, sought or detected people, public service announcements, date, time, or weather.
  • the positioning of robots can be customized based on: anticipated use, a pattern of historical behaviors, or specific goods being carried.
  • the robot fleet is fully-autonomous.
  • the robot fleet is semi-autonomous.
  • the robot fleet is controlled directly by the user.
  • a plurality of said autonomous or semi- autonomous robots within the fleet is operated on behalf of third party vendor/service provider; (e.g., fleet managed by an owner, but providing a coffee service/experience for a third party vendor (i.e., Starbucks) with white label robots in the fleet).
  • third party vendor/service provider e.g., fleet managed by an owner, but providing a coffee service/experience for a third party vendor (i.e., Starbucks) with white label robots in the fleet.
  • a plurality of said autonomous robots within the fleet is further configured to be part of a sub-fleet comprising a sub-plurality of autonomous robots, wherein each sub-fleet is configured to operate independently or in tandem with multiple sub-fleets comprising two or more sub-fleets.
  • FIG. 1 is an exemplary view an autonomous robot fleet, wherein each vehicle within a fleet or sub-fleet can be branded for an entity;
  • FIG. 2 is an exemplary ISO view of a robot vehicle, part of an autonomous robot fleet, illustrating securable compartments within the vehicle;
  • FIG. 3 is an exemplary front view of a robot vehicle, part of an autonomous robot fleet, shown in comparison to the height of an average person;
  • FIG. 4 is an exemplary right side view of a robot vehicle, part of an autonomous robot fleet, illustrating a configuration with two large side doors, each enclosing securable compartments;
  • FIG. 5 is an exemplary left side view of a robot vehicle, part of an autonomous robot fleet, shown in comparison to the height of an average person;
  • FIG. 6 is an exemplary rear view of a robot vehicle, part of an autonomous robot fleet
  • FIG. 7 is an exemplary ISO view of a robot vehicle, part of an autonomous robot fleet, illustrating an autonomous lunch delivery vehicle for any branded company;
  • FIG. 8 is an exemplary ISO view of a robot vehicle, part of an autonomous robot fleet, illustrating an autonomous pizza delivery vehicle for any branded company;
  • FIG. 9 is an exemplary ISO view of a robot vehicle, part of an autonomous robot fleet, illustrating an autonomous coffee delivery vehicle for any branded company;
  • FIG. 10 is an exemplary ISO view of a robot vehicle, part of an autonomous robot fleet, illustrating an autonomous evening/ nighttime delivery vehicle for any branded company, comprising a lighted interior;
  • FIG. 11 is an exemplary flowchart representation of the logic for a fleet management control module associated with a central server for the robot fleet.
  • FIG. 12 is an exemplary flowchart representation of the logic flow from the Fleet Management Control Module through the robot processor to the various systems and modules of the robot. DETAILED DESCRIPTION
  • This disclosure relates to a fully-autonomous and/or semi-autonomous robot fleet and, in particular, to robot vehicles for transporting or retrieving deliveries in either open unstructured outdoor environments or closed environments.
  • a robot fleet having robot vehicles operating fully-autonomously or semi-autonomously and a fleet management module for coordination of the robot fleet, where each robot within the fleet is configured for transporting, delivering or retrieving goods or services and is capable of operating in an unstructured open or closed environment.
  • Each robot can include a power system, a conveyance system, a navigation module, at least one securable compartment or multiple securable compartments to hold goods, a controller configurable to associate each of the securable compartments to an assignable customer a customer group within a marketplace, or provider and provide entry when authorized, a communication module and a processor configured to manage the conveyance system, the navigation module, the sensor system, the communication module and the controller.
  • autonomous includes fully-autonomous, semi- autonomous, and any configuration in which a vehicle can operate in a controlled manner for a period of time without human intervention.
  • the term “fleet,” “sub-fleet,” and like terms are used to indicate a number of land vehicles, watercraft or aircraft operating together or under the same ownership. In some embodiments the fleet or sub-fleet is engaged in the same activity. In some embodiments, the fleet or sub-fleet are engaged in similar activities. In some embodiments, the fleet or sub-fleet are engaged in different activities. [0048] As used herein, the term “robot,” “robot vehicle,” “robot fleet,” “vehicle,” “all-terrain vehicle,” and like terms are used to indicate a mobile machine that transports cargo, items, and/or goods.
  • Typical vehicles include cars, wagons, vans, unmanned motor vehicles (e.g., tricycles, trucks, trailers, buses, etc.), unmanned railed vehicles (e.g., trains, trams, etc.), unmanned watercraft (e.g., ships, boats, ferries, landing craft, barges, rafts, etc.), aerial drones, unmanned hovercraft (air, land and water types) , unmanned aircraft, and even including unmanned spacecraft.
  • unmanned motor vehicles e.g., tricycles, trucks, trailers, buses, etc.
  • unmanned railed vehicles e.g., trains, trams, etc.
  • unmanned watercraft e.g., ships, boats, ferries, landing craft, barges, rafts, etc.
  • aerial drones unmanned hovercraft (air, land and water types) , unmanned aircraft, and even including unmanned spacecraft.
  • the term “compartment” is used to indicate an internal bay of a robot vehicle that has a dedicated door at the exterior of the vehicle for accessing the bay, and also indicates an insert secured within the bay.
  • the term “sub-compartment” is used to indicate a subdivision or portion of a compartment.
  • the term “module” may be used herein to refer to a compartment and/or a sub-compartment.
  • the term "user,” “operator,” “fleet operator,” and like terms are used to indicate the entity that owns or is responsible for managing and operating the robot fleet.
  • customer and like terms are used to indicate the entity that requests the services provided the robot fleet.
  • the term "provider,” “business,” “vendor,” “third party vendor,” and like terms are used to indicate an entity that works in concert with the fleet owner or operator to utilize the services of the robot fleet to deliver the provider's product from and or return the provider's product to the provider's place of business or staging location.
  • server As used herein, the term “server,” “computer server,” “central server,” “main server,” and like terms are used to indicate a computer or device on a network that manages the fleet resources, namely the robot vehicles.
  • controller and like terms are used to indicate a device that controls the transfer of data from a computer to a peripheral device and vice versa.
  • disk drives, display screens, keyboards, and printers all require controllers.
  • the controllers are often single chips.
  • the controller is commonly used for managing access to components of the robot such as the securable compartments.
  • a “mesh network” is a network topology in which each node relays data for the network. All mesh nodes cooperate in the distribution of data in the network. It can be applied to both wired and wireless networks. Wireless mesh networks can be considered a type of "Wireless ad hoc" network. Thus, wireless mesh networks are closely related to Mobile ad hoc networks (MANETs). Although MANETs are not restricted to a specific mesh network topology, Wireless ad hoc networks or MANETs can take any form of network topology.
  • Mesh networks can relay messages using either a flooding technique or a routing technique. With routing, the message is propagated along a path by hopping from node to node until it reaches its destination.
  • the network must allow for continuous connections and must reconfigure itself around broken paths, using self-healing algorithms such as Shortest Path Bridging.
  • Self-healing allows a routing-based network to operate when a node breaks down or when a connection becomes unreliable.
  • the network is typically quite reliable, as there is often more than one path between a source and a destination in the network. This concept can also apply to wired networks and to software interaction.
  • a mesh network whose nodes are all connected to each other is a fully connected network.
  • module and like terms are used to indicate a self-contained hardware component of the central server, which in turn includes software modules.
  • a module is a part of a program. Programs are composed of one or more independently developed modules that are not combined until the program is linked. A single module can contain one or several routines, or sections of programs that perform a particular task.
  • the fleet management module includes software modules for managing various aspects and functions of the robot fleet.
  • processor digital processing device
  • CPU central processing unit
  • the CPU is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logical, control and input/output (I/O) operations specified by the instructions.
  • suitable digital processing devices include, by way of non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles.
  • server computers desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles.
  • Suitable tablet computers include those with booklet, slate, and convertible configurations, known to those of skill in the art.
  • the digital processing device includes an operating system configured to perform executable instructions.
  • the operating system is, for example, software, including programs and data, which manages the device's hardware and provides services for execution of applications.
  • suitable server operating systems include, by way of non-limiting examples, FreeBSD, OpenBSD, NetBSD®, Linux, Apple® Mac OS X Server®, Oracle® Solaris®, Windows Server®, and Novell® NetWare®.
  • suitable personal computer operating systems include, by way of non-limiting examples, Microsoft® Windows®, Apple® Mac OS X®, UNIX®, and UNIX- like operating systems such as GNU/Linux®.
  • the operating system is provided by cloud computing.
  • suitable mobile smart phone operating systems include, by way of non-limiting examples, Nokia® Symbian® OS, Apple® iOS®, Research In Motion® BlackBeny OS®, Google® Android®, Microsoft® Windows Phone® OS, Microsoft® Windows Mobile® OS, Linux®, and Palm® WebOS®.
  • the device includes a storage and/or memory device.
  • the storage and/or memory device is one or more physical apparatus used to store data or programs on a temporary or permanent basis.
  • the device is volatile memory and requires power to maintain stored information.
  • the device is non-volatile memory and retains stored information when the digital processing device is not powered.
  • the non-volatile memory includes flash memory.
  • the non-volatile memory includes dynamic random-access memory (DRAM).
  • the non-volatile memory includes ferroelectric random access memory (FRAM).
  • the non-volatile memory includes phase-change random access memory (PRAM).
  • the device is a storage device including, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, magnetic disk drives, magnetic tapes drives, optical disk drives, and cloud computing based storage.
  • the storage and/or memory device is a combination of devices such as those disclosed herein.
  • the digital processing device includes a display to send visual information to a user.
  • the display is a cathode ray tube (CRT).
  • the display is a liquid crystal display (LCD).
  • the display is a thin film transistor liquid crystal display (TFT-LCD).
  • the display is an organic light emitting diode (OLED) display.
  • on OLED display is a passive-matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display.
  • the display is a plasma display.
  • the display is a video projector.
  • the display is interactive (e.g., having a touch screen or a sensor such as a camera, a 3D sensor, a LiDAR, a radar, etc.) that can detect user interactions/gestures/responses and the like.
  • the display is a combination of devices such as those disclosed herein.
  • a robot fleet 100 as illustrated in FIG. 1, having robot vehicles 101, with each one operating fully-autonomously or semi-autonomously.
  • one exemplary configuration of a robot 101 is a vehicle configured for land travel, such as a small fully-autonomous (or semi-autonomous) automobile.
  • the exemplary fully-autonomous (or semi-autonomous) automobile is narrow (i.e., 2- 5 feet wide), low mass and low center of gravity for stability, having multiple secure compartments assignable to one or more customers, retailers and/or vendors, and designed for moderate working speed ranges (i.e., 1.0 - 45.0 mph) to accommodate inner-city and residential driving speeds.
  • the land vehicle robot units in the fleet are configured with a maximum speed range from 1.0 mph to about 90.0 mph for high speed, intrastate or interstate driving.
  • Each robot in the fleet is equipped with onboard sensors 170 (e.g., cameras (running at a high frame rate, akin to video), LiDAR, radar, ultrasonic sensors, microphones, etc.) and internal computer processing to constantly determine where it can safely navigate, what other objects are around each robot and what it may do.
  • sensors 170 e.g., cameras (running at a high frame rate, akin to video), LiDAR, radar, ultrasonic sensors, microphones, etc.
  • internal computer processing to constantly determine where it can safely navigate, what other objects are around each robot and what it may do.
  • the robot fleet is fully-autonomous.
  • the robot fleet is semi-autonomous. In some embodiments, it may be necessary to have human interaction between the robot 101, the fleet operator 200, the provider 204 and/ or the customer 202 to address previously unforeseen issues (e.g., a malfunction with the navigation module; provider inventory issues; unanticipated traffic or road conditions; or direct customer interaction after the robot arrives at the customer location).
  • previously unforeseen issues e.g., a malfunction with the navigation module; provider inventory issues; unanticipated traffic or road conditions; or direct customer interaction after the robot arrives at the customer location.
  • the robot fleet 100 is controlled directly by the user 200. In some embodiments, it may be necessary to have direct human interaction between the robot 101 and/or the fleet operator 200 to address maintenance issues such as mechanical failure, electrical failure or a traffic accident.
  • the robot fleet is configured for land travel.
  • each robot land vehicle in the fleet is configured with a working speed range from 13.0 mph to 45.0 mph.
  • the land vehicle robot units in the fleet are configured with a maximum speed range from 13.0 mph to about 90.0 mph.
  • the robot fleet is configured for water travel as a watercraft and is configured with a working speed range from 1.0 mph to 45.0 mph.
  • the robot fleet is configured for hover travel as an over-land or over- water hovercraft and is configured with a working speed range from 1.0 mph to 60.0 mph.
  • the robot fleet is configured for air travel as an aerial drone or aerial hovercraft and is configured with a working speed range from 1.0 mph to 80.0 mph.
  • the autonomous robots within the fleet are operated on behalf of third party vendor/service provider.
  • a fleet management service is established to provide a roving delivery service for a third party beverage/ food provider (e.g., a coffee service/experience for a third party vendor (i.e., Starbucks)). It is conceived that the fleet management service would provide a sub-fleet of "white label" vehicles carrying the logo and products of that third party beverage/ food provider to operate either fully-autonomously or semi-autonomously to provide this service.
  • a third party beverage/ food provider e.g., a coffee service/experience for a third party vendor (i.e., Starbucks)
  • the fleet management service would provide a sub-fleet of "white label" vehicles carrying the logo and products of that third party beverage/ food provider to operate either fully-autonomously or semi-autonomously to provide this service.
  • the autonomous robots within the fleet are further configured to be part of a sub-fleet of autonomous robots, and each sub-fleet is configured to operate independently or in tandem with multiple sub-fleets having two or more sub-fleets (100-a, 100-b).
  • a package delivery service is configured to offer multiple levels of service such as "immediate dedicated rush service,” “guaranteed morning/ afternoon delivery service,” or “general delivery service.”
  • a service provider could then have a dedicated sub-fleet of delivery vehicles for each type of service within their overall fleet of vehicles.
  • a third party has priority over a certain number of vehicles in the fleet. In so doing, they can guarantee a certain level of responsiveness. When they aren't using the vehicles, the vehicles are used for general services within the fleet (e.g., other third parties).
  • the robot fleet is controlled directly by the user.
  • each robot within the fleet is configurable to allow for direct control of the robot's processor to override the conveyance and sensor systems (i.e., cameras, etc.) by a fleet operator to allow for the safe return of the vehicle to a base station for repair.
  • conveyance and sensor systems i.e., cameras, etc.
  • a fleet of vehicles for a transportation or delivery service that includes any set of fully human-driven vehicles, semi-autonomous vehicles, fully-autonomous vehicles, vehicles operated remotely by human drivers, and/or any vehicle that is a combination/hybrid of these.
  • the system can choose to dispatch an appropriate type of vehicle based on the specific requirements of that particular transaction. This can be based on distance, locations, customers' preferences, and/or weather conditions, among other factors.
  • the system may include a portal for a business to call, schedule, and monitor a delivery, and also a routing mechanism to find best paths for all the vehicles on the system.
  • the fleet management module receives an order of one or more goods either directly from the customer or from the customer via the central server.
  • the parameters for the order are determined.
  • the parameters may include the customer's preference for service providers, the type of vehicle needed to perform the delivery, care instructions for the one or more goods, and/or size and weight of the one or more goods.
  • a vehicle is selected from a fleet of vehicles to perform the delivery of the one or more goods to the customer based on the determined parameters.
  • a service provider is selected to fulfill the order based on the determined parameters, and the order is transmitted to the selected service provider.
  • the determined parameters may specify a particular service provider.
  • a message is sent to the selected vehicle to obtain the one or more goods from the service provider and deliver the one or more goods to the customer.
  • a method of providing services using a fleet of mixed vehicles includes receiving a request for a service, determining parameters for the service, selecting a vehicle from the fleet of mixed vehicles to perform at least a portion of the service based on the determined parameters, and transmitting a message to the selected vehicle to perform at least a portion of the service.
  • the fleet of mixed vehicles includes at least one of a human-driven vehicle, a semi-autonomous vehicle, a fully autonomous vehicle, or a vehicle remotely operated by a human.
  • the parameters include distance, location, customer's preference, vehicle type, size of a requested good, weight of a requested good, or weather or road conditions at or near the location of the customer receiving the service.
  • the service includes transporting or delivering a good or product.
  • the method includes receiving a message that the selected vehicle has completed performing the requested service, and identifying the selected vehicle as being available for performing another requested service. In various embodiments, the method includes determining similar requests for services from multiple customers located near each other, and determining a path to deliver the services to the multiple customers using the selected vehicle.
  • an integrated system that enables a smaller, sidewalk-friendly autonomous robot to reside inside a larger, on-road autonomous vehicle, and carries a package from the on-road vehicle to the customer's front door/drop-box.
  • the main autonomous vehicle can travel to the curbside of the destination address, and the sub-robot vehicle will complete the journey to the destination and deliver the package to the door or to a drop-box.
  • the autonomous sub-robot vehicle can receive its destination through either communication between the autonomous robot vehicle and the sub-robot vehicle, or through communication with a central server.
  • an autonomous robot vehicle includes a first land conveyance system configured to travel on vehicle roadways, a navigation system configured to navigate to a destination location, an exterior housing, and a sub-robot vehicle carried within the exterior housing while the first land conveyance system autonomously travels on the vehicle roadways to the destination location.
  • the sub-robot vehicle includes a second land conveyance system configured to travel on pedestrian walkways, at least one module configured to store customer items where the at least one module includes at least one compartment or sub- compartment, at least one processor, and a memory storing instructions. The instructions, when executed by the at least one processor, cause the sub-robot vehicle to autonomously control the second conveyance system to exit the exterior housing and travel the pedestrian walkways to a customer pickup location.
  • a method for autonomous robot vehicle delivery includes navigating via a navigation system configured to navigate to a destination location, autonomously traveling via a first land conveyance system on vehicle roadways to the destination location, and carrying a sub-robot vehicle within an exterior housing, where the sub- robot vehicle includes a second land conveyance system configured to travel on pedestrian walkways, and at least one module configured to store customer items.
  • the method includes instructing the sub-robot vehicle to exit the exterior housing and autonomously travel, via the second land conveyance system, the pedestrian walkways to a customer pickup location.
  • the customer pickup location includes a front door.
  • the destination includes a street curb near the customer pick up location.
  • the destination is at least one of a securable drop-box, a residential address, or a commercial address.
  • the sub-robot vehicle receives an item or items corresponding to a purchase order.
  • the unstructured open environment is a non-confined geographic region accessible by navigable pathways, including, for example, public roads, private roads, bike paths, open fields, open public lands, open private lands, pedestrian walkways, lakes, rivers or streams.
  • the closed environment is a confined, enclosed or semi- enclosed structure accessible by navigable pathways, including, for example, open areas or rooms within commercial architecture, with or without structures or obstacles therein, airspace within open areas or rooms within commercial architecture, with or without structures or obstacles therein, public or dedicated aisles, hallways, tunnels, ramps, elevators, conveyors, or pedestrian walkways.
  • the unstructured open environment is a non-confined airspace or even near-space environment which includes all main layers of the Earth's atmosphere including the troposphere, the stratosphere, the mesosphere, the thermosphere and the exosphere.
  • the navigation module controls routing of the conveyance system of the robots in the fleet in the unstructured open or closed environments.
  • the fleet includes a fleet management module 120 (associated with a central server) for coordination of the robot fleet 100 and assignment of tasks for each robot 101 in the fleet.
  • the fleet management module coordinates the activity and positioning of each robot in the fleet.
  • the fleet management module also communicates with providers/vendors/businesses and customers to optimize behavior of the entire system.
  • the fleet management module works in coordination with a central server 110, typically located in a central operating facility owned or managed by the fleet owner 200.
  • a request is sent to a main server 110 (typically located at the fleet owner's or fleet manager's location), which then communicates with the fleet management module 120.
  • the fleet management module then relays the request to the appropriate provider 204 of the service (e.g., restaurant, delivery service, vendor or retailer) and an appropriate robot or robots 101 in the fleet.
  • the best appropriate robot(s) in the fleet within the geographic region and typically closest to the service provider, is then assigned the task, and the provider of the service 204 then interacts with that robot 101 at their business (e.g., loading it with goods, if needed).
  • the robot then travels to the customer 202 and the customer interacts with the robot to retrieve their goods or service (e.g., the goods ordered).
  • An interaction can include requesting the robot to open its compartment 102, 104 through the customer's app or through a user interface on the robot itself (using, e.g., RFID reader and customer phone, a touchpad, a keypad, voice commands, vision-based recognition of the person, etc.).
  • the robot Upon completion of the delivery (or retrieval, if appropriate), the robot reports completion of the assignment and reports back to the fleet management module for re-assignment.
  • the fleet management module 120 handles coordination of the robot fleet 100 and assignment of tasks for each robot 101 in the fleet.
  • the fleet management module coordinates the activity and positioning of each robot in the fleet.
  • the fleet management module also communicates with vendors/businesses 204 and customers 202 to optimize behavior of entire system. It does this by utilizing the robot's processor 125 to process the various inputs and outputs from each of the robot's systems and modules, including: the conveyance system 130, the power system 135, the navigation module 140, the sensor system 170, 175, the communication module 160, and the controller 150, to effectively manage and coordinate the various functions of each robot in the fleet.
  • the robot may be requested for a pick-up of an item (e.g., a document) with the intent of delivery to another party.
  • the fleet management module would assign the robot to arrive at a given location, assign a securable compartment for receipt of the item, confirm receipt from the first party to the fleet management module, then proceed to the second location where an informed receiving party would recover the item from the robot using an appropriate PIN or other recognition code to gain access to the secure compartment.
  • the robot would then reports completion of the assignment and report back to the fleet management module for re-assignment.
  • Each robot vehicle 101 in the fleet includes a conveyance system 130 (e.g., a drive system with a propulsion engine, wheels, treads, wings, rotors, blowers, rockets, propellers, brakes, etc.).
  • a conveyance system 130 e.g., a drive system with a propulsion engine, wheels, treads, wings, rotors, blowers, rockets, propellers, brakes, etc.
  • the robot fleet is configurable for land, water or air.
  • Typical vehicles include cars, wagons, vans, unmanned motor vehicles (e.g., tricycles, trucks, trailers, buses, etc.), unmanned railed vehicles (e.g., trains, trams, etc.), unmanned watercraft (e.g., ships, boats, ferries, landing craft, barges, rafts, etc.), aerial drones, unmanned hovercraft (air, land, and water types), unmanned aircraft, and unmanned spacecraft.
  • unmanned motor vehicles e.g., tricycles, trucks, trailers, buses, etc.
  • unmanned railed vehicles e.g., trains, trams, etc.
  • unmanned watercraft e.g., ships, boats, ferries, landing craft, barges, rafts, etc.
  • aerial drones unmanned hovercraft (air, land, and water types), unmanned aircraft, and unmanned spacecraft.
  • a robot land vehicle 101 is configured with a traditional 4-wheeled automotive configuration comprising conventional steering and braking systems.
  • the drive train is configurable for standard 2-wheel drive or 4-wheel all-terrain traction drive.
  • the propulsion system (engine) is configurable as a gas engine, a turbine engine, an electric motor and/or a hybrid gas/electric engine.
  • the robot could be configured with an auxiliary solar power system 135 to provide back-up emergency power or power for minor low-power sub-systems.
  • the robot fleet is configured for water travel as a watercraft with a propulsion system (engine) that is configurable as a gas engine, a turbine engine, an electric motor and/or a hybrid gas/electric engine and is further configured with a propeller.
  • a propulsion system engine that is configurable as a gas engine, a turbine engine, an electric motor and/or a hybrid gas/electric engine and is further configured with a propeller.
  • the robot fleet is configured for hover travel as an over-land or over-water hovercraft or an air-cushion vehicle (ACV) and is configured with blowers to produce a large volume of air below the hull that is slightly above atmospheric pressure.
  • the propulsion system (engine) is configurable as a gas engine, a turbine engine, an electric motor and/or a hybrid gas/electric engine.
  • the robot fleet is configured for air travel as an aerial drone or aerial hovercraft and is configured with wings, rotors, blowers, rockets, and/or propellers and an appropriate brake system.
  • the propulsion system (engine) is configurable as a gas engine, a turbine engine, an electric motor and/or a hybrid gas/electric engine.
  • each robot of the robot fleet is configured with one or more power sources, which include the power system 135 (e.g., battery, solar, gasoline, propane, etc.).
  • the power system 135 e.g., battery, solar, gasoline, propane, etc.
  • Each robot in the fleet further includes a navigation module 140 for navigation in the unstructured open or closed environments (e.g., digital maps, HD maps, GPS, etc.).
  • the fleet 100 relies on maps generated by the user, operator, or fleet operator, specifically created to cover the intended environment where the robot is configured to operate. These maps would then be used for general guidance of each robot in the fleet, which would augment this understanding of the environment by using a variety of on-board sensors such as cameras, LiDAR, altimeters or radar to confirm its relative geographic position and elevation.
  • the fleet of robots uses internal maps to provide information about where they are going and the structure of the road environment (e.g., lanes, etc.) and combine this information with onboard sensors (e.g., cameras, LiDAR, radar, ultrasound, microphones, etc.) and internal computer processing to constantly determine where they can safely navigate, what other objects are around each robot and what they may do.
  • onboard sensors e.g., cameras, LiDAR, radar, ultrasound, microphones, etc.
  • the fleet incorporates on-line maps to augment internal maps. This information is then combined to determine a safe, robust trajectory for the robot to follow and this is then executed by the low level actuators on the robot.
  • the fleet relies on a global positioning system (GPS) that allows land, sea, and airborne users to determine their exact location, velocity, and time 24 hours a day, in all weather conditions, anywhere in the world.
  • GPS global positioning system
  • the fleet of robots will use a combination of internal maps, sensors and GPS systems to confirm its relative geographic position and elevation.
  • the autonomous fleet is strategically positioned throughout a geographic region in anticipation of a known demand.
  • a user 200 and/or a vendor 204 can anticipate demand for robot services by storing data concerning how many orders (and what type of orders) are made at particular times of day from different areas of the region. This can be done for both source (e.g., restaurants, grocery stores, general businesses, etc.) and destination (e.g., customer, other businesses, etc.). Then, for a specific current day and time, this stored data is used to determine what the optimal location of the fleet is given the expected demand. More concretely, the fleet can be positioned to be as close as possible to the expected source locations, anticipating these source locations will be the most likely new orders to come into the system. Even more concretely, it is possible to estimate the number of orders from each possible source in the next hour and weight each source location by this number. Then one can position the fleet so that the fleet optimally covers the weighted locations based on these numbers.
  • source e.g., restaurants, grocery stores, general businesses, etc.
  • destination e.g., customer, other businesses, etc.
  • the positioning of robots can be customized based on: anticipated use, a pattern of historical behaviors, or specific goods being carried.
  • each robot is equipped with a sensor system 170, which includes at least a minimum number of onboard sensors (e.g., cameras (for example, those running at a high frame rate akin to video), LiDAR, radar, ultrasonic sensors, microphones, etc.) and internal computer processing 125 to constantly determine where it can safely navigate, what other objects are around each robot, and what it may do within its immediate surroundings.
  • sensors e.g., cameras (for example, those running at a high frame rate akin to video), LiDAR, radar, ultrasonic sensors, microphones, etc.
  • internal computer processing 125 to constantly determine where it can safely navigate, what other objects are around each robot, and what it may do within its immediate surroundings.
  • the robots of the robot fleet further include conveyance system sensors 175 configured to: monitor drive mechanism performance (e.g., the propulsion engine); monitor power system levels 135 (e.g., battery, solar, gasoline, propane, etc.); or monitor drive train performance (e.g., transmission, tires, brakes, rotors, etc.).
  • drive mechanism performance e.g., the propulsion engine
  • power system levels 135 e.g., battery, solar, gasoline, propane, etc.
  • monitor drive train performance e.g., transmission, tires, brakes, rotors, etc.
  • a system that processes sensor data using an on-board computer inside the autonomous vehicle and sending out only relevant data to a central server or to remote drivers ("teleoperators") during a remote operation mode, in order to reduce the bandwidth and amount of data transferred.
  • Each robot in the fleet further includes a communication module 160 configurable to receive, store and send data to the fleet management module, to a user, to and from the fleet management module 120, and to and from the robots in the fleet 100.
  • the data is related to at least user interactions and the robot fleet interactions, including, for example, scheduled requests or orders, on-demand requests or orders, or a need for self-positioning of the robot fleet based on anticipated demand within the unstructured open or closed environments.
  • each robot in the fleet includes at least one communication module configurable to receive, store and transmit data, and to store that data to a memory device, for future data transfer or manual download.
  • each business 204 and customer 202 has their own app/interface to communicate with the fleet operator 200 (e.g., "Nuro customer app” for customers on their phone, "Nuro vendor app” for businesses on a tablet or phone or their internal computer system, etc.).
  • the communication to the user and the robots in the fleet, between the robots of the fleet, and between the user and the robots in the fleet occurs via wireless transmission.
  • the user's wireless transmission interactions and the robot fleet wireless transmission interactions occur via mobile application transmitted by an electronic device and forwarded to the communication module via: a central server, a fleet management module, and/or a mesh network.
  • one preferred method of communication is to use cellular communication between the fleet manager and fleet of robots, (e.g., 3G, 4G, 5G, or the like).
  • the communication between the fleet control module and the robots could occur via satellite communication systems.
  • a customer uses an app (either on a cellphone, laptop, tablet, computer or any interactive device) to request a service (e.g., an on-demand food order or for a mobile marketplace robot to come to them).
  • a service e.g., an on-demand food order or for a mobile marketplace robot to come to them.
  • the electronic device includes: a phone, a personal mobile device, a personal digital assistant (PDA), a mainframe computer, a desktop computer, a laptop computer, a tablet computer, and/or wearable computing device such as a communication headset, smart glasses, a contact lens or lenses, a digital watch, a bracelet, a ring, jewelry, or a combination thereof.
  • PDA personal digital assistant
  • mainframe computer a desktop computer
  • laptop computer a laptop computer
  • a tablet computer and/or wearable computing device such as a communication headset, smart glasses, a contact lens or lenses, a digital watch, a bracelet, a ring, jewelry, or a combination thereof.
  • the present disclosure includes a system that stores the user's default location and payment method, and allows the user to summon an autonomous vehicle and/or buy goods and services via a smartphone app or a website by clicking one button.
  • a customer to transact with an autonomous vehicle by using a single button of an app and/or website.
  • a customer using the app and/or website logs in and sets a default location and default payment method. This information is associated with the single button of the app and/or website.
  • the single button can cause the autonomous vehicle to deliver an item or items to the customer, such as an item or items that the customer ordered.
  • an autonomous vehicle management system includes a database configured to store information of a customer where the information includes a default location, a communication system configured to communicate with an autonomous vehicle and with a device of the customer where the device includes a display screen having a button that is associated with the information of the customer, at least one processor, and a memory storing instructions.
  • the instructions when executed by the processor(s), cause the autonomous vehicle management system to receive an indication via the communication system that the button on the device of the customer has been selected, access in the database the default location of the customer, and instruct the autonomous vehicle to travel to the default location based on the indication.
  • an apparatus for summoning an autonomous vehicle includes a communication device configured to communicate with an autonomous vehicle management system, a display screen, at least one processor, and a memory storing instructions.
  • the instructions when executed by the processor(s), cause the apparatus to communicate via the communication device information of a customer to the autonomous vehicle management system where the information includes a default location of the customer, display on the display screen a button associated with the default location, communicate to the autonomous vehicle management system via the communication device an indication that the button has been selected, and receive, in response to communicating the indication that the button has been selected, an indication from the autonomous vehicle management system that an autonomous vehicle will be dispatched to the default location.
  • the information of the customer includes a payment account.
  • the button is associated with a payment account, and selection of the button causes a charge to the payment account.
  • the button is not associated with a destination confirmation screen, such that selection of the button causes an autonomous vehicle to travel to the default location without a destination confirmation screen.
  • the button is associated with a purchase order and prior to the autonomous vehicle traveling to the default destination, the vehicle receives an item or items corresponding to the purchase order.
  • a customer can place items into a basket, and with one click, the customer can check out and the system dispatches an autonomous vehicle.
  • the system does not need to ask the customer for the customer's location (which can be the customer's current phone location or a saved default location) or the customer's payment method (which can be a saved payment method or, for example, Apple® Pay).
  • the system can also change the destination location in real time if the customer is on the move.
  • the customer does not need to purchase any items and can summon an autonomous vehicle with one click, and the autonomous vehicle will travel to the customer.
  • a summon autonomous vehicle can be pre- stocked with products that the customer can buy directly from the autonomous vehicle, or the summoned autonomous vehicle can pick up items from the customer for return purposes, for selling an item to someone else, or for another reason.
  • the user includes a fleet manager, a sub-contracting vendor, a service provider, a customer, a business entity, an individual, or a third party.
  • the services include: subscription services, prescription services, marketing services, advertising services, notification services, or requested, ordered or scheduled delivery services.
  • the scheduled delivery services include, by way of example, special repeat deliveries such as groceries, prescriptions, drinks, mail, documents, etc.
  • the services further include: the user receiving and returning the same or similar goods within the same interaction (e.g., signed documents), the user receiving one set of goods and returning a different set of goods within the same interaction, (e.g., product replacement/ returns, groceries, merchandise, books, recording, videos, movies, payment transactions, etc.), a third party user providing instruction and or authorization to a goods or service provider to prepare, transport, deliver and/or retrieve goods to a principle user in a different location.
  • the user receiving and returning the same or similar goods within the same interaction (e.g., signed documents)
  • the user receiving one set of goods and returning a different set of goods within the same interaction e.g., product replacement/ returns, groceries, merchandise, books, recording, videos, movies, payment transactions, etc.
  • a third party user providing instruction and or authorization to a goods or service provider to prepare, transport, deliver and/or retrieve goods to a principle user in a different location.
  • the services further include: advertising services, land survey services, patrol services, monitoring services, traffic survey services, signage and signal survey services, architectural building or road infrastructure survey services.
  • At least one robot is further configured to process or manufacture goods.
  • the processed or manufactured goods include: beverages, with or without condiments (such as coffee, tea, carbonated drinks, etc.); various fast foods; or microwavable foods.
  • the robots within the fleet are equipped for financial transactions. These can be accomplished using known transaction methods such as debit/ credit card readers or the like.
  • a grocery delivery system via a fleet of autonomous vehicles is disclosed.
  • the vehicle can deliver fresh produce and other grocery items.
  • the vehicles may or may not include temperature control inside the compartments.
  • the temperature inside an autonomous vehicle can be controlled based on the products carried inside the vehicle.
  • the temperature can be set manually or automatically based on the system's or vehicle's knowledge of what it is carrying, and can include heating or cooling.
  • the system enables transporting groceries between residential and/or industrial locations with one or more autonomous vehicles.
  • the autonomous vehicles include any number of compartments for storing, preserving, heating, and/or cooling such groceries.
  • Such compartments include one or more temperature and/or humidity controlled compartments for preserving food and/or delivering food at predetermined temperatures and/or humidity ranges.
  • Groceries include produce, frozen foods, hot foods, wet foods, dry foods, and related consumer products such medication, hygiene products, toys, magazines, cards, and/or other specialty items.
  • An autonomous robotic vehicle includes a conveyance system configured to autonomously drive the autonomous robotic vehicle between at least one grocery storage location and at least one delivery location, and a compartment coupled to the conveyance system and configured to receive at least one grocery item stored at the at least one grocery storage location.
  • the compartment includes a temperature control module configured to maintain the compartment within a predetermined temperature range to provide temperature control for the least one grocery item as the conveyance system drives between the at least one grocery storage location and the at least one delivery location.
  • an autonomous vehicle management system for delivering groceries includes a database configured to store a list of groceries for delivery by an autonomous vehicle and configured to store information of a delivery location and of at least one grocery storage location, a communication system configured to communicate with a computing device to enable at least one grocery item to be selected from the list of groceries stored on the database, at least one processor, and a memory storing instructions.
  • the instructions when executed by the at least one processor, cause the autonomous vehicle system to access in the database the at least one grocery item selected, instruct the storage location to load the at least one grocery item selected on the autonomous vehicle, and instruct the autonomous vehicle to travel to the delivery location when the at least one grocery item selected is loaded in the autonomous vehicle.
  • the autonomous vehicle and/or fleet can include specific heating, cooling, and/or humidity modules, manual or automatic temperature changes, vehicles that deliver only certain types of groceries (e.g., cold or hot), multiple vehicles coordinating with one another, scheduling of deliveries, returning items or whole deliveries, changes in orders, customer verification, multiple location drop off, and/or bulk ordering, among other features.
  • specific heating, cooling, and/or humidity modules e.g., manual or automatic temperature changes
  • vehicles that deliver only certain types of groceries e.g., cold or hot
  • multiple vehicles coordinating with one another e.g., scheduling of deliveries, returning items or whole deliveries, changes in orders, customer verification, multiple location drop off, and/or bulk ordering, among other features.
  • a system that allows a user to take a picture of a product she would like to purchase and receive the product via an autonomous delivery vehicle.
  • the picture taken may or may not need to include the product barcode.
  • the system involves recognizing images of products captured by a customer and delivering the products to the customer via an autonomous delivery vehicle.
  • a customer using an app and/or website logs in and transmits images of products to the delivery system.
  • the delivery system then performs image recognition, identifies the products, and transmits the identified products to the customer.
  • the customer verifies and selects products from the identified products and transmits the selected products to the delivery system, which controls autonomous delivery vehicles to deliver the selected products to the customer upon reception of the selected products.
  • the delivery system controls the delivery of the products via autonomous delivery vehicles.
  • a delivery control system includes a communication device configured to receive images of products transmitted from a customer, at least one processor, and a memory storing instructions.
  • the instructions when executed by the at least one processor, cause the delivery control system to perform an image recognition process to identify the products in the images, transmit identified products to the communication device, receive from the communication device products selected by the customer, and instruct an autonomous vehicle to deliver the selected products to a location of the customer.
  • an apparatus for ordering products includes a communication device, a display screen, at least one processor, and a memory storing instructions.
  • the instructions when executed by the at least one processor, cause the apparatus to receive one or more images of products from a customer, transmit via the communication device the one or more images to a delivery system, receive identified products from the delivery system, display the identified products on the display screen, receive from the customer selected products from among the identified products, and transmit via the communication device an order to the delivery system, where the order includes the selected products from the identified products and a location of the customer.
  • the delivery control system and/or the customer's apparatus can include transmission of updated location of the customer, payment before transmitting the order, use of website images of products or images captured by the customer, confirmation between the customer and the delivery system via the app/web page for confirming the ordered products, image recognition of products in case no product is identified in an image or more than two potential products are identified in an image, delivery feature to deliver the products to the most recent location of the customer, and/or selection of autonomous delivery vehicles based on the most recent location or the default location of the customer.
  • an integrated system to ensure secure delivery of prescription drugs including a number of secure lockers on an autonomous vehicle that can only be accessed by the intended recipient.
  • the recipient can verify her identify by a built-in ID verification system on the autonomous vehicle, which may include a camera or a card reader to detect and identify the ID card along with a facial recognition system to compare the user with the ID card. If the system does not recognize a face, a remote human operator can access the camera stream to verify the ID and customer manually.
  • an ID and/or age verification system on an autonomous vehicle platform includes may include a fingerprint based system.
  • an autonomous robotic vehicle includes a conveyance system, a securable compartment configured to autonomously lock and unlock where the securable compartment contains an item for delivery to a particular individual, a personal identification reader, at least one processor, and a memory storing instructions.
  • the instructions when executed by the at least one processor, cause the autonomous robotic vehicle to, autonomously, travel to a destination location of the particular individual, capture by the personal identification reader at the destination location a personal identification object, determine that the captured personal identification object matches an identity of the particular individual, and unlock the securable compartment based on the determination.
  • the item is a prescription drug.
  • the personal identification reader is a camera and the personal identification object is a face, and identify is verified if the captured face matches a face image on file.
  • the personal identification reader is a camera and the personal identification object is a government issued photo ID card.
  • identify is verified if captured photo ID information matches photo and information on file.
  • identify is verified if the captured face matches the photo on the photo ID card.
  • the system includes receiving an image and transmitting the image off-site to compare and determine if the individual is an intended recipient. In various embodiments, additional verification by having a recipient answer questions and/or enter information regarding the prescribing physician, the pharmacist, the medication, or medical history are contemplated.
  • the system can unlock and/or open an autonomous vehicle based on facial recognition.
  • a recipient's ID or image of a face can be saved, and the system can compare the recipient's face with the previously saved ID or photograph. In this manner, a recipient does not need to provide any identification to be verified to receive an order.
  • the system can unlock and/or open an autonomous vehicle based on the recipient's ID.
  • the system can compare the recipient's current ID with a name on record and/or a previously saved ID or photograph of an ID.
  • the system can unlock and/or open an autonomous vehicle based on a combination of facial verification and ID verification.
  • the ID processing system may be on the autonomous vehicle or may be server-side.
  • a database on the server side keeps a record of a face and/or ID.
  • the system may not need to save preexisting records or photographs.
  • a manual ID system includes capturing information using a camera on the autonomous vehicle and having a certified remote operator check the face and/or ID manually in real time.
  • the system can use vehicle sensors that have depth perception (such as LiDAR and radar) to check that the face being presented is three dimensional.
  • vehicle sensors that have depth perception (such as LiDAR and radar) to check that the face being presented is three dimensional.
  • the system can use two cameras to capture parallax effects of a 3D object to verify that the face is three dimensional.
  • the autonomous vehicle can instruct the person to follow the command, such as blinking eyes, to verify that the face is a live face.
  • the face and/or ID verification system can be implemented on the recipient's device, such as a smartphone.
  • the device's camera can be used to capture live images instead of a camera on an autonomous vehicle.
  • an additional check can be performed to verify that that the user device is in close proximity to the autonomous vehicle, before allowing the vehicle to unlock.
  • the service can confirm the exact location with the customer just before the delivery, and/or the customer can make real-time changes to the location.
  • the systems and methods provide for secure delivery of goods on a regular interval based on a subscription, including goods such as prescriptions, groceries, detergent, engine oil, or any item that becomes consumed or depleted over time.
  • an autonomous delivery management system includes at least one processor and a memory storing instructions which, when executed by the at least one processor, cause the autonomous delivery management system to access subscription information that includes an item and a time interval for regularly delivering the item to a customer, determine a handling itinerary for the item that includes delivery of the item in compliance with the time interval, and communicating instructions to an autonomous vehicle based on the handling itinerary.
  • the subscription information includes automatic payment method.
  • the handling itinerary includes pickup location, destination location, and deadline for delivery.
  • the system can adjust delivery location and/or time when indicated by the customer.
  • the customer subscribes to a product once, which can include setting up dates and/or times for regular deliveries, and the system dispatches a vehicle with the subscribed product on that regular basis.
  • the system may or may not confirm with the customer ahead of time. If advance confirmation is desired, the system can send a notification before the scheduled delivery time, and allow the customer to modify the details of the delivery, such as precise timing, location, and/or quantity, or modify the product itself, or cancel the delivery.
  • advance confirmation is not required, the system can dispatch an autonomous vehicle and have the vehicle wait for the customer at the desired location. If the customer misses the delivery, the system can instruct the vehicle to leave the customer location.
  • the customer can cancel the subscription up to the very last minute or nearly in real-time with minimal or zero cancellation fees.
  • the customer can enable the location feature on the customer's smartphone app, and the system can determine when the customer is at a particular location.
  • the system can modify the delivery time and/or location based on the particular location or based on an estimate of when the customer will be home if travelling from the particular location.
  • the customer chooses to receive monthly delivery at home at 6 pm, but at 5:45 pm the system determines that the customer is not home yet.
  • the system can delay the delivery, with or without confirming with the customer.
  • the system determines that the customer is home or on her way to home with a certain estimated-time-of-arrival, it can dispatch a vehicle with the subscribed products to arrive at the customer's home at approximately the same time as the customer or at another time relative to the estimated-time-of- arrival.
  • robots in the fleet are each configured for transporting, delivering or retrieving goods or services and are capable of operating in an unstructured open environment or closed environment.
  • the vehicle 101 is configured to travel practically anywhere that a small all-terrain vehicle could travel on land, while providing at least one and preferably two large storage compartments 102, and more preferably, at least one large compartment 102 is configured with smaller internal secure compartments 104 of variable configurations to carry individual items that are to be delivered to, or need to be retrieved from customers.
  • the vehicle could be configured for water travel, providing at least one and preferably two large storage compartments, and more preferably, at least one large compartment is configured with smaller internal secure compartments of variable configurations to carry individual items that are to be delivered to, or need to be retrieved from customers.
  • the vehicle could be configured for hover travel, providing at least one and preferably two large storage compartments, and more preferably, at least one large compartment is configured with smaller internal secure compartments of variable configurations to carry individual items that are to be delivered to, or need to be retrieved from customers.
  • the vehicle could be configured for aerial drone or aerial hover travel, providing at least one and preferably two large storage compartments, and more preferably, at least one large compartment is configured with smaller internal secure compartments of variable configurations to carry individual items that are to be delivered to, or need to be retrieved from customers.
  • the securable compartments are humidity and temperature controlled for, for example, hot goods, cold goods, wet goods, dry goods, or combinations or variants thereof. Further still, as illustrated in FIGS. 8 - 10, the compartment(s) are configurable with various amenities, such as compartment lighting for night deliveries and condiment dispensers.
  • the securable compartments are configurable for various goods.
  • Such configurations and goods include: bookshelves for books, thin drawers for documents, larger box-like drawers for packages, and sized compartments for vending machines, coffee makers, pizza ovens and dispensers.
  • the securable compartments are variably configurable based on: anticipated demands, patterns of behaviors, area of service, or types of goods to be transported.
  • each robot includes securable compartments to hold said goods or items associated with said services, and a controller 150 configurable to associate each one of the securable compartments 102, 104 to an assignable customer 202 or provider 204 and provide entry when authorized,
  • Each robot vehicle further includes at least one processor configured to manage the conveyance system, the navigation module, the sensor system, instructions from the fleet management module, the communication module, and the controller.
  • each robot is configured with securable compartments.
  • a robot is configurable to contain a set of goods or even a mobile marketplace (similar to a mini bar at a hotel).
  • a robot When a robot is assigned to a customer 202, one or more of the compartments 102, 104 is also assigned to that customer. Each of the large compartments 12 is secured separately and can securely transport goods to a separate set of customers 202.
  • the customer Upon arrival of the robot to the customer destination, the customer can then open their respective compartment(s) by verifying their identity with the robot. This can be done through a wide variety of approaches comprising, but not limited to:
  • the customers can be given a PIN (e.g., 4 digit number) when they make their initial request/order. They can then enter this pin at the robot using the robot touchscreen or a keypad.
  • PIN e.g., 4 digit number
  • the customers can verify themselves using their mobile phone and an RFID reader on the robot.
  • the customers can verify themselves using their voice and a personal keyword or key phrase they speak to the robot.
  • the customers can verify themselves through their face, a government ID, or a business
  • ID badge using cameras and facial recognition or magnetic readers on the robot.
  • the customers can verify themselves using their mobile phone; by pushing a button or predetermined code on their phone (and the system could optionally detect the customer is near the robot by using their GPS position from phone)
  • a temporary storage system using autonomous vehicles can drop off items with an autonomous vehicle in one location, and then schedule to pick up the items from the autonomous vehicle in another location and/or at a later time. In this manner, a customer, for example, would not need to leave time to return to their original drop-off location to retrieve their items.
  • the temporary storage system can be used a delayed delivery system. For example, a customer can place an order ahead of time (e.g., at least 4 hours ahead or a day ahead), and a human operator can place the order into an autonomous vehicle ahead of time.
  • the system can inform the customer when the order is ready, and the customer can notify the autonomous vehicle or delivery system within a period of time (such as 24 hours of the order) to schedule pick up.
  • the autonomous vehicle can keep the order inside for the period of time (e.g., 24 hours), and the customer can summon it when the customer is ready.
  • each robot in the robot fleet is equipped with one or more processors 125 capable of both high-level computing for processing as well as low-level safety- critical computing capacity for controlling the hardware.
  • the at least one processor is configured to manage the conveyance system, the navigation module, the sensor system, instructions from the fleet management module, the communication module and the controller.
  • each robot in the robot fleet is equipped with a controller 150 configurable to associate each one of the securable compartments 102, 104 to an assignable customer 202 or provider 204 and provide entry when authorized.
  • the robot fleet further includes at least one robot having a digital display for curated content comprising: advertisements (i.e., for both specific user and general public), including services provided, marketing/ promotion, regional / location of areas served, customer details, local environment, lost, sought or detected people, public service announcements, date, time, or weather.
  • advertisements i.e., for both specific user and general public
  • phrases “in an embodiment,” “in embodiments,” “in various embodiments,” “in some embodiments,” or “in other embodiments” may each refer to one or more of the same or different embodiments in accordance with the present disclosure.
  • a phrase in the form “A or B” means “(A), (B), or (A and B).”
  • a phrase in the form “at least one of A, B, or C” means "(A); (B); (C); (A and B); (A and C); (B and C); or (A, B, and C) "
  • any of the herein described methods, programs, algorithms or codes may be converted to, or expressed in, a programming language or computer program.
  • programming language and "computer program,” as used herein, each include any language used to specify instructions to a computer, and include (but is not limited to) the following languages and their derivatives: Assembler, Basic, Batch files, BCPL, C, C+, C++, Delphi, Fortran, Java, JavaScript, machine code, operating system command languages, Pascal, Perl, PL1, Python, scripting languages, Visual Basic, metalanguages which themselves specify programs, and all first, second, third, fourth, fifth, or further generation computer languages. Also included are database and other data schemas, and any other meta-languages.
  • the systems described herein may also utilize one or more controllers to receive various information and transform the received information to generate an output.
  • the controller may include any type of computing device, computational circuit, or any type of processor or processing circuit capable of executing a series of instructions that are stored in a memory.
  • the controller may include multiple processors and/or multicore central processing units (CPUs) and may include any type of processor, such as a microprocessor, digital signal processor, microcontroller, programmable logic device (PLD), field programmable gate array (FPGA), or the like.
  • the controller may also include a memory to store data and/or instructions that, when executed by the one or more processors, cause the one or more processors to perform one or more methods and/or algorithms.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A robot fleet comprising a plurality of robot vehicles operating autonomously and a fleet management module for coordination of the robot fleet, each robot configured for transporting, delivering or retrieving goods or services and capable of operating in an unstructured open or closed environment, each robot comprising, a conveyance system, a navigation module, a plurality of securable compartments to hold goods, a controller configurable to associate each of the securable compartments to an assignable customer, a customer group in a market, or provider and provide entry when authorized, a communication module and a processor configured to manage the conveyance system, the navigation module, the sensor system, the communication module and the controller.

Description

FLEET OF ROBOT VEHICLES FOR SPECIALTY PRODUCT AND
SERVICE DELIVERY
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims priority to U.S. Provisional Application No. 62/538,538, filed on July 28, 2017, U.S. Application No. 16/047,598, filed on July 27, 2018, U.S. Application No. 16/047,640, filed on July 27, 2018, U.S. Application No. 16/047,659, filed on July 27, 2018, U.S. Application No. 16/047,894, filed on July 27, 2018, U.S. Application No. 16/048,669, filed on July 30, 2018, U.S. Application No. 16/048,737, filed on July 30, 2018, and U.S. Application No. 16/048,797, filed on July 30, 2018. The entire contents of each of the foregoing applications are hereby incorporated by reference.
FIELD OF THE TECHNOLOGY
[0002] The present application relates to autonomous vehicles.
BACKGROUND
[0003] The field of fully-autonomous and/or semi-autonomous robots is a growing field of innovation. Robots are being used for many purposes including warehouse inventory operations, household vacuuming robots, hospital delivery robots, sanitation robots, and military or defense applications.
SUMMARY
[0004] This disclosure relates to an autonomous and/or semi-autonomous robot fleet comprising a plurality of robots, in particular robots for transporting or retrieving deliveries in either unstructured outdoor environment or closed environments.
[0005] Provided herein is a robot fleet comprising a plurality of robot vehicles operating autonomously and/or semi-autonomously and a fleet management module, associated with a central server for coordination of the robot fleet; the fleet management module configured to coordinate the activity and positioning of each robot in the fleet, wherein the fleet is configured for transporting, delivering or retrieving goods or services and capable of operating in an unstructured open or closed environments; each robot in the fleet comprising: a power system, a conveyance system; (e.g., a drive system with a propulsion engine, wheels, treads, wings, rotors, blowers, rockets, propellers, brakes, etc.); a navigation module for navigation in the unstructured open or closed environments; (e.g., digital maps, HD maps, GPS); a communication module configurable to receive, store and send data to the fleet management module, a user, and the robots in the fleet, related to at least; user interactions and the robot fleet interactions, comprising: scheduled requests or orders, on- demand requests or orders, or a need for self-positioning of the robot fleet based on anticipated demand within the unstructured open or closed environments; a sensor system, at least one securable compartment or a plurality of securable compartments to hold said goods or items associated with said services; and a controller configurable to associate each one of the at least one or plurality of securable compartments to an assignable customer, or customer group in a marketplace, or provider and provide entry when authorized; at least one processor configured to manage the conveyance system, the navigation module, the sensor system, instructions from the fleet management module, the communication module, and the controller.
[0006] In some embodiments, the unstructured open environment is a non-confined geographic region accessible by navigable pathways comprising: public roads; private roads; bike paths; open fields; open public lands; open private lands; pedestrian walkways; lakes; rivers; streams; or open airspace. [0007] In some embodiments, the closed environment is a confined, enclosed or semi- enclosed structure accessible by navigable pathways comprising: open areas or rooms within commercial architecture, with or without structures or obstacles therein; airspace within open areas or rooms within commercial architecture, with or without structures or obstacles therein; public or dedicated aisles; hallways; tunnels; ramps; elevators; conveyors; or pedestrian walkways.
[0008] In some embodiments, the navigation module controls routing of the conveyance system of the robots in the fleet in the unstructured open or closed environments.
[0009] In some embodiments, the communication to the user, to the robots in the fleet, between the robots of the fleet, and between the user and the robots in the fleet, occurs via wireless transmission.
[0010] In some embodiments, the user comprises a fleet manager; a sub-contracting vendor; a service provider; a customer; a business entity; an individual; or a third party.
[0011] In some embodiments, the user's wireless transmission interactions and the robot fleet wireless transmission interactions occur via mobile application transmitted by an electronic device and forwarded to the communication module via: a central server; a fleet management module; and/or a mesh network.
[0012] In some embodiments, the electronic device comprises: a phone; a personal mobile device; a personal digital assistant (PDA); a mainframe computer; a desktop computer; a laptop computer; a tablet computer; and/or wearable computing device comprising: a communication headset; smart glasses; a contact lens or lenses; a digital watch; a bracelet; a ring; jewelry; or a combination thereof. [0013] In some embodiments, each robot fleet is configured with a maximum speed range from 1.0 mph to 90.0 mph.
[0014] In some embodiments, the plurality of securable compartments is humidity and temperature controlled for: hot goods, cold goods, wet goods, dry goods, or combinations or variants thereof.
[0015] In some embodiments, the plurality of securable compartments is configurable for a plurality of goods. Such configurations and goods comprise: bookshelves for books; thin drawers for documents; larger box-like drawers for packages, and sized compartments for vending machines, coffee makers, pizza ovens and dispensers.
[0016] In some embodiments, the plurality of securable compartments is variably configurable based on: anticipated demands; patterns of behaviors; area of service; or types of goods to be transported.
[0017] In some embodiments, the services comprise: subscription services; prescription services; marketing services; advertising services; notification services; a mobile marketplace; or requested, ordered or scheduled delivery services. In particular embodiments, the scheduled delivery services include, by way of example, special repeat deliveries such as groceries, prescriptions, drinks, mail, documents, etc.
[0018] In some embodiments, the services further comprise: the user receiving and returning the same or similar goods within the same interaction; (e.g., signed documents); the user receiving one set of goods and returning a different set of goods within the same interaction; (e.g., product replacement/ returns, groceries, merchandise, books, recording, videos, movies, payment transactions, etc.); a third party user providing instruction and or authorization to a goods or service provider to prepare, transport, deliver and/or retrieve goods to a principle user in a different location.
[0019] In some embodiments, the services further comprise: general services, (e.g., picking up a user's dry cleaning, dropping off a user's dry cleaning, renting goods, (such as tools, DVDs, etc.), sharing/borrowing goods from other users or businesses, etc.). Further still, it may be a general pickup service for items to be shipped, returned, or sent to other users/businesses, etc.
[0020] In some embodiments, at least one robot in the fleet is further configured to process or manufacture goods.
[0021] In some embodiments, the processed or manufactured goods comprise: beverages, etc., with or without condiments; (e.g., coffee, tea, carbonated drinks, etc.); a plurality of fast foods; or microwavable foods.
[0022] In some embodiments, the robot fleet further comprises at least one robot having a digital display for curated content comprising: advertisements (i.e., for both specific user and general public), including; services provided, marketing/ promotion, regional / location of areas served, customer details, local environment, lost, sought or detected people, public service announcements, date, time, or weather.
[0023] In some embodiments of the robot fleet, the positioning of robots can be customized based on: anticipated use, a pattern of historical behaviors, or specific goods being carried.
[0024] In some embodiments, the robot fleet is fully-autonomous.
[0025] In some embodiments, the robot fleet is semi-autonomous.
[0026] In some embodiments, the robot fleet is controlled directly by the user. [0027] In some embodiments of the robot fleet, a plurality of said autonomous or semi- autonomous robots within the fleet is operated on behalf of third party vendor/service provider; (e.g., fleet managed by an owner, but providing a coffee service/experience for a third party vendor (i.e., Starbucks) with white label robots in the fleet).
[0028] In some embodiments of the robot fleet, a plurality of said autonomous robots within the fleet is further configured to be part of a sub-fleet comprising a sub-plurality of autonomous robots, wherein each sub-fleet is configured to operate independently or in tandem with multiple sub-fleets comprising two or more sub-fleets.
[0029] Further details and aspects of exemplary embodiments of the present disclosure are described in more detail below with reference to the appended figures.
INCORPORATION BY REFERENCE
[0030] All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
BRIEF DESCRIPTION OF THE DRAWINGS
[0031] A better understanding of the features and advantages of the disclosed technology will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the technology are utilized, and the accompanying drawings of which:
[0032] FIG. 1 is an exemplary view an autonomous robot fleet, wherein each vehicle within a fleet or sub-fleet can be branded for an entity;
[0033] FIG. 2 is an exemplary ISO view of a robot vehicle, part of an autonomous robot fleet, illustrating securable compartments within the vehicle; [0034] FIG. 3 is an exemplary front view of a robot vehicle, part of an autonomous robot fleet, shown in comparison to the height of an average person;
[0035] FIG. 4 is an exemplary right side view of a robot vehicle, part of an autonomous robot fleet, illustrating a configuration with two large side doors, each enclosing securable compartments;
[0036] FIG. 5 is an exemplary left side view of a robot vehicle, part of an autonomous robot fleet, shown in comparison to the height of an average person;
[0037] FIG. 6 is an exemplary rear view of a robot vehicle, part of an autonomous robot fleet;
[0038] FIG. 7 is an exemplary ISO view of a robot vehicle, part of an autonomous robot fleet, illustrating an autonomous lunch delivery vehicle for any branded company;
[0039] FIG. 8 is an exemplary ISO view of a robot vehicle, part of an autonomous robot fleet, illustrating an autonomous pizza delivery vehicle for any branded company;
[0040] FIG. 9 is an exemplary ISO view of a robot vehicle, part of an autonomous robot fleet, illustrating an autonomous coffee delivery vehicle for any branded company;
[0041] FIG. 10 is an exemplary ISO view of a robot vehicle, part of an autonomous robot fleet, illustrating an autonomous evening/ nighttime delivery vehicle for any branded company, comprising a lighted interior;
[0042] FIG. 11 is an exemplary flowchart representation of the logic for a fleet management control module associated with a central server for the robot fleet; and
[0043] FIG. 12 is an exemplary flowchart representation of the logic flow from the Fleet Management Control Module through the robot processor to the various systems and modules of the robot. DETAILED DESCRIPTION
[0044] This disclosure relates to a fully-autonomous and/or semi-autonomous robot fleet and, in particular, to robot vehicles for transporting or retrieving deliveries in either open unstructured outdoor environments or closed environments.
[0045] Provided herein is a robot fleet having robot vehicles operating fully-autonomously or semi-autonomously and a fleet management module for coordination of the robot fleet, where each robot within the fleet is configured for transporting, delivering or retrieving goods or services and is capable of operating in an unstructured open or closed environment. Each robot can include a power system, a conveyance system, a navigation module, at least one securable compartment or multiple securable compartments to hold goods, a controller configurable to associate each of the securable compartments to an assignable customer a customer group within a marketplace, or provider and provide entry when authorized, a communication module and a processor configured to manage the conveyance system, the navigation module, the sensor system, the communication module and the controller.
[0046] As used herein, the term "autonomous" includes fully-autonomous, semi- autonomous, and any configuration in which a vehicle can operate in a controlled manner for a period of time without human intervention.
[0047] As used herein, the term "fleet," "sub-fleet," and like terms are used to indicate a number of land vehicles, watercraft or aircraft operating together or under the same ownership. In some embodiments the fleet or sub-fleet is engaged in the same activity. In some embodiments, the fleet or sub-fleet are engaged in similar activities. In some embodiments, the fleet or sub-fleet are engaged in different activities. [0048] As used herein, the term "robot," "robot vehicle," "robot fleet," "vehicle," "all-terrain vehicle," and like terms are used to indicate a mobile machine that transports cargo, items, and/or goods. Typical vehicles include cars, wagons, vans, unmanned motor vehicles (e.g., tricycles, trucks, trailers, buses, etc.), unmanned railed vehicles (e.g., trains, trams, etc.), unmanned watercraft (e.g., ships, boats, ferries, landing craft, barges, rafts, etc.), aerial drones, unmanned hovercraft (air, land and water types) , unmanned aircraft, and even including unmanned spacecraft.
[0049] As used herein, the term "compartment" is used to indicate an internal bay of a robot vehicle that has a dedicated door at the exterior of the vehicle for accessing the bay, and also indicates an insert secured within the bay. As used herein, the term "sub-compartment" is used to indicate a subdivision or portion of a compartment. Additionally, within the context of descriptions relating to compartments and sub-compartments, the term "module" may be used herein to refer to a compartment and/or a sub-compartment.
[0050] As used herein, the term "user," "operator," "fleet operator," and like terms are used to indicate the entity that owns or is responsible for managing and operating the robot fleet.
[0051] As used herein, the term "customer" and like terms are used to indicate the entity that requests the services provided the robot fleet.
[0052] As used herein, the term "provider," "business," "vendor," "third party vendor," and like terms are used to indicate an entity that works in concert with the fleet owner or operator to utilize the services of the robot fleet to deliver the provider's product from and or return the provider's product to the provider's place of business or staging location.
[0053] As used herein, the term "server," "computer server," "central server," "main server," and like terms are used to indicate a computer or device on a network that manages the fleet resources, namely the robot vehicles.
[0054] As used herein, the term "controller" and like terms are used to indicate a device that controls the transfer of data from a computer to a peripheral device and vice versa. For example, disk drives, display screens, keyboards, and printers all require controllers. In personal computers, the controllers are often single chips. As used herein the controller is commonly used for managing access to components of the robot such as the securable compartments.
[0055] As used herein a "mesh network" is a network topology in which each node relays data for the network. All mesh nodes cooperate in the distribution of data in the network. It can be applied to both wired and wireless networks. Wireless mesh networks can be considered a type of "Wireless ad hoc" network. Thus, wireless mesh networks are closely related to Mobile ad hoc networks (MANETs). Although MANETs are not restricted to a specific mesh network topology, Wireless ad hoc networks or MANETs can take any form of network topology. Mesh networks can relay messages using either a flooding technique or a routing technique. With routing, the message is propagated along a path by hopping from node to node until it reaches its destination. To ensure that all its paths are available, the network must allow for continuous connections and must reconfigure itself around broken paths, using self-healing algorithms such as Shortest Path Bridging. Self-healing allows a routing-based network to operate when a node breaks down or when a connection becomes unreliable. As a result, the network is typically quite reliable, as there is often more than one path between a source and a destination in the network. This concept can also apply to wired networks and to software interaction. A mesh network whose nodes are all connected to each other is a fully connected network.
[0056] As used herein, the term "module" and like terms are used to indicate a self-contained hardware component of the central server, which in turn includes software modules. In software, a module is a part of a program. Programs are composed of one or more independently developed modules that are not combined until the program is linked. A single module can contain one or several routines, or sections of programs that perform a particular task. As used herein the fleet management module includes software modules for managing various aspects and functions of the robot fleet.
[0057] As used herein, the term "processor," "digital processing device" and like terms are used to indicate a microprocessor or central processing unit (CPU). The CPU is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logical, control and input/output (I/O) operations specified by the instructions.
[0058] In accordance with the description herein, suitable digital processing devices include, by way of non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles. Those of skill in the art will recognize that many smartphones are suitable for use in the system described herein. Suitable tablet computers include those with booklet, slate, and convertible configurations, known to those of skill in the art.
[0059] In some embodiments, the digital processing device includes an operating system configured to perform executable instructions. The operating system is, for example, software, including programs and data, which manages the device's hardware and provides services for execution of applications. Those of skill in the art will recognize that suitable server operating systems include, by way of non-limiting examples, FreeBSD, OpenBSD, NetBSD®, Linux, Apple® Mac OS X Server®, Oracle® Solaris®, Windows Server®, and Novell® NetWare®. Those of skill in the art will recognize that suitable personal computer operating systems include, by way of non-limiting examples, Microsoft® Windows®, Apple® Mac OS X®, UNIX®, and UNIX- like operating systems such as GNU/Linux®. In some embodiments, the operating system is provided by cloud computing. Those of skill in the art will also recognize that suitable mobile smart phone operating systems include, by way of non-limiting examples, Nokia® Symbian® OS, Apple® iOS®, Research In Motion® BlackBeny OS®, Google® Android®, Microsoft® Windows Phone® OS, Microsoft® Windows Mobile® OS, Linux®, and Palm® WebOS®.
[0060] In some embodiments, the device includes a storage and/or memory device. The storage and/or memory device is one or more physical apparatus used to store data or programs on a temporary or permanent basis. In some embodiments, the device is volatile memory and requires power to maintain stored information. In some embodiments, the device is non-volatile memory and retains stored information when the digital processing device is not powered. In some embodiments, the non-volatile memory includes flash memory. In some embodiments, the non-volatile memory includes dynamic random-access memory (DRAM). In some embodiments, the non-volatile memory includes ferroelectric random access memory (FRAM). In some embodiments, the non-volatile memory includes phase-change random access memory (PRAM). In some embodiments, the device is a storage device including, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, magnetic disk drives, magnetic tapes drives, optical disk drives, and cloud computing based storage. In some embodiments, the storage and/or memory device is a combination of devices such as those disclosed herein. [0061] In some embodiments, the digital processing device includes a display to send visual information to a user. In some embodiments, the display is a cathode ray tube (CRT). In some embodiments, the display is a liquid crystal display (LCD). In some embodiments, the display is a thin film transistor liquid crystal display (TFT-LCD). In some embodiments, the display is an organic light emitting diode (OLED) display. In various some embodiments, on OLED display is a passive-matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display. In some embodiments, the display is a plasma display. In some embodiments, the display is a video projector. In some embodiments, the display is interactive (e.g., having a touch screen or a sensor such as a camera, a 3D sensor, a LiDAR, a radar, etc.) that can detect user interactions/gestures/responses and the like. In still some embodiments, the display is a combination of devices such as those disclosed herein.
The Fleet of Robot Vehicles
[0062] Provided herein is a robot fleet 100, as illustrated in FIG. 1, having robot vehicles 101, with each one operating fully-autonomously or semi-autonomously.
[0063] As illustrated in FIGS. 3 - 6, one exemplary configuration of a robot 101 is a vehicle configured for land travel, such as a small fully-autonomous (or semi-autonomous) automobile. The exemplary fully-autonomous (or semi-autonomous) automobile is narrow (i.e., 2- 5 feet wide), low mass and low center of gravity for stability, having multiple secure compartments assignable to one or more customers, retailers and/or vendors, and designed for moderate working speed ranges (i.e., 1.0 - 45.0 mph) to accommodate inner-city and residential driving speeds. Additionally, in some embodiments, the land vehicle robot units in the fleet are configured with a maximum speed range from 1.0 mph to about 90.0 mph for high speed, intrastate or interstate driving. Each robot in the fleet is equipped with onboard sensors 170 (e.g., cameras (running at a high frame rate, akin to video), LiDAR, radar, ultrasonic sensors, microphones, etc.) and internal computer processing to constantly determine where it can safely navigate, what other objects are around each robot and what it may do.
[0064] In in some embodiments, the robot fleet is fully-autonomous.
[0065] In in some embodiments, the robot fleet is semi-autonomous. In some embodiments, it may be necessary to have human interaction between the robot 101, the fleet operator 200, the provider 204 and/ or the customer 202 to address previously unforeseen issues (e.g., a malfunction with the navigation module; provider inventory issues; unanticipated traffic or road conditions; or direct customer interaction after the robot arrives at the customer location).
[0066] In in some embodiments, the robot fleet 100 is controlled directly by the user 200. In some embodiments, it may be necessary to have direct human interaction between the robot 101 and/or the fleet operator 200 to address maintenance issues such as mechanical failure, electrical failure or a traffic accident.
[0067] In some embodiments, the robot fleet is configured for land travel. In some embodiments, each robot land vehicle in the fleet is configured with a working speed range from 13.0 mph to 45.0 mph. In some embodiments, the land vehicle robot units in the fleet are configured with a maximum speed range from 13.0 mph to about 90.0 mph.
[0068] In some embodiments, the robot fleet is configured for water travel as a watercraft and is configured with a working speed range from 1.0 mph to 45.0 mph.
[0069] In some embodiments, the robot fleet is configured for hover travel as an over-land or over- water hovercraft and is configured with a working speed range from 1.0 mph to 60.0 mph.
[0070] In some embodiments, the robot fleet is configured for air travel as an aerial drone or aerial hovercraft and is configured with a working speed range from 1.0 mph to 80.0 mph. [0071] In some embodiments of the robot fleet, the autonomous robots within the fleet are operated on behalf of third party vendor/service provider.
[0072] For example, a fleet management service is established to provide a roving delivery service for a third party beverage/ food provider (e.g., a coffee service/experience for a third party vendor (i.e., Starbucks)). It is conceived that the fleet management service would provide a sub-fleet of "white label" vehicles carrying the logo and products of that third party beverage/ food provider to operate either fully-autonomously or semi-autonomously to provide this service.
[0073] In some embodiments of the robot fleet, the autonomous robots within the fleet are further configured to be part of a sub-fleet of autonomous robots, and each sub-fleet is configured to operate independently or in tandem with multiple sub-fleets having two or more sub-fleets (100-a, 100-b).
[0074] For example, a package delivery service is configured to offer multiple levels of service such as "immediate dedicated rush service," "guaranteed morning/ afternoon delivery service," or "general delivery service." A service provider could then have a dedicated sub-fleet of delivery vehicles for each type of service within their overall fleet of vehicles. In yet another example, a third party has priority over a certain number of vehicles in the fleet. In so doing, they can guarantee a certain level of responsiveness. When they aren't using the vehicles, the vehicles are used for general services within the fleet (e.g., other third parties).
[0075] In some embodiments, the robot fleet is controlled directly by the user.
[0076] In some embodiments, there will likely be times when a vehicle breaks down, has an internal system or module failure or is in need of maintenance. For example, in the event that the navigation module should fail, each robot within the fleet is configurable to allow for direct control of the robot's processor to override the conveyance and sensor systems (i.e., cameras, etc.) by a fleet operator to allow for the safe return of the vehicle to a base station for repair.
[0077] In accordance with aspects of the present disclosure, disclosed is a fleet of vehicles for a transportation or delivery service that includes any set of fully human-driven vehicles, semi-autonomous vehicles, fully-autonomous vehicles, vehicles operated remotely by human drivers, and/or any vehicle that is a combination/hybrid of these. The system can choose to dispatch an appropriate type of vehicle based on the specific requirements of that particular transaction. This can be based on distance, locations, customers' preferences, and/or weather conditions, among other factors. The system may include a portal for a business to call, schedule, and monitor a delivery, and also a routing mechanism to find best paths for all the vehicles on the system.
[0078] In various embodiments, the fleet management module receives an order of one or more goods either directly from the customer or from the customer via the central server. The parameters for the order are determined. The parameters may include the customer's preference for service providers, the type of vehicle needed to perform the delivery, care instructions for the one or more goods, and/or size and weight of the one or more goods. A vehicle is selected from a fleet of vehicles to perform the delivery of the one or more goods to the customer based on the determined parameters. A service provider is selected to fulfill the order based on the determined parameters, and the order is transmitted to the selected service provider. In some embodiments, the determined parameters may specify a particular service provider. A message is sent to the selected vehicle to obtain the one or more goods from the service provider and deliver the one or more goods to the customer. A message is received from the selected vehicle that the one or more goods have been delivered to the customer. The fleet management module 120 may then identify the selected vehicle as a vehicle available for another delivery. [0079] In various embodiments, a method of providing services using a fleet of mixed vehicles includes receiving a request for a service, determining parameters for the service, selecting a vehicle from the fleet of mixed vehicles to perform at least a portion of the service based on the determined parameters, and transmitting a message to the selected vehicle to perform at least a portion of the service. In various embodiments, the fleet of mixed vehicles includes at least one of a human-driven vehicle, a semi-autonomous vehicle, a fully autonomous vehicle, or a vehicle remotely operated by a human.
[0080] In various embodiments, the parameters include distance, location, customer's preference, vehicle type, size of a requested good, weight of a requested good, or weather or road conditions at or near the location of the customer receiving the service. In various embodiments, the service includes transporting or delivering a good or product.
[0081] In various embodiments, the method includes receiving a message that the selected vehicle has completed performing the requested service, and identifying the selected vehicle as being available for performing another requested service. In various embodiments, the method includes determining similar requests for services from multiple customers located near each other, and determining a path to deliver the services to the multiple customers using the selected vehicle.
[0082] In accordance with aspects of the present disclosure, disclosed is an integrated system that enables a smaller, sidewalk-friendly autonomous robot to reside inside a larger, on-road autonomous vehicle, and carries a package from the on-road vehicle to the customer's front door/drop-box. The main autonomous vehicle can travel to the curbside of the destination address, and the sub-robot vehicle will complete the journey to the destination and deliver the package to the door or to a drop-box. In various embodiments, the autonomous sub-robot vehicle can receive its destination through either communication between the autonomous robot vehicle and the sub-robot vehicle, or through communication with a central server.
[0083] In various embodiments, an autonomous robot vehicle includes a first land conveyance system configured to travel on vehicle roadways, a navigation system configured to navigate to a destination location, an exterior housing, and a sub-robot vehicle carried within the exterior housing while the first land conveyance system autonomously travels on the vehicle roadways to the destination location. The sub-robot vehicle includes a second land conveyance system configured to travel on pedestrian walkways, at least one module configured to store customer items where the at least one module includes at least one compartment or sub- compartment, at least one processor, and a memory storing instructions. The instructions, when executed by the at least one processor, cause the sub-robot vehicle to autonomously control the second conveyance system to exit the exterior housing and travel the pedestrian walkways to a customer pickup location.
[0084] In various embodiments, a method for autonomous robot vehicle delivery includes navigating via a navigation system configured to navigate to a destination location, autonomously traveling via a first land conveyance system on vehicle roadways to the destination location, and carrying a sub-robot vehicle within an exterior housing, where the sub- robot vehicle includes a second land conveyance system configured to travel on pedestrian walkways, and at least one module configured to store customer items. The method includes instructing the sub-robot vehicle to exit the exterior housing and autonomously travel, via the second land conveyance system, the pedestrian walkways to a customer pickup location.
[0085] In various embodiments, the customer pickup location includes a front door. In various embodiments, the destination includes a street curb near the customer pick up location. In various embodiments, the destination is at least one of a securable drop-box, a residential address, or a commercial address. In various embodiments, prior to the autonomous vehicle traveling to the default destination, the sub-robot vehicle receives an item or items corresponding to a purchase order.
The Operating Environments
[0086] In some embodiments, the unstructured open environment is a non-confined geographic region accessible by navigable pathways, including, for example, public roads, private roads, bike paths, open fields, open public lands, open private lands, pedestrian walkways, lakes, rivers or streams.
[0087] In some embodiments, the closed environment is a confined, enclosed or semi- enclosed structure accessible by navigable pathways, including, for example, open areas or rooms within commercial architecture, with or without structures or obstacles therein, airspace within open areas or rooms within commercial architecture, with or without structures or obstacles therein, public or dedicated aisles, hallways, tunnels, ramps, elevators, conveyors, or pedestrian walkways.
[0088] In some embodiments, the unstructured open environment is a non-confined airspace or even near-space environment which includes all main layers of the Earth's atmosphere including the troposphere, the stratosphere, the mesosphere, the thermosphere and the exosphere.
[0089] In some embodiments, the navigation module controls routing of the conveyance system of the robots in the fleet in the unstructured open or closed environments.
The Fleet Management Module
[0090] In some embodiments of the robot fleet 100, the fleet includes a fleet management module 120 (associated with a central server) for coordination of the robot fleet 100 and assignment of tasks for each robot 101 in the fleet. The fleet management module coordinates the activity and positioning of each robot in the fleet. In addition to communicating with the robot fleet, fleet owner/operator and/or user, the fleet management module also communicates with providers/vendors/businesses and customers to optimize behavior of the entire system.
[0091] The fleet management module works in coordination with a central server 110, typically located in a central operating facility owned or managed by the fleet owner 200.
[0092] As illustrated in FIG. 11, in one embodiment, a request is sent to a main server 110 (typically located at the fleet owner's or fleet manager's location), which then communicates with the fleet management module 120. The fleet management module then relays the request to the appropriate provider 204 of the service (e.g., restaurant, delivery service, vendor or retailer) and an appropriate robot or robots 101 in the fleet. The best appropriate robot(s) in the fleet within the geographic region and typically closest to the service provider, is then assigned the task, and the provider of the service 204 then interacts with that robot 101 at their business (e.g., loading it with goods, if needed). The robot then travels to the customer 202 and the customer interacts with the robot to retrieve their goods or service (e.g., the goods ordered). An interaction can include requesting the robot to open its compartment 102, 104 through the customer's app or through a user interface on the robot itself (using, e.g., RFID reader and customer phone, a touchpad, a keypad, voice commands, vision-based recognition of the person, etc.). Upon completion of the delivery (or retrieval, if appropriate), the robot reports completion of the assignment and reports back to the fleet management module for re-assignment.
[0093] As further illustrated in FIG. 12, and previously noted, in some embodiments, the fleet management module 120 handles coordination of the robot fleet 100 and assignment of tasks for each robot 101 in the fleet. The fleet management module coordinates the activity and positioning of each robot in the fleet. The fleet management module also communicates with vendors/businesses 204 and customers 202 to optimize behavior of entire system. It does this by utilizing the robot's processor 125 to process the various inputs and outputs from each of the robot's systems and modules, including: the conveyance system 130, the power system 135, the navigation module 140, the sensor system 170, 175, the communication module 160, and the controller 150, to effectively manage and coordinate the various functions of each robot in the fleet.
[0094] In some embodiments, the robot may be requested for a pick-up of an item (e.g., a document) with the intent of delivery to another party. In this scenario, the fleet management module would assign the robot to arrive at a given location, assign a securable compartment for receipt of the item, confirm receipt from the first party to the fleet management module, then proceed to the second location where an informed receiving party would recover the item from the robot using an appropriate PIN or other recognition code to gain access to the secure compartment. The robot would then reports completion of the assignment and report back to the fleet management module for re-assignment.
Conveyance Systems
[0095] Each robot vehicle 101 in the fleet includes a conveyance system 130 (e.g., a drive system with a propulsion engine, wheels, treads, wings, rotors, blowers, rockets, propellers, brakes, etc.).
[0096] As noted previously, the robot fleet is configurable for land, water or air. Typical vehicles include cars, wagons, vans, unmanned motor vehicles (e.g., tricycles, trucks, trailers, buses, etc.), unmanned railed vehicles (e.g., trains, trams, etc.), unmanned watercraft (e.g., ships, boats, ferries, landing craft, barges, rafts, etc.), aerial drones, unmanned hovercraft (air, land, and water types), unmanned aircraft, and unmanned spacecraft.
[0097] In one exemplary embodiment, a robot land vehicle 101 is configured with a traditional 4-wheeled automotive configuration comprising conventional steering and braking systems. The drive train is configurable for standard 2-wheel drive or 4-wheel all-terrain traction drive. The propulsion system (engine) is configurable as a gas engine, a turbine engine, an electric motor and/or a hybrid gas/electric engine. Alternatively, the robot could be configured with an auxiliary solar power system 135 to provide back-up emergency power or power for minor low-power sub-systems.
[0098] Alternative configurations of components to a total drive system with a propulsion engine could include wheels, treads, wings, rotors, blowers, rockets, propellers, brakes, etc.
[0099] In some embodiments, the robot fleet is configured for water travel as a watercraft with a propulsion system (engine) that is configurable as a gas engine, a turbine engine, an electric motor and/or a hybrid gas/electric engine and is further configured with a propeller.
[0100] In some embodiments, the robot fleet is configured for hover travel as an over-land or over-water hovercraft or an air-cushion vehicle (ACV) and is configured with blowers to produce a large volume of air below the hull that is slightly above atmospheric pressure. The propulsion system (engine) is configurable as a gas engine, a turbine engine, an electric motor and/or a hybrid gas/electric engine.
[0101] In some embodiments, the robot fleet is configured for air travel as an aerial drone or aerial hovercraft and is configured with wings, rotors, blowers, rockets, and/or propellers and an appropriate brake system. The propulsion system (engine) is configurable as a gas engine, a turbine engine, an electric motor and/or a hybrid gas/electric engine. The Power System
[0102] In some embodiments, each robot of the robot fleet is configured with one or more power sources, which include the power system 135 (e.g., battery, solar, gasoline, propane, etc.). Navigation Module
[0103] Each robot in the fleet further includes a navigation module 140 for navigation in the unstructured open or closed environments (e.g., digital maps, HD maps, GPS, etc.). In some embodiments, the fleet 100 relies on maps generated by the user, operator, or fleet operator, specifically created to cover the intended environment where the robot is configured to operate. These maps would then be used for general guidance of each robot in the fleet, which would augment this understanding of the environment by using a variety of on-board sensors such as cameras, LiDAR, altimeters or radar to confirm its relative geographic position and elevation.
[0104] In some embodiments, for navigation, the fleet of robots uses internal maps to provide information about where they are going and the structure of the road environment (e.g., lanes, etc.) and combine this information with onboard sensors (e.g., cameras, LiDAR, radar, ultrasound, microphones, etc.) and internal computer processing to constantly determine where they can safely navigate, what other objects are around each robot and what they may do. In still other embodiments, the fleet incorporates on-line maps to augment internal maps. This information is then combined to determine a safe, robust trajectory for the robot to follow and this is then executed by the low level actuators on the robot.
[0105] In some embodiments, the fleet relies on a global positioning system (GPS) that allows land, sea, and airborne users to determine their exact location, velocity, and time 24 hours a day, in all weather conditions, anywhere in the world. [0106] In some embodiments, the fleet of robots will use a combination of internal maps, sensors and GPS systems to confirm its relative geographic position and elevation.
[0107] In some embodiments, the autonomous fleet is strategically positioned throughout a geographic region in anticipation of a known demand.
[0108] Over time, a user 200 and/or a vendor 204 can anticipate demand for robot services by storing data concerning how many orders (and what type of orders) are made at particular times of day from different areas of the region. This can be done for both source (e.g., restaurants, grocery stores, general businesses, etc.) and destination (e.g., customer, other businesses, etc.). Then, for a specific current day and time, this stored data is used to determine what the optimal location of the fleet is given the expected demand. More concretely, the fleet can be positioned to be as close as possible to the expected source locations, anticipating these source locations will be the most likely new orders to come into the system. Even more concretely, it is possible to estimate the number of orders from each possible source in the next hour and weight each source location by this number. Then one can position the fleet so that the fleet optimally covers the weighted locations based on these numbers.
[0109] In some embodiments of the robot fleet, the positioning of robots can be customized based on: anticipated use, a pattern of historical behaviors, or specific goods being carried.
Sensor Systems
[0110] As noted previously, each robot is equipped with a sensor system 170, which includes at least a minimum number of onboard sensors (e.g., cameras (for example, those running at a high frame rate akin to video), LiDAR, radar, ultrasonic sensors, microphones, etc.) and internal computer processing 125 to constantly determine where it can safely navigate, what other objects are around each robot, and what it may do within its immediate surroundings. [0111] In some embodiments, the robots of the robot fleet further include conveyance system sensors 175 configured to: monitor drive mechanism performance (e.g., the propulsion engine); monitor power system levels 135 (e.g., battery, solar, gasoline, propane, etc.); or monitor drive train performance (e.g., transmission, tires, brakes, rotors, etc.).
[0112] In accordance with aspects of the present disclosure, disclosed is a system that processes sensor data using an on-board computer inside the autonomous vehicle and sending out only relevant data to a central server or to remote drivers ("teleoperators") during a remote operation mode, in order to reduce the bandwidth and amount of data transferred.
Communications Module
[0113] Each robot in the fleet further includes a communication module 160 configurable to receive, store and send data to the fleet management module, to a user, to and from the fleet management module 120, and to and from the robots in the fleet 100. In some embodiments, the data is related to at least user interactions and the robot fleet interactions, including, for example, scheduled requests or orders, on-demand requests or orders, or a need for self-positioning of the robot fleet based on anticipated demand within the unstructured open or closed environments.
[0114] In some embodiments, each robot in the fleet includes at least one communication module configurable to receive, store and transmit data, and to store that data to a memory device, for future data transfer or manual download.
[0115] In some embodiments, each business 204 and customer 202 has their own app/interface to communicate with the fleet operator 200 (e.g., "Nuro customer app" for customers on their phone, "Nuro vendor app" for businesses on a tablet or phone or their internal computer system, etc.). [0116] In some embodiments, the communication to the user and the robots in the fleet, between the robots of the fleet, and between the user and the robots in the fleet, occurs via wireless transmission.
[0117] In some embodiments, the user's wireless transmission interactions and the robot fleet wireless transmission interactions occur via mobile application transmitted by an electronic device and forwarded to the communication module via: a central server, a fleet management module, and/or a mesh network.
[0118] In some embodiments, one preferred method of communication is to use cellular communication between the fleet manager and fleet of robots, (e.g., 3G, 4G, 5G, or the like). Alternatively, the communication between the fleet control module and the robots could occur via satellite communication systems.
[0119] In some embodiments, a customer uses an app (either on a cellphone, laptop, tablet, computer or any interactive device) to request a service (e.g., an on-demand food order or for a mobile marketplace robot to come to them).
[0120] In some embodiments, the electronic device includes: a phone, a personal mobile device, a personal digital assistant (PDA), a mainframe computer, a desktop computer, a laptop computer, a tablet computer, and/or wearable computing device such as a communication headset, smart glasses, a contact lens or lenses, a digital watch, a bracelet, a ring, jewelry, or a combination thereof.
[0121] In various embodiments, the present disclosure includes a system that stores the user's default location and payment method, and allows the user to summon an autonomous vehicle and/or buy goods and services via a smartphone app or a website by clicking one button. There may be another button that allows the user to edit or change the location or payment method if the user chooses to. Thus, disclosed is streamlined way for a customer to transact with an autonomous vehicle by using a single button of an app and/or website. A customer using the app and/or website logs in and sets a default location and default payment method. This information is associated with the single button of the app and/or website. In all cases when the user clicks the single button, an autonomous vehicle is summoned and travels to the associated default location. However, in various situations, the single button can cause the autonomous vehicle to deliver an item or items to the customer, such as an item or items that the customer ordered.
[0122] In various embodiments, an autonomous vehicle management system includes a database configured to store information of a customer where the information includes a default location, a communication system configured to communicate with an autonomous vehicle and with a device of the customer where the device includes a display screen having a button that is associated with the information of the customer, at least one processor, and a memory storing instructions. The instructions, when executed by the processor(s), cause the autonomous vehicle management system to receive an indication via the communication system that the button on the device of the customer has been selected, access in the database the default location of the customer, and instruct the autonomous vehicle to travel to the default location based on the indication.
[0123] In various embodiments, an apparatus for summoning an autonomous vehicle includes a communication device configured to communicate with an autonomous vehicle management system, a display screen, at least one processor, and a memory storing instructions. The instructions, when executed by the processor(s), cause the apparatus to communicate via the communication device information of a customer to the autonomous vehicle management system where the information includes a default location of the customer, display on the display screen a button associated with the default location, communicate to the autonomous vehicle management system via the communication device an indication that the button has been selected, and receive, in response to communicating the indication that the button has been selected, an indication from the autonomous vehicle management system that an autonomous vehicle will be dispatched to the default location.
[0124] In various embodiments, the information of the customer includes a payment account. In various embodiments, the button is associated with a payment account, and selection of the button causes a charge to the payment account. In various embodiments, the button is not associated with a destination confirmation screen, such that selection of the button causes an autonomous vehicle to travel to the default location without a destination confirmation screen. In various embodiments, the button is associated with a purchase order and prior to the autonomous vehicle traveling to the default destination, the vehicle receives an item or items corresponding to the purchase order. In various embodiments, a customer can place items into a basket, and with one click, the customer can check out and the system dispatches an autonomous vehicle. Thus, the system does not need to ask the customer for the customer's location (which can be the customer's current phone location or a saved default location) or the customer's payment method (which can be a saved payment method or, for example, Apple® Pay). In various embodiments, the system can also change the destination location in real time if the customer is on the move.
[0125] In various embodiments, the customer does not need to purchase any items and can summon an autonomous vehicle with one click, and the autonomous vehicle will travel to the customer. In various embodiments, such a summon autonomous vehicle can be pre- stocked with products that the customer can buy directly from the autonomous vehicle, or the summoned autonomous vehicle can pick up items from the customer for return purposes, for selling an item to someone else, or for another reason.
Goods and Services
[0126] In some embodiments, the user includes a fleet manager, a sub-contracting vendor, a service provider, a customer, a business entity, an individual, or a third party.
[0127] In some embodiments, the services include: subscription services, prescription services, marketing services, advertising services, notification services, or requested, ordered or scheduled delivery services. In particular embodiments, the scheduled delivery services include, by way of example, special repeat deliveries such as groceries, prescriptions, drinks, mail, documents, etc.
[0128] In some embodiments, the services further include: the user receiving and returning the same or similar goods within the same interaction (e.g., signed documents), the user receiving one set of goods and returning a different set of goods within the same interaction, (e.g., product replacement/ returns, groceries, merchandise, books, recording, videos, movies, payment transactions, etc.), a third party user providing instruction and or authorization to a goods or service provider to prepare, transport, deliver and/or retrieve goods to a principle user in a different location.
[0129] In some embodiments, the services further include: advertising services, land survey services, patrol services, monitoring services, traffic survey services, signage and signal survey services, architectural building or road infrastructure survey services.
[0130] In some embodiments, at least one robot is further configured to process or manufacture goods. [0131] In some embodiments, the processed or manufactured goods include: beverages, with or without condiments (such as coffee, tea, carbonated drinks, etc.); various fast foods; or microwavable foods.
[0132] In some embodiments, the robots within the fleet are equipped for financial transactions. These can be accomplished using known transaction methods such as debit/ credit card readers or the like.
[0133] In accordance with aspects of the present disclosure, a grocery delivery system via a fleet of autonomous vehicles is disclosed. The vehicle can deliver fresh produce and other grocery items. The vehicles may or may not include temperature control inside the compartments. In various embodiments, the temperature inside an autonomous vehicle can be controlled based on the products carried inside the vehicle. The temperature can be set manually or automatically based on the system's or vehicle's knowledge of what it is carrying, and can include heating or cooling. In various embodiments, the system enables transporting groceries between residential and/or industrial locations with one or more autonomous vehicles. The autonomous vehicles include any number of compartments for storing, preserving, heating, and/or cooling such groceries. Such compartments include one or more temperature and/or humidity controlled compartments for preserving food and/or delivering food at predetermined temperatures and/or humidity ranges. Groceries include produce, frozen foods, hot foods, wet foods, dry foods, and related consumer products such medication, hygiene products, toys, magazines, cards, and/or other specialty items.
[0134] In various embodiments, An autonomous robotic vehicle includes a conveyance system configured to autonomously drive the autonomous robotic vehicle between at least one grocery storage location and at least one delivery location, and a compartment coupled to the conveyance system and configured to receive at least one grocery item stored at the at least one grocery storage location. The compartment includes a temperature control module configured to maintain the compartment within a predetermined temperature range to provide temperature control for the least one grocery item as the conveyance system drives between the at least one grocery storage location and the at least one delivery location.
[0135] In various embodiments, an autonomous vehicle management system for delivering groceries includes a database configured to store a list of groceries for delivery by an autonomous vehicle and configured to store information of a delivery location and of at least one grocery storage location, a communication system configured to communicate with a computing device to enable at least one grocery item to be selected from the list of groceries stored on the database, at least one processor, and a memory storing instructions. The instructions, when executed by the at least one processor, cause the autonomous vehicle system to access in the database the at least one grocery item selected, instruct the storage location to load the at least one grocery item selected on the autonomous vehicle, and instruct the autonomous vehicle to travel to the delivery location when the at least one grocery item selected is loaded in the autonomous vehicle.
[0136] In various embodiments, the autonomous vehicle and/or fleet can include specific heating, cooling, and/or humidity modules, manual or automatic temperature changes, vehicles that deliver only certain types of groceries (e.g., cold or hot), multiple vehicles coordinating with one another, scheduling of deliveries, returning items or whole deliveries, changes in orders, customer verification, multiple location drop off, and/or bulk ordering, among other features.
[0137] In accordance with aspects of the present disclosure, disclosed is a system that allows a user to take a picture of a product she would like to purchase and receive the product via an autonomous delivery vehicle. In various embodiments, the picture taken may or may not need to include the product barcode. In various embodiments, the system involves recognizing images of products captured by a customer and delivering the products to the customer via an autonomous delivery vehicle.
[0138] In various embodiments, a customer using an app and/or website logs in and transmits images of products to the delivery system. The delivery system then performs image recognition, identifies the products, and transmits the identified products to the customer. The customer then verifies and selects products from the identified products and transmits the selected products to the delivery system, which controls autonomous delivery vehicles to deliver the selected products to the customer upon reception of the selected products. Thus, disclosed is an easier way for customers to order products by transmitting images of products via an app and/or website, and the delivery system controls the delivery of the products via autonomous delivery vehicles.
[0139] In various embodiments, a delivery control system includes a communication device configured to receive images of products transmitted from a customer, at least one processor, and a memory storing instructions. The instructions, when executed by the at least one processor, cause the delivery control system to perform an image recognition process to identify the products in the images, transmit identified products to the communication device, receive from the communication device products selected by the customer, and instruct an autonomous vehicle to deliver the selected products to a location of the customer.
[0140] In various embodiments, an apparatus for ordering products includes a communication device, a display screen, at least one processor, and a memory storing instructions. The instructions, when executed by the at least one processor, cause the apparatus to receive one or more images of products from a customer, transmit via the communication device the one or more images to a delivery system, receive identified products from the delivery system, display the identified products on the display screen, receive from the customer selected products from among the identified products, and transmit via the communication device an order to the delivery system, where the order includes the selected products from the identified products and a location of the customer.
[0141] In various embodiments, the delivery control system and/or the customer's apparatus can include transmission of updated location of the customer, payment before transmitting the order, use of website images of products or images captured by the customer, confirmation between the customer and the delivery system via the app/web page for confirming the ordered products, image recognition of products in case no product is identified in an image or more than two potential products are identified in an image, delivery feature to deliver the products to the most recent location of the customer, and/or selection of autonomous delivery vehicles based on the most recent location or the default location of the customer.
[0142] In accordance with aspects of the present disclosure, disclosed is an integrated system to ensure secure delivery of prescription drugs, including a number of secure lockers on an autonomous vehicle that can only be accessed by the intended recipient. The recipient can verify her identify by a built-in ID verification system on the autonomous vehicle, which may include a camera or a card reader to detect and identify the ID card along with a facial recognition system to compare the user with the ID card. If the system does not recognize a face, a remote human operator can access the camera stream to verify the ID and customer manually. In various embodiments, an ID and/or age verification system on an autonomous vehicle platform includes may include a fingerprint based system.
[0143] In various embodiments, an autonomous robotic vehicle includes a conveyance system, a securable compartment configured to autonomously lock and unlock where the securable compartment contains an item for delivery to a particular individual, a personal identification reader, at least one processor, and a memory storing instructions. The instructions, when executed by the at least one processor, cause the autonomous robotic vehicle to, autonomously, travel to a destination location of the particular individual, capture by the personal identification reader at the destination location a personal identification object, determine that the captured personal identification object matches an identity of the particular individual, and unlock the securable compartment based on the determination.
[0144] In various embodiments, the item is a prescription drug. In various embodiments, the personal identification reader is a camera and the personal identification object is a face, and identify is verified if the captured face matches a face image on file. In various embodiments, the personal identification reader is a camera and the personal identification object is a government issued photo ID card. In various embodiments, identify is verified if captured photo ID information matches photo and information on file. In various embodiments, identify is verified if the captured face matches the photo on the photo ID card. In various embodiments, the system includes receiving an image and transmitting the image off-site to compare and determine if the individual is an intended recipient. In various embodiments, additional verification by having a recipient answer questions and/or enter information regarding the prescribing physician, the pharmacist, the medication, or medical history are contemplated.
[0145] In various embodiments, the system can unlock and/or open an autonomous vehicle based on facial recognition. A recipient's ID or image of a face can be saved, and the system can compare the recipient's face with the previously saved ID or photograph. In this manner, a recipient does not need to provide any identification to be verified to receive an order. In various embodiments, the system can unlock and/or open an autonomous vehicle based on the recipient's ID. The system can compare the recipient's current ID with a name on record and/or a previously saved ID or photograph of an ID. In various embodiments, the system can unlock and/or open an autonomous vehicle based on a combination of facial verification and ID verification.
[0146] In various embodiments, the ID processing system may be on the autonomous vehicle or may be server-side. In various embodiments, a database on the server side keeps a record of a face and/or ID. In various embodiments, when the verification involves real-time comparisons between an ID and a face, the system may not need to save preexisting records or photographs.
[0147] In various embodiments, a manual ID system includes capturing information using a camera on the autonomous vehicle and having a certified remote operator check the face and/or ID manually in real time.
[0148] In various embodiments, to verify that face being used for verification is a live face and is not a picture of a face, the system can use vehicle sensors that have depth perception (such as LiDAR and radar) to check that the face being presented is three dimensional. In various embodiments, the system can use two cameras to capture parallax effects of a 3D object to verify that the face is three dimensional. In various embodiments, the autonomous vehicle can instruct the person to follow the command, such as blinking eyes, to verify that the face is a live face.
[0149] In various embodiments, the face and/or ID verification system can be implemented on the recipient's device, such as a smartphone. For example, the device's camera can be used to capture live images instead of a camera on an autonomous vehicle. In such embodiments, an additional check can be performed to verify that that the user device is in close proximity to the autonomous vehicle, before allowing the vehicle to unlock.
[0150] In accordance with aspects of the present disclosure, disclosed is a system for the user to subscribe to certain products and have them delivered automatically to a desired location by an autonomous vehicle on a regular interval without any need to make purchases every time. In various embodiments, the service can confirm the exact location with the customer just before the delivery, and/or the customer can make real-time changes to the location. Thus, the systems and methods provide for secure delivery of goods on a regular interval based on a subscription, including goods such as prescriptions, groceries, detergent, engine oil, or any item that becomes consumed or depleted over time.
[0151] In various embodiments, an autonomous delivery management system includes at least one processor and a memory storing instructions which, when executed by the at least one processor, cause the autonomous delivery management system to access subscription information that includes an item and a time interval for regularly delivering the item to a customer, determine a handling itinerary for the item that includes delivery of the item in compliance with the time interval, and communicating instructions to an autonomous vehicle based on the handling itinerary.
[0152] In various embodiments, the subscription information includes automatic payment method. In various embodiments, the handling itinerary includes pickup location, destination location, and deadline for delivery. In various embodiments, the system can adjust delivery location and/or time when indicated by the customer.
[0153] In various embodiments, the customer subscribes to a product once, which can include setting up dates and/or times for regular deliveries, and the system dispatches a vehicle with the subscribed product on that regular basis. On the day of the delivery, the system may or may not confirm with the customer ahead of time. If advance confirmation is desired, the system can send a notification before the scheduled delivery time, and allow the customer to modify the details of the delivery, such as precise timing, location, and/or quantity, or modify the product itself, or cancel the delivery. If advance confirmation is not required, the system can dispatch an autonomous vehicle and have the vehicle wait for the customer at the desired location. If the customer misses the delivery, the system can instruct the vehicle to leave the customer location. In various embodiments, the customer can cancel the subscription up to the very last minute or nearly in real-time with minimal or zero cancellation fees.
[0154] In various embodiments, the customer can enable the location feature on the customer's smartphone app, and the system can determine when the customer is at a particular location. The system can modify the delivery time and/or location based on the particular location or based on an estimate of when the customer will be home if travelling from the particular location. As an example, the customer chooses to receive monthly delivery at home at 6 pm, but at 5:45 pm the system determines that the customer is not home yet. The system can delay the delivery, with or without confirming with the customer. When the system determines that the customer is home or on her way to home with a certain estimated-time-of-arrival, it can dispatch a vehicle with the subscribed products to arrive at the customer's home at approximately the same time as the customer or at another time relative to the estimated-time-of- arrival.
Securahle Compartments
[0155] As illustrated in FIG. 2, robots in the fleet are each configured for transporting, delivering or retrieving goods or services and are capable of operating in an unstructured open environment or closed environment. In some embodiments, the vehicle 101 is configured to travel practically anywhere that a small all-terrain vehicle could travel on land, while providing at least one and preferably two large storage compartments 102, and more preferably, at least one large compartment 102 is configured with smaller internal secure compartments 104 of variable configurations to carry individual items that are to be delivered to, or need to be retrieved from customers.
[0156] Alternately, in some embodiments, the vehicle could be configured for water travel, providing at least one and preferably two large storage compartments, and more preferably, at least one large compartment is configured with smaller internal secure compartments of variable configurations to carry individual items that are to be delivered to, or need to be retrieved from customers.
[0157] Further still, in some embodiments, the vehicle could be configured for hover travel, providing at least one and preferably two large storage compartments, and more preferably, at least one large compartment is configured with smaller internal secure compartments of variable configurations to carry individual items that are to be delivered to, or need to be retrieved from customers.
[0158] Further still, in some embodiments, the vehicle could be configured for aerial drone or aerial hover travel, providing at least one and preferably two large storage compartments, and more preferably, at least one large compartment is configured with smaller internal secure compartments of variable configurations to carry individual items that are to be delivered to, or need to be retrieved from customers.
[0159] As illustrated in FIGS. 7 - 10, in some embodiments, the securable compartments are humidity and temperature controlled for, for example, hot goods, cold goods, wet goods, dry goods, or combinations or variants thereof. Further still, as illustrated in FIGS. 8 - 10, the compartment(s) are configurable with various amenities, such as compartment lighting for night deliveries and condiment dispensers.
[0160] In some embodiments, the securable compartments are configurable for various goods. Such configurations and goods include: bookshelves for books, thin drawers for documents, larger box-like drawers for packages, and sized compartments for vending machines, coffee makers, pizza ovens and dispensers.
[0161] In some embodiments, the securable compartments are variably configurable based on: anticipated demands, patterns of behaviors, area of service, or types of goods to be transported.
[0162] Further still, each robot includes securable compartments to hold said goods or items associated with said services, and a controller 150 configurable to associate each one of the securable compartments 102, 104 to an assignable customer 202 or provider 204 and provide entry when authorized, Each robot vehicle further includes at least one processor configured to manage the conveyance system, the navigation module, the sensor system, instructions from the fleet management module, the communication module, and the controller.
[0163] As described previously, each robot is configured with securable compartments. Alternately, a robot is configurable to contain a set of goods or even a mobile marketplace (similar to a mini bar at a hotel).
[0164] When a robot is assigned to a customer 202, one or more of the compartments 102, 104 is also assigned to that customer. Each of the large compartments 12 is secured separately and can securely transport goods to a separate set of customers 202. [0165] Upon arrival of the robot to the customer destination, the customer can then open their respective compartment(s) by verifying their identity with the robot. This can be done through a wide variety of approaches comprising, but not limited to:
1. The customers can be given a PIN (e.g., 4 digit number) when they make their initial request/order. They can then enter this pin at the robot using the robot touchscreen or a keypad.
2. The customers can verify themselves using their mobile phone and an RFID reader on the robot.
3. The customers can verify themselves using their voice and a personal keyword or key phrase they speak to the robot.
4. The customers can verify themselves through their face, a government ID, or a business
ID badge using cameras and facial recognition or magnetic readers on the robot.
5. The customers can verify themselves using their mobile phone; by pushing a button or predetermined code on their phone (and the system could optionally detect the customer is near the robot by using their GPS position from phone)
[0166] In accordance with aspects of the present disclosure, disclosed is a temporary storage system using autonomous vehicles. In various embodiments, a user can drop off items with an autonomous vehicle in one location, and then schedule to pick up the items from the autonomous vehicle in another location and/or at a later time. In this manner, a customer, for example, would not need to leave time to return to their original drop-off location to retrieve their items. In various embodiments, the temporary storage system can be used a delayed delivery system. For example, a customer can place an order ahead of time (e.g., at least 4 hours ahead or a day ahead), and a human operator can place the order into an autonomous vehicle ahead of time. The system can inform the customer when the order is ready, and the customer can notify the autonomous vehicle or delivery system within a period of time (such as 24 hours of the order) to schedule pick up. The autonomous vehicle can keep the order inside for the period of time (e.g., 24 hours), and the customer can summon it when the customer is ready.
Controlled and Processor
[0167] In some embodiments, each robot in the robot fleet is equipped with one or more processors 125 capable of both high-level computing for processing as well as low-level safety- critical computing capacity for controlling the hardware. The at least one processor is configured to manage the conveyance system, the navigation module, the sensor system, instructions from the fleet management module, the communication module and the controller.
[0168] Further still, in some embodiments, each robot in the robot fleet is equipped with a controller 150 configurable to associate each one of the securable compartments 102, 104 to an assignable customer 202 or provider 204 and provide entry when authorized.
Additional Features
[0169] In some embodiments, the robot fleet further includes at least one robot having a digital display for curated content comprising: advertisements (i.e., for both specific user and general public), including services provided, marketing/ promotion, regional / location of areas served, customer details, local environment, lost, sought or detected people, public service announcements, date, time, or weather.
[0170] The embodiments disclosed herein are examples of the disclosure and may be embodied in various forms. For instance, although certain embodiments herein are described as separate embodiments, each of the embodiments herein may be combined with one or more of the other embodiments herein. Specific structural and functional details disclosed herein are not to be interpreted as limiting, but as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure. Like reference numerals may refer to similar or identical elements throughout the description of the figures.
[0171] The phrases "in an embodiment," "in embodiments," "in various embodiments," "in some embodiments," or "in other embodiments" may each refer to one or more of the same or different embodiments in accordance with the present disclosure. A phrase in the form "A or B" means "(A), (B), or (A and B)." A phrase in the form "at least one of A, B, or C" means "(A); (B); (C); (A and B); (A and C); (B and C); or (A, B, and C) "
[0172] Any of the herein described methods, programs, algorithms or codes may be converted to, or expressed in, a programming language or computer program. The terms "programming language" and "computer program," as used herein, each include any language used to specify instructions to a computer, and include (but is not limited to) the following languages and their derivatives: Assembler, Basic, Batch files, BCPL, C, C+, C++, Delphi, Fortran, Java, JavaScript, machine code, operating system command languages, Pascal, Perl, PL1, Python, scripting languages, Visual Basic, metalanguages which themselves specify programs, and all first, second, third, fourth, fifth, or further generation computer languages. Also included are database and other data schemas, and any other meta-languages. No distinction is made between languages which are interpreted, compiled, or use both compiled and interpreted approaches. No distinction is made between compiled and source versions of a program. Thus, reference to a program, where the programming language could exist in more than one state (such as source, compiled, object, or linked) is a reference to any and all such states. Reference to a program may encompass the actual instructions and/or the intent of those instructions.
[0173] The systems described herein may also utilize one or more controllers to receive various information and transform the received information to generate an output. The controller may include any type of computing device, computational circuit, or any type of processor or processing circuit capable of executing a series of instructions that are stored in a memory. The controller may include multiple processors and/or multicore central processing units (CPUs) and may include any type of processor, such as a microprocessor, digital signal processor, microcontroller, programmable logic device (PLD), field programmable gate array (FPGA), or the like. The controller may also include a memory to store data and/or instructions that, when executed by the one or more processors, cause the one or more processors to perform one or more methods and/or algorithms.
[0174] It should be understood that the foregoing description is only illustrative of the present disclosure. Various alternatives and modifications can be devised by those skilled in the art without departing from the disclosure. Accordingly, the present disclosure is intended to embrace all such alternatives, modifications and variances. The embodiments described with reference to the attached drawing figures are presented only to demonstrate certain examples of the disclosure. Other elements, steps, methods, and techniques that are insubstantially different from those described above and/or in the appended claims are also intended to be within the scope of the disclosure.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A robot fleet comprising a plurality of robot vehicles operating autonomously and a fleet management module for coordination of the robot fleet, the fleet management module configured to coordinate the activity and positioning of each robot in the fleet, the fleet configured for transporting, delivering, or retrieving goods or services and capable of operating in unstructured open or closed environments, each robot in the fleet comprising: a) a power system;
b) a conveyance system;
c) a navigation module for navigation in the unstructured open or closed environments;
d) a communication module configurable to receive, store, and send data to the fleet management module, comprising: scheduled requests or orders, on- demand requests or orders, or a need for self-positioning of the robot fleet based on anticipated demand within the unstructured open or closed environments,
e) a sensor system;
f) at least one securable compartment to hold said goods or items associated with said services;
g) a controller configurable to associate the at least one securable compartment to an assignable customer, a customer group or a provider and provide entry when authorized; and
h) at least one processor configured to manage the conveyance system, the navigation module, the sensor system, instructions from the fleet management module, the communication module and the controller.
2. The robot fleet of claim 1, wherein the unstructured open environment is a non-confined geographic region accessible by navigable pathways comprising: public roads, private roads; bike paths, open fields, open public lands, open private lands, pedestrian walkways, lakes, rivers or streams, and wherein the unstructured, closed environment is a confined, enclosed or semi-enclosed structure accessible by navigable pathways comprising: open areas or rooms within commercial architecture, with or without structures or obstacles therein, airspace within open areas or rooms within commercial architecture, with or without structures or obstacles therein, public or dedicated aisles, hallways, tunnels, ramps, elevators, conveyors, or pedestrian walkways.
3. The robot fleet of claim 1, wherein the navigation module controls routing of the conveyance system of the robots in the fleet in the unstructured open or closed environments.
4. The robot fleet of claim 1, wherein the communication to the user and the robots in the fleet, between the robots of the fleet and between the user and the robots in the fleet occurs via wireless transmission.
5. The robot fleet of claim 4, wherein the user comprises: a fleet manager, a sub-contracting vendor, a service provider, a customer, a business entity, an individual, or a third party.
6. The robot fleet of claim 4, wherein the user's wireless transmission interactions and the robot fleet wireless transmission interactions occur via mobile application transmitted by an electronic device and forwarded to the communication module via one or more of: a central server, a fleet management module, and a mesh network.
7. The robot fleet of claim 6, wherein the electronic device comprises one or more of: a smartphone, a personal mobile device, a personal digital assistant (PDA), a desktop computer, a laptop computer, a tablet computer, and a wearable computing device.
8. The robot fleet of claim 1, wherein each robot fleet is configured with a maximum speed range from 1.0 mph to 90.0 mph.
9. The robot fleet of claim 1 , wherein the plurality of securable compartments are humidity and temperature controlled for one or more of: hot goods, cold goods, wet goods, dry goods.
10. The robot fleet of claim 1, wherein the at least one or the plurality of securable compartments are configurable for a plurality of goods, wherein the at least one or the plurality of securable compartments and goods comprise: bookshelves for books, thin drawers for documents, shelves or compartments designed to hold a variety of items that can be selected and purchased by a customer, larger box-like drawers for packages, and sized compartments for vending machines, coffee makers, ovens, and dispensers.
1 1. The robot fleet of claim 1, wherein the plurality of securable compartments is variably configurable based on one or more of: anticipated demands, patterns of behaviors, area of service, and types of goods to be transported.
12. The robot fleet of claim 1, wherein the services comprise: subscription services, prescription services, marketing services, advertising services, notification services, dry cleaning, rental of objects, sharing or loaning objects comprising shoes, clothes, goods repair, shipping items or scheduled delivery services.
13. The robot fleet of claim 12, wherein the services further comprise:
a) the user receiving and returning the same or similar goods within the same interaction;
b) the user receiving one set of goods and returning a different set of goods within the same interaction; or
c) a third party user providing instruction and or authorization to a goods or service provider to prepare, transport, deliver and/or retrieve goods to a principle user in a different location.
14. The robot fleet of claim 1, wherein at least one robot is further configured to process or manufacture goods.
15. The robot fleet of claim 14, wherein the processed or manufactured goods comprise one or more of: beverages with or without condiments, a plurality of fast foods, and microwavable foods.
16. The robot fleet of claim 1, further comprising at least one robot having a digital display for curated content comprising: advertisements, lost, sought, or detected people, public service announcements, date, time, and weather.
17. The robot fleet of any one of claims 1 - 16, wherein the positioning of robots can be customized based on one or more of: anticipated use, a pattern of historical behaviors, and specific goods being carried.
18. The robot fleet of any one of claims 1 - 17, wherein the robot fleet is semi-autonomous or fully-autonomous.
19. The robot fleet of any one of claims 1 - 18, wherein the robot fleet is controlled directly by the user.
20. The robot fleet of any one of claims 1 - 19, wherein a plurality of said autonomous robots within the fleet is operated on behalf of third party vendor/service provider.
21. The robot fleet of any one of claims 1 - 20, wherein a plurality of said autonomous robots within the fleet is further configured to be part of a sub-fleet comprising a sub- plurality of autonomous robots, each sub-fleet is configured to operate independently or in tandem with multiple sub-fleets comprising two or more sub-fleets.
PCT/US2018/044361 2017-07-28 2018-07-30 Fleet of robot vehicles for specialty product and service delivery WO2019023704A1 (en)

Priority Applications (32)

Application Number Priority Date Filing Date Title
CA3070725A CA3070725A1 (en) 2017-07-28 2018-07-30 Fleet of robot vehicles for specialty product and service delivery
JP2020504119A JP7236434B2 (en) 2017-07-28 2018-07-30 A fleet of robotic vehicles for the delivery of specialty products and services
CN201880048988.5A CN110945451A (en) 2017-07-28 2018-07-30 Fleet of robotic vehicles for special product and service delivery
EP18837696.6A EP3659003A4 (en) 2017-07-28 2018-07-30 Fleet of robot vehicles for specialty product and service delivery
US16/158,940 US11151632B2 (en) 2017-07-28 2018-10-12 Systems and methods for visual search and autonomous delivery
US16/159,047 US11551278B2 (en) 2017-07-28 2018-10-12 Systems and methods for a mixed fleet transportation service
US16/158,982 US10719805B2 (en) 2017-07-28 2018-10-12 Autonomous robot vehicle with securable compartments
US16/158,917 US20190050808A1 (en) 2017-07-28 2018-10-12 Systems and methods for one-click delivery of autonomous vehicle
US16/158,889 US11200613B2 (en) 2017-07-28 2018-10-12 Systems and methods for a subscription service via autonomous vehicles
US16/159,016 US10486640B2 (en) 2017-07-28 2018-10-12 Grocery delivery system having robot vehicles with temperature and humidity control compartments
US16/158,963 US11556970B2 (en) 2017-07-28 2018-10-12 Systems and methods for personal verification for autonomous vehicle deliveries
US16/176,462 US20190064847A1 (en) 2017-07-28 2018-10-31 Systems and methods for a sub-robot unit transporting a package from on-road an autonomous vehicle to a door or dropbox
US16/181,724 US11574352B2 (en) 2017-07-28 2018-11-06 Systems and methods for return logistics for merchandise via autonomous vehicle
PCT/US2019/043614 WO2020028162A1 (en) 2017-07-28 2019-07-26 Delivery system having robot vehicles with temperature and humidity control compartments
CA3107444A CA3107444A1 (en) 2017-07-28 2019-07-26 Delivery system having robot vehicles with temperature and humidity control compartments
EP19761983.6A EP3830777A1 (en) 2017-07-28 2019-07-26 Delivery system having robot vehicles with temperature and humidity control compartments
JP2021503926A JP2021532480A (en) 2017-07-28 2019-07-26 Delivery system with robotic vehicle with temperature and humidity control compartment
CN201980047757.7A CN112437934A (en) 2017-07-28 2019-07-26 Conveyor system having robotic vehicle with temperature and humidity control chamber
CN201980048683.9A CN112470178A (en) 2017-07-28 2019-07-29 Autonomous robotic vehicle with safety compartment
JP2021503912A JP7365395B2 (en) 2017-07-28 2019-07-29 Autonomous robotic vehicle with secure compartment
JP2021503911A JP2021532476A (en) 2017-07-28 2019-07-29 Systems and methods for personal identification in delivery by autonomous vehicle
PCT/US2019/043897 WO2020028241A1 (en) 2017-07-28 2019-07-29 Systems and methods for a sub-robot unit transporting a package from on-road an autonomous vehicle to a door or dropbox
EP19750228.9A EP3830800B1 (en) 2017-07-28 2019-07-29 Systems and methods for personal verification for autonomous vehicle deliveries
PCT/US2019/043893 WO2020028238A1 (en) 2017-07-28 2019-07-29 Systems and methods for personal verification for autonomous vehicle deliveries
CA3107746A CA3107746A1 (en) 2017-07-28 2019-07-29 Autonomous robot vehicle with securable compartments
CN201980048134.1A CN112424841A (en) 2017-07-28 2019-07-29 System and method for personal verification for autonomous vehicle delivery
CA3107512A CA3107512A1 (en) 2017-07-28 2019-07-29 Systems and methods for personal verification for autonomous vehicle deliveries
EP19759770.1A EP3830776A1 (en) 2017-07-28 2019-07-29 Autonomous robot vehicle with securable compartments
PCT/US2019/043887 WO2020028235A1 (en) 2017-07-28 2019-07-29 Autonomous robot vehicle with securable compartments
US16/654,216 US11222378B2 (en) 2017-07-28 2019-10-16 Delivery system having robot vehicles with temperature and humidity control compartments
US17/539,819 US11645696B2 (en) 2017-07-28 2021-12-01 Delivery system having robot vehicles with temperature and humidity control compartments
US17/676,563 US20220180419A1 (en) 2017-07-28 2022-02-21 Systems and methods for one-click delivery of autonomous vehicle

Applications Claiming Priority (16)

Application Number Priority Date Filing Date Title
US201762538538P 2017-07-28 2017-07-28
US62/538,538 2017-07-28
US16/047,894 2018-07-27
US16/047,659 US10860015B2 (en) 2017-07-28 2018-07-27 Systems and methods for unmanned positioning and delivery of rental vehicles
US16/047,640 US10719078B2 (en) 2017-07-28 2018-07-27 Systems and methods for augmented capabilities for remote operation of robot vehicles
US16/047,894 US10732629B2 (en) 2017-07-28 2018-07-27 Systems and methods for fulfilling peer-to-peer transactions by autonomous robot vehicles
US16/047,659 2018-07-27
US16/047,640 2018-07-27
US16/047,598 2018-07-27
US16/047,598 US10332065B2 (en) 2017-07-28 2018-07-27 Fleet of robot vehicles for food product preparation
US16/048,669 US10864885B2 (en) 2017-07-28 2018-07-30 Systems and methods for autonomously loading and unloading autonomous vehicles
US16/048,669 2018-07-30
US16/048,737 2018-07-30
US16/048,797 2018-07-30
US16/048,797 US10882488B2 (en) 2017-07-28 2018-07-30 Hardware and software mechanisms on autonomous vehicle for pedestrian safety
US16/048,737 US10583803B2 (en) 2017-07-28 2018-07-30 Systems and methods for remote operation of robot vehicles

Related Child Applications (9)

Application Number Title Priority Date Filing Date
US16/159,016 Continuation-In-Part US10486640B2 (en) 2017-07-28 2018-10-12 Grocery delivery system having robot vehicles with temperature and humidity control compartments
US16/158,917 Continuation-In-Part US20190050808A1 (en) 2017-07-28 2018-10-12 Systems and methods for one-click delivery of autonomous vehicle
US16/158,963 Continuation-In-Part US11556970B2 (en) 2017-07-28 2018-10-12 Systems and methods for personal verification for autonomous vehicle deliveries
US16/158,940 Continuation-In-Part US11151632B2 (en) 2017-07-28 2018-10-12 Systems and methods for visual search and autonomous delivery
US16/158,982 Continuation-In-Part US10719805B2 (en) 2017-07-28 2018-10-12 Autonomous robot vehicle with securable compartments
US16/159,047 Continuation-In-Part US11551278B2 (en) 2017-07-28 2018-10-12 Systems and methods for a mixed fleet transportation service
US16/158,889 Continuation-In-Part US11200613B2 (en) 2017-07-28 2018-10-12 Systems and methods for a subscription service via autonomous vehicles
US16/176,462 Continuation-In-Part US20190064847A1 (en) 2017-07-28 2018-10-31 Systems and methods for a sub-robot unit transporting a package from on-road an autonomous vehicle to a door or dropbox
US16/181,724 Continuation-In-Part US11574352B2 (en) 2017-07-28 2018-11-06 Systems and methods for return logistics for merchandise via autonomous vehicle

Publications (1)

Publication Number Publication Date
WO2019023704A1 true WO2019023704A1 (en) 2019-01-31

Family

ID=65039926

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/044361 WO2019023704A1 (en) 2017-07-28 2018-07-30 Fleet of robot vehicles for specialty product and service delivery

Country Status (1)

Country Link
WO (1) WO2019023704A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112005261A (en) * 2019-03-11 2020-11-27 乐天株式会社 Distribution system, control device, distribution method and control method
DE102019208326A1 (en) * 2019-06-07 2021-01-21 Volkswagen Aktiengesellschaft Vehicle with cargo space, fastening system and method for fastening goods to be transported in a vehicle
US20210072750A1 (en) * 2019-09-05 2021-03-11 Lg Electronics Inc. Robot
CN112537705A (en) * 2020-03-31 2021-03-23 深圳优地科技有限公司 Robot elevator taking scheduling method and device, terminal equipment and storage medium
US11023938B2 (en) 2019-09-25 2021-06-01 Ebay Inc. Auto posting system
CN112947561A (en) * 2021-02-09 2021-06-11 北京三快在线科技有限公司 Unmanned aerial vehicle exception handling system, method and device
JP2021101303A (en) * 2019-12-24 2021-07-08 楽天グループ株式会社 Transport system, control device and method
CN113222404A (en) * 2021-05-10 2021-08-06 阿尔华(广州)科技有限公司 Scheduling device and method of intelligent delivery system
WO2021255442A1 (en) * 2020-06-16 2021-12-23 Exaactly Limited Systems and methods for capturing image data for mapping
WO2022182387A1 (en) * 2021-02-25 2022-09-01 Gm Cruise Holdings Llc Transparent cubby system for autonomous delivery services
JP7347349B2 (en) 2020-07-07 2023-09-20 トヨタ自動車株式会社 Information processing device, information processing system, and information processing method
US11772717B1 (en) * 2018-01-03 2023-10-03 AI Incorporated Autonomous versatile vehicle system
RU2810208C2 (en) * 2021-10-26 2023-12-22 Юрий Хабижевич Хамуков Delivery robot on single-axle chassis

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100084426A1 (en) * 2008-10-08 2010-04-08 Devers Jeffrey M Portable self-serve beer vending station
US20140081445A1 (en) * 2012-08-07 2014-03-20 Daniel Judge Villamar Automated delivery vehicle, systems and methods for automated delivery
US20150006005A1 (en) 2013-07-01 2015-01-01 Steven Sounyoung Yu Autonomous Unmanned Road Vehicle for Making Deliveries
WO2015061008A1 (en) 2013-10-26 2015-04-30 Amazon Technologies, Inc. Unmanned aerial vehicle delivery system
US20150202770A1 (en) 2014-01-17 2015-07-23 Anthony Patron Sidewalk messaging of an autonomous robot
US20170174343A1 (en) * 2015-12-22 2017-06-22 International Business Machines Corporation Drone delivery of coffee based on a cognitive state of an individual

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100084426A1 (en) * 2008-10-08 2010-04-08 Devers Jeffrey M Portable self-serve beer vending station
US20140081445A1 (en) * 2012-08-07 2014-03-20 Daniel Judge Villamar Automated delivery vehicle, systems and methods for automated delivery
US20150006005A1 (en) 2013-07-01 2015-01-01 Steven Sounyoung Yu Autonomous Unmanned Road Vehicle for Making Deliveries
WO2015061008A1 (en) 2013-10-26 2015-04-30 Amazon Technologies, Inc. Unmanned aerial vehicle delivery system
US20150202770A1 (en) 2014-01-17 2015-07-23 Anthony Patron Sidewalk messaging of an autonomous robot
US20170174343A1 (en) * 2015-12-22 2017-06-22 International Business Machines Corporation Drone delivery of coffee based on a cognitive state of an individual

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3659003A4

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11772717B1 (en) * 2018-01-03 2023-10-03 AI Incorporated Autonomous versatile vehicle system
CN112005261A (en) * 2019-03-11 2020-11-27 乐天株式会社 Distribution system, control device, distribution method and control method
DE102019208326B4 (en) 2019-06-07 2022-06-09 Volkswagen Aktiengesellschaft Vehicle with loading space, fastening system and method for fastening cargo in a vehicle
DE102019208326A1 (en) * 2019-06-07 2021-01-21 Volkswagen Aktiengesellschaft Vehicle with cargo space, fastening system and method for fastening goods to be transported in a vehicle
US20210072750A1 (en) * 2019-09-05 2021-03-11 Lg Electronics Inc. Robot
US11875389B2 (en) 2019-09-25 2024-01-16 Ebay Inc. Auto posting system
US11023938B2 (en) 2019-09-25 2021-06-01 Ebay Inc. Auto posting system
JP2021101303A (en) * 2019-12-24 2021-07-08 楽天グループ株式会社 Transport system, control device and method
JP7138618B2 (en) 2019-12-24 2022-09-16 楽天グループ株式会社 Conveying system, controller and method
CN112537705B (en) * 2020-03-31 2023-04-11 深圳优地科技有限公司 Robot elevator taking scheduling method and device, terminal equipment and storage medium
CN112537705A (en) * 2020-03-31 2021-03-23 深圳优地科技有限公司 Robot elevator taking scheduling method and device, terminal equipment and storage medium
WO2021255442A1 (en) * 2020-06-16 2021-12-23 Exaactly Limited Systems and methods for capturing image data for mapping
JP7347349B2 (en) 2020-07-07 2023-09-20 トヨタ自動車株式会社 Information processing device, information processing system, and information processing method
CN112947561A (en) * 2021-02-09 2021-06-11 北京三快在线科技有限公司 Unmanned aerial vehicle exception handling system, method and device
WO2022182387A1 (en) * 2021-02-25 2022-09-01 Gm Cruise Holdings Llc Transparent cubby system for autonomous delivery services
US11830312B2 (en) 2021-02-25 2023-11-28 Gm Cruise Holdings Llc Transparent cubby system for autonomous delivery services
CN113222404A (en) * 2021-05-10 2021-08-06 阿尔华(广州)科技有限公司 Scheduling device and method of intelligent delivery system
CN113222404B (en) * 2021-05-10 2024-05-03 阿尔华(广州)科技有限公司 Scheduling device and method of intelligent object delivery system
RU2810208C2 (en) * 2021-10-26 2023-12-22 Юрий Хабижевич Хамуков Delivery robot on single-axle chassis

Similar Documents

Publication Publication Date Title
JP7236434B2 (en) A fleet of robotic vehicles for the delivery of specialty products and services
WO2019023704A1 (en) Fleet of robot vehicles for specialty product and service delivery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18837696

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3070725

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2020504119

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2018837696

Country of ref document: EP