EP3682191A2 - Automatic routing of autonomous vehicles intra-facility movement - Google Patents

Automatic routing of autonomous vehicles intra-facility movement

Info

Publication number
EP3682191A2
EP3682191A2 EP18778736.1A EP18778736A EP3682191A2 EP 3682191 A2 EP3682191 A2 EP 3682191A2 EP 18778736 A EP18778736 A EP 18778736A EP 3682191 A2 EP3682191 A2 EP 3682191A2
Authority
EP
European Patent Office
Prior art keywords
autonomous vehicle
location
data
facility
recommended route
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP18778736.1A
Other languages
German (de)
French (fr)
Inventor
Robert J. GILLEN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
United Parcel Service of America Inc
Original Assignee
United Parcel Service of America Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by United Parcel Service of America Inc filed Critical United Parcel Service of America Inc
Publication of EP3682191A2 publication Critical patent/EP3682191A2/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/06Selective distribution of broadcast services, e.g. multimedia broadcast multicast service [MBMS]; Services to user groups; One-way selective calling services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Definitions

  • various concepts enable specific indications for various addresses that enable the generation of predictive routing to various locations along known travel paths.
  • such concepts merely facilitate travel to a particular outdoor address, such as the address of a specific building, campus, and/or the like.
  • a visitor e.g., service personnel, delivery personnel, maintenance personnel, and/or the like
  • resident, visually impaired individual, and/or the like is scheduled to visit a specific location and/or individual within a building, campus, and/or the like
  • the visitor must manually determine a route to the desired destination location based on limited and potentially outdated information/data provided via a static building directory, based on the instructions of a receptionist, security guard, or other building personnel, and/or the like.
  • the autonomous vehicles comprise: one or more locomotion mechanisms configured to maneuver the autonomous vehicle within the facility; one or more sensors configured to detect navigational cues emitted by one or more location devices positioned along a recommended route; an onboard controller comprising at least one non-transitory memory and a processor, wherein the onboard controller is configured to: receive navigational data from the one or more sensors, wherein the navigational data is indicative of the location of one or more location devices emitting navigational cues identifying the recommended route; control the one or more locomotion mechanisms based at least in part on the identified locations of the one or more location devices emitting navigational cues identifying the recommended route to move the autonomous vehicle along the recommended route identified by the navigational cues emitted by location devices positioned along the recommended route.
  • Certain embodiments are directed to methods for guiding an autonomous vehicle along a recommended route defined within a facility.
  • the methods comprise: detecting one or more navigational cues emitted by one or more location devices positioned along the recommended route; determining a location of the one or more location devices emitting the navigational cues; and activating a locomotion mechanism based at least in part on the identified locations of the one or more location devices emitting navigational cues to move the autonomous vehicle along the recommended route.
  • the autonomous vehicle guidance system comprises: a mapping computing entity comprising a non-transitory memory and a processor, wherein the mapping computing entity is configured to determine a recommended route for an autonomous vehicle to travel through a facility to reach an intended destination; a plurality of location devices positioned throughout the facility, wherein each of the location devices are configured to emit navigational cues detectable by an autonomous vehicle upon receipt of a signal from the mapping computing entity; at least one autonomous vehicle configured for autonomous movement within the facility, the autonomous vehicle comprising: one or more locomotion mechanisms configured to freely maneuver the autonomous vehicle within the facility; one or more sensors configured to detect navigational cues emitted by the one or more location devices; and an onboard controller comprising at least one non-transitory memory and a processor, wherein the onboard controller is configured to: receive navigational data from the one or more sensors, wherein the navigational data is indicative of the location of one or more location devices emitting navigation
  • Fig. 1 is a diagram of a system that can be used to practice various embodiments of the present invention.
  • Fig. 2 is a schematic illustration of a ground based autonomous vehicle in accordance with certain embodiments of the present invention.
  • Fig. 3 is a schematic illustration of an aerial-based autonomous vehicle in accordance with certain embodiments of the present invention.
  • Fig. 4 is a schematic of a mapping computing entity in accordance with certain embodiments of the present invention.
  • Fig. 5 is a schematic of a mobile computing entity in accordance with certain embodiments of the present invention.
  • Fig. 6 shows an example interior map indicating locations of various location devices.
  • Fig. 7 is a flow chart showing an example method for providing navigational guidance to an autonomous vehicle within a facility.
  • Fig. 8 shows an example beacon activity indicating a determined navigation route for a ground-based autonomous vehicle an interior hallway.
  • Fig. 9 shows an example beacon activity indicating a determined navigational route for an aerial-based autonomous vehicle within an interior hallway.
  • Fig. 10 shows an example ground-based autonomous vehicle depositing an item at a destination location in accordance with certain embodiments of the present invention.
  • Fig. 11 is a flow chart showing an example method for automatically operating transportation mechanisms to move an autonomous vehicle along a recommended route within a facility.
  • Fig. 12 shows an example beacon a transportation mechanism operation for a ground-based autonomous vehicle in accordance with certain embodiments of the present invention.
  • Various embodiments are directed to concepts for providing and/or utilizing internal addresses within a facility (e.g., a building, a campus, a suite, a house, an apartment, a warehouse, a building complex, a mall, and/or the like) as described in co-pending U.S. Patent Appl. No. 15/378,515, which is incorporated herein by reference in its entirety.
  • the internal addresses provide reference points that may be utilized as navigational references by autonomous delivery vehicles (e.g., ground-based vehicles and/or aerial-based vehicles) traversing the interior of the facility, for example, to deliver shipments to one or more interior locations.
  • autonomous delivery vehicles e.g., ground-based vehicles and/or aerial-based vehicles
  • the internal addresses may facilitate locating specific individuals (e.g., mobile computing entities carried by those individuals), rooms, furniture (e.g., desks), and/or other locations within the facility.
  • the internal addresses may be associated with one or more location devices (e.g., location beacons, Internet of Things enabled devices, and/or the like), which may be configured to wirelessly broadcast or otherwise transmit information/data indicative of the location device's location to various computer-enabled devices (e.g., mobile user devices, autonomous vehicles, and/or the like) within a corresponding broadcast range.
  • location devices may act as virtual landmarks and/or may be associated with particular internal addresses within a facility that enable mobile user devices and autonomous vehicles to determine their relative locations within the facility.
  • the location devices form a navigational network that may be utilized to generate and/or provide navigational instructions for autonomous vehicles.
  • the network of location devices may be configured to highlight a recommended travel path to guide autonomous vehicles to a desired location within the facility.
  • the location devices may thereby form a virtual breadcrumb trail through the facility to a desired location by providing navigational cues in series, emitted from the location devices along the recommended travel path.
  • the location devices may also be configured to receive information/data indicative of a desired destination of a particular autonomous vehicle (e.g., transmitted from the autonomous vehicle), and may provide navigational instructions to direct the autonomous vehicle toward the desired destination.
  • location devices may be located along travel paths (e.g., walkways, autonomous vehicle travel paths, and/or the like) within a facility and at various internal addresses within the facility.
  • location devices located along a recommended navigational path leading toward a desired destination may provide navigational cues indicative of the navigational instructions for the autonomous vehicle.
  • the location devices may illuminate associated lights, wirelessly transmit navigational data packets, and/or the like in order to direct the autonomous vehicle toward the desired destination.
  • the facilities may be configured to operate the automated transportation mechanisms automatically, in order to move a particular autonomous vehicle, individual, and/or item toward a desired destination.
  • the facility may monitor the location of a particular autonomous vehicle as the autonomous vehicle moves along the recommended travel path toward the desired destination. As the autonomous vehicle nears an automated transport mechanism on the recommended travel path, the facility automatically positions the automated transport mechanism so that the autonomous vehicle may board the automated transport mechanism to be moved toward the desired destination location.
  • the facility may move the automated transport mechanism toward the desired destination location without requiring additional input and/or signals from the autonomous vehicle (e.g., mechanical input and/or wirelessly transmitted input).
  • a facility may position an elevator such that the autonomous vehicle may board the elevator upon determining that the autonomous vehicle is proximate the elevator. The elevator may then automatically move to the floor on which a desired destination is located, without requiring the autonomous vehicle to provide any input to the elevator.
  • Embodiments of the present invention may be implemented in various ways, including as computer program products that comprise articles of manufacture.
  • a computer program product may include a non-transitory computer-readable storage medium storing applications, programs, program modules, scripts, source code, program code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like (also referred to herein as executable instructions, instructions for execution, program code, and/or similar terms used herein interchangeably).
  • Such non- transitory computer- readable storage media include all computer-readable media (including volatile and nonvolatile media).
  • a non-volatile computer-readable storage medium may include a floppy disk, flexible disk, hard disk, solid-state storage (SSS) (e.g., a solid state drive (SSD), solid state card (SSC), solid state module (SSM)), enterprise flash drive, magnetic tape, or any other non-transitory magnetic medium, and/or the like.
  • SSS solid state storage
  • a non- volatile computer-readable storage medium may also include a punch card, paper tape, optical mark sheet (or any other physical medium with patterns of holes or other optically recognizable indicia), compact disc read only memory (CD-ROM), compact disc-rewritable (CD-RW), digital versatile disc (DVD), Blu-ray disc (BD), any other non-transitory optical medium, and/or the like.
  • Such a non-volatile computer-readable storage medium may also include read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory (e.g., Serial, NAND, NOR, and/or the like), multimedia memory cards (MMC), secure digital (SD) memory cards, SmartMedia cards, CompactFlash (CF) cards, Memory Sticks, and/or the like.
  • ROM read-only memory
  • PROM programmable read-only memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory e.g., Serial, NAND, NOR, and/or the like
  • MMC multimedia memory cards
  • SD secure digital
  • SmartMedia cards SmartMedia cards
  • CompactFlash (CF) cards Memory Sticks, and/or the like.
  • a non-volatile computer-readable storage medium may also include conductive-bridging random access memory (CBRAM), phase- change random access memory (PRAM), ferroelectric random-access memory (FeRAM), non-volatile random-access memory (NVRAM), magnetoresistive random-access memory (MRAM), resistive random-access memory (RRAM), Silicon-Oxide-Nitride-Oxide-Silicon memory (SONOS), floating junction gate random access memory (FJG RAM), Millipede memory, racetrack memory, and/or the like.
  • CBRAM conductive-bridging random access memory
  • PRAM phase- change random access memory
  • FeRAM ferroelectric random-access memory
  • NVRAM non-volatile random-access memory
  • MRAM magnetoresistive random-access memory
  • RRAM resistive random-access memory
  • SONOS Silicon-Oxide-Nitride-Oxide-Silicon memory
  • FJG RAM floating junction gate random access memory
  • Millipede memory racetrack memory
  • a volatile computer-readable storage medium may include random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), fast page mode dynamic random access memory (FPM DRAM), extended data-out dynamic random access memory (EDO DRAM), synchronous dynamic random access memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), double data rate type two synchronous dynamic random access memory (DDR2 SDRAM), double data rate type three synchronous dynamic random access memory (DDR3 SDRAM), Rambus dynamic random access memory (RDRAM), Twin Transistor RAM (TTRAM), Thyristor RAM (T-RAM), Zero-capacitor (Z-RAM), Rambus in- line memory module (RIMM), dual in-line memory module (DIMM), single in-line memory module (SIMM), video random access memory (VRAM), cache memory (including various levels), flash memory, register memory, and/or the like.
  • RAM random access memory
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • FPM DRAM fast page mode dynamic random access
  • embodiments of the present invention may also be implemented as methods, apparatus, systems, computing devices, computing entities, and/or the like.
  • embodiments of the present invention may take the form of an apparatus, system, computing device, computing entity, and/or the like executing instructions stored on a computer-readable storage medium to perform certain steps or operations.
  • embodiments of the present invention may also take the form of an entirely hardware embodiment performing certain steps or operations.
  • retrieval, loading, and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together.
  • such embodiments can produce specifically-configured machines performing the steps or operations specified in the block diagrams and flowchart illustrations. Accordingly, the block diagrams and flowchart illustrations support various combinations of embodiments for performing the specified instructions, operations, or steps.
  • Fig. 1 provides an illustration of a system that can be used in conjunction with various embodiments of the present invention.
  • the system may include one or more vehicles 100 (e.g., autonomous vehicles), one or more mobile computing entities 105, one or more mapping computing entities 110, one or more Global Positioning System (GPS) satellites 115, one or more location sensors 120, one or more information/data collection devices 130, one or more networks 135, one or more location devices 400, one or more user computing entities 140 (not shown), and/or the like.
  • vehicles 100 e.g., autonomous vehicles
  • mobile computing entities 105 e.g., mobile computing entities 105
  • mapping computing entities 110 e.g., a Global Positioning System (GPS) satellites 115
  • GPS Global Positioning System
  • Each of the components of the system may be in electronic communication with, for example, one another over the same or different wireless or wired networks including, for example, a wired or wireless Personal Area Network (PAN), Local Area Network (LAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), or the like.
  • PAN Personal Area Network
  • LAN Local Area Network
  • MAN Metropolitan Area Network
  • WAN Wide Area Network
  • Fig. 1 illustrates certain system entities as separate, standalone entities, the various embodiments are not limited to this particular architecture.
  • autonomous vehicles may be configured for transporting one or more items (e.g., one or more packages, parcels, bags, containers, loads, crates, items banded together, vehicle parts, pallets, drums, the like, and/or similar words used herein interchangeably).
  • each autonomous vehicle 100 may be associated with a unique vehicle identifier (such as a vehicle ID) that uniquely identifies the autonomous vehicle 100.
  • the unique vehicle ID may include characters, such as numbers, letters, symbols, and/or the like. For example, an alphanumeric vehicle ID (e.g., "AS445") may be associated with each vehicle 100.
  • the autonomous vehicles 100 may encompass land-based autonomous vehicles (e.g., as shown in Fig.
  • Ground-based autonomous vehicles 100 may be configured for travelling along a support surface (e.g., the ground).
  • the ground-based autonomous vehicles 100 may be freely maneuverable along a support surface such that the vehicles are not limited to movement along a specifically configured track.
  • the ground-based autonomous vehicles 100 may be configured for movement along one or more predefined tracks (e.g., rails, slots, grooves, painted lines, embedded wires, and/or the like) traversing the interior of a facility.
  • the ground-based autonomous vehicles 100 may be driven by one or more forms of locomotion.
  • the ground-based autonomous vehicles 100 may travel on one or more wheels 121 (e.g., two-wheels, three-wheels, four- wheels, 18-wheels, and/or the like), on one or more continuous tracks (e.g., two continuous tracks driven by one or more wheels), on one or more legs (e.g., a bipedal vehicle, a quadruped vehicle, hexapod vehicle, and/or the like), one or more hover mechanisms (e.g., air pillows), and/or the like.
  • wheels 121 e.g., two-wheels, three-wheels, four- wheels, 18-wheels, and/or the like
  • continuous tracks e.g., two continuous tracks driven by one or more wheels
  • legs e.g., a bipedal vehicle, a quadruped vehicle, hexapod vehicle, and/or the like
  • hover mechanisms e.g., air pillows
  • Aerial-based autonomous vehicles 100 may be Unmanned Aerial Vehicles (UAVs) such as those described in co-pending U.S. Patent Appl. Serial No. 15/582,129, filed April 28, 2017, the contents of which are incorporated herein by reference in their entirety.
  • the one or more aerial-based autonomous vehicles 100 may be driven by one or more forms of locomotion and/or lift mechanisms.
  • the autonomous vehicles 100 may comprise one or more rotors 141 configured for vertical and/or horizontal locomotion.
  • aerial -based autonomous vehicles may be embodied as hexacopters having six rotors configured to enable vertical locomotion (e.g., lift) and/or horizontal locomotion, as shown in the example embodiment of Fig. 3, as well as enabling roll, pitch, and yaw movements of the vehicle.
  • rotor-based autonomous vehicles may have any number of rotors (e.g., 1 rotor, 2 rotors, 3 rotors, 4 rotors, 8 rotors, and/or the like).
  • Aerial-based autonomous vehicles may also have a variety of other lift and drive mechanisms, such as balloon-based lift mechanisms (e.g., enabling lighter-than-air transportation), wing-based lift mechanisms, turbine-based lift mechanisms, and/or the like.
  • the aerial-based autonomous vehicles 100 may be freely maneuverable (e.g., in three-dimensions), or the aerial-based autonomous vehicles 100 may be maneuverable along one or more tracks traversing a facility (e.g., rails built onto and/or into walls or ceilings of a facility).
  • the autonomous vehicle 100 locomotion mechanisms may be driven by one or more power devices, such as electrical motors, internal-combustion engines (e.g., alcohol-fueled, oil-fueled, gasoline- fueled, and/or the like), and/or the like.
  • the power devices may be connected to the one or more locomotion mechanisms directly, via one or more shafts, via one or more gears, via one or more flywheels, via one or more viscous couplings, and/or the like.
  • the autonomous vehicles 100 may have a body portion 122, 142 housing or otherwise connecting the various components of the autonomous vehicles 100.
  • the body portion may be comprise one or more rigid materials, such as carbon fiber, aluminum, plastic, and/or the like.
  • the body portion 122, 142 may thereby be secured relative to the one or more locomotion mechanisms, the one or more power devices, an onboard controller (discussed herein), and/or the like.
  • the body portion 122, 142 may encompass, or be secured relative to, one or more item support portions 123, 143 carried by the autonomous vehicle 100.
  • the item support portions 123, 143 may be configured to hold a single item (e.g., a single item shipment for delivery) and/or a plurality of items (e.g., multi-item shipments destined for a common destination and/or multiple single-item shipments destined for multiple destinations).
  • the item support portions 123, 143 may comprise one or more cargo containers, one or more item cages, one or more onboard conveying mechanisms (e.g., belts, chains, lifts, and/or the like, such as conveyor 124 of Fig. 2), one or more grippers (e.g., gripper 144 of Fig. 3), and/or the like.
  • the body portion 122, 142 may form the item support portion 123, 143 such that the item support portion 123, 143 is positioned within the body portion 122, 142.
  • the item support portion 123, 143 may be secured (detachably or permanently) relative to the body portion 122, 142, for example, as an item cage and/or item gripper suspended below the body portion 122, 142 of a flying autonomous vehicle 100.
  • the item support portion may be embodied as a trailer secured (e.g., detachably or permanently) relative to the body portion 122 of a ground-based autonomous vehicle 100.
  • the item support portion 123, 143 of the autonomous vehicles 100 may encompass or may otherwise be secured relative to an item deposit mechanism 125, 145 configured to deposit an item at a delivery location.
  • the item deposit mechanism 125, 145 may comprise one or more grippers, one or more conveying mechanisms, one or more multi-axis robotic arms, one or more actuated doors, and/or the like.
  • the deposit mechanism 125, 145 is configured to retrieve an item to be delivered at a particular location from the item support portion 123,143, and to deposit the item at the delivery location.
  • a multi-axis robotic arm having a gripper at one end may be configured to retrieve the item to be delivered from the item support portion, and to move the item to the destination location (e.g., a desk surface, a floor, and/or the like at the destination location).
  • an item support portion 123 comprising a conveying mechanism 124 may be configured to convey an item onto the deposit mechanism 125 (which may also comprise a conveyor), and the deposit mechanism may be configured to deposit the item at the delivery location, as shown in the example embodiment of Fig. 10.
  • the autonomous vehicle 100 additionally comprises an onboard controller configured to control locomotion of the vehicle, navigation of the vehicle, obstacle avoidance, item delivery, and/or the like.
  • the autonomous vehicle 100 may additional comprise one or more user interfaces 126, which may comprise an output mechanism and/or an input mechanism configured to receive user input.
  • the user interface may be configured to enable autonomous vehicle technicians to review diagnostic information/data relating to the autonomous vehicle, and/or a user of the autonomous vehicle 100 may utilize the user interface to input and/or review information/data indicative of a destination location for the autonomous vehicle 100.
  • Fig. 1 shows one or more computing entities, devices, and/or similar words used herein interchangeably that are associated with and/or otherwise collectively form the onboard controller.
  • the one or more computing entities may encompass, for example, an information/data collection device 130 or other computing entities.
  • computing entity entity, device, system, and/or similar words used herein interchangeably may refer to, for example, one or more computers, computing entities, desktop computers, mobile phones, tablets, phablets, notebooks, laptops, distributed systems, gaming consoles (e.g., Xbox, Play Station, Wii), watches, glasses, iBeacons, proximity beacons, key fobs, radio frequency identification (RFID) tags, ear pieces, scanners, televisions, dongles, cameras, wristbands, wearable items/devices, items/devices, vehicles, kiosks, input terminals, servers or server networks, blades, gateways, switches, processing devices, processing entities, set-top boxes, relays, routers, network access points, base stations, the like, and/or any combination of devices or entities adapted to perform the functions, operations, and/or processes described herein.
  • gaming consoles e.g., Xbox, Play Station, Wii
  • RFID radio frequency identification
  • the information/data collection device 130 may include, be associated with, or be in wired or wireless communication with one or more processors (various exemplary processors are described in greater detail below), one or more location- determining devices or one or more location sensors 120 (e.g., Global Navigation Satellite System (GNSS) sensors, indoor location sensors, such as Bluetooth sensors, Wi-Fi sensors, and/or the like), one or more real-time clocks, a J-Bus protocol architecture, one or more electronic control modules (ECM), one or more communication ports for receiving information/data from various sensors (e.g., via a CAN-bus), one or more communication ports for transmitting/sending data, one or more RFID tags/sensors, one or more power sources, one or more information/data radios for communication with a variety of communication networks, one or more memory modules, and one or more programmable logic controllers (PLC). It should be noted that many of these components may be located in the autonomous vehicle 100 but external to the information/data collection device 130.
  • GNSS Global Navigation Satellite System
  • sensors
  • the one or more location sensors 120, modules, or similar words used herein interchangeably may be one of several components in wired or wireless communication with or available to the information/data collection device 130.
  • the one or more location sensors 120 may be compatible with GPS satellites 115, such as Low Earth Orbit (LEO) satellite systems, Department of Defense (DOD) satellite systems, the European Union Galileo positioning systems, the Chinese Compass navigation systems, Indian Regional Navigational satellite systems, and/or the like.
  • LEO Low Earth Orbit
  • DOD Department of Defense
  • DOD Department of Defense
  • European Union Galileo positioning systems the Chinese Compass navigation systems, Indian Regional Navigational satellite systems, and/or the like.
  • This information/data can be collected using a variety of coordinate systems, such as the Decimal Degrees (DD); Degrees, Minutes, Seconds (DMS); Universal Transverse Mercator (UTM); Universal Polar Stereographic (UPS) coordinate systems; and/or the like.
  • DD Decimal Degrees
  • DMS Degrees, Minutes, Seconds
  • UDM
  • triangulation and/or proximity based location determinations may be used in connection with a device associated with a particular autonomous vehicle and with various communication points (e.g., cellular towers, Wi-Fi access points, location devices 400, and/or the like) positioned at various locations throughout a geographic area and/or throughout an interior of a facility to monitor the location of the vehicle 100 and/or its operator.
  • the one or more location sensors 120 may be used to receive latitude, longitude, altitude, heading or direction, geocode, course, position, time, location identifying information/data, and/or speed information/data (e.g., referred to herein as location information/data and further described herein below).
  • the one or more location sensors 120 may also communicate with the mapping computing entity 110, the information/data collection device 130, user computing entity 105, and/or similar computing entities.
  • the ECM may be one of several components in communication with and/or available to the information/data collection device 130.
  • the ECM which may be a scalable and subservient device to the information/data collection device 130, may have information/data processing capability to decode and store analog and digital inputs received from, for example, vehicle systems and sensors.
  • the ECM may further have information/data processing capability to collect and present location information/data to the J-Bus (which may allow transmission to the information/data collection device 130), and output location identifying data, for example, via a display and/or other output device (e.g., a speaker).
  • a communication port may be one of several components available in the information/data collection device 130 (or be in or as a separate computing entity).
  • Embodiments of the communication port may include an Infrared information/data Association (IrDA) communication port, an information/data radio, and/or a serial port.
  • the communication port may receive instructions for the information/data collection device 130. These instructions may be specific to the vehicle 100 in which the information/data collection device 130 is installed, specific to the geographic area and/or serviceable point in which the vehicle 100 will be traveling, specific to the function the vehicle 100 serves within a fleet, and/or the like.
  • the information/data radio may be configured to communicate with a wireless wide area network (WW AN), wireless local area network (WLAN), wireless personal area network (WPAN), or any combination thereof.
  • the information/data radio may communicate via various wireless protocols, such as 802.11, general packet radio service (GPRS), Universal Mobile Telecommunications System (UMTS), Code Division Multiple Access 2000 (CDMA2000), CDMA2000 IX (lxRTT), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), Evolved Universal Terrestrial Radio Access Network (E-UTRAN), Evolution-Data Optimized (EVDO), High Speed Packet Access (HSPA), High-Speed Downlink Packet Access (HSDPA), IEEE 802.11 (Wi-Fi), 802.16 (WiMAX), ultra- wideband (UWB), infrared (IR) protocols, Bluetooth protocols (including Bluetooth low energy (BLE)), wireless universal serial bus (USB) protocols
  • the communication port may be configured to transmit and/or receive information/data transmissions via light-based communication protocols (e.g., utilizing specific light emission frequencies, wavelengths (e.g., visible light, infrared light, and/or the like), and/or the like to transmit data), to transmit data) via sound-based communication protocols (e.g., utilizing specific sound frequencies to transmit data), and/or the like.
  • the autonomous vehicle 100 may comprise a user interface 126 one or more input devices and/or one or more output devices configured to receive user input and/or to provide visual and/or audible output to a user (e.g., a programmer; a technician, and/or the like).
  • the vehicle may comprise a touchscreen (e.g., a capacitive touchscreen), a keyboard, a mouse, a touchpad, a display (e.g., an LCD display, an LED display, a tube display, and/or the like), and/or the like.
  • a touchscreen e.g., a capacitive touchscreen
  • a keyboard e.g., a keyboard
  • a mouse e.g., a mouse
  • a touchpad e.g., a touchpad
  • a display e.g., an LCD display, an LED display, a tube display, and/or the like
  • the onboard controller of the autonomous vehicle 100 may be configured to generate and/or retrieve navigational instructions for the autonomous vehicle 100 to move to a particular destination location within the facility.
  • the onboard controller may be configured to interpret the navigational instructions relative to a determined location of the autonomous vehicle 100 within the facility, and to provide signals to the onboard locomotion mechanisms to move the autonomous vehicle 100 along a determined route to the destination location.
  • the onboard controller may comprise one or more sensors 127 configured to assist in navigating the autonomous vehicle 100 during movement.
  • the one or more sensors 127 are configured to detect objects around the autonomous vehicle 100 and to provide feedback to the onboard controller to assist in guiding the autonomous vehicle 100 in the execution of various operations, including, for example, initialization (e.g., takeoff), movement navigation, route completion (e.g., landing), and/or the like.
  • the one or more sensors 127 may comprise one or more travel sensors configured to aid in navigating the autonomous vehicle 100 through a facility, one or more shipment/item delivery sensors configured to aid in placing a shipment/item at an intended destination location, and/or the like.
  • a single set of one or more sensors may operate as both the travel sensors and the shipment/item delivery sensors, however, in certain embodiments, an autonomous vehicle 100 may comprise a dedicated set of one or more travel sensors and a separate and dedicated set of one or more shipment/item delivery sensors.
  • the sensors 127 may comprise one or more light sensors, cameras, infrared sensors, microphones, RFID sensors, wireless communication receivers (e.g., Bluetooth, Wi-Fi, and/or the like) LIDAR sensors, proximity sensors, and/or the like.
  • the one or more sensors may be configured for generating and/or interpreting three-dimensional mapping aspects and/or three-dimensional aspects of an environment surrounding the autonomous vehicle 100, for example, utilizing three- dimensional depth sensors (e.g., dual cameras, depth-sensing cameras, depth-sensing scanners, and/or the like).
  • the three-dimensional sensors may be utilized in conjunction with and/or as an alternative to other location-determining devices to ascertain a current location of the autonomous vehicle 100 relative to various visible objects (which may be utilized as known landmarks that are recognizable to the autonomous vehicle 100 as matching information/data stored in a memory associated with the autonomous vehicle 100).
  • the one or more sensors may be secured to and/or within the body portion of the autonomous vehicle.
  • the one or more sensors may be positioned on the autonomous vehicle 100 based on the intended functionality of the sensor. For example, shipment/item delivery sensors may be positioned such that the shipment/item delivery sensors are able to monitor the relative location of a shipment/item to be delivered at a particular location.
  • a camera may have a field of view including one or more edges of a shipment/item (and/or a shipment/item deposit mechanism 125), such that the camera is configured to monitor the placement of the shipment/item at an intended destination location.
  • one or more navigational and/or obstacle avoidance sensors may be positioned such that the sensors are able to monitor the location of one or more edges of the autonomous vehicle 100 relative to one or more detected obstacles within a travel path of the autonomous vehicle 100.
  • the sensors 127 may be configured to detect data outputs from the one or more location devices within the facility. Accordingly, as discussed herein, the onboard controller may be configured to receive sensor data from the one or more sensors 127, and to utilize the retrieved sensor data to self-determine a precise location of the autonomous vehicle 100 within the facility. Moreover, the onboard controller may utilize the sensor data to identify an appropriate location to deposit an item at a destination location (e.g., on a desk, on a floor, and/or the like).
  • a destination location e.g., on a desk, on a floor, and/or the like.
  • Fig. 4 provides a schematic of a mapping computing entity 110 according to one embodiment of the present invention.
  • each facility e.g., office building, apartment building, storage building, campus, office suite, hotel, motel, inn, school, house, warehouse, convention center, and/or the like
  • various entities may comprise a mapping computing entity 110 storing location information/data for various locations located in a plurality of facilities.
  • a carrier may have a third-party mapping computing entity 110 storing location information/data for various locations internal to a plurality of facilities.
  • a carrier may be a traditional carrier, such as United Parcel Service (UPS), FedEx, DHL, courier services, the United States Postal Service (USPS), Canadian Post, freight companies (e.g. truck-load, less-than-truckload, rail carriers, air carriers, ocean carriers, etc.), and/or the like.
  • a carrier may also be a nontraditional carrier, such as Coyote, Amazon, Google, Uber, ride-sharing services, crowd-sourcing services, retailers, and/or the like. Accordingly, each time an employee of the carrier arrives at a particular facility, the carrier's mapping computing entity 110 may provide the employee with location information/data corresponding to the particular facility.
  • a third party may provide software to configure a facility-specific mapping computing entity 110 to provide various internal addressing and/or navigational features as discussed herein.
  • the provided software may comprise algorithms for generating and/or storing map data, algorithms for generating recommended navigational routes to a desired destination location (as discussed herein), and/or the like.
  • the software may be configurable based on hardware utilized at a particular facility. For example, the software may be configured such that signals generated according to the third-party provided software are compatible and readable with various hardware components (e.g., transportation mechanisms, location devices 400, and/or the like) located within the facility.
  • the provided software may be configured to provide security features for a specific facility, for example, to prevent unauthorized devices from obtaining mapping data for the facility.
  • the software may be configured to enable certain devices (e.g., certain autonomous vehicles 100) to connect and communicate with various hardware components (e.g., location devices 400) within the facility.
  • the software may be configured to enable all devices of a certain type (e.g., an autonomous vehicle 100), devices having corresponding specific identifiers (e.g., serial numbers, device names, and/or the like), and/or the like to connect with the various building hardware components.
  • the mapping computing entity 110 may also include one or more communications interfaces 320 for communicating with various computing entities, such as by communicating data, content, information, and/or similar terms used herein interchangeably that can be transmitted, received, operated on, processed, displayed, stored, and/or the like.
  • the mapping computing entity 110 may communicate with autonomous vehicles 100, user computing entities 105, location devices 400, and/or the like.
  • the mapping computing entity 110 may include or be in communication with one or more processing elements 305 (also referred to as processors, processing circuitry, and/or similar terms used herein interchangeably) that communicate with other elements within the mapping computing entity 110 via a bus, for example.
  • processing element 305 may be embodied in a number of different ways.
  • the processing element 305 may be embodied as one or more complex programmable logic devices (CPLDs), microprocessors, multi-core processors, coprocessing entities, application- specific instruction-set processors (ASIPs), and/or controllers.
  • CPLDs complex programmable logic devices
  • ASIPs application-specific instruction-set processors
  • the processing element 305 may be embodied as one or more other processing devices or circuitry.
  • circuitry may refer to an entirely hardware embodiment or a combination of hardware and computer program products.
  • the processing element 305 may be embodied as integrated circuits, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays (PLAs), hardware accelerators, other circuitry, and/or the like.
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • PDAs programmable logic arrays
  • the processing element 305 may be configured for a particular use or configured to execute instructions stored in volatile or non- volatile media or otherwise accessible to the processing element 305.
  • the processing element 305 may be capable of performing steps or operations according to embodiments of the present invention when configured accordingly.
  • the mapping computing entity 110 may further include or be in communication with non-volatile media (also referred to as non-volatile storage, memory, memory storage, memory circuitry and/or similar terms used herein interchangeably).
  • non-volatile storage or memory may include one or more non-volatile storage or memory media 310 as described above, such as hard disks, ROM, PROM, EPROM, EEPROM, flash memory, MMCs, SD memory cards, Memory Sticks, CBRAM, PRAM, FeRAM, RRAM, SONOS, racetrack memory, and/or the like.
  • non- volatile storage or memory media may store databases, database instances, database management system entities, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like.
  • database, database instance, database management system entity, and/or similar terms used herein interchangeably may refer to a structured collection of records or information/data that is stored in a computer- readable storage medium, such as via a relational database, hierarchical database, and/or network database.
  • the mapping computing entity 110 may further include or be in communication with volatile media (also referred to as volatile storage, memory, memory storage, memory circuitry and/or similar terms used herein interchangeably).
  • volatile storage or memory may also include one or more volatile storage or memory media 315 as described above, such as RAM, DRAM, SRAM, FPM DRAM, EDO DRAM, SDRAM, DDR SDRAM, DDR2 SDRAM, DDR3 SDRAM, RDRAM, RIMM, DIMM, SIMM, VRAM, cache memory, register memory, and/or the like.
  • the volatile storage or memory media may be used to store at least portions of the databases, database instances, database management system entities, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like being executed by, for example, the processing element 305.
  • the databases, database instances, database management system entities, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like may be used to control certain aspects of the operation of the mapping computing entity 110 with the assistance of the processing element 305 and operating system.
  • the mapping computing entity 110 may also include one or more communications interfaces 320 for communicating with various computing entities, such as by communicating data, content, information, and/or similar terms used herein interchangeably that can be transmitted, received, operated on, processed, displayed, stored, and/or the like.
  • the mapping computing entity 110 may communicate with computing entities or communication interfaces of the vehicle 100, mobile computing entities 105, and/or the like.
  • Such communication may be executed using a wired information/data transmission protocol, such as fiber distributed information/data interface (FDDI), digital subscriber line (DSL), Ethernet, asynchronous transfer mode (ATM), frame relay, information/data over cable service interface specification (DOCSIS), or any other wired transmission protocol.
  • the mapping computing entity 110 may be configured to communicate via wireless external communication networks using any of a variety of protocols, such as GPRS, UMTS, CDMA2000, lxRTT, WCDMA, TD-SCDMA, LTE, E- UTRAN, EVDO, HSPA, HSDPA, Wi-Fi, WiMAX, UWB, IR protocols, Bluetooth protocols, USB protocols, and/or any other wireless protocol.
  • the mapping computing entity 110 may be configured to transmit and/or receive information/data transmissions via light-based communication protocols (e.g., utilizing specific light emission frequencies, wavelengths (e.g., visible light, infrared light, and/or the like), and/or the like to transmit data), to transmit data) via sound-based communication protocols (e.g., utilizing specific sound frequencies to transmit data), and/or the like.
  • the mapping computing entity 110 may include or be in communication with one or more input elements, such as a keyboard input, a mouse input, a touch screen/display input, audio input, pointing device input, joystick input, keypad input, and/or the like.
  • the mapping computing entity 110 may also include or be in communication with one or more output elements (not shown), such as audio output, video output, screen/display output, motion output, movement output, and/or the like.
  • mapping computing entity 110 may be located remotely from other mapping computing entity 110 components, such as in a distributed system. Furthermore, one or more of the components may be combined and additional components performing functions described herein may be included in the mapping computing entity 110. Thus, the mapping computing entity 110 can be adapted to accommodate a variety of needs and circumstances.
  • a user device as discussed herein may be a user computing entity 105.
  • the user computing entity 105 may be a stationary computing device (e.g., a desktop computer, a server, a mounted computing device, and/or the like) or a mobile computing device (e.g., a PDA, a smartphone, a tablet, a phablet, a wearable computing entity, and/or the like) having an onboard power supply (e.g., a battery).
  • a stationary computing device e.g., a desktop computer, a server, a mounted computing device, and/or the like
  • a mobile computing device e.g., a PDA, a smartphone, a tablet, a phablet, a wearable computing entity, and/or the like
  • an onboard power supply e.g., a battery
  • Fig. 5 provides an illustrative schematic representative of a user computing entity 105 that can be used in conjunction with embodiments of the present invention.
  • the user computing entities 105 may include one or more components that are functionally similar to those of the mapping computing entity 110 and/or as described below.
  • user computing entities 105 can be operated by various parties, including residents, employees, and/or visitors of a facility.
  • a user computing entity 105 can include an antenna 412, a transmitter 404 (e.g., radio), a receiver 406 (e.g., radio), and a processing element 408 that provides signals to and receives signals from the transmitter 404 and receiver 406, respectively.
  • the user computing entity 105 may include signaling information/data in accordance with an air interface standard of applicable wireless systems to communicate with various entities, such as autonomous vehicles 100, mapping computing entities 110, location devices 400, and/or the like.
  • the user computing entity 105 may be capable of operating with one or more air interface standards, communication protocols, modulation types, and access types. More particularly, the user computing entity 105 may operate in accordance with any of a number of wireless communication standards and protocols.
  • the user computing entity 105 may operate in accordance with multiple wireless communication standards and protocols, such as GPRS, UMTS, CDMA2000, lxRTT, WCDMA, TD- SCDMA, LTE, E-UTRAN, EVDO, HSPA, HSDPA, Wi-Fi, WiMAX, UWB, IR protocols, Bluetooth protocols, USB protocols, and/or any other wireless protocol.
  • multiple wireless communication standards and protocols such as GPRS, UMTS, CDMA2000, lxRTT, WCDMA, TD- SCDMA, LTE, E-UTRAN, EVDO, HSPA, HSDPA, Wi-Fi, WiMAX, UWB, IR protocols, Bluetooth protocols, USB protocols, and/or any other wireless protocol.
  • the user computing entity 105 may be configured to transmit and/or receive information/data transmissions via light-based communication protocols (e.g., utilizing specific light emission frequencies, wavelengths (e.g., visible light, infrared light, and/or the like), and/or the like to transmit data), to transmit data) via sound-based communication protocols (e.g., utilizing specific sound frequencies to transmit data), and/or the like.
  • light-based communication protocols e.g., utilizing specific light emission frequencies, wavelengths (e.g., visible light, infrared light, and/or the like), and/or the like to transmit data
  • sound-based communication protocols e.g., utilizing specific sound frequencies to transmit data
  • the user computing entity 105 can communicate with various other entities using concepts such as Unstructured Supplementary Service information/data (USSD), Short Message Service (SMS), Multimedia Messaging Service (MMS), Dual-Tone Multi-Frequency Signaling (DTMF), and/or Subscriber Identity Module Dialer (SIM dialer).
  • USSD Unstructured Supplementary Service information/data
  • SMS Short Message Service
  • MMS Multimedia Messaging Service
  • DTMF Dual-Tone Multi-Frequency Signaling
  • SIM dialer Subscriber Identity Module Dialer
  • the user computing entity 105 can also download changes, add-ons, and updates, for instance, to its firmware, software (e.g., including executable instructions, applications, program modules), and operating system.
  • the user computing entity 105 may include location determining aspects, devices, modules, functionalities, and/or similar words used herein interchangeably.
  • the user computing entity 105 may include outdoor positioning aspects, such as a location module adapted to acquire, for example, latitude, longitude, altitude, geocode, course, direction, heading, speed, UTC, date, and/or various other information/data.
  • the location module can acquire data, sometimes known as ephemeris data, by identifying the number of satellites in view and the relative positions of those satellites.
  • the satellites may be a variety of different satellites, including LEO satellite systems, DOD satellite systems, the European Union Galileo positioning systems, the Chinese Compass navigation systems, Indian Regional Navigational satellite systems, and/or the like.
  • the location information/data may be determined by triangulating the user computing entity's 105 position in connection with a variety of other systems, including cellular towers, Wi-Fi access points, location devices 400, and/or the like.
  • the user computing entity 105 e.g., a mobile user computing entity 105
  • Some of the indoor aspects may use various position or location technologies including RFID tags, indoor location devices 400 or transmitters, Wi-Fi access points, cellular towers, nearby computing devices (e.g., smartphones, laptops) and/or the like.
  • position or location technologies including RFID tags, indoor location devices 400 or transmitters, Wi-Fi access points, cellular towers, nearby computing devices (e.g., smartphones, laptops) and/or the like.
  • technologies may include iBeacons, Gimbal proximity beacons, BLE transmitters, Near Field Communication (NFC) transmitters, and/or the like.
  • NFC Near Field Communication
  • the user computing entity 105 may also comprise a user interface (that can include a display 416 coupled to a processing element 408) and/or a user input interface (coupled to a processing element 408).
  • the user interface may be an application, browser, user interface, dashboard, webpage, and/or similar words used herein interchangeably executing on and/or accessible via the mobile computing entity 105 to interact with and/or cause display of information.
  • the user input interface can comprise any of a number of devices allowing the user computing entity 105 to receive data, such as a keypad 418 (hard or soft), a touch display, voice/speech or motion interfaces, scanners, readers, or other input device.
  • the keypad 418 can include (or cause display of) the conventional numeric (0-9) and related keys (#, *), and other keys used for operating the user computing entity 105 and may include a full set of alphabetic keys or set of keys that may be activated to provide a full set of alphanumeric keys.
  • the user input interface can be used, for example, to activate or deactivate certain functions, such as screen savers and/or sleep modes. Through such inputs the user computing entity can collect contextual information/data as part of the telematics data.
  • the user computing entity 105 can also include volatile storage or memory 422 and/or non-volatile storage or memory 424, which can be embedded and/or may be removable.
  • the non-volatile memory may be ROM, PROM, EPROM, EEPROM, flash memory, MMCs, SD memory cards, Memory Sticks, CBRAM, PRAM, FeRAM, RRAM, SONOS, racetrack memory, and/or the like.
  • the volatile memory may be RAM, DRAM, SRAM, FPM DRAM, EDO DRAM, SDRAM, DDR SDRAM, DDR2 SDRAM, DDR3 SDRAM, RDRAM, RIMM, DIMM, SIMM, VRAM, cache memory, register memory, and/or the like.
  • the volatile and non- volatile storage or memory can store databases, database instances, database management system entities, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like to implement the functions of the user computing entity 105.
  • a facility, facility address, and/or similar words used herein interchangeably may be any identifiable location having a physical address, such as one or more campuses, lockers, access points, delivery locations, longitude and latitude points, geocodes, stops (e.g., pick up stops, delivery stops, vehicle visits, stops) geofenced areas, geographic areas, landmarks, buildings, bridges, and/or other identifiable locations.
  • a facility may be a residential location, such as one or more homes, one or more mobile homes, one or more apartments, one or more apartment buildings, one or more condominiums, one or more townhomes, and/or the like.
  • a facility may also be a commercial location, such as one or more stores in a mall having a defined address, one or more office buildings, one or more office parks, one or more offices of an office complex, one or more garages, one or more lockers or access points, one or more warehouses, one or more restaurants, one or more stores, one or more retail locations, and/or the like.
  • Facilities may also comprise one or more industrial locations, such as manufacturing locations, distribution locations, processing locations, industrial park locations, and/or the like.
  • facilities may encompass one or more internal locations having corresponding internal addresses.
  • the internal locations may comprise one or more rooms, hallways, portions of rooms, portions of hallways, cubicles, offices, stalls, restrooms, furniture (e.g., desks, chairs, and/or the like), walls, floors, portions of floors, stores, departments, elevators, stairwells, escalators, ramps, walkways, catwalks, roofs, basements, parking spaces, buildings (e.g., in a multi-building campus), mobile devices, mobile beacons, fixed-location beacons, and/or the like.
  • Fig. 6 provides a two- dimensional map view of a portion of an example facility, in which a plurality of location devices 400 are shown in various rooms and hallways with corresponding internal addresses.
  • only a subset of a plurality of internal locations may be associated with corresponding internal addresses.
  • various floors, portions of floors, rooms, portions of rooms, furniture, and/or the like may be associated with one or more internal addresses, while other internal locations, such as hallways, walls, and/or the like may not be specifically associated with an internal address.
  • the internal addresses may correspond to one or more network enabled computing entities, such as one or more location devices 400.
  • facilities may encompass a plurality of location devices 400 each associated with one or more internal locations, internal addresses, and/or the like.
  • a facility may encompass a network of location devices 400, collectively providing information/data regarding a plurality of internal locations within the facility and/or one or more autonomous vehicles 100 configured to traverse the interior of the facility.
  • facilities may each be associated with a location device 400 providing a general internal address for the facility.
  • a shipment/item may be any tangible and/or physical object.
  • a shipment/item may be or be enclosed in one or more packages, envelopes, parcels, bags, goods, products, containers, loads, crates, items banded together, vehicle parts, pallets, drums, the like, and/or similar words used herein interchangeably.
  • each shipment/item may include and/or be associated with an item/shipment identifier, such as an alphanumeric identifier.
  • item/shipment identifiers may be represented as text, barcodes, tags, character strings, Aztec Codes, MaxiCodes, information/data Matrices, Quick Response (QR) Codes, electronic representations, and/or the like.
  • a unique item/shipment identifier may be used by the carrier to identify and track the shipment/item as it moves through the carrier's transportation network. Further, such item/shipment identifiers can be affixed to shipments/items by, for example, using a sticker (e.g., label) with the unique item/shipment identifier printed thereon (in human and/or machine readable form) or an RFID tag with the unique item/shipment identifier stored therein. Such items may be referred to as "connected" shipments/items and/or “non- connected" shipments/items.
  • connected shipments/items include the ability to determine their locations and/or communicate with various computing entities. This may include the shipment/item being able to communicate via a chip or other devices, such as an integrated circuit chip, RFID technology, Near Field Communication (NFC) technology, Bluetooth technology, Wi-Fi technology, light-based communication protocols, sound-based communication protocols, and any other suitable communication techniques, standards, or protocols with one another and/or communicate with various computing entities for a variety of purposes.
  • Connected shipments/items may include one or more components that are functionally similar to those of the carrier server 100 and/or the mobile device 110 as described herein.
  • each connected shipment/item may include one or more processing elements, one or more display device/input devices (e.g., including user interfaces), volatile and non- volatile storage or memory, and/or one or more communications interfaces.
  • a shipment/item may communicate send "to" address information/data, received "from” address information/data, unique identifier codes, location information/data, status information/data, and/or various other information/data.
  • non-connected shipments/items do not typically include the ability to determine their locations and/or might not be able communicate with various computing entities or are not designated to do so by the carrier.
  • the location of non- connected shipments/items can be determined with the aid of other appropriate computing entities.
  • non-connected shipments/items can be scanned (e.g., affixed barcodes, RFID tags, and/or the like) or have the containers or vehicles in which they are located scanned or located.
  • an actual scan or location determination of a shipment/item is not necessarily required to determine the location of a shipment/item.
  • a scanning operation might not actually be performed on a label affixed directly to a shipment/item or location determination might not be made specifically for or by a shipment/item.
  • a label on a larger container housing many shipments/items can be scanned, and by association, the location of the shipments/items housed within the container are considered to be located in the container at the scanned location.
  • the location of a vehicle transporting many shipments/items can be determined, and by association, the location of the shipments/items being transported by the vehicle are considered to be located in the vehicle 100 at the determined location.
  • These can be referred to as "logical" scans/determinations or "virtual" scans/determinations.
  • the location of the shipments/items is based on the assumption they are within the container or vehicle, despite the fact that one or more of such shipments/items might not actually be there.
  • one or more location devices 400 located within a facility may be utilized to provide location information/data to one or more devices (e.g., mobile user computing entity 105, shipment/item, autonomous vehicle 100, and/or the like) located within the facility, and/or to provide internal address information/data indicative of the current location (e.g., current location determined in real-time) of a particular mobile device user (e.g., an intended package recipient), internal location, and/or the like.
  • the location devices 400 may be associated with and/or define a particular internal location (e.g., a cubicle, a hallway, a floor, a portion of a floor, a portion of a hallway, a department, a store, and/or the like).
  • the location devices 400 may each be configured to broadcast information/data indicative of the internal location associated with the location device 400 wirelessly, within a wireless communication range associated with the location device 400.
  • the location devices 400 may be configured to transmit data via one or more wireless transmission protocols (e.g., Wi-Fi, Bluetooth, NFC, and/or the like), via soundwaves, via light, and/or the like.
  • the location devices 400 may be configured to transmit different data for different devices. For example, a single location device 400 may simultaneously and/or alternatively transmit separate data intended for multiple autonomous vehicles 100.
  • a single location device 400 may simultaneously transmit data indicative of a first navigation instruction intended for a first autonomous vehicle 100 and transmit data indicative of a second navigation instruction intended for a second autonomous vehicle 100.
  • the location devices 400 may be configured to transmit the multiple data in multiple frequencies (e.g., each transmission being broadcast in a corresponding frequency) (e.g., different radio frequencies, different light frequencies, and/or the like).
  • the multiple data may be combined into a single transmission, but separated by data marker bytes within the transmission. For example, data intended for a first autonomous vehicle 100 may be preceded by a first data marker byte, and data intended for a second autonomous vehicle 100 may be preceded by a second data marker byte.
  • the receiving devices may be configured to review the received data to search for a corresponding data marker byte that serves to mark data intended for the particular receiving device.
  • the location devices 400 may be configured to broadcast information/data indicative of the identity of their location to other devices (e.g., user computing entities 105, autonomous vehicles 100, connected shipments/items, and/or the like) located within the communication range of the location device 400.
  • the device may be configured to determine its location based on the information/data received from the location device 400.
  • the location devices 400 are spaced in various facilities such that a mobile device (e.g., an autonomous vehicle 100) can detect output data from at least two adjacent location devices 400.
  • a mobile device e.g., an autonomous vehicle 100
  • an autonomous vehicle 100 may detect output signals from at least one adjacent location device 400 such that the autonomous vehicle 100 is able to determine whether the adjacent location device 400 is along an intended travel path for the autonomous vehicle 100.
  • the location devices 400 may be spaced such that at least one adjacent location device 400 is visible from each location device 400.
  • the location devices 400 may be configured to broadcast location information/data wirelessly via radio transmission (e.g., Wi-Fi, Bluetooth®, BLE, and/or the like), light transmission (e.g., visible light, infrared light, and/or the like detectable via a mobile computing entity 105), sound transmission, and/or the like.
  • the broadcast signals from the location devices 400 may enable an autonomous vehicle 100 (or other device) to determine its location (e.g., based on the location of the location device 400) and/or the autonomous vehicle's heading.
  • signals broadcast from a location device 400 may be directional, such that an autonomous vehicle may be configured to determine its direction relative to the directional signal broadcast from the location device 400.
  • the location devices 400 may comprise one or more wireless transmitters and/or receivers, as described herein with respect to various computing entities.
  • the location devices 400 may comprise a short range wireless transmitter and/or receiver (e.g., Bluetooth®, BLE, and/or the like) and/or a long range wireless transmitter and/or receiver (e.g., Wi-Fi).
  • the location devices 400 may be configured to transmit information/data indicative of the identity of the internal location associated with the location device 400 via short-range wireless transmitters and may transmit other information/data to computing entities via the long range wireless transmitters.
  • the location devices 400 may be configured to receive information/data transmitted from one or more computing entities, such as the onboard controllers of one or more autonomous vehicles 100 (or other devices such as internal building systems) and to provide navigational and/or other information/data to an autonomous vehicle 100.
  • the location devices 400 may be configured to operate as an information/data relay between the autonomous vehicle 100 (or other device) and the mapping computing entity 110.
  • the location devices 400 may be configured to receive information/data indicative of a desired destination for an autonomous vehicle 100 from the autonomous vehicle 100.
  • the location devices 400 may be configured to relay the received information/data to a mapping computing entity 110, which may be configured to determine a recommended route between the current location of the autonomous vehicle 100 (determined based at least in part on the location and/or identity of the location device 400) and the desired destination location.
  • a location device 400 may receive information/data indicative of a desired delivery location from a connected shipment/item carried by a particular autonomous vehicle 100 and may relay the desired delivery location to a mapping computing entity 110 to determine a recommended route to the desired delivery location.
  • one or more of the location devices 400 may receive information/data instructing the location devices 400 to provide guidance to autonomous vehicle 100, for example, by providing an autonomous vehicle-detectable indicia of a recommended direction of travel to reach the desired destination.
  • the location devices 400 may have associated notification mechanisms, such as speakers, lights (e.g., Light Emitting Diodes), displays, and/or the like configured to provide an indication of a direction of travel for the mobile device user.
  • the mapping computing entity 110 of the facility may provide information/data instructing location devices 400 located along the recommended travel path to emit navigational instructions detectable by an autonomous vehicle 100 to form a virtual breadcrumb trail through the facility to the destination location.
  • the detectable navigational instructions may comprise illuminating an indicator (e.g., by illuminating the location devices 400 in a particular color recognizable by a camera and/or light sensor of the autonomous vehicle, as shown in example Figs. 7-8 and 10) to provide a path of lights for the autonomous vehicle 100 to follow to the desired destination.
  • the location devices 400 may be in wireless and/or wired communication with other devices, such as other location devices 400, user computing entities 105, mapping computing entities 110, autonomous vehicles 100, shipments/items, and/or the like.
  • Each location device 400 may comprise one or more memory storage units (e.g., for storing information/data indicative of a location corresponding to the location device 400), one or more processing units, and/or the like.
  • the location devices 400 may be standalone units providing location information/data for various internal mapping, internal navigation, and/or internal addressing functions.
  • One or more location devices 400 may be secured relative to a particular item, device, and/or the like, and may store information/data indicative of an internal location description for the item, device, and/or the like to which it is attached.
  • a location device 400 may be secured to a ceiling tile, a desk, a chair, a wall, a floor tile, an elevator, a step, a door, and/or the like.
  • the location devices 400 may be embodied as one or more network enabled devices (e.g., Internet of Things enabled devices), such as a thermostat, light fixture, light switch, desktop computer, notebook computer, electronic whiteboard, and/or the like.
  • each location device 400 may be configured to store information/data indicative of an internal location associated with the location device 400.
  • the information/data stored by the location device 400 may comprise at least a portion of an internal address within a particular facility (e.g., within a building, a campus, and/or the like).
  • the location information/data stored by the location devices 400 may comprise a character and/or a string of characters, a symbol, and/or the like.
  • the location information/data may comprise data received from a plurality of location information/data sources.
  • the location information/data may be indicative of a relative location of the location device 400 within the facility.
  • the location information/data may be indicative of a floor on which the location device 400 is located, a room in which the location device 400 is located, a building (e.g., in a multi-building facility) in which the location device 400 is located, and/or the like.
  • the location device 400 may store a portion of an internal address in the foim of 05L37D89, which may correlate to Desk Number 89, located proximate Light Fixture Number 37, on Floor 5 of a particular facility.
  • the location information/data may be dynamic location information/data reflective of a current location of a mobile device (e.g., a mobile user computing entity 105) relative to one or more location devices 400.
  • the location information/data may comprise an internal address comprising data indicative of a mobile device located proximate a location device 400.
  • the address may be updated to reflect the location of a particular mobile device (e.g., a mobile device associated with a resident of a building).
  • a mobile device identifier e.g., a character string
  • the indoor address for the mobile device may be 05L37D89P33.
  • a plurality of mobile devices located at the same internal location may have different internal location addresses.
  • a first mobile device may be associated with the internal location address 05L37D89P33, while a second mobile device may be associated with the internal location address 05L37D89P46.
  • the internal location addresses for each mobile device may reflect that the location devices are located at the same internal location, but may reflect a distinction in identity between the various mobile devices.
  • location devices 400 may be in communication with other location devices 400 in order to provide information/data indicative of the current internal location of a particular location device 400, to provide information/data indicative of navigational instructions between location devices 400, and/or to provide other information/data between a plurality of location devices 400.
  • location devices 400 may be in communication with one another in a hierarchical fashion, for example, in which a plurality of location devices 400 are in communication with a master location device 400.
  • each master beacon 400 may be associated with a large area within a location (e.g., a single floor in a multi-floor building, a geofenced area within a particular building, one or more areas associated with defined, subservient location devices 400, and/or the like) and each subservient location device 400 may be associated with a small area within the area corresponding to the master beacon 400 (e.g., a particular cubicle on the floor of the building).
  • a location e.g., a single floor in a multi-floor building, a geofenced area within a particular building, one or more areas associated with defined, subservient location devices 400, and/or the like
  • each subservient location device 400 may be associated with a small area within the area corresponding to the master beacon 400 (e.g., a particular cubicle on the floor of the building).
  • a first master level location device 400 associated with the fifth floor of the building may provide the first two digits (05) of the address
  • a second master level location device 400 associated with the Light Fixture 37 may provide the second three digits (L37)
  • a third level location device 400 associated with Desk 89 may provide the last three digits (D89). It should be understood that this example should not be construed as limiting, as various location devices 400 may provide other configurations and/or portions of an internal address.
  • each location device 400 may be a standalone location device 400 in direct communication with a mapping computing entity 110.
  • each location device 400 may comprise the entirety of the internal address corresponding to the location device 400.
  • the mapping computing entity 110 may store information/data indicative of the location of various location devices 400 within the facility.
  • the mapping computing entity 110 may store digital map information/data for a facility having an indication of the location of each location device 400 stored within the map information/data.
  • the location devices 400 may comprise one or more notification mechanisms configured to output autonomous-vehicle detectable notifications (e.g., visual (e.g., light-based) notifications, audible notifications, radio frequency notifications, and/or the like).
  • one or more location devices 400 may comprise one or more light sources (e.g., Light Emitting Diodes ("LEDs")) configured to emit light in response to one or more signals received from another computing entity.
  • the one or more location devices 400 may comprise a plurality of light sources and/or one or more light sources configured to emit multiple light colors (e.g., light having a selectable wavelength) in order to convey specific information/data to various autonomous vehicles 100.
  • a location device 400 may be configured to emit a first light color to indicate a desired direction of travel and a second light color to indicate the location of a desired destination.
  • the autonomous vehicle 100 may be configured to detect the emitted light and to distinguish between the first light color and the second light color.
  • the location devices 400 may be configured to emit one or more radio frequency signals detectable by an autonomous vehicle, for example, to guide the autonomous vehicle 100 to a desired internal location within a facility.
  • location information/data stored on one or more location devices 400 may be generated manually during a location device 400 initialization process, and/or automatically. For example, location information/data indicative of a particular internal location may be manually loaded onto a storage device associated with a location device 400 (e.g., based on user input received by the location device 400). In such embodiments, information/data indicative of the relative locations of various location devices 400 within a serviceable point may be manually and/or automatically determined to enable internal mapping and/or internal navigation between various internal locations. Information/data indicative of the internal addresses associated with each of the plurality of location devices 400 may be stored in association with the mapping computing entity 110, such that mapping and/or navigational operations may be enabled by the mapping computing entity 110.
  • the mapping computing entity 110 may be configured to automatically associate various internal addresses with various location devices 400.
  • the various location devices 400 may be configured to automatically identify other location devices 400 in an area surrounding the location device 400 in order to determine a relative location of each location device 400 relative to other location devices 400.
  • the mapping computing entity 110 may comprise information/data indicative of an internal map, such as a blueprint (e.g., a two-dimensional blueprint and/or a three-dimensional blueprint) and may be configured to associate the relative locations of various location devices 400 with particular internal locations reflected within the internal map.
  • a blueprint e.g., a two-dimensional blueprint and/or a three-dimensional blueprint
  • one or more computing entities may provide functionality similar to a location device 400.
  • a mobile user computing entity 105 associated with a particular mobile device user located within a facility may operate as a location device 400 indicating the current location of the associated mobile device user.
  • the mobile computing entity 105 may be configured to determine its location relative to one or more location devices 400 to enable its location to be monitored and/or stored by a mapping computing entity 110.
  • a particular mobile user computing entity 105 may be identified as a desired destination location for a particular mobile device user, and the mapping computing entity 110 (and/or another computing entity) may be configured to generate a recommended route to the current location of the mobile user computing entity 105 defining the destination location.
  • the current location of the associated mobile device user may be monitored and/or identified as a desired destination within the facility.
  • the mapping computing entity 110 may thus be configured to determine a recommended travel path from a particular location to the current location of a mobile device user, based at least in part on the current location of the associated computing entity 105.
  • the mapping computing entity 110 may be configured to determine whether a particular mobile computing entity 105 is located within the associated facility prior to generating a recommended route to the mobile computing entity 105. For example, upon determining that a particular mobile computing entity 105 is identified as a desired destination (e.g., an individual associated with the mobile computing entity 105 is identified as a recipient of a package to be delivered), the mapping computing entity 110 may first determine whether the mobile computing entity 105 is located within the associated facility. If the mobile computing entity 105 is not located within the facility, the mapping computing entity 110 may be configured to transmit information/data to a requesting computing entity (e.g., an autonomous vehicle 100) indicating that the desired destination is unavailable. The mapping computing entity 110 may be configured to additionally provide potential alternative destination locations, such as a user's office, desk, cubicle, and/or the like as alterative destinations for the requesting computing entity.
  • a requesting computing entity e.g., an autonomous vehicle 100
  • a transportation mechanism 170 may be configured for movement of a user, an item, an autonomous vehicle 100, and/or the like within a facility.
  • transportation mechanisms 170 may comprise elevators, dumbwaiters, escalators, people movers, moving walkways, automated transit (e.g., monorail, train, and/or the like), and/or the like.
  • transportation mechanisms 170 may comprise one or more computing mechanisms, such as one or more processors, memory storage devices, communication interfaces, and/or the like, as described herein in reference to other computing entities.
  • transportation mechanisms 170 may be in wired and/or wireless communication with one or more other computing entities, such as mapping computing entity 110, autonomous vehicles 100, user computing entities 105, location devices 400, and/or the like. Moreover, in various embodiments, one or more transportation mechanisms 170 may comprise one or more location devices 400 associated with the transportation mechanism 170.
  • one or more transportation mechanisms 170 may be selectably and/or continuously operable.
  • an escalator may be configured to operate continuously, regardless of whether a person and/or item is being transported by the escalator.
  • a transportation mechanism 170 such as an elevator (and/or an escalator), may be configured to operate (e.g., move) only in response to an indication that a person and/or item is located thereon.
  • the one or more transportation mechanisms 170 may be configured to receive information/data indicative of the presence of a user and/or an autonomous vehicle 100 based on information/data received from a mobile user computing entity (and/or autonomous vehicle 100) indicative of the presence of a user or autonomous vehicle 100 proximate a particular transportation mechanism 170.
  • the one or more transportation mechanisms 170 may be configured to move to a particular location to pick up a user carrying the detected mobile user computing entity 105 (e.g., an elevator may move to a particular floor at which the mobile computing entity is detected).
  • the one or more transportation mechanisms may be configured to receive a desired location (e.g., a desired floor) for permitting a user to exit the transportation mechanism 170 from the mobile computing entity (and/or another computing entity). Accordingly, various transportation mechanisms 170 need not require physical user interactions (e.g., pressing buttons) in order to operate the transportation mechanism 170.
  • Various embodiments are configured for providing one or more internal locations with corresponding internal addresses based on a location of one or more location devices 400, and for providing guidance to a particular location having an internal address.
  • various locations (and/or mobile devices) within a facility may be associated with defined internal addresses unique to each of the various locations.
  • the internal addresses may correspond to a particular location within the facility, as reflected in map information/data stored for a particular facility.
  • the internal addresses may be identified in reference to location devices 400 located nearby to various internal locations.
  • Each of a variety of locations may correspond to unique internal addresses that may distinguish a particular location from others in the same facility. Accordingly, a particular internal location, such as a particular cubicle, office, floor, room, and/or the like, may be identified based on a corresponding internal address.
  • internal addresses may comprise one or more information/data elements configured to be reflective of a particular location corresponding to the particular address.
  • a particular internal address may comprise a portion identifying a floor, a portion of a floor (e.g., corresponding to a particular light fixture within the portion of the floor), a building (e.g., within a multi-building facility), and/or a particular internal location (e.g., a piece of furniture (e.g., a desk)) located proximate the light fixture.
  • internal addresses for various internal locations may be defined in any of a variety of ways, such as via unique character strings.
  • an internal address may be generated based on data identifying a plurality of associated locations, devices, objects, and/or the like.
  • various internal addresses may be static and/or dynamic internal addresses (e.g., an address of a particular room may be static and an address of a particular mobile device, such as a mobile user computing entity carried by a facility resident, may be dynamic).
  • an individual e.g., having an associated mobile user computing entity
  • a cubicle/office may have an internal address determined based at least in part on an internal address associated with a nearby light fixture, a nearby desk, a nearby mobile device, and/or the like.
  • a second individual entering the same space with a second mobile user computing entity may have a second dynamic internal address at the same location, and reflecting the identity of the second mobile user computing entity.
  • particular internal locations may be associated with multiple unique internal addresses based at least in part on a mobile user computing entity located at the internal location.
  • one or more internal addresses may correspond to a variety of internal locations within a single building defining a facility, and/or a variety of locations within a plurality of buildings collectively defining a facility (e.g., within a multi-building campus). Accordingly, in various embodiments, an internal address may be indicative of a particular building, a particular floor within the building, a particular region of the floor, a particular room on the floor, a particular piece of furniture (or other internal location) within the particular region of the floor, and/or the like.
  • the internal addresses may correspond with one or more location devices 400 located near the addressed locations.
  • each location device 400 may have associated location information/data identifying the particular location device 400.
  • the location information/data may comprise a unique identifier for the location device 400, and a particular location proximate the location device 400 may be identified based on the unique identifier of the location device 400.
  • the internal address for the particular office may be 05L56R10.
  • the internal addresses may be correlated with map information/data comprising information/data indicative of the relative position of various locations.
  • the map information/data may comprise one or more building (or campus) maps, blueprints, and/or the like, such as two-dimensional maps, three-dimensional maps, and/or the like in order to provide a locational relationship between the one or more internal addresses and various other locations within a particular facility.
  • Fig. 6 provides a two-dimensional map view of an example portion of a facility indicating the location of various location devices 400 and their associated internal addresses within the facility.
  • the map information/data may be stored in the mapping computing entity 110 having embedded internal location information/data points stored therein.
  • the map information/data having embedded internal location information/data points may be publicly accessible. However, in other embodiments, the map information/data may be privately stored, such that only authorized personnel (and/or authorized autonomous vehicles 100) are granted access to at least a portion of the map information/data. For example, as discussed herein, the map information/data and/or other information/data associated with the internal addressing features may be provided to various autonomous vehicles 100 upon receipt of authorization information/data from the autonomous vehicle 10.
  • the map information/data may be reflective of the location of one or more location devices 400 within the facility.
  • the map information/data may be automatically and/or manually populated with the relative positioning of the location devices 400.
  • the mapping computing entity 110 may be configured to identify a relative location of a particular location device 400 within stored map information/data based at least in part on location information/data stored for the location device 400.
  • location information/data for a particular location device 400 may be indicative of a floor, a location on a floor, a building, and/or the like for the location device 400.
  • the map information/data may comprise information/data identifying various portions of the map information/data as a particular floor, a particular location on a floor, and/or the like.
  • the map information/data may comprise information/data indicative of the location of various location devices 400 relative to various walls, doors, and/or rooms.
  • the map information/data may comprise information/data indicative of various internal addresses associated with various locations (e.g., associated with the locations of various location devices), and/or the like.
  • the mapping computing entity 110 may be configured to correlate the location information/data for a particular location device 400 with information/data identifying various locations within the map information/data to automatically identify a precise location of a location device 400 within the map information/data.
  • the location of various location devices 400 may be manually provided within the map information/data.
  • the map information/data may be stored in a mapping computing entity 110 associated with the mapped facility.
  • facilities may store their own map information/data and may communicate various portions of the stored map information/data to autonomous vehicles 100, via the internet, via local area networks, and/or the like.
  • map information/data corresponding to a particular facility may be communicated to a particular autonomous vehicle 100 when the autonomous vehicle 100 is located within a particular geographical area (e.g., within the facility, within a defined geofenced area, within a wireless communication range of one or more location devices 400 located within the facility, and/or the like).
  • the mapping computing entity 110 may be configured to transmit at least a portion of the map information/data to the computing entity upon establishing an electronic communication between the autonomous vehicle 100 and one or more electronic entities (e.g., mapping computing entity 110 and/or location devices 400) corresponding to the facility. Accordingly, the map information/data may be publicly accessible to any autonomous vehicles 100 within a facility.
  • the mapping computing entity 110 may be configured to only transmit map information/data to authorized autonomous vehicles 100 (e.g., autonomous vehicles 100 associated with the facility; autonomous vehicles associated with known visitors of the facility (e.g., delivery services companies); and/or the like), and in such embodiments, the autonomous vehicle 100 (and/or another user computing entity 105) may be required to present authentication information/data to the mapping computing entity 110 before the mapping computing entity 110 transmits map information/data to the autonomous vehicle 100.
  • authorized autonomous vehicles 100 e.g., autonomous vehicles 100 associated with the facility; autonomous vehicles associated with known visitors of the facility (e.g., delivery services companies); and/or the like
  • the autonomous vehicle 100 and/or another user computing entity 105
  • the mapping computing entity 110 may be required to present authentication information/data to the mapping computing entity 110 before the mapping computing entity 110 transmits map information/data to the autonomous vehicle 100.
  • the mapping information/data may be provided to one or more autonomous vehicles 100 from the mapping computing entity 110 via one or more wireless networks (e.g., the internet, an intranet, and/or the like).
  • the map information/data may be provided to the autonomous vehicle 100 with an indication of a current location of the autonomous vehicle 100, and/or the autonomous vehicle 100 may be configured to separately determine its own location.
  • the mapping computing entity 110 may provide the autonomous vehicle 100 with at least a portion of the map information/data via a network connection between the autonomous vehicle 100 and the mapping computing entity 110.
  • the mapping computing entity 110 may be configured to provide at least a portion of the map information/data upon the autonomous vehicle 100 connecting to a wireless network (e.g., a Wi-Fi network corresponding to the facility, a short-range wireless connection with one or more location devices 400, and/or the like).
  • a wireless network e.g., a Wi-Fi network corresponding to the facility, a short-range wireless connection with one or more location devices 400, and/or the like.
  • the autonomous vehicle 100 may be configured to determine its own location relative to one or more location devices 400 within the facility, and to generate an indicator of its own location within the received map information/data.
  • the autonomous vehicle 100 may be configured to transmit information/data indicative of its current location (as determined based on a location of nearby location devices 400) to the mapping computing entity 110, which may be configured to incorporate an indication of the location of the autonomous vehicle 100 into the map information/data before transmitting the same to the autonomous vehicle 100.
  • the mapping computing entity 110 may be configured to provide at least a portion of the map information/data to an autonomous vehicle 100 via one or more location devices 400 located near the autonomous vehicle 100.
  • the location devices 400 utilized to transmit map information/data to the autonomous vehicle 100 may be configured to update the map information/data received from the mapping computing entity 110 to incorporate information/data indicative of its own location prior to transmitting the updated map information/data to the autonomous vehicle 100.
  • the map information/data comprises an indication of the location of the autonomous vehicle 100 based on the location of a nearby location device 400 from which the map information/data is received.
  • the location device 400 may be configured to transmit the map information/data (and/or other data) to the autonomous vehicle 100 via wireless communication protocols, such as short range Bluetooth, short range Wi-Fi, NFC, and/or the like.
  • the mapping computing entity 110 may enable computing entities located outside of a facility to access the map information/data for a particular facility.
  • the mapping computing entity 110 may be configured to publish the map information/data via the Internet, thereby enabling computing entities (e.g., user computing entities) located outside of the facility to access one or more portions of the map information/data.
  • the map information/data may be stored on one or more third party mapping computing entities 110 located geographically remotely from the facility.
  • a computing system located within the facility may be configured to relay map information/data from the third party mapping computing entities 110 to the autonomous vehicles 100, and/or the autonomous vehicles 100 may be configured to receive the map information/data directly from the third party mapping computing entities 110.
  • the third party mapping computing entities 110 may be configured to automatically determine the location of the autonomous vehicle 100 (e.g., based on information/data provided by the autonomous vehicle) prior to providing the map information/data to the autonomous vehicle 100.
  • the third-party mapping computing system 110 may be configured to transmit map information/data for a particular facility to an autonomous vehicle 100 upon determining that the autonomous vehicle 100 is located within the facility.
  • the third party mapping computing entity 110 may comprise map information/data for a plurality of facilities, and may be configured to identify appropriate map information/data to provide to an autonomous vehicle 100 based at least in part on the location of the autonomous vehicle 100.
  • the autonomous vehicle 100 may be configured to periodically provide the mapping computing entity 110 with updated information/data indicative of the location of the autonomous vehicle 100 within the facility such that the mapping computing entity 110 may be configured to update the location of the autonomous vehicle 100 within the map information/data.
  • the autonomous vehicle 100 may be configured to receive location information/data from a nearby location device 400 (e.g., location information/data identifying the location device 400 broadcast by the location device 400 to the autonomous vehicle 100 while the autonomous vehicle 100 is located within a communication range corresponding to the location device 400), and to transmit information/data identifying the corresponding location device 400 to the mapping computing entity 110 (e.g., via Wi-Fi, cellular information/data connection, and/or the like).
  • the mapping computing entity 110 upon receipt of the location information/data from the autonomous vehicle 100, may update the determined location of the autonomous vehicle 100 within the facility.
  • the mapping computing entity 110 may be configured to transmit updated map information/data back to the autonomous vehicle 100 to reflect the updated location of the autonomous vehicle 100 within the facility.
  • the autonomous vehicle 100 may be required to have a specific software application installed thereon and configured to parse the received map information/data. However, in certain embodiments, it should be understood that the autonomous vehicle 100 may not be required to have specific software applications installed thereon.
  • the specific software applications may comprise mapping software.
  • the mapping software may be specific to indoor navigation, may be configured for both outdoor (e.g., GPS-based) navigation and indoor navigation, and/or the like.
  • the map information/data may be configured to be viewable via an interface, such as an Internet browser— e.g., Safari® browser, Google Chrome, Internet Explorer, Firefox, Opera, Netscape Navigator, and/or the like.
  • the map information/data may comprise information/data usable by the autonomous vehicle 100 to navigate the interior of the facility.
  • the autonomous vehicle 100 may be configured to generate a graphical display indicative of an interior map of the facility.
  • the graphical display may comprise a three- dimensional graphical display indicative of distance and altitude within the facility, and/or one or more two-dimensional graphical displays each indicative of a single altitude (e.g., floor) within a facility.
  • An example map display is shown in Fig. 4. As shown in Fig. 4, the map display may identify the location of various location devices 400, the location of an autonomous vehicle 100, the location of various destinations, the address of various locations, and/or the like.
  • the location of one or more mobile user computing entities 105 may be monitored within a facility, and the current location of a particular mobile user computing entity 105 may be correlated with a particular interior address.
  • the location of residents, employees, and/or the like within the building may be monitored and associated with particular indoor addresses based on the location of mobile user computing entities 105 associated with each mobile device user.
  • a computing entity e.g., autonomous vehicle 100 and/or mapping computing entity 110
  • the one or more internal addresses may be utilized for defining a route between a particular location (e.g., a current location of an autonomous vehicle 100) and a desired interior address.
  • the internal addresses may be correlated with map information/data (e.g., comprising a blueprint and/or other internal layout providing information/data indicative of the spatial relationships between various internal locations within the facility).
  • map information/data e.g., comprising a blueprint and/or other internal layout providing information/data indicative of the spatial relationships between various internal locations within the facility.
  • various embodiments may be configured to calculate one or more routes between the current location of an autonomous vehicle 100 and a desired destination address within the facility.
  • navigational instructions may then be provided to the autonomous vehicle 100 to guide the autonomous vehicle 100 to the desired destination location. As shown in Fig.
  • the recommended route utilized to generate the navigational instructions may be generated for a particular autonomous vehicle 100 and may be provided to the autonomous vehicle 100 to guide the autonomous vehicle 100 to the desired destination.
  • one or more computing entities may be configured to determine a recommended route between a current location of an autonomous vehicle 100 and a desired destination of the autonomous vehicle 100 (e.g., a delivery location and/or intended recipient for a shipment to be delivered by the autonomous vehicle 100).
  • the computing entity may receive information/data indicative of a current location of the autonomous vehicle 100 within a facility, and/or additional information about the facility, such as facility systems (e.g., environmental systems, transport systems, crowd control systems, and/or the like) as shown at Block 601 of Fig. 7.
  • facility systems e.g., environmental systems, transport systems, crowd control systems, and/or the like
  • the location of an autonomous vehicle 100 within a facility may be determined based on the identity of location devices 400 determined to be nearby the autonomous vehicle 100 (e.g., based on an estimated wireless communication link between the autonomous vehicle 100 and one or more location devices 400).
  • a desired destination address may be identified by a corresponding destination address (e.g., a character string corresponding to the internal address of the destination location); a name associated with a particular location (e.g., John Smith's office; 15th floor conference room; Reception; and/or the like); a mobile user computing entity user's name (e.g., John Smith); and/or the like.
  • a corresponding destination address e.g., a character string corresponding to the internal address of the destination location
  • a name associated with a particular location e.g., John Smith's office; 15th floor conference room; Reception; and/or the like
  • a mobile user computing entity user's name e.g., John Smith
  • the autonomous vehicle 100 may be configured to accept a free-form text input indicative of the destination location (e.g., via user input provided prior to the autonomous vehicle departing for the intended destination), a user selection of one or more listed locations, a scanned destination location (e.g., a delivery destination for a shipment/item), transmitted destination location from a connected shipment/item, transmitted destination location from a mobile user computing entity 105, and/or the like.
  • the autonomous vehicle 100 may receive user input identifying a desired destination location, and the autonomous vehicle 100 may utilize the received user input to identify the location of the desired destination address, and/or the autonomous vehicle 100 may transmit information/data indicative of the desired destination to a mapping computing entity 110 to identify a recommended route.
  • a connected shipment/item may transmit information/data indicative of an intended consignee/destination for the shipment/item to the computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100), which may utilize the intended consignee/destination as the desired destination.
  • the computing entity e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100
  • a desired destination address within the facility may be identified based on information/data stored within the onboard controller of the autonomous vehicle 100.
  • the autonomous vehicle 100 may be configured to utilize information/data indicative of scheduled tasks, scheduled deliveries of shipments/items, and/or the like occurring at defined locations within the facility to identify a desired destination address.
  • a desired arrival time corresponding to the particular entry and a location corresponding to the entry may be identified (e.g., a desired delivery time and location for a particular shipment).
  • the autonomous vehicle 100 may be configured to compare the desired arrival time for the entry against the current time, and may be configured to identify the location for the entry as the desired destination address if the desired arrival time for the entry is less than a configurable threshold amount of time from the current time.
  • the autonomous vehicle 100 may be configured to identify a plurality of possible delivery locations each corresponding to non-overlapping time frames identified relative to the current time. For example, the autonomous vehicle 100 may be configured to identify a first delivery location if the desired arrival time is within a first timeframe relative to the current time (e.g., within 15 minutes of the current time) and a second delivery location if the desired arrival time is within a second timeframe relative to the current time (e.g., between 15 minutes and 30 minutes after the current time).
  • the autonomous vehicle 100 may be configured to automatically identify the location of the delivery as a desired destination location upon identifying information/data stored within the onboard controller of the autonomous vehicle 100 identifying a delivery scheduled to occur less than 15 minutes in the future.
  • the computing entity may receive information/data indicative of multiple desired destinations and/or one or more waypoints the autonomous vehicle 100 is scheduled to visit prior to arriving at a desired destination.
  • the computing entity e.g., mapping computing entity 110 and/or onboard computing entity of the autonomous vehicle 100
  • an autonomous vehicle 100 may store information/data indicative of a plurality of shipment/item deliveries for a particular facility, each of which may be identified as a particular waypoint within a facility.
  • the computing entity e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100
  • the computing entity may be configured to determine a most efficient route within the facility for delivering the shipments/items, and may generate a route between each of the plurality of shipment/item delivery destinations.
  • a destination internal location may be identified based on a desired internal delivery location for an item.
  • the desired internal delivery location for the item may be identified based on user input identifying the desired internal delivery location received by the autonomous vehicle 100, based on computer-readable information/data printed on the item (e.g., a bar code, MaxiCode, QR code, RFID tag, and/or the like) and received by the autonomous vehicle 100 (e.g., via scanning the computer-readable information/data from the item); based on information/data transmitted from the item to a computing entity (e.g., onboard controller of the autonomous vehicle 100, location devices 400, and/or mapping computing entity 110); based on information/data transmitted from a third-party computing entity (e.g., a carrier-operated computing entity); based on information/data stored within the onboard controller of the autonomous vehicle 100, and/or the like.
  • a computing entity e.g., onboard controller of the autonomous vehicle 100, location devices 400, and/or mapping computing entity 110
  • a desired internal delivery location may be identified as a specific internal address; as an identifier associated with a specific internal address (e.g., John Smith's office); as an identity of an intended recipient; and or the like.
  • one or more computing entities e.g., mapping computing entity 110 and/or autonomous vehicle 100
  • mapping computing entity 110 and/or autonomous vehicle 100 may be configured to identify an internal location associated with the intended recipient (e.g., based on a static and/or dynamic directory comprising information/data identifying one or more locations corresponding to one or more occupants of a serviceable point).
  • an internal destination location corresponding to the intended recipient may be a static location (e.g., the intended recipient's cubicle, desk, office, apartment, and/or the like) and/or a dynamic location (e.g., determined based on a monitored current location of an intended recipient's mobile computing entity).
  • various embodiments may monitor the locations of one or more users (e.g., based on the determined location of a mobile computing entity carried by the user), and may determine typical internal locations associated with the user and/or typical times (and/or ranges) at which the user moves to one or more locations.
  • the onboard controller of the autonomous vehicle 100 and/or mapping computing entity 110 may be configured to collect and store location information/data indicative of a corresponding user's location within the facility over time. Based at least in part on the collected and stored location information/data for a particular mobile user computing entity user, the autonomous vehicle 100 and/or mapping computing entity 110 may be configured to identify times at which a particular user is positioned at a particular location within the facility.
  • the computing entity may be configured to determine that a particular mobile user computing entity user is located in a particular office between 8AM and 12PM, in a lunch room between 12PM and 1PM, and in the same office between 1PM and 5PM.
  • the computing entity may be configured to identify the office as the destination location when the mobile user computing entity user first arrives at the facility at approximately 8AM, the lunch room as the destination location at approximately noon, and the office as the destination location at approximately 1PM.
  • various embodiments may be configured to automatically determine destination locations for one or more users based on historical information/data indicative of typical locations and/or time periods associated with one or more users.
  • a computing entity e.g., mapping computing entity 110 and/or onboard controller of an autonomous vehicle 100
  • a computing entity may be configured to identify a desired destination location based on information/data identifying a mobile user computing entity user, a title, a service, and/or the like.
  • the autonomous vehicle 100 may receive data identifying "John Smith” as a desired destination location.
  • the autonomous vehicle 100 may receive data identifying "television salesman,” “masseuse,” “parts department,” “IT support,” and/or the like as a desired destination.
  • the computing entity e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100
  • the computing entity may be configured to determine whether the desired destination location is associated with a particular stationary location (e.g., a location associated with a corresponding stationary location device 400), a mobile location (e.g., a mobile user computing entity 105 identifying the current location of a particular mobile device user), and/or the like.
  • identifying a mobile device user e.g., John Smith
  • a computing entity e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100
  • the determination of whether a particular destination location is a stationary location or a mobile location may be determined based on user preferences (e.g., the preferences of the mobile device user associated with a prospective destination). For example, a mobile device user may provide user input indicative of whether the mobile device user's office or the mobile device user's current location should be utilized as a destination location.
  • facility occupants and/or individuals may be enabled to designate a desired destination location for an autonomous vehicle 100.
  • facility occupants may request the presence of an autonomous vehicle 100 to pick up a package to be shipped via one or more carriers.
  • the user e.g., a facility occupant
  • the desired internal destination location may be automatically indicated to be the current location of the user requesting the autonomous vehicle's presence.
  • the computing entity may be configured to collect and/or store historical information/data indicative of an average amount of time to travel through various portions of a facility at various times.
  • the computing entity may be configured to monitor the movement of one or more autonomous vehicles 100 within the facility (e.g., by monitoring which location devices 400 are in communication with various autonomous vehicles 100 and the amount of time that the location devices 400 are in communication with the autonomous vehicles 100; and/or by receiving information/data from the autonomous vehicles 100 indicative of their movement throughout the facility).
  • the computing entity may be configured to store the historical information/data collected as a result of the monitoring of the autonomous vehicle 100 movement and may determine, based on the historical data, an average amount of time to move through various portions of the facility at one or more times.
  • the mapping computing system may be configured to generate information/data indicative of an average amount of time for an autonomous vehicle 100 to move from John Smith's office to a loading dock at 5PM on Wednesdays.
  • the mapping computing entity 110 may be configured to monitor the amount of time to move between various areas of a facility along a plurality of routes.
  • the mapping computing entity 110 may be configured to identify a fastest and/or shortest route for an autonomous vehicle 100 to travel between points within a facility. Accordingly, the mapping computing entity 110 may be configured to utilize the historical information/data to select a fastest (e.g., least travel time) route between a current location of an autonomous vehicle 100 and a desired destination.
  • a fastest e.g., least travel time
  • the historical information/data may be generated based on monitoring various autonomous vehicles 100 moving throughout the facility.
  • the mapping computing entity 110 may be configured to collect information/data indicative of various autonomous vehicles 100 traveling throughout the facility over time, and to generate information/data indicative of an average travel time between various locations within the facility at various times (e.g., 2:00 PM on Wednesday).
  • the computing entity e.g., mapping computing entity 110 and/or onboard controller for the autonomous vehicle 100
  • the computing entity may be configured to generate a recommended route between the current location of the autonomous vehicle 100 and the desired destination location as indicated at Block 603 of Fig. 7.
  • the recommended route may be dynamically determined, such that the recommended route may change if the autonomous vehicle 100 is determined to move off of the recommended route or if the desired destination location is changed.
  • the computing entity Upon generating a recommended route for an autonomous vehicle 100, the computing entity (e.g., onboard controller of the autonomous vehicle 100 and/or mapping computing entity 110) may direct the autonomous vehicle 100 along the recommended route toward the desired destination location, as shown in Block 604 of Fig. 7.
  • the location of the autonomous vehicle 100 may be monitored (as shown at Block 605) to ensure the provided guidance remain accurate.
  • the onboard controller of the autonomous vehicle 100 may periodically (e.g., every 1 second, every 0.5 seconds, every 10 milliseconds, and/or the like) compare a determined location of the autonomous vehicle 100 and a determined heading of the autonomous vehicle 100 versus a desired location of the autonomous vehicle 100 and a desired heading of the autonomous vehicle 100 to ensure the autonomous vehicle is travelling along an appropriate travel path to the desired destination.
  • the onboard controller of the autonomous vehicle 100 may additionally monitor the determined heading of the autonomous vehicle 100 for one or more obstacles (e.g., people, objects, and/or the like) that pose a collision risk for the autonomous vehicle 100, such that the onboard controller may adjust the heading of the autonomous vehicle to avoid the detected collision risks.
  • obstacles e.g., people, objects, and/or the like
  • the navigational instructions may be provided to and/or generated by the onboard controller of the autonomous vehicle 100, may be provided via one or more location devices 400, and/or the like.
  • the following subsections provide example configurations for providing navigational instructions to autonomous vehicles 100, and should not be construed as limiting. a. Navigational Guidance Provided by Facility-Specific Mapping Computing Entity
  • a facility-specific mapping computing entity 110 may be configured to provide navigational guidance for autonomous vehicles 100.
  • a mapping computing entity 110 configured to provide mapping services for a particular facility e.g., storing map information/data for the particular facility and being in direct communication with one or more facility-specific computing entities
  • the facility-specific mapping computing entity 110 may be configured to generate a recommended route for the autonomous vehicle 100 upon receipt of information/data identifying a current location of the autonomous vehicle 100 and a desired destination for the autonomous vehicle 100 (for example, according to the methodology discussed herein).
  • the facility- specific mapping computing entity 110 may be configured to be in direct communication with the autonomous vehicle 100, in order to receive information/data indicative of the current location of the autonomous vehicle 100 and/or information/data indicative of the desired destination of the autonomous vehicle 100; or the facility-specific mapping computing entity 110 may be configured to be in communication with the autonomous vehicle 100 via one or more relays (e.g., location devices 400).
  • the autonomous vehicle 100 may be configured to self-determine its location within the facility, and to provide information/data indicative of its current location to the facility-specific mapping computing entity 110 together with information/data identifying the autonomous vehicle 100.
  • the autonomous vehicle 100 may be configured to receive transmitted information/data from a nearby location device 400 providing information/data indicative of the location of the location device 400.
  • the autonomous vehicle 100 may be configured to determine that the location information/data received from the nearby location device 400 is indicative of the current location of the autonomous vehicle 100.
  • the autonomous vehicle 100 may be configured to self-determine its location relative to one or more known landmarks detected by one or more onboard sensors (e.g., artwork identified via image recognition, room name labels recognized via OCR (e.g., identified based on a room label such as "CONFERENCE ROOM 201"), and/or the like.
  • the autonomous vehicle 100 may be configured to determine its own location using any of a variety of other technologies (e.g., triangulation based on signals received from a plurality of information/data communication devices and/or location devices 400, GPS, and/or the like).
  • the location of the autonomous vehicle 100 within a facility need not be identified with sufficient precision to guide the autonomous vehicle 100 along a travel path and/or around various known obstacles. Instead, as discussed in greater detail herein, the location of the autonomous vehicle 100 may be determined and relayed to a mapping computing entity 110 with sufficient precision to identify a generic route to a determined destination. For example, the location of the autonomous vehicle 100 may be determined with a location tolerance of 1 foot, 2 feet, 5 feet, 10 feet, and/or the like.
  • the onboard controller of the autonomous vehicle 100 may utilize provided navigational guidance received from the mapping computing entity 110 together with input received from the various onboard sensors 127 (e.g., LIDAR, proximity sensors, cameras, and/or the like) to make precise determinations of various movements for the autonomous vehicle 100.
  • the determined location of the autonomous vehicle 100 as identified by the mapping computing entity 110 may indicate that the autonomous vehicle 100 is located at a desired destination, within a 5 foot tolerance diameter.
  • the autonomous vehicle may utilize input received from the various onboard sensors to guide the autonomous vehicle while it continues to move to the precise location of the desired destination to avoid potential collisions with detected objects.
  • the autonomous vehicle 100 may be configured to transmit information/data to the facility-specific mapping computing entity 110 indicative of the current location of the autonomous vehicle 100 (e.g., on an information/data transmission channel different from that between the location device 400 and the autonomous vehicle 100 and/or at a different transmission frequency).
  • the autonomous vehicle 100 may be configured to transmit information/data indicative of a desired destination location to the facility-specific mapping computing entity 110 together with the information/data identifying the current location of the autonomous vehicle 100 and/or in a separate information/data transmission.
  • the autonomous vehicle 100 may transmit information/data indicative of a desired destination to the facility-specific mapping computing entity 110 via one or more location devices 400 configured to indicate the current location of the autonomous vehicle 100 based on the known location of the transmitting location device 400.
  • the autonomous vehicle 100 may be configured to transmit information/data indicative of a desired destination and information/data indicative of the identity of the autonomous vehicle 100 to a proximate location device 400.
  • the location device 400 may relay at least a portion of the received information/data to the facility-specific mapping computing entity 110 and may additionally transmit the current location of the autonomous vehicle 100 to the mapping computing entity 110.
  • the facility- specific mapping computing entity 110 may identify a recommended route to the desired destination location (e.g., based on current time, distance, and/or the like).
  • the mapping computing entity 110 may additionally generate navigational instructions (e.g., route-based instructions, such as indications regarding when to turn, how far to travel, and/or the like) and may provide the navigational instructions to the autonomous vehicle 100.
  • the mapping computing entity 110 may transmit (e.g., directly and/or through one or more relays) map information/data together with information/data identifying the recommended route to the autonomous vehicle 100, such that the onboard controller of the autonomous vehicle 100 is enabled to operate the locomotion mechanisms of the autonomous vehicle 100 to move the vehicle along the recommended route, while utilizing input received from various onboard sensors 127 (e.g., LIDAR, proximity sensors, cameras, and/or the like) to avoid detected obstacles detected while the autonomous vehicle 100 moves along the recommended route.
  • the onboard controller of the autonomous vehicle 100 is configured to supplement the navigational guidance data received from the mapping computing entity 110 with the data received from the various onboard sensors.
  • the data received from the onboard sensors may be utilized to address known precision tolerances in the location data received from the mapping computing entity 110.
  • the onboard sensors may thus be utilized to ensure the autonomous vehicle 100 turns at appropriate locations (e.g., at corners of hallways), fully boards various transportation mechanisms (e.g., elevators), fully approaches identified destination locations (e.g., a specific portion of a desk at a destination location), and/or the like.
  • the mapping computing entity 110 may transmit signals to one or more location devices 400 to cause the location devices 400 to emit autonomous vehicle-detectable navigational cues (e.g., light, sound, displayed information, and/or the like) indicative of the navigational instructions for the autonomous vehicle 100.
  • autonomous vehicle-detectable navigational cues e.g., light, sound, displayed information, and/or the like
  • the one or more location devices 400 located along a recommended travel path may be illuminated such that an autonomous vehicle 100 may detect the signals (e.g., light signals) emitted from the location devices 400.
  • the autonomous vehicles may utilize the illuminated location devices 400 as indicative of the recommended route, while the autonomous vehicle 100 utilizes additional sensor data generated by the one or more onboard sensors 127 to provide highly precise maneuvering determinations to move along the recommended route while avoiding obstacles.
  • the navigational cues are detectable by one or more sensors 127 onboard the autonomous vehicles 100 (e.g., lights are detected by a light sensor and/or a camera; sounds are detected by a microphone, infrared lights are detected by an infrared sensor, displayed images are detected by a camera and are interpreted by one or more algorithms operable on the onboard controller, and/or the like).
  • the navigational cues may be indicative of a determined recommended route for the autonomous vehicle 100, and accordingly the autonomous vehicle 100 may rely on the navigational cues as "breadcrumbs" that serve to indicate the recommended route.
  • the autonomous vehicle 100 may be configured to detect a first navigational cue emitted from a first location device 400, and may immediately begin monitoring for a second navigational cue emitted from a second location device 400 farther along the recommended route.
  • the autonomous vehicle 100 may be configured to determine an orientation of the autonomous vehicle 100 within the facility (e.g., based on a known direction of travel relative to a location device, based on an internal compass, and/or the like) such that the autonomous vehicle 100 is enabled to distinguish between location devices 400 located between the autonomous vehicle 100 and the intended destination, and location devices 400 located between the autonomous vehicle 100 and the starting point of the recommended route (i.e., such that the autonomous vehicle 100 is enabled to identify location devices farther along the recommended route).
  • the autonomous vehicle 100 may be configured to move toward the detected navigational cues upon detection of the navigational cues. However, in certain embodiments, the autonomous vehicle 100 may be configured to move along the recommended route and to utilize the navigational cues to verify that the autonomous vehicle 100 is moving along the correct recommended route. In such embodiments, the autonomous vehicle 100 may be configured to anticipate the location of a subsequent navigational cue based at least in part on the location of a currently detected navigational cue, a previously detected navigational cue, and/or detected environmental characteristics surrounding the autonomous vehicle 100.
  • the navigational cues may be directional, thereby indicating the location of the recommended travel path for the autonomous vehicle (e.g., the navigational cues may be arrows and/or text displayed on the locational devices 400 that are indicative of the location of a subsequent location device 400 along the recommended travel path).
  • the navigational cues may not be directional, and accordingly the autonomous vehicle 100 may be configured to anticipate the location of a subsequent navigational cue based on various characteristics of the environment surrounding the autonomous vehicle 100.
  • the autonomous vehicle 100 may be travelling along an elongated hallway and may detect a navigational cue emitted by a location device 400 located proximate a midpoint of the hallway, rather than waiting until the next navigational cue is detected, the autonomous vehicle may be configured to anticipate the location of the next navigational cue as being farther along the hallway, in the same direction that the autonomous vehicle had been previously traveling.
  • the autonomous vehicle 100 may be configured to move smoothly along the recommended route, and making adjustments to the route upon determining that an anticipated direction of movement is incorrect (as determined when no navigational cues are detected).
  • the autonomous vehicle 100 need not store navigational data locally on the onboard controller, because the autonomous vehicle 100 is guided based on the locations of the various navigational cues emitted by the location devices 400.
  • the autonomous vehicle 100 may have navigational data indicative of the recommended route stored locally on the onboard controller, and accordingly the navigational cues are utilized by the autonomous vehicle 100 to verify that the autonomous vehicle 100 is travelling along the recommended travel path.
  • the mapping computing entity 110 may transmit signals to one or more location devices 400 satisfying configurable characteristics. For example, the mapping computing entity 110 may transmit signals to location devices 400 within a configurable distance of the autonomous vehicle 100 (e.g., 20 feet) and along the recommended route; to all location devices 400 along the recommended route (e.g., all location devices 400 that are likely to be in communication with the autonomous vehicle 100 for at least a period of time while the autonomous vehicle 100 travels to the desired destination); to a configurable number of location devices 400 located along the recommended route and substantially adjacent the current location of the autonomous vehicle 100 (e.g., the nearest 5 location devices 400 along the recommended route); to location devices 400 of a particular type (e.g., light fixtures) along at least a portion of the recommended route; and/or the like.
  • a configurable distance of the autonomous vehicle 100 e.g., 20 feet
  • all location devices 400 along the recommended route e.g., all location devices 400 that are likely to be in communication with the autonomous vehicle 100 for at least a period of time while the autonomous vehicle
  • the location devices 400 receiving signals to generate navigational cues may be identified based at least in part on the identified current location of an autonomous vehicle 100.
  • location devices 400 identified to generate navigational cues may be dynamically selected such that location devices 400 near a monitored current location of an autonomous vehicle 100 are utilized to provide navigational cues to the autonomous vehicle 100.
  • the location devices 400 located along the recommended route may receive signals that cause the location devices 400 to emit navigational cues when an autonomous vehicle 100 is detected to be proximate the location device 400.
  • the autonomous vehicle 100 may broadcast a unique wireless communication (e.g., via radio transmission, light transmission, and/or sound transmission) that is detectable by proximate location devices 400. Once the location devices 400 receive the unique transmission, the location devices 400 may be configured to emit navigational cues to the autonomous vehicle 100.
  • a location device 400 proximate the autonomous vehicle 100 may detect the presence of the autonomous vehicle 100 as discussed herein (e.g., the location device 400 may establish a wireless communication with the autonomous vehicle 100 while the autonomous vehicle 100 is within a communication range associated with the location device 400).
  • the location device 400 may transmit information/data to a mapping computing entity 110 identifying itself (e.g., the identity and/or location of the location device 400) and identifying the autonomous vehicle 100.
  • the location device 400 may also act as a relay to transmit information/data identifying a desired destination from the autonomous vehicle 100 to the mapping computing entity 110.
  • the location device 400 may handoff communications with the autonomous vehicle 100 to a second location device 400.
  • the second location device 400 may transmit information/data identifying itself and information/data identifying the autonomous vehicle 100 to the mapping computing entity 110.
  • the mapping computing entity 110 may thereby be configured to monitor the location of the autonomous vehicle 100 within the facility.
  • the mapping computing entity 110 may transmit signals to one or more location devices 400 causing those location devices 400 to emit navigational cues based on the determined location of autonomous vehicle 100.
  • the mapping computing entity 110 may identify those location devices 400 located along the recommended route (after the recommended route is determined), and/or meeting one or more criteria based on the determined location of the autonomous vehicle 100.
  • the autonomous vehicle 100 may be configured to self-generate navigational instructions to guide the autonomous vehicle 100 to a desired destination.
  • the autonomous vehicle 100 may have map data stored locally on the onboard controller of the autonomous vehicle 100, and the autonomous vehicle 100 may thereby utilize the stored map data to self-generate a recommended route upon receipt of information/data identifying a current location of the autonomous vehicle 100 and a desired destination for the autonomous vehicle 100.
  • the autonomous vehicle 100 may be configured to self-determine its location within the facility such that the mobile computing entity 105 can determine a recommended route to a desired destination location.
  • the autonomous vehicle 100 may be configured to receive transmitted information/data from a nearby location device 400 providing information/data indicative of the location of the location device 400.
  • the autonomous vehicle 100 may be configured to determine that the location information/data received from the nearby location device 400 is indicative of the current location of the autonomous vehicle 100 and may correlate the location information/data received from the nearby location device 400 with map information/data to identify a current location of the autonomous vehicle 100 within the facility.
  • the autonomous vehicle 100 may be configured to determine its own location using any of a variety of other technologies (e.g., triangulation based on signals received from a plurality of information/data communication devices, GPS, detection of known landmarks within the facility (e.g., known artwork) by various onboard sensors of the autonomous vehicle 100, and/or the like).
  • a variety of other technologies e.g., triangulation based on signals received from a plurality of information/data communication devices, GPS, detection of known landmarks within the facility (e.g., known artwork) by various onboard sensors of the autonomous vehicle 100, and/or the like).
  • one or more location devices 400 may detect the presence of the autonomous vehicle 100 as being within a communication range.
  • the autonomous vehicle 100 may broadcast information/data indicative of its identity to nearby location devices 400.
  • the location devices 400 may transmit information/data back to the autonomous vehicle 100 directly and/or indirectly indicative of the current location of the autonomous vehicle 100 within the facility.
  • the location devices 400 may, in certain embodiments, transmit information/data indicative of the identity of the location device 400 and the location of the autonomous vehicle 100 to the mapping computing entity 110, which may then transmit information/data back to the autonomous vehicle 100 indicative of the current location of the autonomous vehicle 100.
  • the autonomous vehicle 100 may be configured to execute the generated navigational instructions by moving through the facility along the recommended route. As discussed herein, the autonomous vehicle 100 may utilize the generated navigational instructions along with data received from the various onboard sensors to maneuver the autonomous vehicle 100 along the recommended route while avoiding collisions with detected persons, objects, and/or the like detected by the various onboard sensors.
  • the mapping computing entity 110 may transmit signals to one or more location devices 400 to cause the location devices 400 to emit autonomous vehicle-detectable navigational cues (e.g., light, sound, displayed information, , and/or the like) indicative of the navigational instructions for the mobile device user.
  • the autonomous vehicle 100 may be configured to transmit a signal to a location device 400 within a communication range of the autonomous vehicle 100 configured to cause one or more location devices 400 to emit navigational cues for the autonomous vehicle 100.
  • the location device 400 receiving the signal from the autonomous vehicle 100 may be configured to relay the signal to a mapping computing entity 110, which may be configured to transmit signals to one or more location devices 400 to emit navigational cues for the autonomous vehicle 100, as described herein.
  • the mapping computing entity 110 may transmit signals to one or more location devices 400 satisfying configurable characteristics.
  • the mapping computing entity 110 may transmit signals to location devices 400 within a configurable distance of the autonomous vehicle 100 (e.g., 20 feet) and along the recommended route; to all location devices 400 along the recommended route (e.g., all location devices 400 that are likely to be in communication with the autonomous vehicle 100 for at least a period of time while the autonomous vehicle 100 travels to the desired destination); to a configurable number of location devices 400 located along the recommended route and substantially adjacent the current location of the autonomous vehicle 100 (e.g., the nearest 5 location devices 400 along the recommended route); to location devices 400 of a particular type (e.g., light fixtures) along at least a portion of the recommended route; and/or the like.
  • the autonomous vehicle 100 may be configured to utilize the navigational cues as breadcrumbs identifying the recommended route.
  • the location devices 400 receiving signals to generate navigational cues may be identified based at least in part on the identified current location of an autonomous vehicle 100.
  • location devices 400 identified to generate navigational cues may be dynamically selected such that location devices 400 near a monitored current location of an autonomous vehicle 100 are utilized to provide navigational cues to the autonomous vehicle 100.
  • the autonomous vehicle 100 may be configured to transmit a signal to a location device 400 located within a communication range of the autonomous vehicle 100.
  • the receiving location device 400 may be configured to transmit the received signal to one or more additional location devices 400 (e.g., via parallel transmissions from the receiving location device 400 to a plurality of additional location devices 400 and/or via a series transmissions from the receiving location device 400 to a second location device 400, which then transmits the signal to a third location device 400, and/or the like).
  • the receiving location device 400 and/or one or more of the additional location devices 400 may thereby receive the transmitted signal and may provide navigational cues for the autonomous vehicle 100.
  • the autonomous vehicle 100 may be configured to transmit a new signal each time a new location device 400 is within range of the autonomous vehicle 100. Accordingly, the identity, number, and/or location of location devices 400 emitting navigational cues may be updated each time the autonomous vehicle 100 moves to connect to a new location device 400 within the facility.
  • the autonomous vehicle 100 may receive a broadcast signal from a nearby location device 400.
  • the autonomous vehicle 100 which may have map information/data for the facility stored thereon, may compare the received signal from the location device 400 against the map information/data to identify its location within the facility.
  • the autonomous vehicle 100 may generate a recommended route within the facility to the desired destination location and may begin executing the recommended route by moving through the facility.
  • the autonomous vehicle 100 may continuously receive new broadcast location information/data from new nearby location devices 400, and accordingly the autonomous vehicle 100 may update the determined current location of the autonomous vehicle 100 within the facility.
  • the autonomous vehicle 100 may be configured to update the recommended route traveled by the autonomous vehicle 100 based on the determined location of the autonomous vehicle 100 within the facility.
  • the autonomous vehicle 100 may be configured to transmit signals to one or more location devices 400 causing the location devices 400 to provide navigational cues for the autonomous vehicle 100. Accordingly, as the autonomous vehicle 100 passes each location device 400 while travelling toward a desired destination location, the autonomous vehicle 100 may emit signals to each passed location device 400 causing one or more location devices 400 to emit navigational cues.
  • a third-party mapping computing entity 110 may be configured to generate and provide navigational instructions to an autonomous vehicle 100.
  • one or more additional computing entities e.g., onboard controller of the autonomous vehicle 100 and/or facility- specific mapping computing entity 110
  • the third-party mapping computing entity 110 may be configured to generate a recommended route to the desired destination location for the autonomous vehicle 100.
  • the third-party mapping computing entity 110 may be configured to transmit information/data indicative of the recommended route (and/or map data) to the autonomous vehicle 100 and/or the facility-specific mapping computing entity 110 to provide navigational instructions to the autonomous vehicle 100, as discussed herein.
  • the autonomous vehicle 100 may receive the information/data indicative of the recommended route from the third-party mapping computing entity 110 (e.g., via a direct information/data transmission from the third-party mapping computing entity 110 and/or via an indirect information/data transmission from the third-party mapping computing entity 110 and through the facility-specific mapping computing entity 110 and/or one or more location devices 400.
  • the autonomous vehicle 100 and/or the facility-specific mapping computing entity 110 may receive information/data indicative of the recommended route from the third-party mapping computing entity and may transmit signals to one or more location devices 400 to cause the location devices 400 to emit navigational cues for the autonomous vehicle 100. d. Providing Navigational Instructions via Location devices
  • one or more location devices 400 may be configured to provide navigational cues to an autonomous vehicle 100 along a recommended route to a desired destination within a facility.
  • the recommended route may be generated by a facility specific mapping computing entity 110, a third party mapping computing entity 110, an onboard controller of the autonomous vehicle 100, and/or the like.
  • location devices 400 located along at least a portion of a recommended route e.g., location devices 400 that are likely to be in communication with the autonomous vehicle 100 as the autonomous vehicle 100 moves along the calculated route
  • a determined current location of an autonomous vehicle 100 and a desired destination may be configured to emit navigational cues (e.g., light, sound, and/or the like) to direct a particular autonomous vehicle 100 to the desired destination.
  • navigational cues e.g., light, sound, and/or the like
  • various location devices 400 may be configured to emit one or more colored lights to indicate a recommended route to the destination internal address (e.g., as shown in Figs. 8-9 and 11).
  • location devices 400 along a recommended route may illuminate in a first color (e.g., green) to indicate the recommended route to the autonomous vehicle 100.
  • the illuminated color may be generic and may apply to all autonomous vehicles 100, or it may be selected for a particular autonomous vehicle 100 (e.g., a recommended route for a first autonomous vehicle 100 is illuminated with purple location devices 400 and the recommended route for a second autonomous vehicle 100 is illuminated with blue location devices 400).
  • the autonomous vehicles 100 may be configured to detect the illuminated colors of the navigational cues, and may utilize the illuminated colors as virtual breadcrumbs indicating the recommended route. Accordingly, as discussed herein, the autonomous vehicle 100 may be configured to continuously detect the illuminated colors of the location devices 400 as the autonomous vehicle 100 moves along the recommended route.
  • one or more location devices 400 may be configured to provide an audible instruction (e.g., a beep and/or a spoken instruction) detectable by the autonomous vehicle 100 as the autonomous vehicle 100 moves along the determined route.
  • the audible instruction need not be audible to a human ear (e.g., the instruction may be emitted at higher and/or lower frequencies than detectable by the human ear), so long as the audible instruction is detectable by sensors onboard the autonomous vehicle 100.
  • the location device 400 corresponding to the desired destination internal address may be configured to emit one or more signals (e.g., a second colored light (e.g., yellow) and/or an audible tone) to indicate to the autonomous vehicle 100 the final location of the destination location.
  • a second colored light e.g., yellow
  • an audible tone e.g., a second colored light (e.g., yellow) and/or an audible tone
  • the location devices 400 may be configured to receive signals from a computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100) causing the location devices 400 to emit navigational cues.
  • a computing entity e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100
  • the mapping computing entity 110 may transmit a signal to one or more location devices 400 causing the location devices 400 to emit navigational cues from a mapping computing entity 110 after the mapping computing entity 110 receives a request for navigational instructions from an autonomous vehicle 100.
  • the autonomous vehicle 100 may transmit a signal to a first location device 400 located proximate the autonomous vehicle 100 (e.g., within a communication range associated with the location device 400) causing at least the first location device 400 to emit a navigational cue for the particular autonomous vehicle 100.
  • the first location device 400 may be configured to transmit a signal to a second location device 400 located along the recommended route toward a desired destination location (e.g., as determined by the autonomous vehicle 100 and/or the mapping computing entity 110).
  • the second location device 400 may emit a navigational cue.
  • location devices 400 along the entire recommended route between a particular location e.g., the current location of the autonomous vehicle 100 identified when the autonomous vehicle 100 requested navigational instructions to a destination location
  • the destination location may be configured to emit navigational cues simultaneously.
  • location devices 400 along the entire recommended route may emit light having desired characteristics (e.g., wavelength, flashing frequency, and/or the like) simultaneously to guide the particular autonomous vehicle 100 toward the desired destination.
  • location devices 400 along a portion of the recommended route may be configured to provide navigational cues simultaneously.
  • a computing entity e.g., mapping computing entity 110
  • mapping computing entity 110 may be configured to dynamically update which location devices 400 emit navigational cues based on a dynamically determined current location of the autonomous vehicle 100 within the facility.
  • location devices 400 located within a configurable threshold distance of the current location of an autonomous vehicle 100 and located between the current location of the autonomous vehicle 100 and the desired destination location along the recommended route may receive signals causing the location devices 400 to emit navigational cues.
  • the location devices 400 emitting navigational cues may change as the particular autonomous vehicle 100 moves along the recommended route to lead the autonomous vehicle 100 along the recommended route.
  • a computing entity e.g., onboard controller of the autonomous vehicle 100 mapping computing entity 110
  • an autonomous vehicle 100 located at a loading dock on a first floor of a facility may receive user input indicating that the autonomous vehicle 100 is to deliver a shipment to the office of John Smith on the fifth floor of the facility.
  • the autonomous vehicle 100 may transmit information/data indicative of the desired destination location to the mapping computing entity 110 corresponding to the facility.
  • the mapping computing entity 110 may determine the location of the autonomous vehicle 100 being within a transmission range of a location device 400 located at the loading dock, and may calculate a recommended route between the autonomous vehicle's current location at the loading dock on the first floor to John Smith's office on the fifth floor.
  • the mapping computing entity may transmit signals to one or more location devices 400 between John Smith's office and the current location of the autonomous vehicle 100 (determined to be located at the location device 400 located nearest the location of the autonomous vehicle 100) to cause the location devices 400 to indicate the recommended route.
  • the mapping computing entity 110 may transmit signals to all of the location devices 400 located along the recommended travel path, or a subset of the location devices 400 located along the recommended travel path.
  • the mapping computing entity 110 may be configured to transmit signals to a predetermined number of location devices 400 located between the current location of the autonomous vehicle 100 and the desired destination and adjacent the current location of the autonomous vehicle 100 (e.g., the three location devices 400 located between the location device 400 identifying the current location of the autonomous vehicle 100 and the destination location and immediately adjacent the current location of the autonomous vehicle 100).
  • the autonomous vehicle 100 may begin executing the recommended route by moving toward a first detected navigational cue emitted from a nearby location device 400.
  • the first location device 400 indicating the recommended travel path may be configured to stop providing an indication of a recommended travel path once the autonomous vehicle 100 enters the transmission range of the second location device 400.
  • another location device 400 located beyond the last location device 400 providing an indication of the recommended travel path may be configured to begin providing an indication of the recommended travel path, such that the location devices 400 leading the autonomous vehicle 100 continue to provide an indication of the recommended travel path.
  • the autonomous vehicle 100 may communicate with a first, nearest location device 400, while a second location device 400 may illuminate to indicate a recommended direction toward a destination location.
  • the location device 400 corresponding to the destination location may be configured to indicate the location of the desired destination location.
  • a location device 400 proximate John Smith' s office may illuminate to indicate the destination location to the autonomous vehicle 100.
  • the autonomous vehicle 100 may thereafter utilize the various onboard sensors to maneuver to a precise destination location, such as John Smith's desk in his office.
  • the autonomous vehicle 100 may thus rely on various sensors to avoid collisions with objects and/or persons within John Smith's office, and to ultimately move adjacent the desk to deposit an item to be delivered to John Smith.
  • the autonomous vehicle 100 may utilize the various onboard sensors to verify that the shipment is deposited onto the desk in John Smith' s office.
  • the one or more computing entities may be configured to operate one or more transportation mechanisms 170 (e.g., elevators, escalators, automatic doors, and/or the like) to facilitate movement of an autonomous vehicle 100 within a facility.
  • transportation mechanisms 170 e.g., elevators, escalators, automatic doors, and/or the like
  • a recommended route between a current location of an autonomous vehicle 100 and a desired destination location includes one or more transportation mechanisms 170 (e.g., an elevator)
  • the computing entity may be configured to transmit one or more signals to the transportation mechanism 170 such that the transportation mechanism 170 is available for boarding by the autonomous vehicle 100 when the autonomous vehicle 100 reaches the transportation mechanism 170, and to automatically direct the transportation mechanism 170 to move the autonomous vehicle 100 to an appropriate location along the recommended route.
  • an elevator may be available and open for an autonomous vehicle 100 when the autonomous vehicle 100 arrives at the elevator bank while moving toward a destination location, and the elevator may automatically move to a desired floor once the autonomous vehicle 100 is in the elevator.
  • An example method for automatically operating one or more transportation mechanisms 170 is illustrated in the flow chart of Fig. 11.
  • one or more transportation mechanisms 170 may be configured to receive operating information/data from an autonomous vehicle 100 and/or provide information/data to an autonomous vehicle 100, internal mapping entity, and/or the like.
  • the transportation mechanisms 170 may comprise a communication device configured to communicate (e.g., wirelessly) with one or more autonomous vehicles 100.
  • the communication device of the transportation mechanisms may be configured to communicate with autonomous vehicles 100 within a defined communication range, and accordingly the autonomous vehicle 100 may be configured to transmit signals causing the transportation mechanisms 170 to move once the autonomous vehicle 100 is within the communication range of the transportation mechanism 170.
  • a computing entity e.g., an onboard controller of an autonomous vehicle 100 and/or mapping computing entity 110
  • the computing entity may generate a recommended route through the facility between the current location and the desired destination of the autonomous vehicle 100, as indicated at Block 703.
  • the computing entity Upon generating the recommended route, the computing entity (e.g., the onboard controller of the autonomous vehicle 100 and/or mapping computing entity 110) may identify one or more transportation mechanisms located along the recommended route, as indicated at Block 704. For example, the computing entity may determine that movement from a current location on one floor of a facility to a desired destination on a different floor of the facility may involve travelling in an elevator between floors. The one or more identified transportation mechanisms 170 may be identified as candidates for automated operation as the autonomous vehicle 100 approaches the transportation mechanism 170. As the autonomous vehicle 100 moves along the generated recommended route within the facility, the computing entity monitors the location of the autonomous vehicle 100, as indicated at Block 705.
  • the computing entity monitors the location of the autonomous vehicle 100, as indicated at Block 705.
  • the computing entity Based on the monitored location of the autonomous vehicle 100, the computing entity estimates an arrival time for the autonomous vehicle 100 to arrive at the transportation mechanism, as shown at Block 706. For example, the computing entity may determine an average speed for the autonomous vehicle 100 moving along the recommended route and/or a distance to reach the transportation mechanism 170, and may estimate an amount of time remaining until the autonomous vehicle 100 reaches the transportation mechanism. As yet another example, the computing entity may comprise information/data indicative of a geofenced area surrounding the transportation mechanism 170.
  • the edge of the geofenced area may be an estimated travel time away from the transportation mechanism (e.g., an estimated amount of time for a determined average user to move from the edge of the geofence to the transportation mechanism 170), such that the computing entity estimates the time remaining before the autonomous vehicle 100 reaches the destination location based on the time at which the autonomous vehicle 100 crosses the edge of the geofenced area.
  • the computing entity may be configured to transmit one or more signals to the transportation mechanism 170 to cause the transportation mechanism 170 to enable the autonomous vehicle 100 to board the transportation mechanism 170. For example, as an autonomous vehicle 100 approaches an elevator along the recommended route, as shown in Fig.
  • the computing entity may transmit a signal to the elevator to cause the elevator to move to the current location of the autonomous vehicle 100 (e.g., the current floor of the autonomous vehicle 100) if the elevator was not previously located at the current location of the autonomous vehicle 100 and to open the elevator doors to enable the autonomous vehicle 100 to board the elevator when the autonomous vehicle 100 arrives at the elevator.
  • the current location of the autonomous vehicle 100 e.g., the current floor of the autonomous vehicle 100
  • the computing entity may be configured to transmit a second signal to the transportation mechanism 170 to cause the transportation mechanism to move to a second location along the recommended route, as indicated at Block 708.
  • the transportation mechanism 170 may be configured to close included doors (if applicable) and move to a second specified location along the recommended route.
  • the computing entity may transmit a signal to an elevator causing the elevator to move to a different floor on which the autonomous vehicle's destination location is located.
  • the transportation mechanism 170 may be configured to enable the autonomous vehicle 100 to disembark (e.g., by opening the doors of the transportation mechanism 170), as indicated at Block 709.
  • the transportation mechanism 170 may be configured to automatically enable the autonomous vehicle 100 to disembark, however in certain embodiments, the computing entity may be configured to transmit a third signal to the transportation mechanism 170 to enable the autonomous vehicle 100 to disembark.
  • the autonomous vehicle 100 may be configured to transmit a signal to a transportation mechanism 170 located along the recommended route causing the transportation mechanism to facilitate the autonomous vehicle's movement toward the desired destination.
  • a transportation mechanism 170 located along the recommended route causing the transportation mechanism to facilitate the autonomous vehicle's movement toward the desired destination.
  • the autonomous vehicle 100 may call the elevator to the autonomous vehicle's initial floor.
  • the autonomous vehicle 100 may transmit a second signal causing the elevator to move to a desired floor (e.g., the floor of the desired destination location) to enable the autonomous vehicle 100 to disembark at the desired floor.
  • the autonomous vehicle 100 may be configured to generate and transmit the signals to the transportation mechanism automatically, based on the generated recommended route.
  • one or more transportation mechanisms 170 may be in communication with a facility-specific mapping computing entity 110 (e.g., directly and/or via a relay, such as one or more location devices 400). Accordingly, the one or more transportation mechanisms 170 may be configured to receive operating information/data from the facility-specific mapping computing entity 110. Moreover, in various embodiments, the one or more transportation mechanisms may be configured to provide operational information/data (e.g., current location of the transportation mechanism, current operating state of the transportation mechanism, and/or the like) to an autonomous vehicle 100, mapping computing entity 110, and/or the like.
  • operational information/data e.g., current location of the transportation mechanism, current operating state of the transportation mechanism, and/or the like
  • the facility-specific mapping computing entity 110 may be configured to transmit a signal to a transportation mechanism 170 located along the recommended route causing the transportation mechanism 170 to facilitate the autonomous vehicle movement toward the desired destination.
  • the facility-specific mapping computing entity 110 may detect the autonomous vehicle's presence proximate the elevator, and may transmit a signal calling the elevator to the autonomous vehicle's current floor.
  • the facility-specific mapping computing entity 110 may transmit a second signal to the elevator causing the elevator to move to a desired floor (e.g., the floor of the desired destination location) to enable the autonomous vehicle 100 to disembark at the desired floor.
  • the facility-specific mapping computing entity 110 may be configured to generate and transmit the signals to the transportation mechanism 170 automatically, based on the generated recommended route.
  • the facility-specific mapping computing entity 110 may be configured to transmit one or more signals to a transportation mechanism 170 upon determining that an autonomous vehicle 100 is within a predefined distance of the particular transportation mechanism.
  • the facility-specific mapping computing entity 110 may be configured to monitor the location of the autonomous vehicle 100 moving within the facility (e.g., based on the identity of location device 400 in communication with the autonomous vehicle 100).
  • the facility-specific mapping computing entity 110 may transmit a signal to the transportation mechanism 170 causing the transportation mechanism 170 to move to the autonomous vehicle's current floor such that the transportation mechanism 170 is available when the autonomous vehicle 100 arrives at the transportation mechanism 170.
  • the signal transmitted to the transportation mechanism 170 may comprise information/data indicative of a current location of an autonomous vehicle 100 (e.g., the initial floor) and a desired destination for the transportation mechanism 170 (e.g., a destination floor).
  • the current location of autonomous vehicle 100 (e.g., the initial floor) and/or the desired destination for the autonomous vehicle 100 (e.g., the destination floor) may be determined based at least in part on a generated recommended route to the desired destination.
  • a particular destination location may be mobile within a particular building and/or campus.
  • a destination location may correspond to the current location of a particular mobile user computing entity user (e.g., determined based on the location of a mobile user computing entity carried by the mobile user computing entity user).
  • the destination internal address may change.
  • various embodiments may be configured to adjust a route determined between a current location of an autonomous vehicle 100 and the destination location such that the autonomous vehicle 100 intercepts the mobile user computing entity user defining the destination location.
  • computing entities may be configured to generate one or more notifications to various mobile user computing entity users, such as a mobile user computing entity user defining a destination location, that a particular autonomous vehicle 100 is scheduled to visit the mobile user computing entity user. Accordingly, the generated notification may request that the mobile user computing entity user remain within their current area until the autonomous vehicle 100 reaches the current location of the mobile user computing entity user.

Abstract

Various embodiments are directed to autonomous vehicles and systems and methods for guiding autonomous vehicles within a facility. The autonomous vehicle guide system comprises a mapping computing entity configured to determine a recommended route for the autonomous vehicle and a plurality of location devices configured to emit navigational cues detectable by an autonomous vehicle. The autonomous vehicles each comprise one or more sensors configured to detect the navigational cues emitted by the one or more location devices and an onboard controller configured to selectively activate locomotion mechanisms of the autonomous vehicle to move along the recommended route as indicated by the one or more navigational cues.

Description

AUTOMATIC ROUTING OF AUTONOMOUS VEHICLES INTRA-FACILITY
MOVEMENT
BACKGROUND OF THE INVENTION
At present, various concepts enable specific indications for various addresses that enable the generation of predictive routing to various locations along known travel paths. However, such concepts merely facilitate travel to a particular outdoor address, such as the address of a specific building, campus, and/or the like. In instances in which a visitor (e.g., service personnel, delivery personnel, maintenance personnel, and/or the like), resident, visually impaired individual, and/or the like is scheduled to visit a specific location and/or individual within a building, campus, and/or the like, however, the visitor must manually determine a route to the desired destination location based on limited and potentially outdated information/data provided via a static building directory, based on the instructions of a receptionist, security guard, or other building personnel, and/or the like.
Moreover, various companies are looking to implement delivery of shipments via autonomous vehicles configured to transport items to various outdoor addresses. However, these vehicles are not configured to delivery within various facilities, and accordingly manual delivery of items within facilities will still be necessary even if items are autonomously delivered to the facilities.
Accordingly, a need exists for accurate, internal location addressing and routing concepts enabling visitors and autonomous delivery vehicles to quickly locate a desired internal destination in order to facilitate navigation and shipment delivery within locations.
SUMMARY OF THE INVENTION
Various embodiments are directed to autonomous vehicles configured for autonomous movement within a facility along one or more recommended routes. In certain embodiments, the autonomous vehicles comprise: one or more locomotion mechanisms configured to maneuver the autonomous vehicle within the facility; one or more sensors configured to detect navigational cues emitted by one or more location devices positioned along a recommended route; an onboard controller comprising at least one non-transitory memory and a processor, wherein the onboard controller is configured to: receive navigational data from the one or more sensors, wherein the navigational data is indicative of the location of one or more location devices emitting navigational cues identifying the recommended route; control the one or more locomotion mechanisms based at least in part on the identified locations of the one or more location devices emitting navigational cues identifying the recommended route to move the autonomous vehicle along the recommended route identified by the navigational cues emitted by location devices positioned along the recommended route.
Certain embodiments are directed to methods for guiding an autonomous vehicle along a recommended route defined within a facility. In various embodiments, the methods comprise: detecting one or more navigational cues emitted by one or more location devices positioned along the recommended route; determining a location of the one or more location devices emitting the navigational cues; and activating a locomotion mechanism based at least in part on the identified locations of the one or more location devices emitting navigational cues to move the autonomous vehicle along the recommended route.
Moreover, various embodiments are directed to autonomous vehicle guidance systems for guiding an autonomous vehicle through a facility. In various embodiments, the autonomous vehicle guidance system comprises: a mapping computing entity comprising a non-transitory memory and a processor, wherein the mapping computing entity is configured to determine a recommended route for an autonomous vehicle to travel through a facility to reach an intended destination; a plurality of location devices positioned throughout the facility, wherein each of the location devices are configured to emit navigational cues detectable by an autonomous vehicle upon receipt of a signal from the mapping computing entity; at least one autonomous vehicle configured for autonomous movement within the facility, the autonomous vehicle comprising: one or more locomotion mechanisms configured to freely maneuver the autonomous vehicle within the facility; one or more sensors configured to detect navigational cues emitted by the one or more location devices; and an onboard controller comprising at least one non-transitory memory and a processor, wherein the onboard controller is configured to: receive navigational data from the one or more sensors, wherein the navigational data is indicative of the location of one or more location devices emitting navigational cues; identify a recommended route based on the location of the one or more location devices emitting navigational cues; and control the one or more locomotion mechanisms based at least in part on the identified locations of the one or more location devices emitting navigational cues to move the autonomous vehicle along the recommended route.
BRIEF DESCRIPTION OF THE DRAWING
Reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Fig. 1 is a diagram of a system that can be used to practice various embodiments of the present invention.
Fig. 2 is a schematic illustration of a ground based autonomous vehicle in accordance with certain embodiments of the present invention.
Fig. 3 is a schematic illustration of an aerial-based autonomous vehicle in accordance with certain embodiments of the present invention.
Fig. 4 is a schematic of a mapping computing entity in accordance with certain embodiments of the present invention.
Fig. 5 is a schematic of a mobile computing entity in accordance with certain embodiments of the present invention.
Fig. 6 shows an example interior map indicating locations of various location devices.
Fig. 7 is a flow chart showing an example method for providing navigational guidance to an autonomous vehicle within a facility.
Fig. 8 shows an example beacon activity indicating a determined navigation route for a ground-based autonomous vehicle an interior hallway.
Fig. 9 shows an example beacon activity indicating a determined navigational route for an aerial-based autonomous vehicle within an interior hallway.
Fig. 10 shows an example ground-based autonomous vehicle depositing an item at a destination location in accordance with certain embodiments of the present invention.
Fig. 11 is a flow chart showing an example method for automatically operating transportation mechanisms to move an autonomous vehicle along a recommended route within a facility. Fig. 12 shows an example beacon a transportation mechanism operation for a ground-based autonomous vehicle in accordance with certain embodiments of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like numbers refer to like elements throughout.
Various embodiments are directed to concepts for providing and/or utilizing internal addresses within a facility (e.g., a building, a campus, a suite, a house, an apartment, a warehouse, a building complex, a mall, and/or the like) as described in co-pending U.S. Patent Appl. No. 15/378,515, which is incorporated herein by reference in its entirety. The internal addresses provide reference points that may be utilized as navigational references by autonomous delivery vehicles (e.g., ground-based vehicles and/or aerial-based vehicles) traversing the interior of the facility, for example, to deliver shipments to one or more interior locations.
The internal addresses may facilitate locating specific individuals (e.g., mobile computing entities carried by those individuals), rooms, furniture (e.g., desks), and/or other locations within the facility. The internal addresses may be associated with one or more location devices (e.g., location beacons, Internet of Things enabled devices, and/or the like), which may be configured to wirelessly broadcast or otherwise transmit information/data indicative of the location device's location to various computer-enabled devices (e.g., mobile user devices, autonomous vehicles, and/or the like) within a corresponding broadcast range. By broadcasting and/or otherwise transmitting their location, the location devices may act as virtual landmarks and/or may be associated with particular internal addresses within a facility that enable mobile user devices and autonomous vehicles to determine their relative locations within the facility. Collectively, the location devices form a navigational network that may be utilized to generate and/or provide navigational instructions for autonomous vehicles. The network of location devices may be configured to highlight a recommended travel path to guide autonomous vehicles to a desired location within the facility. The location devices may thereby form a virtual breadcrumb trail through the facility to a desired location by providing navigational cues in series, emitted from the location devices along the recommended travel path.
In certain embodiments, the location devices may also be configured to receive information/data indicative of a desired destination of a particular autonomous vehicle (e.g., transmitted from the autonomous vehicle), and may provide navigational instructions to direct the autonomous vehicle toward the desired destination. For example, location devices may be located along travel paths (e.g., walkways, autonomous vehicle travel paths, and/or the like) within a facility and at various internal addresses within the facility. As an autonomous vehicle moves along the various travel paths within the facility, location devices located along a recommended navigational path leading toward a desired destination may provide navigational cues indicative of the navigational instructions for the autonomous vehicle. For example, the location devices may illuminate associated lights, wirelessly transmit navigational data packets, and/or the like in order to direct the autonomous vehicle toward the desired destination.
For facilities having automated transportation mechanisms (e.g., elevators, escalators, people movers, dumbwaiters, and/or the like), the facilities may be configured to operate the automated transportation mechanisms automatically, in order to move a particular autonomous vehicle, individual, and/or item toward a desired destination. As a specific example, the facility may monitor the location of a particular autonomous vehicle as the autonomous vehicle moves along the recommended travel path toward the desired destination. As the autonomous vehicle nears an automated transport mechanism on the recommended travel path, the facility automatically positions the automated transport mechanism so that the autonomous vehicle may board the automated transport mechanism to be moved toward the desired destination location. Once the facility detects that the autonomous vehicle has boarded the automated transport mechanism, the facility may move the automated transport mechanism toward the desired destination location without requiring additional input and/or signals from the autonomous vehicle (e.g., mechanical input and/or wirelessly transmitted input). As a specific example, a facility may position an elevator such that the autonomous vehicle may board the elevator upon determining that the autonomous vehicle is proximate the elevator. The elevator may then automatically move to the floor on which a desired destination is located, without requiring the autonomous vehicle to provide any input to the elevator.
I. Computer Program Products, Methods, and Computing Entities
Embodiments of the present invention may be implemented in various ways, including as computer program products that comprise articles of manufacture. A computer program product may include a non-transitory computer-readable storage medium storing applications, programs, program modules, scripts, source code, program code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like (also referred to herein as executable instructions, instructions for execution, program code, and/or similar terms used herein interchangeably). Such non- transitory computer- readable storage media include all computer-readable media (including volatile and nonvolatile media).
In one embodiment, a non-volatile computer-readable storage medium may include a floppy disk, flexible disk, hard disk, solid-state storage (SSS) (e.g., a solid state drive (SSD), solid state card (SSC), solid state module (SSM)), enterprise flash drive, magnetic tape, or any other non-transitory magnetic medium, and/or the like. A non- volatile computer-readable storage medium may also include a punch card, paper tape, optical mark sheet (or any other physical medium with patterns of holes or other optically recognizable indicia), compact disc read only memory (CD-ROM), compact disc-rewritable (CD-RW), digital versatile disc (DVD), Blu-ray disc (BD), any other non-transitory optical medium, and/or the like. Such a non-volatile computer-readable storage medium may also include read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory (e.g., Serial, NAND, NOR, and/or the like), multimedia memory cards (MMC), secure digital (SD) memory cards, SmartMedia cards, CompactFlash (CF) cards, Memory Sticks, and/or the like. Further, a non-volatile computer-readable storage medium may also include conductive-bridging random access memory (CBRAM), phase- change random access memory (PRAM), ferroelectric random-access memory (FeRAM), non-volatile random-access memory (NVRAM), magnetoresistive random-access memory (MRAM), resistive random-access memory (RRAM), Silicon-Oxide-Nitride-Oxide-Silicon memory (SONOS), floating junction gate random access memory (FJG RAM), Millipede memory, racetrack memory, and/or the like. In one embodiment, a volatile computer-readable storage medium may include random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), fast page mode dynamic random access memory (FPM DRAM), extended data-out dynamic random access memory (EDO DRAM), synchronous dynamic random access memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), double data rate type two synchronous dynamic random access memory (DDR2 SDRAM), double data rate type three synchronous dynamic random access memory (DDR3 SDRAM), Rambus dynamic random access memory (RDRAM), Twin Transistor RAM (TTRAM), Thyristor RAM (T-RAM), Zero-capacitor (Z-RAM), Rambus in- line memory module (RIMM), dual in-line memory module (DIMM), single in-line memory module (SIMM), video random access memory (VRAM), cache memory (including various levels), flash memory, register memory, and/or the like. It will be appreciated that where embodiments are described to use a computer-readable storage medium, other types of computer-readable storage media may be substituted for or used in addition to the computer- readable storage media described above.
As should be appreciated, various embodiments of the present invention may also be implemented as methods, apparatus, systems, computing devices, computing entities, and/or the like. As such, embodiments of the present invention may take the form of an apparatus, system, computing device, computing entity, and/or the like executing instructions stored on a computer-readable storage medium to perform certain steps or operations. However, embodiments of the present invention may also take the form of an entirely hardware embodiment performing certain steps or operations.
Embodiments of the present invention are described below with reference to block diagrams and flowchart illustrations. Thus, it should be understood that each block of the block diagrams and flowchart illustrations may be implemented in the form of a computer program product, an entirely hardware embodiment, a combination of hardware and computer program products, and/or apparatus, systems, computing devices, computing entities, and/or the like carrying out instructions, operations, steps, and similar words used interchangeably (e.g., the executable instructions, instructions for execution, program code, and/or the like) on a computer-readable storage medium for execution. For example, retrieval, loading, and execution of code may be performed sequentially such that one instruction is retrieved, loaded, and executed at a time. In some exemplary embodiments, retrieval, loading, and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together. Thus, such embodiments can produce specifically-configured machines performing the steps or operations specified in the block diagrams and flowchart illustrations. Accordingly, the block diagrams and flowchart illustrations support various combinations of embodiments for performing the specified instructions, operations, or steps.
II. Exemplary System Architecture
Fig. 1 provides an illustration of a system that can be used in conjunction with various embodiments of the present invention. As shown in Fig. 1, the system may include one or more vehicles 100 (e.g., autonomous vehicles), one or more mobile computing entities 105, one or more mapping computing entities 110, one or more Global Positioning System (GPS) satellites 115, one or more location sensors 120, one or more information/data collection devices 130, one or more networks 135, one or more location devices 400, one or more user computing entities 140 (not shown), and/or the like. Each of the components of the system may be in electronic communication with, for example, one another over the same or different wireless or wired networks including, for example, a wired or wireless Personal Area Network (PAN), Local Area Network (LAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), or the like. Additionally, while Fig. 1 illustrates certain system entities as separate, standalone entities, the various embodiments are not limited to this particular architecture.
A. Exemplary Autonomous Vehicle
As utilized herein, autonomous vehicles may be configured for transporting one or more items (e.g., one or more packages, parcels, bags, containers, loads, crates, items banded together, vehicle parts, pallets, drums, the like, and/or similar words used herein interchangeably). In certain embodiments, each autonomous vehicle 100 may be associated with a unique vehicle identifier (such as a vehicle ID) that uniquely identifies the autonomous vehicle 100. The unique vehicle ID may include characters, such as numbers, letters, symbols, and/or the like. For example, an alphanumeric vehicle ID (e.g., "AS445") may be associated with each vehicle 100. The autonomous vehicles 100 may encompass land-based autonomous vehicles (e.g., as shown in Fig. 2) and/or aerial-based autonomous vehicles (e.g., as shown in Fig. 3). Ground-based autonomous vehicles 100 may be configured for travelling along a support surface (e.g., the ground). The ground-based autonomous vehicles 100 may be freely maneuverable along a support surface such that the vehicles are not limited to movement along a specifically configured track. However, it should be understood that in certain embodiments, the ground-based autonomous vehicles 100 may be configured for movement along one or more predefined tracks (e.g., rails, slots, grooves, painted lines, embedded wires, and/or the like) traversing the interior of a facility.
The ground-based autonomous vehicles 100 (e.g., as shown in Fig. 2) may be driven by one or more forms of locomotion. For example, the ground-based autonomous vehicles 100 may travel on one or more wheels 121 (e.g., two-wheels, three-wheels, four- wheels, 18-wheels, and/or the like), on one or more continuous tracks (e.g., two continuous tracks driven by one or more wheels), on one or more legs (e.g., a bipedal vehicle, a quadruped vehicle, hexapod vehicle, and/or the like), one or more hover mechanisms (e.g., air pillows), and/or the like.
Aerial-based autonomous vehicles 100 (e.g., as shown in Fig. 3) may be Unmanned Aerial Vehicles (UAVs) such as those described in co-pending U.S. Patent Appl. Serial No. 15/582,129, filed April 28, 2017, the contents of which are incorporated herein by reference in their entirety. Accordingly, the one or more aerial-based autonomous vehicles 100 may be driven by one or more forms of locomotion and/or lift mechanisms. For example, the autonomous vehicles 100 may comprise one or more rotors 141 configured for vertical and/or horizontal locomotion. As just one non-limiting example, aerial -based autonomous vehicles may be embodied as hexacopters having six rotors configured to enable vertical locomotion (e.g., lift) and/or horizontal locomotion, as shown in the example embodiment of Fig. 3, as well as enabling roll, pitch, and yaw movements of the vehicle. It should be understood that rotor-based autonomous vehicles may have any number of rotors (e.g., 1 rotor, 2 rotors, 3 rotors, 4 rotors, 8 rotors, and/or the like). Aerial-based autonomous vehicles may also have a variety of other lift and drive mechanisms, such as balloon-based lift mechanisms (e.g., enabling lighter-than-air transportation), wing-based lift mechanisms, turbine-based lift mechanisms, and/or the like. Moreover, like the ground-based autonomous vehicles 100 discussed above, the aerial-based autonomous vehicles 100 may be freely maneuverable (e.g., in three-dimensions), or the aerial-based autonomous vehicles 100 may be maneuverable along one or more tracks traversing a facility (e.g., rails built onto and/or into walls or ceilings of a facility).
In various embodiments, the autonomous vehicle 100 locomotion mechanisms (e.g., ground-based and/or aerial -based) may be driven by one or more power devices, such as electrical motors, internal-combustion engines (e.g., alcohol-fueled, oil-fueled, gasoline- fueled, and/or the like), and/or the like. The power devices may be connected to the one or more locomotion mechanisms directly, via one or more shafts, via one or more gears, via one or more flywheels, via one or more viscous couplings, and/or the like.
The autonomous vehicles 100 may have a body portion 122, 142 housing or otherwise connecting the various components of the autonomous vehicles 100. The body portion may be comprise one or more rigid materials, such as carbon fiber, aluminum, plastic, and/or the like. The body portion 122, 142 may thereby be secured relative to the one or more locomotion mechanisms, the one or more power devices, an onboard controller (discussed herein), and/or the like. Moreover, the body portion 122, 142 may encompass, or be secured relative to, one or more item support portions 123, 143 carried by the autonomous vehicle 100. The item support portions 123, 143 may be configured to hold a single item (e.g., a single item shipment for delivery) and/or a plurality of items (e.g., multi-item shipments destined for a common destination and/or multiple single-item shipments destined for multiple destinations). The item support portions 123, 143 may comprise one or more cargo containers, one or more item cages, one or more onboard conveying mechanisms (e.g., belts, chains, lifts, and/or the like, such as conveyor 124 of Fig. 2), one or more grippers (e.g., gripper 144 of Fig. 3), and/or the like. As a non-limiting example, the body portion 122, 142 may form the item support portion 123, 143 such that the item support portion 123, 143 is positioned within the body portion 122, 142. As another non-limiting example, the item support portion 123, 143 may be secured (detachably or permanently) relative to the body portion 122, 142, for example, as an item cage and/or item gripper suspended below the body portion 122, 142 of a flying autonomous vehicle 100. As yet another non-limiting example, the item support portion may be embodied as a trailer secured (e.g., detachably or permanently) relative to the body portion 122 of a ground-based autonomous vehicle 100.
The item support portion 123, 143 of the autonomous vehicles 100 may encompass or may otherwise be secured relative to an item deposit mechanism 125, 145 configured to deposit an item at a delivery location. For example, the item deposit mechanism 125, 145 may comprise one or more grippers, one or more conveying mechanisms, one or more multi-axis robotic arms, one or more actuated doors, and/or the like. In various embodiments, the deposit mechanism 125, 145 is configured to retrieve an item to be delivered at a particular location from the item support portion 123,143, and to deposit the item at the delivery location. For example, a multi-axis robotic arm having a gripper at one end may be configured to retrieve the item to be delivered from the item support portion, and to move the item to the destination location (e.g., a desk surface, a floor, and/or the like at the destination location). As yet another example, an item support portion 123 comprising a conveying mechanism 124 may be configured to convey an item onto the deposit mechanism 125 (which may also comprise a conveyor), and the deposit mechanism may be configured to deposit the item at the delivery location, as shown in the example embodiment of Fig. 10.
Moreover, the autonomous vehicle 100 additionally comprises an onboard controller configured to control locomotion of the vehicle, navigation of the vehicle, obstacle avoidance, item delivery, and/or the like. As discussed herein, the autonomous vehicle 100 may additional comprise one or more user interfaces 126, which may comprise an output mechanism and/or an input mechanism configured to receive user input. For example, the user interface may be configured to enable autonomous vehicle technicians to review diagnostic information/data relating to the autonomous vehicle, and/or a user of the autonomous vehicle 100 may utilize the user interface to input and/or review information/data indicative of a destination location for the autonomous vehicle 100.
Fig. 1 shows one or more computing entities, devices, and/or similar words used herein interchangeably that are associated with and/or otherwise collectively form the onboard controller. For example, the one or more computing entities may encompass, for example, an information/data collection device 130 or other computing entities. In general, the terms computing entity, entity, device, system, and/or similar words used herein interchangeably may refer to, for example, one or more computers, computing entities, desktop computers, mobile phones, tablets, phablets, notebooks, laptops, distributed systems, gaming consoles (e.g., Xbox, Play Station, Wii), watches, glasses, iBeacons, proximity beacons, key fobs, radio frequency identification (RFID) tags, ear pieces, scanners, televisions, dongles, cameras, wristbands, wearable items/devices, items/devices, vehicles, kiosks, input terminals, servers or server networks, blades, gateways, switches, processing devices, processing entities, set-top boxes, relays, routers, network access points, base stations, the like, and/or any combination of devices or entities adapted to perform the functions, operations, and/or processes described herein.
In one embodiment, the information/data collection device 130 may include, be associated with, or be in wired or wireless communication with one or more processors (various exemplary processors are described in greater detail below), one or more location- determining devices or one or more location sensors 120 (e.g., Global Navigation Satellite System (GNSS) sensors, indoor location sensors, such as Bluetooth sensors, Wi-Fi sensors, and/or the like), one or more real-time clocks, a J-Bus protocol architecture, one or more electronic control modules (ECM), one or more communication ports for receiving information/data from various sensors (e.g., via a CAN-bus), one or more communication ports for transmitting/sending data, one or more RFID tags/sensors, one or more power sources, one or more information/data radios for communication with a variety of communication networks, one or more memory modules, and one or more programmable logic controllers (PLC). It should be noted that many of these components may be located in the autonomous vehicle 100 but external to the information/data collection device 130.
In one embodiment, the one or more location sensors 120, modules, or similar words used herein interchangeably may be one of several components in wired or wireless communication with or available to the information/data collection device 130. Moreover, the one or more location sensors 120 may be compatible with GPS satellites 115, such as Low Earth Orbit (LEO) satellite systems, Department of Defense (DOD) satellite systems, the European Union Galileo positioning systems, the Chinese Compass navigation systems, Indian Regional Navigational satellite systems, and/or the like. This information/data can be collected using a variety of coordinate systems, such as the Decimal Degrees (DD); Degrees, Minutes, Seconds (DMS); Universal Transverse Mercator (UTM); Universal Polar Stereographic (UPS) coordinate systems; and/or the like.
As discussed herein, triangulation and/or proximity based location determinations may be used in connection with a device associated with a particular autonomous vehicle and with various communication points (e.g., cellular towers, Wi-Fi access points, location devices 400, and/or the like) positioned at various locations throughout a geographic area and/or throughout an interior of a facility to monitor the location of the vehicle 100 and/or its operator. The one or more location sensors 120 may be used to receive latitude, longitude, altitude, heading or direction, geocode, course, position, time, location identifying information/data, and/or speed information/data (e.g., referred to herein as location information/data and further described herein below). The one or more location sensors 120 may also communicate with the mapping computing entity 110, the information/data collection device 130, user computing entity 105, and/or similar computing entities.
In one embodiment, the ECM may be one of several components in communication with and/or available to the information/data collection device 130. The ECM, which may be a scalable and subservient device to the information/data collection device 130, may have information/data processing capability to decode and store analog and digital inputs received from, for example, vehicle systems and sensors. The ECM may further have information/data processing capability to collect and present location information/data to the J-Bus (which may allow transmission to the information/data collection device 130), and output location identifying data, for example, via a display and/or other output device (e.g., a speaker).
As indicated, a communication port may be one of several components available in the information/data collection device 130 (or be in or as a separate computing entity). Embodiments of the communication port may include an Infrared information/data Association (IrDA) communication port, an information/data radio, and/or a serial port. The communication port may receive instructions for the information/data collection device 130. These instructions may be specific to the vehicle 100 in which the information/data collection device 130 is installed, specific to the geographic area and/or serviceable point in which the vehicle 100 will be traveling, specific to the function the vehicle 100 serves within a fleet, and/or the like. In one embodiment, the information/data radio may be configured to communicate with a wireless wide area network (WW AN), wireless local area network (WLAN), wireless personal area network (WPAN), or any combination thereof. For example, the information/data radio may communicate via various wireless protocols, such as 802.11, general packet radio service (GPRS), Universal Mobile Telecommunications System (UMTS), Code Division Multiple Access 2000 (CDMA2000), CDMA2000 IX (lxRTT), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), Evolved Universal Terrestrial Radio Access Network (E-UTRAN), Evolution-Data Optimized (EVDO), High Speed Packet Access (HSPA), High-Speed Downlink Packet Access (HSDPA), IEEE 802.11 (Wi-Fi), 802.16 (WiMAX), ultra- wideband (UWB), infrared (IR) protocols, Bluetooth protocols (including Bluetooth low energy (BLE)), wireless universal serial bus (USB) protocols, and/or any other wireless protocol. As yet other examples, the communication port may be configured to transmit and/or receive information/data transmissions via light-based communication protocols (e.g., utilizing specific light emission frequencies, wavelengths (e.g., visible light, infrared light, and/or the like), and/or the like to transmit data), to transmit data) via sound-based communication protocols (e.g., utilizing specific sound frequencies to transmit data), and/or the like. In various embodiments, the autonomous vehicle 100 may comprise a user interface 126 one or more input devices and/or one or more output devices configured to receive user input and/or to provide visual and/or audible output to a user (e.g., a programmer; a technician, and/or the like). For example, the vehicle may comprise a touchscreen (e.g., a capacitive touchscreen), a keyboard, a mouse, a touchpad, a display (e.g., an LCD display, an LED display, a tube display, and/or the like), and/or the like.
As discussed herein, the onboard controller of the autonomous vehicle 100 may be configured to generate and/or retrieve navigational instructions for the autonomous vehicle 100 to move to a particular destination location within the facility. The onboard controller may be configured to interpret the navigational instructions relative to a determined location of the autonomous vehicle 100 within the facility, and to provide signals to the onboard locomotion mechanisms to move the autonomous vehicle 100 along a determined route to the destination location.
Moreover, in various embodiments, the onboard controller may comprise one or more sensors 127 configured to assist in navigating the autonomous vehicle 100 during movement. The one or more sensors 127 are configured to detect objects around the autonomous vehicle 100 and to provide feedback to the onboard controller to assist in guiding the autonomous vehicle 100 in the execution of various operations, including, for example, initialization (e.g., takeoff), movement navigation, route completion (e.g., landing), and/or the like. The one or more sensors 127 may comprise one or more travel sensors configured to aid in navigating the autonomous vehicle 100 through a facility, one or more shipment/item delivery sensors configured to aid in placing a shipment/item at an intended destination location, and/or the like. In certain embodiments, a single set of one or more sensors may operate as both the travel sensors and the shipment/item delivery sensors, however, in certain embodiments, an autonomous vehicle 100 may comprise a dedicated set of one or more travel sensors and a separate and dedicated set of one or more shipment/item delivery sensors.
For example, the sensors 127 may comprise one or more light sensors, cameras, infrared sensors, microphones, RFID sensors, wireless communication receivers (e.g., Bluetooth, Wi-Fi, and/or the like) LIDAR sensors, proximity sensors, and/or the like. In certain embodiments, the one or more sensors may be configured for generating and/or interpreting three-dimensional mapping aspects and/or three-dimensional aspects of an environment surrounding the autonomous vehicle 100, for example, utilizing three- dimensional depth sensors (e.g., dual cameras, depth-sensing cameras, depth-sensing scanners, and/or the like). The three-dimensional sensors may be utilized in conjunction with and/or as an alternative to other location-determining devices to ascertain a current location of the autonomous vehicle 100 relative to various visible objects (which may be utilized as known landmarks that are recognizable to the autonomous vehicle 100 as matching information/data stored in a memory associated with the autonomous vehicle 100). In certain embodiments, the one or more sensors may be secured to and/or within the body portion of the autonomous vehicle. The one or more sensors may be positioned on the autonomous vehicle 100 based on the intended functionality of the sensor. For example, shipment/item delivery sensors may be positioned such that the shipment/item delivery sensors are able to monitor the relative location of a shipment/item to be delivered at a particular location. For example, a camera may have a field of view including one or more edges of a shipment/item (and/or a shipment/item deposit mechanism 125), such that the camera is configured to monitor the placement of the shipment/item at an intended destination location. As yet another example, one or more navigational and/or obstacle avoidance sensors may be positioned such that the sensors are able to monitor the location of one or more edges of the autonomous vehicle 100 relative to one or more detected obstacles within a travel path of the autonomous vehicle 100.
In various embodiments, the sensors 127 may be configured to detect data outputs from the one or more location devices within the facility. Accordingly, as discussed herein, the onboard controller may be configured to receive sensor data from the one or more sensors 127, and to utilize the retrieved sensor data to self-determine a precise location of the autonomous vehicle 100 within the facility. Moreover, the onboard controller may utilize the sensor data to identify an appropriate location to deposit an item at a destination location (e.g., on a desk, on a floor, and/or the like).
B. Exemplary Mapping Computing Entity
Fig. 4 provides a schematic of a mapping computing entity 110 according to one embodiment of the present invention. In various embodiments, each facility (e.g., office building, apartment building, storage building, campus, office suite, hotel, motel, inn, school, house, warehouse, convention center, and/or the like) may have a corresponding facility- specific mapping computing entity 110 configured to store and/or provide information/data indicative of various locations within the facility. In certain embodiments, various entities may comprise a mapping computing entity 110 storing location information/data for various locations located in a plurality of facilities. For example, a carrier may have a third-party mapping computing entity 110 storing location information/data for various locations internal to a plurality of facilities. A carrier may be a traditional carrier, such as United Parcel Service (UPS), FedEx, DHL, courier services, the United States Postal Service (USPS), Canadian Post, freight companies (e.g. truck-load, less-than-truckload, rail carriers, air carriers, ocean carriers, etc.), and/or the like. However, a carrier may also be a nontraditional carrier, such as Coyote, Amazon, Google, Uber, ride-sharing services, crowd-sourcing services, retailers, and/or the like. Accordingly, each time an employee of the carrier arrives at a particular facility, the carrier's mapping computing entity 110 may provide the employee with location information/data corresponding to the particular facility.
In various embodiments, a third party may provide software to configure a facility-specific mapping computing entity 110 to provide various internal addressing and/or navigational features as discussed herein. In various embodiments, the provided software may comprise algorithms for generating and/or storing map data, algorithms for generating recommended navigational routes to a desired destination location (as discussed herein), and/or the like. In various embodiments, the software may be configurable based on hardware utilized at a particular facility. For example, the software may be configured such that signals generated according to the third-party provided software are compatible and readable with various hardware components (e.g., transportation mechanisms, location devices 400, and/or the like) located within the facility.
Moreover, the provided software may be configured to provide security features for a specific facility, for example, to prevent unauthorized devices from obtaining mapping data for the facility. The software may be configured to enable certain devices (e.g., certain autonomous vehicles 100) to connect and communicate with various hardware components (e.g., location devices 400) within the facility. For example, the software may be configured to enable all devices of a certain type (e.g., an autonomous vehicle 100), devices having corresponding specific identifiers (e.g., serial numbers, device names, and/or the like), and/or the like to connect with the various building hardware components.
As indicated, in one embodiment, the mapping computing entity 110 may also include one or more communications interfaces 320 for communicating with various computing entities, such as by communicating data, content, information, and/or similar terms used herein interchangeably that can be transmitted, received, operated on, processed, displayed, stored, and/or the like. For instance, the mapping computing entity 110 may communicate with autonomous vehicles 100, user computing entities 105, location devices 400, and/or the like.
As shown in Fig. 4, in one embodiment, the mapping computing entity 110 may include or be in communication with one or more processing elements 305 (also referred to as processors, processing circuitry, and/or similar terms used herein interchangeably) that communicate with other elements within the mapping computing entity 110 via a bus, for example. As will be understood, the processing element 305 may be embodied in a number of different ways. For example, the processing element 305 may be embodied as one or more complex programmable logic devices (CPLDs), microprocessors, multi-core processors, coprocessing entities, application- specific instruction-set processors (ASIPs), and/or controllers. Further, the processing element 305 may be embodied as one or more other processing devices or circuitry. The term circuitry may refer to an entirely hardware embodiment or a combination of hardware and computer program products. Thus, the processing element 305 may be embodied as integrated circuits, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays (PLAs), hardware accelerators, other circuitry, and/or the like. As will therefore be understood, the processing element 305 may be configured for a particular use or configured to execute instructions stored in volatile or non- volatile media or otherwise accessible to the processing element 305. As such, whether configured by hardware or computer program products, or by a combination thereof, the processing element 305 may be capable of performing steps or operations according to embodiments of the present invention when configured accordingly.
In one embodiment, the mapping computing entity 110 may further include or be in communication with non-volatile media (also referred to as non-volatile storage, memory, memory storage, memory circuitry and/or similar terms used herein interchangeably). In one embodiment, the non-volatile storage or memory may include one or more non-volatile storage or memory media 310 as described above, such as hard disks, ROM, PROM, EPROM, EEPROM, flash memory, MMCs, SD memory cards, Memory Sticks, CBRAM, PRAM, FeRAM, RRAM, SONOS, racetrack memory, and/or the like. As will be recognized, the non- volatile storage or memory media may store databases, database instances, database management system entities, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like. The term database, database instance, database management system entity, and/or similar terms used herein interchangeably may refer to a structured collection of records or information/data that is stored in a computer- readable storage medium, such as via a relational database, hierarchical database, and/or network database.
In one embodiment, the mapping computing entity 110 may further include or be in communication with volatile media (also referred to as volatile storage, memory, memory storage, memory circuitry and/or similar terms used herein interchangeably). In one embodiment, the volatile storage or memory may also include one or more volatile storage or memory media 315 as described above, such as RAM, DRAM, SRAM, FPM DRAM, EDO DRAM, SDRAM, DDR SDRAM, DDR2 SDRAM, DDR3 SDRAM, RDRAM, RIMM, DIMM, SIMM, VRAM, cache memory, register memory, and/or the like. As will be recognized, the volatile storage or memory media may be used to store at least portions of the databases, database instances, database management system entities, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like being executed by, for example, the processing element 305. Thus, the databases, database instances, database management system entities, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like may be used to control certain aspects of the operation of the mapping computing entity 110 with the assistance of the processing element 305 and operating system.
As indicated, in one embodiment, the mapping computing entity 110 may also include one or more communications interfaces 320 for communicating with various computing entities, such as by communicating data, content, information, and/or similar terms used herein interchangeably that can be transmitted, received, operated on, processed, displayed, stored, and/or the like. For instance, the mapping computing entity 110 may communicate with computing entities or communication interfaces of the vehicle 100, mobile computing entities 105, and/or the like.
Such communication may be executed using a wired information/data transmission protocol, such as fiber distributed information/data interface (FDDI), digital subscriber line (DSL), Ethernet, asynchronous transfer mode (ATM), frame relay, information/data over cable service interface specification (DOCSIS), or any other wired transmission protocol. Similarly, the mapping computing entity 110 may be configured to communicate via wireless external communication networks using any of a variety of protocols, such as GPRS, UMTS, CDMA2000, lxRTT, WCDMA, TD-SCDMA, LTE, E- UTRAN, EVDO, HSPA, HSDPA, Wi-Fi, WiMAX, UWB, IR protocols, Bluetooth protocols, USB protocols, and/or any other wireless protocol. As yet other examples, the mapping computing entity 110 may be configured to transmit and/or receive information/data transmissions via light-based communication protocols (e.g., utilizing specific light emission frequencies, wavelengths (e.g., visible light, infrared light, and/or the like), and/or the like to transmit data), to transmit data) via sound-based communication protocols (e.g., utilizing specific sound frequencies to transmit data), and/or the like. Although not shown, the mapping computing entity 110 may include or be in communication with one or more input elements, such as a keyboard input, a mouse input, a touch screen/display input, audio input, pointing device input, joystick input, keypad input, and/or the like. The mapping computing entity 110 may also include or be in communication with one or more output elements (not shown), such as audio output, video output, screen/display output, motion output, movement output, and/or the like.
As will be appreciated, one or more of the mapping computing entity's 110 components may be located remotely from other mapping computing entity 110 components, such as in a distributed system. Furthermore, one or more of the components may be combined and additional components performing functions described herein may be included in the mapping computing entity 110. Thus, the mapping computing entity 110 can be adapted to accommodate a variety of needs and circumstances.
C. Exemplary User Computing Entity
In various embodiments, a user device as discussed herein may be a user computing entity 105. As discussed herein, the user computing entity 105 may be a stationary computing device (e.g., a desktop computer, a server, a mounted computing device, and/or the like) or a mobile computing device (e.g., a PDA, a smartphone, a tablet, a phablet, a wearable computing entity, and/or the like) having an onboard power supply (e.g., a battery).
Fig. 5 provides an illustrative schematic representative of a user computing entity 105 that can be used in conjunction with embodiments of the present invention. In one embodiment, the user computing entities 105 may include one or more components that are functionally similar to those of the mapping computing entity 110 and/or as described below. As will be recognized, user computing entities 105 can be operated by various parties, including residents, employees, and/or visitors of a facility. As shown in Fig. 5, a user computing entity 105 can include an antenna 412, a transmitter 404 (e.g., radio), a receiver 406 (e.g., radio), and a processing element 408 that provides signals to and receives signals from the transmitter 404 and receiver 406, respectively.
The signals provided to and received from the transmitter 404 and the receiver
406, respectively, may include signaling information/data in accordance with an air interface standard of applicable wireless systems to communicate with various entities, such as autonomous vehicles 100, mapping computing entities 110, location devices 400, and/or the like. In this regard, the user computing entity 105 may be capable of operating with one or more air interface standards, communication protocols, modulation types, and access types. More particularly, the user computing entity 105 may operate in accordance with any of a number of wireless communication standards and protocols. In a particular embodiment, the user computing entity 105 may operate in accordance with multiple wireless communication standards and protocols, such as GPRS, UMTS, CDMA2000, lxRTT, WCDMA, TD- SCDMA, LTE, E-UTRAN, EVDO, HSPA, HSDPA, Wi-Fi, WiMAX, UWB, IR protocols, Bluetooth protocols, USB protocols, and/or any other wireless protocol. As yet other examples, the user computing entity 105 may be configured to transmit and/or receive information/data transmissions via light-based communication protocols (e.g., utilizing specific light emission frequencies, wavelengths (e.g., visible light, infrared light, and/or the like), and/or the like to transmit data), to transmit data) via sound-based communication protocols (e.g., utilizing specific sound frequencies to transmit data), and/or the like.
Via these communication standards and protocols, the user computing entity 105 can communicate with various other entities using concepts such as Unstructured Supplementary Service information/data (USSD), Short Message Service (SMS), Multimedia Messaging Service (MMS), Dual-Tone Multi-Frequency Signaling (DTMF), and/or Subscriber Identity Module Dialer (SIM dialer). The user computing entity 105 can also download changes, add-ons, and updates, for instance, to its firmware, software (e.g., including executable instructions, applications, program modules), and operating system.
According to one embodiment, the user computing entity 105 (e.g., a mobile user computing entity 105) may include location determining aspects, devices, modules, functionalities, and/or similar words used herein interchangeably. For example, the user computing entity 105 may include outdoor positioning aspects, such as a location module adapted to acquire, for example, latitude, longitude, altitude, geocode, course, direction, heading, speed, UTC, date, and/or various other information/data. In one embodiment, the location module can acquire data, sometimes known as ephemeris data, by identifying the number of satellites in view and the relative positions of those satellites. The satellites may be a variety of different satellites, including LEO satellite systems, DOD satellite systems, the European Union Galileo positioning systems, the Chinese Compass navigation systems, Indian Regional Navigational satellite systems, and/or the like. Alternatively, the location information/data may be determined by triangulating the user computing entity's 105 position in connection with a variety of other systems, including cellular towers, Wi-Fi access points, location devices 400, and/or the like. Similarly, the user computing entity 105 (e.g., a mobile user computing entity 105) may include indoor positioning aspects, such as a location module adapted to acquire, for example, latitude, longitude, altitude, geocode, course, direction, heading, speed, time, date, location identifying data, and/or various other information/data. Some of the indoor aspects may use various position or location technologies including RFID tags, indoor location devices 400 or transmitters, Wi-Fi access points, cellular towers, nearby computing devices (e.g., smartphones, laptops) and/or the like. For instance, such technologies may include iBeacons, Gimbal proximity beacons, BLE transmitters, Near Field Communication (NFC) transmitters, and/or the like. These indoor positioning aspects can be used in a variety of settings to determine the location of someone or something to within inches or centimeters.
The user computing entity 105 may also comprise a user interface (that can include a display 416 coupled to a processing element 408) and/or a user input interface (coupled to a processing element 408). For example, the user interface may be an application, browser, user interface, dashboard, webpage, and/or similar words used herein interchangeably executing on and/or accessible via the mobile computing entity 105 to interact with and/or cause display of information. The user input interface can comprise any of a number of devices allowing the user computing entity 105 to receive data, such as a keypad 418 (hard or soft), a touch display, voice/speech or motion interfaces, scanners, readers, or other input device. In embodiments including a keypad 418, the keypad 418 can include (or cause display of) the conventional numeric (0-9) and related keys (#, *), and other keys used for operating the user computing entity 105 and may include a full set of alphabetic keys or set of keys that may be activated to provide a full set of alphanumeric keys. In addition to providing input, the user input interface can be used, for example, to activate or deactivate certain functions, such as screen savers and/or sleep modes. Through such inputs the user computing entity can collect contextual information/data as part of the telematics data.
The user computing entity 105 can also include volatile storage or memory 422 and/or non-volatile storage or memory 424, which can be embedded and/or may be removable. For example, the non-volatile memory may be ROM, PROM, EPROM, EEPROM, flash memory, MMCs, SD memory cards, Memory Sticks, CBRAM, PRAM, FeRAM, RRAM, SONOS, racetrack memory, and/or the like. The volatile memory may be RAM, DRAM, SRAM, FPM DRAM, EDO DRAM, SDRAM, DDR SDRAM, DDR2 SDRAM, DDR3 SDRAM, RDRAM, RIMM, DIMM, SIMM, VRAM, cache memory, register memory, and/or the like. The volatile and non- volatile storage or memory can store databases, database instances, database management system entities, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like to implement the functions of the user computing entity 105.
D. Exemplary Facilities
In one embodiment, a facility, facility address, and/or similar words used herein interchangeably may be any identifiable location having a physical address, such as one or more campuses, lockers, access points, delivery locations, longitude and latitude points, geocodes, stops (e.g., pick up stops, delivery stops, vehicle visits, stops) geofenced areas, geographic areas, landmarks, buildings, bridges, and/or other identifiable locations. For example, a facility may be a residential location, such as one or more homes, one or more mobile homes, one or more apartments, one or more apartment buildings, one or more condominiums, one or more townhomes, and/or the like. A facility may also be a commercial location, such as one or more stores in a mall having a defined address, one or more office buildings, one or more office parks, one or more offices of an office complex, one or more garages, one or more lockers or access points, one or more warehouses, one or more restaurants, one or more stores, one or more retail locations, and/or the like. Facilities may also comprise one or more industrial locations, such as manufacturing locations, distribution locations, processing locations, industrial park locations, and/or the like. As will be recognized, a variety of approaches and techniques can be used to adapt to various needs and circumstances. As discussed herein, facilities may encompass one or more internal locations having corresponding internal addresses. The internal locations may comprise one or more rooms, hallways, portions of rooms, portions of hallways, cubicles, offices, stalls, restrooms, furniture (e.g., desks, chairs, and/or the like), walls, floors, portions of floors, stores, departments, elevators, stairwells, escalators, ramps, walkways, catwalks, roofs, basements, parking spaces, buildings (e.g., in a multi-building campus), mobile devices, mobile beacons, fixed-location beacons, and/or the like. As a non-limiting example, Fig. 6 provides a two- dimensional map view of a portion of an example facility, in which a plurality of location devices 400 are shown in various rooms and hallways with corresponding internal addresses. In various embodiments, only a subset of a plurality of internal locations may be associated with corresponding internal addresses. For example, various floors, portions of floors, rooms, portions of rooms, furniture, and/or the like may be associated with one or more internal addresses, while other internal locations, such as hallways, walls, and/or the like may not be specifically associated with an internal address. As discussed herein, the internal addresses may correspond to one or more network enabled computing entities, such as one or more location devices 400. Moreover, as discussed herein, facilities may encompass a plurality of location devices 400 each associated with one or more internal locations, internal addresses, and/or the like. In various embodiments, a facility may encompass a network of location devices 400, collectively providing information/data regarding a plurality of internal locations within the facility and/or one or more autonomous vehicles 100 configured to traverse the interior of the facility. Moreover, in various embodiments, facilities may each be associated with a location device 400 providing a general internal address for the facility.
E. Exemplary Shipment/Item
In one embodiment, a shipment/item may be any tangible and/or physical object. In one embodiment, a shipment/item may be or be enclosed in one or more packages, envelopes, parcels, bags, goods, products, containers, loads, crates, items banded together, vehicle parts, pallets, drums, the like, and/or similar words used herein interchangeably. In one embodiment, each shipment/item may include and/or be associated with an item/shipment identifier, such as an alphanumeric identifier. Such item/shipment identifiers may be represented as text, barcodes, tags, character strings, Aztec Codes, MaxiCodes, information/data Matrices, Quick Response (QR) Codes, electronic representations, and/or the like. A unique item/shipment identifier (e.g., 123456789) may be used by the carrier to identify and track the shipment/item as it moves through the carrier's transportation network. Further, such item/shipment identifiers can be affixed to shipments/items by, for example, using a sticker (e.g., label) with the unique item/shipment identifier printed thereon (in human and/or machine readable form) or an RFID tag with the unique item/shipment identifier stored therein. Such items may be referred to as "connected" shipments/items and/or "non- connected" shipments/items.
In one embodiment, connected shipments/items include the ability to determine their locations and/or communicate with various computing entities. This may include the shipment/item being able to communicate via a chip or other devices, such as an integrated circuit chip, RFID technology, Near Field Communication (NFC) technology, Bluetooth technology, Wi-Fi technology, light-based communication protocols, sound-based communication protocols, and any other suitable communication techniques, standards, or protocols with one another and/or communicate with various computing entities for a variety of purposes. Connected shipments/items may include one or more components that are functionally similar to those of the carrier server 100 and/or the mobile device 110 as described herein. For example, in one embodiment, each connected shipment/item may include one or more processing elements, one or more display device/input devices (e.g., including user interfaces), volatile and non- volatile storage or memory, and/or one or more communications interfaces. In this regard, in some example embodiments, a shipment/item may communicate send "to" address information/data, received "from" address information/data, unique identifier codes, location information/data, status information/data, and/or various other information/data.
In one embodiment, non-connected shipments/items do not typically include the ability to determine their locations and/or might not be able communicate with various computing entities or are not designated to do so by the carrier. The location of non- connected shipments/items can be determined with the aid of other appropriate computing entities. For example, non-connected shipments/items can be scanned (e.g., affixed barcodes, RFID tags, and/or the like) or have the containers or vehicles in which they are located scanned or located. As will be recognized, an actual scan or location determination of a shipment/item is not necessarily required to determine the location of a shipment/item. That is, a scanning operation might not actually be performed on a label affixed directly to a shipment/item or location determination might not be made specifically for or by a shipment/item. For example, a label on a larger container housing many shipments/items can be scanned, and by association, the location of the shipments/items housed within the container are considered to be located in the container at the scanned location. Similarly, the location of a vehicle transporting many shipments/items can be determined, and by association, the location of the shipments/items being transported by the vehicle are considered to be located in the vehicle 100 at the determined location. These can be referred to as "logical" scans/determinations or "virtual" scans/determinations. Thus, the location of the shipments/items is based on the assumption they are within the container or vehicle, despite the fact that one or more of such shipments/items might not actually be there. F. Exemplary Location Device
In various embodiments, one or more location devices 400 located within a facility may be utilized to provide location information/data to one or more devices (e.g., mobile user computing entity 105, shipment/item, autonomous vehicle 100, and/or the like) located within the facility, and/or to provide internal address information/data indicative of the current location (e.g., current location determined in real-time) of a particular mobile device user (e.g., an intended package recipient), internal location, and/or the like. For example, the location devices 400 may be associated with and/or define a particular internal location (e.g., a cubicle, a hallway, a floor, a portion of a floor, a portion of a hallway, a department, a store, and/or the like). In various embodiments, the location devices 400 may each be configured to broadcast information/data indicative of the internal location associated with the location device 400 wirelessly, within a wireless communication range associated with the location device 400. For example, the location devices 400 may be configured to transmit data via one or more wireless transmission protocols (e.g., Wi-Fi, Bluetooth, NFC, and/or the like), via soundwaves, via light, and/or the like. Moreover, the location devices 400 may be configured to transmit different data for different devices. For example, a single location device 400 may simultaneously and/or alternatively transmit separate data intended for multiple autonomous vehicles 100. As a specific example, a single location device 400 may simultaneously transmit data indicative of a first navigation instruction intended for a first autonomous vehicle 100 and transmit data indicative of a second navigation instruction intended for a second autonomous vehicle 100. The location devices 400 may be configured to transmit the multiple data in multiple frequencies (e.g., each transmission being broadcast in a corresponding frequency) (e.g., different radio frequencies, different light frequencies, and/or the like). As yet another example, the multiple data may be combined into a single transmission, but separated by data marker bytes within the transmission. For example, data intended for a first autonomous vehicle 100 may be preceded by a first data marker byte, and data intended for a second autonomous vehicle 100 may be preceded by a second data marker byte. In such embodiments, the receiving devices (e.g., receiving autonomous vehicles 100) may be configured to review the received data to search for a corresponding data marker byte that serves to mark data intended for the particular receiving device. Accordingly, the location devices 400 may be configured to broadcast information/data indicative of the identity of their location to other devices (e.g., user computing entities 105, autonomous vehicles 100, connected shipments/items, and/or the like) located within the communication range of the location device 400. Thus, as a device (e.g., an autonomous vehicle 100) enters the transmission range associated with the location device 400, the device may be configured to determine its location based on the information/data received from the location device 400.
The location devices 400 are spaced in various facilities such that a mobile device (e.g., an autonomous vehicle 100) can detect output data from at least two adjacent location devices 400. For example, while located proximate a location device 400, an autonomous vehicle 100 may detect output signals from at least one adjacent location device 400 such that the autonomous vehicle 100 is able to determine whether the adjacent location device 400 is along an intended travel path for the autonomous vehicle 100. For example, the location devices 400 may be spaced such that at least one adjacent location device 400 is visible from each location device 400.
In various embodiments, the location devices 400 may be configured to broadcast location information/data wirelessly via radio transmission (e.g., Wi-Fi, Bluetooth®, BLE, and/or the like), light transmission (e.g., visible light, infrared light, and/or the like detectable via a mobile computing entity 105), sound transmission, and/or the like. In various embodiments, the broadcast signals from the location devices 400 may enable an autonomous vehicle 100 (or other device) to determine its location (e.g., based on the location of the location device 400) and/or the autonomous vehicle's heading. For example, signals broadcast from a location device 400 may be directional, such that an autonomous vehicle may be configured to determine its direction relative to the directional signal broadcast from the location device 400.
The location devices 400 may comprise one or more wireless transmitters and/or receivers, as described herein with respect to various computing entities. In various embodiments, the location devices 400 may comprise a short range wireless transmitter and/or receiver (e.g., Bluetooth®, BLE, and/or the like) and/or a long range wireless transmitter and/or receiver (e.g., Wi-Fi). Accordingly, the location devices 400 may be configured to transmit information/data indicative of the identity of the internal location associated with the location device 400 via short-range wireless transmitters and may transmit other information/data to computing entities via the long range wireless transmitters.
In various embodiments, the location devices 400 may be configured to receive information/data transmitted from one or more computing entities, such as the onboard controllers of one or more autonomous vehicles 100 (or other devices such as internal building systems) and to provide navigational and/or other information/data to an autonomous vehicle 100. The location devices 400 may be configured to operate as an information/data relay between the autonomous vehicle 100 (or other device) and the mapping computing entity 110. For example, the location devices 400 may be configured to receive information/data indicative of a desired destination for an autonomous vehicle 100 from the autonomous vehicle 100. The location devices 400 may be configured to relay the received information/data to a mapping computing entity 110, which may be configured to determine a recommended route between the current location of the autonomous vehicle 100 (determined based at least in part on the location and/or identity of the location device 400) and the desired destination location. As yet another example, a location device 400 may receive information/data indicative of a desired delivery location from a connected shipment/item carried by a particular autonomous vehicle 100 and may relay the desired delivery location to a mapping computing entity 110 to determine a recommended route to the desired delivery location. In various embodiments, one or more of the location devices 400 (including the location device 400 proximate to the autonomous vehicle 100 and/or other location devices 400) may receive information/data instructing the location devices 400 to provide guidance to autonomous vehicle 100, for example, by providing an autonomous vehicle-detectable indicia of a recommended direction of travel to reach the desired destination. For example, the location devices 400 may have associated notification mechanisms, such as speakers, lights (e.g., Light Emitting Diodes), displays, and/or the like configured to provide an indication of a direction of travel for the mobile device user. As described herein, the mapping computing entity 110 of the facility may provide information/data instructing location devices 400 located along the recommended travel path to emit navigational instructions detectable by an autonomous vehicle 100 to form a virtual breadcrumb trail through the facility to the destination location. The detectable navigational instructions may comprise illuminating an indicator (e.g., by illuminating the location devices 400 in a particular color recognizable by a camera and/or light sensor of the autonomous vehicle, as shown in example Figs. 7-8 and 10) to provide a path of lights for the autonomous vehicle 100 to follow to the desired destination.
Accordingly, the location devices 400 may be in wireless and/or wired communication with other devices, such as other location devices 400, user computing entities 105, mapping computing entities 110, autonomous vehicles 100, shipments/items, and/or the like. Each location device 400 may comprise one or more memory storage units (e.g., for storing information/data indicative of a location corresponding to the location device 400), one or more processing units, and/or the like. In various embodiments, the location devices 400 may be standalone units providing location information/data for various internal mapping, internal navigation, and/or internal addressing functions. One or more location devices 400 may be secured relative to a particular item, device, and/or the like, and may store information/data indicative of an internal location description for the item, device, and/or the like to which it is attached. For example, a location device 400 may be secured to a ceiling tile, a desk, a chair, a wall, a floor tile, an elevator, a step, a door, and/or the like. In other embodiments, the location devices 400 may be embodied as one or more network enabled devices (e.g., Internet of Things enabled devices), such as a thermostat, light fixture, light switch, desktop computer, notebook computer, electronic whiteboard, and/or the like.
As noted herein, each location device 400 may be configured to store information/data indicative of an internal location associated with the location device 400. For example, the information/data stored by the location device 400 may comprise at least a portion of an internal address within a particular facility (e.g., within a building, a campus, and/or the like). The location information/data stored by the location devices 400 may comprise a character and/or a string of characters, a symbol, and/or the like. In certain embodiments, the location information/data may comprise data received from a plurality of location information/data sources. In various embodiments, the location information/data may be indicative of a relative location of the location device 400 within the facility. For example, the location information/data may be indicative of a floor on which the location device 400 is located, a room in which the location device 400 is located, a building (e.g., in a multi-building facility) in which the location device 400 is located, and/or the like. As a specific example, the location device 400 may store a portion of an internal address in the foim of 05L37D89, which may correlate to Desk Number 89, located proximate Light Fixture Number 37, on Floor 5 of a particular facility.
In various embodiments, the location information/data may be dynamic location information/data reflective of a current location of a mobile device (e.g., a mobile user computing entity 105) relative to one or more location devices 400. For example, the location information/data may comprise an internal address comprising data indicative of a mobile device located proximate a location device 400. With reference to the above mentioned example location address (05L37D89), in various embodiments, the address may be updated to reflect the location of a particular mobile device (e.g., a mobile device associated with a resident of a building). Accordingly, a mobile device identifier (e.g., a character string) may be appended onto the location address to reflect the current location of the mobile device. As a specific example, if a mobile device having an associated mobile device identifier P33 is located near the above-mentioned address, the indoor address for the mobile device may be 05L37D89P33. In various embodiments, a plurality of mobile devices located at the same internal location may have different internal location addresses. For example, a first mobile device may be associated with the internal location address 05L37D89P33, while a second mobile device may be associated with the internal location address 05L37D89P46. Accordingly, the internal location addresses for each mobile device may reflect that the location devices are located at the same internal location, but may reflect a distinction in identity between the various mobile devices.
In various embodiments, location devices 400 may be in communication with other location devices 400 in order to provide information/data indicative of the current internal location of a particular location device 400, to provide information/data indicative of navigational instructions between location devices 400, and/or to provide other information/data between a plurality of location devices 400. In various embodiments, location devices 400 may be in communication with one another in a hierarchical fashion, for example, in which a plurality of location devices 400 are in communication with a master location device 400. For example, each master beacon 400 may be associated with a large area within a location (e.g., a single floor in a multi-floor building, a geofenced area within a particular building, one or more areas associated with defined, subservient location devices 400, and/or the like) and each subservient location device 400 may be associated with a small area within the area corresponding to the master beacon 400 (e.g., a particular cubicle on the floor of the building). With reference again to the above example internal address (05L37D89), a first master level location device 400 associated with the fifth floor of the building may provide the first two digits (05) of the address, a second master level location device 400 associated with the Light Fixture 37 may provide the second three digits (L37), and a third level location device 400 associated with Desk 89 may provide the last three digits (D89). It should be understood that this example should not be construed as limiting, as various location devices 400 may provide other configurations and/or portions of an internal address.
As yet another example, each location device 400 may be a standalone location device 400 in direct communication with a mapping computing entity 110. In such embodiments, each location device 400 may comprise the entirety of the internal address corresponding to the location device 400. In such embodiments, the mapping computing entity 110 may store information/data indicative of the location of various location devices 400 within the facility. For example, the mapping computing entity 110 may store digital map information/data for a facility having an indication of the location of each location device 400 stored within the map information/data.
As mentioned herein, the location devices 400 may comprise one or more notification mechanisms configured to output autonomous-vehicle detectable notifications (e.g., visual (e.g., light-based) notifications, audible notifications, radio frequency notifications, and/or the like). For example, one or more location devices 400 may comprise one or more light sources (e.g., Light Emitting Diodes ("LEDs")) configured to emit light in response to one or more signals received from another computing entity. In certain embodiments, the one or more location devices 400 may comprise a plurality of light sources and/or one or more light sources configured to emit multiple light colors (e.g., light having a selectable wavelength) in order to convey specific information/data to various autonomous vehicles 100. For example, a location device 400 may be configured to emit a first light color to indicate a desired direction of travel and a second light color to indicate the location of a desired destination. The autonomous vehicle 100 may be configured to detect the emitted light and to distinguish between the first light color and the second light color. In certain embodiments, the location devices 400 may be configured to emit one or more radio frequency signals detectable by an autonomous vehicle, for example, to guide the autonomous vehicle 100 to a desired internal location within a facility.
In various embodiments, location information/data stored on one or more location devices 400 may be generated manually during a location device 400 initialization process, and/or automatically. For example, location information/data indicative of a particular internal location may be manually loaded onto a storage device associated with a location device 400 (e.g., based on user input received by the location device 400). In such embodiments, information/data indicative of the relative locations of various location devices 400 within a serviceable point may be manually and/or automatically determined to enable internal mapping and/or internal navigation between various internal locations. Information/data indicative of the internal addresses associated with each of the plurality of location devices 400 may be stored in association with the mapping computing entity 110, such that mapping and/or navigational operations may be enabled by the mapping computing entity 110.
In certain embodiments, the mapping computing entity 110 may be configured to automatically associate various internal addresses with various location devices 400. For example, the various location devices 400 may be configured to automatically identify other location devices 400 in an area surrounding the location device 400 in order to determine a relative location of each location device 400 relative to other location devices 400. Moreover, the mapping computing entity 110 may comprise information/data indicative of an internal map, such as a blueprint (e.g., a two-dimensional blueprint and/or a three-dimensional blueprint) and may be configured to associate the relative locations of various location devices 400 with particular internal locations reflected within the internal map.
As discussed herein, one or more computing entities (e.g., mobile user computing entity 105) may provide functionality similar to a location device 400. For example, a mobile user computing entity 105 associated with a particular mobile device user located within a facility may operate as a location device 400 indicating the current location of the associated mobile device user. In such embodiments, the mobile computing entity 105 may be configured to determine its location relative to one or more location devices 400 to enable its location to be monitored and/or stored by a mapping computing entity 110. Accordingly, a particular mobile user computing entity 105 may be identified as a desired destination location for a particular mobile device user, and the mapping computing entity 110 (and/or another computing entity) may be configured to generate a recommended route to the current location of the mobile user computing entity 105 defining the destination location. Thus, for example, the current location of the associated mobile device user may be monitored and/or identified as a desired destination within the facility. The mapping computing entity 110 may thus be configured to determine a recommended travel path from a particular location to the current location of a mobile device user, based at least in part on the current location of the associated computing entity 105.
Moreover, the mapping computing entity 110 may be configured to determine whether a particular mobile computing entity 105 is located within the associated facility prior to generating a recommended route to the mobile computing entity 105. For example, upon determining that a particular mobile computing entity 105 is identified as a desired destination (e.g., an individual associated with the mobile computing entity 105 is identified as a recipient of a package to be delivered), the mapping computing entity 110 may first determine whether the mobile computing entity 105 is located within the associated facility. If the mobile computing entity 105 is not located within the facility, the mapping computing entity 110 may be configured to transmit information/data to a requesting computing entity (e.g., an autonomous vehicle 100) indicating that the desired destination is unavailable. The mapping computing entity 110 may be configured to additionally provide potential alternative destination locations, such as a user's office, desk, cubicle, and/or the like as alterative destinations for the requesting computing entity.
G. Exemplary Transportation Mechanism
In various embodiments, a transportation mechanism 170 may be configured for movement of a user, an item, an autonomous vehicle 100, and/or the like within a facility. For example, transportation mechanisms 170 may comprise elevators, dumbwaiters, escalators, people movers, moving walkways, automated transit (e.g., monorail, train, and/or the like), and/or the like. In various embodiments, transportation mechanisms 170 may comprise one or more computing mechanisms, such as one or more processors, memory storage devices, communication interfaces, and/or the like, as described herein in reference to other computing entities. In various embodiments, transportation mechanisms 170 may be in wired and/or wireless communication with one or more other computing entities, such as mapping computing entity 110, autonomous vehicles 100, user computing entities 105, location devices 400, and/or the like. Moreover, in various embodiments, one or more transportation mechanisms 170 may comprise one or more location devices 400 associated with the transportation mechanism 170.
In various embodiments, one or more transportation mechanisms 170 may be selectably and/or continuously operable. For example, an escalator may be configured to operate continuously, regardless of whether a person and/or item is being transported by the escalator. Alternatively, a transportation mechanism 170, such as an elevator (and/or an escalator), may be configured to operate (e.g., move) only in response to an indication that a person and/or item is located thereon.
In various embodiments, the one or more transportation mechanisms 170 may be configured to receive information/data indicative of the presence of a user and/or an autonomous vehicle 100 based on information/data received from a mobile user computing entity (and/or autonomous vehicle 100) indicative of the presence of a user or autonomous vehicle 100 proximate a particular transportation mechanism 170. In various embodiments, the one or more transportation mechanisms 170 may be configured to move to a particular location to pick up a user carrying the detected mobile user computing entity 105 (e.g., an elevator may move to a particular floor at which the mobile computing entity is detected). Moreover, the one or more transportation mechanisms may be configured to receive a desired location (e.g., a desired floor) for permitting a user to exit the transportation mechanism 170 from the mobile computing entity (and/or another computing entity). Accordingly, various transportation mechanisms 170 need not require physical user interactions (e.g., pressing buttons) in order to operate the transportation mechanism 170.
III. Exemplary System Operation
Various embodiments are configured for providing one or more internal locations with corresponding internal addresses based on a location of one or more location devices 400, and for providing guidance to a particular location having an internal address.
A. Internal Addressing
As discussed above, various locations (and/or mobile devices) within a facility may be associated with defined internal addresses unique to each of the various locations. The internal addresses may correspond to a particular location within the facility, as reflected in map information/data stored for a particular facility. In various embodiments, the internal addresses may be identified in reference to location devices 400 located nearby to various internal locations.
Each of a variety of locations may correspond to unique internal addresses that may distinguish a particular location from others in the same facility. Accordingly, a particular internal location, such as a particular cubicle, office, floor, room, and/or the like, may be identified based on a corresponding internal address. In certain embodiments, internal addresses may comprise one or more information/data elements configured to be reflective of a particular location corresponding to the particular address. As discussed herein, a particular internal address may comprise a portion identifying a floor, a portion of a floor (e.g., corresponding to a particular light fixture within the portion of the floor), a building (e.g., within a multi-building facility), and/or a particular internal location (e.g., a piece of furniture (e.g., a desk)) located proximate the light fixture. However, internal addresses for various internal locations may be defined in any of a variety of ways, such as via unique character strings. As discussed herein, an internal address may be generated based on data identifying a plurality of associated locations, devices, objects, and/or the like. Moreover, various internal addresses may be static and/or dynamic internal addresses (e.g., an address of a particular room may be static and an address of a particular mobile device, such as a mobile user computing entity carried by a facility resident, may be dynamic). For example, an individual (e.g., having an associated mobile user computing entity) in a cubicle/office may have an internal address determined based at least in part on an internal address associated with a nearby light fixture, a nearby desk, a nearby mobile device, and/or the like. Moreover, a second individual entering the same space with a second mobile user computing entity may have a second dynamic internal address at the same location, and reflecting the identity of the second mobile user computing entity. Accordingly, in various embodiments, particular internal locations may be associated with multiple unique internal addresses based at least in part on a mobile user computing entity located at the internal location.
Moreover, one or more internal addresses may correspond to a variety of internal locations within a single building defining a facility, and/or a variety of locations within a plurality of buildings collectively defining a facility (e.g., within a multi-building campus). Accordingly, in various embodiments, an internal address may be indicative of a particular building, a particular floor within the building, a particular region of the floor, a particular room on the floor, a particular piece of furniture (or other internal location) within the particular region of the floor, and/or the like.
In various embodiments, the internal addresses may correspond with one or more location devices 400 located near the addressed locations. For example, each location device 400 may have associated location information/data identifying the particular location device 400. The location information/data may comprise a unique identifier for the location device 400, and a particular location proximate the location device 400 may be identified based on the unique identifier of the location device 400. As a specific example, if a location device 400 located within a particular office is identified as 05L56R10, then the internal address for the particular office may be 05L56R10.
In various embodiments, the internal addresses may be correlated with map information/data comprising information/data indicative of the relative position of various locations. For example, the map information/data may comprise one or more building (or campus) maps, blueprints, and/or the like, such as two-dimensional maps, three-dimensional maps, and/or the like in order to provide a locational relationship between the one or more internal addresses and various other locations within a particular facility. For example, Fig. 6 provides a two-dimensional map view of an example portion of a facility indicating the location of various location devices 400 and their associated internal addresses within the facility. In various embodiments, the map information/data may be stored in the mapping computing entity 110 having embedded internal location information/data points stored therein. In various embodiments, the map information/data having embedded internal location information/data points may be publicly accessible. However, in other embodiments, the map information/data may be privately stored, such that only authorized personnel (and/or authorized autonomous vehicles 100) are granted access to at least a portion of the map information/data. For example, as discussed herein, the map information/data and/or other information/data associated with the internal addressing features may be provided to various autonomous vehicles 100 upon receipt of authorization information/data from the autonomous vehicle 10.
Moreover, the map information/data may be reflective of the location of one or more location devices 400 within the facility. In various embodiments, the map information/data may be automatically and/or manually populated with the relative positioning of the location devices 400. For example, the mapping computing entity 110 may be configured to identify a relative location of a particular location device 400 within stored map information/data based at least in part on location information/data stored for the location device 400. For example, location information/data for a particular location device 400 may be indicative of a floor, a location on a floor, a building, and/or the like for the location device 400. Similarly, the map information/data may comprise information/data identifying various portions of the map information/data as a particular floor, a particular location on a floor, and/or the like. For example, as shown in the example map information/data shown in Fig. 4, the map information/data may comprise information/data indicative of the location of various location devices 400 relative to various walls, doors, and/or rooms. Moreover, the map information/data may comprise information/data indicative of various internal addresses associated with various locations (e.g., associated with the locations of various location devices), and/or the like. Accordingly, the mapping computing entity 110 may be configured to correlate the location information/data for a particular location device 400 with information/data identifying various locations within the map information/data to automatically identify a precise location of a location device 400 within the map information/data. However, it should be understood that the location of various location devices 400 may be manually provided within the map information/data.
In certain embodiments, the map information/data may be stored in a mapping computing entity 110 associated with the mapped facility. Thus, facilities may store their own map information/data and may communicate various portions of the stored map information/data to autonomous vehicles 100, via the internet, via local area networks, and/or the like. For example, map information/data corresponding to a particular facility may be communicated to a particular autonomous vehicle 100 when the autonomous vehicle 100 is located within a particular geographical area (e.g., within the facility, within a defined geofenced area, within a wireless communication range of one or more location devices 400 located within the facility, and/or the like). In such embodiments, the mapping computing entity 110 may be configured to transmit at least a portion of the map information/data to the computing entity upon establishing an electronic communication between the autonomous vehicle 100 and one or more electronic entities (e.g., mapping computing entity 110 and/or location devices 400) corresponding to the facility. Accordingly, the map information/data may be publicly accessible to any autonomous vehicles 100 within a facility. However, in certain embodiments, the mapping computing entity 110 may be configured to only transmit map information/data to authorized autonomous vehicles 100 (e.g., autonomous vehicles 100 associated with the facility; autonomous vehicles associated with known visitors of the facility (e.g., delivery services companies); and/or the like), and in such embodiments, the autonomous vehicle 100 (and/or another user computing entity 105) may be required to present authentication information/data to the mapping computing entity 110 before the mapping computing entity 110 transmits map information/data to the autonomous vehicle 100.
In various embodiments, the mapping information/data may be provided to one or more autonomous vehicles 100 from the mapping computing entity 110 via one or more wireless networks (e.g., the internet, an intranet, and/or the like). The map information/data may be provided to the autonomous vehicle 100 with an indication of a current location of the autonomous vehicle 100, and/or the autonomous vehicle 100 may be configured to separately determine its own location. As just one example, the mapping computing entity 110 may provide the autonomous vehicle 100 with at least a portion of the map information/data via a network connection between the autonomous vehicle 100 and the mapping computing entity 110. As a specific example, the mapping computing entity 110 may be configured to provide at least a portion of the map information/data upon the autonomous vehicle 100 connecting to a wireless network (e.g., a Wi-Fi network corresponding to the facility, a short-range wireless connection with one or more location devices 400, and/or the like). In such embodiments, the autonomous vehicle 100 may be configured to determine its own location relative to one or more location devices 400 within the facility, and to generate an indicator of its own location within the received map information/data. As yet another alternative, the autonomous vehicle 100 may be configured to transmit information/data indicative of its current location (as determined based on a location of nearby location devices 400) to the mapping computing entity 110, which may be configured to incorporate an indication of the location of the autonomous vehicle 100 into the map information/data before transmitting the same to the autonomous vehicle 100.
In certain embodiments, the mapping computing entity 110 may be configured to provide at least a portion of the map information/data to an autonomous vehicle 100 via one or more location devices 400 located near the autonomous vehicle 100. Accordingly, the location devices 400 utilized to transmit map information/data to the autonomous vehicle 100 may be configured to update the map information/data received from the mapping computing entity 110 to incorporate information/data indicative of its own location prior to transmitting the updated map information/data to the autonomous vehicle 100. Accordingly, once received by the autonomous vehicle 100, the map information/data comprises an indication of the location of the autonomous vehicle 100 based on the location of a nearby location device 400 from which the map information/data is received. As discussed herein, the location device 400 may be configured to transmit the map information/data (and/or other data) to the autonomous vehicle 100 via wireless communication protocols, such as short range Bluetooth, short range Wi-Fi, NFC, and/or the like.
In various embodiments, the mapping computing entity 110 may enable computing entities located outside of a facility to access the map information/data for a particular facility. For example, the mapping computing entity 110 may be configured to publish the map information/data via the Internet, thereby enabling computing entities (e.g., user computing entities) located outside of the facility to access one or more portions of the map information/data.
Moreover, as discussed herein, the map information/data may be stored on one or more third party mapping computing entities 110 located geographically remotely from the facility. In such embodiments, a computing system located within the facility may be configured to relay map information/data from the third party mapping computing entities 110 to the autonomous vehicles 100, and/or the autonomous vehicles 100 may be configured to receive the map information/data directly from the third party mapping computing entities 110. In various embodiments, the third party mapping computing entities 110 may be configured to automatically determine the location of the autonomous vehicle 100 (e.g., based on information/data provided by the autonomous vehicle) prior to providing the map information/data to the autonomous vehicle 100. For example, the third-party mapping computing system 110 may be configured to transmit map information/data for a particular facility to an autonomous vehicle 100 upon determining that the autonomous vehicle 100 is located within the facility. In various embodiments, the third party mapping computing entity 110 may comprise map information/data for a plurality of facilities, and may be configured to identify appropriate map information/data to provide to an autonomous vehicle 100 based at least in part on the location of the autonomous vehicle 100.
Moreover, in embodiments in which the map information/data is stored geographically remotely from the facility, the autonomous vehicle 100 may be configured to periodically provide the mapping computing entity 110 with updated information/data indicative of the location of the autonomous vehicle 100 within the facility such that the mapping computing entity 110 may be configured to update the location of the autonomous vehicle 100 within the map information/data. For example, the autonomous vehicle 100 may be configured to receive location information/data from a nearby location device 400 (e.g., location information/data identifying the location device 400 broadcast by the location device 400 to the autonomous vehicle 100 while the autonomous vehicle 100 is located within a communication range corresponding to the location device 400), and to transmit information/data identifying the corresponding location device 400 to the mapping computing entity 110 (e.g., via Wi-Fi, cellular information/data connection, and/or the like). The mapping computing entity 110, upon receipt of the location information/data from the autonomous vehicle 100, may update the determined location of the autonomous vehicle 100 within the facility. In various embodiments, the mapping computing entity 110 may be configured to transmit updated map information/data back to the autonomous vehicle 100 to reflect the updated location of the autonomous vehicle 100 within the facility.
In various embodiments, the autonomous vehicle 100 may be required to have a specific software application installed thereon and configured to parse the received map information/data. However, in certain embodiments, it should be understood that the autonomous vehicle 100 may not be required to have specific software applications installed thereon. In various embodiments, the specific software applications may comprise mapping software. For example, the mapping software may be specific to indoor navigation, may be configured for both outdoor (e.g., GPS-based) navigation and indoor navigation, and/or the like. In certain embodiments, the map information/data may be configured to be viewable via an interface, such as an Internet browser— e.g., Safari® browser, Google Chrome, Internet Explorer, Firefox, Opera, Netscape Navigator, and/or the like.
The map information/data may comprise information/data usable by the autonomous vehicle 100 to navigate the interior of the facility. Moreover, in certain embodiments in which the autonomous vehicle 100 comprises a graphical display, the autonomous vehicle 100 may be configured to generate a graphical display indicative of an interior map of the facility. For example, the graphical display may comprise a three- dimensional graphical display indicative of distance and altitude within the facility, and/or one or more two-dimensional graphical displays each indicative of a single altitude (e.g., floor) within a facility. An example map display is shown in Fig. 4. As shown in Fig. 4, the map display may identify the location of various location devices 400, the location of an autonomous vehicle 100, the location of various destinations, the address of various locations, and/or the like.
In various embodiments, the location of one or more mobile user computing entities 105 may be monitored within a facility, and the current location of a particular mobile user computing entity 105 may be correlated with a particular interior address. Thus, for example, the location of residents, employees, and/or the like within the building may be monitored and associated with particular indoor addresses based on the location of mobile user computing entities 105 associated with each mobile device user. By monitoring the current location of a particular mobile device user within a facility, a computing entity (e.g., autonomous vehicle 100 and/or mapping computing entity 110) may determine an appropriate route to reach the current location of a particular mobile device user within a facility.
B. Internal Routing
In various embodiments, the one or more internal addresses may be utilized for defining a route between a particular location (e.g., a current location of an autonomous vehicle 100) and a desired interior address. As noted above, the internal addresses may be correlated with map information/data (e.g., comprising a blueprint and/or other internal layout providing information/data indicative of the spatial relationships between various internal locations within the facility). Accordingly, various embodiments may be configured to calculate one or more routes between the current location of an autonomous vehicle 100 and a desired destination address within the facility. In various embodiments, navigational instructions may then be provided to the autonomous vehicle 100 to guide the autonomous vehicle 100 to the desired destination location. As shown in Fig. 7, which illustrates a flow chart of an example method for generating and providing navigational guidance for an autonomous vehicle 100 moving within a facility, the recommended route utilized to generate the navigational instructions may be generated for a particular autonomous vehicle 100 and may be provided to the autonomous vehicle 100 to guide the autonomous vehicle 100 to the desired destination.
1. Generating a Recommended Route
In various embodiments, one or more computing entities (e.g., mapping computing entity 110 and/or onboard controller of an autonomous vehicle 100) may be configured to determine a recommended route between a current location of an autonomous vehicle 100 and a desired destination of the autonomous vehicle 100 (e.g., a delivery location and/or intended recipient for a shipment to be delivered by the autonomous vehicle 100).
In various embodiments, the computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100) may receive information/data indicative of a current location of the autonomous vehicle 100 within a facility, and/or additional information about the facility, such as facility systems (e.g., environmental systems, transport systems, crowd control systems, and/or the like) as shown at Block 601 of Fig. 7. As discussed herein, the location of an autonomous vehicle 100 within a facility may be determined based on the identity of location devices 400 determined to be nearby the autonomous vehicle 100 (e.g., based on an estimated wireless communication link between the autonomous vehicle 100 and one or more location devices 400). Moreover, the computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100) may receive information/data identifying a desired destination for the autonomous vehicle 100, as shown at Block 602 of Fig. 7. In various embodiments, a desired destination address may be identified by a corresponding destination address (e.g., a character string corresponding to the internal address of the destination location); a name associated with a particular location (e.g., John Smith's office; 15th floor conference room; Reception; and/or the like); a mobile user computing entity user's name (e.g., John Smith); and/or the like. Accordingly, the autonomous vehicle 100 may be configured to accept a free-form text input indicative of the destination location (e.g., via user input provided prior to the autonomous vehicle departing for the intended destination), a user selection of one or more listed locations, a scanned destination location (e.g., a delivery destination for a shipment/item), transmitted destination location from a connected shipment/item, transmitted destination location from a mobile user computing entity 105, and/or the like. For example, the autonomous vehicle 100 may receive user input identifying a desired destination location, and the autonomous vehicle 100 may utilize the received user input to identify the location of the desired destination address, and/or the autonomous vehicle 100 may transmit information/data indicative of the desired destination to a mapping computing entity 110 to identify a recommended route. As yet other examples, a connected shipment/item may transmit information/data indicative of an intended consignee/destination for the shipment/item to the computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100), which may utilize the intended consignee/destination as the desired destination.
In various embodiments, a desired destination address within the facility may be identified based on information/data stored within the onboard controller of the autonomous vehicle 100. For example, the autonomous vehicle 100 may be configured to utilize information/data indicative of scheduled tasks, scheduled deliveries of shipments/items, and/or the like occurring at defined locations within the facility to identify a desired destination address. For a particular entry stored within the autonomous vehicle 100, a desired arrival time corresponding to the particular entry and a location corresponding to the entry may be identified (e.g., a desired delivery time and location for a particular shipment). The autonomous vehicle 100 may be configured to compare the desired arrival time for the entry against the current time, and may be configured to identify the location for the entry as the desired destination address if the desired arrival time for the entry is less than a configurable threshold amount of time from the current time. In certain embodiments, the autonomous vehicle 100 may be configured to identify a plurality of possible delivery locations each corresponding to non-overlapping time frames identified relative to the current time. For example, the autonomous vehicle 100 may be configured to identify a first delivery location if the desired arrival time is within a first timeframe relative to the current time (e.g., within 15 minutes of the current time) and a second delivery location if the desired arrival time is within a second timeframe relative to the current time (e.g., between 15 minutes and 30 minutes after the current time). As a non-limiting example, upon identifying information/data stored within the onboard controller of the autonomous vehicle 100 identifying a delivery scheduled to occur less than 15 minutes in the future, the autonomous vehicle 100 may be configured to automatically identify the location of the delivery as a desired destination location.
Moreover, the computing entity (e.g., mapping computing entity 110 and/or onboard computing entity of the autonomous vehicle 100) may receive information/data indicative of multiple desired destinations and/or one or more waypoints the autonomous vehicle 100 is scheduled to visit prior to arriving at a desired destination. In various embodiments, the computing entity (e.g., mapping computing entity 110 and/or onboard computing entity of the autonomous vehicle 100) may be configured to determine a most efficient order to visit the plurality of desired destinations and/or waypoints based on one or more configurable characteristics, such as the relative locations of the various desired destinations and/or waypoints, deadlines for arriving at the various destinations and/or waypoints, and/or the like. For example, an autonomous vehicle 100 may store information/data indicative of a plurality of shipment/item deliveries for a particular facility, each of which may be identified as a particular waypoint within a facility. In such embodiments, the computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100) may be configured to determine a most efficient route within the facility for delivering the shipments/items, and may generate a route between each of the plurality of shipment/item delivery destinations.
As yet other examples, a destination internal location may be identified based on a desired internal delivery location for an item. In various embodiments, the desired internal delivery location for the item may be identified based on user input identifying the desired internal delivery location received by the autonomous vehicle 100, based on computer-readable information/data printed on the item (e.g., a bar code, MaxiCode, QR code, RFID tag, and/or the like) and received by the autonomous vehicle 100 (e.g., via scanning the computer-readable information/data from the item); based on information/data transmitted from the item to a computing entity (e.g., onboard controller of the autonomous vehicle 100, location devices 400, and/or mapping computing entity 110); based on information/data transmitted from a third-party computing entity (e.g., a carrier-operated computing entity); based on information/data stored within the onboard controller of the autonomous vehicle 100, and/or the like.
In such embodiments, a desired internal delivery location may be identified as a specific internal address; as an identifier associated with a specific internal address (e.g., John Smith's office); as an identity of an intended recipient; and or the like. For example, one or more computing entities (e.g., mapping computing entity 110 and/or autonomous vehicle 100) may be configured to identify an internal location associated with the intended recipient (e.g., based on a static and/or dynamic directory comprising information/data identifying one or more locations corresponding to one or more occupants of a serviceable point). In various embodiments, an internal destination location corresponding to the intended recipient may be a static location (e.g., the intended recipient's cubicle, desk, office, apartment, and/or the like) and/or a dynamic location (e.g., determined based on a monitored current location of an intended recipient's mobile computing entity).
As yet another methodology for identifying desired internal destination locations, various embodiments may monitor the locations of one or more users (e.g., based on the determined location of a mobile computing entity carried by the user), and may determine typical internal locations associated with the user and/or typical times (and/or ranges) at which the user moves to one or more locations. The onboard controller of the autonomous vehicle 100 and/or mapping computing entity 110 may be configured to collect and store location information/data indicative of a corresponding user's location within the facility over time. Based at least in part on the collected and stored location information/data for a particular mobile user computing entity user, the autonomous vehicle 100 and/or mapping computing entity 110 may be configured to identify times at which a particular user is positioned at a particular location within the facility. For example, the computing entity may be configured to determine that a particular mobile user computing entity user is located in a particular office between 8AM and 12PM, in a lunch room between 12PM and 1PM, and in the same office between 1PM and 5PM. Thus, the computing entity may be configured to identify the office as the destination location when the mobile user computing entity user first arrives at the facility at approximately 8AM, the lunch room as the destination location at approximately noon, and the office as the destination location at approximately 1PM. Accordingly, various embodiments may be configured to automatically determine destination locations for one or more users based on historical information/data indicative of typical locations and/or time periods associated with one or more users.
In various embodiments, a computing entity (e.g., mapping computing entity 110 and/or onboard controller of an autonomous vehicle 100) may be configured to identify a desired destination location based on information/data identifying a mobile user computing entity user, a title, a service, and/or the like. For example, the autonomous vehicle 100 may receive data identifying "John Smith" as a desired destination location. As yet other examples, the autonomous vehicle 100 may receive data identifying "television salesman," "masseuse," "parts department," "IT support," and/or the like as a desired destination. Based on the provided input identifying a particular desired destination, the computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100) may be configured to determine whether the desired destination location is associated with a particular stationary location (e.g., a location associated with a corresponding stationary location device 400), a mobile location (e.g., a mobile user computing entity 105 identifying the current location of a particular mobile device user), and/or the like. For example, identifying a mobile device user (e.g., John Smith) as a destination location may cause a computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100) to determine whether the corresponding destination is a stationary location (e.g., John Smith's office) or a mobile destination (e.g., the current location of John Smith's mobile computing entity). In various embodiments, the determination of whether a particular destination location is a stationary location or a mobile location may be determined based on user preferences (e.g., the preferences of the mobile device user associated with a prospective destination). For example, a mobile device user may provide user input indicative of whether the mobile device user's office or the mobile device user's current location should be utilized as a destination location.
In various embodiments, other facility occupants and/or individuals may be enabled to designate a desired destination location for an autonomous vehicle 100. For example, facility occupants may request the presence of an autonomous vehicle 100 to pick up a package to be shipped via one or more carriers. In such embodiments, the user (e.g., a facility occupant) requesting the presence of the autonomous vehicle 100 may be enabled to designate a desired internal destination location (e.g., based on user input) and/or the desired internal destination location may be automatically indicated to be the current location of the user requesting the autonomous vehicle's presence.
Moreover, in various embodiments, the computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100) may be configured to collect and/or store historical information/data indicative of an average amount of time to travel through various portions of a facility at various times. The computing entity may be configured to monitor the movement of one or more autonomous vehicles 100 within the facility (e.g., by monitoring which location devices 400 are in communication with various autonomous vehicles 100 and the amount of time that the location devices 400 are in communication with the autonomous vehicles 100; and/or by receiving information/data from the autonomous vehicles 100 indicative of their movement throughout the facility). The computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100) may be configured to store the historical information/data collected as a result of the monitoring of the autonomous vehicle 100 movement and may determine, based on the historical data, an average amount of time to move through various portions of the facility at one or more times. For example, the mapping computing system may be configured to generate information/data indicative of an average amount of time for an autonomous vehicle 100 to move from John Smith's office to a loading dock at 5PM on Wednesdays. In various embodiments, the mapping computing entity 110 may be configured to monitor the amount of time to move between various areas of a facility along a plurality of routes. Accordingly, the mapping computing entity 110 may be configured to identify a fastest and/or shortest route for an autonomous vehicle 100 to travel between points within a facility. Accordingly, the mapping computing entity 110 may be configured to utilize the historical information/data to select a fastest (e.g., least travel time) route between a current location of an autonomous vehicle 100 and a desired destination.
The historical information/data may be generated based on monitoring various autonomous vehicles 100 moving throughout the facility. The mapping computing entity 110 may be configured to collect information/data indicative of various autonomous vehicles 100 traveling throughout the facility over time, and to generate information/data indicative of an average travel time between various locations within the facility at various times (e.g., 2:00 PM on Wednesday).
Based at least in part on the current location of an autonomous vehicle 100, a desired destination location, and/or the historical data, the computing entity (e.g., mapping computing entity 110 and/or onboard controller for the autonomous vehicle 100) may be configured to generate a recommended route between the current location of the autonomous vehicle 100 and the desired destination location as indicated at Block 603 of Fig. 7. In various embodiments, the recommended route may be dynamically determined, such that the recommended route may change if the autonomous vehicle 100 is determined to move off of the recommended route or if the desired destination location is changed.
2. Directing an Autonomous Vehicle
Upon generating a recommended route for an autonomous vehicle 100, the computing entity (e.g., onboard controller of the autonomous vehicle 100 and/or mapping computing entity 110) may direct the autonomous vehicle 100 along the recommended route toward the desired destination location, as shown in Block 604 of Fig. 7. The location of the autonomous vehicle 100 may be monitored (as shown at Block 605) to ensure the provided guidance remain accurate. For example, the onboard controller of the autonomous vehicle 100 may periodically (e.g., every 1 second, every 0.5 seconds, every 10 milliseconds, and/or the like) compare a determined location of the autonomous vehicle 100 and a determined heading of the autonomous vehicle 100 versus a desired location of the autonomous vehicle 100 and a desired heading of the autonomous vehicle 100 to ensure the autonomous vehicle is travelling along an appropriate travel path to the desired destination. As discussed herein, the onboard controller of the autonomous vehicle 100 may additionally monitor the determined heading of the autonomous vehicle 100 for one or more obstacles (e.g., people, objects, and/or the like) that pose a collision risk for the autonomous vehicle 100, such that the onboard controller may adjust the heading of the autonomous vehicle to avoid the detected collision risks.
The navigational instructions may be provided to and/or generated by the onboard controller of the autonomous vehicle 100, may be provided via one or more location devices 400, and/or the like. The following subsections provide example configurations for providing navigational instructions to autonomous vehicles 100, and should not be construed as limiting. a. Navigational Guidance Provided by Facility-Specific Mapping Computing Entity
In various embodiments, a facility-specific mapping computing entity 110 may be configured to provide navigational guidance for autonomous vehicles 100. For example, a mapping computing entity 110 configured to provide mapping services for a particular facility (e.g., storing map information/data for the particular facility and being in direct communication with one or more facility-specific computing entities) may be configured to provide navigational guidance to autonomous vehicles 100 upon generation of a recommended route for the autonomous vehicle 100.
In such embodiments, the facility-specific mapping computing entity 110 may be configured to generate a recommended route for the autonomous vehicle 100 upon receipt of information/data identifying a current location of the autonomous vehicle 100 and a desired destination for the autonomous vehicle 100 (for example, according to the methodology discussed herein). The facility- specific mapping computing entity 110 may be configured to be in direct communication with the autonomous vehicle 100, in order to receive information/data indicative of the current location of the autonomous vehicle 100 and/or information/data indicative of the desired destination of the autonomous vehicle 100; or the facility-specific mapping computing entity 110 may be configured to be in communication with the autonomous vehicle 100 via one or more relays (e.g., location devices 400).
In certain embodiments, the autonomous vehicle 100 may be configured to self-determine its location within the facility, and to provide information/data indicative of its current location to the facility-specific mapping computing entity 110 together with information/data identifying the autonomous vehicle 100. For example, the autonomous vehicle 100 may be configured to receive transmitted information/data from a nearby location device 400 providing information/data indicative of the location of the location device 400. The autonomous vehicle 100 may be configured to determine that the location information/data received from the nearby location device 400 is indicative of the current location of the autonomous vehicle 100. As yet another example, the autonomous vehicle 100 may be configured to self-determine its location relative to one or more known landmarks detected by one or more onboard sensors (e.g., artwork identified via image recognition, room name labels recognized via OCR (e.g., identified based on a room label such as "CONFERENCE ROOM 201"), and/or the like. The autonomous vehicle 100 may be configured to determine its own location using any of a variety of other technologies (e.g., triangulation based on signals received from a plurality of information/data communication devices and/or location devices 400, GPS, and/or the like).
In various embodiments, the location of the autonomous vehicle 100 within a facility need not be identified with sufficient precision to guide the autonomous vehicle 100 along a travel path and/or around various known obstacles. Instead, as discussed in greater detail herein, the location of the autonomous vehicle 100 may be determined and relayed to a mapping computing entity 110 with sufficient precision to identify a generic route to a determined destination. For example, the location of the autonomous vehicle 100 may be determined with a location tolerance of 1 foot, 2 feet, 5 feet, 10 feet, and/or the like. As discussed herein, the onboard controller of the autonomous vehicle 100 may utilize provided navigational guidance received from the mapping computing entity 110 together with input received from the various onboard sensors 127 (e.g., LIDAR, proximity sensors, cameras, and/or the like) to make precise determinations of various movements for the autonomous vehicle 100. As a specific example, the determined location of the autonomous vehicle 100 as identified by the mapping computing entity 110 may indicate that the autonomous vehicle 100 is located at a desired destination, within a 5 foot tolerance diameter. The autonomous vehicle may utilize input received from the various onboard sensors to guide the autonomous vehicle while it continues to move to the precise location of the desired destination to avoid potential collisions with detected objects.
The autonomous vehicle 100 may be configured to transmit information/data to the facility-specific mapping computing entity 110 indicative of the current location of the autonomous vehicle 100 (e.g., on an information/data transmission channel different from that between the location device 400 and the autonomous vehicle 100 and/or at a different transmission frequency). The autonomous vehicle 100 may be configured to transmit information/data indicative of a desired destination location to the facility-specific mapping computing entity 110 together with the information/data identifying the current location of the autonomous vehicle 100 and/or in a separate information/data transmission.
In various embodiments, the autonomous vehicle 100 may transmit information/data indicative of a desired destination to the facility-specific mapping computing entity 110 via one or more location devices 400 configured to indicate the current location of the autonomous vehicle 100 based on the known location of the transmitting location device 400. In such embodiments, the autonomous vehicle 100 may be configured to transmit information/data indicative of a desired destination and information/data indicative of the identity of the autonomous vehicle 100 to a proximate location device 400. The location device 400 may relay at least a portion of the received information/data to the facility-specific mapping computing entity 110 and may additionally transmit the current location of the autonomous vehicle 100 to the mapping computing entity 110.
As discussed herein, upon receipt of information/data identifying the desired destination location, the facility- specific mapping computing entity 110 may identify a recommended route to the desired destination location (e.g., based on current time, distance, and/or the like). The mapping computing entity 110 may additionally generate navigational instructions (e.g., route-based instructions, such as indications regarding when to turn, how far to travel, and/or the like) and may provide the navigational instructions to the autonomous vehicle 100.
In certain embodiments, the mapping computing entity 110 may transmit (e.g., directly and/or through one or more relays) map information/data together with information/data identifying the recommended route to the autonomous vehicle 100, such that the onboard controller of the autonomous vehicle 100 is enabled to operate the locomotion mechanisms of the autonomous vehicle 100 to move the vehicle along the recommended route, while utilizing input received from various onboard sensors 127 (e.g., LIDAR, proximity sensors, cameras, and/or the like) to avoid detected obstacles detected while the autonomous vehicle 100 moves along the recommended route. Moreover, as discussed above, the onboard controller of the autonomous vehicle 100 is configured to supplement the navigational guidance data received from the mapping computing entity 110 with the data received from the various onboard sensors. For example, the data received from the onboard sensors may be utilized to address known precision tolerances in the location data received from the mapping computing entity 110. The onboard sensors may thus be utilized to ensure the autonomous vehicle 100 turns at appropriate locations (e.g., at corners of hallways), fully boards various transportation mechanisms (e.g., elevators), fully approaches identified destination locations (e.g., a specific portion of a desk at a destination location), and/or the like.
In various embodiments, the mapping computing entity 110 may transmit signals to one or more location devices 400 to cause the location devices 400 to emit autonomous vehicle-detectable navigational cues (e.g., light, sound, displayed information, and/or the like) indicative of the navigational instructions for the autonomous vehicle 100. For example, as shown in the example schematic view of Figures 8-9 and 11, the one or more location devices 400 located along a recommended travel path may be illuminated such that an autonomous vehicle 100 may detect the signals (e.g., light signals) emitted from the location devices 400. The autonomous vehicles may utilize the illuminated location devices 400 as indicative of the recommended route, while the autonomous vehicle 100 utilizes additional sensor data generated by the one or more onboard sensors 127 to provide highly precise maneuvering determinations to move along the recommended route while avoiding obstacles. In various embodiments, the navigational cues are detectable by one or more sensors 127 onboard the autonomous vehicles 100 (e.g., lights are detected by a light sensor and/or a camera; sounds are detected by a microphone, infrared lights are detected by an infrared sensor, displayed images are detected by a camera and are interpreted by one or more algorithms operable on the onboard controller, and/or the like). The navigational cues may be indicative of a determined recommended route for the autonomous vehicle 100, and accordingly the autonomous vehicle 100 may rely on the navigational cues as "breadcrumbs" that serve to indicate the recommended route. For example, the autonomous vehicle 100 may be configured to detect a first navigational cue emitted from a first location device 400, and may immediately begin monitoring for a second navigational cue emitted from a second location device 400 farther along the recommended route. In various embodiments, the autonomous vehicle 100 may be configured to determine an orientation of the autonomous vehicle 100 within the facility (e.g., based on a known direction of travel relative to a location device, based on an internal compass, and/or the like) such that the autonomous vehicle 100 is enabled to distinguish between location devices 400 located between the autonomous vehicle 100 and the intended destination, and location devices 400 located between the autonomous vehicle 100 and the starting point of the recommended route (i.e., such that the autonomous vehicle 100 is enabled to identify location devices farther along the recommended route). Moreover, the autonomous vehicle 100 may be configured to move toward the detected navigational cues upon detection of the navigational cues. However, in certain embodiments, the autonomous vehicle 100 may be configured to move along the recommended route and to utilize the navigational cues to verify that the autonomous vehicle 100 is moving along the correct recommended route. In such embodiments, the autonomous vehicle 100 may be configured to anticipate the location of a subsequent navigational cue based at least in part on the location of a currently detected navigational cue, a previously detected navigational cue, and/or detected environmental characteristics surrounding the autonomous vehicle 100. In certain embodiments, the navigational cues may be directional, thereby indicating the location of the recommended travel path for the autonomous vehicle (e.g., the navigational cues may be arrows and/or text displayed on the locational devices 400 that are indicative of the location of a subsequent location device 400 along the recommended travel path). However, it should be understood that the navigational cues may not be directional, and accordingly the autonomous vehicle 100 may be configured to anticipate the location of a subsequent navigational cue based on various characteristics of the environment surrounding the autonomous vehicle 100. For example, the autonomous vehicle 100 may be travelling along an elongated hallway and may detect a navigational cue emitted by a location device 400 located proximate a midpoint of the hallway, rather than waiting until the next navigational cue is detected, the autonomous vehicle may be configured to anticipate the location of the next navigational cue as being farther along the hallway, in the same direction that the autonomous vehicle had been previously traveling. Thus, the autonomous vehicle 100 may be configured to move smoothly along the recommended route, and making adjustments to the route upon determining that an anticipated direction of movement is incorrect (as determined when no navigational cues are detected). In such embodiments, the autonomous vehicle 100 need not store navigational data locally on the onboard controller, because the autonomous vehicle 100 is guided based on the locations of the various navigational cues emitted by the location devices 400. However, in certain embodiments, the autonomous vehicle 100 may have navigational data indicative of the recommended route stored locally on the onboard controller, and accordingly the navigational cues are utilized by the autonomous vehicle 100 to verify that the autonomous vehicle 100 is travelling along the recommended travel path.
When transmitting signals to the one or more location devices, the mapping computing entity 110 may transmit signals to one or more location devices 400 satisfying configurable characteristics. For example, the mapping computing entity 110 may transmit signals to location devices 400 within a configurable distance of the autonomous vehicle 100 (e.g., 20 feet) and along the recommended route; to all location devices 400 along the recommended route (e.g., all location devices 400 that are likely to be in communication with the autonomous vehicle 100 for at least a period of time while the autonomous vehicle 100 travels to the desired destination); to a configurable number of location devices 400 located along the recommended route and substantially adjacent the current location of the autonomous vehicle 100 (e.g., the nearest 5 location devices 400 along the recommended route); to location devices 400 of a particular type (e.g., light fixtures) along at least a portion of the recommended route; and/or the like.
In various embodiments, the location devices 400 receiving signals to generate navigational cues may be identified based at least in part on the identified current location of an autonomous vehicle 100. In such embodiments, as the autonomous vehicle 100 moves within the facility, location devices 400 identified to generate navigational cues may be dynamically selected such that location devices 400 near a monitored current location of an autonomous vehicle 100 are utilized to provide navigational cues to the autonomous vehicle 100.
As yet another example, at least a portion of the location devices 400 located along the recommended route (e.g., all location devices 400 located along the recommended route) may receive signals that cause the location devices 400 to emit navigational cues when an autonomous vehicle 100 is detected to be proximate the location device 400. For example, the autonomous vehicle 100 may broadcast a unique wireless communication (e.g., via radio transmission, light transmission, and/or sound transmission) that is detectable by proximate location devices 400. Once the location devices 400 receive the unique transmission, the location devices 400 may be configured to emit navigational cues to the autonomous vehicle 100.
As a specific example, when an autonomous vehicle 100 is first initialized within a facility, a location device 400 proximate the autonomous vehicle 100 may detect the presence of the autonomous vehicle 100 as discussed herein (e.g., the location device 400 may establish a wireless communication with the autonomous vehicle 100 while the autonomous vehicle 100 is within a communication range associated with the location device 400). The location device 400 may transmit information/data to a mapping computing entity 110 identifying itself (e.g., the identity and/or location of the location device 400) and identifying the autonomous vehicle 100. As discussed herein, the location device 400 may also act as a relay to transmit information/data identifying a desired destination from the autonomous vehicle 100 to the mapping computing entity 110. As the autonomous vehicle 100 moves within the facility (e.g., along a recommended route generated as discussed herein), the location device 400 may handoff communications with the autonomous vehicle 100 to a second location device 400. Once the autonomous vehicle 100 is in communication with the second location device 400, the second location device 400 may transmit information/data identifying itself and information/data identifying the autonomous vehicle 100 to the mapping computing entity 110. Accordingly, the mapping computing entity 110 may thereby be configured to monitor the location of the autonomous vehicle 100 within the facility. In those embodiments in which the mapping computing entity 110 selects location devices 400 to emit navigational cues based on the determined location of the autonomous vehicle 100, the mapping computing entity 110 may transmit signals to one or more location devices 400 causing those location devices 400 to emit navigational cues based on the determined location of autonomous vehicle 100. For example, the mapping computing entity 110 may identify those location devices 400 located along the recommended route (after the recommended route is determined), and/or meeting one or more criteria based on the determined location of the autonomous vehicle 100. b. Navigational Instructions Generated by the Autonomous Vehicle
In various embodiments, the autonomous vehicle 100 may be configured to self-generate navigational instructions to guide the autonomous vehicle 100 to a desired destination. In such embodiments, the autonomous vehicle 100 may have map data stored locally on the onboard controller of the autonomous vehicle 100, and the autonomous vehicle 100 may thereby utilize the stored map data to self-generate a recommended route upon receipt of information/data identifying a current location of the autonomous vehicle 100 and a desired destination for the autonomous vehicle 100.
In certain embodiments, the autonomous vehicle 100 may be configured to self-determine its location within the facility such that the mobile computing entity 105 can determine a recommended route to a desired destination location. For example, the autonomous vehicle 100 may be configured to receive transmitted information/data from a nearby location device 400 providing information/data indicative of the location of the location device 400. The autonomous vehicle 100 may be configured to determine that the location information/data received from the nearby location device 400 is indicative of the current location of the autonomous vehicle 100 and may correlate the location information/data received from the nearby location device 400 with map information/data to identify a current location of the autonomous vehicle 100 within the facility. As yet another example, the autonomous vehicle 100 may be configured to determine its own location using any of a variety of other technologies (e.g., triangulation based on signals received from a plurality of information/data communication devices, GPS, detection of known landmarks within the facility (e.g., known artwork) by various onboard sensors of the autonomous vehicle 100, and/or the like).
As yet another example, one or more location devices 400 may detect the presence of the autonomous vehicle 100 as being within a communication range. For example, the autonomous vehicle 100 may broadcast information/data indicative of its identity to nearby location devices 400. The location devices 400 may transmit information/data back to the autonomous vehicle 100 directly and/or indirectly indicative of the current location of the autonomous vehicle 100 within the facility. The location devices 400 may, in certain embodiments, transmit information/data indicative of the identity of the location device 400 and the location of the autonomous vehicle 100 to the mapping computing entity 110, which may then transmit information/data back to the autonomous vehicle 100 indicative of the current location of the autonomous vehicle 100.
As discussed herein, upon receipt of information/data identifying the desired destination location, the autonomous vehicle 100 may be configured to execute the generated navigational instructions by moving through the facility along the recommended route. As discussed herein, the autonomous vehicle 100 may utilize the generated navigational instructions along with data received from the various onboard sensors to maneuver the autonomous vehicle 100 along the recommended route while avoiding collisions with detected persons, objects, and/or the like detected by the various onboard sensors.
In various embodiments, the mapping computing entity 110 may transmit signals to one or more location devices 400 to cause the location devices 400 to emit autonomous vehicle-detectable navigational cues (e.g., light, sound, displayed information, , and/or the like) indicative of the navigational instructions for the mobile device user. For example, the autonomous vehicle 100 may be configured to transmit a signal to a location device 400 within a communication range of the autonomous vehicle 100 configured to cause one or more location devices 400 to emit navigational cues for the autonomous vehicle 100. The location device 400 receiving the signal from the autonomous vehicle 100 may be configured to relay the signal to a mapping computing entity 110, which may be configured to transmit signals to one or more location devices 400 to emit navigational cues for the autonomous vehicle 100, as described herein. For example, the mapping computing entity 110 may transmit signals to one or more location devices 400 satisfying configurable characteristics. For example, the mapping computing entity 110 may transmit signals to location devices 400 within a configurable distance of the autonomous vehicle 100 (e.g., 20 feet) and along the recommended route; to all location devices 400 along the recommended route (e.g., all location devices 400 that are likely to be in communication with the autonomous vehicle 100 for at least a period of time while the autonomous vehicle 100 travels to the desired destination); to a configurable number of location devices 400 located along the recommended route and substantially adjacent the current location of the autonomous vehicle 100 (e.g., the nearest 5 location devices 400 along the recommended route); to location devices 400 of a particular type (e.g., light fixtures) along at least a portion of the recommended route; and/or the like. As discussed herein, the autonomous vehicle 100 may be configured to utilize the navigational cues as breadcrumbs identifying the recommended route.
In various embodiments, the location devices 400 receiving signals to generate navigational cues may be identified based at least in part on the identified current location of an autonomous vehicle 100. In such embodiments, as the autonomous vehicle 100 moves within the facility, location devices 400 identified to generate navigational cues may be dynamically selected such that location devices 400 near a monitored current location of an autonomous vehicle 100 are utilized to provide navigational cues to the autonomous vehicle 100.
As yet another example, the autonomous vehicle 100 may be configured to transmit a signal to a location device 400 located within a communication range of the autonomous vehicle 100. The receiving location device 400 may be configured to transmit the received signal to one or more additional location devices 400 (e.g., via parallel transmissions from the receiving location device 400 to a plurality of additional location devices 400 and/or via a series transmissions from the receiving location device 400 to a second location device 400, which then transmits the signal to a third location device 400, and/or the like). The receiving location device 400 and/or one or more of the additional location devices 400 may thereby receive the transmitted signal and may provide navigational cues for the autonomous vehicle 100. In various embodiments, the autonomous vehicle 100 may be configured to transmit a new signal each time a new location device 400 is within range of the autonomous vehicle 100. Accordingly, the identity, number, and/or location of location devices 400 emitting navigational cues may be updated each time the autonomous vehicle 100 moves to connect to a new location device 400 within the facility.
As a specific example, when an autonomous vehicle 100 is first initialized, the autonomous vehicle 100 may receive a broadcast signal from a nearby location device 400. The autonomous vehicle 100, which may have map information/data for the facility stored thereon, may compare the received signal from the location device 400 against the map information/data to identify its location within the facility. Upon receipt of information/data identifying a desired destination location, the autonomous vehicle 100 may generate a recommended route within the facility to the desired destination location and may begin executing the recommended route by moving through the facility. As the autonomous vehicle 100 moves within the facility, the autonomous vehicle 100 may continuously receive new broadcast location information/data from new nearby location devices 400, and accordingly the autonomous vehicle 100 may update the determined current location of the autonomous vehicle 100 within the facility. The autonomous vehicle 100 may be configured to update the recommended route traveled by the autonomous vehicle 100 based on the determined location of the autonomous vehicle 100 within the facility.
In various embodiments, the autonomous vehicle 100 may be configured to transmit signals to one or more location devices 400 causing the location devices 400 to provide navigational cues for the autonomous vehicle 100. Accordingly, as the autonomous vehicle 100 passes each location device 400 while travelling toward a desired destination location, the autonomous vehicle 100 may emit signals to each passed location device 400 causing one or more location devices 400 to emit navigational cues. c. Navigational Instructions Provided by Third-Party Mapping Computing Entity
In various embodiments, a third-party mapping computing entity 110 may be configured to generate and provide navigational instructions to an autonomous vehicle 100. In such embodiments, one or more additional computing entities (e.g., onboard controller of the autonomous vehicle 100 and/or facility- specific mapping computing entity 110) may be configured to transmit information/data indicative of the identity of the facility, the identity of the autonomous vehicle 100, the current location of the autonomous vehicle 100 within the facility, and/or a desired destination location for the autonomous vehicle 100 to the third- party mapping computing entity 110. Upon receipt of information/data identifying the facility, the autonomous vehicle 100, the current location of the autonomous vehicle 100, and the desired destination location for the autonomous vehicle 100, the third-party mapping computing entity 110 may be configured to generate a recommended route to the desired destination location for the autonomous vehicle 100. The third-party mapping computing entity 110 may be configured to transmit information/data indicative of the recommended route (and/or map data) to the autonomous vehicle 100 and/or the facility-specific mapping computing entity 110 to provide navigational instructions to the autonomous vehicle 100, as discussed herein. For example, the autonomous vehicle 100 may receive the information/data indicative of the recommended route from the third-party mapping computing entity 110 (e.g., via a direct information/data transmission from the third-party mapping computing entity 110 and/or via an indirect information/data transmission from the third-party mapping computing entity 110 and through the facility-specific mapping computing entity 110 and/or one or more location devices 400. As yet another example, the autonomous vehicle 100 and/or the facility-specific mapping computing entity 110 may receive information/data indicative of the recommended route from the third-party mapping computing entity and may transmit signals to one or more location devices 400 to cause the location devices 400 to emit navigational cues for the autonomous vehicle 100. d. Providing Navigational Instructions via Location devices
As discussed herein, one or more location devices 400 may be configured to provide navigational cues to an autonomous vehicle 100 along a recommended route to a desired destination within a facility. In various embodiments, the recommended route may be generated by a facility specific mapping computing entity 110, a third party mapping computing entity 110, an onboard controller of the autonomous vehicle 100, and/or the like. For example, location devices 400 located along at least a portion of a recommended route (e.g., location devices 400 that are likely to be in communication with the autonomous vehicle 100 as the autonomous vehicle 100 moves along the calculated route) between a determined current location of an autonomous vehicle 100 and a desired destination may be configured to emit navigational cues (e.g., light, sound, and/or the like) to direct a particular autonomous vehicle 100 to the desired destination.
For example, various location devices 400 may be configured to emit one or more colored lights to indicate a recommended route to the destination internal address (e.g., as shown in Figs. 8-9 and 11). For example, location devices 400 along a recommended route may illuminate in a first color (e.g., green) to indicate the recommended route to the autonomous vehicle 100. In various embodiments, the illuminated color may be generic and may apply to all autonomous vehicles 100, or it may be selected for a particular autonomous vehicle 100 (e.g., a recommended route for a first autonomous vehicle 100 is illuminated with purple location devices 400 and the recommended route for a second autonomous vehicle 100 is illuminated with blue location devices 400). As discussed herein, the autonomous vehicles 100 may be configured to detect the illuminated colors of the navigational cues, and may utilize the illuminated colors as virtual breadcrumbs indicating the recommended route. Accordingly, as discussed herein, the autonomous vehicle 100 may be configured to continuously detect the illuminated colors of the location devices 400 as the autonomous vehicle 100 moves along the recommended route.
In various embodiments, one or more location devices 400 may be configured to provide an audible instruction (e.g., a beep and/or a spoken instruction) detectable by the autonomous vehicle 100 as the autonomous vehicle 100 moves along the determined route. The audible instruction need not be audible to a human ear (e.g., the instruction may be emitted at higher and/or lower frequencies than detectable by the human ear), so long as the audible instruction is detectable by sensors onboard the autonomous vehicle 100. Moreover, in various embodiments, the location device 400 corresponding to the desired destination internal address may be configured to emit one or more signals (e.g., a second colored light (e.g., yellow) and/or an audible tone) to indicate to the autonomous vehicle 100 the final location of the destination location.
In various embodiments, the location devices 400 may be configured to receive signals from a computing entity (e.g., mapping computing entity 110 and/or onboard controller of the autonomous vehicle 100) causing the location devices 400 to emit navigational cues. For example, the mapping computing entity 110 may transmit a signal to one or more location devices 400 causing the location devices 400 to emit navigational cues from a mapping computing entity 110 after the mapping computing entity 110 receives a request for navigational instructions from an autonomous vehicle 100. As another example, the autonomous vehicle 100 may transmit a signal to a first location device 400 located proximate the autonomous vehicle 100 (e.g., within a communication range associated with the location device 400) causing at least the first location device 400 to emit a navigational cue for the particular autonomous vehicle 100. In various embodiments, the first location device 400 may be configured to transmit a signal to a second location device 400 located along the recommended route toward a desired destination location (e.g., as determined by the autonomous vehicle 100 and/or the mapping computing entity 110). In response to receipt of the signal from the first location device 400, the second location device 400 may emit a navigational cue. In various embodiments, location devices 400 along the entire recommended route between a particular location (e.g., the current location of the autonomous vehicle 100 identified when the autonomous vehicle 100 requested navigational instructions to a destination location) and the destination location may be configured to emit navigational cues simultaneously. As a specific example, location devices 400 along the entire recommended route may emit light having desired characteristics (e.g., wavelength, flashing frequency, and/or the like) simultaneously to guide the particular autonomous vehicle 100 toward the desired destination. As yet another example, location devices 400 along a portion of the recommended route may be configured to provide navigational cues simultaneously. In various embodiments, a computing entity (e.g., mapping computing entity 110) may be configured to dynamically update which location devices 400 emit navigational cues based on a dynamically determined current location of the autonomous vehicle 100 within the facility. For example, location devices 400 located within a configurable threshold distance of the current location of an autonomous vehicle 100 and located between the current location of the autonomous vehicle 100 and the desired destination location along the recommended route may receive signals causing the location devices 400 to emit navigational cues. Thus, the location devices 400 emitting navigational cues may change as the particular autonomous vehicle 100 moves along the recommended route to lead the autonomous vehicle 100 along the recommended route. In such embodiments, a computing entity (e.g., onboard controller of the autonomous vehicle 100 mapping computing entity 110) may be configured to transmit a signal to a location device 400 causing the location device 400 to terminate emitting navigational cues upon determining that the autonomous vehicle 100 has passed the particular location device 400 (e.g., upon determining that the particular location device 400 is no longer located between the current location of the autonomous vehicle 100 and the desired destination).
As a specific example, an autonomous vehicle 100 located at a loading dock on a first floor of a facility may receive user input indicating that the autonomous vehicle 100 is to deliver a shipment to the office of John Smith on the fifth floor of the facility. The autonomous vehicle 100 may transmit information/data indicative of the desired destination location to the mapping computing entity 110 corresponding to the facility. The mapping computing entity 110 may determine the location of the autonomous vehicle 100 being within a transmission range of a location device 400 located at the loading dock, and may calculate a recommended route between the autonomous vehicle's current location at the loading dock on the first floor to John Smith's office on the fifth floor.
Upon identifying a recommended route to John Smith's office, the mapping computing entity may transmit signals to one or more location devices 400 between John Smith's office and the current location of the autonomous vehicle 100 (determined to be located at the location device 400 located nearest the location of the autonomous vehicle 100) to cause the location devices 400 to indicate the recommended route. For example, the mapping computing entity 110 may transmit signals to all of the location devices 400 located along the recommended travel path, or a subset of the location devices 400 located along the recommended travel path. As just one specific example, the mapping computing entity 110 may be configured to transmit signals to a predetermined number of location devices 400 located between the current location of the autonomous vehicle 100 and the desired destination and adjacent the current location of the autonomous vehicle 100 (e.g., the three location devices 400 located between the location device 400 identifying the current location of the autonomous vehicle 100 and the destination location and immediately adjacent the current location of the autonomous vehicle 100). The autonomous vehicle 100 may begin executing the recommended route by moving toward a first detected navigational cue emitted from a nearby location device 400. In various embodiments, as the autonomous vehicle 100 moves along the recommended path, the first location device 400 indicating the recommended travel path may be configured to stop providing an indication of a recommended travel path once the autonomous vehicle 100 enters the transmission range of the second location device 400. Moreover, another location device 400 located beyond the last location device 400 providing an indication of the recommended travel path may be configured to begin providing an indication of the recommended travel path, such that the location devices 400 leading the autonomous vehicle 100 continue to provide an indication of the recommended travel path. With reference to Figs. 8-9, which each provide schematic views of an autonomous vehicle 100 traveling along a hallway toward a destination location, the autonomous vehicle 100 may communicate with a first, nearest location device 400, while a second location device 400 may illuminate to indicate a recommended direction toward a destination location. Upon reaching the desired destination location, the location device 400 corresponding to the destination location may be configured to indicate the location of the desired destination location. With reference to the previously mentioned example, a location device 400 proximate John Smith' s office may illuminate to indicate the destination location to the autonomous vehicle 100. The autonomous vehicle 100 may thereafter utilize the various onboard sensors to maneuver to a precise destination location, such as John Smith's desk in his office. The autonomous vehicle 100 may thus rely on various sensors to avoid collisions with objects and/or persons within John Smith's office, and to ultimately move adjacent the desk to deposit an item to be delivered to John Smith. The autonomous vehicle 100 may utilize the various onboard sensors to verify that the shipment is deposited onto the desk in John Smith' s office.
C. Transportation Mechanism Operation
Moreover, in various embodiments, the one or more computing entities (e.g., onboard controller of autonomous vehicles 100 and/or mapping computing entity 110) may be configured to operate one or more transportation mechanisms 170 (e.g., elevators, escalators, automatic doors, and/or the like) to facilitate movement of an autonomous vehicle 100 within a facility. For example, when a recommended route between a current location of an autonomous vehicle 100 and a desired destination location includes one or more transportation mechanisms 170 (e.g., an elevator), the computing entity may be configured to transmit one or more signals to the transportation mechanism 170 such that the transportation mechanism 170 is available for boarding by the autonomous vehicle 100 when the autonomous vehicle 100 reaches the transportation mechanism 170, and to automatically direct the transportation mechanism 170 to move the autonomous vehicle 100 to an appropriate location along the recommended route. For example, an elevator may be available and open for an autonomous vehicle 100 when the autonomous vehicle 100 arrives at the elevator bank while moving toward a destination location, and the elevator may automatically move to a desired floor once the autonomous vehicle 100 is in the elevator. An example method for automatically operating one or more transportation mechanisms 170 is illustrated in the flow chart of Fig. 11.
In various embodiments, one or more transportation mechanisms 170 may be configured to receive operating information/data from an autonomous vehicle 100 and/or provide information/data to an autonomous vehicle 100, internal mapping entity, and/or the like. As discussed herein, the transportation mechanisms 170 may comprise a communication device configured to communicate (e.g., wirelessly) with one or more autonomous vehicles 100. The communication device of the transportation mechanisms may be configured to communicate with autonomous vehicles 100 within a defined communication range, and accordingly the autonomous vehicle 100 may be configured to transmit signals causing the transportation mechanisms 170 to move once the autonomous vehicle 100 is within the communication range of the transportation mechanism 170.
With reference specifically to Fig. 11, one or more transportation mechanisms 170 may be operated to move an autonomous vehicle 100 along a recommended route generated as discussed herein. Accordingly, a computing entity (e.g., an onboard controller of an autonomous vehicle 100 and/or mapping computing entity 110) may receive information/data indicative of a current location of an autonomous vehicle 100 and information/data indicative of a desired destination associated with the autonomous vehicle 100, as indicated at Blocks 701-702. Based at least in part on the information/data indicative of the current location of the autonomous vehicle 100 and the information/data indicative of the desired destination associated with the autonomous vehicle 100, the computing entity may generate a recommended route through the facility between the current location and the desired destination of the autonomous vehicle 100, as indicated at Block 703.
Upon generating the recommended route, the computing entity (e.g., the onboard controller of the autonomous vehicle 100 and/or mapping computing entity 110) may identify one or more transportation mechanisms located along the recommended route, as indicated at Block 704. For example, the computing entity may determine that movement from a current location on one floor of a facility to a desired destination on a different floor of the facility may involve travelling in an elevator between floors. The one or more identified transportation mechanisms 170 may be identified as candidates for automated operation as the autonomous vehicle 100 approaches the transportation mechanism 170. As the autonomous vehicle 100 moves along the generated recommended route within the facility, the computing entity monitors the location of the autonomous vehicle 100, as indicated at Block 705. Based on the monitored location of the autonomous vehicle 100, the computing entity estimates an arrival time for the autonomous vehicle 100 to arrive at the transportation mechanism, as shown at Block 706. For example, the computing entity may determine an average speed for the autonomous vehicle 100 moving along the recommended route and/or a distance to reach the transportation mechanism 170, and may estimate an amount of time remaining until the autonomous vehicle 100 reaches the transportation mechanism. As yet another example, the computing entity may comprise information/data indicative of a geofenced area surrounding the transportation mechanism 170. In various embodiments, the edge of the geofenced area may be an estimated travel time away from the transportation mechanism (e.g., an estimated amount of time for a determined average user to move from the edge of the geofence to the transportation mechanism 170), such that the computing entity estimates the time remaining before the autonomous vehicle 100 reaches the destination location based on the time at which the autonomous vehicle 100 crosses the edge of the geofenced area. As shown in Block 707, the computing entity may be configured to transmit one or more signals to the transportation mechanism 170 to cause the transportation mechanism 170 to enable the autonomous vehicle 100 to board the transportation mechanism 170. For example, as an autonomous vehicle 100 approaches an elevator along the recommended route, as shown in Fig. 12, the computing entity may transmit a signal to the elevator to cause the elevator to move to the current location of the autonomous vehicle 100 (e.g., the current floor of the autonomous vehicle 100) if the elevator was not previously located at the current location of the autonomous vehicle 100 and to open the elevator doors to enable the autonomous vehicle 100 to board the elevator when the autonomous vehicle 100 arrives at the elevator.
Upon receiving information/data indicating that the autonomous vehicle 100 is onboard the transportation mechanism (e.g., received from the autonomous vehicle 100, the transportation mechanism, one or more location devices 400, and/or the like), the computing entity may be configured to transmit a second signal to the transportation mechanism 170 to cause the transportation mechanism to move to a second location along the recommended route, as indicated at Block 708. Accordingly, upon receipt of the second signal, the transportation mechanism 170 may be configured to close included doors (if applicable) and move to a second specified location along the recommended route. For example, the computing entity may transmit a signal to an elevator causing the elevator to move to a different floor on which the autonomous vehicle's destination location is located. Upon reaching the second location at which the autonomous vehicle 100 is to disembark the transportation mechanism 170, the transportation mechanism 170 may be configured to enable the autonomous vehicle 100 to disembark (e.g., by opening the doors of the transportation mechanism 170), as indicated at Block 709. In various embodiments, the transportation mechanism 170 may be configured to automatically enable the autonomous vehicle 100 to disembark, however in certain embodiments, the computing entity may be configured to transmit a third signal to the transportation mechanism 170 to enable the autonomous vehicle 100 to disembark. For example, as an autonomous vehicle 100 moves along a recommended route between an initial location and a desired destination location, the autonomous vehicle 100 (which may generate and store information/data indicative of the recommended route) may be configured to transmit a signal to a transportation mechanism 170 located along the recommended route causing the transportation mechanism to facilitate the autonomous vehicle's movement toward the desired destination. For example, as the autonomous vehicle 100 approaches an elevator, the autonomous vehicle 100 may call the elevator to the autonomous vehicle's initial floor. Once the autonomous vehicle 100 enters the elevator, the autonomous vehicle 100 may transmit a second signal causing the elevator to move to a desired floor (e.g., the floor of the desired destination location) to enable the autonomous vehicle 100 to disembark at the desired floor. In various embodiments, the autonomous vehicle 100 may be configured to generate and transmit the signals to the transportation mechanism automatically, based on the generated recommended route.
In certain embodiments, one or more transportation mechanisms 170 may be in communication with a facility-specific mapping computing entity 110 (e.g., directly and/or via a relay, such as one or more location devices 400). Accordingly, the one or more transportation mechanisms 170 may be configured to receive operating information/data from the facility-specific mapping computing entity 110. Moreover, in various embodiments, the one or more transportation mechanisms may be configured to provide operational information/data (e.g., current location of the transportation mechanism, current operating state of the transportation mechanism, and/or the like) to an autonomous vehicle 100, mapping computing entity 110, and/or the like.
For example, as an autonomous vehicle 100 moves along a recommended route between an initial location and a desired destination location, the facility- specific mapping computing entity 110 (which may generate and store information/data indicative of the recommended route) may be configured to transmit a signal to a transportation mechanism 170 located along the recommended route causing the transportation mechanism 170 to facilitate the autonomous vehicle movement toward the desired destination. For example, as an autonomous vehicle 100 approaches an elevator, the facility-specific mapping computing entity 110 may detect the autonomous vehicle's presence proximate the elevator, and may transmit a signal calling the elevator to the autonomous vehicle's current floor. Once the autonomous vehicle 100 enters the elevator, the facility-specific mapping computing entity 110 may transmit a second signal to the elevator causing the elevator to move to a desired floor (e.g., the floor of the desired destination location) to enable the autonomous vehicle 100 to disembark at the desired floor. In various embodiments, the facility-specific mapping computing entity 110 may be configured to generate and transmit the signals to the transportation mechanism 170 automatically, based on the generated recommended route.
In various embodiments, the facility-specific mapping computing entity 110 may be configured to transmit one or more signals to a transportation mechanism 170 upon determining that an autonomous vehicle 100 is within a predefined distance of the particular transportation mechanism. For example, the facility-specific mapping computing entity 110 may be configured to monitor the location of the autonomous vehicle 100 moving within the facility (e.g., based on the identity of location device 400 in communication with the autonomous vehicle 100). Upon determining that the autonomous vehicle 100 is within a predefined distance from the transportation mechanism 170, the facility-specific mapping computing entity 110 may transmit a signal to the transportation mechanism 170 causing the transportation mechanism 170 to move to the autonomous vehicle's current floor such that the transportation mechanism 170 is available when the autonomous vehicle 100 arrives at the transportation mechanism 170. In various embodiments, the signal transmitted to the transportation mechanism 170 may comprise information/data indicative of a current location of an autonomous vehicle 100 (e.g., the initial floor) and a desired destination for the transportation mechanism 170 (e.g., a destination floor). In various embodiments, the current location of autonomous vehicle 100 (e.g., the initial floor) and/or the desired destination for the autonomous vehicle 100 (e.g., the destination floor) may be determined based at least in part on a generated recommended route to the desired destination.
Conclusion
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation. Moreover, a particular destination location may be mobile within a particular building and/or campus. For example, a destination location may correspond to the current location of a particular mobile user computing entity user (e.g., determined based on the location of a mobile user computing entity carried by the mobile user computing entity user). In such embodiments, the destination internal address may change. Accordingly, various embodiments may be configured to adjust a route determined between a current location of an autonomous vehicle 100 and the destination location such that the autonomous vehicle 100 intercepts the mobile user computing entity user defining the destination location.
Moreover, in various embodiments, computing entities may be configured to generate one or more notifications to various mobile user computing entity users, such as a mobile user computing entity user defining a destination location, that a particular autonomous vehicle 100 is scheduled to visit the mobile user computing entity user. Accordingly, the generated notification may request that the mobile user computing entity user remain within their current area until the autonomous vehicle 100 reaches the current location of the mobile user computing entity user.

Claims

CLAIMS What is claimed is:
1. An autonomous vehicle configured for autonomous movement within a facility along one or more recommended routes, the autonomous vehicle comprising: one or more locomotion mechanisms configured to maneuver the autonomous vehicle within the facility; one or more sensors configured to detect navigational cues emitted by one or more location devices positioned along a recommended route; an onboard controller comprising at least one non-transitory memory and a processor, wherein the onboard controller is configured to: receive navigational data from the one or more sensors, wherein the navigational data is indicative of the location of one or more location devices emitting navigational cues identifying the recommended route; control the one or more locomotion mechanisms based at least in part on the identified locations of the one or more location devices emitting navigational cues identifying the recommended route to move the autonomous vehicle along the recommended route identified by the navigational cues emitted by location devices positioned along the recommended route.
2. The autonomous vehicle of claim 1, wherein the navigational cues are detectable within a localized area surrounding the location device, and the one or more sensors are configured to detect the navigational cues while the autonomous vehicle is positioned within the localized area surrounding the location device.
3. The autonomous vehicle of claim 2, wherein the navigational cues are detectable within a direct line of sight of the location devices, and the one or more sensors are configured to detect the navigational cues while the autonomous vehicle is positioned within a direct line of sight of the location devices.
4. The autonomous vehicle of claim 3, wherein the navigational cues are defined by light emitted from the location devices, and the one or more sensors comprise at least one of a light sensor or a camera.
5. The autonomous vehicle of claim 2, wherein the navigational cues are defined by wireless data transmissions and the localized area is defined by the transmission range of the location device, and wherein the one or more sensors comprise a wireless data receiver.
6. The autonomous vehicle of claim 5, wherein the wireless data transmissions are Bluetooth low energy data transmissions.
7. The autonomous vehicle of claim 1, wherein the navigational cues are directional, and wherein the onboard controller is configured to determine a direction of travel along the recommended route based on the detected directional navigational cue.
8. The autonomous vehicle of claim 1, wherein the one or more sensors are configured to detect obstacles along a travel path of the autonomous vehicle, and wherein the controller is configured to control the operation of the locomotion mechanism to avoid colliding with detected obstacles.
9. The autonomous vehicle of claim 8, wherein the one or more sensors comprises at least one sensor selected from: a LIDAR sensor, a light sensor, a camera, a microphone, and a wireless data receiver.
10. The autonomous vehicle of claim 8, wherein the controller is configured to identify the recommended route based at least in part on the detected navigational cues and the output of the one or more sensors to avoid collisions.
11. The autonomous vehicle of claim 1, wherein the controller comprises map data stored within the non-transitory memory, and the controller is further configured to: determine a recommended route to be traversed by the autonomous vehicle; control the operation of the locomotion mechanism to move the autonomous vehicle along the recommended route; monitor the navigational data generated by the one or more sensors to ensure the autonomous vehicle is travelling along the recommended route.
12. The autonomous vehicle of claim 1, further comprising an item support configured to support one or more items to be transported along a recommended route to be delivered to a destination location defining an end of the recommended route.
13. The autonomous vehicle of claim 12, wherein the item support is configured to support a plurality of items to be delivered at one or more locations within the facility.
14. The autonomous vehicle of claim 12, wherein the one or more sensors are configured to detect a delivery location for at least one item, and the controller is configured to maneuver the autonomous vehicle adjacent the delivery location.
15. The autonomous vehicle of claim 14, further comprising an item deposit mechanism configured to deposit the item at the delivery location, and wherein the controller is configured to operate the item deposit mechanism once the autonomous vehicle is positioned adjacent the delivery location.
16. The autonomous vehicle of claim 1, wherein the locomotion mechanism is selected from an aerial-based locomotion mechanism and a land-based locomotion mechanism.
17. A method for guiding an autonomous vehicle along a recommended route defined within a facility, the method comprising: detecting one or more navigational cues emitted by one or more location devices positioned along the recommended route; determining a location of the one or more location devices emitting the navigational cues; and activating a locomotion mechanism based at least in part on the identified locations of the one or more location devices emitting navigational cues to move the autonomous vehicle along the recommended route.
18. The method of claim 17, wherein the navigational cues are detectable within a localized area surrounding the location device, and wherein detecting the one or more navigational cues comprises: moving the autonomous vehicle into the localized area surrounding the location device; and detecting a navigational cue emitted by the location device while the autonomous vehicle is positioned within the localized area surrounding the location device.
19. The method of claim 18, wherein the navigational cues are defined by light emitted by the one or more location devices, and wherein detecting the one or more navigational cues comprises: moving the autonomous vehicle within a line-of-sight of the location device; and detecting the light emitted by the location device while the autonomous vehicle is positioned within the line-of-sight of the location device.
20. The method of claim 17, wherein the navigational cues are directional, and wherein activating the locomotion mechanism comprises: determining an appropriate direction of travel relative to the location device along the recommended route based on a detected directional navigational cue; and activating the locomotion mechanism to move the autonomous vehicle in the appropriate direction of travel along the recommended route.
21. The method of claim 17, further comprising: detecting one or more obstacles along the recommended route; and wherein activating the locomotion mechanism comprises activating the locomotion mechanism to move the autonomous vehicle along the recommended route and to avoid collisions with detected obstacles along the recommended route.
22. The method of claim 17, wherein the recommended route terminates at an intended delivery destination for a shipment transported by the autonomous vehicle, and wherein the method further comprises: detecting a delivery location at the intended delivery destination; activating the locomotion mechanism to move the autonomous vehicle adjacent the detected delivery location; activating an item deposit mechanism to deposit the shipment at the delivery location.
23. An autonomous vehicle guidance system for guiding an autonomous vehicle through a facility, the autonomous vehicle guidance system comprising: a mapping computing entity comprising a non-transitory memory and a processor, wherein the mapping computing entity is configured to determine a recommended route for an autonomous vehicle to travel through a facility to reach an intended destination; a plurality of location devices positioned throughout the facility, wherein each of the location devices are configured to emit navigational cues detectable by an autonomous vehicle upon receipt of a signal from the mapping computing entity; at least one autonomous vehicle configured for autonomous movement within the facility, the autonomous vehicle comprising: one or more locomotion mechanisms configured to freely maneuver the autonomous vehicle within the facility; one or more sensors configured to detect navigational cues emitted by the one or more location devices; an onboard controller comprising at least one non-transitory memory and a processor, wherein the onboard controller is configured to: receive navigational data from the one or more sensors, wherein the navigational data is indicative of the location of one or more location devices emitting navigational cues; identify a recommended route based on the location of the one or more location devices emitting navigational cues; control the one or more locomotion mechanisms based at least in part on the identified locations of the one or more location devices emitting navigational cues to move the autonomous vehicle along the recommended route.
24. The autonomous vehicle guidance system of claim 23, wherein location devices are configured to emit the navigational cues within a localized area surrounding the location device, and wherein the one or more sensors are configured to detect the navigational cues while the autonomous vehicle is positioned within the localized area.
25. The autonomous vehicle guidance system of claim 24, wherein the navigational cues are defined as light emitted from the location devices, and the one or more sensors are configured to detect the emitted light while the autonomous vehicle is positioned within a line-of-sight of the location device.
26. The autonomous vehicle guidance system of claim 24, wherein the navigational cues are defined by wireless data transmissions emitted from the location devices within a transmission range of the location devices, and the one or more sensors are configured to detect the emitted wireless data transmissions while the autonomous vehicle is located within the transmission range of the location device.
27. The autonomous vehicle guidance system of claim 23, wherein the one or more sensors are configured to detect obstacles within the facility, and wherein the controller is configured to control the operation of the locomotion mechanism to avoid colliding with the detected obstacles.
28. The autonomous vehicle guidance system of claim 23, wherein the autonomous vehicle further comprises an item support configured to support one or more items to be transported along a recommended route to be delivered to a destination location defining an end of the recommended route.
29. The autonomous vehicle guidance system of claim 28, wherein the one or more sensors are configured to detect a delivery location for at least one item, and the controller is configured to maneuver the autonomous vehicle adjacent the delivery location.
30. The autonomous vehicle guidance system of claim 28, wherein the autonomous vehicle further comprises an item deposit mechanism configured to deposit the item at the delivery location, and wherein the controller is configured to operate the item deposit mechanism once the autonomous vehicle is positioned adjacent the delivery location.
31. The autonomous vehicle guidance system of claim 23, wherein the locomotion mechanism is selected from an aerial-based locomotion mechanism and a land- based locomotion mechanism.
EP18778736.1A 2017-09-14 2018-09-06 Automatic routing of autonomous vehicles intra-facility movement Withdrawn EP3682191A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201715704640A 2017-09-14 2017-09-14
PCT/US2018/049749 WO2019055281A2 (en) 2017-09-14 2018-09-06 Automatic routing of autonomous vehicles intra-facility movement

Publications (1)

Publication Number Publication Date
EP3682191A2 true EP3682191A2 (en) 2020-07-22

Family

ID=63684569

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18778736.1A Withdrawn EP3682191A2 (en) 2017-09-14 2018-09-06 Automatic routing of autonomous vehicles intra-facility movement

Country Status (3)

Country Link
EP (1) EP3682191A2 (en)
CA (1) CA3068963A1 (en)
WO (1) WO2019055281A2 (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3107180C (en) 2016-09-06 2022-10-04 Advanced Intelligent Systems Inc. Mobile work station for transporting a plurality of articles
CA3090827C (en) 2018-02-15 2021-01-19 Advanced Intelligent Systems Inc. Apparatus for supporting an article during transport
US10745219B2 (en) 2018-09-28 2020-08-18 Advanced Intelligent Systems Inc. Manipulator apparatus, methods, and systems with at least one cable
US10751888B2 (en) 2018-10-04 2020-08-25 Advanced Intelligent Systems Inc. Manipulator apparatus for operating on articles
US10645882B1 (en) 2018-10-29 2020-05-12 Advanced Intelligent Systems Inc. Method and apparatus for performing pruning operations using an autonomous vehicle
US10966374B2 (en) 2018-10-29 2021-04-06 Advanced Intelligent Systems Inc. Method and apparatus for performing pruning operations using an autonomous vehicle
US10676279B1 (en) 2018-11-20 2020-06-09 Advanced Intelligent Systems Inc. Systems, methods, and storage units for article transport and storage
US11747149B2 (en) * 2019-04-01 2023-09-05 Robert Bosch Gmbh Method for operating a track guidance system
DE102019207773A1 (en) * 2019-04-01 2020-10-01 Robert Bosch Gmbh Method for operating a guidance system
US11693420B2 (en) * 2019-08-09 2023-07-04 Universal City Studios Llc Vehicle guidance via infrared projection
US11529981B2 (en) 2020-01-31 2022-12-20 Siemens Mobility, Inc. Ultra-wideband based vital train tracking
US20230408289A1 (en) * 2020-07-28 2023-12-21 Ception Technologies Ltd. Guidance of a transport vehicle to a loading point
CN114612622A (en) * 2020-12-14 2022-06-10 北京石头创新科技有限公司 Robot three-dimensional map pose display method, device and equipment and storage medium
CN112698648B (en) * 2020-12-14 2024-04-19 珠海格力智能装备有限公司 Vehicle positioning method and device and autonomous navigation vehicle
CA3214999A1 (en) * 2021-05-07 2022-11-10 Youngjun Choi Cloud-based platform for determining and generating optimized navigation instructions for autonomous vehicles
US11754414B2 (en) 2021-05-13 2023-09-12 Ford Global Technologies, Llc Systems and methods to provide last mile assistance to a delivery robot
JP2023128584A (en) * 2022-03-03 2023-09-14 トヨタ自動車株式会社 Information processing device, method, and program

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2610427B1 (en) * 1987-02-04 1995-09-29 Protee SYSTEM AND METHOD FOR MONITORING THE RUNNING OF A SELF-CONTAINED VEHICLE
ITTO20080489A1 (en) * 2008-06-23 2009-12-24 Bromas S R L INFRARED DRIVING SYSTEM FOR AUTOMATIC DRIVING TROLLEYS
ES2401509B1 (en) * 2011-10-05 2014-03-05 Universidad De Almería GUIDING SYSTEM FOR AUTONOMOUS MOVEMENT OF VEHICLES IN STRUCTURED ENVIRONMENTS.
US9147173B2 (en) * 2011-10-31 2015-09-29 Harvest Automation, Inc. Methods and systems for automated transportation of items between variable endpoints
CN103228040A (en) * 2012-01-31 2013-07-31 国际商业机器公司 Indoor electronic map generating method and system, and indoor target positioning method and system

Also Published As

Publication number Publication date
WO2019055281A2 (en) 2019-03-21
WO2019055281A3 (en) 2019-04-25
CA3068963A1 (en) 2019-03-21

Similar Documents

Publication Publication Date Title
CA3068963A1 (en) Automatic routing of autonomous vehicles intra-facility movement
CA3065721C (en) System and method for enabling remote operation of at least one transportation mechanism
US11435744B2 (en) Autonomously delivering items to corresponding delivery locations proximate a delivery route
US9905100B2 (en) Remote initiation of interaction by a computing entity
US10351400B2 (en) Apparatus and method of obtaining location information of a motorized transport unit
US10796269B2 (en) Methods for sending and receiving notifications in an unmanned aerial vehicle delivery system
US20170213308A1 (en) Arbitration of passenger pickup and drop-off and vehicle routing in an autonomous vehicle based transportation system
US10949792B2 (en) System and method for delivering items using autonomous vehicles and receptacle targets
US11829927B2 (en) Remote initiation of interaction by a computing entity
US11474515B2 (en) Method and control apparatus for an autonomous and/or semiautonomous transport vehicle
CN112660267A (en) Article transfer robot, article transfer system, and robot management device
JP2019077530A (en) Article conveying device
JP2021064241A (en) Article transfer system
GB2542470A (en) Shopping facility assistance systems, devices, and methods to dispatch and recover motorized transport units that effect remote deliveries
RU2734927C1 (en) System for accurate delivery by drones with identification of a recipient's personality
JP2024007239A (en) Information processing device, delivery system, and information processing method
JP2023076332A (en) Collection and delivery system, collection and delivery center device, and collection and delivery management method
CA3023051A1 (en) Remote initiation of interaction by a computing entity

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200108

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210428

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20230627