US20210383320A1 - Object location in a delivery vehicle - Google Patents
Object location in a delivery vehicle Download PDFInfo
- Publication number
- US20210383320A1 US20210383320A1 US17/411,495 US202117411495A US2021383320A1 US 20210383320 A1 US20210383320 A1 US 20210383320A1 US 202117411495 A US202117411495 A US 202117411495A US 2021383320 A1 US2021383320 A1 US 2021383320A1
- Authority
- US
- United States
- Prior art keywords
- asset
- location
- control system
- user
- user device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000003860 storage Methods 0.000 claims description 87
- 238000000034 method Methods 0.000 claims description 50
- 230000003213 activating effect Effects 0.000 claims description 4
- 239000003086 colorant Substances 0.000 claims description 2
- 230000015654 memory Effects 0.000 description 55
- 238000004891 communication Methods 0.000 description 30
- 238000012545 processing Methods 0.000 description 30
- 230000033001 locomotion Effects 0.000 description 23
- 238000013507 mapping Methods 0.000 description 20
- 230000007246 mechanism Effects 0.000 description 16
- 230000008569 process Effects 0.000 description 15
- 230000003190 augmentative effect Effects 0.000 description 13
- 230000000007 visual effect Effects 0.000 description 13
- 230000006870 function Effects 0.000 description 12
- 238000004422 calculation algorithm Methods 0.000 description 10
- 238000004590 computer program Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 239000004984 smart glass Substances 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 239000011521 glass Substances 0.000 description 7
- 238000007726 management method Methods 0.000 description 6
- 230000004044 response Effects 0.000 description 6
- 230000003068 static effect Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 5
- 238000013515 script Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000012790 confirmation Methods 0.000 description 4
- 230000004438 eyesight Effects 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 241000258963 Diplopoda Species 0.000 description 3
- 238000013459 approach Methods 0.000 description 3
- 230000000712 assembly Effects 0.000 description 3
- 238000000429 assembly Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000011664 signaling Effects 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000005055 memory storage Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000007797 non-conventional method Methods 0.000 description 2
- 230000037361 pathway Effects 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 239000011888 foil Substances 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 229910052736 halogen Inorganic materials 0.000 description 1
- 150000002367 halogens Chemical class 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000005043 peripheral vision Effects 0.000 description 1
- 230000037074 physically active Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000000275 quality assurance Methods 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000003325 tomography Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/083—Shipping
- G06Q10/0833—Tracking
-
- G06K9/00201—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/021—Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
Definitions
- U.S. application Ser. No. 16/226,180 is a continuation-in-part of U.S. application Ser. No. 16/103,566, entitled “Hands-Free Augmented Reality System For Picking and/or Sorting Assets,” filed Aug. 14, 2018; which claims priority to U.S. Provisional Patent Application No. 62/545,752, entitled “Hands-Free Augmented Reality System For Picking and/or Sorting Assets and Methods of Utilizing The Same,” filed Aug. 15, 2017.
- a primary component in some systems and methods for automated handling of packages is a conveyance device (i.e., a conveyor belt), which is generally formed and/or extended around at least two driving wheels. Thus, by turning the driving wheels, the conveyor belt may run continuously. Conveyor belts may also generally be flexible and deformable at least while running in contact with the driving wheels, and a multitude of materials, linkages, and so forth have been used to achieve these goals.
- a conveyance device i.e., a conveyor belt
- Conveyor belts may also generally be flexible and deformable at least while running in contact with the driving wheels, and a multitude of materials, linkages, and so forth have been used to achieve these goals.
- environments e.g., a delivery vehicle, a trailer or cargo area of a delivery vehicle, a warehouse environment whether relative to a sort location, a pick location, a conveyor belt, and/or any combination thereof.
- a system for hands-free handling of at least one asset by a user.
- the system can include a user device configured to be worn by a user.
- the user device may include one or more memories and one or more processors configured to perform the following operations.
- Asset identifier data can be obtained for at least one asset.
- Location data, associated with a location for the at least one asset, can be determined based, at least in part, upon the obtained asset identifier data.
- One or more navigational projections configured to guide the user to the location can be dynamically generated and displayed.
- Handling of the at least one asset by the user can be detected.
- One or more notifications associated with the handling of the at least one asset by the user at the location may be received.
- a computer implemented method for hands-free handling of at least one asset by a user.
- the method may include the following operations.
- Asset identifier data for at least one asset can be received from a remote location relative to a user device that is worn by the user.
- First location data associated with the user device can be determined at the user device.
- Second location data associated with the at least one asset may be determined.
- the second location data may be determined, based at least in part, on analyzing the received asset identifier data.
- the first location data may be determined, based at least in part, on analyzing a present position of the user device.
- One or more navigational projections configured to guide the user to a location associated with the second location data may be dynamically generated and displayed.
- the one or more navigational projections may be dynamically updated based at least in part on one or more detected changes of a present location of the user device.
- a computer program product for hands-free handling of at least one asset by a user.
- the computer program product may include at least one non-transitory computer-readable storage medium having computer-readable program code portions stored therein.
- the computer-readable program code portions may include one or more executable portions configured for performing the following operations.
- An environment that a user is located in can be mapped based at least in part on generating a multidimensional graphical representation of the environment.
- Asset identifier data to identify at least one asset can be received at a user device.
- One or more assets locations can be associated within the mapped environment.
- the one or more asset locations may be associated with the at least one asset.
- One or more navigational projections configured to guide the user to an asset location within the environment may be generated and displayed based at least on the associating and within the environment that the user is in.
- a system comprising one or more processors and one or more computer-storage media having one or more instructions stored that, when used by the one or more processors, cause the one or more processors to perform the following operations.
- the operations comprise initializing a scanning device secured to a storage area of a delivery vehicle. Additionally, the operation comprise determining that an asset is located within the storage area based on scanning information obtained from the scanning device, wherein the scanning information includes an asset identifier associated with the asset and a defined delivery destination. The operations also comprise obtaining a current location of the delivery vehicle based on detected location data.
- the operations further comprise, based on a determination that that the obtained current location is within a threshold distance of the defined delivery destination, activating a projection device to emit a projection that corresponds to a determined position of the asset, the position of the asset being determined based at least in part on the obtained scanning information.
- a computer-implemented method for scanning and locating assets comprises obtaining, by a computing device, an asset identifier for an asset located within a physical environment based on received scanning information, wherein the asset identifier is associated with a delivery destination defined for the asset. Additionally, the method comprises generating, by the computing device, asset location data that is associated with the obtained asset identifier and defines a position of the asset within the physical environment based on the received scanning information. The method further comprises determining, by the computing device, a current location of the physical environment based on received asset location data.
- the method also comprises causing, by the computing device, a projection device to emit a navigational projection that corresponds to the position of the asset within the physical environment based on the asset location data and a determination that the determined current location is within a threshold distance of the defined delivery destination.
- one or more computer-storage media having computer-executable instructions embodied thereon that, when executed by a computing device, perform a method.
- the method comprises determining a physical location of a delivery vehicle based on detected location data.
- the method also comprises obtaining an asset identifier for an asset stored within the delivery vehicle based on obtained scanning information, wherein the asset identifier is associated with a defined delivery destination.
- the method further comprises determining a location to emit a navigational projection within a cargo portion of the delivery vehicle based on asset location data that is generated based at least in part on the obtained scanning information.
- the method comprises causing a projection device secured to the cargo portion to emit the navigational projection directed to the determined location based on a determination that the determined physical location is within a threshold distance of the defined delivery destination.
- FIG. 1 schematically depicts a control system according to one or more embodiments shown and described herein;
- FIG. 2 schematically depicts a control system according to one or more embodiments shown and described herein;
- FIG. 3 schematically depicts a user device that communicates with a control system of according to one or more embodiments shown and described herein;
- FIG. 4 depicts a user device in conjunction with a harness mechanism according to one or more embodiments shown and described herein;
- FIG. 5 depicts a user device in isolation without the harness mechanism according to one or more embodiments shown and described herein;
- FIG. 6 schematically depicts a flowchart illustrating operations and processes performed by a user device according to one or more embodiments shown and described herein;
- FIG. 7 schematically depicts a flowchart illustrating operations and processes performed according to one or more embodiments shown and described herein;
- FIG. 8 depicts a facility and an environmental mapping procedure achieved via a user device according to one or more embodiments shown and described herein;
- FIG. 9 depicts a facility and a pathway indicating navigational projection achieved via a user device according to one or more embodiments shown and described herein;
- FIG. 10 depicts a shelving containing portion of a facility and a placement indicating navigational projection achieved via a user device according to one or more embodiments shown and described herein;
- FIGS. 11A-C depict further views of three exemplary embodiments of the placement indicating navigational projection achieved via a user device according to one or more embodiments shown and described herein;
- FIG. 12 is a perspective or isometric view of a conveyor belt assembly that may be utilized in conjunction with the control system and user device according to one or more embodiments shown and described herein;
- FIGS. 13A-13F depict further views of additional exemplary navigational projections achieved via a user device and in conjunction with a conveyor belt assembly according to one or more embodiments shown and described herein;
- FIGS. 14A-B depicts a navigational projection within a physical environment according to one or more embodiments shown and described herein;
- FIG. 15A is a perspective view of a physical environment according to one or more embodiments shown and described herein;
- FIG. 15B is a side view of the physical environment of FIG. 15A according to one or more embodiments shown and described herein;
- FIG. 15C is a perspective view of a plurality of scanning devices having a field of view of the physical environment of FIG. 15A according to one or more embodiments shown and described herein;
- FIG. 16 depicts an exemplary projector device according to one or more embodiments shown and described herein;
- FIG. 17 depicts a navigational projection within a physical environment according to one or more embodiments shown and described herein;
- FIG. 18 is a flow diagram of an exemplary process for locating an asset according to one or more embodiments shown and described herein.
- augmented reality-based computing solutions have been pursued, such as with reference to U.S. Ser. No. 15/581,609, the contents of which as are incorporated by reference herein in their entirety.
- These augmented-reality-based solutions can utilize objects, such as smart glasses, to generate an environment so as to provide to carrier personnel (e.g., via a lens of the smart glasses) directions for transporting particular assets or packages.
- smart glasses may be uncomfortable to use for long periods of time (e.g., due to the weight and constant pressure) and these glasses reduce the peripheral vision for instructions needed for users or reduce vision in general due to glare on the lenses, which may impact both safety and job accuracy.
- Various embodiments of the present disclosure improve these existing technologies, such as smart glasses, by at least utilizing a hands-free user device(s), a control system or server in networked communication with the hands-free user device(s), and/or a generated augmented reality environment to facilitate handling and transport of an asset or package by carrier personnel or the like.
- the handling and/or transport of the asset or package may be related to a picking of the asset from a pick location (e.g., to “pull” the asset to fulfill an order thereof by a customer), the sorting of the asset to a sort location (e.g., from a conveyor belt or the like to the next location in which transport or handling of the asset may occur, for example, on the shelving of a warehouse or a vehicle).
- the hands-free user device(s) enables carrier personnel to transport and/or handle the asset or package in a safe, ergonomic, efficient, and accurate matter, regardless of where (e.g., to and from) the handling and/or transport is occurring, at least within a three dimensional environment mapped via the hands-free user device(s).
- a user device can be worn by a user, such as on a wearable article of clothing, as opposed to placing eyewear over a user's eyes or using a mobile or cart device for the handling of assets.
- a navigational projections can be dynamically generated and displayed (e.g., within a physical environment a user is in, as opposed to a lens medium) to guide the user to the location associated with the location data.
- aspects can also detect handling of the asset by the user (e.g., via cameras, sensors).
- One or more notifications associated with the detection of the handling can be received (e.g., from a control system).
- Location data can be determined based on analyzing asset identifier data and analyzing a present position of the user device.
- a user's environment may be mapped based at least on generating a multidimensional graphical representation of the environment and associating one or more asset locations within the mapped environment. At least each of these new functionalities improve existing technologies, as these are functionalities that various existing technologies do not now employ.
- non-conventional methods include the following operations: obtaining one or more asset identifiers determining location data for the associated asset(s).
- Navigational projections can be dynamically generated and displayed (e.g., within a physical environment a user is in, as opposed to a lens medium) to guide the user to the location associated with the location data.
- Aspects can also detect handling of the asset by the user (e.g., via cameras, sensors).
- One or more notifications associated with the handling can be received (e.g., from a control system).
- Location data can be determined based on analyzing asset identifier data and analyzing a present position of the user device.
- a user's environment may be mapped based at least on generating a multidimensional graphical representation of the environment and associating one or more asset locations within the mapped environment. At least each of these new functionalities include non-conventional functions.
- an asset may be a parcel or group of parcels, a package or group of packages, a box, a crate, a drum, a box strapped to a pallet, and/or the like.
- packages to be sorted may be moved along a conveyor belt from some package source to an intake location (e.g., one or more sort employee workstations).
- a user e.g., a sort employee or carrier personnel generally
- embodiments utilizing a conveyor belt assembly may rely upon an acquisition device (e.g., a stationary imager) positioned above the conveyor, upstream of the intake location or sort employee workstations to capture data associated with the package. Additional details in this respect may be understood with reference to U.S. Ser. No. 15/581,609, the contents of which as are incorporated by reference herein in their entirety.
- acquisition device e.g., a stationary imager
- the carrier personnel or sort employee may be guided to particular packages to select for transport.
- the hands-free user device(s) may be configured, according to various embodiments, to generate various projections, visible to the carrier personnel or sort employee.
- the generated projections which may be three-dimensional or two-dimensional in form, are configured to guide the carrier personnel or sort employee from their current location to the appropriate sort location for the particular package being handled.
- the hands-free user device(s) upon detecting a placement of the particular package may further verify that the placement is correct. If incorrect, notification(s) may be generated, which notifications may take multiple forms, as detailed elsewhere herein.
- the control system may, via the network, interface with the hands-free user device(s) so as to generate one or more of various projections to guide the carrier personnel or pick employee to the location of a particular package that needs to be picked or “pulled” for order fulfillment from a warehouse location or the like.
- specific projections may be generated, so as to advise the personnel or employee which specific package should be picked/pulled and/or how many packages (i.e., of the same type) should be picked/pulled.
- the latter may be further configured to then guide the carrier personnel to a subsequent location for ongoing handling/transport of the package.
- Exemplary subsequent locations may include a conveyor belt, a sort location, and/or a delivery location, as discussed above and also detailed elsewhere herein.
- the hands-free user device(s) may utilize software that not only detects changes in handling of the packages (e.g., picking up or placement actions), but that also detects various markers or identifiers distributed throughout the facility or warehouse, so as to ensure accuracy of the guidance and/or navigational instructions provided to the carrier personnel.
- no such markers or identifiers may be provided, as the three dimensional mapping via the user device(s)—with networking connectivity to the control system/server—may be utilized to calibrate and establish defined locations (i.e., pick or sort) throughout the facility or warehouse prior to utilization of the hands-free user device(s) for operational purposes by the carrier personnel.
- RFID/WIFI radio signal triangulation
- digital compass digital compass
- any other current method to determine indoor and outdoor position and bearing may be utilized.
- control system may also, via the network or otherwise, interface with a fixed projector and/or the hands-free user device(s) so as to generate one or more of various projections to guide a delivery vehicle operator to the location of a particular package that needs to be picked or “pulled” for delivery at an address at which the delivery vehicle is presently located.
- specific projections may be generated, so as to advise the personnel or employee which specific package should be picked/pulled for delivery at the present address. It should be understood that upon detection of the picking/pulling of the package(s) by the projector and/or the user device, at least the latter may be further configured to then guide the carrier personnel or vehicle operator to a subsequent location for ongoing handling/transport of the package.
- Embodiments of the present disclosure may be implemented in various ways, including as computer program products that comprise articles of manufacture.
- a computer program product may include a non-transitory computer-readable storage medium storing applications, programs, program modules, scripts, source code, program code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like (also referred to herein as executable instructions, instructions for execution, computer program products, program code, and/or similar terms used herein interchangeably).
- Such non-transitory computer-readable storage media include all computer-readable media (including volatile and non-volatile media).
- a non-volatile computer-readable storage medium may include a floppy disk, flexible disk, hard disk, solid-state storage (SSS) (e.g., a solid state drive (SSD), solid state card (SSC), solid state module (SSM)), enterprise flash drive, magnetic tape, or any other non-transitory magnetic medium, and/or the like.
- SSD solid state drive
- SSC solid state card
- SSM solid state module
- a non-volatile computer-readable storage medium may also include a punch card, paper tape, optical mark sheet (or any other physical medium with patterns of holes or other optically recognizable indicia), compact disc read only memory (CD-ROM), compact disc-rewritable (CD-RW), digital versatile disc (DVD), Blu-ray disc (BD), any other non-transitory optical medium, and/or the like.
- CD-ROM compact disc read only memory
- CD-RW compact disc-rewritable
- DVD digital versatile disc
- BD Blu-ray disc
- Such a non-volatile computer-readable storage medium may also include read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory (e.g., Serial, NAND, NOR, and/or the like), multimedia memory cards (MMC), secure digital (SD) memory cards, SmartMedia cards, CompactFlash (CF) cards, Memory Sticks, and/or the like.
- ROM read-only memory
- PROM programmable read-only memory
- EPROM erasable programmable read-only memory
- EEPROM electrically erasable programmable read-only memory
- flash memory e.g., Serial, NAND, NOR, and/or the like
- MMC multimedia memory cards
- SD secure digital
- SmartMedia cards SmartMedia cards
- CompactFlash (CF) cards Memory Sticks, and/or the like.
- a non-volatile computer-readable storage medium may also include conductive-bridging random access memory (CBRAM), phase-change random access memory (PRAM), ferroelectric random-access memory (FeRAM), non-volatile random-access memory (NVRAM), magnetoresistive random-access memory (MRAM), resistive random-access memory (RRAM), Silicon-Oxide-Nitride-Oxide-Silicon memory (SONOS), floating junction gate random access memory (FJG RAM), Millipede memory, racetrack memory, and/or the like.
- CBRAM conductive-bridging random access memory
- PRAM phase-change random access memory
- FeRAM ferroelectric random-access memory
- NVRAM non-volatile random-access memory
- MRAM magnetoresistive random-access memory
- RRAM resistive random-access memory
- SONOS Silicon-Oxide-Nitride-Oxide-Silicon memory
- FJG RAM floating junction gate random access memory
- Millipede memory racetrack memory
- a volatile computer-readable storage medium may include random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), fast page mode dynamic random access memory (FPM DRAM), extended data-out dynamic random access memory (EDO DRAM), synchronous dynamic random access memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), double data rate type two synchronous dynamic random access memory (DDR2 SDRAM), double data rate type three synchronous dynamic random access memory (DDR3 SDRAM), Rambus dynamic random access memory (RDRAM), Twin Transistor RAM (TTRAM), Thyristor RAM (T-RAM), Zero-capacitor (Z-RAM), Rambus in-line memory module (RIMM), dual in-line memory module (DIMM), single in-line memory module (SIMM), video random access memory (VRAM), cache memory (including various levels), flash memory, register memory, and/or the like.
- RAM random access memory
- DRAM dynamic random access memory
- SRAM static random access memory
- FPM DRAM fast page mode dynamic random access
- embodiments of the present disclosure may also be implemented as methods, apparatus, systems, computing devices, computing entities, and/or the like. As such, embodiments of the present disclosure may take the form of an apparatus, system, computing device, computing entity, and/or the like executing instructions stored on a computer-readable storage medium to perform certain steps or operations. However, embodiments of the present disclosure may also take the form of an entirely hardware embodiment performing certain steps or operations.
- retrieval, loading, and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together.
- such embodiments can produce specifically-configured machines performing the steps or operations specified in the block diagrams and flowchart illustrations. Accordingly, the block diagrams and flowchart illustrations support various combinations of embodiments for performing the specified instructions, operations, or steps.
- FIG. 1 is a schematic diagram showing the exemplary communication relationships between components of various embodiments of the present disclosure.
- the system may include one or more control systems 100 , one or more user devices 110 , one or more (optionally) location devices 415 associated with a location 400 (e.g., a sort location or a pick location), one or more (optionally) conveyor belt assemblies 800 , and one or more networks 105 .
- Each of the components of the system may be in electronic communication with one another over the same or different wireless or wired networks including, for example, a wired or wireless Personal Area Network (PAN), Local Area Network (LAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), or the like.
- PAN Personal Area Network
- LAN Local Area Network
- MAN Metropolitan Area Network
- WAN Wide Area Network
- FIG. 1 illustrates certain system entities as separate, standalone entities, the various embodiments are not limited to this particular architecture.
- FIG. 2 provides a schematic of a control system 100 according to one embodiment of the present disclosure.
- the control system 100 may be incorporated into a system as one or more components for providing information regarding the appropriate location 400 for each of one or more assets 10 (see FIGS. 8-12 ).
- computing entity may refer to, for example, one or more computers, computing entities, desktops, mobile phones, tablets, phablets, notebooks, laptops, distributed systems, gaming consoles (e.g., Xbox, Play Station, Wii), watches, glasses, key fobs, radio frequency identification (RFID) tags, ear pieces, scanners, televisions, dongles, cameras, wristbands, kiosks, input terminals, servers or server networks, blades, gateways, switches, processing devices, processing entities, set-top boxes, relays, routers, network access points, base stations, the like, and/or any combination of devices or entities adapted to perform the functions, operations, and/or processes described herein.
- gaming consoles e.g., Xbox, Play Station, Wii
- RFID radio frequency identification
- Such functions, operations, and/or processes may include, for example, transmitting, receiving, operating on, processing, displaying, storing, determining, creating/generating, monitoring, evaluating, comparing, and/or similar terms used herein interchangeably. In one embodiment, these functions, operations, and/or processes can be performed on data, content, information, and/or similar terms used herein interchangeably.
- the control system 100 may also comprise various other systems, such as an Address Matching System (AMS), an Internet Membership System (IMS), a Customer Profile System (CPS), a Package Center Information System (PCIS), a Customized Pickup and Delivery System (CPAD), a Web Content Management System (WCMS), a Notification Email System (NES), a Fraud Prevention System (FPS), and a variety of other systems and their corresponding components.
- AMS Address Matching System
- IMS Internet Membership System
- CPS Customer Profile System
- PCIS Package Center Information System
- CPAD Customized Pickup and Delivery System
- WMS Web Content Management System
- NES Notification Email System
- FPS Fraud Prevention System
- control system 100 may also include one or more communications interfaces 220 for communicating with various computing entities, such as by communicating data, content, information, and/or similar terms used herein interchangeably that can be transmitted, received, operated on, processed, displayed, stored, and/or the like.
- the control system 100 may include or be in communication with one or more processing elements 205 (also referred to as processors, processing circuitry, and/or similar terms used herein interchangeably) that communicate with other elements within the control system 100 via a bus, for example.
- the processing element 205 may be embodied in a number of different ways.
- the processing element 205 may be embodied as one or more complex programmable logic devices (CPLDs), microprocessors, multi-core processors, co-processing entities, application-specific instruction-set processors (ASIPs), microcontrollers, and/or controllers.
- CPLDs complex programmable logic devices
- ASIPs application-specific instruction-set processors
- microcontrollers and/or controllers.
- the processing element 205 may be embodied as one or more other processing devices or circuitry.
- circuitry may refer to an entirely hardware embodiment or a combination of hardware and computer program products.
- the processing element 205 may be embodied as integrated circuits, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays (PLAs), hardware accelerators, other circuitry, and/or the like.
- ASICs application specific integrated circuits
- FPGAs field programmable gate arrays
- PDAs programmable logic arrays
- the processing element 205 may be configured for a particular use or configured to execute instructions stored in volatile or non-volatile media or otherwise accessible to the processing element 205 .
- the processing element 205 may be capable of performing steps or operations according to embodiments of the present disclosure when configured accordingly.
- control system 100 may further include or be in communication with non-volatile media (also referred to as non-volatile storage, memory, memory storage, memory circuitry and/or similar terms used herein interchangeably).
- non-volatile storage or memory may include one or more non-volatile storage or memory media 210 , including but not limited to hard disks, ROM, PROM, EPROM, EEPROM, flash memory, MMCs, SD memory cards, Memory Sticks, CBRAM, PRAM, FeRAM, NVRAM, MRAM, RRAM, SONOS, FJG RAM, Millipede memory, racetrack memory, and/or the like.
- the non-volatile storage or memory media may store databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like.
- code may include an operating system, an acquisition module, a sort location module, a matching module, and a notification module.
- database, database instance, database management system, and/or similar terms used herein interchangeably may refer to a structured collection of records or data that is stored in a computer-readable storage medium, such as via a relational database, hierarchical database, and/or network database.
- control system 100 may further include or be in communication with volatile media (also referred to as volatile storage, memory, memory storage, memory circuitry and/or similar terms used herein interchangeably).
- volatile storage or memory may also include one or more volatile storage or memory media 215 , including but not limited to RAM, DRAM, SRAM, FPM DRAM, EDO DRAM, SDRAM, DDR SDRAM, DDR2 SDRAM, DDR3 SDRAM, RDRAM, TTRAM, T-RAM, Z-RAM, RIMM, DIMM, SIMM, VRAM, cache memory, register memory, and/or the like.
- the volatile storage or memory media may be used to store at least portions of the databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like being executed by, for example, the processing element 205 .
- the databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like may be used to control certain aspects of the operation of the control system 100 with the assistance of the processing element 205 and operating system.
- control system 100 may also include one or more communications interfaces 220 for communicating with various computing entities, such as by communicating data, content, information, and/or similar terms used herein interchangeably that can be transmitted, received, operated on, processed, displayed, stored, and/or the like.
- communications may be executed using a wired data transmission protocol, such as fiber distributed data interface (FDDI), digital subscriber line (DSL), Ethernet, asynchronous transfer mode (ATM), frame relay, data over cable service interface specification (DOCSIS), or any other wired transmission protocol.
- FDDI fiber distributed data interface
- DSL digital subscriber line
- Ethernet asynchronous transfer mode
- ATM asynchronous transfer mode
- frame relay frame relay
- DOCSIS data over cable service interface specification
- control system 100 may be configured to communicate via wireless external communication networks using any of a variety of protocols, such as general packet radio service (GPRS), Universal Mobile Telecommunications System (UMTS), Code Division Multiple Access 2000 (CDMA2000), CDMA2000 1 ⁇ (1 ⁇ RTT), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), Evolved Universal Terrestrial Radio Access Network (E-UTRAN), Evolution-Data Optimized (EVDO), High Speed Packet Access (HSPA), High-Speed Downlink Packet Access (HSDPA), IEEE 802.11 (Wi-Fi), 802.16 (WiMAX), ultra-wideband (UWB), infrared (IR) protocols, near field communication (NFC) protocols, BluetoothTM protocols (e.g., BluetoothTM Smart), wireless universal serial bus (USB) protocols, and/or any other wireless protocol.
- GPRS general packet radio service
- UMTS Universal Mobile Telecommunications System
- CDMA2000 Code
- the control system 100 may include or be in communication with one or more input elements, such as a keyboard input, a mouse input, a touch screen/display input, motion input, movement input, audio input, pointing device input, joystick input, keypad input, and/or the like.
- the control system 100 may also include or be in communication with one or more output elements (not shown), such as audio output, video output, screen/display output, motion output, movement output, and/or the like.
- control system 100 components may be located remotely from other control system 100 components, such as in a distributed system. Furthermore, one or more of the components may be combined and additional components performing functions described herein may be included in the control system 100 .
- the control system 100 can be adapted to accommodate a variety of needs and circumstances. As will be recognized, these architectures and descriptions are provided for exemplary purposes only and are not limiting to the various embodiments. Additional details in this respect may be understood from U.S. Ser. No. 15/390,109, the contents of which as are incorporated herein by reference in their entirety.
- control system 100 may be generally configured to maintain and/or update a defined location map associated with a facility or warehouse in which the user device(s) will be operated. This may be maintained for provision to the user device(s) upon calibration or initial “environment mapping” (see FIG. 8 ) via the user device(s); in other embodiments, the control system may maintain the defined location map—indicating where each package or asset should be located (whether for picking or sorting)—as a fail-safe check or validation to be assessed against the environment mapping conducted via and at the user device(s).
- location devices 415 (or identifier tags/codes/or the like) may be provided at the respective locations, for scanning or recognition via the user device(s) during calibration and/or environment mapping. In at least one preferred embodiment, however, the environment mapping occurs without need for or utilization of such location devices 415 .
- FIG. 3 depicts a user device 110 that a user 5 (e.g., user 5 of FIGS. 8-12 ) may operate.
- a user may be an individual (e.g., carrier personnel, such as a sort employee, a pick employee, or the like), a group of individuals, and/or the like.
- the user may operate the user device 110 , which may include one or more components that are functionally similar to those of the control system 100 .
- the user device 110 may be one or more mobile phones, tablets, watches, glasses (e.g., Google Glass, HoloLens, Vuzix M-100, SeeThru, Optinvent ORA-S, Epson Moverio BT-300, Epson Moverio BT-2000, ODG R-7, binocular Smart Glasses, monocular Smart Glasses, and the like), wristbands, and the like, and/or any combination of devices or entities adapted to perform the functions, operations, and/or processes described herein.
- the user device 110 is a hands-free type device, including wearable items/devices (e.g., the user devices of FIGS.
- HMDs head-mounted displays
- the user device 110 is configured to not impede the user's range of vision and/or the like during the use thereof, as is possible in other embodiments wherein, for example, the user device 110 is a type of glasses or the like.
- the term user device 110 is intended to refer to any device that projects, superimposes, overlays, or otherwise provides an image or projection on a surface with respect to a user's viewing angle or line of vision or a user device 110 's angle.
- the device 110 is configured for chest mounting via a mounting mechanism 112 in the illustrated embodiment.
- the mounting mechanism 112 may include a shoulder strap portion 112 - 1 configured to adjustably wrap around a user's anterior and posterior portion of the user's shoulders and a waist portion 112 - 2 configured to adjustably wrap around the user's waist.
- the shoulder strap portion 112 - 1 is connected to a top portion of the device component 114 at the anterior side and a top portion of the waist portion 112 - 2 at the posterior side.
- the device 110 may come in any suitable form and be mounted in any suitable manner and in any suitable location.
- the user device 110 does not include the mounting mechanism 112 and may be mounted to a component: on the user's head (e.g., a hardhat), within a wristband, within a sock, within a glove, within a shirt, and/or within any other suitable article of clothing in any orientation.
- a mounting mechanism (not illustrated) on a picking cart or a forklift or any type of component operated and/or being moved by the carrier personnel.
- other mounts could place the device 110 on top of the head (e.g., via a helmet, cap or headband), or shoulder mounted. This, in order to avoid covering the projection element when carrying a parcel in front of the chest.
- the projection element, the sensors and the processing units are mounted in different parts of the body to get a better weight distribution.
- the user device 110 in its hands-free form may include not only the mounting mechanism 112 but also a device component 114 that together define and constitute the user device 110 in some embodiments.
- the user device 110 in its hands-free form may include an antenna 115 (e.g., the antenna 312 of the user device of FIG. 3 ), a camera 116 , a speaker/microphone 117 , a pivoting laser projector 118 and two or more three-dimensional depth sensors 119 .
- the term user device 110 is intended to also include any other peripheral electronics and functionality that may be provided in conjunction with such devices.
- the user device 110 may include speakers, headphones, or other electronic hardware for audio output, a plurality of display devices, one or more position sensors (e.g., gyroscopes, global positioning system receivers, and/or accelerometers), battery packs, beacons for external sensors (e.g., infrared lamps), or the like.
- the user device 110 can be used to provide an augmented reality environment/area, a mixed reality environment/area, and/or similar words, as may be used herein interchangeably, to a user.
- the terms augmented/mixed environment/area should be understood to refer to a combined environment/area including the physical environment/area and elements of a virtual environment/area.
- the pivoting laser projector 118 is alternatively an LED picoprojector.
- the device component 114 alternatively or additionally includes different sensors for various functions, such as one or more digital compasses, accelerometers and/or gyroscopes configured to determine changes in position or speed of a user such that the pivoting laser projector 118 projects the correct image in the correct orientation. For example, if the user is hanging in a sideways manner, an accelerometer can detect that the associated device component 114 is likewise oriented. This information can be identified by a processor, which causes the pivoting laser projector 118 to responsively transmit a projection in a sideways manner, as opposed to a manner associated with the user standing on his/her feet.
- sensors for various functions such as one or more digital compasses, accelerometers and/or gyroscopes configured to determine changes in position or speed of a user such that the pivoting laser projector 118 projects the correct image in the correct orientation. For example, if the user is hanging in a sideways manner, an accelerometer can detect that the associated device component 114 is likewise oriented. This information can be identified
- the user can be running or otherwise moving at a particular speed, which causes the projector 118 to make projections faster/slower based on the speed or acceleration a user is moving at.
- these movement sensors can be used for notification purposes to the control system 100 .
- the accelerometer may infer that a person is in a particular orientation. These accelerometer readings may then be transmitted, via the antenna 115 , to the control system 100 such that the control system 100 responsively transmits a notification back to the device component 114 in order to warn or notify the user whether the user is in a suitable orientation.
- Other sensors may be used alternatively or additionally, such as range finders to identify how far away the device component 114 is from obstacles (e.g. conveyor devices) within an environment.
- the projection 810 may be projected in its specific orientation based at least on one or more range finders identifying the precise distance between a user device and the conveying mechanism 802 .
- proximity-based sensors e.g., RFID reader and tag
- an object e.g., an asset and/or conveying mechanism
- the device component 114 may include a tag reader that, when within a proximity or signal strength threshold of a tag located on an asset/object, triggers projections and/or notifications from the control system 100 .
- assets or other pieces of equipment includes one or more beacons configured to transmit location identifiers to any listening device within a threshold distance.
- the listening device may be the device component 114 , which receives the location identifiers and transmits them to the control system 100 such that the control system 100 provides responsive notifications back to the device component 114 , such as “pick package Y from shelf X,” etc.
- the device component 114 includes one or more location sensors (e.g., beacons, GPS modules) for the determining and analyzing location data as described herein.
- the user device 110 can include an antenna 312 (e.g., the antenna 115 of FIG. 5 ), a transmitter 304 (e.g., radio), a receiver 306 (e.g., radio), and a processing element 308 (e.g., CPLDs, microprocessors, multi-core processors, co-processing entities, ASIPs, microcontrollers, and/or controllers) that provides signals to and receives signals from the transmitter 304 and receiver 306 , respectively.
- Certain embodiments of the user device 110 may also include and/or be associated with any of a variety of sensors (e.g., three-dimensional sensors, such as the depth sensors 119 of FIG. 5 ), depth cameras (e.g., the camera 116 of FIG. 5 ), three-dimensional scanners, binocular cameras, stereo-vision systems, pivoting projectors (e.g., the laser projector 118 of FIG. 5 ).
- sensors e.g., three-dimensional sensors, such as the depth sensors 119 of FIG.
- the three-dimensional sensors e.g., sensors 119 of FIG. 5
- the three-dimensional scanners may be utilized to “read” the environment surrounding the user device 110 , as detailed elsewhere herein.
- the sensors and/or scanners may build therefrom a three-dimensional model of the area through which the device 110 travels and/or has travelled. This generated model, as detailed elsewhere herein, may then be compared by one or more processors within the device 110 to a memory-based map of the facility or area (i.e., the environment). By doing so, the scanner readings may be used to determine which area of the map is in front of a user of the user device 110 (during operation) and extrapolate from the same the position and heading of the user for future movement.
- the pivoting projectors may be the pivoting laser projector 118 of FIG. 5 ; although, in other embodiments the projectors need not necessarily be laser-based.
- the projectors are configured to generate and provide—in a manner visible to the user (e.g., the carrier personnel)—one or more navigational guidance projections (e.g., arrows, frames, text, and/or the like). These projections, as will be discussed in further detail elsewhere herein, may be two-dimensional representations (e.g., as illustrated in FIGS. 8-11C ), three-dimensional representations (e.g., the projection 810 of FIG. 12 ), and/or holographic-based projections, or the like.
- the projections may be provided on a floor surface, a wall surface, and/or on or adjacent a shelving structure, as detailed elsewhere herein.
- a separate sensor, a regular camera, or the like may also be provided on the user device 110 for reading of the projected image(s) and therefrom verify if the projector is working properly and/or whether the result projected is readable.
- the projections provided via the user device are updated in real-time or near-real-time, as the user moves physically.
- a refresh rate in the range of 35-60 times per second may be provided, although differing rates of refreshing may be desirable, provided that the rate provided is substantially real-time or near real-time in nature.
- the camera 116 of the user device component 114 illustrated therein in its hands-free form may be utilized as a fail-safe for visual confirmation or the like of correct/accurate handling of an asset or package by the user of the user device 110 .
- the camera 116 captures each asset identifier and/or asset location identifier as a user traverses an environment. This location data and asset identifier data may then be transmitted, in near-real time via the antenna 115 , to the control system 100 .
- the control system 100 may then compare the asset identifier to asset identifiers stored in a data store to identify an asset and do the same with the captured location.
- the control system 100 may identify any discrepancies between the asset and the location by locating any mismatches between identifiers. For example, the control system 100 may determine that the identifier associated with package X should be located, picked, and/or placed at shelf Y, but the camera 116 captured it located in, picked, sorted and/or placed at shelf B. A notification indicating this may be responsively transmitted back to the device 114 such that the speaker 117 issues a prompt indicating the discrepancy and/or telling the user where the correct location is for the particular package. In some embodiments, the device component 114 itself determines this information without the need to transmit the information to the control system 100 for processing. In some embodiments, other notifications may be provided additionally or alternatively, such as a visual notification within a display screen on the device component and/or a notification that causes vibration of the device component 114 .
- the camera 116 may be utilized as a verification mechanism for ensuring that the projector 118 is working properly and/or is displaying readable projections.
- the device component 114 may stream in near-real-time information captured via the camera 116 to the control system 100 . If no projections are captured, this may trigger an alert (e.g., to a supervisor mobile device), which indicates that the projections are not being made. Likewise, if a projection is not verified (e.g., because there is a lot of light reducing projection image boundaries), a notification can be made in a similar manner as described above.
- the speaker 117 may be utilized in conjunction therewith, so as to provide audible commands to the user (e.g., delivered from the control system 100 to the component 114 via the antenna 115 ) should a deviation occur and/or to enable the user of the user device to communicate, via the network, with the control system in a near real-time or real-time manner.
- the speaker 117 alternatively or additionally includes a microphone that picks up sound variations that are stored in the memory.
- the sound variations may correspond to a command or natural language phrase issued by the user, such as “where do I find item XT” or “where is shelf Y located?” Responsively, these sound variations are transmitted, via the antenna 115 , to the control system 100 .
- control system 100 may employ one or more voice recognition algorithms to interpret the sound variations and provide one or more responsive notifications back to the device component 114 , such that the speaker 117 provides the notification output. For example, in response to the user question of “where do I find item XT” the control system 100 may interpret the phrase, identify a data structure that associates the location with item X. The control system 100 may then responsively transmit to the device component 114 a notification that causes the speaker 117 to output the location of where item X is.
- the signals provided to and received from the transmitter 304 and the receiver 306 may include signaling information in accordance with air interface standards of applicable wireless systems.
- the user device 110 may be capable of operating with one or more air interface standards, communication protocols, modulation types, and access types. More particularly, the user device 110 may operate in accordance with any of a number of wireless communication standards and protocols, such as those described above with regard to the control system 100 .
- the user device 110 may operate in accordance with multiple wireless communication standards and protocols, such as UMTS, CDMA2000, 1 ⁇ RTT, WCDMA, TD-SCDMA, LTE, E-UTRAN, EVDO, HSPA, HSDPA, Wi-Fi, WiMAX, UWB, IR, NFC, BluetoothTM Smart, USB, and/or the like.
- the user device 110 may operate in accordance with multiple wired communication standards and protocols, such as those described above with regard to the control system 100 via a network interface 320 .
- the user device 110 can communicate with various other entities (e.g., the control system 100 , a location device 415 , or the like) using concepts such as Unstructured Supplementary Service Data (USSD), Short Message Service (SMS), Multimedia Messaging Service (MMS), Dual-Tone Multi-Frequency Signaling (DTMF), and/or Subscriber Identity Module Dialer (SIM dialer).
- USSD Unstructured Supplementary Service Data
- SMS Short Message Service
- MMS Multimedia Messaging Service
- DTMF Dual-Tone Multi-Frequency Signaling
- SIM dialer Subscriber Identity Module Dialer
- the user device 110 can also download changes, add-ons, and updates, for instance, to its firmware, software (e.g., including executable instructions, applications, program modules), and operating system; this may occur periodically, upon initiation via a user of the user device 110 , or upon cue(s) received at the user device from the control system 100 .
- the user device 110 may also include a location and/or perspective determining aspect, device, module, functionality, and/or similar words used herein interchangeably.
- the user device 110 may include outdoor and/or environmental positioning aspects, such as a location module adapted to acquire, for example, latitude, longitude, geocode, course, direction, heading, speed, universal time (UTC), date, and/or various other information/data.
- the location module can acquire data, sometimes known as ephemeris data, by identifying the number of satellites in view and the relative positions of those satellites.
- the satellites may be a variety of different satellites, including Low Earth Orbit (LEO) satellite systems, Department of Defense (DOD) satellite systems, the European Union Galileo positioning systems, the Chinese Compass navigation systems, Indian Regional Navigational satellite systems, and/or the like.
- LEO Low Earth Orbit
- DOD Department of Defense
- the location information may be determined by triangulating the user device 110 's position in connection with a variety of other systems, including cellular towers, Wi-Fi access points, and/or the like.
- the user device 110 may include indoor positioning aspects, such as a location/environment module adapted to acquire, for example, latitude, longitude, geocode, course, direction, heading, speed, time, date, and/or various other information/data.
- Some of the indoor systems may use various position or location technologies including RFID tags, indoor beacons or transmitters, Wi-Fi access points, cellular towers, nearby computing devices (e.g., smartphones, laptops), nearby components with known relative locations, and/or the like.
- position or location technologies including RFID tags, indoor beacons or transmitters, Wi-Fi access points, cellular towers, nearby computing devices (e.g., smartphones, laptops), nearby components with known relative locations, and/or the like.
- such technologies may include the iBeacons, Gimbal proximity beacons, Bluetooth Low Energy (BLE) transmitters, Near Field Communication (NFC) transmitters, three-dimensional scanners, robot vision systems, environmental mapping devices, and/or the like.
- BLE Bluetooth Low Energy
- NFC Near Field Communication
- the user device 110 may also detect markers and/or target objects.
- the user device 110 may include readers, scanners, cameras, sensors, and/or the like for detecting when a marker and/or target object and/or a pattern of unique colors within its point-of-view (POV)/field-of-view (FOV) of the real world environment/area.
- POV point-of-view
- FOV field-of-view
- readers, scanners, cameras, sensors, and/or the like may include RFID readers/interrogators to read RFID tags, scanners and cameras to capture visual patterns and/or codes (e.g., text, barcodes, character strings, Aztec Codes, MaxiCodes, information/data Matrices, QR Codes, electronic representations, and/or the like), and sensors to detect beacon signals transmitted from target objects or the environment/area in which target objects are located.
- the user device 110 may detect signals transmitted from the control system 100 ( FIGS. 1-2 ), an asset 10 ( FIG. 8 ), an improved conveyor belt assembly ( FIG. 12 ), and/or from a location device 415 ( FIG. 1 ), as may be desirable or advantageous.
- the user device 110 may include accelerometer circuitry for detecting movement, pitch, bearing, orientation, and the like of the user device 110 .
- This information/data may be used to determine which area of the augmented/mixed environment/area corresponds to the orientation/bearing of the user device 110 (e.g., x, y, and z axes), so that the corresponding environment/area of the augmented/mixed environment/area may be displayed via the display along with a displayed image.
- the user device 110 may overlay an image in a portion of the user's POV/FOV of the real world environment/area.
- the user device 110 may also include circuitry and/or software for determining when a change in the handling of a package or asset by a user of the user device has occurred. Exemplary changes detected may include the picking up of an asset or package, the setting down of an asset or package, or the like.
- the user device 110 may also comprise or be associated with an asset indicia reader, device, module, functionality, and/or similar words used herein interchangeably.
- the user device 110 may include a camera or RFID tag reader configured to receive information from passive RFID tags and/or from active RFID tags associated with an asset 10 .
- the user device 110 may additionally or alternatively include an optical reader configured for receiving information printed on an asset 10 .
- the optical reader may be configured to receive information stored as a bar code, QR code, or other machine-readable code.
- the optical reader may be integral to the user device 110 and/or may be an external peripheral device in electronic communication with the user device 110 .
- the optical reader may also or alternatively be configured to receive information stored as human readable text, such as characters, character strings, symbols, and/or the like.
- the user device 110 may utilize the asset indicia reader to receive information regarding an asset 10 to be sorted.
- the user device 110 may be equipped with an optical reader or the like configured to receive and/or monitor information associated with an associated conveyor belt, as detailed elsewhere herein.
- the optical reader may be configured to receive and/or otherwise monitor and/or recognize a pattern located on the conveyor belt and associated with respective assets or packages. Additional details in this respect may be understood with reference to U.S. Ser. No. 15/581,609, the contents of which as are incorporated by reference herein in their entirety
- the user device 110 may also comprise a user interface (that can include a display or see-through display 314 coupled to a processing element 308 and/or a user input device 318 coupled to a processing element 308 ).
- a user interface may be a user application, browser, user interface, and/or similar words used herein interchangeably executing on and/or accessible via the user device 110 to interact with and/or cause display of information, as described herein.
- the user interface can comprise any of a number of devices allowing the user device 110 to receive data, such as a keypad (hard or soft), a touch display, voice or motion interfaces, or other input device.
- the keypad can include (or cause display of) the conventional numeric (0-9) and related keys (#, *), and other keys used for operating the user device 110 and may include a full set of alphabetic keys or set of keys that may be activated to provide a full set of alphanumeric keys.
- the user input interface can be used, for example, to activate or deactivate certain functions, such as screen savers and/or sleep modes.
- the user device 110 can also include volatile storage or memory 322 and/or non-volatile storage or memory 324 , which can be embedded and/or may be removable.
- the non-volatile memory may be ROM, PROM, EPROM, EEPROM, flash memory, MMCs, SD memory cards, Memory Sticks, CBRAM, PRAM, FeRAM, NVRAM, MRAM, RRAM, SONOS, FJG RAM, Millipede memory, racetrack memory, and/or the like.
- the volatile memory may be RAM, DRAM, SRAM, FPM DRAM, EDO DRAM, SDRAM, DDR SDRAM, DDR2 SDRAM, DDR3 SDRAM, RDRAM, TTRAM, T-RAM, Z-RAM, RIMM, DIMM, SIMM, VRAM, cache memory, register memory, and/or the like.
- the volatile and non-volatile storage or memory can store databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like to implement the functions of the user device 110 . As indicated, this may include a user application that is resident on the entity or accessible through a browser or other user interface for communicating with the control system 100 ( FIG. 2 ), location device 415 ( FIG. 1 ), and/or various other computing entities.
- the user device 110 may include one or more components or functionality that are the same or similar to those of the control system 100 , as described in greater detail above. As will be recognized, these architectures and descriptions are provided for exemplary purposes only and are not limiting to the various embodiments.
- an information gathering device may be provided via a combination of the image camera 116 that is mounted on the device component 114 and/or the three-dimensional depth sensors 119 .
- the information gathering device may be a three-dimensional depth sensor, a stereo camera, and/or the like—utilized independently relative to the camera.
- the displayed or captured image data e.g., FIGS. 8-11 described elsewhere herein
- the displayed or captured image data is merged with objects in the physical world/environment in a seamless manner, so as to provide a sense that the displayed image(s) or projection is an extension of the reality present in the physical world/environment.
- the overlay provided via the user device 110 and its physical component 114 is at least a two dimensional representation that is projected ahead of or before the user of the user device, so as to provide handling/movement guidance (i.e., navigational guidance) for the user during transport of or travel to initiate transport of an asset or package.
- handling/movement guidance i.e., navigational guidance
- FIG. 12 depicts a conveyor belt assembly 800 in communication with the control system 100 , where the improved conveyor belt assembly facilitates obtaining of asset 10 information.
- the conveyor belt assembly 800 may comprise a conveying mechanism 802 and an acquisition/display entity 804 (for capturing the asset 10 information), each of which as are described in further detail in U.S. Ser. No. 15/581,609, the contents of which as are incorporated by reference herein in their entirety.
- a user 5 may be guided toward a particular asset or package on the conveying mechanism 802 via one or more navigational projections 810 .
- the projections 810 are provided in a three-dimensional form, as may be compared with the two-dimensional projections in FIGS. 8-11C . Either may be utilized interchangeably, as may be holographic-based projections or the like. Again, as mentioned, additional detail in this respect may be obtained from U.S. Ser. No. 15/581,609, the contents of which as are incorporated by reference herein in their entirety.
- a hologram or holographic-based projection is a recording of a light field, as opposed to an image formed by a lens (e.g., of a pair of smart glasses), and is used to display a fully three or more dimensional image without the use or aid of special glasses or other intermediate objects.
- a hologram can be displayed within any physical geographical environment without the need of any projecting medium (e.g., projector screen, object, or lens).
- the navigation projection 810 is or includes a multidimensional image that represents a volumetric display, which is a visual representation of an object in at least three physical dimensions, as opposed to simulating depth or multiple dimensions through visual effects.
- the same projected object looks different from various perspectives (e.g., a side view, versus a front view, versus a back view).
- a first front view can include a first arrow and first instructions for worker X to pick/sort from.
- the same first arrow can include second instructions for worker Y to pick/sort from, etc.
- one or more locations 400 may be associated with one or more (optionally provided) location devices 415 , with both being configured for identifying one or more assets 10 being sorted to each location 400 .
- locations 400 may include one or more vehicles (e.g., aircraft, tractor-trailer, cargo container, local delivery vehicles, and/or the like), pallets, identified areas within a building, bins, chutes, conveyor belts, shelves, and/or the like.
- the locations may be sort locations (for transport of the asset for additional movement/handling) or pick locations (for storing of the asset until it needs to be picked or “pulled” for order fulfillment purposes of the like).
- the one or more location devices 415 may be attached to a location 400 and/or located more generally within and/or at the location 1400 (see FIGS. 13A-13F ). Alternatively the one or more location devices 415 may be located adjacent to a sort location 400 / 1400 or otherwise proximate the sort location 400 / 1400 . In various embodiments, a location device 415 may be located proximate to an area designated to store the sort location 400 / 1400 . For example, when the sort location 400 includes a delivery vehicle, a location device 415 may be located above each of a plurality of parking areas designated for one or more delivery vehicles. This may apply equally relative to sort and/or pick locations (e.g., FIGS. 13D-F ).
- the one or more location devices 415 may include components functionally similar to the control system 100 and/or the user device 110 .
- the term “computing entity” may refer to, for example, one or more computers, computing entities, desktops, mobile phones, tablets, phablets, notebooks, laptops, distributed systems, gaming consoles (e.g., Xbox, Play Station, Wii), watches, glasses, key fobs, RFID tags, ear pieces, scanners, televisions, dongles, cameras, wristbands, kiosks, input terminals, servers or server networks, blades, gateways, switches, processing devices, processing entities, set-top boxes, relays, routers, network access points, base stations, the like, and/or any combination of devices or entities adapted to perform the functions, operations, and/or processes described herein
- gaming consoles e.g., Xbox, Play Station, Wii
- watches glasses, key fobs, RFID tags, ear pieces, scanners, televisions, dongles, cameras, wristbands, kiosks
- input terminals
- the location device 415 can include an antenna, a transmitter (e.g., radio), a receiver (e.g., radio), and a processing element (e.g., CPLDs, microprocessors, multi-core processors, co-processing entities, ASIPs, microcontrollers, and/or controllers) that provides signals to and receives signals from the transmitter and receiver, respectively.
- a transmitter e.g., radio
- a receiver e.g., radio
- a processing element e.g., CPLDs, microprocessors, multi-core processors, co-processing entities, ASIPs, microcontrollers, and/or controllers
- the signals provided to and received from the transmitter and the receiver, respectively, may include signaling information in accordance with air interface standards of applicable wireless systems.
- the location device 415 may be capable of operating with one or more air interface standards, communication protocols, modulation types, and access types. More particularly, the location device 415 may operate in accordance with any of a number of wireless communication standards and protocols, such as those described above with regard to the control system 100 .
- the location device 415 may operate in accordance with multiple wireless communication standards and protocols, such as UMTS, CDMA2000, 1 ⁇ RTT, WCDMA, TD-SCDMA, LTE, E-UTRAN, EVDO, HSPA, HSDPA, Wi-Fi, WiMAX, UWB, IR, NFC, BluetoothTM, USB, and/or the like.
- the location device 415 may operate in accordance with multiple wired communication standards and protocols, such as those described above with regard to the control system 100 via a network interface.
- the location device 415 can communicate with various other entities (e.g., the user device 110 and/or the control system 100 ) using concepts such as USSD, SMS, MMS, DTMF, and/or SIM dialer.
- the location device 415 can also download changes, add-ons, and updates, for instance, to its firmware, software (e.g., including executable instructions, applications, program modules), and operating system.
- utilizing the hands-free form of the user device 110 may not utilize any location devices 415 , whereby location of the various assets (for sorting or picking) may be communicated directly as between the user device 110 and the control system 100 , further in conjunction with an environmental mapping capability of the user device 110 , as described further below.
- the location 400 may include may include one or more vehicles (e.g., aircraft, tractor-trailer, cargo container, local delivery vehicles, and/or the like), pallets, identified areas within a building, bins, chutes, conveyor belts, shelves, and/or the like.
- the location 400 includes a plurality of shelves onto which the assets 10 may be placed and/or removed from. While these figures depict a specific quantity of shelves as being stacked in a vertical direction, it should be understood that any quantity of shelves may be arranged in any suitable configuration to hold the assets 10 .
- Each of the shelves may include one or more visual indicators (e.g., 715 FIG. 10 ) positioned on or proximate to the shelves; however, as mentioned previously herein, certain hands-free user device embodiments may dispense with such indicators 110 c .
- navigational projections 715 much like the navigational projections 810 provided in conjunction with the conveyor belt assembly 800 may assist in identifying an appropriate position for placement and/or removal of the asset 10 within the sort and/or pick location.
- a user 5 FIG.
- the control system 100 may utilize the indicia reader or camera of the user device 110 to scan, read, or otherwise receive asset identifier data from the asset 10 to identify, in cooperation with the control system 100 , an appropriate position for placement of the asset 10 within the warehouse or facility, namely at the location 400 / 1400 .
- the control system 100 may determine the appropriate position for placement of the asset within the warehouse relative to the location 400 / 1400 and convey that information to the user device 110 in response to the user device having approached an asset or package (e.g., for sorting).
- the control system 100 may proactively transmit projections to the user device 110 upon receipt of a package or asset order, requiring “picking” of the asset or package from a pick location for order fulfillment purposes or the like.
- a user 5 may utilize the indicia reader or camera of the user device 110 to scan, read, or otherwise receive asset identifier data from the asset 10 at a location 400 that is associated with a mobile storage area (i.e., a delivery vehicle).
- the delivery vehicle may be configured with a projector 900 that proactively transmits navigational projections within the physical space of the storage area that is visible to the user, analogous to the user device 110 .
- the navigational projections may be visible within the physical space of the storage area so as to aid in the selection (i.e., picking) of an asset or package.
- the projector 900 may be used in these and other embodiments involving a delivery vehicle in conjunction with or in place of the user device 110 .
- the location 400 can be mobile or static.
- the location 400 can be mobile as it may be associated with the delivery vehicle.
- the location 400 can be a cargo container associated with the delivery vehicle or within the delivery vehicle itself.
- the location 400 may be static.
- the location 400 can be a storage area within a storefront. Additionally or alternatively, the storage area can be behind a customer counter.
- the location 400 can be within a sorting facility.
- control system 100 may determine the appropriate position for placement of the asset 10 within the location 400 / 1400 based on a variety of factors. For example and without limitation, the control system 100 may determine the appropriate position for placement of the asset 10 within the location 400 based on the destination of the assets 10 .
- an exemplary embodiment illustrates identifying an asset within a storage area of a delivery vehicle.
- delivery vehicle may be any kind of vehicle, such as an automobile, truck, train, or airplane.
- the control system 100 may be configured to communicate with the delivery vehicle (i.e., location 400 ) in any of the ways and/or manners detailed elsewhere herein.
- the user 5 may also, in exemplary embodiments, utilize a user device 110 in any of the ways and/or manners detailed elsewhere herein.
- the user 5 may dispense with the user device 110 and rely instead upon instructions communicated via a projector 900 mounted on the delivery vehicle (i.e., location 400 ).
- the projector 900 may be configured to communicate with the control system 100 in a manner and/or way analogous to the communication between the control system 100 and the user device 110 , as detailed elsewhere herein.
- utilizing a projector 900 that is mounted within the storage area can be advantageous as it removes the need for the user 5 to wear additional equipment.
- the user 5 may be physically active (e.g., carrying assets, entering and exiting the vehicle, climbing stairs at a delivery location)
- any additional equipment, including the user device 110 could interfere or encumber the user's 5 movement.
- smart glasses may be prone to falling off the wearer's head or obstructing the user's sight, presenting potential safety issues.
- an in-storage area mounted projector 900 can rely on a permanent power source. This is in contrast to a portable device that is powered by a portable battery, which may require constant recharging or replacement. As such, an in-storage area mounted system may be advantageous over a user device 110 , in some instances.
- FIGS. 14A-18 describe a location 400 with respect to the storage area associated with a delivery vehicle, the location 400 can be any physical environment.
- the location 400 can be a physical environment within a warehouse, a sorting facility, a shopping area of a store, and the like.
- the projector 900 can be secured within a physical environment of a static location.
- the location of a set of packages may be illuminated by one or more navigational projections 901 .
- the location of the set of packages may be highlighted by a user device (such as user device 110 of FIG. 4 ) that generates one or more navigational projections.
- the one or more navigational projections 901 as illustrated in FIG. 14A , may be configured according to various embodiments to operate and provide navigational guidance to a user 5 in substantially the same way and/or manner as the one or more navigational projections 715 detailed elsewhere herein.
- the current process for drivers or personnel on a delivery vehicle involves a manual sequencing of loading assets 10 inside the delivery vehicle.
- the personnel goes to deliver the asset 10 , he or she must sort through the assets to determine the correct asset to deliver.
- assets are conventionally loaded following a delivery sequence, the drivers or personnel must still spend some degree of time finding the right box at each stop; oftentimes, additional boxes for a particular stop may be inadvertently overlooked.
- the exemplary embodiment described herein utilized augmented reality techniques (as described elsewhere herein) to highlight at each stop which asset(s) 10 is to be picked by the drivers or personnel.
- control system 100 may be utilized for the control system 100 to, over time, improve and understand how best to project and identify individual assets 10 and to build a three-dimensional representation of the assets based on two-dimensional images and/or three-dimensional sensor captured data.
- the projector 900 may be utilized to provide navigational projections (e.g., “light” the asset) to the asset(s) 10 for picking at a particular service stop; this may be done in conjunction with—or in place of—a “lighting” of the asset(s) 10 via the user device 110 , as detailed elsewhere herein.
- navigational projections e.g., “light” the asset
- the system may also, in addition to the projector 900 , have multiple components, including a scanning device 1500 (e.g., a video camera, a three dimensional sensor or camera, a LIDAR scanner, tomography scanner, wireless RF signal scanner, or the like).
- the scanning device 1500 can be used to capture the size and shape of asset 10 as it is loaded into the storage area and placed onto shelves (i.e., specific locations). Any asset 10 added, stored, or removed, can be captured by the scanning device 1500 .
- the captured information can then be processed via one or more computer processors associated with the scanning device, the control system 100 , and/or the projector 900 .
- Pattern recognition, machine learning, and/or AI-based algorithms may be utilized to identify, from the scanning information, the shape, size, and position of each asset stored. Additionally or alternatively, the control system 100 may determine an asset identifier from the scanning information (which could be determined through image recognition that identifies a label on an asset and/or via wireless sensors detecting an RFID signal, or the like).
- FIGS. 14A-18 describe a location 400 with respect to the storage area of a delivery vehicle
- the location 400 can be any physical environment.
- the location 400 can be a storage area within a warehouse, a sorting facility, a shopping area of a store, and the like.
- the scanning device 1500 can be secured within a storage area of a static location.
- the control system 100 can cause the scanning device 1500 to capture scanning information of a physical environment of a storage area. Based on obtained scanning information, the control system 100 can track the addition, movement, or removal of the asset 10 from the storage area. For example, the asset 10 may be added to the storage area. The addition of the asset 10 can be detected by the control system 100 via one or more scanning devices 1500 . As a further example, the asset 10 may slide as a result of the delivery vehicle stopping, turning, or accelerating, or a user moving a particular parcel to a different location so as to reach a different parcel. The control system 100 can cause scanning device 1500 can capture scanning information and then analyze the scanning information to determine that the asset 10 has moved to a new location.
- the control system 100 can store the asset's location within a location database.
- the projector 900 can provide navigational projections to guide the user 5 to the particular parcel.
- the projector 900 may be configured to illuminate the tracked/monitored/identified asset 10 , as detailed elsewhere herein.
- one or more scanning devices 1500 can be used to obtain location data for an asset within an environment (e.g., a storage area or a cargo area).
- the scanning device 1500 may comprise one or more sensors.
- the scanning device 1500 can include a three dimensional sensor, an image sensor (e.g., video or camera), a laser sensor (e.g., for use in LIDAR), infrared sensor, motion detector, and the like.
- the scanning device(s) 1500 can be secured to and/or mounted within the location 400 in any manner.
- the scanning device 1500 can be mounted in a mobile or fixed fashion.
- the scanning device 1500 can be a mobile 3D sensor that can sweep the location 400 to identify one or more assets.
- the scanning device 1500 may be a component within the projector 900 that rotates. It should be appreciated that the one or more scanning devices 1500 may be secured along the ceiling, walls, or shelves within the storage area (e.g., location 400 ). It should be appreciated that the term secured and/or mounted may refer to a permanent coupling or a temporary coupling such that the scanning device may be removed. Data obtained by the scanning device can then be analyzed (e.g., by the control system 100 ) to generate location data for an asset within the storage area.
- the scanning device 1500 can be mounted to a shelving unit.
- the shelving unit can be positioned such that a face of one shelf is opposite the face of another shelf. As shown in FIG. 15A , the face of a left shelving unit 410 L is opposite the face of the right shelving unit 410 R. While only two shelving units are depicted, the storage area may comprise rows of shelving units, as shown in FIG. 8 .
- one or more scanning devices 1500 can be mounted within the location 400 so as to have a field of view of a shelving unit. As described above, the one or more scanning devices 1500 can be mounted along the ceiling, walls, or shelves within the storage area (e.g., location 400 ). In some aspects, the one or more scanning devices 1500 can be mounted to a particular shelving unit so as to have a field of view of an opposing shelving unit. For example, the scanning device 1500 can be coupled to a front face of the shelving unit. Additionally or alternatively, the one or more scanning devices 1500 can be positioned in series, along a front face of an individual shelf. As shown in FIG. 15B , scanning devices 1500 can be positioned in series along a plurality of shelves. For instance, the scanning devices 1500 can be positioned in series along a first shelf 1530 and a second shelf 1540 .
- the sensors associated with each scanning device 1500 have a field of view of an opposing shelving unit.
- the series of scanning devices 1500 n can be positioned along the shelves of the left shelving unit 410 L such that the field of view of each sensor produces a combined field of view 1520 of one or more shelves of the right shelving unit 410 R.
- a second series of scanning devices 1500 a - f can be positioned along a right shelving unit 410 R to provide a combined field of view of the left shelving unit 410 L. Positioning the scanning device(s) 1500 along a shelf of an opposing shelving unit can be advantageous because it maximizes the field of view of each scanning device 1500 .
- the series of scanning devices 1500 a - f can be spaced apart horizontally (e.g., along the x-axis) and vertically (e.g., along the y-axis). Additionally, while the scanning devices 1500 n are illustrated as being aligned vertically or horizontally, in some embodiments, the scanning devices are not aligned as such.
- the control system 100 can generate location data for the asset 10 representing the asset's physical location within the location 400 .
- the asset's location data can then be stored in an asset location database.
- the control system 100 can update the asset location database based on determining that an asset has been added, moved, or removed from the storage area.
- the control system 100 can detect the addition (or removal) of the asset based on analyzing data obtained via the scanning device 1500 . For instance, the scanning device 1500 can detect (or no longer detect) an asset identifier transmitted by an asset wireless signal generated by an RFID. Additionally or alternatively, the control system 100 can determine that the scanning information no longer includes a visual pattern (e.g., a QR code, particular dimensions of the asset) associated with a previously identified asset.
- a visual pattern e.g., a QR code, particular dimensions of the asset
- the control system 100 can determine whether there is an object (the user 5 , a truck loader, etc.) in the storage area. The control system 100 can then determine that the object may interfere with capturing scanning information.
- the control system 100 can detect an object (e.g., the user 5 ) is in the aisle based on scanning information received via the scanning device(s), such as scanning information captured by an image sensor, a motion sensor, a thermal image sensor, or the like. Based on detecting an object in the aisle, the control system 100 can determine that the object has interfered or will interfere with the scanning information. As such, the control system 100 can delay analyzing the scanning information so as to determine an asset's location.
- control system 100 can analyze the obtained scanning information to generate asset location data.
- the asset location data can then be stored in an asset location database, which can be updated over time as new scanning information is obtained.
- control system 100 instructs the scanning device 1500 to receive scanning information based on determining that no object is detected.
- the control system 100 can generate asset location data based on position data for the scanning device.
- the position data can define the physical location of the scanning device 1500 within the storage area.
- the control system 100 can store position data for each scanning device 1500 a - f .
- the position data can include a measured distance with respect to a point of origin.
- the control system 100 can utilize the point of origin as a reference to generate a value (e.g., coordinates) for the asset location based on the location of the scanning device.
- the point of origin is associated with a physical location of a particular scanning device.
- the point of origin can be associated with the physical location of scanning device 1500 a .
- the position data for each scanning device 1500 b - i can be determined with respect to the point of origin (e.g., scanning device 1500 a ).
- the position data can include a measured distance (e.g., along the x-axis, y-axis, and z-axis) of each scanning device with respect to the point of origin.
- the control system 100 can then utilize the point of origin to determine a value (e.g., coordinates) for the asset's location.
- the control system 100 can obtain scanning information from scanning device 1500 b .
- the control system 100 can then account for the position of the scanning device when analyzing the scanning information. For instance, the control system 100 can analyze the scanning information received from scanning device 1500 b based on the position of scanning device 1500 b .
- the control system 100 can then generate location data for the asset 10 from the scanning information obtained for each the scanning device 1500 n .
- the asset location can then be stored in an asset location database.
- the scanning device 1500 may comprise a wireless signal reader that can determine the location of the asset 10 through wireless signals.
- each asset 10 can be equipped with a tag (e.g., a microchip coupled to an antenna) that emits a wireless RF signal that is received by one or more tag readers associated with the scanning device 1500 .
- the wireless signal emitted by the tag can then be used to determine the location of the asset 10 within the location 400 .
- a distance between the tag and a tag reader can be determined through relative signal strength intensity (RSSI) triangulation, Time Difference of Arrival (TDOA), and the like.
- RSSI relative signal strength intensity
- TDOA Time Difference of Arrival
- the control system 100 can identify a particular asset based on asset characteristics captured by the scanning device 1500 .
- the control system 100 can analyze the scanning information to identify a particular visual pattern or characteristics associated with the asset (e.g., a QR code, particular dimensions of the asset, or particular markings on the asset).
- the control system 100 can analyze the scanning information to identify a RF signal emitted from an RFID associated with the asset.
- the control system 100 can then utilize this information to identify the particular asset.
- the identified asset can be associated with a unique identifier (e.g., alphanumeric code).
- the control system 100 can then store the location data in association with the identified asset (e.g., associating the location data with the unique identifier).
- the control system 100 can determine a delivery location associated with the identified asset.
- the control system 100 can reference a database that comprises a delivery location for each identified asset.
- the control system 100 causes a projector 900 to generate one or more navigational projections that identifies an asset to be pulled.
- the one or more navigational projections provided by the projector 900 creates a visual cue for the physical location of the asset 10 .
- the projector 900 can illuminate a portion of the environment of the delivery vehicle so as to guide the user 5 to the particular asset.
- the projector 900 can include one or more light sources mounted within the storage area (e.g., location 400 ).
- the one or more light sources can be mounted at any location within the storage area, including a ceiling, a wall, a floor, or a shelving unit.
- the projector 900 can be a light source that is mounted to the ceiling of the storage area.
- the one or more light sources can be mounted along the shelves of the shelving unit 410 (e.g., along a front surface of a shelf).
- the projector 900 may include a light source that projects a structured light in a particular direction. Additionally or alternatively, the projector 900 is a light source that generates omnidirectional light.
- the light source can be a single, centralized light source or a plurality of distributed light sources.
- the light source can be activated, by the control system 100 , to illuminate a portion of the environment to identify the physical location of the asset to be pulled.
- Any light source may be used, including a halogen light source, an LED light source, a laser light source, or the like.
- the light source can be a stationary light source or a rotatable light source. It should be appreciated that a stationary light source can eliminate the need for any moving parts, which can be beneficial in some instances. For example, if the storage area is a portion of a delivery vehicle, the movement of the vehicle along with the movement of a rotatable light source can hinder an accurate placement of the navigational projection. As such, a stationary light source may be preferred. Still, in some instances, the rotatable light source may be preferred.
- a rotatable light source can reduce installation time as it may eliminate the installation of multiple, stationary light sources.
- control system 100 can determine the location of the navigational projection within the physical environment of the storage area (e.g., location 400 ). For example, the control system 100 can determine the location of the navigational projection based on the asset location data. The control system 100 can reference an asset location stored in the asset location database and cause a navigational projection to be presented proximate to the physical location of the environment that is associated with the asset location data.
- the control system 100 can activate the projector 900 to generate a navigational projection in a particular portion within the physical environment.
- the control system 100 can cause a centralized light source to project light onto or near the surface of the asset.
- the control system 100 can selectively activate one or more light sources positioned near the asset.
- the projector 900 may comprise a plurality of light sources that are distributed throughout the storage area. The control system 100 can activate the projector 900 to generate a navigational projection in a particular portion of the environment by selectively activating a light source among the plurality of light sources.
- the control system 100 can selectively activate a particular light source that is mounted proximate to the determined location of the asset 10 . That is, based on the determined location of the asset, the control system 100 can reference a physical location of the one or more light sources and selectively illuminate a light source that is located near the generated asset location. It should be appreciated that the navigational projection may illuminate an area within 0-20 feet of the surface of the asset.
- the control system 100 can mechanically control the direction of the light emitted by the projector 900 .
- the control system 100 can generate a navigational projection in a particular portion of the physical environment by mechanically controlling the position of the projector 900 .
- the projector 900 may comprise a centralized light source 1600 that is rotatable, horizontally 1630 (e.g., about the x-axis) or vertically 1640 (e.g., about the y-axis).
- the light source 1600 can be rotated by one or more stepper motors 1610 1620 .
- one or more encoders can provide feedback to the control system 100 as to the orientation of the rotatable centralized light source. Additionally or alternatively, the rotatable centralized light source can be recalibrated or reset based on returning to an original position.
- the control system 100 can control the direction of a light that is emitted by the projector 900 by controlling the position of a mirror. For example, the control system 100 can cause a mirror to rotate such that the mirror redirects a light emitted from a fixed, centralized light source toward a particular portion of the physical environment, thereby indicating a particular asset to be pulled.
- the navigational projection is an illumination of the asset 10 .
- the control system 100 can cause a rotatable, centralized light source to shine a light onto a surface of the asset 10 , thereby illuminating the asset 10 .
- the control system 100 can selectively activate a stationary light source to project a light onto a surface of the asset 10 . A user can then quickly identify the illuminated asset 10 to be pulled without having to take the time to manually look through the storage area.
- the navigational projection can indicate that the asset 10 is behind a different asset.
- the asset 10 to be pulled is positioned behind another asset 1700 .
- the control system 100 can determine that the location of the asset 10 is behind another asset 1700 .
- the control system 100 may determine, based on scanning information, that asset 1700 has been placed in front of the asset 10 .
- the control system 100 can enter a loading state. During the loading state, the control system 100 can assume that any asset loaded will remain within the cargo area. The control system 100 can obtain first scanning information and identify the asset 10 has been loaded.
- the control system 100 can then obtain updated scanning information and identify that an asset 1700 has been placed in a similar location as asset 10 .
- the control system 100 can then assume that asset 10 has been pushed to rear of asset 1700 .
- the control system 100 can receive data from a user device indicating that asset 100 is located behind asset 1700 .
- the control system 100 can receive scanning information wirelessly and determine that the asset 10 is located to the rear of asset 1700 .
- the control system 100 can then cause the projector 900 to generate a distinct navigational projection or otherwise alter the navigational projection to indicate that the asset 10 is located behind another asset 1700 .
- a light illuminating asset 1700 may blink.
- the distinct navigational projection can be a particular color, symbol, or pattern to indicate that the user should look behind the asset 1700 to find the asset 10 to be pulled.
- the control system 100 will communicate—to the projector 900 and/or the user device 110 —instructions identifying which of one or more assets 10 should be highlighted as the driver or user 5 enters the package storing portion of the delivery vehicle.
- asset locations can be known based upon the three-dimensional representation of the assets 10 mapped via the control system 100 , with capabilities including mapping of shelving, floor, and/or aisle locations for various assets 10 .
- the control system 100 can monitor the location of the delivery vehicle. For example, the control system 100 can obtain location data from one or more location modules. The control system 100 can then utilize the location data to indicate the physical location of a delivery vehicle. It should be appreciated that the location module (e.g., a GPS location module) can be a component of the control system 100 , the delivery vehicle, or a user device. Based on obtained location data from the location module, the control system 100 can determine the vehicle's proximity to a delivery destination of a particular asset stored in the delivery vehicle. The control system 100 can determine the delivery destination for any particular asset from a database associating asset identifier of the particular asset with it respective delivery destination. The control system 100 can thus receive an asset identifier associated with each asset being transported and determine the delivery destination of each asset.
- the location module e.g., a GPS location module
- the control system 100 can determine the vehicle's proximity to a delivery destination of a particular asset stored in the delivery vehicle.
- the control system 100 can determine the delivery destination for any particular asset from
- the control system 100 can determine whether a detected location is associated with a delivery location of an asset. For instance, the control system 100 can determine whether the delivery vehicle is within a predefined threshold of a delivery location associated with an asset. The predefined threshold can be any threshold distance, including a foot up to several miles. If the delivery vehicle's location is within a predefined threshold, the control system 100 can determine the one or more assets to be pulled. The control system 100 can then direct the user 5 to the asset 10 to be pulled for delivery to a location (e.g., house, apartment, building, or smart locker) proximate the vehicle's physical location.
- a location e.g., house, apartment, building, or smart locker
- the control system 100 can cause one or more navigational projections to be generated based on predefine conditions.
- the predefined conditions may be associated with the state of the delivery vehicle. For example, the control system 100 can determine whether the delivery vehicle's gear has been placed in park. If so, the control system 100 can automatically cause the projector 900 to generate the one or more navigational projections. As another example, the control system 100 can determine that the delivery vehicle's door has been opened (e.g., the opening of a door to the storage area or driver side door).
- the predefined condition may be associated with receiving a command signal.
- the user 5 may arrive at a particular stop and activate a command signal (e.g., through a user device or a switch mounted within the delivery vehicle). Based on receiving the command signal, the control system 100 can cause one or more navigational projections to be generated.
- a command signal e.g., through a user device or a switch mounted within the delivery vehicle.
- the control system 100 can exchange asset-related data with a user device regarding the handling of the asset.
- the control system 100 receive asset-related data from a user device (a handheld computing device or user device 110 ) indicating that a particular asset has been loaded or unloaded.
- the asset-related data can include an asset identifier, dimensions, a weight, or a delivery destination.
- the control system 100 can receive asset-related data during the loading of the asset, which can then be used to determine the location of the asset.
- the control system 100 can analyze scanning information for an asset having a particular asset identifier or having particular dimensions.
- the control system 100 can receive asset-related data related to the unloading of the asset from the cargo area.
- the control system 100 can utilize the asset-related data to determine that the asset has been removed and that the asset will be absent from any further scanning information.
- an exemplary flow diagram 1800 shows a process of locating an asset.
- a scanning device is initialized.
- the control system 100 can instruct a scanning device 1500 to begin obtaining scanning information regarding a cargo container (e.g., location 400 ) of a delivery vehicle.
- the control system 100 can instruct a plurality of scanning devices to obtain information regarding the cargo area.
- the control system 100 can initialize the scanning device based on determining that no object will interfere or disrupt the scanning information obtained by the scanning device.
- the scanning device can then capture scanning information.
- the scanning device 1500 can receive scanning information through image sensors, depth sensors, wireless sensors, and the like. It should be appreciated that while the exemplary flow diagram 1800 refers to a cargo container of a delivery vehicle, the steps could be performed with respect to any location.
- the control system 100 can determine that an asset is located within the cargo container (e.g., location 400 ) based on scanning information obtained from a scanning device (e.g., scanning device 1500 ). It should be appreciated that the control system 100 can receive, from a scanner (e.g., scanning device 1500 or handheld scanner), scanning information including an asset identifier associated with the asset and a defined delivery location. In some aspects, the control system 100 can determine an asset identifier based on analyzing the scanning information.
- a scanning device e.g., scanning device 1500 or handheld scanner
- control system 100 can analyze the scanning information for distinguishing characteristics of the asset, such as a distinct visual aspect associated with the asset (such as an alphanumeric code, QR code, dimensions of the asset, symbols, and the like) or a wireless signal (e.g., an RFID signal communicating an asset identifier). Based on scanning information, the control system 100 can determine that a particular asset is located within the cargo container. As described herein, the control system 100 can also determine a particular location for the particular asset. It should be appreciated that the control system 100 can reference a database linking the asset identifier and the particular delivery location so as to determine a delivery destination for the asset.
- a distinct visual aspect associated with the asset such as an alphanumeric code, QR code, dimensions of the asset, symbols, and the like
- a wireless signal e.g., an RFID signal communicating an asset identifier
- a current location of the delivery vehicle is obtained.
- the control system 100 can utilize location data that is detected from one or more location modules.
- the location module e.g., a GPS location module
- the location module can be a component of the control system 100 , the delivery vehicle, or a user device.
- a projection device can be activated. For example, based on a control system 100 determining that a current location of the delivery vehicle is within a predefined range or threshold of a delivery destination, the control system 100 can activate a projection device 900 to emit a projection that corresponds to a determined position of the asset.
- the position of the asset can be determined based on the scanning information obtained from the scanning device 1500 .
- the scanning information is a scanned position of the asset within the storage area.
- the scanning device 1500 can detect a particular position of the asset relative to the scanning device 1500 based on one or more sensors, such as a depth sensor, an image sensor, a wireless signal sensor, and the like.
- the projection is emitted or generated by a light source associated with the projection device 900 .
- the control system 100 can activate the projector 900 to generate a navigational projection in a particular portion of the environment by selectively activating a light source among the plurality of light sources. For instance, based on determining the location of the asset 10 to be pulled, the control system 100 can selectively activate a particular light source that is associated with the determined location of the asset 10 . That is, based on the determined location of the asset 10 , the control system 100 can reference a physical location of the one or more light sources and selectively illuminate a light source that is located near the determined asset location. As a further example, the control system 100 can cause a centralized light source to project light onto or near the surface of the asset. It is contemplated that the projected light can be within 0-20 feet of the surface of the asset.
- the control system modifies projection coordinates of the projection device based on the scanned position and the determination that the current location is within the threshold distance of the defined delivery destination. For example, the control system 100 can determine a placement of a projection within the storage area from scanning information receiving from scanning device 1500 . The control system 100 can then cause a centralized light source of the projector 900 to rotate so point in a particular direction. The control system can then instruct the projector 900 to emit the projection based on the modified projection coordinates.
- additional scanning information is obtained.
- the control system 100 can instruct the scanning device 1500 to obtain additional scanning information.
- the control system 100 can instruct the scanning device 1500 to obtain additional scanning information based predetermined conditions, such as based on determining a vehicle door has been closed, determining that the vehicle is moving or has stopped, determining that a particular time interval has lapsed, and the like.
- the additional scanning information can include an updated scanned position of the asset within the storage area. This scanning information can then be analyzed by the control system 100 and stored in the asset location database. It should be appreciated that the control system 100 can modify the projection coordinates based on the updated scanned position.
- control system 100 may comprise a plurality of modules, each module configured to perform at least a portion of the functions associated with the methods described herein.
- the control system 100 may comprise an acquisition module, a location module, and a notification module.
- the various modules may operate on a combination of one or more devices (e.g., the user device 110 , the acquisition/display entity 804 (for capturing the asset 10 information), the location device 415 (where provided), and/or the control system 100 ), such that each device performs the functions of one or more modules.
- the acquisition module may be configured to obtain asset identifier data associated with an asset 10 to be sorted and/or picked.
- This asset identifier data may be obtained, in part, via an order placed by a customer desiring transport and delivery (e.g., picking, as a first step) of the asset or package.
- the asset identifier data may be obtained, in part, via the acquisition/display entity 804 associated with a conveyor belt of the like, transporting packages or assets to a sort location.
- the asset identifier data may comprise a unique asset identifier such as a tracking number or code, and data defining the one or more appropriate locations 400 for the asset 10 as it moves between an origin and a destination, and/or the like.
- the acquisition module may be configured to obtain data from the user device 110 (e.g., of FIGS. 3 and 4 ) and/or the acquisition device 810 (e.g., of FIG. 12 ).
- the data received from the user device 110 and/or the acquisition device 810 may include the entirety of the asset identifier data and therefore the acquisition module need only receive asset identifier data from one of the user device 110 and/or the acquisition device 810 .
- the data received from the user device 110 ( FIGS. 3 and 4 ) and/or the acquisition device 810 FIG.
- the acquisition module 12 may comprise only a portion of the asset identifier data, and the acquisition module may be configured to obtain the remainder of the asset identifier data from one or more other sources.
- the acquisition module may be configured to search one or more databases in communication with the control system 100 for asset identifier data corresponding to the data received from the user device 110 and/or the acquisition device 810 .
- the acquisition module may additionally or alternatively be configured to receive and store at least a portion of the asset identifier data corresponding to the asset 10 that is stored in one or more databases.
- the acquisition module may be configured to transmit at least a portion of the asset identifier data to one or more devices (e.g., the user device 110 ) and/or one or more modules (e.g., the location module and/or the notification module). Moreover, upon receiving the asset identifier data regarding an asset 10 to be sorted, the acquisition module may be configured to link or otherwise associate the user device 110 and the asset identifier data. As will be described in greater detail herein, the user device 110 may be associated with the asset identifier data by storing at least a portion of the asset identifier data in a memory associated with the user device 110 .
- the location module may be configured to receive asset identifier data from the acquisition module.
- the sort location module is configured to ascertain the appropriate location 400 and/or the appropriate position within the location 400 for the asset 10 based at least in part on the asset identifier data.
- the location module may be configured to determine the appropriate location 400 based at least in part on the asset identifier data and location data that is associated with the each of the plurality of locations 400 .
- the location data may be generated based not only upon the asset identifier data, but also upon the environmental mapping conducted via the user device, as described elsewhere herein.
- each of the plurality of locations 400 may be identified by location data, which may include a unique location identifier.
- the unique location identifier may comprise a unique character string individually identifying each of the plurality of locations 400 .
- the location data may define any subsequent processing to be performed on assets 10 within each location 400 and/or 1400 , and may comprise the unique sort location identifier for each of the plurality of locations 400 / 1400 the assets 10 will pass through.
- the location module may determine whether the processing to be performed on assets 10 in each of the plurality of locations 400 (as defined in the location data) will move the asset 10 closer to its final destination.
- the location module may determine whether the processing steps to be performed on the assets 10 in each of the locations 400 / 1400 complies with the service level (e.g., Same Day shipping, Next Day Air, Second Day Air, 3 Day Select, Ground shipping, and/or the like) corresponding to the asset 10 .
- the location module may determine the appropriate location for an asset 10 to be delivered to 123 Main Street, Atlanta, Ga. is a delivery vehicle that will deliver other assets 10 to the same address or nearby addresses (e.g., along the same delivery route).
- the location module may determine the appropriate location for an asset 10 to be delivered to 345 Broad Street, Los Angeles, Calif.
- the location module may determine the appropriate location for an asset 10 prior to its fulfilment for delivery, which location may be characterized—as done elsewhere herein—as a pick location for the asset.
- the location module may be configured to transmit data defining the appropriate location 400 / 1400 and/or the appropriate position for the asset 10 within the location 400 / 1400 to one or more devices (e.g., the user device 110 ) and/or modules (e.g., the notification module). Additional details in this respect are provided in U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety.
- the notification module may receive data indicating whether the location 400 and/or 1400 (e.g., as transmitted to the control system 100 via the user device) is the appropriate sort or pick location (e.g., as determined by the control system 100 ) for the asset or package being handled. As described herein, the notification module may cause one or more alerts to be generated in order to notify the user 5 (e.g., sort or pick personnel, more generally the carrier personnel) whether the asset 10 should be deposited in the location 400 and/or picked therefrom, however as the case may be.
- the notification module may cause one or more alerts to be generated in order to notify the user 5 (e.g., sort or pick personnel, more generally the carrier personnel) whether the asset 10 should be deposited in the location 400 and/or picked therefrom, however as the case may be.
- the notification module may be configured to transmit confirmation data and/or mistake data to the user device 110 in order to cause the device to generate an alert discernible by the user 5 (e.g., carrier personnel) indicative of the appropriate sort location for the asset 10 .
- the user device 110 and/or sensors associated therewith, e.g., three-dimensional sensors
- the camera 116 may utilize object recognition algorithms that identify whenever a person is clasping an object in a particular manner to determine properness.
- the notification module may cause the user device 110 to audibly provide the user with a confirmation message (e.g., via the speaker 117 ) upon a determination that the location 400 / 1400 is the appropriate sort or pick location.
- the notification module may alternatively or additionally cause one or more sounds to be generated, one or more lights to illuminate, one or more mechanical assemblies to move, and/or other processes discernible by a user 5 to operate and thus indicate to the user 5 whether the location 400 / 1400 is the appropriate location.
- notifications may be generated—and communicated to the user via the user device—not only when the user is at the location (e.g., for picking or sorting), but also during travel of the user to/from the location relative to other locations in the warehouse or facility.
- the user device could generate an audible (or other type) notification to the user-either independently, or upon cue received via the control system 100 .
- notifications may be generated and communicated (e.g., via the network or otherwise) to one or more parties other than the user of the user device (e.g., carrier supervisory personnel, other internal carrier personnel (e.g., quality assurance representatives, or the like), external personnel, external third party entities, or the like). Any of the notifications and/or communications described herein may be so communicated, whether to the user alone and/or to parties other than the user and/or to a combination of both, as may be desirable.
- parties e.g., via the network or otherwise
- parties other than the user of the user device e.g., carrier supervisory personnel, other internal carrier personnel (e.g., quality assurance representatives, or the like), external personnel, external third party entities, or the like.
- the notification module may be configured to generate an alert after associating asset identifier data with location data and/or cueing an asset or package for picking.
- the notification module may be configured to generate an alert to inform the user 5 (e.g., carrier personnel) or other users regarding asset identifier data being associated with location data and/or the immediate need for navigation or travel to occur toward the location for picking of the asset or package or otherwise.
- the notification module may be configured to generate one or more navigational projections (e.g., 710 , 715 , 1401 and/or the like, with reference to FIGS. 8-11C by way of non-limiting example) to convey navigational instructions to the user 5 .
- the navigational projections may be computer-generated and/or overlaid over an augmented reality environment, which may in certain embodiments be displayed to the user via the utilized user devices 110 .
- the navigational projections may be generated via the pivoting laser projector (e.g., 118 of FIG. 5 ) of the user device 110 .
- FIGS. 13A-13F illustrate various types of navigational projections as may be generated via the notification module, when conveyed via the control system 100 further to the user device 110 .
- the navigational projections may be generated at/by the user device 110 , independent of the control system 100 , upon receipt from the control system of only a new “pick” or “sort” command for a particular asset or package.
- the text indicia and navigational projections of FIGS. 13A-13F occur via one or more components of the device component 114 (e.g., laser projector 118 to project the arrow 1303 and the indicia 1309 of FIG. 13A ).
- FIG. 13A includes the environment 800 that the user is physically located in, which includes the conveyance device 1305 , an asset 1301 , various location devices, one of which is indicated by the device 1307 .
- the location devices help map the environment 800 .
- the text indicia (which may also be considered a navigational projection) 1309 may be projected within the environment 800 , which commands the user to “push forward” the asset 1301 .
- the navigational projection 1303 i.e., the arrow
- FIG. 13B includes the same environment 800 , except a different asset 1315 arrives and the text indicia 1311 commands the user to “push” the asset 1315 to the “other side.”
- the navigation projection 1313 helps guide the user to show which direction to push the asset 1315 . In this way, these visual frames together help guide and instruct the user for handling of assets.
- FIG. 13C another asset 1317 arrives and the text indicia 1319 prompts the user to “pick and sort” the asset 1317 to pick up, sort, and place within a sorting location.
- FIG. 13D illustrates prompting the user to sort the asset in a particular location within the environment 1400 .
- the instructions illustrated in FIG. 13D occur in response to the user picking up the asset 1317 as illustrated in FIG. 13C .
- the environment 1400 includes the text indicia 1401 that states “look that way,” which is accompanied by the navigational projection 1403 that illustrates what direction the user should walk in in order to sort the asset.
- FIG. 13E illustrates the correct location to sort an asset within the environment 1400 .
- the instructions illustrated in FIG. 13E occur in response to the user moving responsive to the text indicia 1401 and/or the navigational projection 1403 of FIG. 13D .
- the environment 1400 includes the text indicia 1407 “sort here” indicating, along with the navigation projection 1405 , where the correct cell or location for sorting the asset is.
- FIG. 13F illustrates the combined environments 800 and 1400 of FIGS. 13A through 13E . Accordingly, the user 1411 picks up the asset 1413 responsive to viewing a first set of navigational projections and/or text indicia and places the asset 1413 within the correct location responsive to viewing a second set of navigational projections and/or text indicia.
- FIGS. 8-13F illustrate an exemplary environment in which assets 10 are moved amongst various locations 400 , which locations may be pick locations (e.g., for storage of an asset in a warehouse or the like; e.g., FIGS. 8-11C in particular), a conveyor belt location (e.g., FIG. 12 in particular), and/or sort locations (e.g., for placement of an asset following distribution from a pick location to a conveyor belt location; e.g., FIGS. 13A-F in particular).
- a user 5 e.g., sort personnel
- the user device 110 may be configured for receiving information regarding a particular asset 10 to be transported, whether from the control system 100 or otherwise, for guiding the user 5 to a location in which the asset 10 is or should be transported to, and for informing the user 5 whether the asset 10 is being located (e.g., via navigational projections) and/or transported appropriately.
- FIG. 6 illustrates exemplary steps carried out by the user device 110 according to various embodiments of the present disclosure to achieve the advantages and capabilities outlined above.
- an initialization or calibration of the user device 110 may be conducted according to various embodiments. In certain embodiments this step or block may be optional; in other embodiments, it need only be conducted periodically, for example upon initial use of the user device and/or upon receipt—from the control system 100 —of a notification that an environment in which the user device operates has been altered or updated.
- an environment that a user is located in is mapped based at least in part on generating a multidimensional (e.g., 3-D) graphical representation of the environment.
- a multidimensional e.g., 3-D
- the user device 110 may be worn by a user 5 so as to map an environment 700 in which the user device 110 is to be used.
- a graphical representation 701 of the environment may be generated and/or stored via the user device 110 ; whereby storing may occur locally at the user device and/or be stored at and/or synced with the control system 100 , for example, upon completion of the mapping procedure.
- the mapping procedure of step or Block 501 involves the user, while wearing the user device 110 , to move through the environment 700 , which movement necessarily involves passage of locations 400 / 1400 (e.g., shelves) and various assets or packages 10 .
- three-dimensional depth sensors 119 e.g., the depth sensors 119 of FIG. 5
- the user device may be utilized to capture the data required to generate the graphical representation 701 .
- locations of shelving e.g., locations 400
- locations of assets/packages for example in “pick locations”—may also be established, determined, and/or otherwise saved at or by the user device 110 .
- Movement of the user 5 through the environment 700 defines progressive mapping zones 705 , through which the three dimensional depth sensors are configured to scan during the course of mapping.
- one or more commands may be transmitted to the user (e.g., via the speaker 117 ) from the control system (or otherwise), for example to instruct the user to “turn left” at specific intervals configured to ensure that the entirety of the environment 700 (or a desired portion thereof) is sufficiently mapped.
- This feature will, for example, minimize and/or eliminate instances of users not covering every area within the environment for which mapping may be desired and/or required.
- this mapping using 3D scanning capabilities to detect the immediate environment of the user projections are modified to avoid visual distortion. Accordingly, shapes and images projected are adjusted to the shape of the surface they will be projected upon in particular embodiments.
- object recognition devices and/or scanners located within a device component (e.g., device component 114 ) can identify the contours of the environment.
- a projection can be made in the environment based on the shapes (e.g., uneven surfaces) of the environment.
- the user device 110 is configured to receive and/or associate product (e.g., package or asset) locations within the mapped environment. This may be via utilization of identifiers 415 (as described elsewhere herein), via transmission of asset location information from the control system to the user device, and/or the like.
- product e.g., package or asset
- the user device 110 upon completion of the mapping of the environment 700 and the association of product (e.g., package or asset 10 ) locations therein, the user device 110 is calibrated for operational mode or use, which use may occur in either (or both) a pick and a sort mode.
- the pick mode the user device is configured to guide a user thereof to a location in which a package or asset 10 may be picked or “pulled” for fulfillment of an order (e.g., within the environment 800 of FIG. 13A ); in the sort mode, the user device guides the user (e.g., from a conveyor belt to a sort location), defined to enable further transport and handling of the asset or package in route to a customer or the like (e.g., within the environment 1400 of FIG. 13D ).
- the user device 110 proceeds to step or Block 503 , wherein pick location data is received.
- the pick location data is received—at the user device 110 —from the control system 100 , for example, upon receipt—at the control system—of a customer order for a particular asset 10 or package.
- the user device 110 is configured to, in certain embodiments, generate pick instructions in Block 504 .
- the generation of pick instructions in Block 504 may entail compilation of a route through which the user of the user device 110 must travel—from their present location—so as to reach the location of the asset needing to be picked.
- Block 504 may further entail generation of a plurality of potential navigational projections (e.g., as described in FIGS. 9-11C ) that will be necessary to accurately and efficiently guide or direct the user of the user device to the pick location.
- a plurality of potential navigational projections e.g., as described in FIGS. 9-11C
- multiple possible routes may be determined and assessed, with either the user device (automatically) or the user (via an interface selection) choosing an optimal route, for which the navigational projections may thereafter be established.
- Associated audible commands may also be generated/established, in conjunction with the navigational projections, should it be desirable to—in addition or alternatively to the navigational projections—also audibly guide (e.g., via the speaker 117 ) the user, instructing them to, for example, “turn left after the next shelving row, as depicted in FIG. 9 , by way of non-limiting example.
- Block 504 Upon completion of step or Block 504 the user device 110 is configured to proceed to Block 505 , wherein the navigational projections and/or audible instructions are dynamically displayed and/or otherwise provided to the user of the user device, characterized generically as “pick instructions.”
- Blocks 504 and 505 need not be separate and distinct steps or blocks; instead, as will be described below, as the user moves through the environment 700 , the user device 110 may be configured to dynamically generate and display various navigational projections and/or audible instructions.
- Block 504 may entail merely identifying—at the user device 110 —the user's present location, the pick location, and a variety of pathways or routes there-between.
- FIGS. 9-11C depict a variety of exemplary navigational projections that may be generated and displayed via the user device 110 as a user 5 wearing the same moves around an environment 700 .
- various navigational projections 710 may be generated and displayed (e.g., via the pivoting laser projector 118 of FIG. 5 ).
- These navigational projections 710 may be two or three-dimensional in nature and—as generally understood—provide directional guidance to the user, as to which direction they should move or turn. As illustrated, by way of non-limiting example, in FIG.
- the navigational projection 710 in certain embodiments may be a two-dimensional directional arrow configured to instruct a user waring a user device to make a change relative to their present movement pattern.
- the navigational projections, therein described as indicators 810 or navigational projections 810 may be three-dimensional in nature.
- navigational projections 715 , 715 A, 715 B, 715 C which may be generated and displayed for the user-via the worn user-device 110 —as they approach the proximity of the pick location. Proximity may be defined as within a specific row upon which the asset or package to be picked is located. As may be understood from FIGS. 11A-11C in particular, the navigational projections 715 may include a frame and/or a checkmark, showing the right location to pick the asset or package from. In certain embodiments, text indicating a quantity may also be illustrated, as in FIG.
- projection 715 A which projection encompasses multiple package or asset boundaries, as may be recognized and detected by the user device 110 upon approach to the asset or package.
- the boundaries may, as a non-limiting example, be determined by the user device 110 , at least in part, based upon asset information received from the control system 100 .
- software embedded upon the user device 110 in conjunction with the camera (e.g., the camera 116 of FIG. 5 ) may be configured to, via an iterative machine learning-type algorithm and/or object recognition algorithm, recognize and determine asset/package dimensions and thus boundaries over a period of time.
- FIG. 11A illustrates one exemplary embodiment, in which, as alluded to above, the navigational projection 715 A encompasses multiple assets or packages 10 in a particular location 400 , with textual instructions also be generated and provided to the user to “pick 2 ” of the highlighted or framed packages.
- the instructions may say “pick these two” assets, with the frame encompassing only two specific assets on the shelf or location 400 .
- FIG. 11C multiple frames and instructions associated with navigational projection 715 C may be provided, distinctly identifying three assets or packages that need to be picked.
- a single asset pick embodiment is also illustrated in FIG.
- FIGS. 11A-C are non-limiting in nature; additional and/or alternatively configured navigational projections 715 may be envisioned within the scope of the inventive concept described herein.
- various embodiments may involve the user device 110 further generating and transmitting to the user 5 audible alerts when deviations occur.
- the user device 110 via its speaker 117 of FIG.
- the user device 110 upon the user with the user device 110 reaching the pick location, the user device 110 is configured to further detect asset handling, specifically when the asset or package has been picked up by the user. In certain embodiments, feedback may also be provided at this junction, via the speaker 117 of the user device 110 . In these and still other embodiments, the user device 110 may capture an image of the “picking” (e.g., via its camera 116 ) and/or transmit the same to the control system 100 for centralized/remote verification of picking accuracy and completeness. This action may occur also in conjunction with Block 507 , whereby one or more pick-related notifications may be generated and/or transmitted by the user device 110 , whether to the user 5 and/or the control system 100 .
- asset handling specifically when the asset or package has been picked up by the user.
- feedback may also be provided at this junction, via the speaker 117 of the user device 110 .
- the user device 110 may capture an image of the “picking” (e.g., via its camera 116 ) and/or transmit
- the detection of the “picking” may be conducted by the user device 110 via a collision detection algorithm, as detailed elsewhere herein.
- a collision detection algorithm As generally understood, though, such algorithm(s) are configured to detect changes in movement relative to the user device 110 , whereby if an item or person (e.g., a user) associated with or wearing the user device 110 encounters a collision—for example by picking up and physically touching an asset or package—that “collision” likewise registers at the user device.
- the user device 100 may be programmed to transition from a guidance mode to—at least temporarily—a report or notification mode, so as to convey—for example to the control system 100 —that the pick has occurred.
- multiple algorithms may be utilized. One may be to identify what an asset or package is, namely what its physical boundaries entail. Another is to interpret when a user's hands (or the like) collide with and pick up (or set down) the asset or package. Each may be assisted, not only via the depth sensors 119 of the user device, but also the camera 116 thereof. In certain embodiments, at least two and in some instances four depth sensors 119 may be provided, each with a working range between 0.85 and 3.1 meters (alternative ranges may also be provided).
- the environment may thus not only be mapped, but changes therein, including collisions between objects—including packages, assets, users, and/or devices such as forklifts operated by a user—may be detected and accounted for.
- particular algorithms able to identify the parcel itself using machine learning techniques to search for physical clues (e.g., size, color, scratches, or any feature, even microscopic that may lead to uniquely id the parcel) without needing to read any parcel id label or barcode.
- the user device 110 may be configured to operate in a sort mode, which corresponds generally to utilization of the user device to move assets or packages to/from various sorting locations within a mapped environment, as contrasted with a user locating the asset or package from a stored “pick” location.
- the sequence in sort mode initiates with the user device 110 , in Block 508 , detecting handling of an asset by the user (e.g., via the collision algorithm described previously herein) and/or obtaining asset identifier data from the asset or package (e.g., via the camera 116 , whether independently at the user device 110 or further in conjunction with an exchange of certain asset/package data with the control system 100 ).
- the user device 110 Upon obtaining the asset identifier data, the user device 110 is able to determine and/or receive sort location data for the asset or package 10 in Block 509 . Based thereon, much like in Blocks 504 - 505 and 507 (in the context of sorting), the user device 110 is configured to—according to various embodiments—generate sort instructions in Block 510 , dynamically display sort instructions (e.g., navigational projections, text indicia, and/or audible instructions and the like) in Block 511 , and generate/transmit one or more sort-related notifications in Block 512 .
- sort instructions e.g., navigational projections, text indicia, and/or audible instructions and the like
- the generating and displaying of one or more navigational projections configured to guide the user to an asset location within an environment is based at least on associating one or more asset locations within a mapped environment.
- the displaying occurs within an environment that a user is in without regard to a necessary medium (e.g., lens, projector screen, etc.).
- the projection is displayed in open space within the environment. It should be understood that any of Blocks 509 - 512 may be substantially the same or identical (the same) as those in Blocks 503 - 505 and 507 , as previously detailed herein; in certain embodiments, though, one or more of the Blocks may be configured differently for sort versus picking mode.
- FIGS. 13A-F Additional details relative to the utilization of the user device 110 in sort mode may be understood with reference to FIGS. 13A-F , which figures are also described elsewhere herein. Additional details regarding sorting procedures may also be understood with reference to U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety.
- FIG. 7 illustrates exemplary steps carried out by the control system 100 according to various embodiments of the present disclosure.
- the control system 100 may receive asset identifier data at Block 601 .
- the asset indicator data may be received from the user device 110 , the acquisition device 810 , and/or the one or more location devices 415 at a location 400 . Further details regarding the scope and contents of the asset identifier data have been described previously herein. Still additional details in this respect may be understood with reference to U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety.
- the control system 100 may be configured to determine the appropriate location 400 for the asset 10 and/or the appropriate position within the location for the asset 10 .
- the determination of the appropriate location for the asset 10 may be based at least in part on the received asset identifier data.
- the control system 100 may utilize location data corresponding to each of the locations 400 to determine whether any subsequent processing to be performed on assets 10 at each location 400 will move the asset 10 closer to its final destination.
- the control system 100 may determine the appropriate location for an asset 10 to be delivered to 123 Main Street, Atlanta, Ga. is the delivery vehicle that will deliver other assets 10 to 123 Main Street, Atlanta, Ga. Additional details in this respect may be understood with reference to U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety.
- the control system 100 may be configured to transmit data identifying the appropriate sort location to the user device 110 .
- the user device 110 may be configured to generate an indicator (e.g., visual indicators or navigational projections 710 / 715 / 810 ) discernible by the user 5 (e.g., carrier personnel) regarding the appropriate pick or sort location for the asset 10 .
- an indicator e.g., visual indicators or navigational projections 710 / 715 / 810
- each asset 10 may have information indicative of an appropriate location printed thereon, and accordingly the control system 100 may not need to—in those embodiments—transmit appropriate location data to the user device 110 .
- the control system 100 may also be configured to receive a variety of data—including location data—from the user device 110 at Block 604 .
- the control system 100 may subsequently compare the appropriate location (at which the user for picking or the asset for sorting should be located) and the actual location data received at Block 604 to determine whether the user device 110 is proximate the appropriate location.
- the remaining steps to be completed may be selected based at least in part on a determination of whether the location is an appropriate (or desired/accurate) location. Additional details in this respect may be understood with reference to U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety.
- the control system 100 may generate mistake data at Block 610 . Upon generating the mistake data, the control system 100 may transmit the mistake data to the user device 110 at Block 611 .
- the user device 110 may be configured to generate a message discernible by the user 5 (e.g., carrier personnel) indicating the user device 110 is proximate an incorrect location 400 (e.g., as illustrated in FIG. 13D ).
- the control system 100 may be configured to associate the asset identifier data with the location data corresponding to the location 400 at Block 612 .
- the user 5 may continue transporting the asset 10 (and consequently the user device 110 ) to another (ideally correct) location 400 . The process may return to Block 601 in such scenarios and repeat the recited steps.
- the process may proceed after comparing the actual/received location data and the appropriate location data for the asset 10 (illustrated as Block 605 ) with reference to Blocks 607 - 609 if the user 5 approaches the appropriate location.
- Block 605 the appropriate location data for the asset 10
- Blocks 607 - 609 the user 5 approaches the appropriate location.
- additional details in this respect may be understood with reference to U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety.
- the control system 100 may be further configured to generate one or more alerts regarding the association between the asset identifier data and the location data.
- the control system 100 may be configured to generate an alert to inform the user 5 (e.g., carrier personnel) or other users regarding asset identifier data being associated with location data.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Signal Processing (AREA)
- Economics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Quality & Reliability (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- Operations Research (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Human Resources & Organizations (AREA)
- Entrepreneurship & Innovation (AREA)
- Development Economics (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This Application is a continuation of U.S. application Ser. No. 16/226,180, entitled “Hands-Free Augmented Reality System For Picking and/or Sorting Assets,” filed Dec. 19, 2018.
- U.S. application Ser. No. 16/226,180 is a continuation-in-part of U.S. application Ser. No. 16/103,566, entitled “Hands-Free Augmented Reality System For Picking and/or Sorting Assets,” filed Aug. 14, 2018; which claims priority to U.S. Provisional Patent Application No. 62/545,752, entitled “Hands-Free Augmented Reality System For Picking and/or Sorting Assets and Methods of Utilizing The Same,” filed Aug. 15, 2017.
- U.S. application Ser. No. 16/226,180 claims priority to U.S. Provisional Patent Application No. 62/607,814, entitled “Hands-Free Augmented Reality System for Picking and/or Sorting Assets and Methods of Utilizing the Same,” filed Dec. 19, 2017
- Each of the aforementioned applications is incorporated herein by reference in its entirety.
- The automated handling of parcels (e.g., packages, containers, letters, items, pallets, etc.) transported by common carriers through transportation networks is a complex problem with many parts. No single system or method alone appears to provide a comprehensive solution for all conditions. A primary component in some systems and methods for automated handling of packages is a conveyance device (i.e., a conveyor belt), which is generally formed and/or extended around at least two driving wheels. Thus, by turning the driving wheels, the conveyor belt may run continuously. Conveyor belts may also generally be flexible and deformable at least while running in contact with the driving wheels, and a multitude of materials, linkages, and so forth have been used to achieve these goals.
- Where automated handling of packages has been implemented, certain inefficiencies may arise. For example, where packages may be improperly or too closely placed relative to one another on a conveyor belt, congestion may arise, impacting various measurements or the like that need to be performed on the packages while on the conveyor belt. Still further, where the materials in which packages are wrapped (e.g., foil or paper or the like) differ in color or other material characteristics, inaccuracies may also arise in any measurements, imaging, or observations made in an automated fashion relative to the packages or assets.
- Beyond interactions with conveyor belts, automated handling of parcels creates additional challenges related to how the parcels—referred to elsewhere herein as assets—are transported and/or handled by carrier personnel between conveyor belts (and the like) and respective inventory locations (e.g., for picking) and/or sort locations (e.g., for sorting) associated with the assets.
- In this context, a need exists for improved technological systems, assemblies, and/or methods for maintaining accurate records of the location of an asset in a sort and/or pick process, while also providing to carrier personnel improved instructions and/or guidance for the automated handling of the packages within various environments (e.g., a delivery vehicle, a trailer or cargo area of a delivery vehicle, a warehouse environment whether relative to a sort location, a pick location, a conveyor belt, and/or any combination thereof).
- According to various embodiments described herein, a system is provided for hands-free handling of at least one asset by a user. The system can include a user device configured to be worn by a user. The user device may include one or more memories and one or more processors configured to perform the following operations. Asset identifier data can be obtained for at least one asset. Location data, associated with a location for the at least one asset, can be determined based, at least in part, upon the obtained asset identifier data. One or more navigational projections configured to guide the user to the location can be dynamically generated and displayed. Handling of the at least one asset by the user can be detected. One or more notifications associated with the handling of the at least one asset by the user at the location may be received.
- In another embodiment, a computer implemented method is provided for hands-free handling of at least one asset by a user is provided. The method may include the following operations. Asset identifier data for at least one asset can be received from a remote location relative to a user device that is worn by the user. First location data associated with the user device can be determined at the user device. Second location data associated with the at least one asset may be determined. The second location data may be determined, based at least in part, on analyzing the received asset identifier data. The first location data may be determined, based at least in part, on analyzing a present position of the user device. One or more navigational projections configured to guide the user to a location associated with the second location data may be dynamically generated and displayed. The one or more navigational projections may be dynamically updated based at least in part on one or more detected changes of a present location of the user device.
- In yet another embodiment, a computer program product is provided for hands-free handling of at least one asset by a user. The computer program product may include at least one non-transitory computer-readable storage medium having computer-readable program code portions stored therein. The computer-readable program code portions may include one or more executable portions configured for performing the following operations. An environment that a user is located in can be mapped based at least in part on generating a multidimensional graphical representation of the environment. Asset identifier data to identify at least one asset can be received at a user device. One or more assets locations can be associated within the mapped environment. The one or more asset locations may be associated with the at least one asset. One or more navigational projections configured to guide the user to an asset location within the environment may be generated and displayed based at least on the associating and within the environment that the user is in.
- In a further embodiment, a system is provided. The system comprises one or more processors and one or more computer-storage media having one or more instructions stored that, when used by the one or more processors, cause the one or more processors to perform the following operations. The operations comprise initializing a scanning device secured to a storage area of a delivery vehicle. Additionally, the operation comprise determining that an asset is located within the storage area based on scanning information obtained from the scanning device, wherein the scanning information includes an asset identifier associated with the asset and a defined delivery destination. The operations also comprise obtaining a current location of the delivery vehicle based on detected location data. The operations further comprise, based on a determination that that the obtained current location is within a threshold distance of the defined delivery destination, activating a projection device to emit a projection that corresponds to a determined position of the asset, the position of the asset being determined based at least in part on the obtained scanning information.
- In another embodiment, a computer-implemented method for scanning and locating assets is provided. The method comprises obtaining, by a computing device, an asset identifier for an asset located within a physical environment based on received scanning information, wherein the asset identifier is associated with a delivery destination defined for the asset. Additionally, the method comprises generating, by the computing device, asset location data that is associated with the obtained asset identifier and defines a position of the asset within the physical environment based on the received scanning information. The method further comprises determining, by the computing device, a current location of the physical environment based on received asset location data. The method also comprises causing, by the computing device, a projection device to emit a navigational projection that corresponds to the position of the asset within the physical environment based on the asset location data and a determination that the determined current location is within a threshold distance of the defined delivery destination.
- In yet another embodiment, one or more computer-storage media having computer-executable instructions embodied thereon that, when executed by a computing device, perform a method. The method comprises determining a physical location of a delivery vehicle based on detected location data. The method also comprises obtaining an asset identifier for an asset stored within the delivery vehicle based on obtained scanning information, wherein the asset identifier is associated with a defined delivery destination. The method further comprises determining a location to emit a navigational projection within a cargo portion of the delivery vehicle based on asset location data that is generated based at least in part on the obtained scanning information. Additionally, the method comprises causing a projection device secured to the cargo portion to emit the navigational projection directed to the determined location based on a determination that the determined physical location is within a threshold distance of the defined delivery destination.
-
FIG. 1 schematically depicts a control system according to one or more embodiments shown and described herein; -
FIG. 2 schematically depicts a control system according to one or more embodiments shown and described herein; -
FIG. 3 schematically depicts a user device that communicates with a control system of according to one or more embodiments shown and described herein; -
FIG. 4 depicts a user device in conjunction with a harness mechanism according to one or more embodiments shown and described herein; -
FIG. 5 depicts a user device in isolation without the harness mechanism according to one or more embodiments shown and described herein; -
FIG. 6 schematically depicts a flowchart illustrating operations and processes performed by a user device according to one or more embodiments shown and described herein; -
FIG. 7 schematically depicts a flowchart illustrating operations and processes performed according to one or more embodiments shown and described herein; -
FIG. 8 depicts a facility and an environmental mapping procedure achieved via a user device according to one or more embodiments shown and described herein; -
FIG. 9 depicts a facility and a pathway indicating navigational projection achieved via a user device according to one or more embodiments shown and described herein; -
FIG. 10 depicts a shelving containing portion of a facility and a placement indicating navigational projection achieved via a user device according to one or more embodiments shown and described herein; -
FIGS. 11A-C depict further views of three exemplary embodiments of the placement indicating navigational projection achieved via a user device according to one or more embodiments shown and described herein; -
FIG. 12 is a perspective or isometric view of a conveyor belt assembly that may be utilized in conjunction with the control system and user device according to one or more embodiments shown and described herein; -
FIGS. 13A-13F depict further views of additional exemplary navigational projections achieved via a user device and in conjunction with a conveyor belt assembly according to one or more embodiments shown and described herein; -
FIGS. 14A-B depicts a navigational projection within a physical environment according to one or more embodiments shown and described herein; -
FIG. 15A is a perspective view of a physical environment according to one or more embodiments shown and described herein; -
FIG. 15B is a side view of the physical environment ofFIG. 15A according to one or more embodiments shown and described herein; -
FIG. 15C is a perspective view of a plurality of scanning devices having a field of view of the physical environment ofFIG. 15A according to one or more embodiments shown and described herein; -
FIG. 16 depicts an exemplary projector device according to one or more embodiments shown and described herein; -
FIG. 17 depicts a navigational projection within a physical environment according to one or more embodiments shown and described herein; and -
FIG. 18 is a flow diagram of an exemplary process for locating an asset according to one or more embodiments shown and described herein. - The present disclosure will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the disclosure are shown. Indeed, the disclosure may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like numbers refer to like elements throughout.
- Existing asset handling technologies remain burdensome and are not suitable for all tasks. For example, augmented reality-based computing solutions have been pursued, such as with reference to U.S. Ser. No. 15/581,609, the contents of which as are incorporated by reference herein in their entirety. These augmented-reality-based solutions can utilize objects, such as smart glasses, to generate an environment so as to provide to carrier personnel (e.g., via a lens of the smart glasses) directions for transporting particular assets or packages. However, smart glasses may be uncomfortable to use for long periods of time (e.g., due to the weight and constant pressure) and these glasses reduce the peripheral vision for instructions needed for users or reduce vision in general due to glare on the lenses, which may impact both safety and job accuracy. Existing technology solutions also fail to have more robust functionality to meet the needs of multiple users for various tasks. Further, some existing asset handling technologies are based on static projection methods that are also burdensome. For example, some solutions require users to move around a large projector cart or wagon mounted with a generic projector to guide users to destinations.
- Various embodiments of the present disclosure improve these existing technologies, such as smart glasses, by at least utilizing a hands-free user device(s), a control system or server in networked communication with the hands-free user device(s), and/or a generated augmented reality environment to facilitate handling and transport of an asset or package by carrier personnel or the like. The handling and/or transport of the asset or package may be related to a picking of the asset from a pick location (e.g., to “pull” the asset to fulfill an order thereof by a customer), the sorting of the asset to a sort location (e.g., from a conveyor belt or the like to the next location in which transport or handling of the asset may occur, for example, on the shelving of a warehouse or a vehicle). The hands-free user device(s) enables carrier personnel to transport and/or handle the asset or package in a safe, ergonomic, efficient, and accurate matter, regardless of where (e.g., to and from) the handling and/or transport is occurring, at least within a three dimensional environment mapped via the hands-free user device(s).
- In an illustrative example of how these existing technologies are improved according to aspects of the present disclosure, a user device can be worn by a user, such as on a wearable article of clothing, as opposed to placing eyewear over a user's eyes or using a mobile or cart device for the handling of assets. After one or more asset identifiers are obtained and location data is determined for an asset, one or more navigational projections can be dynamically generated and displayed (e.g., within a physical environment a user is in, as opposed to a lens medium) to guide the user to the location associated with the location data. Aspects can also detect handling of the asset by the user (e.g., via cameras, sensors). One or more notifications associated with the detection of the handling can be received (e.g., from a control system). Location data can be determined based on analyzing asset identifier data and analyzing a present position of the user device. Further, a user's environment may be mapped based at least on generating a multidimensional graphical representation of the environment and associating one or more asset locations within the mapped environment. At least each of these new functionalities improve existing technologies, as these are functionalities that various existing technologies do not now employ.
- Conventional methods in the shipping and other industries rely upon carrier personnel manually reading and/or scanning asset identifier data associated with the asset or package and then based thereon manually transporting the package or asset to the proper location. A pick or inventory location code or identifier and/or a sort location code or identifier could then also—in some instances—be read or scanned to confirm transport was correctly completed. Inefficiencies and inaccuracies are oftentimes encountered.
- To address these inefficiencies and inaccuracies of the conventional methods, various non-conventional methods have been employed in the present disclosure. For example, such non-conventional methods include the following operations: obtaining one or more asset identifiers determining location data for the associated asset(s). Navigational projections can be dynamically generated and displayed (e.g., within a physical environment a user is in, as opposed to a lens medium) to guide the user to the location associated with the location data. Aspects can also detect handling of the asset by the user (e.g., via cameras, sensors). One or more notifications associated with the handling can be received (e.g., from a control system). Location data can be determined based on analyzing asset identifier data and analyzing a present position of the user device. Further, a user's environment may be mapped based at least on generating a multidimensional graphical representation of the environment and associating one or more asset locations within the mapped environment. At least each of these new functionalities include non-conventional functions.
- As used herein, an asset may be a parcel or group of parcels, a package or group of packages, a box, a crate, a drum, a box strapped to a pallet, and/or the like. According to standard practices, packages to be sorted may be moved along a conveyor belt from some package source to an intake location (e.g., one or more sort employee workstations). A user (e.g., a sort employee or carrier personnel generally) may scan a bar code on the package, or simply reviews information printed on the package, and moves that package to an appropriate sort location (e.g., a vehicle, a shelf, and/or the like) based on the information provided on the package or via the barcode scanner. As described herein, embodiments utilizing a conveyor belt assembly may rely upon an acquisition device (e.g., a stationary imager) positioned above the conveyor, upstream of the intake location or sort employee workstations to capture data associated with the package. Additional details in this respect may be understood with reference to U.S. Ser. No. 15/581,609, the contents of which as are incorporated by reference herein in their entirety.
- Via the hands-free user device(s), the carrier personnel or sort employee may be guided to particular packages to select for transport. Upon the carrier personnel or sort employee picking up the particular packages they are guided to, the hands-free user device(s) may be configured, according to various embodiments, to generate various projections, visible to the carrier personnel or sort employee. The generated projections, which may be three-dimensional or two-dimensional in form, are configured to guide the carrier personnel or sort employee from their current location to the appropriate sort location for the particular package being handled. Upon arrival—via the guidance of the various projections—at the appropriate sort location, the hands-free user device(s), upon detecting a placement of the particular package may further verify that the placement is correct. If incorrect, notification(s) may be generated, which notifications may take multiple forms, as detailed elsewhere herein.
- In the context of picking, the control system may, via the network, interface with the hands-free user device(s) so as to generate one or more of various projections to guide the carrier personnel or pick employee to the location of a particular package that needs to be picked or “pulled” for order fulfillment from a warehouse location or the like. Upon arrival of the carrier personnel or pick employee at the pick location, specific projections may be generated, so as to advise the personnel or employee which specific package should be picked/pulled and/or how many packages (i.e., of the same type) should be picked/pulled. It should be understood that upon or in response to detection of the picking/pulling of the package(s) by the user device, the latter may be further configured to then guide the carrier personnel to a subsequent location for ongoing handling/transport of the package. Exemplary subsequent locations may include a conveyor belt, a sort location, and/or a delivery location, as discussed above and also detailed elsewhere herein.
- In certain embodiments, although not necessary via the three dimensional mapping of the facility or warehouse and the network interface between the user device(s) and the control system/server, the hands-free user device(s) may utilize software that not only detects changes in handling of the packages (e.g., picking up or placement actions), but that also detects various markers or identifiers distributed throughout the facility or warehouse, so as to ensure accuracy of the guidance and/or navigational instructions provided to the carrier personnel. In other embodiments, no such markers or identifiers may be provided, as the three dimensional mapping via the user device(s)—with networking connectivity to the control system/server—may be utilized to calibrate and establish defined locations (i.e., pick or sort) throughout the facility or warehouse prior to utilization of the hands-free user device(s) for operational purposes by the carrier personnel. In some embodiments, radio signal triangulation (RFID/WIFI), digital compass, and/or any other current method to determine indoor and outdoor position and bearing may be utilized.
- In the context of picking, the control system may also, via the network or otherwise, interface with a fixed projector and/or the hands-free user device(s) so as to generate one or more of various projections to guide a delivery vehicle operator to the location of a particular package that needs to be picked or “pulled” for delivery at an address at which the delivery vehicle is presently located. Upon entry of the vehicle operator (or other personnel) in the loading portion of the delivery vehicle, specific projections may be generated, so as to advise the personnel or employee which specific package should be picked/pulled for delivery at the present address. It should be understood that upon detection of the picking/pulling of the package(s) by the projector and/or the user device, at least the latter may be further configured to then guide the carrier personnel or vehicle operator to a subsequent location for ongoing handling/transport of the package.
- Embodiments of the present disclosure may be implemented in various ways, including as computer program products that comprise articles of manufacture. A computer program product may include a non-transitory computer-readable storage medium storing applications, programs, program modules, scripts, source code, program code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like (also referred to herein as executable instructions, instructions for execution, computer program products, program code, and/or similar terms used herein interchangeably). Such non-transitory computer-readable storage media include all computer-readable media (including volatile and non-volatile media).
- In one embodiment, a non-volatile computer-readable storage medium may include a floppy disk, flexible disk, hard disk, solid-state storage (SSS) (e.g., a solid state drive (SSD), solid state card (SSC), solid state module (SSM)), enterprise flash drive, magnetic tape, or any other non-transitory magnetic medium, and/or the like. A non-volatile computer-readable storage medium may also include a punch card, paper tape, optical mark sheet (or any other physical medium with patterns of holes or other optically recognizable indicia), compact disc read only memory (CD-ROM), compact disc-rewritable (CD-RW), digital versatile disc (DVD), Blu-ray disc (BD), any other non-transitory optical medium, and/or the like. Such a non-volatile computer-readable storage medium may also include read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory (e.g., Serial, NAND, NOR, and/or the like), multimedia memory cards (MMC), secure digital (SD) memory cards, SmartMedia cards, CompactFlash (CF) cards, Memory Sticks, and/or the like. Further, a non-volatile computer-readable storage medium may also include conductive-bridging random access memory (CBRAM), phase-change random access memory (PRAM), ferroelectric random-access memory (FeRAM), non-volatile random-access memory (NVRAM), magnetoresistive random-access memory (MRAM), resistive random-access memory (RRAM), Silicon-Oxide-Nitride-Oxide-Silicon memory (SONOS), floating junction gate random access memory (FJG RAM), Millipede memory, racetrack memory, and/or the like.
- In one embodiment, a volatile computer-readable storage medium may include random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), fast page mode dynamic random access memory (FPM DRAM), extended data-out dynamic random access memory (EDO DRAM), synchronous dynamic random access memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), double data rate type two synchronous dynamic random access memory (DDR2 SDRAM), double data rate type three synchronous dynamic random access memory (DDR3 SDRAM), Rambus dynamic random access memory (RDRAM), Twin Transistor RAM (TTRAM), Thyristor RAM (T-RAM), Zero-capacitor (Z-RAM), Rambus in-line memory module (RIMM), dual in-line memory module (DIMM), single in-line memory module (SIMM), video random access memory (VRAM), cache memory (including various levels), flash memory, register memory, and/or the like. It will be appreciated that where embodiments are described to use a computer-readable storage medium, other types of computer-readable storage media may be substituted for or used in addition to the computer-readable storage media described above.
- As should be appreciated, various embodiments of the present disclosure may also be implemented as methods, apparatus, systems, computing devices, computing entities, and/or the like. As such, embodiments of the present disclosure may take the form of an apparatus, system, computing device, computing entity, and/or the like executing instructions stored on a computer-readable storage medium to perform certain steps or operations. However, embodiments of the present disclosure may also take the form of an entirely hardware embodiment performing certain steps or operations.
- Embodiments of the present disclosure are described below with reference to block diagrams and flowchart illustrations. Thus, it should be understood that each block of the block diagrams and flowchart illustrations may be implemented in the form of a computer program product, an entirely hardware embodiment, a combination of hardware and computer program products, and/or apparatus, systems, computing devices, computing entities, and/or the like carrying out instructions, operations, steps, and similar words used interchangeably (e.g., the executable instructions, instructions for execution, program code, and/or the like) on a computer-readable storage medium for execution. For example, retrieval, loading, and execution of code may be performed sequentially such that one instruction is retrieved, loaded, and executed at a time. In some exemplary embodiments, retrieval, loading, and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together. Thus, such embodiments can produce specifically-configured machines performing the steps or operations specified in the block diagrams and flowchart illustrations. Accordingly, the block diagrams and flowchart illustrations support various combinations of embodiments for performing the specified instructions, operations, or steps.
- Generally, embodiments of the present disclosure relate to concepts for utilizing a hands-free user device(s), a control system or server in networked communication with the hands-free user device(s), and a generated augmented reality environment to facilitate handling and transport of an asset or package by carrier personnel or the like.
FIG. 1 is a schematic diagram showing the exemplary communication relationships between components of various embodiments of the present disclosure. As shown inFIG. 1 , the system may include one ormore control systems 100, one ormore user devices 110, one or more (optionally)location devices 415 associated with a location 400 (e.g., a sort location or a pick location), one or more (optionally)conveyor belt assemblies 800, and one ormore networks 105. Each of the components of the system may be in electronic communication with one another over the same or different wireless or wired networks including, for example, a wired or wireless Personal Area Network (PAN), Local Area Network (LAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), or the like. Additionally, whileFIG. 1 illustrates certain system entities as separate, standalone entities, the various embodiments are not limited to this particular architecture. - A. Exemplary Control System
-
FIG. 2 provides a schematic of acontrol system 100 according to one embodiment of the present disclosure. As described above, thecontrol system 100 may be incorporated into a system as one or more components for providing information regarding theappropriate location 400 for each of one or more assets 10 (seeFIGS. 8-12 ). In general, the terms computing entity, computer, entity, device, system, and/or similar words used herein interchangeably may refer to, for example, one or more computers, computing entities, desktops, mobile phones, tablets, phablets, notebooks, laptops, distributed systems, gaming consoles (e.g., Xbox, Play Station, Wii), watches, glasses, key fobs, radio frequency identification (RFID) tags, ear pieces, scanners, televisions, dongles, cameras, wristbands, kiosks, input terminals, servers or server networks, blades, gateways, switches, processing devices, processing entities, set-top boxes, relays, routers, network access points, base stations, the like, and/or any combination of devices or entities adapted to perform the functions, operations, and/or processes described herein. Such functions, operations, and/or processes may include, for example, transmitting, receiving, operating on, processing, displaying, storing, determining, creating/generating, monitoring, evaluating, comparing, and/or similar terms used herein interchangeably. In one embodiment, these functions, operations, and/or processes can be performed on data, content, information, and/or similar terms used herein interchangeably. Thecontrol system 100 may also comprise various other systems, such as an Address Matching System (AMS), an Internet Membership System (IMS), a Customer Profile System (CPS), a Package Center Information System (PCIS), a Customized Pickup and Delivery System (CPAD), a Web Content Management System (WCMS), a Notification Email System (NES), a Fraud Prevention System (FPS), and a variety of other systems and their corresponding components. - As indicated, in one embodiment, the
control system 100 may also include one ormore communications interfaces 220 for communicating with various computing entities, such as by communicating data, content, information, and/or similar terms used herein interchangeably that can be transmitted, received, operated on, processed, displayed, stored, and/or the like. - As shown in
FIG. 2 , in one embodiment, thecontrol system 100 may include or be in communication with one or more processing elements 205 (also referred to as processors, processing circuitry, and/or similar terms used herein interchangeably) that communicate with other elements within thecontrol system 100 via a bus, for example. As will be understood, theprocessing element 205 may be embodied in a number of different ways. For example, theprocessing element 205 may be embodied as one or more complex programmable logic devices (CPLDs), microprocessors, multi-core processors, co-processing entities, application-specific instruction-set processors (ASIPs), microcontrollers, and/or controllers. Further, theprocessing element 205 may be embodied as one or more other processing devices or circuitry. The term circuitry may refer to an entirely hardware embodiment or a combination of hardware and computer program products. Thus, theprocessing element 205 may be embodied as integrated circuits, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays (PLAs), hardware accelerators, other circuitry, and/or the like. As will therefore be understood, theprocessing element 205 may be configured for a particular use or configured to execute instructions stored in volatile or non-volatile media or otherwise accessible to theprocessing element 205. As such, whether configured by hardware or computer program products, or by a combination thereof, theprocessing element 205 may be capable of performing steps or operations according to embodiments of the present disclosure when configured accordingly. - In one embodiment, the
control system 100 may further include or be in communication with non-volatile media (also referred to as non-volatile storage, memory, memory storage, memory circuitry and/or similar terms used herein interchangeably). In one embodiment, the non-volatile storage or memory may include one or more non-volatile storage ormemory media 210, including but not limited to hard disks, ROM, PROM, EPROM, EEPROM, flash memory, MMCs, SD memory cards, Memory Sticks, CBRAM, PRAM, FeRAM, NVRAM, MRAM, RRAM, SONOS, FJG RAM, Millipede memory, racetrack memory, and/or the like. As will be recognized, the non-volatile storage or memory media may store databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like. Such code may include an operating system, an acquisition module, a sort location module, a matching module, and a notification module. The terms database, database instance, database management system, and/or similar terms used herein interchangeably may refer to a structured collection of records or data that is stored in a computer-readable storage medium, such as via a relational database, hierarchical database, and/or network database. - In one embodiment, the
control system 100 may further include or be in communication with volatile media (also referred to as volatile storage, memory, memory storage, memory circuitry and/or similar terms used herein interchangeably). In one embodiment, the volatile storage or memory may also include one or more volatile storage ormemory media 215, including but not limited to RAM, DRAM, SRAM, FPM DRAM, EDO DRAM, SDRAM, DDR SDRAM, DDR2 SDRAM, DDR3 SDRAM, RDRAM, TTRAM, T-RAM, Z-RAM, RIMM, DIMM, SIMM, VRAM, cache memory, register memory, and/or the like. As will be recognized, the volatile storage or memory media may be used to store at least portions of the databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like being executed by, for example, theprocessing element 205. Thus, the databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like may be used to control certain aspects of the operation of thecontrol system 100 with the assistance of theprocessing element 205 and operating system. - As indicated, in one embodiment, the
control system 100 may also include one ormore communications interfaces 220 for communicating with various computing entities, such as by communicating data, content, information, and/or similar terms used herein interchangeably that can be transmitted, received, operated on, processed, displayed, stored, and/or the like. Such communication may be executed using a wired data transmission protocol, such as fiber distributed data interface (FDDI), digital subscriber line (DSL), Ethernet, asynchronous transfer mode (ATM), frame relay, data over cable service interface specification (DOCSIS), or any other wired transmission protocol. Similarly, thecontrol system 100 may be configured to communicate via wireless external communication networks using any of a variety of protocols, such as general packet radio service (GPRS), Universal Mobile Telecommunications System (UMTS), Code Division Multiple Access 2000 (CDMA2000), CDMA2000 1× (1×RTT), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), Evolved Universal Terrestrial Radio Access Network (E-UTRAN), Evolution-Data Optimized (EVDO), High Speed Packet Access (HSPA), High-Speed Downlink Packet Access (HSDPA), IEEE 802.11 (Wi-Fi), 802.16 (WiMAX), ultra-wideband (UWB), infrared (IR) protocols, near field communication (NFC) protocols, Bluetooth™ protocols (e.g., Bluetooth™ Smart), wireless universal serial bus (USB) protocols, and/or any other wireless protocol. - The
control system 100 may include or be in communication with one or more input elements, such as a keyboard input, a mouse input, a touch screen/display input, motion input, movement input, audio input, pointing device input, joystick input, keypad input, and/or the like. Thecontrol system 100 may also include or be in communication with one or more output elements (not shown), such as audio output, video output, screen/display output, motion output, movement output, and/or the like. - As will be appreciated, one or more of the control system's 100 components may be located remotely from
other control system 100 components, such as in a distributed system. Furthermore, one or more of the components may be combined and additional components performing functions described herein may be included in thecontrol system 100. Thus, thecontrol system 100 can be adapted to accommodate a variety of needs and circumstances. As will be recognized, these architectures and descriptions are provided for exemplary purposes only and are not limiting to the various embodiments. Additional details in this respect may be understood from U.S. Ser. No. 15/390,109, the contents of which as are incorporated herein by reference in their entirety. - As will also be appreciated, the
control system 100 may be generally configured to maintain and/or update a defined location map associated with a facility or warehouse in which the user device(s) will be operated. This may be maintained for provision to the user device(s) upon calibration or initial “environment mapping” (seeFIG. 8 ) via the user device(s); in other embodiments, the control system may maintain the defined location map—indicating where each package or asset should be located (whether for picking or sorting)—as a fail-safe check or validation to be assessed against the environment mapping conducted via and at the user device(s). In these embodiments, location devices 415 (or identifier tags/codes/or the like) may be provided at the respective locations, for scanning or recognition via the user device(s) during calibration and/or environment mapping. In at least one preferred embodiment, however, the environment mapping occurs without need for or utilization ofsuch location devices 415. - B. Exemplary User Device
-
FIG. 3 depicts auser device 110 that a user 5 (e.g.,user 5 ofFIGS. 8-12 ) may operate. As used herein, a user may be an individual (e.g., carrier personnel, such as a sort employee, a pick employee, or the like), a group of individuals, and/or the like. In various embodiments, the user may operate theuser device 110, which may include one or more components that are functionally similar to those of thecontrol system 100. In one embodiment, theuser device 110 may be one or more mobile phones, tablets, watches, glasses (e.g., Google Glass, HoloLens, Vuzix M-100, SeeThru, Optinvent ORA-S, Epson Moverio BT-300, Epson Moverio BT-2000, ODG R-7, binocular Smart Glasses, monocular Smart Glasses, and the like), wristbands, and the like, and/or any combination of devices or entities adapted to perform the functions, operations, and/or processes described herein. In particularly preferred embodiments herein theuser device 110 is a hands-free type device, including wearable items/devices (e.g., the user devices ofFIGS. 4-5 ), head-mounted displays (HMDs) (e.g., Oculus Rift, Sony HMZ-T3 W, and the like), and the like. It should be understood that, in at least these embodiments, theuser device 110 is configured to not impede the user's range of vision and/or the like during the use thereof, as is possible in other embodiments wherein, for example, theuser device 110 is a type of glasses or the like. - It should also be understood that the
term user device 110 is intended to refer to any device that projects, superimposes, overlays, or otherwise provides an image or projection on a surface with respect to a user's viewing angle or line of vision or auser device 110's angle. With reference now toFIGS. 4 and 5 , illustrated therein is an exemplary embodiment of a hands-free user device 110 according to various solutions described herein. Thedevice 110 is configured for chest mounting via amounting mechanism 112 in the illustrated embodiment. The mountingmechanism 112 may include a shoulder strap portion 112-1 configured to adjustably wrap around a user's anterior and posterior portion of the user's shoulders and a waist portion 112-2 configured to adjustably wrap around the user's waist. The shoulder strap portion 112-1 is connected to a top portion of thedevice component 114 at the anterior side and a top portion of the waist portion 112-2 at the posterior side. As discussed herein, it should be understood that thedevice 110 may come in any suitable form and be mounted in any suitable manner and in any suitable location. For example, in some embodiments, theuser device 110 does not include the mountingmechanism 112 and may be mounted to a component: on the user's head (e.g., a hardhat), within a wristband, within a sock, within a glove, within a shirt, and/or within any other suitable article of clothing in any orientation. It may also be provided with a mounting mechanism (not illustrated) on a picking cart or a forklift or any type of component operated and/or being moved by the carrier personnel. In some embodiments, other mounts could place thedevice 110 on top of the head (e.g., via a helmet, cap or headband), or shoulder mounted. This, in order to avoid covering the projection element when carrying a parcel in front of the chest. In some embodiments, the projection element, the sensors and the processing units are mounted in different parts of the body to get a better weight distribution. - Remaining with
FIGS. 4 and 5 , theuser device 110 in its hands-free form may include not only the mountingmechanism 112 but also adevice component 114 that together define and constitute theuser device 110 in some embodiments. Via the device component 114 (FIG. 5 ), theuser device 110 in its hands-free form may include an antenna 115 (e.g., theantenna 312 of the user device ofFIG. 3 ), acamera 116, a speaker/microphone 117, a pivotinglaser projector 118 and two or more three-dimensional depth sensors 119. In this respect, it should more broadly be understood that theterm user device 110 is intended to also include any other peripheral electronics and functionality that may be provided in conjunction with such devices. For example, theuser device 110 may include speakers, headphones, or other electronic hardware for audio output, a plurality of display devices, one or more position sensors (e.g., gyroscopes, global positioning system receivers, and/or accelerometers), battery packs, beacons for external sensors (e.g., infrared lamps), or the like. In one embodiment, theuser device 110 can be used to provide an augmented reality environment/area, a mixed reality environment/area, and/or similar words, as may be used herein interchangeably, to a user. The terms augmented/mixed environment/area should be understood to refer to a combined environment/area including the physical environment/area and elements of a virtual environment/area. In some embodiments, the pivotinglaser projector 118 is alternatively an LED picoprojector. - In some embodiments, the
device component 114 alternatively or additionally includes different sensors for various functions, such as one or more digital compasses, accelerometers and/or gyroscopes configured to determine changes in position or speed of a user such that the pivotinglaser projector 118 projects the correct image in the correct orientation. For example, if the user is hanging in a sideways manner, an accelerometer can detect that the associateddevice component 114 is likewise oriented. This information can be identified by a processor, which causes the pivotinglaser projector 118 to responsively transmit a projection in a sideways manner, as opposed to a manner associated with the user standing on his/her feet. In another example, the user can be running or otherwise moving at a particular speed, which causes theprojector 118 to make projections faster/slower based on the speed or acceleration a user is moving at. Additionally or alternatively, these movement sensors can be used for notification purposes to thecontrol system 100. For example, the accelerometer may infer that a person is in a particular orientation. These accelerometer readings may then be transmitted, via theantenna 115, to thecontrol system 100 such that thecontrol system 100 responsively transmits a notification back to thedevice component 114 in order to warn or notify the user whether the user is in a suitable orientation. Other sensors may be used alternatively or additionally, such as range finders to identify how far away thedevice component 114 is from obstacles (e.g. conveyor devices) within an environment. This may help the projected image be projected in a more precise three-dimensional manner. For example, referring toFIG. 12 , theprojection 810 may be projected in its specific orientation based at least on one or more range finders identifying the precise distance between a user device and the conveyingmechanism 802. In another example, proximity-based sensors (e.g., RFID reader and tag) may be utilized within the user device and an object (e.g., an asset and/or conveying mechanism) in order to trigger the appropriate projections projected by theprojector 118. For example, thedevice component 114 may include a tag reader that, when within a proximity or signal strength threshold of a tag located on an asset/object, triggers projections and/or notifications from thecontrol system 100. In another example, assets or other pieces of equipment (e.g., a conveyance mechanism) includes one or more beacons configured to transmit location identifiers to any listening device within a threshold distance. In these embodiments, the listening device may be thedevice component 114, which receives the location identifiers and transmits them to thecontrol system 100 such that thecontrol system 100 provides responsive notifications back to thedevice component 114, such as “pick package Y from shelf X,” etc. In some aspects, thedevice component 114 includes one or more location sensors (e.g., beacons, GPS modules) for the determining and analyzing location data as described herein. - Referring back to
FIG. 3 , theuser device 110 can include an antenna 312 (e.g., theantenna 115 ofFIG. 5 ), a transmitter 304 (e.g., radio), a receiver 306 (e.g., radio), and a processing element 308 (e.g., CPLDs, microprocessors, multi-core processors, co-processing entities, ASIPs, microcontrollers, and/or controllers) that provides signals to and receives signals from thetransmitter 304 andreceiver 306, respectively. Certain embodiments of theuser device 110 may also include and/or be associated with any of a variety of sensors (e.g., three-dimensional sensors, such as thedepth sensors 119 ofFIG. 5 ), depth cameras (e.g., thecamera 116 ofFIG. 5 ), three-dimensional scanners, binocular cameras, stereo-vision systems, pivoting projectors (e.g., thelaser projector 118 ofFIG. 5 ). - In certain embodiments, the three-dimensional sensors (e.g.,
sensors 119 ofFIG. 5 ) and/or the three-dimensional scanners may be utilized to “read” the environment surrounding theuser device 110, as detailed elsewhere herein. In those and additional embodiments, the sensors and/or scanners may build therefrom a three-dimensional model of the area through which thedevice 110 travels and/or has travelled. This generated model, as detailed elsewhere herein, may then be compared by one or more processors within thedevice 110 to a memory-based map of the facility or area (i.e., the environment). By doing so, the scanner readings may be used to determine which area of the map is in front of a user of the user device 110 (during operation) and extrapolate from the same the position and heading of the user for future movement. - In certain embodiments, the pivoting projectors may be the pivoting
laser projector 118 ofFIG. 5 ; although, in other embodiments the projectors need not necessarily be laser-based. In various embodiments, it should be understood that the projectors are configured to generate and provide—in a manner visible to the user (e.g., the carrier personnel)—one or more navigational guidance projections (e.g., arrows, frames, text, and/or the like). These projections, as will be discussed in further detail elsewhere herein, may be two-dimensional representations (e.g., as illustrated inFIGS. 8-11C ), three-dimensional representations (e.g., theprojection 810 ofFIG. 12 ), and/or holographic-based projections, or the like. The projections may be provided on a floor surface, a wall surface, and/or on or adjacent a shelving structure, as detailed elsewhere herein. Optionally, a separate sensor, a regular camera, or the like (e.g., thecamera 116 ofFIG. 5 ) may also be provided on theuser device 110 for reading of the projected image(s) and therefrom verify if the projector is working properly and/or whether the result projected is readable. Via theuser device 110 and software associated therewith, in conjunction with location determining capabilities of the system described herein generally, the projections provided via the user device are updated in real-time or near-real-time, as the user moves physically. A refresh rate in the range of 35-60 times per second may be provided, although differing rates of refreshing may be desirable, provided that the rate provided is substantially real-time or near real-time in nature. - Remaining with
FIG. 5 , thecamera 116 of theuser device component 114 illustrated therein in its hands-free form may be utilized as a fail-safe for visual confirmation or the like of correct/accurate handling of an asset or package by the user of theuser device 110. For example, in some embodiments thecamera 116 captures each asset identifier and/or asset location identifier as a user traverses an environment. This location data and asset identifier data may then be transmitted, in near-real time via theantenna 115, to thecontrol system 100. Thecontrol system 100 may then compare the asset identifier to asset identifiers stored in a data store to identify an asset and do the same with the captured location. In this way, thecontrol system 100 may identify any discrepancies between the asset and the location by locating any mismatches between identifiers. For example, thecontrol system 100 may determine that the identifier associated with package X should be located, picked, and/or placed at shelf Y, but thecamera 116 captured it located in, picked, sorted and/or placed at shelf B. A notification indicating this may be responsively transmitted back to thedevice 114 such that thespeaker 117 issues a prompt indicating the discrepancy and/or telling the user where the correct location is for the particular package. In some embodiments, thedevice component 114 itself determines this information without the need to transmit the information to thecontrol system 100 for processing. In some embodiments, other notifications may be provided additionally or alternatively, such as a visual notification within a display screen on the device component and/or a notification that causes vibration of thedevice component 114. - In certain embodiments, the
camera 116 may be utilized as a verification mechanism for ensuring that theprojector 118 is working properly and/or is displaying readable projections. For example, thedevice component 114 may stream in near-real-time information captured via thecamera 116 to thecontrol system 100. If no projections are captured, this may trigger an alert (e.g., to a supervisor mobile device), which indicates that the projections are not being made. Likewise, if a projection is not verified (e.g., because there is a lot of light reducing projection image boundaries), a notification can be made in a similar manner as described above. Thespeaker 117 may be utilized in conjunction therewith, so as to provide audible commands to the user (e.g., delivered from thecontrol system 100 to thecomponent 114 via the antenna 115) should a deviation occur and/or to enable the user of the user device to communicate, via the network, with the control system in a near real-time or real-time manner. For example, in some embodiments, thespeaker 117 alternatively or additionally includes a microphone that picks up sound variations that are stored in the memory. The sound variations may correspond to a command or natural language phrase issued by the user, such as “where do I find item XT” or “where is shelf Y located?” Responsively, these sound variations are transmitted, via theantenna 115, to thecontrol system 100. In these embodiments, thecontrol system 100 may employ one or more voice recognition algorithms to interpret the sound variations and provide one or more responsive notifications back to thedevice component 114, such that thespeaker 117 provides the notification output. For example, in response to the user question of “where do I find item XT” thecontrol system 100 may interpret the phrase, identify a data structure that associates the location with item X. Thecontrol system 100 may then responsively transmit to the device component 114 a notification that causes thespeaker 117 to output the location of where item X is. - Returning back to
FIG. 3 , the signals provided to and received from thetransmitter 304 and thereceiver 306, respectively, may include signaling information in accordance with air interface standards of applicable wireless systems. In this regard, theuser device 110 may be capable of operating with one or more air interface standards, communication protocols, modulation types, and access types. More particularly, theuser device 110 may operate in accordance with any of a number of wireless communication standards and protocols, such as those described above with regard to thecontrol system 100. In a particular embodiment, theuser device 110 may operate in accordance with multiple wireless communication standards and protocols, such as UMTS, CDMA2000, 1×RTT, WCDMA, TD-SCDMA, LTE, E-UTRAN, EVDO, HSPA, HSDPA, Wi-Fi, WiMAX, UWB, IR, NFC, Bluetooth™ Smart, USB, and/or the like. Similarly, theuser device 110 may operate in accordance with multiple wired communication standards and protocols, such as those described above with regard to thecontrol system 100 via anetwork interface 320. - Via these communication standards and protocols, the
user device 110 can communicate with various other entities (e.g., thecontrol system 100, alocation device 415, or the like) using concepts such as Unstructured Supplementary Service Data (USSD), Short Message Service (SMS), Multimedia Messaging Service (MMS), Dual-Tone Multi-Frequency Signaling (DTMF), and/or Subscriber Identity Module Dialer (SIM dialer). Theuser device 110 can also download changes, add-ons, and updates, for instance, to its firmware, software (e.g., including executable instructions, applications, program modules), and operating system; this may occur periodically, upon initiation via a user of theuser device 110, or upon cue(s) received at the user device from thecontrol system 100. - According to one embodiment, the
user device 110 may also include a location and/or perspective determining aspect, device, module, functionality, and/or similar words used herein interchangeably. For example, theuser device 110 may include outdoor and/or environmental positioning aspects, such as a location module adapted to acquire, for example, latitude, longitude, geocode, course, direction, heading, speed, universal time (UTC), date, and/or various other information/data. In one embodiment, the location module can acquire data, sometimes known as ephemeris data, by identifying the number of satellites in view and the relative positions of those satellites. The satellites may be a variety of different satellites, including Low Earth Orbit (LEO) satellite systems, Department of Defense (DOD) satellite systems, the European Union Galileo positioning systems, the Chinese Compass navigation systems, Indian Regional Navigational satellite systems, and/or the like. Alternatively, the location information may be determined by triangulating theuser device 110's position in connection with a variety of other systems, including cellular towers, Wi-Fi access points, and/or the like. Similarly, theuser device 110 may include indoor positioning aspects, such as a location/environment module adapted to acquire, for example, latitude, longitude, geocode, course, direction, heading, speed, time, date, and/or various other information/data. Some of the indoor systems may use various position or location technologies including RFID tags, indoor beacons or transmitters, Wi-Fi access points, cellular towers, nearby computing devices (e.g., smartphones, laptops), nearby components with known relative locations, and/or the like. For instance, such technologies may include the iBeacons, Gimbal proximity beacons, Bluetooth Low Energy (BLE) transmitters, Near Field Communication (NFC) transmitters, three-dimensional scanners, robot vision systems, environmental mapping devices, and/or the like. These indoor positioning aspects can be used in a variety of settings to determine the location of someone or something to within inches or centimeters. - The
user device 110 may also detect markers and/or target objects. For example, theuser device 110 may include readers, scanners, cameras, sensors, and/or the like for detecting when a marker and/or target object and/or a pattern of unique colors within its point-of-view (POV)/field-of-view (FOV) of the real world environment/area. For example, readers, scanners, cameras, sensors, and/or the like may include RFID readers/interrogators to read RFID tags, scanners and cameras to capture visual patterns and/or codes (e.g., text, barcodes, character strings, Aztec Codes, MaxiCodes, information/data Matrices, QR Codes, electronic representations, and/or the like), and sensors to detect beacon signals transmitted from target objects or the environment/area in which target objects are located. For example, in some embodiments, theuser device 110 may detect signals transmitted from the control system 100 (FIGS. 1-2 ), an asset 10 (FIG. 8 ), an improved conveyor belt assembly (FIG. 12 ), and/or from a location device 415 (FIG. 1 ), as may be desirable or advantageous. - In one embodiment, the
user device 110 may include accelerometer circuitry for detecting movement, pitch, bearing, orientation, and the like of theuser device 110. This information/data may be used to determine which area of the augmented/mixed environment/area corresponds to the orientation/bearing of the user device 110 (e.g., x, y, and z axes), so that the corresponding environment/area of the augmented/mixed environment/area may be displayed via the display along with a displayed image. For example, theuser device 110 may overlay an image in a portion of the user's POV/FOV of the real world environment/area. In these and other embodiments, theuser device 110 may also include circuitry and/or software for determining when a change in the handling of a package or asset by a user of the user device has occurred. Exemplary changes detected may include the picking up of an asset or package, the setting down of an asset or package, or the like. - The
user device 110 may also comprise or be associated with an asset indicia reader, device, module, functionality, and/or similar words used herein interchangeably. For example, theuser device 110 may include a camera or RFID tag reader configured to receive information from passive RFID tags and/or from active RFID tags associated with anasset 10. Theuser device 110 may additionally or alternatively include an optical reader configured for receiving information printed on anasset 10. For example, the optical reader may be configured to receive information stored as a bar code, QR code, or other machine-readable code. The optical reader may be integral to theuser device 110 and/or may be an external peripheral device in electronic communication with theuser device 110. The optical reader may also or alternatively be configured to receive information stored as human readable text, such as characters, character strings, symbols, and/or the like. Theuser device 110 may utilize the asset indicia reader to receive information regarding anasset 10 to be sorted. - In at least one embodiment, the
user device 110 may be equipped with an optical reader or the like configured to receive and/or monitor information associated with an associated conveyor belt, as detailed elsewhere herein. For example, the optical reader may be configured to receive and/or otherwise monitor and/or recognize a pattern located on the conveyor belt and associated with respective assets or packages. Additional details in this respect may be understood with reference to U.S. Ser. No. 15/581,609, the contents of which as are incorporated by reference herein in their entirety - The
user device 110 may also comprise a user interface (that can include a display or see-throughdisplay 314 coupled to aprocessing element 308 and/or auser input device 318 coupled to a processing element 308). For example, the user interface may be a user application, browser, user interface, and/or similar words used herein interchangeably executing on and/or accessible via theuser device 110 to interact with and/or cause display of information, as described herein. The user interface can comprise any of a number of devices allowing theuser device 110 to receive data, such as a keypad (hard or soft), a touch display, voice or motion interfaces, or other input device. In embodiments including a keypad, the keypad can include (or cause display of) the conventional numeric (0-9) and related keys (#, *), and other keys used for operating theuser device 110 and may include a full set of alphabetic keys or set of keys that may be activated to provide a full set of alphanumeric keys. In addition to providing input, the user input interface can be used, for example, to activate or deactivate certain functions, such as screen savers and/or sleep modes. - The
user device 110 can also include volatile storage ormemory 322 and/or non-volatile storage ormemory 324, which can be embedded and/or may be removable. For example, the non-volatile memory may be ROM, PROM, EPROM, EEPROM, flash memory, MMCs, SD memory cards, Memory Sticks, CBRAM, PRAM, FeRAM, NVRAM, MRAM, RRAM, SONOS, FJG RAM, Millipede memory, racetrack memory, and/or the like. The volatile memory may be RAM, DRAM, SRAM, FPM DRAM, EDO DRAM, SDRAM, DDR SDRAM, DDR2 SDRAM, DDR3 SDRAM, RDRAM, TTRAM, T-RAM, Z-RAM, RIMM, DIMM, SIMM, VRAM, cache memory, register memory, and/or the like. The volatile and non-volatile storage or memory can store databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like to implement the functions of theuser device 110. As indicated, this may include a user application that is resident on the entity or accessible through a browser or other user interface for communicating with the control system 100 (FIG. 2 ), location device 415 (FIG. 1 ), and/or various other computing entities. - In another embodiment, the
user device 110 may include one or more components or functionality that are the same or similar to those of thecontrol system 100, as described in greater detail above. As will be recognized, these architectures and descriptions are provided for exemplary purposes only and are not limiting to the various embodiments. - In the embodiment shown in
FIGS. 4 and 5 , an information gathering device may be provided via a combination of theimage camera 116 that is mounted on thedevice component 114 and/or the three-dimensional depth sensors 119. In other embodiments, the information gathering device may be a three-dimensional depth sensor, a stereo camera, and/or the like—utilized independently relative to the camera. In all of these and additional embodiments, the displayed or captured image data (e.g.,FIGS. 8-11 described elsewhere herein) is merged with objects in the physical world/environment in a seamless manner, so as to provide a sense that the displayed image(s) or projection is an extension of the reality present in the physical world/environment. This is oftentimes referred to as a “mixed reality” or a “hybrid reality” environment, whereby the merging of real and virtual worlds produces a new environment containing visualizations of both physical and digital objects that are able to co-exist and interact relative to one another in a real-time manner. Stated otherwise, provided and/or generated is an overlay of synthetic content on the real world or physical environment, with the former being anchored to and able to in a real-time manner (e.g., upon movement of a user) interact with the real world or physical environment. - In the exemplary embodiment of
FIGS. 4 and 5 , the overlay provided via theuser device 110 and itsphysical component 114 is at least a two dimensional representation that is projected ahead of or before the user of the user device, so as to provide handling/movement guidance (i.e., navigational guidance) for the user during transport of or travel to initiate transport of an asset or package. - C. Exemplary Conveyor Belt Assembly
-
FIG. 12 depicts aconveyor belt assembly 800 in communication with thecontrol system 100, where the improved conveyor belt assembly facilitates obtaining ofasset 10 information. In the embodiment depicted inFIG. 12 , theconveyor belt assembly 800 may comprise a conveyingmechanism 802 and an acquisition/display entity 804 (for capturing theasset 10 information), each of which as are described in further detail in U.S. Ser. No. 15/581,609, the contents of which as are incorporated by reference herein in their entirety. - Of note relative to
FIG. 12 , in conjunction with the hands-free user device(s) 110 described elsewhere herein, auser 5 may be guided toward a particular asset or package on the conveyingmechanism 802 via one or morenavigational projections 810. In at least the illustrated embodiment, theprojections 810 are provided in a three-dimensional form, as may be compared with the two-dimensional projections inFIGS. 8-11C . Either may be utilized interchangeably, as may be holographic-based projections or the like. Again, as mentioned, additional detail in this respect may be obtained from U.S. Ser. No. 15/581,609, the contents of which as are incorporated by reference herein in their entirety. In various embodiments, a hologram or holographic-based projection is a recording of a light field, as opposed to an image formed by a lens (e.g., of a pair of smart glasses), and is used to display a fully three or more dimensional image without the use or aid of special glasses or other intermediate objects. A hologram can be displayed within any physical geographical environment without the need of any projecting medium (e.g., projector screen, object, or lens). In some embodiments, thenavigation projection 810 is or includes a multidimensional image that represents a volumetric display, which is a visual representation of an object in at least three physical dimensions, as opposed to simulating depth or multiple dimensions through visual effects. In these embodiments, the same projected object (e.g., an arrow pointing to an asset over a conveyor) looks different from various perspectives (e.g., a side view, versus a front view, versus a back view). For example, a first front view can include a first arrow and first instructions for worker X to pick/sort from. From a second rear view, the same first arrow can include second instructions for worker Y to pick/sort from, etc. - D. Exemplary Location Device
- In various embodiments, one or more locations 400 (and/or 1400) may be associated with one or more (optionally provided)
location devices 415, with both being configured for identifying one ormore assets 10 being sorted to eachlocation 400. As non-limiting examples,such locations 400 may include one or more vehicles (e.g., aircraft, tractor-trailer, cargo container, local delivery vehicles, and/or the like), pallets, identified areas within a building, bins, chutes, conveyor belts, shelves, and/or the like. The locations may be sort locations (for transport of the asset for additional movement/handling) or pick locations (for storing of the asset until it needs to be picked or “pulled” for order fulfillment purposes of the like). The one or more location devices 415 (e.g., 415-1 ofFIG. 13E ) may be attached to alocation 400 and/or located more generally within and/or at the location 1400 (seeFIGS. 13A-13F ). Alternatively the one ormore location devices 415 may be located adjacent to asort location 400/1400 or otherwise proximate thesort location 400/1400. In various embodiments, alocation device 415 may be located proximate to an area designated to store thesort location 400/1400. For example, when thesort location 400 includes a delivery vehicle, alocation device 415 may be located above each of a plurality of parking areas designated for one or more delivery vehicles. This may apply equally relative to sort and/or pick locations (e.g.,FIGS. 13D-F ). - In various embodiments, the one or
more location devices 415 may include components functionally similar to thecontrol system 100 and/or theuser device 110. As noted above in referencing thecontrol system 100, the term “computing entity” may refer to, for example, one or more computers, computing entities, desktops, mobile phones, tablets, phablets, notebooks, laptops, distributed systems, gaming consoles (e.g., Xbox, Play Station, Wii), watches, glasses, key fobs, RFID tags, ear pieces, scanners, televisions, dongles, cameras, wristbands, kiosks, input terminals, servers or server networks, blades, gateways, switches, processing devices, processing entities, set-top boxes, relays, routers, network access points, base stations, the like, and/or any combination of devices or entities adapted to perform the functions, operations, and/or processes described herein Like the user device shown schematically inFIG. 3 , thelocation device 415 can include an antenna, a transmitter (e.g., radio), a receiver (e.g., radio), and a processing element (e.g., CPLDs, microprocessors, multi-core processors, co-processing entities, ASIPs, microcontrollers, and/or controllers) that provides signals to and receives signals from the transmitter and receiver, respectively. - The signals provided to and received from the transmitter and the receiver, respectively, may include signaling information in accordance with air interface standards of applicable wireless systems. In this regard, the
location device 415 may be capable of operating with one or more air interface standards, communication protocols, modulation types, and access types. More particularly, thelocation device 415 may operate in accordance with any of a number of wireless communication standards and protocols, such as those described above with regard to thecontrol system 100. In a particular embodiment, thelocation device 415 may operate in accordance with multiple wireless communication standards and protocols, such as UMTS, CDMA2000, 1×RTT, WCDMA, TD-SCDMA, LTE, E-UTRAN, EVDO, HSPA, HSDPA, Wi-Fi, WiMAX, UWB, IR, NFC, Bluetooth™, USB, and/or the like. Similarly, thelocation device 415 may operate in accordance with multiple wired communication standards and protocols, such as those described above with regard to thecontrol system 100 via a network interface. - Via these communication standards and protocols, the
location device 415 can communicate with various other entities (e.g., theuser device 110 and/or the control system 100) using concepts such as USSD, SMS, MMS, DTMF, and/or SIM dialer. Thelocation device 415 can also download changes, add-ons, and updates, for instance, to its firmware, software (e.g., including executable instructions, applications, program modules), and operating system. Of course, it should also be understood, as mentioned previously herein, that certain embodiments utilizing the hands-free form of theuser device 110 may not utilize anylocation devices 415, whereby location of the various assets (for sorting or picking) may be communicated directly as between theuser device 110 and thecontrol system 100, further in conjunction with an environmental mapping capability of theuser device 110, as described further below. - E. Exemplary Location
- Referring to
FIGS. 11A-C and 13D-E, an exemplary location 400 (and/or 1400) is schematically depicted. As described above, thelocation 400 may include may include one or more vehicles (e.g., aircraft, tractor-trailer, cargo container, local delivery vehicles, and/or the like), pallets, identified areas within a building, bins, chutes, conveyor belts, shelves, and/or the like. In the embodiment depicted in the figures listed above, thelocation 400 includes a plurality of shelves onto which theassets 10 may be placed and/or removed from. While these figures depict a specific quantity of shelves as being stacked in a vertical direction, it should be understood that any quantity of shelves may be arranged in any suitable configuration to hold theassets 10. Each of the shelves may include one or more visual indicators (e.g., 715FIG. 10 ) positioned on or proximate to the shelves; however, as mentioned previously herein, certain hands-free user device embodiments may dispense with such indicators 110 c. In these and other embodiments,navigational projections 715, much like thenavigational projections 810 provided in conjunction with theconveyor belt assembly 800 may assist in identifying an appropriate position for placement and/or removal of theasset 10 within the sort and/or pick location. In particular embodiments, for example, a user 5 (FIG. 12 ) may utilize the indicia reader or camera of theuser device 110 to scan, read, or otherwise receive asset identifier data from theasset 10 to identify, in cooperation with thecontrol system 100, an appropriate position for placement of theasset 10 within the warehouse or facility, namely at thelocation 400/1400. In other embodiments, thecontrol system 100 may determine the appropriate position for placement of the asset within the warehouse relative to thelocation 400/1400 and convey that information to theuser device 110 in response to the user device having approached an asset or package (e.g., for sorting). In other embodiments, thecontrol system 100 may proactively transmit projections to theuser device 110 upon receipt of a package or asset order, requiring “picking” of the asset or package from a pick location for order fulfillment purposes or the like. In still other embodiments, auser 5 may utilize the indicia reader or camera of theuser device 110 to scan, read, or otherwise receive asset identifier data from theasset 10 at alocation 400 that is associated with a mobile storage area (i.e., a delivery vehicle). In this and other embodiments, as illustrated inFIG. 14A , the delivery vehicle may be configured with aprojector 900 that proactively transmits navigational projections within the physical space of the storage area that is visible to the user, analogous to theuser device 110. The navigational projections may be visible within the physical space of the storage area so as to aid in the selection (i.e., picking) of an asset or package. Theprojector 900 may be used in these and other embodiments involving a delivery vehicle in conjunction with or in place of theuser device 110. It should be appreciated that thelocation 400 can be mobile or static. In some aspects, thelocation 400 can be mobile as it may be associated with the delivery vehicle. For example, thelocation 400 can be a cargo container associated with the delivery vehicle or within the delivery vehicle itself. In some aspects, thelocation 400 may be static. For example, thelocation 400 can be a storage area within a storefront. Additionally or alternatively, the storage area can be behind a customer counter. As a further example, thelocation 400 can be within a sorting facility. - Still further, the
control system 100 may determine the appropriate position for placement of theasset 10 within thelocation 400/1400 based on a variety of factors. For example and without limitation, thecontrol system 100 may determine the appropriate position for placement of theasset 10 within thelocation 400 based on the destination of theassets 10. - F. Package Car Application
- Turning now with particular focus upon
FIGS. 14A-B , an exemplary embodiment illustrates identifying an asset within a storage area of a delivery vehicle. It is contemplated that the term delivery vehicle may be any kind of vehicle, such as an automobile, truck, train, or airplane. In this exemplary embodiment, thecontrol system 100 may be configured to communicate with the delivery vehicle (i.e., location 400) in any of the ways and/or manners detailed elsewhere herein. Theuser 5 may also, in exemplary embodiments, utilize auser device 110 in any of the ways and/or manners detailed elsewhere herein. In further exemplary embodiments, theuser 5 may dispense with theuser device 110 and rely instead upon instructions communicated via aprojector 900 mounted on the delivery vehicle (i.e., location 400). In some embodiments, theprojector 900 may be configured to communicate with thecontrol system 100 in a manner and/or way analogous to the communication between thecontrol system 100 and theuser device 110, as detailed elsewhere herein. In some aspects, utilizing aprojector 900 that is mounted within the storage area can be advantageous as it removes the need for theuser 5 to wear additional equipment. As theuser 5 may be physically active (e.g., carrying assets, entering and exiting the vehicle, climbing stairs at a delivery location), any additional equipment, including theuser device 110, could interfere or encumber the user's 5 movement. Additionally, due to the user's movement, smart glasses may be prone to falling off the wearer's head or obstructing the user's sight, presenting potential safety issues. In addition, an in-storage area mountedprojector 900 can rely on a permanent power source. This is in contrast to a portable device that is powered by a portable battery, which may require constant recharging or replacement. As such, an in-storage area mounted system may be advantageous over auser device 110, in some instances. It should be appreciated that whileFIGS. 14A-18 describe alocation 400 with respect to the storage area associated with a delivery vehicle, thelocation 400 can be any physical environment. For example, thelocation 400 can be a physical environment within a warehouse, a sorting facility, a shopping area of a store, and the like. As such, theprojector 900 can be secured within a physical environment of a static location. - According specifically to the embodiment of
FIG. 14A , in an analogous fashion as the “picking”-focused embodiments detailed elsewhere herein, the location of a set of packages (i.e.assets 10 a-d) may be illuminated by one or morenavigational projections 901. It should be appreciated that, while not shown, the location of the set of packages may be highlighted by a user device (such asuser device 110 ofFIG. 4 ) that generates one or more navigational projections. It should be understood that the one or morenavigational projections 901, as illustrated inFIG. 14A , may be configured according to various embodiments to operate and provide navigational guidance to auser 5 in substantially the same way and/or manner as the one or morenavigational projections 715 detailed elsewhere herein. - For purposes of background, the current process for drivers or personnel on a delivery vehicle (i.e., location 400) involves a manual sequencing of
loading assets 10 inside the delivery vehicle. As the personnel goes to deliver theasset 10, he or she must sort through the assets to determine the correct asset to deliver. Even though assets are conventionally loaded following a delivery sequence, the drivers or personnel must still spend some degree of time finding the right box at each stop; oftentimes, additional boxes for a particular stop may be inadvertently overlooked. To alleviate and address the deficiencies of the current process's manual nature, the exemplary embodiment described herein utilized augmented reality techniques (as described elsewhere herein) to highlight at each stop which asset(s) 10 is to be picked by the drivers or personnel. This may be achieved, in certain embodiments, without utilization of smart glasses; alternative mechanisms include the user device 110 (also as described elsewhere herein) and/or aprojector 900 mounted inside the delivery vehicle (i.e., location 400). As in the various embodiments described elsewhere herein, pattern recognition and machine learning may be utilized for thecontrol system 100 to, over time, improve and understand how best to project and identifyindividual assets 10 and to build a three-dimensional representation of the assets based on two-dimensional images and/or three-dimensional sensor captured data. In at least one embodiment, theprojector 900 may be utilized to provide navigational projections (e.g., “light” the asset) to the asset(s) 10 for picking at a particular service stop; this may be done in conjunction with—or in place of—a “lighting” of the asset(s) 10 via theuser device 110, as detailed elsewhere herein. - According to various embodiments the system may also, in addition to the
projector 900, have multiple components, including a scanning device 1500 (e.g., a video camera, a three dimensional sensor or camera, a LIDAR scanner, tomography scanner, wireless RF signal scanner, or the like). The scanning device 1500 can be used to capture the size and shape ofasset 10 as it is loaded into the storage area and placed onto shelves (i.e., specific locations). Anyasset 10 added, stored, or removed, can be captured by the scanning device 1500. The captured information can then be processed via one or more computer processors associated with the scanning device, thecontrol system 100, and/or theprojector 900. Pattern recognition, machine learning, and/or AI-based algorithms may be utilized to identify, from the scanning information, the shape, size, and position of each asset stored. Additionally or alternatively, thecontrol system 100 may determine an asset identifier from the scanning information (which could be determined through image recognition that identifies a label on an asset and/or via wireless sensors detecting an RFID signal, or the like). As described above, it should be appreciated that whileFIGS. 14A-18 describe alocation 400 with respect to the storage area of a delivery vehicle, thelocation 400 can be any physical environment. For example, thelocation 400 can be a storage area within a warehouse, a sorting facility, a shopping area of a store, and the like. As such, the scanning device 1500 can be secured within a storage area of a static location. - In some embodiments, the
control system 100 can cause the scanning device 1500 to capture scanning information of a physical environment of a storage area. Based on obtained scanning information, thecontrol system 100 can track the addition, movement, or removal of theasset 10 from the storage area. For example, theasset 10 may be added to the storage area. The addition of theasset 10 can be detected by thecontrol system 100 via one or more scanning devices 1500. As a further example, theasset 10 may slide as a result of the delivery vehicle stopping, turning, or accelerating, or a user moving a particular parcel to a different location so as to reach a different parcel. Thecontrol system 100 can cause scanning device 1500 can capture scanning information and then analyze the scanning information to determine that theasset 10 has moved to a new location. Thecontrol system 100 can store the asset's location within a location database. As described in more detail below, when theuser 5 needs to pick theasset 10 when the delivery vehicle arrives at a delivery location associated with a delivery destination of theasset 10, theprojector 900 can provide navigational projections to guide theuser 5 to the particular parcel. For example, theprojector 900 may be configured to illuminate the tracked/monitored/identifiedasset 10, as detailed elsewhere herein. - Referring to
FIG. 15A , one or more scanning devices 1500 can be used to obtain location data for an asset within an environment (e.g., a storage area or a cargo area). As described herein, the scanning device 1500 may comprise one or more sensors. For example, the scanning device 1500 can include a three dimensional sensor, an image sensor (e.g., video or camera), a laser sensor (e.g., for use in LIDAR), infrared sensor, motion detector, and the like. The scanning device(s) 1500 can be secured to and/or mounted within thelocation 400 in any manner. The scanning device 1500 can be mounted in a mobile or fixed fashion. In some aspects, the scanning device 1500 can be a mobile 3D sensor that can sweep thelocation 400 to identify one or more assets. For example, the scanning device 1500 may be a component within theprojector 900 that rotates. It should be appreciated that the one or more scanning devices 1500 may be secured along the ceiling, walls, or shelves within the storage area (e.g., location 400). It should be appreciated that the term secured and/or mounted may refer to a permanent coupling or a temporary coupling such that the scanning device may be removed. Data obtained by the scanning device can then be analyzed (e.g., by the control system 100) to generate location data for an asset within the storage area. - In some embodiments, the scanning device 1500 can be mounted to a shelving unit. The shelving unit can be positioned such that a face of one shelf is opposite the face of another shelf. As shown in
FIG. 15A , the face of aleft shelving unit 410L is opposite the face of theright shelving unit 410R. While only two shelving units are depicted, the storage area may comprise rows of shelving units, as shown inFIG. 8 . - In some embodiments, one or more scanning devices 1500 can be mounted within the
location 400 so as to have a field of view of a shelving unit. As described above, the one or more scanning devices 1500 can be mounted along the ceiling, walls, or shelves within the storage area (e.g., location 400). In some aspects, the one or more scanning devices 1500 can be mounted to a particular shelving unit so as to have a field of view of an opposing shelving unit. For example, the scanning device 1500 can be coupled to a front face of the shelving unit. Additionally or alternatively, the one or more scanning devices 1500 can be positioned in series, along a front face of an individual shelf. As shown inFIG. 15B , scanning devices 1500 can be positioned in series along a plurality of shelves. For instance, the scanning devices 1500 can be positioned in series along afirst shelf 1530 and asecond shelf 1540. - In some embodiments, the sensors associated with each scanning device 1500 have a field of view of an opposing shelving unit. As shown in
FIG. 15C , the series ofscanning devices 1500 n can be positioned along the shelves of theleft shelving unit 410L such that the field of view of each sensor produces a combined field ofview 1520 of one or more shelves of theright shelving unit 410R. While not shown, a second series of scanning devices 1500 a-f can be positioned along aright shelving unit 410R to provide a combined field of view of theleft shelving unit 410L. Positioning the scanning device(s) 1500 along a shelf of an opposing shelving unit can be advantageous because it maximizes the field of view of each scanning device 1500. This is in contrast to a wall or ceiling mounted scanning devices, which may have sensors that suffer from a limited field of view. For example, a ceiling mounted scanner might not be able to detect an asset that is positioned to the rear of the shelf. As shown inFIG. 15B , the series of scanning devices 1500 a-f can be spaced apart horizontally (e.g., along the x-axis) and vertically (e.g., along the y-axis). Additionally, while thescanning devices 1500 n are illustrated as being aligned vertically or horizontally, in some embodiments, the scanning devices are not aligned as such. - Utilizing data obtained from the scanning device 1500, the
control system 100 can generate location data for theasset 10 representing the asset's physical location within thelocation 400. The asset's location data can then be stored in an asset location database. Thecontrol system 100 can update the asset location database based on determining that an asset has been added, moved, or removed from the storage area. In some aspects, thecontrol system 100 can detect the addition (or removal) of the asset based on analyzing data obtained via the scanning device 1500. For instance, the scanning device 1500 can detect (or no longer detect) an asset identifier transmitted by an asset wireless signal generated by an RFID. Additionally or alternatively, thecontrol system 100 can determine that the scanning information no longer includes a visual pattern (e.g., a QR code, particular dimensions of the asset) associated with a previously identified asset. - In some embodiments, to ensure that the scanning devices 1500 generate accurate location data for the asset, the
control system 100 can determine whether there is an object (theuser 5, a truck loader, etc.) in the storage area. Thecontrol system 100 can then determine that the object may interfere with capturing scanning information. By way of example, thecontrol system 100 can detect an object (e.g., the user 5) is in the aisle based on scanning information received via the scanning device(s), such as scanning information captured by an image sensor, a motion sensor, a thermal image sensor, or the like. Based on detecting an object in the aisle, thecontrol system 100 can determine that the object has interfered or will interfere with the scanning information. As such, thecontrol system 100 can delay analyzing the scanning information so as to determine an asset's location. If thecontrol system 100 determines that no object is detected, thecontrol system 100 can analyze the obtained scanning information to generate asset location data. The asset location data can then be stored in an asset location database, which can be updated over time as new scanning information is obtained. In some embodiments, thecontrol system 100 instructs the scanning device 1500 to receive scanning information based on determining that no object is detected. - In some embodiments, the
control system 100 can generate asset location data based on position data for the scanning device. Among other things, the position data can define the physical location of the scanning device 1500 within the storage area. For example, referring toFIG. 15A , thecontrol system 100 can store position data for each scanning device 1500 a-f. In some aspects, the position data can include a measured distance with respect to a point of origin. Thecontrol system 100 can utilize the point of origin as a reference to generate a value (e.g., coordinates) for the asset location based on the location of the scanning device. - In some embodiments, the point of origin is associated with a physical location of a particular scanning device. For instance, the point of origin can be associated with the physical location of scanning
device 1500 a. In some aspects, the position data for eachscanning device 1500 b-i can be determined with respect to the point of origin (e.g.,scanning device 1500 a). The position data can include a measured distance (e.g., along the x-axis, y-axis, and z-axis) of each scanning device with respect to the point of origin. Thecontrol system 100 can then utilize the point of origin to determine a value (e.g., coordinates) for the asset's location. - By way of example, the
control system 100 can obtain scanning information fromscanning device 1500 b. Thecontrol system 100 can then account for the position of the scanning device when analyzing the scanning information. For instance, thecontrol system 100 can analyze the scanning information received fromscanning device 1500 b based on the position ofscanning device 1500 b. Thecontrol system 100 can then generate location data for theasset 10 from the scanning information obtained for each thescanning device 1500 n. The asset location can then be stored in an asset location database. - In some embodiments, the scanning device 1500 may comprise a wireless signal reader that can determine the location of the
asset 10 through wireless signals. By way of example, eachasset 10 can be equipped with a tag (e.g., a microchip coupled to an antenna) that emits a wireless RF signal that is received by one or more tag readers associated with the scanning device 1500. The wireless signal emitted by the tag can then be used to determine the location of theasset 10 within thelocation 400. For instance, as known in the art, a distance between the tag and a tag reader can be determined through relative signal strength intensity (RSSI) triangulation, Time Difference of Arrival (TDOA), and the like. - In some embodiments, the
control system 100 can identify a particular asset based on asset characteristics captured by the scanning device 1500. For example, thecontrol system 100 can analyze the scanning information to identify a particular visual pattern or characteristics associated with the asset (e.g., a QR code, particular dimensions of the asset, or particular markings on the asset). Additionally or alternatively, thecontrol system 100 can analyze the scanning information to identify a RF signal emitted from an RFID associated with the asset. Thecontrol system 100 can then utilize this information to identify the particular asset. It should be appreciated that the identified asset can be associated with a unique identifier (e.g., alphanumeric code). Thecontrol system 100 can then store the location data in association with the identified asset (e.g., associating the location data with the unique identifier). Additionally or alternatively, once the asset has been identified, thecontrol system 100 can determine a delivery location associated with the identified asset. For instance, thecontrol system 100 can reference a database that comprises a delivery location for each identified asset. - In some embodiments, the
control system 100 causes aprojector 900 to generate one or more navigational projections that identifies an asset to be pulled. As described herein, the one or more navigational projections provided by theprojector 900 creates a visual cue for the physical location of theasset 10. In other words, theprojector 900 can illuminate a portion of the environment of the delivery vehicle so as to guide theuser 5 to the particular asset. - In some embodiments, the
projector 900 can include one or more light sources mounted within the storage area (e.g., location 400). The one or more light sources can be mounted at any location within the storage area, including a ceiling, a wall, a floor, or a shelving unit. In some aspects, as shown inFIG. 14A andFIG. 16 , theprojector 900 can be a light source that is mounted to the ceiling of the storage area. In some aspects, the one or more light sources can be mounted along the shelves of the shelving unit 410 (e.g., along a front surface of a shelf). It should be appreciated that theprojector 900 may include a light source that projects a structured light in a particular direction. Additionally or alternatively, theprojector 900 is a light source that generates omnidirectional light. - The light source can be a single, centralized light source or a plurality of distributed light sources. In some embodiments, the light source can be activated, by the
control system 100, to illuminate a portion of the environment to identify the physical location of the asset to be pulled. Any light source may be used, including a halogen light source, an LED light source, a laser light source, or the like. The light source can be a stationary light source or a rotatable light source. It should be appreciated that a stationary light source can eliminate the need for any moving parts, which can be beneficial in some instances. For example, if the storage area is a portion of a delivery vehicle, the movement of the vehicle along with the movement of a rotatable light source can hinder an accurate placement of the navigational projection. As such, a stationary light source may be preferred. Still, in some instances, the rotatable light source may be preferred. A rotatable light source can reduce installation time as it may eliminate the installation of multiple, stationary light sources. - In some embodiments, the
control system 100 can determine the location of the navigational projection within the physical environment of the storage area (e.g., location 400). For example, thecontrol system 100 can determine the location of the navigational projection based on the asset location data. Thecontrol system 100 can reference an asset location stored in the asset location database and cause a navigational projection to be presented proximate to the physical location of the environment that is associated with the asset location data. - In some embodiments, the
control system 100 can activate theprojector 900 to generate a navigational projection in a particular portion within the physical environment. For example, thecontrol system 100 can cause a centralized light source to project light onto or near the surface of the asset. As a further example, thecontrol system 100 can selectively activate one or more light sources positioned near the asset. As described above, theprojector 900 may comprise a plurality of light sources that are distributed throughout the storage area. Thecontrol system 100 can activate theprojector 900 to generate a navigational projection in a particular portion of the environment by selectively activating a light source among the plurality of light sources. For instance, based on determining the location of theasset 10 to be pulled, thecontrol system 100 can selectively activate a particular light source that is mounted proximate to the determined location of theasset 10. That is, based on the determined location of the asset, thecontrol system 100 can reference a physical location of the one or more light sources and selectively illuminate a light source that is located near the generated asset location. It should be appreciated that the navigational projection may illuminate an area within 0-20 feet of the surface of the asset. - In some embodiments, the
control system 100 can mechanically control the direction of the light emitted by theprojector 900. In some aspects, thecontrol system 100 can generate a navigational projection in a particular portion of the physical environment by mechanically controlling the position of theprojector 900. By way of example, as shown in FIG. 16, theprojector 900 may comprise a centralizedlight source 1600 that is rotatable, horizontally 1630 (e.g., about the x-axis) or vertically 1640 (e.g., about the y-axis). In some aspects, thelight source 1600 can be rotated by one ormore stepper motors 1610 1620. In some aspects, to ensure that theprojector 900 projects a light in the correct direction, one or more encoders (e.g., a light encoder or magnetic encoder) can provide feedback to thecontrol system 100 as to the orientation of the rotatable centralized light source. Additionally or alternatively, the rotatable centralized light source can be recalibrated or reset based on returning to an original position. In some aspects, thecontrol system 100 can control the direction of a light that is emitted by theprojector 900 by controlling the position of a mirror. For example, thecontrol system 100 can cause a mirror to rotate such that the mirror redirects a light emitted from a fixed, centralized light source toward a particular portion of the physical environment, thereby indicating a particular asset to be pulled. - In some embodiments, as shown in
FIG. 14 , the navigational projection is an illumination of theasset 10. For example, thecontrol system 100 can cause a rotatable, centralized light source to shine a light onto a surface of theasset 10, thereby illuminating theasset 10. As a further example, thecontrol system 100 can selectively activate a stationary light source to project a light onto a surface of theasset 10. A user can then quickly identify the illuminatedasset 10 to be pulled without having to take the time to manually look through the storage area. - In some embodiments, the navigational projection can indicate that the
asset 10 is behind a different asset. As shown inFIG. 17 , theasset 10 to be pulled is positioned behind anotherasset 1700. In some aspects, thecontrol system 100 can determine that the location of theasset 10 is behind anotherasset 1700. For example, during loading of the delivery vehicle, thecontrol system 100 may determine, based on scanning information, thatasset 1700 has been placed in front of theasset 10. For example, if the scanning device 1500 relies on image sensor or depth sensors to determine the location of the asset, thecontrol system 100 can enter a loading state. During the loading state, thecontrol system 100 can assume that any asset loaded will remain within the cargo area. Thecontrol system 100 can obtain first scanning information and identify theasset 10 has been loaded. Thecontrol system 100 can then obtain updated scanning information and identify that anasset 1700 has been placed in a similar location asasset 10. Thecontrol system 100 can then assume thatasset 10 has been pushed to rear ofasset 1700. Additionally or alternatively, thecontrol system 100 can receive data from a user device indicating thatasset 100 is located behindasset 1700. In some embodiments, thecontrol system 100 can receive scanning information wirelessly and determine that theasset 10 is located to the rear ofasset 1700. Based on determining thatasset 10 is behindasset 1700, thecontrol system 100 can then cause theprojector 900 to generate a distinct navigational projection or otherwise alter the navigational projection to indicate that theasset 10 is located behind anotherasset 1700. For example, alight illuminating asset 1700 may blink. As a further example, the distinct navigational projection can be a particular color, symbol, or pattern to indicate that the user should look behind theasset 1700 to find theasset 10 to be pulled. - In certain embodiments, either upon request from the driver or personnel or automatically based upon—as a non-limiting example—GPS positional data associated with the delivery vehicle (i.e., location), the
control system 100 will communicate—to theprojector 900 and/or theuser device 110—instructions identifying which of one ormore assets 10 should be highlighted as the driver oruser 5 enters the package storing portion of the delivery vehicle. As described herein, in some embodiments, asset locations can be known based upon the three-dimensional representation of theassets 10 mapped via thecontrol system 100, with capabilities including mapping of shelving, floor, and/or aisle locations forvarious assets 10. - In some embodiments, the
control system 100 can monitor the location of the delivery vehicle. For example, thecontrol system 100 can obtain location data from one or more location modules. Thecontrol system 100 can then utilize the location data to indicate the physical location of a delivery vehicle. It should be appreciated that the location module (e.g., a GPS location module) can be a component of thecontrol system 100, the delivery vehicle, or a user device. Based on obtained location data from the location module, thecontrol system 100 can determine the vehicle's proximity to a delivery destination of a particular asset stored in the delivery vehicle. Thecontrol system 100 can determine the delivery destination for any particular asset from a database associating asset identifier of the particular asset with it respective delivery destination. Thecontrol system 100 can thus receive an asset identifier associated with each asset being transported and determine the delivery destination of each asset. - Continuing, in some embodiments, the
control system 100 can determine whether a detected location is associated with a delivery location of an asset. For instance, thecontrol system 100 can determine whether the delivery vehicle is within a predefined threshold of a delivery location associated with an asset. The predefined threshold can be any threshold distance, including a foot up to several miles. If the delivery vehicle's location is within a predefined threshold, thecontrol system 100 can determine the one or more assets to be pulled. Thecontrol system 100 can then direct theuser 5 to theasset 10 to be pulled for delivery to a location (e.g., house, apartment, building, or smart locker) proximate the vehicle's physical location. - In some embodiments, the
control system 100 can cause one or more navigational projections to be generated based on predefine conditions. In some aspects, the predefined conditions may be associated with the state of the delivery vehicle. For example, thecontrol system 100 can determine whether the delivery vehicle's gear has been placed in park. If so, thecontrol system 100 can automatically cause theprojector 900 to generate the one or more navigational projections. As another example, thecontrol system 100 can determine that the delivery vehicle's door has been opened (e.g., the opening of a door to the storage area or driver side door). In some aspects, the predefined condition may be associated with receiving a command signal. For example, theuser 5 may arrive at a particular stop and activate a command signal (e.g., through a user device or a switch mounted within the delivery vehicle). Based on receiving the command signal, thecontrol system 100 can cause one or more navigational projections to be generated. - In various embodiments, the
control system 100 can exchange asset-related data with a user device regarding the handling of the asset. For example, thecontrol system 100 receive asset-related data from a user device (a handheld computing device or user device 110) indicating that a particular asset has been loaded or unloaded. In some aspects, the asset-related data can include an asset identifier, dimensions, a weight, or a delivery destination. Thecontrol system 100 can receive asset-related data during the loading of the asset, which can then be used to determine the location of the asset. For example, thecontrol system 100 can analyze scanning information for an asset having a particular asset identifier or having particular dimensions. As a further example, thecontrol system 100 can receive asset-related data related to the unloading of the asset from the cargo area. Thecontrol system 100 can utilize the asset-related data to determine that the asset has been removed and that the asset will be absent from any further scanning information. - Turning now to
FIG. 18 , an exemplary flow diagram 1800 shows a process of locating an asset. AtBlock 1810, a scanning device is initialized. For instance, thecontrol system 100 can instruct a scanning device 1500 to begin obtaining scanning information regarding a cargo container (e.g., location 400) of a delivery vehicle. In some embodiments, thecontrol system 100 can instruct a plurality of scanning devices to obtain information regarding the cargo area. In some aspects, thecontrol system 100 can initialize the scanning device based on determining that no object will interfere or disrupt the scanning information obtained by the scanning device. The scanning device can then capture scanning information. For instance, as described herein, the scanning device 1500 can receive scanning information through image sensors, depth sensors, wireless sensors, and the like. It should be appreciated that while the exemplary flow diagram 1800 refers to a cargo container of a delivery vehicle, the steps could be performed with respect to any location. - At
Block 1820, it can be determined that an asset is located within a cargo container. In some embodiments, thecontrol system 100 can determine that an asset is located within the cargo container (e.g., location 400) based on scanning information obtained from a scanning device (e.g., scanning device 1500). It should be appreciated that thecontrol system 100 can receive, from a scanner (e.g., scanning device 1500 or handheld scanner), scanning information including an asset identifier associated with the asset and a defined delivery location. In some aspects, thecontrol system 100 can determine an asset identifier based on analyzing the scanning information. For instance, thecontrol system 100 can analyze the scanning information for distinguishing characteristics of the asset, such as a distinct visual aspect associated with the asset (such as an alphanumeric code, QR code, dimensions of the asset, symbols, and the like) or a wireless signal (e.g., an RFID signal communicating an asset identifier). Based on scanning information, thecontrol system 100 can determine that a particular asset is located within the cargo container. As described herein, thecontrol system 100 can also determine a particular location for the particular asset. It should be appreciated that thecontrol system 100 can reference a database linking the asset identifier and the particular delivery location so as to determine a delivery destination for the asset. - At
Block 1830, a current location of the delivery vehicle is obtained. For example, thecontrol system 100 can utilize location data that is detected from one or more location modules. The location module (e.g., a GPS location module) can be a component of thecontrol system 100, the delivery vehicle, or a user device. - At
Block 1840, a projection device can be activated. For example, based on acontrol system 100 determining that a current location of the delivery vehicle is within a predefined range or threshold of a delivery destination, thecontrol system 100 can activate aprojection device 900 to emit a projection that corresponds to a determined position of the asset. As described herein, the position of the asset can be determined based on the scanning information obtained from the scanning device 1500. In some embodiments, the scanning information is a scanned position of the asset within the storage area. For example, the scanning device 1500 can detect a particular position of the asset relative to the scanning device 1500 based on one or more sensors, such as a depth sensor, an image sensor, a wireless signal sensor, and the like. - In some aspects, the projection is emitted or generated by a light source associated with the
projection device 900. For example, thecontrol system 100 can activate theprojector 900 to generate a navigational projection in a particular portion of the environment by selectively activating a light source among the plurality of light sources. For instance, based on determining the location of theasset 10 to be pulled, thecontrol system 100 can selectively activate a particular light source that is associated with the determined location of theasset 10. That is, based on the determined location of theasset 10, thecontrol system 100 can reference a physical location of the one or more light sources and selectively illuminate a light source that is located near the determined asset location. As a further example, thecontrol system 100 can cause a centralized light source to project light onto or near the surface of the asset. It is contemplated that the projected light can be within 0-20 feet of the surface of the asset. - In some embodiments, the control system modifies projection coordinates of the projection device based on the scanned position and the determination that the current location is within the threshold distance of the defined delivery destination. For example, the
control system 100 can determine a placement of a projection within the storage area from scanning information receiving from scanning device 1500. Thecontrol system 100 can then cause a centralized light source of theprojector 900 to rotate so point in a particular direction. The control system can then instruct theprojector 900 to emit the projection based on the modified projection coordinates. - In some embodiments, additional scanning information is obtained. For example, the
control system 100 can instruct the scanning device 1500 to obtain additional scanning information. In some aspects, thecontrol system 100 can instruct the scanning device 1500 to obtain additional scanning information based predetermined conditions, such as based on determining a vehicle door has been closed, determining that the vehicle is moving or has stopped, determining that a particular time interval has lapsed, and the like. The additional scanning information can include an updated scanned position of the asset within the storage area. This scanning information can then be analyzed by thecontrol system 100 and stored in the asset location database. It should be appreciated that thecontrol system 100 can modify the projection coordinates based on the updated scanned position. - In various embodiments, the
control system 100 may comprise a plurality of modules, each module configured to perform at least a portion of the functions associated with the methods described herein. For example, thecontrol system 100 may comprise an acquisition module, a location module, and a notification module. Although described herein as being individual components of thecontrol system 100, the various modules may operate on a combination of one or more devices (e.g., theuser device 110, the acquisition/display entity 804 (for capturing theasset 10 information), the location device 415 (where provided), and/or the control system 100), such that each device performs the functions of one or more modules. - A. Acquisition Module
- In various embodiments, the acquisition module may be configured to obtain asset identifier data associated with an
asset 10 to be sorted and/or picked. This asset identifier data may be obtained, in part, via an order placed by a customer desiring transport and delivery (e.g., picking, as a first step) of the asset or package. In other embodiments, the asset identifier data may be obtained, in part, via the acquisition/display entity 804 associated with a conveyor belt of the like, transporting packages or assets to a sort location. - In various embodiments, the asset identifier data may comprise a unique asset identifier such as a tracking number or code, and data defining the one or more
appropriate locations 400 for theasset 10 as it moves between an origin and a destination, and/or the like. - As a non-limiting example, the acquisition module may be configured to obtain data from the user device 110 (e.g., of
FIGS. 3 and 4 ) and/or the acquisition device 810 (e.g., ofFIG. 12 ). In various embodiments, the data received from theuser device 110 and/or theacquisition device 810 may include the entirety of the asset identifier data and therefore the acquisition module need only receive asset identifier data from one of theuser device 110 and/or theacquisition device 810. However, in various embodiments, the data received from the user device 110 (FIGS. 3 and 4 ) and/or the acquisition device 810 (FIG. 12 ) may comprise only a portion of the asset identifier data, and the acquisition module may be configured to obtain the remainder of the asset identifier data from one or more other sources. As another non-limiting example, the acquisition module may be configured to search one or more databases in communication with thecontrol system 100 for asset identifier data corresponding to the data received from theuser device 110 and/or theacquisition device 810. The acquisition module may additionally or alternatively be configured to receive and store at least a portion of the asset identifier data corresponding to theasset 10 that is stored in one or more databases. - In various embodiments, the acquisition module may be configured to transmit at least a portion of the asset identifier data to one or more devices (e.g., the user device 110) and/or one or more modules (e.g., the location module and/or the notification module). Moreover, upon receiving the asset identifier data regarding an
asset 10 to be sorted, the acquisition module may be configured to link or otherwise associate theuser device 110 and the asset identifier data. As will be described in greater detail herein, theuser device 110 may be associated with the asset identifier data by storing at least a portion of the asset identifier data in a memory associated with theuser device 110. - B. Location Module
- The location module may be configured to receive asset identifier data from the acquisition module. The sort location module is configured to ascertain the
appropriate location 400 and/or the appropriate position within thelocation 400 for theasset 10 based at least in part on the asset identifier data. In certain embodiments, the location module may be configured to determine theappropriate location 400 based at least in part on the asset identifier data and location data that is associated with the each of the plurality oflocations 400. The location data may be generated based not only upon the asset identifier data, but also upon the environmental mapping conducted via the user device, as described elsewhere herein. - In various embodiments, each of the plurality of
locations 400 may be identified by location data, which may include a unique location identifier. The unique location identifier may comprise a unique character string individually identifying each of the plurality oflocations 400. In various embodiments, the location data may define any subsequent processing to be performed onassets 10 within eachlocation 400 and/or 1400, and may comprise the unique sort location identifier for each of the plurality oflocations 400/1400 theassets 10 will pass through. In various embodiments, the location module may determine whether the processing to be performed onassets 10 in each of the plurality of locations 400 (as defined in the location data) will move theasset 10 closer to its final destination. - In various embodiments, the location module may determine whether the processing steps to be performed on the
assets 10 in each of thelocations 400/1400 complies with the service level (e.g., Same Day shipping, Next Day Air, Second Day Air, 3 Day Select, Ground shipping, and/or the like) corresponding to theasset 10. As a non-limiting example, the location module may determine the appropriate location for anasset 10 to be delivered to 123 Main Street, Atlanta, Ga. is a delivery vehicle that will deliverother assets 10 to the same address or nearby addresses (e.g., along the same delivery route). As a second non-limiting example, the location module may determine the appropriate location for anasset 10 to be delivered to 345 Broad Street, Los Angeles, Calif. via Next Day Delivery is a pallet to be loaded onto a plane destined for Los Angeles, Calif. As yet another non-limiting example, the location module may determine the appropriate location for anasset 10 prior to its fulfilment for delivery, which location may be characterized—as done elsewhere herein—as a pick location for the asset. - After determining the
appropriate location 400/1400 and/or the appropriate position for theasset 10 within thelocation 400/1400, the location module may be configured to transmit data defining theappropriate location 400/1400 and/or the appropriate position for theasset 10 within thelocation 400/1400 to one or more devices (e.g., the user device 110) and/or modules (e.g., the notification module). Additional details in this respect are provided in U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety. - C. Notification Module
- In various embodiments, the notification module may receive data indicating whether the
location 400 and/or 1400 (e.g., as transmitted to thecontrol system 100 via the user device) is the appropriate sort or pick location (e.g., as determined by the control system 100) for the asset or package being handled. As described herein, the notification module may cause one or more alerts to be generated in order to notify the user 5 (e.g., sort or pick personnel, more generally the carrier personnel) whether theasset 10 should be deposited in thelocation 400 and/or picked therefrom, however as the case may be. For example, the notification module may be configured to transmit confirmation data and/or mistake data to theuser device 110 in order to cause the device to generate an alert discernible by the user 5 (e.g., carrier personnel) indicative of the appropriate sort location for theasset 10. To ascertain whether confirmation data and/or mistake data is appropriate for transmission, the user device 110 (and/or sensors associated therewith, e.g., three-dimensional sensors) may be configured to determine not only the position of the asset but also the position of the user's hands (e.g., including not only location, but also gestures), so as to gauge whether or not sorting and/or picking of the asset is proceeding properly. For example, thecamera 116 may utilize object recognition algorithms that identify whenever a person is clasping an object in a particular manner to determine properness. - In various embodiments, the notification module may cause the
user device 110 to audibly provide the user with a confirmation message (e.g., via the speaker 117) upon a determination that thelocation 400/1400 is the appropriate sort or pick location. In various embodiments, the notification module may alternatively or additionally cause one or more sounds to be generated, one or more lights to illuminate, one or more mechanical assemblies to move, and/or other processes discernible by auser 5 to operate and thus indicate to theuser 5 whether thelocation 400/1400 is the appropriate location. It should also be understood that notifications may be generated—and communicated to the user via the user device—not only when the user is at the location (e.g., for picking or sorting), but also during travel of the user to/from the location relative to other locations in the warehouse or facility. As a non-limiting example, with reference toFIG. 9 , as the user travels down the hallway or open path, were the user to navigate contrary to thenavigational projection 710 provided, the user device could generate an audible (or other type) notification to the user-either independently, or upon cue received via thecontrol system 100. In certain embodiments, notifications may be generated and communicated (e.g., via the network or otherwise) to one or more parties other than the user of the user device (e.g., carrier supervisory personnel, other internal carrier personnel (e.g., quality assurance representatives, or the like), external personnel, external third party entities, or the like). Any of the notifications and/or communications described herein may be so communicated, whether to the user alone and/or to parties other than the user and/or to a combination of both, as may be desirable. - Moreover, the notification module may be configured to generate an alert after associating asset identifier data with location data and/or cueing an asset or package for picking. The notification module may be configured to generate an alert to inform the user 5 (e.g., carrier personnel) or other users regarding asset identifier data being associated with location data and/or the immediate need for navigation or travel to occur toward the location for picking of the asset or package or otherwise.
- According to various embodiments, whether adjacent a
location 400/1400 or a conveyingmechanism 802, the notification module may be configured to generate one or more navigational projections (e.g., 710, 715, 1401 and/or the like, with reference toFIGS. 8-11C by way of non-limiting example) to convey navigational instructions to theuser 5. It should be understood that according to various embodiments, the navigational projections may be computer-generated and/or overlaid over an augmented reality environment, which may in certain embodiments be displayed to the user via the utilizeduser devices 110. In at least the hands-free embodiment, the navigational projections may be generated via the pivoting laser projector (e.g., 118 ofFIG. 5 ) of theuser device 110.FIGS. 8-11C and alsoFIGS. 13A-13F illustrate various types of navigational projections as may be generated via the notification module, when conveyed via thecontrol system 100 further to theuser device 110. In at least one embodiment, it may be understood that the navigational projections may be generated at/by theuser device 110, independent of thecontrol system 100, upon receipt from the control system of only a new “pick” or “sort” command for a particular asset or package. In various embodiments, the text indicia and navigational projections ofFIGS. 13A-13F occur via one or more components of the device component 114 (e.g.,laser projector 118 to project thearrow 1303 and theindicia 1309 ofFIG. 13A ). -
FIG. 13A includes theenvironment 800 that the user is physically located in, which includes theconveyance device 1305, anasset 1301, various location devices, one of which is indicated by thedevice 1307. As described above, the location devices help map theenvironment 800. The text indicia (which may also be considered a navigational projection) 1309 may be projected within theenvironment 800, which commands the user to “push forward” theasset 1301. The navigational projection 1303 (i.e., the arrow) also assists or illustrates the direction in which the user should push theasset 1301 forward on theconveyance mechanism 1305.FIG. 13B includes thesame environment 800, except a different asset 1315 arrives and thetext indicia 1311 commands the user to “push” the asset 1315 to the “other side.” Thenavigation projection 1313 helps guide the user to show which direction to push the asset 1315. In this way, these visual frames together help guide and instruct the user for handling of assets. Turning toFIG. 13C , anotherasset 1317 arrives and thetext indicia 1319 prompts the user to “pick and sort” theasset 1317 to pick up, sort, and place within a sorting location. -
FIG. 13D illustrates prompting the user to sort the asset in a particular location within theenvironment 1400. In some embodiments, the instructions illustrated inFIG. 13D occur in response to the user picking up theasset 1317 as illustrated inFIG. 13C . Theenvironment 1400 includes thetext indicia 1401 that states “look that way,” which is accompanied by thenavigational projection 1403 that illustrates what direction the user should walk in in order to sort the asset.FIG. 13E illustrates the correct location to sort an asset within theenvironment 1400. In some embodiments, the instructions illustrated inFIG. 13E occur in response to the user moving responsive to thetext indicia 1401 and/or thenavigational projection 1403 ofFIG. 13D . Theenvironment 1400 includes the text indicia 1407 “sort here” indicating, along with thenavigation projection 1405, where the correct cell or location for sorting the asset is.FIG. 13F illustrates the combinedenvironments FIGS. 13A through 13E . Accordingly, theuser 1411 picks up theasset 1413 responsive to viewing a first set of navigational projections and/or text indicia and places theasset 1413 within the correct location responsive to viewing a second set of navigational projections and/or text indicia. - A. Exemplary User Device Operation
-
FIGS. 8-13F illustrate an exemplary environment in whichassets 10 are moved amongstvarious locations 400, which locations may be pick locations (e.g., for storage of an asset in a warehouse or the like; e.g.,FIGS. 8-11C in particular), a conveyor belt location (e.g.,FIG. 12 in particular), and/or sort locations (e.g., for placement of an asset following distribution from a pick location to a conveyor belt location; e.g.,FIGS. 13A-F in particular). In various embodiments, a user 5 (e.g., sort personnel) may utilize auser device 110 as described herein while transportingassets 10. As described herein, theuser device 110 may be configured for receiving information regarding aparticular asset 10 to be transported, whether from thecontrol system 100 or otherwise, for guiding theuser 5 to a location in which theasset 10 is or should be transported to, and for informing theuser 5 whether theasset 10 is being located (e.g., via navigational projections) and/or transported appropriately. -
FIG. 6 illustrates exemplary steps carried out by theuser device 110 according to various embodiments of the present disclosure to achieve the advantages and capabilities outlined above. InBlock 501, an initialization or calibration of theuser device 110 may be conducted according to various embodiments. In certain embodiments this step or block may be optional; in other embodiments, it need only be conducted periodically, for example upon initial use of the user device and/or upon receipt—from thecontrol system 100—of a notification that an environment in which the user device operates has been altered or updated. As described herein above, perblock 501 an environment that a user is located in is mapped based at least in part on generating a multidimensional (e.g., 3-D) graphical representation of the environment. - According to various embodiments, with reference to
FIG. 6 andFIG. 8 , duringBlock 501 ofFIG. 6 , theuser device 110 may be worn by auser 5 so as to map anenvironment 700 in which theuser device 110 is to be used. Agraphical representation 701 of the environment may be generated and/or stored via theuser device 110; whereby storing may occur locally at the user device and/or be stored at and/or synced with thecontrol system 100, for example, upon completion of the mapping procedure. As may be understood fromFIG. 8 and in some embodiments, the mapping procedure of step orBlock 501 involves the user, while wearing theuser device 110, to move through theenvironment 700, which movement necessarily involves passage oflocations 400/1400 (e.g., shelves) and various assets or packages 10. During the mapping procedure, three-dimensional depth sensors 119 (e.g., thedepth sensors 119 ofFIG. 5 ) of the user device may be utilized to capture the data required to generate thegraphical representation 701. In conjunction with a camera (e.g., thecamera 116 ofFIG. 5 ), locations of shelving (e.g., locations 400) and also locations of assets/packages—for example in “pick locations”—may also be established, determined, and/or otherwise saved at or by theuser device 110. Movement of theuser 5 through theenvironment 700 definesprogressive mapping zones 705, through which the three dimensional depth sensors are configured to scan during the course of mapping. To facilitate and further optimize the mapping procedure, in certain embodiments, one or more commands may be transmitted to the user (e.g., via the speaker 117) from the control system (or otherwise), for example to instruct the user to “turn left” at specific intervals configured to ensure that the entirety of the environment 700 (or a desired portion thereof) is sufficiently mapped. This feature will, for example, minimize and/or eliminate instances of users not covering every area within the environment for which mapping may be desired and/or required. In some embodiments, this mapping using 3D scanning capabilities to detect the immediate environment of the user projections are modified to avoid visual distortion. Accordingly, shapes and images projected are adjusted to the shape of the surface they will be projected upon in particular embodiments. In these embodiments, object recognition devices and/or scanners, such as cameras, located within a device component (e.g., device component 114) can identify the contours of the environment. In response, a projection can be made in the environment based on the shapes (e.g., uneven surfaces) of the environment. - Returning to
FIG. 6 , in Block (or step) 502, theuser device 110 is configured to receive and/or associate product (e.g., package or asset) locations within the mapped environment. This may be via utilization of identifiers 415 (as described elsewhere herein), via transmission of asset location information from the control system to the user device, and/or the like. - According to various embodiments, upon completion of the mapping of the
environment 700 and the association of product (e.g., package or asset 10) locations therein, theuser device 110 is calibrated for operational mode or use, which use may occur in either (or both) a pick and a sort mode. In the pick mode, the user device is configured to guide a user thereof to a location in which a package orasset 10 may be picked or “pulled” for fulfillment of an order (e.g., within theenvironment 800 ofFIG. 13A ); in the sort mode, the user device guides the user (e.g., from a conveyor belt to a sort location), defined to enable further transport and handling of the asset or package in route to a customer or the like (e.g., within theenvironment 1400 ofFIG. 13D ). - If it is determined in
block 515 that pick mode is appropriate, theuser device 110 proceeds to step orBlock 503, wherein pick location data is received. In certain embodiments, the pick location data is received—at theuser device 110—from thecontrol system 100, for example, upon receipt—at the control system—of a customer order for aparticular asset 10 or package. Based upon the received pick location data inBlock 503, theuser device 110 is configured to, in certain embodiments, generate pick instructions in Block 504. The generation of pick instructions in Block 504 may entail compilation of a route through which the user of theuser device 110 must travel—from their present location—so as to reach the location of the asset needing to be picked. Block 504 may further entail generation of a plurality of potential navigational projections (e.g., as described inFIGS. 9-11C ) that will be necessary to accurately and efficiently guide or direct the user of the user device to the pick location. In Block 504, in certain embodiments, multiple possible routes may be determined and assessed, with either the user device (automatically) or the user (via an interface selection) choosing an optimal route, for which the navigational projections may thereafter be established. Associated audible commands may also be generated/established, in conjunction with the navigational projections, should it be desirable to—in addition or alternatively to the navigational projections—also audibly guide (e.g., via the speaker 117) the user, instructing them to, for example, “turn left after the next shelving row, as depicted inFIG. 9 , by way of non-limiting example. - Upon completion of step or Block 504 the
user device 110 is configured to proceed to Block 505, wherein the navigational projections and/or audible instructions are dynamically displayed and/or otherwise provided to the user of the user device, characterized generically as “pick instructions.” It should be understood that, according to certain embodiments,Blocks 504 and 505 need not be separate and distinct steps or blocks; instead, as will be described below, as the user moves through theenvironment 700, theuser device 110 may be configured to dynamically generate and display various navigational projections and/or audible instructions. In at least those embodiments, Block 504 may entail merely identifying—at theuser device 110—the user's present location, the pick location, and a variety of pathways or routes there-between. - Reference is made now to
FIGS. 9-11C , which depict a variety of exemplary navigational projections that may be generated and displayed via theuser device 110 as auser 5 wearing the same moves around anenvironment 700. As may be understood by contrastingFIG. 9 withFIGS. 10-11C , ahead of the user arriving proximate the pick location, variousnavigational projections 710 may be generated and displayed (e.g., via the pivotinglaser projector 118 ofFIG. 5 ). Thesenavigational projections 710 may be two or three-dimensional in nature and—as generally understood—provide directional guidance to the user, as to which direction they should move or turn. As illustrated, by way of non-limiting example, inFIG. 9 , thenavigational projection 710 in certain embodiments may be a two-dimensional directional arrow configured to instruct a user waring a user device to make a change relative to their present movement pattern. In other embodiments, such as those illustrated inFIGS. 12-13F , the navigational projections, therein described asindicators 810 ornavigational projections 810, may be three-dimensional in nature. - Returning now to
FIGS. 9-11C , with reference now in particular toFIGS. 10-11C , therein may be seen certainnavigational projections device 110—as they approach the proximity of the pick location. Proximity may be defined as within a specific row upon which the asset or package to be picked is located. As may be understood fromFIGS. 11A-11C in particular, thenavigational projections 715 may include a frame and/or a checkmark, showing the right location to pick the asset or package from. In certain embodiments, text indicating a quantity may also be illustrated, as inFIG. 11A viaprojection 715A, which projection encompasses multiple package or asset boundaries, as may be recognized and detected by theuser device 110 upon approach to the asset or package. The boundaries may, as a non-limiting example, be determined by theuser device 110, at least in part, based upon asset information received from thecontrol system 100. In other embodiments software embedded upon theuser device 110, in conjunction with the camera (e.g., thecamera 116 ofFIG. 5 ) may be configured to, via an iterative machine learning-type algorithm and/or object recognition algorithm, recognize and determine asset/package dimensions and thus boundaries over a period of time. -
FIG. 11A illustrates one exemplary embodiment, in which, as alluded to above, thenavigational projection 715A encompasses multiple assets orpackages 10 in aparticular location 400, with textual instructions also be generated and provided to the user to “pick 2” of the highlighted or framed packages. In other embodiments (not illustrated), the instructions may say “pick these two” assets, with the frame encompassing only two specific assets on the shelf orlocation 400. In still other embodiments (e.g.,FIG. 11C ), multiple frames and instructions associated withnavigational projection 715C may be provided, distinctly identifying three assets or packages that need to be picked. A single asset pick embodiment is also illustrated inFIG. 11B , whereby anavigational projection 715B is provided, similar to theprojection 715C, in that each frame and textual instruction generated and displayed surrounds and is overlaid relative to a single package orasset 10. It should be understood that the illustrations and embodiments ofFIGS. 11A-C are non-limiting in nature; additional and/or alternatively configurednavigational projections 715 may be envisioned within the scope of the inventive concept described herein. - Returning now to
FIG. 6 and remaining withBlock 505, in addition to audible guidance instructions that may be provided/generated in conjunction with thenavigational projections FIGS. 10-11C , various embodiments may involve theuser device 110 further generating and transmitting to theuser 5 audible alerts when deviations occur. For example, theuser device 110—via itsspeaker 117 ofFIG. 5 —may provide general feedback to the user, such as “you picked the wrong asset; please await further instructions” or “you turned the wrong way; please stand still pending updated navigational projections becoming visible.” Alternative or additional “feedback” type alerts and/or instructions may be generated, in a near real time or real time manner, based upon the user's responsiveness (and accuracy of movement) relative to the provided navigational projections. - Turning now to step or
Block 506 inFIG. 6 , upon the user with theuser device 110 reaching the pick location, theuser device 110 is configured to further detect asset handling, specifically when the asset or package has been picked up by the user. In certain embodiments, feedback may also be provided at this junction, via thespeaker 117 of theuser device 110. In these and still other embodiments, theuser device 110 may capture an image of the “picking” (e.g., via its camera 116) and/or transmit the same to thecontrol system 100 for centralized/remote verification of picking accuracy and completeness. This action may occur also in conjunction withBlock 507, whereby one or more pick-related notifications may be generated and/or transmitted by theuser device 110, whether to theuser 5 and/or thecontrol system 100. - According to various embodiments, the detection of the “picking” may be conducted by the
user device 110 via a collision detection algorithm, as detailed elsewhere herein. As generally understood, though, such algorithm(s) are configured to detect changes in movement relative to theuser device 110, whereby if an item or person (e.g., a user) associated with or wearing theuser device 110 encounters a collision—for example by picking up and physically touching an asset or package—that “collision” likewise registers at the user device. In this manner, theuser device 100 may be programmed to transition from a guidance mode to—at least temporarily—a report or notification mode, so as to convey—for example to thecontrol system 100—that the pick has occurred. - In certain embodiments multiple algorithms may be utilized. One may be to identify what an asset or package is, namely what its physical boundaries entail. Another is to interpret when a user's hands (or the like) collide with and pick up (or set down) the asset or package. Each may be assisted, not only via the
depth sensors 119 of the user device, but also thecamera 116 thereof. In certain embodiments, at least two and in some instances fourdepth sensors 119 may be provided, each with a working range between 0.85 and 3.1 meters (alternative ranges may also be provided). Using data collected via the depth sensors, the environment may thus not only be mapped, but changes therein, including collisions between objects—including packages, assets, users, and/or devices such as forklifts operated by a user—may be detected and accounted for. In some embodiments, particular algorithms able to identify the parcel itself using machine learning techniques to search for physical clues (e.g., size, color, scratches, or any feature, even microscopic that may lead to uniquely id the parcel) without needing to read any parcel id label or barcode. - Remaining with
FIG. 6 and returning toBlock 515, in certain instances theuser device 110—whether independently or following a successful pick procedure—may be configured to operate in a sort mode, which corresponds generally to utilization of the user device to move assets or packages to/from various sorting locations within a mapped environment, as contrasted with a user locating the asset or package from a stored “pick” location. The sequence in sort mode initiates with theuser device 110, inBlock 508, detecting handling of an asset by the user (e.g., via the collision algorithm described previously herein) and/or obtaining asset identifier data from the asset or package (e.g., via thecamera 116, whether independently at theuser device 110 or further in conjunction with an exchange of certain asset/package data with the control system 100). - Upon obtaining the asset identifier data, the
user device 110 is able to determine and/or receive sort location data for the asset orpackage 10 inBlock 509. Based thereon, much like in Blocks 504-505 and 507 (in the context of sorting), theuser device 110 is configured to—according to various embodiments—generate sort instructions inBlock 510, dynamically display sort instructions (e.g., navigational projections, text indicia, and/or audible instructions and the like) inBlock 511, and generate/transmit one or more sort-related notifications inBlock 512. In some embodiments, the generating and displaying of one or more navigational projections configured to guide the user to an asset location within an environment is based at least on associating one or more asset locations within a mapped environment. In various embodiments, the displaying occurs within an environment that a user is in without regard to a necessary medium (e.g., lens, projector screen, etc.). In these embodiments, the projection is displayed in open space within the environment. It should be understood that any of Blocks 509-512 may be substantially the same or identical (the same) as those in Blocks 503-505 and 507, as previously detailed herein; in certain embodiments, though, one or more of the Blocks may be configured differently for sort versus picking mode. - Additional details relative to the utilization of the
user device 110 in sort mode may be understood with reference toFIGS. 13A-F , which figures are also described elsewhere herein. Additional details regarding sorting procedures may also be understood with reference to U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety. - B. Exemplary Control System Operation
-
FIG. 7 illustrates exemplary steps carried out by thecontrol system 100 according to various embodiments of the present disclosure. As illustrated inFIG. 7 , thecontrol system 100 may receive asset identifier data atBlock 601. As indicated herein, the asset indicator data may be received from theuser device 110, theacquisition device 810, and/or the one ormore location devices 415 at alocation 400. Further details regarding the scope and contents of the asset identifier data have been described previously herein. Still additional details in this respect may be understood with reference to U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety. - At
Block 602, thecontrol system 100 may be configured to determine theappropriate location 400 for theasset 10 and/or the appropriate position within the location for theasset 10. In various embodiments, the determination of the appropriate location for theasset 10 may be based at least in part on the received asset identifier data. Moreover, thecontrol system 100 may utilize location data corresponding to each of thelocations 400 to determine whether any subsequent processing to be performed onassets 10 at eachlocation 400 will move theasset 10 closer to its final destination. As a non-limiting example, thecontrol system 100 may determine the appropriate location for anasset 10 to be delivered to 123 Main Street, Atlanta, Ga. is the delivery vehicle that will deliverother assets 10 to 123 Main Street, Atlanta, Ga. Additional details in this respect may be understood with reference to U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety. - Referring again to
FIG. 7 , at Block 703 thecontrol system 100 may be configured to transmit data identifying the appropriate sort location to theuser device 110. As noted herein, theuser device 110 may be configured to generate an indicator (e.g., visual indicators ornavigational projections 710/715/810) discernible by the user 5 (e.g., carrier personnel) regarding the appropriate pick or sort location for theasset 10. However, as noted herein, eachasset 10 may have information indicative of an appropriate location printed thereon, and accordingly thecontrol system 100 may not need to—in those embodiments—transmit appropriate location data to theuser device 110. - The
control system 100 may also be configured to receive a variety of data—including location data—from theuser device 110 atBlock 604. At Block 605, thecontrol system 100 may subsequently compare the appropriate location (at which the user for picking or the asset for sorting should be located) and the actual location data received atBlock 604 to determine whether theuser device 110 is proximate the appropriate location. As indicated atBlock 606, the remaining steps to be completed may be selected based at least in part on a determination of whether the location is an appropriate (or desired/accurate) location. Additional details in this respect may be understood with reference to U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety. - Upon a determination that the
user device 110 is proximate anincorrect location 400, thecontrol system 100 may generate mistake data atBlock 610. Upon generating the mistake data, thecontrol system 100 may transmit the mistake data to theuser device 110 atBlock 611. As indicated herein, theuser device 110 may be configured to generate a message discernible by the user 5 (e.g., carrier personnel) indicating theuser device 110 is proximate an incorrect location 400 (e.g., as illustrated inFIG. 13D ). In various embodiments, thecontrol system 100 may be configured to associate the asset identifier data with the location data corresponding to thelocation 400 atBlock 612. AtBlock 613, theuser 5 may continue transporting the asset 10 (and consequently the user device 110) to another (ideally correct)location 400. The process may return toBlock 601 in such scenarios and repeat the recited steps. - Referring again to Block 606, the process may proceed after comparing the actual/received location data and the appropriate location data for the asset 10 (illustrated as Block 605) with reference to Blocks 607-609 if the
user 5 approaches the appropriate location. In the context of sorting procedures, additional details in this respect may be understood with reference to U.S. Ser. No. 15/390,109, the contents of which as are hereby incorporated by reference in their entirety. - The
control system 100 may be further configured to generate one or more alerts regarding the association between the asset identifier data and the location data. Thecontrol system 100 may be configured to generate an alert to inform the user 5 (e.g., carrier personnel) or other users regarding asset identifier data being associated with location data. - Many modifications and other embodiments of the disclosures set forth herein will come to mind to one skilled in the art to which these disclosures pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the disclosures are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/411,495 US20210383320A1 (en) | 2017-08-15 | 2021-08-25 | Object location in a delivery vehicle |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762545752P | 2017-08-15 | 2017-08-15 | |
US201762607814P | 2017-12-19 | 2017-12-19 | |
US16/103,566 US11156471B2 (en) | 2017-08-15 | 2018-08-14 | Hands-free augmented reality system for picking and/or sorting assets |
US16/226,180 US11797910B2 (en) | 2017-08-15 | 2018-12-19 | Hands-free augmented reality system for picking and/or sorting assets |
US17/411,495 US20210383320A1 (en) | 2017-08-15 | 2021-08-25 | Object location in a delivery vehicle |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/226,180 Continuation US11797910B2 (en) | 2017-08-15 | 2018-12-19 | Hands-free augmented reality system for picking and/or sorting assets |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210383320A1 true US20210383320A1 (en) | 2021-12-09 |
Family
ID=66170613
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/226,180 Active 2039-06-03 US11797910B2 (en) | 2017-08-15 | 2018-12-19 | Hands-free augmented reality system for picking and/or sorting assets |
US17/411,495 Abandoned US20210383320A1 (en) | 2017-08-15 | 2021-08-25 | Object location in a delivery vehicle |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/226,180 Active 2039-06-03 US11797910B2 (en) | 2017-08-15 | 2018-12-19 | Hands-free augmented reality system for picking and/or sorting assets |
Country Status (1)
Country | Link |
---|---|
US (2) | US11797910B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI822132B (en) * | 2022-06-20 | 2023-11-11 | 國立臺北科技大學 | Courier assistance system and method of using the courier assistance system |
US11935169B2 (en) | 2016-11-02 | 2024-03-19 | United Parcel Service Of America, Inc. | Displaying items of interest in an augmented reality environment |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10546173B2 (en) * | 2015-04-09 | 2020-01-28 | Nec Corporation | Information processing device, information processing system, position reporting method, and program recording medium |
WO2018165052A1 (en) | 2017-03-06 | 2018-09-13 | United States Postal Service | System and method of providing informed delivery items using a hybrid-digital mailbox |
WO2019232420A2 (en) * | 2018-06-01 | 2019-12-05 | Culvert-Iot Corporation | An intelligent tracking system and methods and systems therefor |
JP2021144064A (en) * | 2018-06-06 | 2021-09-24 | ソニーグループ株式会社 | Information processing device, information processing method and program |
US10977867B2 (en) * | 2018-08-14 | 2021-04-13 | Goodrich Corporation | Augmented reality-based aircraft cargo monitoring and control system |
US11109309B2 (en) * | 2019-03-29 | 2021-08-31 | Blackberry Limited | Systems and methods for establishing short-range communication links between asset tracking devices |
US11715060B2 (en) | 2019-05-31 | 2023-08-01 | X Development Llc | Intelligent tracking system and methods and systems therefor |
US11556937B2 (en) * | 2019-06-24 | 2023-01-17 | Sap Se | Virtual reality for situational handling |
FR3096215A1 (en) * | 2019-06-24 | 2020-11-20 | Orange | Method of communicating an estimate of the location of at least one radio terminal to a connected object |
KR20190106930A (en) * | 2019-08-30 | 2019-09-18 | 엘지전자 주식회사 | Intelligent Device and Method for Information Display with Projection Type Using the Same |
US12033111B2 (en) * | 2019-10-03 | 2024-07-09 | United States Postal Service | Distribution item delivery point management system |
US20210142265A1 (en) * | 2019-11-08 | 2021-05-13 | Walmart Apollo, Llc | System and Method for Orderfilling Trip Allocation |
US11783268B2 (en) | 2020-03-31 | 2023-10-10 | Walmart Apollo, Llc | Systems and methods for packing visualizations |
EP3913528A1 (en) * | 2020-05-20 | 2021-11-24 | Hand Held Products, Inc. | Apparatuses, computer-implemented methods, and computer program products for automatic item searching and verification |
US11512956B2 (en) | 2020-07-09 | 2022-11-29 | Trimble Inc. | Construction layout using augmented reality |
US11360310B2 (en) * | 2020-07-09 | 2022-06-14 | Trimble Inc. | Augmented reality technology as a controller for a total station |
CN112150072A (en) * | 2020-09-27 | 2020-12-29 | 北京海益同展信息科技有限公司 | Asset checking method and device based on intelligent robot, electronic equipment and medium |
IT202100011462A1 (en) | 2021-05-05 | 2022-11-05 | Engynya S R L | ULTRA BROADBAND REAL-TIME TRACKING SYSTEM. |
CA3218658A1 (en) * | 2021-05-28 | 2022-12-01 | Ashutosh Prasad | System for inventory tracking |
US11983755B2 (en) * | 2021-08-31 | 2024-05-14 | International Busi Corporation ess Machines | Digital twin exchange filtering of digital resources based on owned assets |
JP2023136239A (en) * | 2022-03-16 | 2023-09-29 | 株式会社リコー | Information processing device, information processing system, supporting system, and information processing method |
US20230316212A1 (en) * | 2022-03-29 | 2023-10-05 | United Parcel Service Of America, Inc. | Package identification using multiple signals |
CN116329098A (en) * | 2022-05-31 | 2023-06-27 | 北京三快在线科技有限公司 | Goods sorting method and device, sorting system and electronic equipment |
US11816635B1 (en) * | 2022-09-07 | 2023-11-14 | David Paul Winter | System and methods of three-dimensional projection mapping-based visual guidance for order fulfillment |
WO2024054698A1 (en) * | 2022-09-07 | 2024-03-14 | Baker Creek Heirloom Seed Co., LLC | System and methods of three-dimensional projection mapping-based visual guidance for order fulfillment |
CN115965301B (en) * | 2022-11-18 | 2023-10-27 | 共青科技职业学院 | Arrival cargo distribution system based on AR space identification and distribution method thereof |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030036985A1 (en) * | 2001-08-15 | 2003-02-20 | Soderholm Mark J. | Product locating system for use in a store or other facility |
US20040088229A1 (en) * | 2002-11-05 | 2004-05-06 | Yongjie Xu | Multi-user light directed inventory system |
US20110199187A1 (en) * | 2010-02-12 | 2011-08-18 | Biotillion, Llc | Tracking Biological and Other Samples Using RFID Tags |
JP2017007866A (en) * | 2016-09-13 | 2017-01-12 | オークラ輸送機株式会社 | Picking system |
US20170015502A1 (en) * | 2015-07-17 | 2017-01-19 | Intelligrated Headquarters, Llc | Modular and configurable pick/put wall |
US20170140329A1 (en) * | 2015-11-18 | 2017-05-18 | Hand Held Products, Inc. | In-vehicle package location identification at load and delivery times |
US20180068266A1 (en) * | 2016-09-08 | 2018-03-08 | Position Imaging, Inc. | System and method of object tracking using weight confirmation |
Family Cites Families (102)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5090789A (en) * | 1990-08-03 | 1992-02-25 | Crabtree Allen E | Laser light show device and method |
US5555090A (en) * | 1994-10-24 | 1996-09-10 | Adaptive Optics Associates | System for dimensioning objects |
US5770841A (en) * | 1995-09-29 | 1998-06-23 | United Parcel Service Of America, Inc. | System and method for reading package information |
US6778683B1 (en) * | 1999-12-08 | 2004-08-17 | Federal Express Corporation | Method and apparatus for reading and decoding information |
US20020010661A1 (en) | 2000-05-31 | 2002-01-24 | Waddington Steffanie G. | Distribution system |
US7035856B1 (en) * | 2000-09-28 | 2006-04-25 | Nobuyoshi Morimoto | System and method for tracking and routing shipped items |
US9092841B2 (en) | 2004-06-09 | 2015-07-28 | Cognex Technology And Investment Llc | Method and apparatus for visual detection and inspection of objects |
US7301547B2 (en) * | 2002-03-22 | 2007-11-27 | Intel Corporation | Augmented reality system |
US7002551B2 (en) | 2002-09-25 | 2006-02-21 | Hrl Laboratories, Llc | Optical see-through augmented reality modified-scale display |
US7725406B2 (en) * | 2004-03-30 | 2010-05-25 | United Parcel Service Of America, Inc. | Systems and methods for international shipping and brokerage operations support processing |
US7158241B2 (en) * | 2004-06-17 | 2007-01-02 | The Boeing Company | Method for calibration and certifying laser projection beam accuracy |
US7624024B2 (en) * | 2005-04-18 | 2009-11-24 | United Parcel Service Of America, Inc. | Systems and methods for dynamically updating a dispatch plan |
US7489411B2 (en) * | 2005-07-27 | 2009-02-10 | The Boeing Company | Apparatus and methods for calibrating a laser projection device |
JP4913514B2 (en) * | 2006-03-28 | 2012-04-11 | 東芝テック株式会社 | Display shelf and display shelf system |
US7504949B1 (en) | 2006-05-24 | 2009-03-17 | Amazon Technologies, Inc. | Method and apparatus for indirect asset tracking with RFID |
US8684268B2 (en) * | 2006-07-21 | 2014-04-01 | Hussmann Corporation | Product display system, profile assembly for a product display system, and method for illuminating a product |
US7495561B2 (en) * | 2006-08-25 | 2009-02-24 | International Business Machines Corporation | Item position indicator and optimized item retrieval for a sensor equipped storage unit |
US20080183328A1 (en) * | 2007-01-26 | 2008-07-31 | Danelski Darin L | Laser Guided System for Picking or Sorting |
US9277351B2 (en) * | 2007-09-07 | 2016-03-01 | International Business Machines Corporation | Wireless transmission duration and location-based services |
US8423431B1 (en) | 2007-12-20 | 2013-04-16 | Amazon Technologies, Inc. | Light emission guidance |
US20100022221A1 (en) * | 2008-07-25 | 2010-01-28 | Yahoo! Inc. | Real-time inventory tracking via mobile device |
AT10520U3 (en) * | 2008-09-05 | 2013-10-15 | Knapp Systemintegration Gmbh | DEVICE AND METHOD FOR THE VISUAL SUPPORT OF PICKING PROCESSES |
JP2011118834A (en) | 2009-12-07 | 2011-06-16 | Sony Corp | Apparatus and method for processing information, and program |
US8400548B2 (en) | 2010-01-05 | 2013-03-19 | Apple Inc. | Synchronized, interactive augmented reality displays for multifunction devices |
US20130278631A1 (en) | 2010-02-28 | 2013-10-24 | Osterhout Group, Inc. | 3d positioning of augmented reality information |
US20150309316A1 (en) | 2011-04-06 | 2015-10-29 | Microsoft Technology Licensing, Llc | Ar glasses with predictive control of external device based on event input |
US20120063125A1 (en) * | 2010-03-17 | 2012-03-15 | The Sloan Company, Inc. Dba Sloanled | Display case lighting |
US9183560B2 (en) | 2010-05-28 | 2015-11-10 | Daniel H. Abelow | Reality alternate |
US8660581B2 (en) * | 2011-02-23 | 2014-02-25 | Digimarc Corporation | Mobile device indoor navigation |
US10394843B2 (en) | 2011-07-01 | 2019-08-27 | Here Global B.V. | Method and apparatus for personal asset management |
US8866391B2 (en) * | 2011-07-26 | 2014-10-21 | ByteLight, Inc. | Self identifying modulated light source |
US8941560B2 (en) | 2011-09-21 | 2015-01-27 | Google Inc. | Wearable computer with superimposed controls and instructions for external device |
EP2764469A4 (en) | 2011-10-03 | 2015-04-15 | Avocent Huntsville Corp | Data center infrastructure management system having real time enhanced reality tablet |
US10223710B2 (en) * | 2013-01-04 | 2019-03-05 | Visa International Service Association | Wearable intelligent vision device apparatuses, methods and systems |
US8690057B2 (en) | 2012-03-06 | 2014-04-08 | A-I Packaging Solutions, Inc. | Radio frequency identification system for tracking and managing materials in a manufacturing process |
US8947456B2 (en) | 2012-03-22 | 2015-02-03 | Empire Technology Development Llc | Augmented reality process for sorting materials |
US9064165B2 (en) * | 2012-03-28 | 2015-06-23 | Metrologic Instruments, Inc. | Laser scanning system using laser beam sources for producing long and short wavelengths in combination with beam-waist extending optics to extend the depth of field thereof while resolving high resolution bar code symbols having minimum code element widths |
JP5334145B1 (en) | 2012-06-29 | 2013-11-06 | トーヨーカネツソリューションズ株式会社 | Support system for picking goods |
BR202012019055Y1 (en) * | 2012-07-31 | 2019-09-10 | Waisman Reinaldo | image projection equipment at points of sale simulating a holography |
US9235553B2 (en) * | 2012-10-19 | 2016-01-12 | Hand Held Products, Inc. | Vehicle computer system with transparent display |
US20140175165A1 (en) * | 2012-12-21 | 2014-06-26 | Honeywell Scanning And Mobility | Bar code scanner with integrated surface authentication |
US9483875B2 (en) | 2013-02-14 | 2016-11-01 | Blackberry Limited | Augmented reality system with encoding beacons |
US9070032B2 (en) * | 2013-04-10 | 2015-06-30 | Hand Held Products, Inc. | Method of programming a symbol reading system |
CA2851950A1 (en) * | 2013-05-21 | 2014-11-21 | Fonella Oy | System for managing locations of items |
EP3028104A4 (en) | 2013-08-02 | 2016-08-03 | Tweddle Group | Systems and methods of creating and delivering item of manufacture specific information to remote devices |
US9464885B2 (en) * | 2013-08-30 | 2016-10-11 | Hand Held Products, Inc. | System and method for package dimensioning |
US9171278B1 (en) * | 2013-09-25 | 2015-10-27 | Amazon Technologies, Inc. | Item illumination based on image recognition |
US20150130592A1 (en) * | 2013-11-13 | 2015-05-14 | Symbol Technologies. Inc. | Package-loading system |
CN204009928U (en) * | 2013-12-12 | 2014-12-10 | 手持产品公司 | Laser scanner |
US10139495B2 (en) * | 2014-01-24 | 2018-11-27 | Hand Held Products, Inc. | Shelving and package locating systems for delivery vehicles |
CN106030427B (en) | 2014-02-20 | 2020-09-08 | M·奥利尼克 | Method and system for preparing food in a robotic cooking kitchen |
US9697548B1 (en) | 2014-02-20 | 2017-07-04 | Amazon Technologies, Inc. | Resolving item returns of an electronic marketplace |
WO2015143423A1 (en) * | 2014-03-21 | 2015-09-24 | Lightwave International, Inc. | Laser projection system |
US9632313B1 (en) | 2014-03-27 | 2017-04-25 | Amazon Technologies, Inc. | Augmented reality user interface facilitating fulfillment |
WO2015171825A1 (en) | 2014-05-06 | 2015-11-12 | Carvajal Hernan Ramiro | Switch network of containers and trailers for transportation, storage, and distribution of physical items |
EP3146729B1 (en) | 2014-05-21 | 2024-10-16 | Millennium Three Technologies Inc. | System comprising a helmet, a multi-camera array and an ad hoc arrangement of fiducial marker patterns and their automatic detection in images |
US9091530B1 (en) * | 2014-06-10 | 2015-07-28 | The Boeing Company | Calibration system and method for a three-dimensional measurement system |
MX364081B (en) * | 2014-07-23 | 2019-04-11 | Dematic Corp | Laser mobile put wall. |
US9342724B2 (en) * | 2014-09-10 | 2016-05-17 | Honeywell International, Inc. | Variable depth of field barcode scanner |
US20200294336A1 (en) * | 2014-10-02 | 2020-09-17 | Luxer Corporation (Dba Luxer One) | Automated storage area |
US9443222B2 (en) * | 2014-10-14 | 2016-09-13 | Hand Held Products, Inc. | Identifying inventory items in a storage facility |
US10438409B2 (en) | 2014-12-15 | 2019-10-08 | Hand Held Products, Inc. | Augmented reality asset locator |
US10169677B1 (en) * | 2014-12-19 | 2019-01-01 | Amazon Technologies, Inc. | Counting stacked inventory using image analysis |
WO2016103265A1 (en) * | 2014-12-26 | 2016-06-30 | Splitty Travel Ltd. | System and method for optimizing utilization of a population of underutilized physical facilities such as tourist facilities |
US20160189087A1 (en) * | 2014-12-30 | 2016-06-30 | Hand Held Products, Inc,. | Cargo Apportionment Techniques |
US9645482B2 (en) | 2015-03-10 | 2017-05-09 | Disney Enterprises, Inc. | Fail-safe projection system |
US10148918B1 (en) * | 2015-04-06 | 2018-12-04 | Position Imaging, Inc. | Modular shelving systems for package tracking |
US9659275B2 (en) * | 2015-04-14 | 2017-05-23 | Wal-Mart Stores, Inc. | Consumer demand-based inventory management system |
US9658310B2 (en) | 2015-06-16 | 2017-05-23 | United Parcel Service Of America, Inc. | Concepts for identifying an asset sort location |
US10495723B2 (en) * | 2015-06-16 | 2019-12-03 | United Parcel Service Of America, Inc. | Identifying an asset sort location |
US10013883B2 (en) | 2015-06-22 | 2018-07-03 | Digital Ally, Inc. | Tracking and analysis of drivers within a fleet of vehicles |
US9911290B1 (en) | 2015-07-25 | 2018-03-06 | Gary M. Zalewski | Wireless coded communication (WCC) devices for tracking retail interactions with goods and association to user accounts |
US10474987B2 (en) | 2015-08-05 | 2019-11-12 | Whirlpool Corporation | Object recognition system for an appliance and method for managing household inventory of consumables |
JP2017048024A (en) * | 2015-09-03 | 2017-03-09 | 株式会社東芝 | Eyeglass-type wearable terminal and picking method using the same |
WO2017074989A1 (en) * | 2015-10-28 | 2017-05-04 | Wal-Mart Stores, Inc. | Apparatus and method for providing package release to unmanned aerial system |
US10395116B2 (en) | 2015-10-29 | 2019-08-27 | Hand Held Products, Inc. | Dynamically created and updated indoor positioning map |
CA3153451A1 (en) * | 2015-11-02 | 2017-05-11 | Sargent Manufacturing Company | Methods and systems for ensuring secure delivery of parcels using internet-enabled storage receptacle |
US20170193428A1 (en) | 2016-01-06 | 2017-07-06 | International Business Machines Corporation | Method, system and computer product for identifying, coordinating and managing mail |
US10600109B2 (en) | 2016-01-11 | 2020-03-24 | Honeywell International Inc. | Tag for order fulfillment |
US10628862B2 (en) * | 2016-03-08 | 2020-04-21 | Walmart Apollo, Llc | Fresh perishable store item notification systems and methods |
US12094276B2 (en) * | 2016-04-06 | 2024-09-17 | Smiota, Inc. | Smart locker agnostic operating platform |
US11900312B2 (en) * | 2016-04-06 | 2024-02-13 | Smiota, Inc. | Package analysis devices and systems |
US10250720B2 (en) | 2016-05-05 | 2019-04-02 | Google Llc | Sharing in an augmented and/or virtual reality environment |
EP4410155A1 (en) * | 2016-05-09 | 2024-08-07 | Grabango Co. | System and method for computer vision driven applications within an environment |
WO2017223242A1 (en) | 2016-06-22 | 2017-12-28 | United States Postal Service | Item tracking using a dynamic region of interest |
US10679204B2 (en) * | 2016-07-21 | 2020-06-09 | Hewlett-Packard Development Company, Lp. | Imaging a package to identify contents associated with the package |
EP3276530A1 (en) | 2016-07-29 | 2018-01-31 | Neopost Technologies | Assisted manual mail sorting system and method |
US10346797B2 (en) * | 2016-09-26 | 2019-07-09 | Cybernet Systems, Inc. | Path and load localization and operations supporting automated warehousing using robotic forklifts or other material handling vehicles |
JP2018055429A (en) * | 2016-09-29 | 2018-04-05 | ファナック株式会社 | Object recognition device and object recognition method |
US10535169B2 (en) | 2016-11-02 | 2020-01-14 | United Parcel Service Of America, Inc. | Displaying items of interest in an augmented reality environment |
US10198711B2 (en) * | 2017-02-28 | 2019-02-05 | Walmart Apollo, Llc | Methods and systems for monitoring or tracking products in a retail shopping facility |
US10471478B2 (en) | 2017-04-28 | 2019-11-12 | United Parcel Service Of America, Inc. | Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same |
US10592536B2 (en) | 2017-05-30 | 2020-03-17 | Hand Held Products, Inc. | Systems and methods for determining a location of a user when using an imaging device in an indoor facility |
EP3625960A4 (en) * | 2017-06-14 | 2021-03-10 | Roborep Inc. | Telepresence management |
US20180373327A1 (en) | 2017-06-26 | 2018-12-27 | Hand Held Products, Inc. | System and method for selective scanning on a binocular augmented reality device |
KR20200060704A (en) * | 2017-06-30 | 2020-06-01 | 클리어 데스티네이션 인크. | Systems and methods to expose and integrate multiple supply chains and delivery networks to optimize capacity utilization |
WO2019083822A1 (en) * | 2017-10-24 | 2019-05-02 | Walmart Apollo, Llc | System and method for identifying transition points in a retail facility |
US11023851B2 (en) * | 2018-03-30 | 2021-06-01 | A-1 Packaging Solutions, Inc. | RFID-based inventory tracking system |
US11348067B2 (en) * | 2018-03-30 | 2022-05-31 | A-1 Packaging Solutions, Inc. | RFID-based inventory tracking system |
US20200111050A1 (en) * | 2018-10-04 | 2020-04-09 | Ford Global Technologies, Llc | Method and apparatus for vehicle-assisted delivery fulfilment |
US11361277B2 (en) * | 2019-03-06 | 2022-06-14 | Walmart Apollo, Llc | Integrated container conveyance system |
WO2021042293A1 (en) * | 2019-09-04 | 2021-03-11 | 北京图森智途科技有限公司 | Auto-driving vehicle service system and method |
-
2018
- 2018-12-19 US US16/226,180 patent/US11797910B2/en active Active
-
2021
- 2021-08-25 US US17/411,495 patent/US20210383320A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030036985A1 (en) * | 2001-08-15 | 2003-02-20 | Soderholm Mark J. | Product locating system for use in a store or other facility |
US20040088229A1 (en) * | 2002-11-05 | 2004-05-06 | Yongjie Xu | Multi-user light directed inventory system |
US20110199187A1 (en) * | 2010-02-12 | 2011-08-18 | Biotillion, Llc | Tracking Biological and Other Samples Using RFID Tags |
US20170015502A1 (en) * | 2015-07-17 | 2017-01-19 | Intelligrated Headquarters, Llc | Modular and configurable pick/put wall |
US20170140329A1 (en) * | 2015-11-18 | 2017-05-18 | Hand Held Products, Inc. | In-vehicle package location identification at load and delivery times |
US20180068266A1 (en) * | 2016-09-08 | 2018-03-08 | Position Imaging, Inc. | System and method of object tracking using weight confirmation |
JP2017007866A (en) * | 2016-09-13 | 2017-01-12 | オークラ輸送機株式会社 | Picking system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11935169B2 (en) | 2016-11-02 | 2024-03-19 | United Parcel Service Of America, Inc. | Displaying items of interest in an augmented reality environment |
TWI822132B (en) * | 2022-06-20 | 2023-11-11 | 國立臺北科技大學 | Courier assistance system and method of using the courier assistance system |
Also Published As
Publication number | Publication date |
---|---|
US11797910B2 (en) | 2023-10-24 |
US20190122174A1 (en) | 2019-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210383320A1 (en) | Object location in a delivery vehicle | |
US11703345B2 (en) | Hands-free augmented reality system for picking and/or sorting assets | |
US11090689B2 (en) | Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same | |
US11841452B2 (en) | Identifying an asset sort location | |
KR102347015B1 (en) | Vehicle tracking in a warehouse environment | |
US11580684B2 (en) | Displaying items of interest in an augmented reality environment | |
US10078916B2 (en) | Pick to augmented reality | |
US10268892B1 (en) | System and methods for volume dimensioning for supply chains and shelf sets | |
US11803803B2 (en) | Electronically connectable packaging systems configured for shipping items | |
JP2009012923A (en) | Moving device, system, moving method and moving program | |
US10360528B2 (en) | Product delivery unloading assistance systems and methods | |
US20170200115A1 (en) | Systems and methods of consolidating product orders | |
US20190259150A1 (en) | Autonomous marking system | |
JP2015184894A (en) | Information storage processing device, terminal device, control method, program, and storage medium | |
CA3046378A1 (en) | Identifying an asset sort location |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
AS | Assignment |
Owner name: UNITED PARCEL SERVICE OF AMERICA, INC., GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GIL, JULIO;REEL/FRAME:062736/0373 Effective date: 20210226 |
|
AS | Assignment |
Owner name: UNITED PARCEL SERVICE OF AMERICA, INC., GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TYLER, DANIEL PAUL;REEL/FRAME:062764/0596 Effective date: 20210823 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |