WO2022107000A1 - Automated tracking of inventory items for order fulfilment and replenishment - Google Patents

Automated tracking of inventory items for order fulfilment and replenishment Download PDF

Info

Publication number
WO2022107000A1
WO2022107000A1 PCT/IB2021/060622 IB2021060622W WO2022107000A1 WO 2022107000 A1 WO2022107000 A1 WO 2022107000A1 IB 2021060622 W IB2021060622 W IB 2021060622W WO 2022107000 A1 WO2022107000 A1 WO 2022107000A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
operator
position information
control server
storage facility
Prior art date
Application number
PCT/IB2021/060622
Other languages
French (fr)
Inventor
Akash Gupta
Sumit Tiwary
Ankit Agarwal
Original Assignee
Grey Orange Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Grey Orange Inc. filed Critical Grey Orange Inc.
Publication of WO2022107000A1 publication Critical patent/WO2022107000A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders

Definitions

  • Various embodiments of the disclosure relate generally to management of storage facilities. More specifically, various embodiments of the disclosure relate to methods and systems for automated tracking of inventory items in storage facilities.
  • Modern storage facilities handle a large number of inventory items on a daily basis.
  • the inventory items are handled within the storage facility for fulfilment of an order or brought inside the storage facility for replenishment of inventory stock.
  • Throughputs of such storage facilities have a direct bearing on various business metrics such as time taken to complete orders, total number of orders completed within a time duration, customer satisfaction, or the like.
  • the automated item tracking includes a plurality of imaging devices positioned within the storage facility and a control server.
  • the control server may be configured to receive, from at least one imaging device of the plurality of imaging devices, time-series images.
  • the control server may be further configured to process the time-series images to identify an item being handled by a first operator in the storage facility.
  • the control server may be further configured to tag the item with the first operator based on a successful identification of the item in the time-series images.
  • the control server may be further configured to track a movement of the item and the first operator, tagged with the item, within the storage facility based on subsequent time-series images captured by the at least one imaging device or one or more other imaging devices of the plurality of imaging devices.
  • the control server may be further configured to determine final position information of the item based on the tracked movement of at least one of the item and the first operator tagged with the item.
  • the control server may be further configured to compare the final position information of the item with desired position information of the item.
  • the control server may be further configured to detect a successful handling of the item or a failure in handling the item based on a result of the comparison between the final position information and the desired position information.
  • control server may be configured to receive, from at least one imaging device of the plurality of imaging devices, time-series images.
  • the control server may be further configured to process the time-series images to identify a batch of items being handled by a first operator in the storage facility.
  • the control server may be further configured to tag the batch of items with the first operator based on a successful identification of the batch of items in the time-series images.
  • the control server may be further configured to track a movement of the batch of items and the first operator, tagged with the batch of items, within the storage facility based on subsequent time-series images captured by the at least one imaging device or one or more other imaging devices of the plurality of imaging devices.
  • the control server may be further configured to determine final position information of the batch of items based on the tracked movement of at least one of the batch of items and the first operator tagged with the batch of items.
  • the control server may be further configured to compare the final position information of the batch of items with desired position information of the item.
  • the control server may be further configured to detect a successful handling of the batch of items or a failure in handling the batch of items based on a result of the comparison between the final position information and the desired position information.
  • the handling of the item may be detected as successful when the result of the comparison indicates that the final position information is same as the desired position information.
  • the handling of the item may be detected as failure when the result of the comparison indicates that the final position information is different from the desired position information.
  • control server when the handling of the item may be detected as failure, the control server may be further configured to display one or more instructions on a display.
  • the one or more instructions indicate a sequence of actions to be performed on the item by the first operator to match the final position information of the item with the desired position information.
  • control server may be further configured to control a visual indicator mechanism to provide visual cues for indicating the desired position information of the item.
  • control server may be further configured to communicate a set of instructions to the at least one imaging device or the one or more other imaging devices to orient in a direction of movement of the item.
  • control server may be further configured to process the subsequent time-series images to identify at least one of the item or the first operator tagged with the item. The movement of the item and the first operator may be tracked based on the identification of the item or the first operator in the subsequent time-series images.
  • control server may be further configured to generate a movement trajectory of the item in the storage facility based on the identification of the item or the first operator in the subsequent time-series images.
  • a starting point of the movement trajectory corresponds to an initial position information of the item and an ending point of the movement trajectory corresponds to the final position information of the item.
  • control server may track the movement of the item based on the movement of the first operator.
  • control server may be further configured to detect, based on the subsequent time-series images, that the item is handled by a second operator different from the first operator.
  • the control server may be further configured to re-tag the item with the second operator based on the detection that the item is handled by the second operator.
  • the control server may be further configured to track a movement of the second operator, tagged with the item, within the storage facility based on the subsequent time-series images. The final position information of the item is further determined based on the tracked movement of the second operator.
  • control server may be further configured to execute one or more image processing operations on the time-series images to identify the item.
  • FIG. 1 is a block diagram that illustrates an exemplary environment of a storage facility, in accordance with an exemplary embodiment of the disclosure
  • FIG. 2 is a block diagram that illustrates a control server of FIG. 1, in accordance with an exemplary embodiment of the disclosure
  • FIGS. 3A-3E are schematic diagrams that, collectively, illustrates an exemplary scenario for tracking movement of inventory items in a storage facility, in accordance with an exemplary embodiment of the disclosure
  • FIG. 4 is a block diagram that illustrates a system architecture of a computer system, in accordance with an exemplary embodiment of the disclosure
  • FIG. 5 is a flowchart that illustrate a process for tracking movement of inventory items in a storage facility, in accordance with an exemplary embodiment of the disclosure.
  • FIGS. 6A-6B are high-level flowcharts that, collectively, illustrate a process for tracking movement of inventory items in a storage facility, in accordance with an exemplary embodiment of the disclosure.
  • Certain embodiments of the disclosure may be found in disclosed systems and methods for tracking movement of inventory items within a storage facility. Exemplary aspects of the disclosure provide methods for tracking and monitoring inventory items.
  • the methods and systems of the disclosure provide a solution for facilitating automated tracking of inventory items within a storage facility.
  • the methods and systems disclosed herein eliminate scope of human error during handling of inventory items in the storage facility.
  • FIG. 1 is a block diagram that illustrates an exemplary environment of a storage facility, in accordance with an exemplary embodiment of the disclosure.
  • the environment 100 shows the storage facility 102.
  • the storage facility 102 includes a storage area 104, first through third operator stations 106a- 106c (hereinafter, the first through third operator stations 106a- 106c are collectively referred to as ‘the operator stations 106’), a plurality of transport vehicles (for example, a transport vehicle 108), a plurality of imaging devices (e.g., imaging devices 110a- 1 lOe) controlled by an image acquisition device 111, and a control server 112.
  • the operator stations 106 includes a storage area 104, first through third operator stations 106a- 106c (hereinafter, the first through third operator stations 106a- 106c are collectively referred to as ‘the operator stations 106’), a plurality of transport vehicles (for example, a transport vehicle 108), a plurality of imaging devices (e.g.,
  • the control server 112 communicates with the operator stations 106, the transport vehicle 108, and the image acquisition device 111 by way of a communication network 114 or through separate communication networks established therebetween.
  • Examples of the communication network 114 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li- Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof.
  • Wi-Fi wireless fidelity
  • Li- Fi light fidelity
  • LAN local area network
  • WAN wide area network
  • MAN metropolitan area network
  • satellite network the Internet
  • a fiber optic network a coaxial cable network
  • IR infrared
  • RF radio frequency
  • Various entities in the environment 100 may be coupled to the communication network 114 in accordance with various wired and wireless communication protocols, such as Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Long Term Evolution (LTE) communication protocols, or any combination thereof.
  • TCP/IP Transmission Control Protocol and Internet Protocol
  • UDP User Datagram Protocol
  • LTE Long Term Evolution
  • the storage facility 102 stores multiple inventory items for fulfillment of one or more orders, maintenance of inventory stock, and/or selling of one or more inventory items.
  • Examples of the storage facility 102 may include, but are not limited to, a forward warehouse, a backward warehouse, a fulfillment center, or a retail store (e.g., a supermarket, an apparel store, or the like).
  • Examples of the inventory items may include, but are not limited to, groceries, apparel, or the like.
  • the inventory items are stored in the storage area 104.
  • the storage area 104 may be of any shape, for example, a rectangular shape.
  • the storage area 104 includes first through third storage units 116a- 116c that are arranged within the storage area 104.
  • the first through third storage units 116a- 116c are collectively referred to as ‘the storage units 116’.
  • One or more inventory items are allocated to each storage unit 116a- 116c and each storage unit 116a- 116c stores the corresponding allocated inventory items.
  • the storage units 116 may have different shapes, sizes, and dimensions.
  • the terms ‘inventory items’ and ‘items’ are used interchangeably.
  • the storage units 116 are arranged such that first through fourth aisles 118a-l 18d are formed therebetween.
  • the first through fourth aisles 118a-l 18d are collectively referred to as ‘the aisles 118’.
  • the first aisle 118a is formed between the first and second storage units 116a and 116b.
  • the second aisle 118b is formed between the second and third storage units 116b and 116c.
  • the third and fourth aisles 118c and 118d are formed between side faces of the storage units 116 and a sidewall of the storage area 104.
  • the aisles 118 are passageways used by customers, operators, or the transport vehicle 108 to move in the storage area 104.
  • Arrangement of the storage units 116 may be performed in any desired configuration known to those of skill in the art. In a non-limiting example, it is assumed that the storage units 116 are arranged such that a layout of the aisles 118 forms a virtual grid in a rectangular space.
  • each aisle 118 is one of a horizontal aisle or a vertical aisle.
  • the first aisle 118a is a vertical aisle
  • the fourth aisle 118d is a horizontal aisle.
  • An intersection between horizontal and vertical aisles forms a cross-aisle.
  • the storage facility 102 may be marked with various fiducial markers (e.g., fiducial markers FMi, FM2, RMi, and RM2).
  • Fiducial markers are markers (or identifiers) placed in the storage facility 102 for uniquely identifying different locations in the storage facility 102, different sections of each storage unit 116a- 116c, or the like.
  • the storage area 104 has been shown to include multiple fiducial markers and only the fiducial markers FMi, FM2, RMi, and RM2 have been labeled. It will be apparent to those of skill in the art that the entire storage facility 102 may include the fiducial markers without deviating from the scope of the disclosure.
  • Each fiducial marker may correspond to one of two types - location markers (e.g., the fiducial markers FMi and FM2) and storage unit markers (e.g., the fiducial markers RMi and RM2).
  • the location markers e.g., the fiducial markers FMi and FM2 are located at pre-determined locations in the storage facility 102.
  • the pre-determined locations may not conform to any specific pattern and may be subject to a configuration of the storage facility 102.
  • the fiducial markers FMi and FM2 are located at first and second locations (e.g., on the floor of the storage area 104) along the first and second aisles 118a and 118b, respectively.
  • the storage unit markers may uniquely identify different storage units or different sections of each storage unit 116a- 116c.
  • the fiducial markers RMi and RM2 uniquely identify shelves of the first and second storage units 116a and 116b that partly constitute the first and second storage units 116a and 116b, respectively.
  • the fiducial markers include, but or not limited to, barcodes, quick response (QR) codes, radio frequency identification device (RFID) tags, or the like.
  • QR quick response
  • RFID radio frequency identification device
  • the placement of the fiducial markers may be non-uniform (i.e., a distance between consecutive fiducial markers is variable).
  • the storage facility 102 may not be marked with any fiducial markers.
  • different locations in the storage facility 102 may be determined by using global positioning system (GPS) coordinates or other localization technology-based coordinates.
  • GPS global positioning system
  • the operator stations 106 in the storage facility 102 may refer to pick-and-put stations (PPSs) for holding inventory items that are to be placed in the storage units 116 or the inventory items that are retrieved from the storage units 116.
  • Each operator station 106 may be manned by one or more operators.
  • the first through third operator stations 106a- 106c are manned by first through third operators 120a- 120c, respectively.
  • the storage units 116 are transported to the operator stations 106 by the plurality of transport vehicles (e.g., the transport vehicle 108).
  • the storage facility 102 is shown to include three operator stations 106, it will be apparent to those of skill in the art that the storage facility 102 may include any number of operator stations without deviating from the scope of the disclosure.
  • the operator stations 106 may include a display device (as shown in FIG. 3B) that receives various commands or instructions from the control server 112 for placing the inventory items in the storage units 116 or retrieving the inventory items from the storage units 116. Based on the received commands or instructions, the first through third operators 120a- 120c at the operator stations 106 may place the inventory items in the storage units 116 or retrieve the inventory items from the storage units 116.
  • An item placement operation involves, in some examples, picking up the inventory items from one or more storage bins (shown in FIG. 3B) at the operator stations 106 and placing the picked inventory items in one of the storage units 116.
  • an item retrieval operation involves retrieving one or more inventory items from a storage unit 116a, 116b, or 116c and placing the retrieved inventory items in one of the storage bins in a temporary storage at the operator stations 106 or one of the storage bins carried by a mobile robotic apparatus.
  • the operator stations 106 may include robotic operators for performing item placement and item retrieval operations, without deviating from the scope of the disclosure.
  • the first through third operators 120a- 120c may be associated with respective first through third portable devices 121a-121c.
  • the first through third portable devices 121 a- 121c may be configured to receive one or more instructions from the control server 112 for providing one or more sensory inputs (such as vibrations, audio sounds, or visual cues) to the respective first through third operators 120a- 120c.
  • the one or more sensory inputs are provided to alert the first through third operators 120a- 120c regarding an error committed by the respective first through third operators 120a- 120c, a pick/put operation to be performed for handling the item, or an emergency situation in the storage facility 102, and/or the like.
  • the first through third portable devices 121 a- 121c may be controlled by the control server 112 to provide the sensory inputs to the respective first through third operators 120a- 120c.
  • the first through third portable devices 121a-121c may be controlled by the control server 112 to display the one or more instructions for handling the item.
  • the first through third portable devices 121 a- 121c may be wearable devices, for example, smartwatches, smart pendants, wearable computers, and/or the like.
  • the first through third portable devices 121 a- 121c may be cellphones or mobile phones of the corresponding first through third operators 120a- 120c.
  • the transport vehicle 108 is a robotic apparatus that moves within the storage facility 102.
  • the transport vehicle 108 is an automatic guided vehicle (AGV) that is responsive to commands received from the control server 112.
  • the transport vehicle 108 may include suitable logic, instructions, circuitry, interfaces, and/or codes, executable by the circuitry, for transporting payloads (e.g., the storage units 116) in the storage facility 102 based on the commands received from the control server 112.
  • the transport vehicle 108 may carry and transport the storage units 116 from the storage area 104 to the operator stations 106 and from the operator stations 106 to the storage area 104 for fulfillment of orders, replenishment of inventory stock, loading of inventory items into the storage units 116, and/or the like.
  • the transport vehicle 108 may be configured to read the fiducial markers (e.g., the fiducial markers FMi, FM2, RMi, and RM2).
  • the transport vehicle 108 may include various sensors (e.g., imaging devices, RFID sensors, and/or the like) for reading the fiducial markers.
  • the transport vehicle 108 may utilize the fiducial markers for determining a relative position of the transport vehicle 108 within the storage facility 102 and/or identifying the storage units 116.
  • the storage facility 102 is shown to have a single transport vehicle (i.e., the transport vehicle 108). It will be apparent to those of skill in the art that the storage facility 102 may include any number of transport vehicles without deviating from the scope of the disclosure.
  • the storage facility 102 may further include the plurality of imaging devices HOa-l lOe installed at different locations.
  • the storage facility 102 has been shown to include multiple imaging devices and only five imaging devices 110a, 110b, 110c, HOd, and I lOe have been labeled.
  • the imaging devices HOa-l lOe may be installed in such a way that the storage units 116, the transport vehicle 108, and the operator stations 106 along with corresponding operators 120a- 120c are captured from various angles and orientations.
  • the field of view of the imaging devices 110a- 1 lOe is set in such a way that there are no unmonitored zones in the storage facility 102.
  • the plurality of imaging devices 110a- 1 lOe are configured to capture time-series images of different locations within the storage facility 102.
  • the storage facility 102 may include multiple imaging devices (for example, high definition cameras, imaging devices, scanners, or the like) of varying configurations installed at different locations within the storage facility 102 without deviating from the scope of the disclosure.
  • each imaging device of the plurality of imaging devices HOa-l lOe may have high resolution and may be configured to capture the smallest of details and movements within the storage facility 102.
  • Each of the imaging devices HOa-l lOe may be communicab ly coupled to the control server 112 via the communication network 114 and the image acquisition device 111.
  • one or more of the plurality of imaging devices HOa-l lOe may have a dynamic field of view and hence may change its orientation based on an instruction received from the control server 112.
  • the plurality of imaging devices HOa-l lOe may be mounted on walls of the storage facility 102. In other embodiment, the plurality of imaging devices HOa-l lOe may be mounted on the storage units 116 and/or the storage bins. In other embodiment, the plurality of imaging devices HOa-l lOe may be mounted on the transport vehicle 108. In other embodiment, the plurality of imaging devices HOa-l lOe may be mounted on structures (e.g., a vertical pole) positioned at different locations within the storage facility [0037] In some embodiments, the plurality of imaging devices HOa-l lOe may be controlled by at least one of the image acquisition device 111 and the control server 112.
  • the plurality of imaging devices HOa-l lOe may be controlled to focus at different locations and objects based on the movement of the inventory items.
  • the plurality of imaging devices 1 Wal l Oe may be further controlled to rotate at different angles.
  • the plurality of imaging devices HOa-l lOe may be controlled to operate in different modes such as day-light mode and night mode based upon ambient light within the storage facility 102.
  • the plurality of imaging devices HOa-l lOe may be further controlled to manipulate corresponding configurations (for example, aperture, shutter speed, flash, color intensity, warmth, hue, and the like) such that the plurality of imaging devices HOa-l lOe are optimized to capture clear and unambiguous images and/or video.
  • the plurality of imaging devices HOa-l lOe communicate their corresponding time-series image data and/or video data to the control server 112. In another embodiment, the plurality of imaging devices HOa-l lOe communicate corresponding time-series image data and/or video data to the image acquisition device 111 that may communicate the time-series image data and/or video data received from the plurality of imaging devices 110a- 1 e to the control server 112.
  • the plurality of imaging devices HOa-l lOe may periodically communicate the time-series image data to the control server 112 and/or the image acquisition device 111.
  • the plurality of imaging devices HOa-l lOe may continuously transmit a live feed of the time-series image data and/or video data the control server 112 and/or the image acquisition device 111.
  • the plurality of imaging devices HOa-l lOe may be configured to transmit the time-series image data and/or video data upon being prompted by one of the image acquisition device 111 and/or the control server 112.
  • the plurality of imaging devices HOa-l lOe may be configured to transmit the time-series image data upon detecting a movement associated with one or more inventory items, the transport vehicle 108, the storage units 116, the first through third operators 120a- 120c, or any other object in the storage facility 102.
  • each of the plurality of imaging devices HOa-l lOe may be configured to track the first through third operators 120a- 120c, the storage units 116, the storage bins, and/or the transport vehicle 108 in order to capture the movement of the inventory items during pick or put sessions associated with the inventory items.
  • the image acquisition device 111 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more operations for collecting the time-series image data and/or video data from the plurality of imaging devices HOa-l lOe.
  • the time-series image data may refer to a temporal sequence of images captured by the plurality of imaging devices HOa-l lOe.
  • the time-series images may include one or more images in which the item is shown to picked by corresponding operator.
  • the time-series image data and/or video data may include a live feed, a recorded video, an image, or the like that includes visual data associated with the movement of the inventory items inside the storage facility 102.
  • the image acquisition device 111 may be further configured to filter the time-series image data and/or video data for image optimization and noise filtration.
  • the image acquisition device 111 may be further configured to select relevant image data from cumulative time-series image data and/or video data of each imaging device HOa-l lOe. Selection of the relevant image data from the cumulative time-series image data and/or video data may be performed based on identification of the item or the operator. In an instance, when at least one image or video frame in the time-series images or video data may present the item being handled by corresponding operator, the at least one image or video frame and subsequent images in the time-series images or video data may be selected as the relevant image data.
  • At least one image of the time-series images captured by a specific imaging device may present the item being handled by the corresponding operator.
  • subsequent time-series images captured by the specific imaging device may be selected as the relevant image data
  • the image acquisition device 111 may be further configured to communicate the cumulative time-series image data and/or video data or the relevant image data to the control server 112.
  • the image acquisition device 111 may periodically communicate the cumulative time-series image data and/or video data or the relevant image data to the control server 112. In another embodiment, the image acquisition device 111 may continuously transmit a live feed of the cumulative time-series image data and/or video data to the control server 112. In another embodiment, the image acquisition device 111 may be configured to transmit the cumulative time-series image data and/or video data or the relevant image data upon being prompted by the control server 112.
  • the image acquisition device 111 may be configured to transmit the cumulative time-series image data and/or video data or the relevant image data upon observing a movement associated with one or more inventory items, the transport vehicle 108, the storage units 116, the first through third operators 120a- 120c, or any other object in the storage facility 102.
  • the image acquisition device 111 may be configured to perform the transmission based on a size of the cumulative time-series image data and/or video data or the relevant image data being greater than a threshold value.
  • the storage facility 102 may further include one or more microphones and/or an audio input device 109.
  • the audio input device 109 may be configured to capture audio sound, such as words spoken by the first through third operators 120a- 120c, sound generated due to one of falling of the inventory items, movements of the transport vehicle 108, movements of the storage units 116, movements of the inventory items, and /or movements of the storage bins.
  • the audio input device 109 may be communicab ly coupled to the control server 112 via the communication network 114.
  • the audio input device 109 may be configured to communicate audio data associated with the storage facility 102 to the control server 112.
  • the control server 112 may be a network of computers, a software framework, or a combination thereof, that may provide a generalized approach to create the server implementation.
  • Examples of the control server 112 include, but are not limited to, personal computers, laptops, mini-computers, mainframe computers, any non-transient and tangible machine that can execute a machine-readable code, cloud-based servers, distributed server networks, or a network of computer systems.
  • the control server 112 may be realized through various web-based technologies such as, but not limited to, a Java web-framework, a .NET framework, a personal home page (PHP) framework, or any web-application framework.
  • the control server 112 may be maintained by a warehouse management authority or a third-party entity that facilitates inventory management operations for the storage facility 102. It will be apparent to a person of ordinary skill in the art that the control server 112 may perform other warehouse management operations as well along with the inventory item tracking and management operations. [0044]
  • the control server 112 may be configured to store, in a memory of the control server 112, a virtual map of the storage facility 102 and inventory storage data (as shown in FIG. 9) of the inventory stock.
  • the virtual map is indicative of the current location of the storage units 116, the operator stations 106, entry and exit points of the storage facility 102, the fiducial markers in the storage facility 102, a current location of the transport vehicle 108, or the like.
  • the inventory storage data is indicative of associations between the inventory items stored in the storage facility 102 and the storage units 116 in the storage facility 102.
  • the inventory storage data may further include historic storage locations of each inventory item.
  • the inventory storage data further includes parameters (for example, weight, shape, size, color, dimensions, or the like) associated with each inventory item.
  • the control server 112 may be configured to receive, from at least one imaging device (for example, the imaging device 110a) of the plurality of imaging devices HOa-l lOe, time-series images.
  • the control server 112 may be further configured to process the time-series images to identify the item being handled by an operator (for example, the first operator 120a) in the storage facility 102.
  • the control server 112 may be further configured to tag the item with the first operator 120a based on the successful identification of the item in the time-series images.
  • the control server 112 may be further configured to track the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on subsequent time-series images captured by the at least one imaging device (e.g., the imaging device 110a) or one or more other imaging devices (for example, the imaging device 110b and 110c) of the plurality of imaging devices HOa-l lOe.
  • the control server 112 may be further configured to determine a final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item.
  • the control server 112 may be further configured to compare the final position information of the item with a desired position information of the item.
  • the control server 112 may be further configured to detect the successful handling of the item or the failure in handling the item based on the result of the comparison between the final position information and the desired position information.
  • control server 112 may be configured to receive, from at least one imaging device (for example, the imaging device 110a) of the imaging devices HOa-l lOe, time-series images.
  • the time-series images may include a time series of image frames captured one after the other. Images in the time-series images are temporally related with each other.
  • the control server 112 may be further configured to process the time-series images to identify an item being handled by an operator (for example, the first operator 120a) in the storage facility 102.
  • the item may be handled by the first operator 120a at an operator station (for example, the first operator station 106a) by executing one or more pick and/or put operations by picking the item from an initial position (for example, a shelf of a storage unit) and putting the item at a final position (for example, a storage bin).
  • the item may be handled by the first operator 120a in the storage area 104 by picking the item from an initial position (for example, a shelf of a storage unit) and putting the item at a final position (for example, a storage bin or another shelf of the same or a different storage unit).
  • the processing of the time-series images may be performed by executing one or more image processing operations on the time-series images.
  • Example of the image processing operations may include, but is not limited to, bounding box technique to identify regions of interest in the time-series images.
  • the time-series images may be processed by executing a plurality of operations such as image acquisition, image enhancement, feature extraction, and object recognition on the time-series images.
  • the control server 112 may be configured to tag the item with the first operator 120a.
  • the tagging of the item with the first operator 120a is indicative of a coexistence of the item and the first operator 120a, in subsequent time-series images thereof, regardless of the item being visible or obscured from vision.
  • the tagging of the item with the first operator 120a results in the item and the first operator 120a being considered as a single unit.
  • the item may be tagged with the first operator 120a based on one or more tagging or labelling techniques known in the art.
  • the control server 112 may maintain a reference database that includes various items tagged to different operators.
  • the reference database may be a look-up table and each row of the look-up table may indicate a unique tagging between an operator and an item.
  • a first row of the lookup table may include two cells one for a unique identifier allocated to the first operator 120a and one for a unique identifier allocated to an item being handled by the first operator 120a.
  • the control server 112 determines which items are being handled by which operators at all times. In some scenarios, some items may not be handled by any operator or some operators may not be involved in item handling. For such items or operators, the look-up table may not include any tagging.
  • the control server 112 may be configured to update the reference database to add new rows for newly tagged items and operators, delete previous rows upon completion of a handling operation, and modify previous rows due to change in tagging between items and operators.
  • the control server 112 may be further configured to generate a movement trajectory of the item in the storage facility 102 based on identification of the item or the first operator 120a in subsequent time-series images captured by the plurality of imaging devices HOa-l lOe.
  • the subsequent time-series images may be one or more images captured by the imaging device 110a and one or more other imaging devices of the plurality of imaging devices HOa-l lOe that may have the item and/or the first operator 120a in corresponding field of view.
  • the control server 112 may be configured to process the subsequent time-series images captured by the imaging device 110a and the one or more other imaging devices of the plurality of imaging devices HOa-l lOe that have the item and/or the first operator 120a in corresponding field of view.
  • the control server 112 may be further configured to cause the one or more other imaging devices to capture time-series images of a portion of the storage facility 102 where the item is currently being handled by the first operator 120a.
  • the field of view of the one or more other imaging devices may overlap with a field of view of the imaging device 110a.
  • the one or more other imaging devices may vary with time based on a movement of the item within the storage facility 102.
  • the one or more other imaging devices may include an imaging device 110b.
  • the control server 112 may determine an imaging device 110c as the other imaging device for receiving the subsequent time-series images. Subsequently, based on identification of one of (i) the item and the first operator 120a or (ii) only the first operator 120a, in the subsequent timeseries images of the imaging device 110a and/or the other imaging devices (for example, the imaging devices 110b and 110c), the control server 112 may be configured to determine a movement trajectory of the item within the storage facility 102.
  • a starting point of the movement trajectory corresponds to an initial position information of the item and an ending point of the movement trajectory corresponds to the final position information of the item.
  • the starting point of the movement trajectory may refer to a position of the item within the storage facility 102 where the item was identified to be handled by the first operator 120a initially.
  • the initial position information may include a fiducial marker of a storage unit or a location within the storage facility 102 where the item may have been stored initially, an identifier of a storage bin, or the like.
  • the ending point of the movement trajectory may be a position of the item where subsequent time-series images may present the item being separated from the first operator 120a (or any other operator) handling the item.
  • the final position information indicated by the ending point of the movement trajectory may be indicative of a location within the storage facility 102 where the item was separated from the first operator 120a.
  • the tagging between the item and the first operator 120a (or any other operator) handling the item may be removed once the item reaches the ending point of the movement trajectory.
  • the imaging device 110a may capture the time-series images in which the item is identified to be picked up by the first operator 120a from the storage unit 116a and based on such identification the control server 112 may tag the item with the first operator 120a.
  • different imaging devices HOa-l lOe may capture the first operator 120a and/or the item at different instances of time.
  • the control server 112 continues to process images of the plurality of imaging devices HOa-l lOe to identify the first operator 120a and/or the item in the subsequent images after the tagging.
  • control server 112 may determine that a field of view of the imaging device 110a may overlap with that of the imaging device 110b. Therefore, the control server 112 may process the subsequent time-series images of the imaging devices 110a and 110b to locate (or track) the movement of the first operator 120a and the handled item. In an instance, the control server 112 may identify the item and/or the first operator 120a in the subsequent time-series images captured by the imaging device 110b. Thus, the control server 112 may select the imaging device 110b as the other imaging device for tracking the movement of the first operator 120a and the handled item.
  • control server 112 may discard the images captured by the imaging device 110b for tracking the movement of the first operator 120a and the handled item.
  • control server 112 may be configured to identify presence or absence of at least one of the item and the first operator 120a in time-series images captured by the plurality of imaging devices 110-110e to keep a track of the first operator 120a and the handled item.
  • the control server 112 may be configured to communicate a set of instructions to one or more imaging devices of the plurality of imaging devices 1 Wal l Oe to orient in a direction of movement of the item. The control server 112 may communicate such instruction to the one or more imaging devices when the current field of view of any imaging device 110a- 1 e is not oriented in the direction of the movement of the item and/or the tagged first operator 120a.
  • control server 112 may be configured to track the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on subsequent time-series images captured by the imaging device 110a and/or the one or more other imaging devices 110b and 110c of the plurality of imaging devices HOa-l lOe.
  • the control server 112 may track the movement of the item by identifying (e.g., detecting presence or absence of) the item and/or the first operator 120a in the subsequent time-series images.
  • the control server 112 may be configured to track the movement of the item based on the determined movement trajectory of the item within the storage facility 102.
  • the control server 112 may be configured to track the movement of the item based on the movement of the first operator 120a. Since the item is tagged with the first operator 120a, for each image where the first operator 120a is visible, the item is considered to coexist with the first operator 120a. Therefore, the control server 112 may be configured to track the movement of the item by tracking the first operator 120a in the subsequent time-series images.
  • control server 112 may be configured to track the movement of the item by processing a real-time video that captures the first operator 120a moving in the storage facility 102 along with the item.
  • control server 112 may be configured to identify the item or the first operator 120a in the real-time video captured by the imaging device 110a and the other imaging devices 110b and 110c, and may track the item or the first operator 120a based on one or object tracking technique (for example, bounding box technique, thermal imaging for facial recognition, face recognition, feature extraction, or the like).
  • object tracking technique for example, bounding box technique, thermal imaging for facial recognition, face recognition, feature extraction, or the like.
  • control server 112 may be configured to detect, based on the subsequent time-series images, that the item is being handled by the second operator 120b different from the first operator 120a.
  • the subsequent time-series images display the item being in the possession of the second operator 120b instead of the first operator 120a.
  • the control server 112 may detect, based on the subsequent time-series images, a transfer of the item from the first operator 120a to the second operator 120b to detect that the item is currently being handled by the second operator 120b that is different from the first operator 120a tagged with the item.
  • control server 112 may be further configured to re-tag the item with the second operator 120a based on the detection that the item is handled by a different operator, e.g., the second operator 120b.
  • the tagging of the item with the second operator 120b may be performed similar to the tagging of the item with the first operator 120a.
  • the control server 112 may be configured to track a movement of the second operator 120b, tagged with the item, within the storage facility 102 based on the subsequent time-series images.
  • the final position information of the item is further determined based on the tracked movement of the second operator 120b.
  • the item is considered to be tagged with the first operator 120a throughout the description.
  • the control server 112 is further configured to determine the final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item.
  • the final position information of the item is determined based on identification of the item being separated by the first operator 120a.
  • the final position information of the item corresponds to the end point of the movement trajectory of the item.
  • the final position information is indicative of the item having reached a final location after which the item is not required to be moved by the first operator 120a or any other operator for at least for some time period.
  • the control server 112 may be configured to compare the final position information of the item with desired position information of the item.
  • the desired position information of the item may be indicative of a position within the storage facility 102 where the item has to reach for inventory replenishment, order fulfilment, or any other operation within the storage facility 102. That is to say, a desired position of the item refers to a position within the storage facility 102 where the item has to reach for a successful handling of the item.
  • the control server 112 may detect a successful handling of the item or a failure in handling the item based on a result of the comparison between the final position information and the desired position information. The handling of the item is detected as successful when the result of the comparison indicates that the final position information is same as the desired position information. The handling of the item is detected as failure when the result of the comparison indicates that the final position information is different from the desired position information.
  • control server 112 may be further configured to display one or more instructions on a display (for example, a display of the first portable device 121a associated with the first operator 120a).
  • the one or more instructions indicate a sequence of actions to be performed on the item by the first operator 120a to match the final position information of the item with the desired position information.
  • the sequence of actions may include one or more pick/put operations to be executed by the first operator 120a to achieve the successful handling of the item.
  • control server 112 may be further configured to control a visual indicator mechanism to provide visual cues for indicating the desired position information of the item.
  • the visual indicator may be a pick/put to light (PPTL) structure or a projector system corresponding to a shelf, a conveyor, a storage bin, a kiosk, a curbside locker, or the like that may be indicative of the desired position of the item.
  • the visual cues may be provided by way of one or more light emitting diodes (LEDs) positioned corresponding to the desired location.
  • LEDs light emitting diodes
  • the item is considered to be a single item being handled by the first operator 120a. In other embodiments, the item may be a batch of items being handled as a unit by the first operator 120a without deviating from the scope of the disclosure. For the sake of brevity, the ongoing description is described with respect to the first operator 120a. It will be apparent to a person of skill in the art that FIG. 1 shown herein is exemplary and does not limit the scope of the disclosure.
  • FIG. 2 is a block diagram that illustrates the control server 112, in accordance with an exemplary embodiment of the disclosure.
  • the control server 112 may include an image processor 202, an audio processor 204, a processor 206, a natural language processor 208, a memory 210, a machine learning engine 212, and a network interface 214.
  • the image processor 202 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more image processing operations for processing the time-series image data and/or video data captured by the plurality of imaging devices HOa-l lOe.
  • the image processor 202 may process the time-series image data and/or video data to filter out noise and optimize the time-series image data and/or video data.
  • the image processor 202 may further process the time-series image data and/or video data to detect one or more entities (e.g., the first through third operators 120a- 120c, the storage units 116, the transport vehicle 108, and the inventory items) in the time-series image data and identify one or more actions of the first through third operators 120a- 120c during a pick or put session associated with the inventory items.
  • the image processor 202 may be configured to apply one or more image processing algorithms for processing the time-series image data and/or video data. Examples of the image processing algorithms may include Anisotropic Diffusion, Hidden Markov Models, Linear filtering, and/or the like.
  • the audio processor 204 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more operations for processing the audio data received from the audio input device 109.
  • the output of the audio processor 204 may be used to optimize and enhance tracking of the movement of the inventory items within the storage facility 102.
  • the audio processor 204 may apply one or more audio processing algorithms to the audio data. Examples of the audio processing algorithm may include, but are not limited to, digital signal processing (DSP) technique. It will be apparent to a person skilled in the art that the audio data may be processed by applying any audio processing technique known in the art.
  • DSP digital signal processing
  • the processor 206 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more operations for tracking and monitoring the movement of the inventory items within the storage facility 102.
  • the processor 206 may be configured to receive, from the imaging device 110a of the plurality of imaging devices 110a- 1 lOe, time-series images.
  • the processor 206 may be further configured to process the time-series images to identify the item being handled by the first operator 120a in the storage facility 102.
  • the processor 206 may process the time-series images by way of the image processor 202.
  • the processor 206 may be further configured to tag the item with the first operator 120a based on the successful identification of the item being handled by the first operator 120a in at least one of the time-series images.
  • the processor 206 may be further configured to generate the movement trajectory of the item in the storage facility 102 based on the identification of the item or the first operator 120a in the subsequent time-series images.
  • the processor 206 may be further configured to track the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on subsequent time-series images captured by the imaging device 110a or the one or more other imaging devices 110b and 110c of the plurality of imaging devices HOa-l lOe.
  • the processor 206 may be further configured to determine the final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item.
  • the processor 206 may be further configured to detect the successful handling of the item or a failure in handling the item based on a result of the comparison between the final position information and the desired position information.
  • the processor 206 may be further configured to control the first portable device 121a of the first operator 120a to display the one or more instructions to be followed by the first operator 120a in order for successful handling the inventory item.
  • the processor 206 may be further configured to determine the initial position, the final position, and the desired position of the inventory item in the storage facility 102.
  • the processor 206 may determine whether placement of the inventory item at the desired position is correct. Further, the processor 206 may determine one or more actions to be performed by the first operator 120a for achieving the correct placement of the inventory item at the desired position.
  • the natural language processor 208 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more operations for tracking and monitoring the movement of the inventory items within the storage facility 102.
  • the natural language processor 208 may process the audio data captured by the audio input device 109 in order to identify the movement of one of the inventory items, the transport vehicle 108, the storage units 116, and the storage bins.
  • the natural language processor 208 may be configured to process the audio data received from the audio input device 109.
  • the natural language processor 208 may be configured to process the audio data that has already been processed by the audio processor 204.
  • the natural language processor 208 may be further configured to determine one or more words spoken by the first through third operators 120a- 120c to identify an issue with handling of the inventory items.
  • the natural language processor 208 may apply one or more natural language processing (NLP) techniques for processing the audio data.
  • NLP techniques include, but are not limited to, named entity recognition technique, tokenization, and the like.
  • the memory 210 may include suitable logic, circuitry, and interfaces that may be configured to store one or more instructions which when executed by the processor 206 cause the control server 112 to perform various operations for tracking and monitoring movement of the inventory items.
  • the memory 210 may be configured to store information associated with one or more orders that are to be fulfilled, information associated with one or more orders fulfilled in past, the image data, the audio data, the inventory storage data, and the virtual map. Examples of the memory 210 may include, but are not limited to, a random-access memory (RAM), a read only memory (ROM), a removable storage drive, a hard disk drive (HDD), a flash memory, a solid-state memory, or the like.
  • RAM random-access memory
  • ROM read only memory
  • HDD hard disk drive
  • flash memory a solid-state memory, or the like.
  • the scope of the disclosure is not limited to realizing the memory 210 in the control server 112, as described herein.
  • the memory 210 may be realized in form of a database or a cloud storage working in conjunction with the control server 112, without departing from the scope of the disclosure.
  • the machine learning engine 212 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more operations for tracking and monitoring the inventory items.
  • the machine learning engine 212 may be further configured to identify a trend or pattern in actions of the first through third operators 120a- 120c and display personalized information associated with placement of the inventory items.
  • the machine learning engine 212 may be configured to learn, based on an organization of the inventory items in the storage facility 102, to generate customized instructions for handling of the items.
  • the machine learning engine 212 may learn to instruct the transport vehicle 108 to transport the inventory items that are stored on a single storage unit (for example, the storage unit 116a) in a single batch.
  • the machine learning engine 212 may apply one or more machine learning algorithms and/or techniques for operations thereof.
  • Examples of the machine learning algorithms and/or techniques may include, but are not limited to, Linear Regression, Logistic Regression, k-nearest neighbors algorithm, neural networks, and the like.
  • the image processor 202, the audio processor 204, the processor 206, the natural language processor 208, and the machine learning engine 212 may be implemented by one or more processors, such as, but not limited to, an application-specific integrated circuit (ASIC) processor, a reduced instruction set computing (RISC) processor, a complex instruction set computing (CISC) processor, and a field-programmable gate array (FPGA) processor.
  • the one or more processors may also correspond to central processing units (CPUs), graphics processing units (GPUs), network processing units (NPUs), digital signal processors (DSPs), or the like. It will be apparent to a person of ordinary skill in the art that the image processor 202, the audio processor 204, the processor 206, the natural language processor 208, and the machine learning engine 212 may be compatible with multiple operating systems.
  • the network interface 214 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to enable the control server 112 to communicate with the plurality of imaging devices HOa-l lOe, the audio input device 109, the image acquisition device 111, the display device, and the transport vehicle 108.
  • the network interface 214 may be implemented as a hardware, software, firmware, or a combination thereof. Examples of the network interface 214 may include a network interface card, a physical port, a network interface device, an antenna, a radio frequency transceiver, a wireless transceiver, an Ethernet port, a universal serial bus (USB) port, or the like.
  • FIGS. 3A-3E are schematic diagrams that, collectively, illustrate an exemplary scenario for tracking movement of the inventory items in the storage facility, in accordance with an exemplary embodiment of the disclosure.
  • the exemplary scenarios 300A-300E are explained with respect to a goods-to-person implementation.
  • the scope of the present disclosure may be expanded to person-to-goods implementation also.
  • the transport vehicle 108 receives a first set of instructions, from the control server 112, regarding fulfillment of an order.
  • the first set of instructions may indicate that the storage unit 116a, storing the items, i.e., inventory item 310 (shown in FIG. 3B), is to be transported to the first operator station 106a for fulfillment of the order.
  • the transport vehicle 108 may receive, along with the first set of instructions, fiducial markers required to the transport of the storage unit 116a from a current location of the storage unit 116a to the first operator station 106a.
  • the received fiducial markers may include a sequence of fiducial markers that the transport vehicle 108 is required to follow to reach the location of the storage unit 116a from a current location of the transport vehicle 108 and the location of the first operator station 106a from the location of the storage unit 116a.
  • the received fiducial markers may be indicative of a path that the transport vehicle 108 is required to follow for transporting the storage unit 116a to the first operator station 106a.
  • the received fiducial markers may further include a fiducial marker (e.g., FMi) of the storage unit 116a using which the transport vehicle 108 may recognize or identify the storage unit 116a for transportation.
  • a fiducial marker e.g., FMi
  • the transport vehicle 108 may reach the desired location of the storage unit 116a from its current location and transport the storage unit 116a to the first operator station 106a.
  • the transport vehicle 108 may be configured to transport the storage unit 116a by lifting the storage unit 116a from its bottom (shown in FIGS. 3 A and 3B).
  • the transport vehicle 108 may transport the storage unit 116a to the first operator station 106a for fulfillment of the order.
  • control server 112 may be configured to determine one or more actions to be performed by the first operator 120a for fulfillment of the order.
  • the control server 112 may be further configured to determine one or more sequences in which the one or more actions (for example, pick or put actions) are to be performed for fulfillment of the order.
  • the transport vehicle 108 is shown to have transported the storage unit 116a to the first operator station 106a.
  • the storage unit 116a includes various shelves (for example, the shelves 302, 304, 306, and 308). As shown in FIG. 3B, the shelves 302-308 store inventory items 310, 312, 314, and 316, respectively.
  • the first operator station 106a is manned by the first operator 120a and includes a display device 318 and storage bins 320 and 322. The first operator station 106a, the first operator 120a, the transport vehicle 108, the storage unit 116a, and the inventory items 310, 312, 314, and 316 may be monitored by a plurality of imaging devices 324-330.
  • the plurality of imaging devices 324-330 may be positioned within the first operator station 106a or in vicinity of the first operator station 106a in a way that the plurality of imaging devices 324-330 capture the first operator station 106a, the first operator 120a, the transport vehicle 108, the storage unit 116a, and the display device 318 from various angles, positions, and orientations.
  • the plurality of imaging devices 324-330 may be any of the plurality of imaging devices 110a- 1 lOe.
  • the control server 112 determines that the transport vehicle 108 has transported the storage unit 116a to the first operator station 106a for fulfillment of the order that is assigned to the first operator 120a. Further, the control server 112 further identifies, based on the processing of the time-series images, that the item 310 is currently being handled by the first operator 120a. For example, one of the time-series images may display the first operator 120a reaching out for the item 310 in the storage unit 116a and another image may display that the first operator 120a has picked the inventory item 310, required for fulfilment of the order, from the storage unit 116a.
  • the control server 112 may tag the item 310 with the first operator 120a based on the successful identification of the item 310 in the time-series images received from the imaging device 330.
  • the control server 112 may be configured to track the movement of the item 310 and the first operator 120a, tagged with the item 310, based on subsequent time-series images received from the imaging device 330 and other imaging devices 324 and 326.
  • the other imaging devices 324 and 326 may have a field of view that does not overlap with the imaging device 330. However, the subsequent time-series images captured by the other imaging devices 324 and 326 have the item 310 and/or the first operator 120a displayed therein.
  • the control server 112 may process the subsequent timeseries images received from the imaging device 330 and the other imaging devices 324 and 326, and identify at least one of the item 310 and/or the first operator 120a tagged with the item 310 in the subsequent time-series images. Since the item 310 is tagged with the first operator 120a, the control server 112 uses the presence of the first operator 120a in an image as a proxy for the presence of the item 310.
  • the control server 112 may determine a current position of the item 310 in the storage facility 102 and a time instance at which the item 310 was identified to be present at the current position. Thus, by processing the subsequent time-series images, the control server 112 is able to track the movement of the item 310 and the first operator 120a in the storage facility 102.
  • the control server 112 may be further configured to generate a movement trajectory of the item 310 in the storage facility 102 based on the identification of the item 310 or the first operator 120a in the subsequent time-series images.
  • the movement trajectory indicates various positions or locations through which the item 310 was moved by the first operator 120a during the handling and also time instances at which the item 310 was present at those locations or positions.
  • the movement trajectory may indicate spatial as well as temporal information regarding the movement of the item 310 in the storage facility 102 during the handling of the item 310.
  • the movement trajectory may be continuously generated based on movement of the item 310 or the first operator 120a by the control server 112. The generated movement trajectory may enable the control server 112 to be aware of current position information of the item 310 at all time instances during the handling of the item 310.
  • the first operator 120a may put the item 310 in the storage bin 320.
  • the control server 112 may determine the final position information of the item 310 to be the storage bin 320 based on the tracked movement of the item 310 and/or the first operator 120a tagged with the item 310.
  • the control server 112 may further compare the final position information of the item 310 with the desired position information that may be the storage bin 320. Based on the result “successful match” of the comparison of the final position information and the desired position information, the control server 112 may determine successful handling of the item 310.
  • the desired position information may be the storage bin 322. Therefore, the result of the comparison of the final position information and the desired position information may be “failed match”. Subsequently, the control server 112 may determine failure in handling of the item 310. Therefore, the control server 112 may control the display device 318 to display information associated with a sequence of pick and put actions that are to be performed by the first operator 120a to handle the item 310 correcting the failure in handling. Since the time-series images are being generate and processed in real time or near-real time, the control server 112 is able to detect item handling failure in real time or near-real time.
  • the display device 318 is controlled to display identification information associated with the inventory item (e.g., the inventory item 310) that is to be picked by the first operator 120a from the storage unit 116a.
  • the identification information may include a serial number (i.e., item 310) of the inventory item 310 and the shelf 302. In another embodiment, the identification information may further include a shelf number, a placement position (e.g., 5 th item from the right side), a physical description (e.g., color, shape, size, item type, item category), or the like of the inventory item 310.
  • the display device 318 may be further controlled to display identification information (e.g., serial number) of the storage bin (i.e., e.g., storage bin 322) that is designated to receive the inventory item 310.
  • the control server 112 may be configured to indicate the shelf 302 and the storage bin 322 to the first operator 120a for the pick operation by way of a light guided path formed using light emitting diodes or other such light emitting components.
  • the shelf 302 and the storage bin 322 may be indicated to the first operator 120a by way of vibrations or audio signals being generated by the shelf 302 and the storage bin 322 or a device associated therewith.
  • the shelf 302 and the storage bin 322 may be indicated to the first operator 120a by way of light projections on the shelf 302 and the storage bin 322.
  • the light projections may be indicative of an identifier (such as a barcode) associated with the shelf 302 and the storage bin 322.
  • the light projections may be used by the first operator 120a and the imaging devices 324-330 to identify the shelf 302 and the storage bin 322.
  • the projector system may be utilized to provide visual cues to assist the handling of the inventory item.
  • the projector system may project a visual indicator (i.e., light projections for visual cues) to the first operator 120a regarding various operations the first operator 120a needs to perform for order fulfillment or inventory replenishment.
  • the projector system may be configured to present an image or video based on one or more instructions received from the control server 112.
  • the projector system may be configured to present one of an image of the inventory item 310, the identification information associated with the inventory item 310, the identifier associated with the storage unit 116a and the shelf 302, the identifier associated with the storage bin 322, images of one or more additional inventory items required for fulfilling the order, and an identification information associated with the one or more additional inventory items required for fulfilling the order.
  • the first operator 120a may get confused whether he/she has picked up a correct inventory item for fulfilling the order.
  • the first operator 120a may utilize a projected barcode of the picked inventory item for confirmation.
  • the control server 112 may be further configured to convey, via the projection of the projector system, to the first operator 120a regarding the picked inventory item being correct or incorrect.
  • the projector system may be configured to highlight the storage unit 116a, the shelf 302, the inventory item 310, and the storage bin 320 that are to be accessed by the first operator 120a in order to fulfill the order.
  • the projector system may highlight the storage unit 116a, the shelf 302, the inventory item 310, and the storage bin 320 in accordance with a sequence in which they are to be accessed by the first operator 120a.
  • the projector system may be configured to change color of a projection light to indicate an error that has been detected while the first operator 120a was accessing one of the storage unit 116a, the shelf 302, the inventory item 310, and the storage bin 320.
  • the projector system may render the projection in form of an image on a flat surface or a hologram.
  • the projector system may render the projection on a wall, a floor, or a ceiling of the storage facility 102, a display screen in the storage facility 102, or a surface of a storage bin or a storage unit.
  • the projector system may render the projection in air.
  • the first operator 120a may pick up another inventory item 312 from a different shelf 304. Consequently, based on the processing of the captured time-series image data, the control server 112 may detect that the first operator 120a has picked up an inventory item that does not resemble the inventory item 310.
  • the control server 112 may detect that the shape, the size, the dimensions, the color, and an identifier (e.g., a barcode or a QR code) of the inventory item 312 picked by the first operator 120a do not match the shape, the size, the dimensions, the color, and an identifier (e.g., a barcode or a QR code) of the inventory item 310 required for fulfilling the order. Further, based on the image data, the control server 112 may detect that an action performed by the first operator 120a for picking the inventory item 312 from the shelf 304 does not resemble the action that the first operator 120a was expected to perform for picking the inventory item 310 from the shelf 302.
  • an action performed by the first operator 120a for picking the inventory item 312 from the shelf 304 does not resemble the action that the first operator 120a was expected to perform for picking the inventory item 310 from the shelf 302.
  • the control server 112 causes the display device 318 to display a corrective notification (i.e., an alarm or a display message) to alert the first operator 120a of the incorrect action.
  • a corrective notification i.e., an alarm or a display message
  • the storage bin 322 may be assigned to the inventory item 310 by the control server 112.
  • the first operator 120a may be informed regarding the assignment via the display device 318 or the first portable device 121a.
  • the storage bin 322 may be assigned to the inventory item 310 on-the-fly, i.e., dynamically. For example, the first operator 120a may choose to put the inventory item 310 in the storage bin 322 randomly.
  • the control server 112 may identify that a first item (e.g., the inventory item 310) of the order has been put in the storage bin 322 by the first operator 120a. In such a scenario, the control server 112 may allocate or assign the storage bin 322 to one or more remaining inventory items of the order for order fulfilment.
  • the disclosed invention may also be implemented for replenishment of the inventory stock, organization of the inventory stock within the storage facility 102 and any other task that involves movement of the inventory stock stored within the storage facility 102.
  • the plurality of imaging devices 324-330 may capture the time-series image data while the first operator 120a is replenishing inventory items in the storage unit 116a.
  • the control server 112 may determine whether the first operator 120a has replenished inventory items in the correct shelves of the storage unit 116a. In an event of error detection, the control server 112 may generate an alert and notify the first operator 120a regarding the error.
  • FIG. 4 is a block diagram that illustrates a system architecture of a computer system 400 for tracking movement of the inventory items within the storage facility 102, in accordance with an exemplary embodiment of the disclosure.
  • An embodiment of the disclosure, or portions thereof, may be implemented as computer readable code on the computer system 400.
  • the control server 112 and the image acquisition device 111 of FIG. 1 may be implemented in the computer system 400 using hardware, software, firmware, non-transitory computer readable media having instructions stored thereon, or a combination thereof and may be implemented in one or more computer systems or other processing systems.
  • Hardware, software, or any combination thereof may embody modules and components used to implement the method of FIGS. 5, 6 A, and 6B.
  • the computer system 400 may include a processor 402 that may be a special purpose or a general-purpose processing device.
  • the processor 402 may be a single processor or multiple processors.
  • the processor 402 may have one or more processor “cores.”
  • the processor 402 may be coupled to a communication infrastructure 404, such as a bus, a bridge, a message queue, the communication network 114, multi-core message-passing scheme, or the like.
  • the computer system 400 may further include a main memory 406 and a secondary memory 408. Examples of the main memory 406 may include RAM, ROM, and the like.
  • the secondary memory 408 may include a hard disk drive or a removable storage drive (not shown), such as a floppy disk drive, a magnetic tape drive, a compact disc, an optical disk drive, a flash memory, or the like. Further, the removable storage drive may read from and/or write to a removable storage device in a manner known in the art. In some embodiments, the removable storage unit may be a non-transitory computer readable recording media.
  • the computer system 400 may further include an input/output (I/O) port 410 and a communication interface 412.
  • the I/O port 410 may include various input and output devices that are configured to communicate with the processor 402. Examples of the input devices may include a keyboard, a mouse, a joystick, a touchscreen, a microphone, and the like. Examples of the output devices may include a display screen, a speaker, headphones, and the like.
  • the communication interface 412 may be configured to allow data to be transferred between the computer system 400 and various devices that are communicatively coupled to the computer system 400. Examples of the communication interface 412 may include a modem, a network interface, i.e., an Ethernet card, a communication port, and the like.
  • Data transferred via the communication interface 412 may be signals, such as electronic, electromagnetic, optical, or other signals as will be apparent to a person skilled in the art.
  • the signals may travel via a communications channel, such as the communication network 114, which may be configured to transmit the signals to the various devices that are communicatively coupled to the computer system 400.
  • Examples of the communication channel may include a wired, wireless, and/or optical medium such as cable, fiber optics, a phone line, a cellular phone link, a radio frequency link, and the like.
  • the main memory 406 and the secondary memory 408 may refer to non- transitory computer readable mediums that may provide data that enables the computer system 400 to implement the method illustrated in FIGS. 5, 6A, and 6B.
  • FIG. 5 is a flowchart that illustrate a process for tracking movement of inventory items in a storage facility, in accordance with an exemplary embodiment of the disclosure.
  • the process 500 generally starts at 502, where the time-series images are received from the imaging device 110a of the plurality of imaging devices HOa-l lOe.
  • the control server 112 may be configured to receive the time-series images from the imaging devices 110a of the plurality of imaging devices HOa-l lOe.
  • the process then proceeds to 504, where the time-series images are processed to identify an item (or a batch of items) being handled by the first operator 120a in the storage facility 102.
  • the control server 112 may be configured to process the time-series images to identify the item (or the batch of items) being handled by the first operator 120a in the storage facility 102.
  • the process then proceeds to 506, where the item (or the batch of items) is tagged with the first operator 120a based on the successful identification of the item in the time-series images.
  • the control server 112 may be configured to tag the item (or the batch of items) with the first operator 120a based on the successful identification of the item (or the batch of items) in the time-series images.
  • the process then proceeds to 508, where the movement of the item (or the batch of items) and the first operator 120a, tagged with the item(or the batch of items), within the storage facility 102 is tracked based on the subsequent time-series images captured by the imaging device 110a or one or more other imaging devices 110b and 110c of the plurality of imaging devices 110a- 1 lOe.
  • the control server 112 may be configured to track the movement of the item (or the batch of items) and the first operator 120a, tagged with the item (or the batch of items), within the storage facility 102 based on the subsequent time-series images captured by the imaging device 110a or one or more other imaging devices 110b and 110c of the plurality of imaging devices HOa-l lOe.
  • the process then proceeds to 510, where the final position information of the item (or the batch of items) is determined based on the tracked movement of at least one of the item (or the batch of items) and the first operator 120a tagged with the item (or the batch of items).
  • the control server 112 may be configured to determine the final position information of the item (or the batch of items) based on the tracked movement of at least one of the item (or the batch of items) and the first operator 120a tagged with the item.
  • the process then proceeds to 512, where the final position information of the item (or the batch of items) is compared with the desired position information of the item(or the batch of items).
  • the control server 112 may be configured to compare the final position information of the item (or the batch of items) with the desired position information of the item (or the batch of items).
  • the process then proceeds to 514, where the successful handling of the item or the failure in handling the item (or the batch of items) is detected based on the result of the comparison between the final position information and the desired position information.
  • the control server 112 may be configured to detect the successful handling of the item (or the batch of items) or the failure in handling the item (or the batch of items) based on the result of the comparison between the final position information and the desired position information.
  • FIGS. 6A-6B are high-level flowcharts that, collectively, illustrate a process for tracking the movement of the inventory items in a storage facility, in accordance with an exemplary embodiment of the disclosure.
  • FIGS. 6A-6B are described with respect to elements of FIG. 1.
  • the process 600 may generally start at 602, where the time-series images are received from the imaging device 110a of the plurality of imaging devices HOa-l lOe.
  • the control server 112 may be configured to receive the time-series images from the imaging device 110a of the plurality of imaging devices 110a- 1 lOe.
  • the process then proceeds to 604, where the time-series images are processed to identify the item being handled by the first operator 120a in the storage facility 102.
  • the control server 112 may be configured to process the time-series images to identify the item being handled by the first operator 120a in the storage facility 102.
  • the process then proceeds to 606, where the item is tagged with the first operator 120a based on the successful identification of the item in the time-series images.
  • the control server 112 may be configured to tag the item with the first operator 120a based on the successful identification of the item in the time-series images.
  • the process then proceeds to 608, where the subsequent time-series images are processed to identify at least one of the item or the first operator 120a tagged with the item.
  • the movement of the item and the first operator 120a is tracked based on the identification of the item or the first operator 120a in the subsequent time-series images received from the imaging device 110a and the other imaging devices 110b and 110c.
  • the control server 112 may be configured to process the subsequent time-series images to identify at least one of the item or the first operator 120a tagged with the item.
  • the process then proceeds to 610, where the movement trajectory of the item in the storage facility 102 is generated based on the identification of the item or the first operator 120a in the subsequent time-series images.
  • the control server 112 may be configured to generate the movement trajectory of the item in the storage facility 102 based on the identification of the item or the first operator 120a in the subsequent time-series images.
  • the starting point of the movement trajectory corresponds to the initial position information of the item and the ending point of the movement trajectory corresponds to the final position information of the item.
  • the process then proceeds to 612, where the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 is tracked based on the subsequent time-series images captured by the imaging device 110a and the one or more other imaging devices 110b and 110c of the plurality of imaging devices HOa-l lOe.
  • the control server 112 may be configured to track the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on the subsequent time-series images captured by the imaging device 110a or one or more other imaging devices 110b and 110c of the plurality of imaging devices 110a- 1 lOe and the generated movement trajectory.
  • the process then proceeds to 614, where it is detected that the item is being handled by the second operator 120b different from the first operator 120a based on the subsequent time-series image.
  • the control server 112 may be configured to detect, based on the subsequent time-series image, that the item is being handled by the second operator 120b different from the first operator 120a.
  • the process may then proceed to 616 or process A based on the detection that whether the item is being handled by the second operator 120b different from the first operator 120a.
  • the process may proceed to 616 when the item is detected to be handled by the second operator 120b different from the first operator 120a and to process A when the item is detected to be handled by the same first operator 120a.
  • the item is re-tagged based on the detection that the item is being handled by the second operator 120b.
  • the control server 112 may be configured to re-tag the item based on the detection that the item is being handled by the second operator 120b and the process proceeds to process B.
  • the process B proceeds to 618, where the movement of the second operator 120b, tagged with the item, within the storage facility 102 is tracked based on the subsequent time-series images.
  • the control server 112 may be configured to track the movement of the second operator 120b, tagged with the item, within the storage facility 102 based on the subsequent time-series images.
  • the process proceeds to 620, where the final position information of the item is determined based on the tracked movement of at least one of the item and the first operator 120a tagged with the item.
  • the control server 112 may be configured to determine the final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item.
  • the process then proceeds to 622, where the final position information of the item is compared with the desired position information of the item.
  • the control server 112 may be configured to compare the final position information of the item with the desired position information of the item.
  • the process then proceeds to 624, where the successful handling of the item or the failure in handling the item is detected based on the result of the comparison between the final position information and the desired position information.
  • the control server 112 may be configured to detect the successful handling of the item or the failure in handling the item based on the result of the comparison between the final position information and the desired position information.
  • the process then proceeds to 626, where it is determined whether the item has been handled successfully. If at 626, it is determined that the item has been handled successfully, the process stops. If at 626, it is determined that the item has not been handled successfully, the process proceeds to 628, where the one or more instructions are displayed on the display of the first portable device 121a of the first operator 120a or a display (for example, the display 318) of the storage facility 102.
  • the control server 112 may be configured to display the one or more instructions on the display.
  • the one or more instructions indicate a sequence of actions to be performed on the item by the second operator 120b to match the final position information of the item with the desired position information.
  • the process then proceeds to 630, where the visual indicator mechanism is controlled to provide visual cues for indicating the desired position information of the item.
  • the control server 112 may be configured to control the visual indicator mechanism to provide visual cues for indicating the desired position information of the item.
  • control server 112 for automated tracking of the inventory items in the storage facility 102.
  • the control server 112 may be configured to receive, from at least one imaging device (for example, the imaging device 110a) of the plurality of imaging devices HOa-l lOe, time-series images.
  • the control server 112 may be further configured to process the time-series images to identify the item being handled by the first operator 120a in the storage facility 102.
  • the control server 112 may be further configured to tag the item with the first operator 120a based on the successful identification of the item in the time-series images.
  • the control server 112 may be further configured to track the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on subsequent time-series images captured by the at least one imaging device (e.g., the imaging device 110a) or one or more other imaging devices (for example, the imaging device 110b and 110c) of the plurality of imaging devices HOa-l lOe.
  • the control server 112 may be further configured to determine the final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item.
  • the control server 112 may be further configured to compare the final position information of the item with the desired position information of the item.
  • the control server 112 may be further configured to detect the successful handling of the item or the failure in handling the item based on the result of the comparison between the final position information and the desired position information.
  • Various embodiments of the disclosure provide a non-transitory computer readable medium having stored thereon, computer executable instructions, which when executed by a computer, cause the computer to execute one or more operations automated tracking of inventory items in the storage facility 102.
  • the one or more operations include receiving, by the control server 112, the time-series images from at least one imaging device (for example, the imaging device 110a) positioned within the storage facility 102.
  • the one or more operations further include processing, by the control server 112, the time-series images to identify the item being handled by the first operator 120a in the storage facility 102.
  • the one or more operations further include tagging, by the control server 112, the item with the first operator 120a based on the successful identification of the item in the time-series images.
  • the one or more operations further include tracking, by the control server 112, the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on the subsequent time-series images captured by the at least one imaging device (for example, the imaging device 110a) or the one or more other imaging devices (for example, the imaging device 110b and 110c) positioned within the storage facility 102.
  • the one or more operations further include determining, by the control server 112, the final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item.
  • the one or more operations further include comparing, by the control server 112, the final position information of the item with the desired position information of the item.
  • the one or more operations further include detecting, by the control server 112, the successful handling of the item or the failure in handling the item based on the result of the comparison between the final position information and the desired position information.
  • the disclosed embodiments encompass numerous advantages. Exemplary advantages of the disclosed methods include, but are not limited to, tracking and auditing of inventory stock. In other words, the disclosed methods and systems allow for security and automated monitoring of the inventory stock. Further, the disclosed methods and systems significantly reduce a time required for rectifying human errors in inventory handling. Therefore, the disclosed methods and systems increase the throughput of the storage facility 102. Further, the disclosed methods and systems significantly reduce inconvenience caused to operators of the storage facility 102. The disclosed methods and systems significantly reduce a probability of human errors being caused during processing of orders. Moreover, the disclosed methods and systems eliminate requirement to manually scan the inventory items for verification thereof for fulfilling the orders. Also, the disclosed methods and systems ensure the correct placement of inventory items while the items are being moved by the operators.
  • the inventory items may be processed in a less amount of time and with significantly reduced chance of error during processing of inventory items.
  • the disclosed methods and systems significantly reduce time consumption and manual intervention required for order fulfillment, inventory stock replenishment, and inventory stock organization.
  • the disclosed methods and systems does not require any additional infrastructure (e.g., handheld devices) for processing the inventory items at the storage facility 102.
  • additional infrastructure e.g., handheld devices
  • Such lack of reliability on hardware infrastructure increases ease of implementation and flexibility of the disclosed methods and systems. Therefore, the disclosed methods and systems may be implemented in storage facilities of varying layout.

Abstract

An automated item tracking system is provided. The system includes plurality of imaging devices positioned within a storage facility and a control server. The control server receives, from at least one imaging device, time-series images and identifies an item being handled by an operator. The control server tags the item with operator based on successful identification of the item and tracks movement of the item and operator based on subsequent time-series images captured by the at least one imaging device or one or more other imaging devices. The control server determines final position information of the item based on tracked movement of at least one of the item and the operator and compares the final position information with desired position information of the item. The control server detects successful handling or failure in handling of the item based on result of comparison between the final and desired position information.

Description

AUTOMATED TRACKING OF INVENTORY ITEMS FOR ORDER FULFILMENT
AND REPLENISHMENT
CROSS-RELATED APPLICATIONS
[0001] This application claims priority of Indian Provisional Application No. IN202011050940, filed November 23, 2020, the contents of which are incorporated herein by reference.
FIELD OF THE DISCLOSURE
[0002] Various embodiments of the disclosure relate generally to management of storage facilities. More specifically, various embodiments of the disclosure relate to methods and systems for automated tracking of inventory items in storage facilities.
BACKGROUND
[0003] Modern storage facilities handle a large number of inventory items on a daily basis. The inventory items are handled within the storage facility for fulfilment of an order or brought inside the storage facility for replenishment of inventory stock. Throughputs of such storage facilities have a direct bearing on various business metrics such as time taken to complete orders, total number of orders completed within a time duration, customer satisfaction, or the like.
[0004] Typically, different tasks for handling inventory items involve manual intervention. One of the major challenges faced by the storage facilities due to involvement of humans is handling of wrong items or handling of a correct item incorrectly, which sometimes even lead to missing inventory in the storage facilities. Thus, such approach is suboptimal and may cause customer dissatisfaction due to delay and errors in order fulfillment. Therefore, the aforementioned approach lacks flexibility and ease of operation.
[0005] In light of the foregoing, there exists a need for a technical solution that reduces scope of human error in handling of inventory items in storage facilities. [0006] Limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of described systems with some aspects of the present disclosure, as set forth in the remainder of the present application and with reference to the drawings.
SUMMARY
[0007] Methods and systems for automated tracking of inventory items in a storage facility are provided substantially as shown in, and described in connection with, at least one of the figures. The automated item tracking includes a plurality of imaging devices positioned within the storage facility and a control server. The control server may be configured to receive, from at least one imaging device of the plurality of imaging devices, time-series images. The control server may be further configured to process the time-series images to identify an item being handled by a first operator in the storage facility. The control server may be further configured to tag the item with the first operator based on a successful identification of the item in the time-series images. The control server may be further configured to track a movement of the item and the first operator, tagged with the item, within the storage facility based on subsequent time-series images captured by the at least one imaging device or one or more other imaging devices of the plurality of imaging devices. The control server may be further configured to determine final position information of the item based on the tracked movement of at least one of the item and the first operator tagged with the item. The control server may be further configured to compare the final position information of the item with desired position information of the item. The control server may be further configured to detect a successful handling of the item or a failure in handling the item based on a result of the comparison between the final position information and the desired position information.
[0008] In some embodiments, the control server may be configured to receive, from at least one imaging device of the plurality of imaging devices, time-series images. The control server may be further configured to process the time-series images to identify a batch of items being handled by a first operator in the storage facility. The control server may be further configured to tag the batch of items with the first operator based on a successful identification of the batch of items in the time-series images. The control server may be further configured to track a movement of the batch of items and the first operator, tagged with the batch of items, within the storage facility based on subsequent time-series images captured by the at least one imaging device or one or more other imaging devices of the plurality of imaging devices. The control server may be further configured to determine final position information of the batch of items based on the tracked movement of at least one of the batch of items and the first operator tagged with the batch of items. The control server may be further configured to compare the final position information of the batch of items with desired position information of the item. The control server may be further configured to detect a successful handling of the batch of items or a failure in handling the batch of items based on a result of the comparison between the final position information and the desired position information.
[0009] In some embodiments, the handling of the item may be detected as successful when the result of the comparison indicates that the final position information is same as the desired position information.
[0010] In some embodiments, the handling of the item may be detected as failure when the result of the comparison indicates that the final position information is different from the desired position information.
[0011] In some embodiments, when the handling of the item may be detected as failure, the control server may be further configured to display one or more instructions on a display. The one or more instructions indicate a sequence of actions to be performed on the item by the first operator to match the final position information of the item with the desired position information.
[0012] In some embodiments, the control server may be further configured to control a visual indicator mechanism to provide visual cues for indicating the desired position information of the item.
[0013] In some embodiments, the control server may be further configured to communicate a set of instructions to the at least one imaging device or the one or more other imaging devices to orient in a direction of movement of the item. [0014] In some embodiments, the control server may be further configured to process the subsequent time-series images to identify at least one of the item or the first operator tagged with the item. The movement of the item and the first operator may be tracked based on the identification of the item or the first operator in the subsequent time-series images.
[0015] In some embodiments, the control server may be further configured to generate a movement trajectory of the item in the storage facility based on the identification of the item or the first operator in the subsequent time-series images. A starting point of the movement trajectory corresponds to an initial position information of the item and an ending point of the movement trajectory corresponds to the final position information of the item.
[0016] In some embodiments, when the item may be not visible in one or more images of the subsequent time-series images and the first operator tagged with the item may be visible, the control server may track the movement of the item based on the movement of the first operator.
[0017] In some embodiments, the control server may be further configured to detect, based on the subsequent time-series images, that the item is handled by a second operator different from the first operator. The control server may be further configured to re-tag the item with the second operator based on the detection that the item is handled by the second operator. The control server may be further configured to track a movement of the second operator, tagged with the item, within the storage facility based on the subsequent time-series images. The final position information of the item is further determined based on the tracked movement of the second operator.
[0018] In some embodiments, the control server may be further configured to execute one or more image processing operations on the time-series images to identify the item.
[0019] These and other features and advantages of the present disclosure may be appreciated from a review of the following detailed description of the present disclosure, along with the accompanying figures in which like reference numerals refer to like parts throughout.
BRIEF DESCRIPTION OF THE DRAWINGS [0020] FIG. 1 is a block diagram that illustrates an exemplary environment of a storage facility, in accordance with an exemplary embodiment of the disclosure;
[0021] FIG. 2 is a block diagram that illustrates a control server of FIG. 1, in accordance with an exemplary embodiment of the disclosure;
[0022] FIGS. 3A-3E are schematic diagrams that, collectively, illustrates an exemplary scenario for tracking movement of inventory items in a storage facility, in accordance with an exemplary embodiment of the disclosure;
[0023] FIG. 4 is a block diagram that illustrates a system architecture of a computer system, in accordance with an exemplary embodiment of the disclosure;
[0024] FIG. 5 is a flowchart that illustrate a process for tracking movement of inventory items in a storage facility, in accordance with an exemplary embodiment of the disclosure; and
[0025] FIGS. 6A-6B are high-level flowcharts that, collectively, illustrate a process for tracking movement of inventory items in a storage facility, in accordance with an exemplary embodiment of the disclosure.
DETAILED DESCRIPTION
[0026] Certain embodiments of the disclosure may be found in disclosed systems and methods for tracking movement of inventory items within a storage facility. Exemplary aspects of the disclosure provide methods for tracking and monitoring inventory items.
[0027] The methods and systems of the disclosure provide a solution for facilitating automated tracking of inventory items within a storage facility. The methods and systems disclosed herein eliminate scope of human error during handling of inventory items in the storage facility.
[0028] FIG. 1 is a block diagram that illustrates an exemplary environment of a storage facility, in accordance with an exemplary embodiment of the disclosure. The environment 100 shows the storage facility 102. The storage facility 102 includes a storage area 104, first through third operator stations 106a- 106c (hereinafter, the first through third operator stations 106a- 106c are collectively referred to as ‘the operator stations 106’), a plurality of transport vehicles (for example, a transport vehicle 108), a plurality of imaging devices (e.g., imaging devices 110a- 1 lOe) controlled by an image acquisition device 111, and a control server 112. The control server 112 communicates with the operator stations 106, the transport vehicle 108, and the image acquisition device 111 by way of a communication network 114 or through separate communication networks established therebetween. Examples of the communication network 114 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li- Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. Various entities (such as the transport vehicle 108, the plurality of imaging devices 110a- 1 lOe, the image acquisition device 111, and the control server 112) in the environment 100 may be coupled to the communication network 114 in accordance with various wired and wireless communication protocols, such as Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Long Term Evolution (LTE) communication protocols, or any combination thereof.
[0029] The storage facility 102 stores multiple inventory items for fulfillment of one or more orders, maintenance of inventory stock, and/or selling of one or more inventory items. Examples of the storage facility 102 may include, but are not limited to, a forward warehouse, a backward warehouse, a fulfillment center, or a retail store (e.g., a supermarket, an apparel store, or the like). Examples of the inventory items may include, but are not limited to, groceries, apparel, or the like. The inventory items are stored in the storage area 104. The storage area 104 may be of any shape, for example, a rectangular shape. The storage area 104 includes first through third storage units 116a- 116c that are arranged within the storage area 104. Hereinafter, the first through third storage units 116a- 116c are collectively referred to as ‘the storage units 116’. One or more inventory items are allocated to each storage unit 116a- 116c and each storage unit 116a- 116c stores the corresponding allocated inventory items. In one embodiment, the storage units 116 may have different shapes, sizes, and dimensions. Hereinafter, the terms ‘inventory items’ and ‘items’ are used interchangeably. [0030] The storage units 116 are arranged such that first through fourth aisles 118a-l 18d are formed therebetween. Hereinafter, the first through fourth aisles 118a-l 18d are collectively referred to as ‘the aisles 118’. The first aisle 118a is formed between the first and second storage units 116a and 116b. The second aisle 118b is formed between the second and third storage units 116b and 116c. The third and fourth aisles 118c and 118d are formed between side faces of the storage units 116 and a sidewall of the storage area 104. The aisles 118 are passageways used by customers, operators, or the transport vehicle 108 to move in the storage area 104. Arrangement of the storage units 116 may be performed in any desired configuration known to those of skill in the art. In a non-limiting example, it is assumed that the storage units 116 are arranged such that a layout of the aisles 118 forms a virtual grid in a rectangular space. Thus, each aisle 118 is one of a horizontal aisle or a vertical aisle. For example, the first aisle 118a is a vertical aisle and the fourth aisle 118d is a horizontal aisle. An intersection between horizontal and vertical aisles forms a cross-aisle.
[0031] In one embodiment, the storage facility 102 may be marked with various fiducial markers (e.g., fiducial markers FMi, FM2, RMi, and RM2). Fiducial markers are markers (or identifiers) placed in the storage facility 102 for uniquely identifying different locations in the storage facility 102, different sections of each storage unit 116a- 116c, or the like. For the sake of brevity, the storage area 104 has been shown to include multiple fiducial markers and only the fiducial markers FMi, FM2, RMi, and RM2 have been labeled. It will be apparent to those of skill in the art that the entire storage facility 102 may include the fiducial markers without deviating from the scope of the disclosure. Each fiducial marker may correspond to one of two types - location markers (e.g., the fiducial markers FMi and FM2) and storage unit markers (e.g., the fiducial markers RMi and RM2). The location markers (e.g., the fiducial markers FMi and FM2) are located at pre-determined locations in the storage facility 102. The pre-determined locations may not conform to any specific pattern and may be subject to a configuration of the storage facility 102. For example, the fiducial markers FMi and FM2 are located at first and second locations (e.g., on the floor of the storage area 104) along the first and second aisles 118a and 118b, respectively. The storage unit markers (e.g., the fiducial markers RMi and RM2) may uniquely identify different storage units or different sections of each storage unit 116a- 116c. For example, the fiducial markers RMi and RM2 uniquely identify shelves of the first and second storage units 116a and 116b that partly constitute the first and second storage units 116a and 116b, respectively. Examples of the fiducial markers include, but or not limited to, barcodes, quick response (QR) codes, radio frequency identification device (RFID) tags, or the like. In one embodiment, placement of the fiducial markers is uniform (i.e., a distance between consecutive fiducial markers is constant). In another embodiment, the placement of the fiducial markers may be non-uniform (i.e., a distance between consecutive fiducial markers is variable). In other embodiments, the storage facility 102 may not be marked with any fiducial markers. In such a scenario, different locations in the storage facility 102 may be determined by using global positioning system (GPS) coordinates or other localization technology-based coordinates.
[0032] The operator stations 106 in the storage facility 102 may refer to pick-and-put stations (PPSs) for holding inventory items that are to be placed in the storage units 116 or the inventory items that are retrieved from the storage units 116. Each operator station 106 may be manned by one or more operators. For example, the first through third operator stations 106a- 106c are manned by first through third operators 120a- 120c, respectively. The storage units 116 are transported to the operator stations 106 by the plurality of transport vehicles (e.g., the transport vehicle 108). Although the storage facility 102 is shown to include three operator stations 106, it will be apparent to those of skill in the art that the storage facility 102 may include any number of operator stations without deviating from the scope of the disclosure. The operator stations 106 may include a display device (as shown in FIG. 3B) that receives various commands or instructions from the control server 112 for placing the inventory items in the storage units 116 or retrieving the inventory items from the storage units 116. Based on the received commands or instructions, the first through third operators 120a- 120c at the operator stations 106 may place the inventory items in the storage units 116 or retrieve the inventory items from the storage units 116. An item placement operation involves, in some examples, picking up the inventory items from one or more storage bins (shown in FIG. 3B) at the operator stations 106 and placing the picked inventory items in one of the storage units 116. Similarly, an item retrieval operation involves retrieving one or more inventory items from a storage unit 116a, 116b, or 116c and placing the retrieved inventory items in one of the storage bins in a temporary storage at the operator stations 106 or one of the storage bins carried by a mobile robotic apparatus. In another embodiment, the operator stations 106 may include robotic operators for performing item placement and item retrieval operations, without deviating from the scope of the disclosure. [0033] In some embodiments, the first through third operators 120a- 120c may be associated with respective first through third portable devices 121a-121c. The first through third portable devices 121 a- 121c may be configured to receive one or more instructions from the control server 112 for providing one or more sensory inputs (such as vibrations, audio sounds, or visual cues) to the respective first through third operators 120a- 120c. The one or more sensory inputs are provided to alert the first through third operators 120a- 120c regarding an error committed by the respective first through third operators 120a- 120c, a pick/put operation to be performed for handling the item, or an emergency situation in the storage facility 102, and/or the like. The first through third portable devices 121 a- 121c may be controlled by the control server 112 to provide the sensory inputs to the respective first through third operators 120a- 120c. The first through third portable devices 121a-121c may be controlled by the control server 112 to display the one or more instructions for handling the item. The first through third portable devices 121 a- 121c may be wearable devices, for example, smartwatches, smart pendants, wearable computers, and/or the like. In one example, the first through third portable devices 121 a- 121c may be cellphones or mobile phones of the corresponding first through third operators 120a- 120c.
[0034] The transport vehicle 108 is a robotic apparatus that moves within the storage facility 102. For example, the transport vehicle 108 is an automatic guided vehicle (AGV) that is responsive to commands received from the control server 112. The transport vehicle 108 may include suitable logic, instructions, circuitry, interfaces, and/or codes, executable by the circuitry, for transporting payloads (e.g., the storage units 116) in the storage facility 102 based on the commands received from the control server 112. For example, the transport vehicle 108 may carry and transport the storage units 116 from the storage area 104 to the operator stations 106 and from the operator stations 106 to the storage area 104 for fulfillment of orders, replenishment of inventory stock, loading of inventory items into the storage units 116, and/or the like. The transport vehicle 108 may be configured to read the fiducial markers (e.g., the fiducial markers FMi, FM2, RMi, and RM2). The transport vehicle 108 may include various sensors (e.g., imaging devices, RFID sensors, and/or the like) for reading the fiducial markers. The transport vehicle 108 may utilize the fiducial markers for determining a relative position of the transport vehicle 108 within the storage facility 102 and/or identifying the storage units 116. For the sake of brevity, the storage facility 102 is shown to have a single transport vehicle (i.e., the transport vehicle 108). It will be apparent to those of skill in the art that the storage facility 102 may include any number of transport vehicles without deviating from the scope of the disclosure.
[0035] The storage facility 102 may further include the plurality of imaging devices HOa-l lOe installed at different locations. For the sake of brevity, the storage facility 102 has been shown to include multiple imaging devices and only five imaging devices 110a, 110b, 110c, HOd, and I lOe have been labeled. The imaging devices HOa-l lOe may be installed in such a way that the storage units 116, the transport vehicle 108, and the operator stations 106 along with corresponding operators 120a- 120c are captured from various angles and orientations. In other words, the field of view of the imaging devices 110a- 1 lOe is set in such a way that there are no unmonitored zones in the storage facility 102. The plurality of imaging devices 110a- 1 lOe are configured to capture time-series images of different locations within the storage facility 102. The storage facility 102 may include multiple imaging devices (for example, high definition cameras, imaging devices, scanners, or the like) of varying configurations installed at different locations within the storage facility 102 without deviating from the scope of the disclosure. Further, each imaging device of the plurality of imaging devices HOa-l lOe may have high resolution and may be configured to capture the smallest of details and movements within the storage facility 102. Each of the imaging devices HOa-l lOe may be communicab ly coupled to the control server 112 via the communication network 114 and the image acquisition device 111. In some embodiments, one or more of the plurality of imaging devices HOa-l lOe may have a dynamic field of view and hence may change its orientation based on an instruction received from the control server 112.
[0036] In some embodiments, the plurality of imaging devices HOa-l lOe may be mounted on walls of the storage facility 102. In other embodiment, the plurality of imaging devices HOa-l lOe may be mounted on the storage units 116 and/or the storage bins. In other embodiment, the plurality of imaging devices HOa-l lOe may be mounted on the transport vehicle 108. In other embodiment, the plurality of imaging devices HOa-l lOe may be mounted on structures (e.g., a vertical pole) positioned at different locations within the storage facility
Figure imgf000012_0001
[0037] In some embodiments, the plurality of imaging devices HOa-l lOe may be controlled by at least one of the image acquisition device 111 and the control server 112. The plurality of imaging devices HOa-l lOe may be controlled to focus at different locations and objects based on the movement of the inventory items. The plurality of imaging devices 1 Wal l Oe may be further controlled to rotate at different angles. Further, the plurality of imaging devices HOa-l lOe may be controlled to operate in different modes such as day-light mode and night mode based upon ambient light within the storage facility 102. The plurality of imaging devices HOa-l lOe may be further controlled to manipulate corresponding configurations (for example, aperture, shutter speed, flash, color intensity, warmth, hue, and the like) such that the plurality of imaging devices HOa-l lOe are optimized to capture clear and unambiguous images and/or video.
[0038] In some embodiments, the plurality of imaging devices HOa-l lOe communicate their corresponding time-series image data and/or video data to the control server 112. In another embodiment, the plurality of imaging devices HOa-l lOe communicate corresponding time-series image data and/or video data to the image acquisition device 111 that may communicate the time-series image data and/or video data received from the plurality of imaging devices 110a- 1 e to the control server 112.
[0039] In some embodiments, the plurality of imaging devices HOa-l lOe may periodically communicate the time-series image data to the control server 112 and/or the image acquisition device 111. In another embodiment, the plurality of imaging devices HOa-l lOe may continuously transmit a live feed of the time-series image data and/or video data the control server 112 and/or the image acquisition device 111. In another embodiment, the plurality of imaging devices HOa-l lOe may be configured to transmit the time-series image data and/or video data upon being prompted by one of the image acquisition device 111 and/or the control server 112. In another embodiment, the plurality of imaging devices HOa-l lOe may be configured to transmit the time-series image data upon detecting a movement associated with one or more inventory items, the transport vehicle 108, the storage units 116, the first through third operators 120a- 120c, or any other object in the storage facility 102. In some embodiments, each of the plurality of imaging devices HOa-l lOe may be configured to track the first through third operators 120a- 120c, the storage units 116, the storage bins, and/or the transport vehicle 108 in order to capture the movement of the inventory items during pick or put sessions associated with the inventory items.
[0040] The image acquisition device 111 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more operations for collecting the time-series image data and/or video data from the plurality of imaging devices HOa-l lOe. The time-series image data may refer to a temporal sequence of images captured by the plurality of imaging devices HOa-l lOe. The time-series images may include one or more images in which the item is shown to picked by corresponding operator. The time-series image data and/or video data may include a live feed, a recorded video, an image, or the like that includes visual data associated with the movement of the inventory items inside the storage facility 102. The image acquisition device 111 may be further configured to filter the time-series image data and/or video data for image optimization and noise filtration. The image acquisition device 111 may be further configured to select relevant image data from cumulative time-series image data and/or video data of each imaging device HOa-l lOe. Selection of the relevant image data from the cumulative time-series image data and/or video data may be performed based on identification of the item or the operator. In an instance, when at least one image or video frame in the time-series images or video data may present the item being handled by corresponding operator, the at least one image or video frame and subsequent images in the time-series images or video data may be selected as the relevant image data. In another instance, at least one image of the time-series images captured by a specific imaging device (for example, the imaging device 110a) may present the item being handled by the corresponding operator. In such an instance, subsequent time-series images captured by the specific imaging device (for example, the imaging device 110a) may be selected as the relevant image data The image acquisition device 111 may be further configured to communicate the cumulative time-series image data and/or video data or the relevant image data to the control server 112.
[0041] In one embodiment, the image acquisition device 111 may periodically communicate the cumulative time-series image data and/or video data or the relevant image data to the control server 112. In another embodiment, the image acquisition device 111 may continuously transmit a live feed of the cumulative time-series image data and/or video data to the control server 112. In another embodiment, the image acquisition device 111 may be configured to transmit the cumulative time-series image data and/or video data or the relevant image data upon being prompted by the control server 112. In another embodiment, the image acquisition device 111 may be configured to transmit the cumulative time-series image data and/or video data or the relevant image data upon observing a movement associated with one or more inventory items, the transport vehicle 108, the storage units 116, the first through third operators 120a- 120c, or any other object in the storage facility 102. In another embodiment, the image acquisition device 111 may be configured to perform the transmission based on a size of the cumulative time-series image data and/or video data or the relevant image data being greater than a threshold value.
[0042] In some embodiments, the storage facility 102 may further include one or more microphones and/or an audio input device 109. The audio input device 109 may be configured to capture audio sound, such as words spoken by the first through third operators 120a- 120c, sound generated due to one of falling of the inventory items, movements of the transport vehicle 108, movements of the storage units 116, movements of the inventory items, and /or movements of the storage bins. The audio input device 109 may be communicab ly coupled to the control server 112 via the communication network 114. The audio input device 109 may be configured to communicate audio data associated with the storage facility 102 to the control server 112.
[0043] The control server 112 may be a network of computers, a software framework, or a combination thereof, that may provide a generalized approach to create the server implementation. Examples of the control server 112 include, but are not limited to, personal computers, laptops, mini-computers, mainframe computers, any non-transient and tangible machine that can execute a machine-readable code, cloud-based servers, distributed server networks, or a network of computer systems. The control server 112 may be realized through various web-based technologies such as, but not limited to, a Java web-framework, a .NET framework, a personal home page (PHP) framework, or any web-application framework. The control server 112 may be maintained by a warehouse management authority or a third-party entity that facilitates inventory management operations for the storage facility 102. It will be apparent to a person of ordinary skill in the art that the control server 112 may perform other warehouse management operations as well along with the inventory item tracking and management operations. [0044] The control server 112 may be configured to store, in a memory of the control server 112, a virtual map of the storage facility 102 and inventory storage data (as shown in FIG. 9) of the inventory stock. The virtual map is indicative of the current location of the storage units 116, the operator stations 106, entry and exit points of the storage facility 102, the fiducial markers in the storage facility 102, a current location of the transport vehicle 108, or the like. The inventory storage data is indicative of associations between the inventory items stored in the storage facility 102 and the storage units 116 in the storage facility 102. The inventory storage data may further include historic storage locations of each inventory item. The inventory storage data further includes parameters (for example, weight, shape, size, color, dimensions, or the like) associated with each inventory item. The control server 112 may be configured to receive, from at least one imaging device (for example, the imaging device 110a) of the plurality of imaging devices HOa-l lOe, time-series images. The control server 112 may be further configured to process the time-series images to identify the item being handled by an operator (for example, the first operator 120a) in the storage facility 102. The control server 112 may be further configured to tag the item with the first operator 120a based on the successful identification of the item in the time-series images. The control server 112 may be further configured to track the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on subsequent time-series images captured by the at least one imaging device (e.g., the imaging device 110a) or one or more other imaging devices (for example, the imaging device 110b and 110c) of the plurality of imaging devices HOa-l lOe. The control server 112 may be further configured to determine a final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item. The control server 112 may be further configured to compare the final position information of the item with a desired position information of the item. The control server 112 may be further configured to detect the successful handling of the item or the failure in handling the item based on the result of the comparison between the final position information and the desired position information.
[0045] In operation, the control server 112 may be configured to receive, from at least one imaging device (for example, the imaging device 110a) of the imaging devices HOa-l lOe, time-series images. The time-series images may include a time series of image frames captured one after the other. Images in the time-series images are temporally related with each other. The control server 112 may be further configured to process the time-series images to identify an item being handled by an operator (for example, the first operator 120a) in the storage facility 102. In an embodiment, the item may be handled by the first operator 120a at an operator station (for example, the first operator station 106a) by executing one or more pick and/or put operations by picking the item from an initial position (for example, a shelf of a storage unit) and putting the item at a final position (for example, a storage bin). In another embodiment, the item may be handled by the first operator 120a in the storage area 104 by picking the item from an initial position (for example, a shelf of a storage unit) and putting the item at a final position (for example, a storage bin or another shelf of the same or a different storage unit). The processing of the time-series images may be performed by executing one or more image processing operations on the time-series images. Example of the image processing operations may include, but is not limited to, bounding box technique to identify regions of interest in the time-series images. In some embodiments, the time-series images may be processed by executing a plurality of operations such as image acquisition, image enhancement, feature extraction, and object recognition on the time-series images.
[0046] Based on a successful identification of the item being handled by the first operator 120a in at least one of the time-series images, the control server 112 may be configured to tag the item with the first operator 120a. The tagging of the item with the first operator 120a is indicative of a coexistence of the item and the first operator 120a, in subsequent time-series images thereof, regardless of the item being visible or obscured from vision. The tagging of the item with the first operator 120a results in the item and the first operator 120a being considered as a single unit. Further, the item may be tagged with the first operator 120a based on one or more tagging or labelling techniques known in the art. In an embodiment, the control server 112 may maintain a reference database that includes various items tagged to different operators. Here, the reference database may be a look-up table and each row of the look-up table may indicate a unique tagging between an operator and an item. For example, a first row of the lookup table may include two cells one for a unique identifier allocated to the first operator 120a and one for a unique identifier allocated to an item being handled by the first operator 120a. Thus, by referring the look-up table, the control server 112 determines which items are being handled by which operators at all times. In some scenarios, some items may not be handled by any operator or some operators may not be involved in item handling. For such items or operators, the look-up table may not include any tagging. The control server 112 may be configured to update the reference database to add new rows for newly tagged items and operators, delete previous rows upon completion of a handling operation, and modify previous rows due to change in tagging between items and operators.
[0047] The control server 112 may be further configured to generate a movement trajectory of the item in the storage facility 102 based on identification of the item or the first operator 120a in subsequent time-series images captured by the plurality of imaging devices HOa-l lOe. The subsequent time-series images may be one or more images captured by the imaging device 110a and one or more other imaging devices of the plurality of imaging devices HOa-l lOe that may have the item and/or the first operator 120a in corresponding field of view. The control server 112 may be configured to process the subsequent time-series images captured by the imaging device 110a and the one or more other imaging devices of the plurality of imaging devices HOa-l lOe that have the item and/or the first operator 120a in corresponding field of view. The control server 112 may be further configured to cause the one or more other imaging devices to capture time-series images of a portion of the storage facility 102 where the item is currently being handled by the first operator 120a. The field of view of the one or more other imaging devices may overlap with a field of view of the imaging device 110a. The one or more other imaging devices may vary with time based on a movement of the item within the storage facility 102. For example, in an instance, the one or more other imaging devices may include an imaging device 110b. Subsequently, the item may be moved to a portion of the storage facility 102 that is different from a field of view of the imaging device 110b. Therefore, the control server 112 may determine an imaging device 110c as the other imaging device for receiving the subsequent time-series images. Subsequently, based on identification of one of (i) the item and the first operator 120a or (ii) only the first operator 120a, in the subsequent timeseries images of the imaging device 110a and/or the other imaging devices (for example, the imaging devices 110b and 110c), the control server 112 may be configured to determine a movement trajectory of the item within the storage facility 102. A starting point of the movement trajectory corresponds to an initial position information of the item and an ending point of the movement trajectory corresponds to the final position information of the item. The starting point of the movement trajectory may refer to a position of the item within the storage facility 102 where the item was identified to be handled by the first operator 120a initially. The initial position information may include a fiducial marker of a storage unit or a location within the storage facility 102 where the item may have been stored initially, an identifier of a storage bin, or the like. The ending point of the movement trajectory may be a position of the item where subsequent time-series images may present the item being separated from the first operator 120a (or any other operator) handling the item. The final position information indicated by the ending point of the movement trajectory may be indicative of a location within the storage facility 102 where the item was separated from the first operator 120a. The tagging between the item and the first operator 120a (or any other operator) handling the item may be removed once the item reaches the ending point of the movement trajectory.
[0048] In an exemplary scenario, the imaging device 110a may capture the time-series images in which the item is identified to be picked up by the first operator 120a from the storage unit 116a and based on such identification the control server 112 may tag the item with the first operator 120a. As the first operator 120a continue to move the handled item within the storage facility 102, different imaging devices HOa-l lOe may capture the first operator 120a and/or the item at different instances of time. Thus, the control server 112 continues to process images of the plurality of imaging devices HOa-l lOe to identify the first operator 120a and/or the item in the subsequent images after the tagging. For example, the control server 112 may determine that a field of view of the imaging device 110a may overlap with that of the imaging device 110b. Therefore, the control server 112 may process the subsequent time-series images of the imaging devices 110a and 110b to locate (or track) the movement of the first operator 120a and the handled item. In an instance, the control server 112 may identify the item and/or the first operator 120a in the subsequent time-series images captured by the imaging device 110b. Thus, the control server 112 may select the imaging device 110b as the other imaging device for tracking the movement of the first operator 120a and the handled item. However, if the control server 112 fails to identify the item and/or the first operator 120a in any of the subsequent timeseries images captured by the imaging device 110b, the control server 112 may discard the images captured by the imaging device 110b for tracking the movement of the first operator 120a and the handled item. Similarly, the control server 112 may be configured to identify presence or absence of at least one of the item and the first operator 120a in time-series images captured by the plurality of imaging devices 110-110e to keep a track of the first operator 120a and the handled item. [0049] In some embodiments, the control server 112 may be configured to communicate a set of instructions to one or more imaging devices of the plurality of imaging devices 1 Wal l Oe to orient in a direction of movement of the item. The control server 112 may communicate such instruction to the one or more imaging devices when the current field of view of any imaging device 110a- 1 e is not oriented in the direction of the movement of the item and/or the tagged first operator 120a.
[0050] Subsequently, the control server 112 may be configured to track the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on subsequent time-series images captured by the imaging device 110a and/or the one or more other imaging devices 110b and 110c of the plurality of imaging devices HOa-l lOe. The control server 112 may track the movement of the item by identifying (e.g., detecting presence or absence of) the item and/or the first operator 120a in the subsequent time-series images. In some embodiments, the control server 112 may be configured to track the movement of the item based on the determined movement trajectory of the item within the storage facility 102.
[0051] In some embodiments, when the item may not be visible in one or more images of the subsequent time-series images and the first operator 120a tagged with the item may be visible, the control server 112 may be configured to track the movement of the item based on the movement of the first operator 120a. Since the item is tagged with the first operator 120a, for each image where the first operator 120a is visible, the item is considered to coexist with the first operator 120a. Therefore, the control server 112 may be configured to track the movement of the item by tracking the first operator 120a in the subsequent time-series images.
[0052] In some embodiments, the control server 112 may be configured to track the movement of the item by processing a real-time video that captures the first operator 120a moving in the storage facility 102 along with the item. In such embodiments, the control server 112 may be configured to identify the item or the first operator 120a in the real-time video captured by the imaging device 110a and the other imaging devices 110b and 110c, and may track the item or the first operator 120a based on one or object tracking technique (for example, bounding box technique, thermal imaging for facial recognition, face recognition, feature extraction, or the like). [0053] In some embodiments, the control server 112 may be configured to detect, based on the subsequent time-series images, that the item is being handled by the second operator 120b different from the first operator 120a. For example, the subsequent time-series images display the item being in the possession of the second operator 120b instead of the first operator 120a. In such a scenario, the control server 112 may detect, based on the subsequent time-series images, a transfer of the item from the first operator 120a to the second operator 120b to detect that the item is currently being handled by the second operator 120b that is different from the first operator 120a tagged with the item. In such embodiments, the control server 112 may be further configured to re-tag the item with the second operator 120a based on the detection that the item is handled by a different operator, e.g., the second operator 120b. The tagging of the item with the second operator 120b may be performed similar to the tagging of the item with the first operator 120a. Subsequently, the control server 112 may be configured to track a movement of the second operator 120b, tagged with the item, within the storage facility 102 based on the subsequent time-series images. The final position information of the item is further determined based on the tracked movement of the second operator 120b. For the sake of brevity, the item is considered to be tagged with the first operator 120a throughout the description.
[0054] The control server 112 is further configured to determine the final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item. The final position information of the item is determined based on identification of the item being separated by the first operator 120a. As mentioned in the foregoing, the final position information of the item corresponds to the end point of the movement trajectory of the item. The final position information is indicative of the item having reached a final location after which the item is not required to be moved by the first operator 120a or any other operator for at least for some time period. Subsequently, the control server 112 may be configured to compare the final position information of the item with desired position information of the item. The desired position information of the item may be indicative of a position within the storage facility 102 where the item has to reach for inventory replenishment, order fulfilment, or any other operation within the storage facility 102. That is to say, a desired position of the item refers to a position within the storage facility 102 where the item has to reach for a successful handling of the item. The control server 112 may detect a successful handling of the item or a failure in handling the item based on a result of the comparison between the final position information and the desired position information. The handling of the item is detected as successful when the result of the comparison indicates that the final position information is same as the desired position information. The handling of the item is detected as failure when the result of the comparison indicates that the final position information is different from the desired position information. When the handling of the item is detected as failure, the control server 112 may be further configured to display one or more instructions on a display (for example, a display of the first portable device 121a associated with the first operator 120a). The one or more instructions indicate a sequence of actions to be performed on the item by the first operator 120a to match the final position information of the item with the desired position information. The sequence of actions may include one or more pick/put operations to be executed by the first operator 120a to achieve the successful handling of the item.
[0055] In some embodiments, the control server 112 may be further configured to control a visual indicator mechanism to provide visual cues for indicating the desired position information of the item. The visual indicator may be a pick/put to light (PPTL) structure or a projector system corresponding to a shelf, a conveyor, a storage bin, a kiosk, a curbside locker, or the like that may be indicative of the desired position of the item. In some embodiments, the visual cues may be provided by way of one or more light emitting diodes (LEDs) positioned corresponding to the desired location.
[0056] For the sake of brevity, the item is considered to be a single item being handled by the first operator 120a. In other embodiments, the item may be a batch of items being handled as a unit by the first operator 120a without deviating from the scope of the disclosure. For the sake of brevity, the ongoing description is described with respect to the first operator 120a. It will be apparent to a person of skill in the art that FIG. 1 shown herein is exemplary and does not limit the scope of the disclosure.
[0057] FIG. 2 is a block diagram that illustrates the control server 112, in accordance with an exemplary embodiment of the disclosure. As shown, the control server 112 may include an image processor 202, an audio processor 204, a processor 206, a natural language processor 208, a memory 210, a machine learning engine 212, and a network interface 214. [0058] The image processor 202 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more image processing operations for processing the time-series image data and/or video data captured by the plurality of imaging devices HOa-l lOe. The image processor 202 may process the time-series image data and/or video data to filter out noise and optimize the time-series image data and/or video data. The image processor 202 may further process the time-series image data and/or video data to detect one or more entities (e.g., the first through third operators 120a- 120c, the storage units 116, the transport vehicle 108, and the inventory items) in the time-series image data and identify one or more actions of the first through third operators 120a- 120c during a pick or put session associated with the inventory items. The image processor 202 may be configured to apply one or more image processing algorithms for processing the time-series image data and/or video data. Examples of the image processing algorithms may include Anisotropic Diffusion, Hidden Markov Models, Linear filtering, and/or the like.
[0059] The audio processor 204 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more operations for processing the audio data received from the audio input device 109. The output of the audio processor 204 may be used to optimize and enhance tracking of the movement of the inventory items within the storage facility 102. The audio processor 204 may apply one or more audio processing algorithms to the audio data. Examples of the audio processing algorithm may include, but are not limited to, digital signal processing (DSP) technique. It will be apparent to a person skilled in the art that the audio data may be processed by applying any audio processing technique known in the art.
[0060] The processor 206 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more operations for tracking and monitoring the movement of the inventory items within the storage facility 102. The processor 206 may be configured to receive, from the imaging device 110a of the plurality of imaging devices 110a- 1 lOe, time-series images. The processor 206 may be further configured to process the time-series images to identify the item being handled by the first operator 120a in the storage facility 102. The processor 206 may process the time-series images by way of the image processor 202. The processor 206 may be further configured to tag the item with the first operator 120a based on the successful identification of the item being handled by the first operator 120a in at least one of the time-series images. The processor 206 may be further configured to generate the movement trajectory of the item in the storage facility 102 based on the identification of the item or the first operator 120a in the subsequent time-series images. The processor 206 may be further configured to track the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on subsequent time-series images captured by the imaging device 110a or the one or more other imaging devices 110b and 110c of the plurality of imaging devices HOa-l lOe. The processor 206 may be further configured to determine the final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item. The processor 206 may be further configured to detect the successful handling of the item or a failure in handling the item based on a result of the comparison between the final position information and the desired position information. When the failure in the handling of the item may be detected, the processor 206 may be further configured to control the first portable device 121a of the first operator 120a to display the one or more instructions to be followed by the first operator 120a in order for successful handling the inventory item. The processor 206 may be further configured to determine the initial position, the final position, and the desired position of the inventory item in the storage facility 102. Further, the processor 206, based on one of the timeseries image data and the audio data, may determine whether placement of the inventory item at the desired position is correct. Further, the processor 206 may determine one or more actions to be performed by the first operator 120a for achieving the correct placement of the inventory item at the desired position.
[0061] The natural language processor 208 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more operations for tracking and monitoring the movement of the inventory items within the storage facility 102. The natural language processor 208 may process the audio data captured by the audio input device 109 in order to identify the movement of one of the inventory items, the transport vehicle 108, the storage units 116, and the storage bins. In one embodiment, the natural language processor 208 may be configured to process the audio data received from the audio input device 109. In another embodiment, the natural language processor 208 may be configured to process the audio data that has already been processed by the audio processor 204. The natural language processor 208 may be further configured to determine one or more words spoken by the first through third operators 120a- 120c to identify an issue with handling of the inventory items. The natural language processor 208 may apply one or more natural language processing (NLP) techniques for processing the audio data. Examples of the NLP techniques include, but are not limited to, named entity recognition technique, tokenization, and the like.
[0062] The memory 210 may include suitable logic, circuitry, and interfaces that may be configured to store one or more instructions which when executed by the processor 206 cause the control server 112 to perform various operations for tracking and monitoring movement of the inventory items. The memory 210 may be configured to store information associated with one or more orders that are to be fulfilled, information associated with one or more orders fulfilled in past, the image data, the audio data, the inventory storage data, and the virtual map. Examples of the memory 210 may include, but are not limited to, a random-access memory (RAM), a read only memory (ROM), a removable storage drive, a hard disk drive (HDD), a flash memory, a solid-state memory, or the like. It will be apparent to a person skilled in the art that the scope of the disclosure is not limited to realizing the memory 210 in the control server 112, as described herein. In another embodiment, the memory 210 may be realized in form of a database or a cloud storage working in conjunction with the control server 112, without departing from the scope of the disclosure.
[0063] The machine learning engine 212 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to perform one or more operations for tracking and monitoring the inventory items. The machine learning engine 212 may be further configured to identify a trend or pattern in actions of the first through third operators 120a- 120c and display personalized information associated with placement of the inventory items. Further, the machine learning engine 212 may be configured to learn, based on an organization of the inventory items in the storage facility 102, to generate customized instructions for handling of the items. In one example, the machine learning engine 212 may learn to instruct the transport vehicle 108 to transport the inventory items that are stored on a single storage unit (for example, the storage unit 116a) in a single batch. The machine learning engine 212 may apply one or more machine learning algorithms and/or techniques for operations thereof. Examples of the machine learning algorithms and/or techniques may include, but are not limited to, Linear Regression, Logistic Regression, k-nearest neighbors algorithm, neural networks, and the like.
[0064] The image processor 202, the audio processor 204, the processor 206, the natural language processor 208, and the machine learning engine 212 may be implemented by one or more processors, such as, but not limited to, an application-specific integrated circuit (ASIC) processor, a reduced instruction set computing (RISC) processor, a complex instruction set computing (CISC) processor, and a field-programmable gate array (FPGA) processor. The one or more processors may also correspond to central processing units (CPUs), graphics processing units (GPUs), network processing units (NPUs), digital signal processors (DSPs), or the like. It will be apparent to a person of ordinary skill in the art that the image processor 202, the audio processor 204, the processor 206, the natural language processor 208, and the machine learning engine 212 may be compatible with multiple operating systems.
[0065] The network interface 214 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to enable the control server 112 to communicate with the plurality of imaging devices HOa-l lOe, the audio input device 109, the image acquisition device 111, the display device, and the transport vehicle 108. The network interface 214 may be implemented as a hardware, software, firmware, or a combination thereof. Examples of the network interface 214 may include a network interface card, a physical port, a network interface device, an antenna, a radio frequency transceiver, a wireless transceiver, an Ethernet port, a universal serial bus (USB) port, or the like.
[0066] FIGS. 3A-3E are schematic diagrams that, collectively, illustrate an exemplary scenario for tracking movement of the inventory items in the storage facility, in accordance with an exemplary embodiment of the disclosure. For the sake of brevity, the exemplary scenarios 300A-300E are explained with respect to a goods-to-person implementation. However, the scope of the present disclosure may be expanded to person-to-goods implementation also.
[0067] With reference to FIG. 3 A, the storage facility 102 is shown. The transport vehicle 108 receives a first set of instructions, from the control server 112, regarding fulfillment of an order. The first set of instructions may indicate that the storage unit 116a, storing the items, i.e., inventory item 310 (shown in FIG. 3B), is to be transported to the first operator station 106a for fulfillment of the order. The transport vehicle 108 may receive, along with the first set of instructions, fiducial markers required to the transport of the storage unit 116a from a current location of the storage unit 116a to the first operator station 106a. For example, the received fiducial markers may include a sequence of fiducial markers that the transport vehicle 108 is required to follow to reach the location of the storage unit 116a from a current location of the transport vehicle 108 and the location of the first operator station 106a from the location of the storage unit 116a. In other words, the received fiducial markers may be indicative of a path that the transport vehicle 108 is required to follow for transporting the storage unit 116a to the first operator station 106a. The received fiducial markers may further include a fiducial marker (e.g., FMi) of the storage unit 116a using which the transport vehicle 108 may recognize or identify the storage unit 116a for transportation. Based on the instruction and the received fiducial markers, the transport vehicle 108 may reach the desired location of the storage unit 116a from its current location and transport the storage unit 116a to the first operator station 106a. In some embodiments, the transport vehicle 108 may be configured to transport the storage unit 116a by lifting the storage unit 116a from its bottom (shown in FIGS. 3 A and 3B). The transport vehicle 108 may transport the storage unit 116a to the first operator station 106a for fulfillment of the order.
[0068] Further, the control server 112 may be configured to determine one or more actions to be performed by the first operator 120a for fulfillment of the order. The control server 112 may be further configured to determine one or more sequences in which the one or more actions (for example, pick or put actions) are to be performed for fulfillment of the order.
[0069] With reference to FIG. 3B, the transport vehicle 108 is shown to have transported the storage unit 116a to the first operator station 106a. The storage unit 116a includes various shelves (for example, the shelves 302, 304, 306, and 308). As shown in FIG. 3B, the shelves 302-308 store inventory items 310, 312, 314, and 316, respectively. Further, the first operator station 106a is manned by the first operator 120a and includes a display device 318 and storage bins 320 and 322. The first operator station 106a, the first operator 120a, the transport vehicle 108, the storage unit 116a, and the inventory items 310, 312, 314, and 316 may be monitored by a plurality of imaging devices 324-330. The plurality of imaging devices 324-330 may be positioned within the first operator station 106a or in vicinity of the first operator station 106a in a way that the plurality of imaging devices 324-330 capture the first operator station 106a, the first operator 120a, the transport vehicle 108, the storage unit 116a, and the display device 318 from various angles, positions, and orientations. The plurality of imaging devices 324-330 may be any of the plurality of imaging devices 110a- 1 lOe.
[0070] Based on processing of the time-series images received from the imaging device 330, the control server 112 determines that the transport vehicle 108 has transported the storage unit 116a to the first operator station 106a for fulfillment of the order that is assigned to the first operator 120a. Further, the control server 112 further identifies, based on the processing of the time-series images, that the item 310 is currently being handled by the first operator 120a. For example, one of the time-series images may display the first operator 120a reaching out for the item 310 in the storage unit 116a and another image may display that the first operator 120a has picked the inventory item 310, required for fulfilment of the order, from the storage unit 116a. The control server 112 may tag the item 310 with the first operator 120a based on the successful identification of the item 310 in the time-series images received from the imaging device 330. The control server 112 may be configured to track the movement of the item 310 and the first operator 120a, tagged with the item 310, based on subsequent time-series images received from the imaging device 330 and other imaging devices 324 and 326. The other imaging devices 324 and 326 may have a field of view that does not overlap with the imaging device 330. However, the subsequent time-series images captured by the other imaging devices 324 and 326 have the item 310 and/or the first operator 120a displayed therein. For example, after the item 310 is tagged with the first operator 120a, the control server 112 may process the subsequent timeseries images received from the imaging device 330 and the other imaging devices 324 and 326, and identify at least one of the item 310 and/or the first operator 120a tagged with the item 310 in the subsequent time-series images. Since the item 310 is tagged with the first operator 120a, the control server 112 uses the presence of the first operator 120a in an image as a proxy for the presence of the item 310. Once the item 310 or the first operator 120a is identified in any of the subsequent time-series images, the control server 112 may determine a current position of the item 310 in the storage facility 102 and a time instance at which the item 310 was identified to be present at the current position. Thus, by processing the subsequent time-series images, the control server 112 is able to track the movement of the item 310 and the first operator 120a in the storage facility 102. The control server 112 may be further configured to generate a movement trajectory of the item 310 in the storage facility 102 based on the identification of the item 310 or the first operator 120a in the subsequent time-series images. The movement trajectory indicates various positions or locations through which the item 310 was moved by the first operator 120a during the handling and also time instances at which the item 310 was present at those locations or positions. In other words, the movement trajectory may indicate spatial as well as temporal information regarding the movement of the item 310 in the storage facility 102 during the handling of the item 310. The movement trajectory may be continuously generated based on movement of the item 310 or the first operator 120a by the control server 112. The generated movement trajectory may enable the control server 112 to be aware of current position information of the item 310 at all time instances during the handling of the item 310.
[0071] Referring now to FIG. 3C, the first operator 120a may put the item 310 in the storage bin 320. The control server 112 may determine the final position information of the item 310 to be the storage bin 320 based on the tracked movement of the item 310 and/or the first operator 120a tagged with the item 310. The control server 112 may further compare the final position information of the item 310 with the desired position information that may be the storage bin 320. Based on the result “successful match” of the comparison of the final position information and the desired position information, the control server 112 may determine successful handling of the item 310.
[0072] Alternatively, referring now to FIG. 3D, the desired position information may be the storage bin 322. Therefore, the result of the comparison of the final position information and the desired position information may be “failed match”. Subsequently, the control server 112 may determine failure in handling of the item 310. Therefore, the control server 112 may control the display device 318 to display information associated with a sequence of pick and put actions that are to be performed by the first operator 120a to handle the item 310 correcting the failure in handling. Since the time-series images are being generate and processed in real time or near-real time, the control server 112 is able to detect item handling failure in real time or near-real time. Thus, the time taken by the control server 112 to rectify an item handling failure in the storage facility 102 is less as compared to conventional methods where the item handling mistakes are identified at a later time instance.. [0073] The display device 318 is controlled to display identification information associated with the inventory item (e.g., the inventory item 310) that is to be picked by the first operator 120a from the storage unit 116a. The identification information may include a serial number (i.e., item 310) of the inventory item 310 and the shelf 302. In another embodiment, the identification information may further include a shelf number, a placement position (e.g., 5th item from the right side), a physical description (e.g., color, shape, size, item type, item category), or the like of the inventory item 310. The display device 318 may be further controlled to display identification information (e.g., serial number) of the storage bin (i.e., e.g., storage bin 322) that is designated to receive the inventory item 310.
[0074] In some embodiments, the control server 112 may be configured to indicate the shelf 302 and the storage bin 322 to the first operator 120a for the pick operation by way of a light guided path formed using light emitting diodes or other such light emitting components. In another embodiment, the shelf 302 and the storage bin 322 may be indicated to the first operator 120a by way of vibrations or audio signals being generated by the shelf 302 and the storage bin 322 or a device associated therewith. In another embodiment, the shelf 302 and the storage bin 322 may be indicated to the first operator 120a by way of light projections on the shelf 302 and the storage bin 322. The light projections may be indicative of an identifier (such as a barcode) associated with the shelf 302 and the storage bin 322. The light projections may be used by the first operator 120a and the imaging devices 324-330 to identify the shelf 302 and the storage bin 322.
[0075] In some embodiments, the projector system (not shown) may be utilized to provide visual cues to assist the handling of the inventory item. For example, the projector system may project a visual indicator (i.e., light projections for visual cues) to the first operator 120a regarding various operations the first operator 120a needs to perform for order fulfillment or inventory replenishment. The projector system may be configured to present an image or video based on one or more instructions received from the control server 112. The projector system may be configured to present one of an image of the inventory item 310, the identification information associated with the inventory item 310, the identifier associated with the storage unit 116a and the shelf 302, the identifier associated with the storage bin 322, images of one or more additional inventory items required for fulfilling the order, and an identification information associated with the one or more additional inventory items required for fulfilling the order. In an exemplary scenario, the first operator 120a may get confused whether he/she has picked up a correct inventory item for fulfilling the order. In such a scenario, the first operator 120a may utilize a projected barcode of the picked inventory item for confirmation. The control server 112 may be further configured to convey, via the projection of the projector system, to the first operator 120a regarding the picked inventory item being correct or incorrect.
[0076] In some embodiments, the projector system may be configured to highlight the storage unit 116a, the shelf 302, the inventory item 310, and the storage bin 320 that are to be accessed by the first operator 120a in order to fulfill the order. The projector system may highlight the storage unit 116a, the shelf 302, the inventory item 310, and the storage bin 320 in accordance with a sequence in which they are to be accessed by the first operator 120a. In some embodiments, while highlighting the storage unit 116a, the shelf 302, the inventory item 310, and the storage bin 320, the projector system may be configured to change color of a projection light to indicate an error that has been detected while the first operator 120a was accessing one of the storage unit 116a, the shelf 302, the inventory item 310, and the storage bin 320.
[0077] The projector system may render the projection in form of an image on a flat surface or a hologram. For example, the projector system may render the projection on a wall, a floor, or a ceiling of the storage facility 102, a display screen in the storage facility 102, or a surface of a storage bin or a storage unit. In another example, the projector system may render the projection in air.
[0078] Referring now to FIG. 3E, while picking the inventory item 310, the first operator 120a may pick up another inventory item 312 from a different shelf 304. Consequently, based on the processing of the captured time-series image data, the control server 112 may detect that the first operator 120a has picked up an inventory item that does not resemble the inventory item 310. For example, based on the received image data, the control server 112 may detect that the shape, the size, the dimensions, the color, and an identifier (e.g., a barcode or a QR code) of the inventory item 312 picked by the first operator 120a do not match the shape, the size, the dimensions, the color, and an identifier (e.g., a barcode or a QR code) of the inventory item 310 required for fulfilling the order. Further, based on the image data, the control server 112 may detect that an action performed by the first operator 120a for picking the inventory item 312 from the shelf 304 does not resemble the action that the first operator 120a was expected to perform for picking the inventory item 310 from the shelf 302. Subsequently, the first operator 120a may have put the item 312 in the storage bin 320 that may not match the desired position information. Thus, based on such detection, the control server 112 causes the display device 318 to display a corrective notification (i.e., an alarm or a display message) to alert the first operator 120a of the incorrect action.
[0079] In some embodiments, the storage bin 322 may be assigned to the inventory item 310 by the control server 112. The first operator 120a may be informed regarding the assignment via the display device 318 or the first portable device 121a. In another embodiment, the storage bin 322 may be assigned to the inventory item 310 on-the-fly, i.e., dynamically. For example, the first operator 120a may choose to put the inventory item 310 in the storage bin 322 randomly. Based on the captured image data, the control server 112 may identify that a first item (e.g., the inventory item 310) of the order has been put in the storage bin 322 by the first operator 120a. In such a scenario, the control server 112 may allocate or assign the storage bin 322 to one or more remaining inventory items of the order for order fulfilment.
[0080] It will be apparent to a person skilled in the art that although the foregoing embodiments are described for fulfillment of an order, the disclosed invention may also be implemented for replenishment of the inventory stock, organization of the inventory stock within the storage facility 102 and any other task that involves movement of the inventory stock stored within the storage facility 102. For example, the plurality of imaging devices 324-330 may capture the time-series image data while the first operator 120a is replenishing inventory items in the storage unit 116a. Based on the captured time-series image data, the control server 112 may determine whether the first operator 120a has replenished inventory items in the correct shelves of the storage unit 116a. In an event of error detection, the control server 112 may generate an alert and notify the first operator 120a regarding the error.
[0081] FIG. 4 is a block diagram that illustrates a system architecture of a computer system 400 for tracking movement of the inventory items within the storage facility 102, in accordance with an exemplary embodiment of the disclosure. An embodiment of the disclosure, or portions thereof, may be implemented as computer readable code on the computer system 400. In one example, the control server 112 and the image acquisition device 111 of FIG. 1 may be implemented in the computer system 400 using hardware, software, firmware, non-transitory computer readable media having instructions stored thereon, or a combination thereof and may be implemented in one or more computer systems or other processing systems. Hardware, software, or any combination thereof may embody modules and components used to implement the method of FIGS. 5, 6 A, and 6B.
[0082] The computer system 400 may include a processor 402 that may be a special purpose or a general-purpose processing device. The processor 402 may be a single processor or multiple processors. The processor 402 may have one or more processor “cores.” Further, the processor 402 may be coupled to a communication infrastructure 404, such as a bus, a bridge, a message queue, the communication network 114, multi-core message-passing scheme, or the like. The computer system 400 may further include a main memory 406 and a secondary memory 408. Examples of the main memory 406 may include RAM, ROM, and the like. The secondary memory 408 may include a hard disk drive or a removable storage drive (not shown), such as a floppy disk drive, a magnetic tape drive, a compact disc, an optical disk drive, a flash memory, or the like. Further, the removable storage drive may read from and/or write to a removable storage device in a manner known in the art. In some embodiments, the removable storage unit may be a non-transitory computer readable recording media.
[0083] The computer system 400 may further include an input/output (I/O) port 410 and a communication interface 412. The I/O port 410 may include various input and output devices that are configured to communicate with the processor 402. Examples of the input devices may include a keyboard, a mouse, a joystick, a touchscreen, a microphone, and the like. Examples of the output devices may include a display screen, a speaker, headphones, and the like. The communication interface 412 may be configured to allow data to be transferred between the computer system 400 and various devices that are communicatively coupled to the computer system 400. Examples of the communication interface 412 may include a modem, a network interface, i.e., an Ethernet card, a communication port, and the like. Data transferred via the communication interface 412 may be signals, such as electronic, electromagnetic, optical, or other signals as will be apparent to a person skilled in the art. The signals may travel via a communications channel, such as the communication network 114, which may be configured to transmit the signals to the various devices that are communicatively coupled to the computer system 400. Examples of the communication channel may include a wired, wireless, and/or optical medium such as cable, fiber optics, a phone line, a cellular phone link, a radio frequency link, and the like. The main memory 406 and the secondary memory 408 may refer to non- transitory computer readable mediums that may provide data that enables the computer system 400 to implement the method illustrated in FIGS. 5, 6A, and 6B.
[0084] FIG. 5 is a flowchart that illustrate a process for tracking movement of inventory items in a storage facility, in accordance with an exemplary embodiment of the disclosure. FIG.
5 is described with respect to elements of FIG. 1. With reference to FIG. 5, the process 500 generally starts at 502, where the time-series images are received from the imaging device 110a of the plurality of imaging devices HOa-l lOe. The control server 112 may be configured to receive the time-series images from the imaging devices 110a of the plurality of imaging devices HOa-l lOe.
[0085] The process then proceeds to 504, where the time-series images are processed to identify an item (or a batch of items) being handled by the first operator 120a in the storage facility 102. The control server 112 may be configured to process the time-series images to identify the item (or the batch of items) being handled by the first operator 120a in the storage facility 102.
[0086] The process then proceeds to 506, where the item (or the batch of items) is tagged with the first operator 120a based on the successful identification of the item in the time-series images. The control server 112 may be configured to tag the item (or the batch of items) with the first operator 120a based on the successful identification of the item (or the batch of items) in the time-series images.
[0087] The process then proceeds to 508, where the movement of the item (or the batch of items) and the first operator 120a, tagged with the item(or the batch of items), within the storage facility 102 is tracked based on the subsequent time-series images captured by the imaging device 110a or one or more other imaging devices 110b and 110c of the plurality of imaging devices 110a- 1 lOe. The control server 112 may be configured to track the movement of the item (or the batch of items) and the first operator 120a, tagged with the item (or the batch of items), within the storage facility 102 based on the subsequent time-series images captured by the imaging device 110a or one or more other imaging devices 110b and 110c of the plurality of imaging devices HOa-l lOe.
[0088] The process then proceeds to 510, where the final position information of the item (or the batch of items) is determined based on the tracked movement of at least one of the item (or the batch of items) and the first operator 120a tagged with the item (or the batch of items). The control server 112 may be configured to determine the final position information of the item (or the batch of items) based on the tracked movement of at least one of the item (or the batch of items) and the first operator 120a tagged with the item.
[0089] The process then proceeds to 512, where the final position information of the item (or the batch of items) is compared with the desired position information of the item(or the batch of items). The control server 112 may be configured to compare the final position information of the item (or the batch of items) with the desired position information of the item (or the batch of items).
[0090] The process then proceeds to 514, where the successful handling of the item or the failure in handling the item (or the batch of items) is detected based on the result of the comparison between the final position information and the desired position information. The control server 112 may be configured to detect the successful handling of the item (or the batch of items) or the failure in handling the item (or the batch of items) based on the result of the comparison between the final position information and the desired position information.
[0091] FIGS. 6A-6B are high-level flowcharts that, collectively, illustrate a process for tracking the movement of the inventory items in a storage facility, in accordance with an exemplary embodiment of the disclosure. FIGS. 6A-6B are described with respect to elements of FIG. 1.
[0092] With reference to FIG. 6A, the process 600 may generally start at 602, where the time-series images are received from the imaging device 110a of the plurality of imaging devices HOa-l lOe. The control server 112 may be configured to receive the time-series images from the imaging device 110a of the plurality of imaging devices 110a- 1 lOe.
[0093] The process then proceeds to 604, where the time-series images are processed to identify the item being handled by the first operator 120a in the storage facility 102. The control server 112 may be configured to process the time-series images to identify the item being handled by the first operator 120a in the storage facility 102.
[0094] The process then proceeds to 606, where the item is tagged with the first operator 120a based on the successful identification of the item in the time-series images. The control server 112 may be configured to tag the item with the first operator 120a based on the successful identification of the item in the time-series images.
[0095] The process then proceeds to 608, where the subsequent time-series images are processed to identify at least one of the item or the first operator 120a tagged with the item. The movement of the item and the first operator 120a is tracked based on the identification of the item or the first operator 120a in the subsequent time-series images received from the imaging device 110a and the other imaging devices 110b and 110c. The control server 112 may be configured to process the subsequent time-series images to identify at least one of the item or the first operator 120a tagged with the item.
[0096] The process then proceeds to 610, where the movement trajectory of the item in the storage facility 102 is generated based on the identification of the item or the first operator 120a in the subsequent time-series images. The control server 112 may be configured to generate the movement trajectory of the item in the storage facility 102 based on the identification of the item or the first operator 120a in the subsequent time-series images. The starting point of the movement trajectory corresponds to the initial position information of the item and the ending point of the movement trajectory corresponds to the final position information of the item.
[0097] The process then proceeds to 612, where the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 is tracked based on the subsequent time-series images captured by the imaging device 110a and the one or more other imaging devices 110b and 110c of the plurality of imaging devices HOa-l lOe. The control server 112 may be configured to track the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on the subsequent time-series images captured by the imaging device 110a or one or more other imaging devices 110b and 110c of the plurality of imaging devices 110a- 1 lOe and the generated movement trajectory.
[0098] The process then proceeds to 614, where it is detected that the item is being handled by the second operator 120b different from the first operator 120a based on the subsequent time-series image. The control server 112 may be configured to detect, based on the subsequent time-series image, that the item is being handled by the second operator 120b different from the first operator 120a. The process may then proceed to 616 or process A based on the detection that whether the item is being handled by the second operator 120b different from the first operator 120a. The process may proceed to 616 when the item is detected to be handled by the second operator 120b different from the first operator 120a and to process A when the item is detected to be handled by the same first operator 120a.
[0099] At 616, the item is re-tagged based on the detection that the item is being handled by the second operator 120b. The control server 112 may be configured to re-tag the item based on the detection that the item is being handled by the second operator 120b and the process proceeds to process B.
[00100] Referring now to FIG. 6B, the process B proceeds to 618, where the movement of the second operator 120b, tagged with the item, within the storage facility 102 is tracked based on the subsequent time-series images. The control server 112 may be configured to track the movement of the second operator 120b, tagged with the item, within the storage facility 102 based on the subsequent time-series images.
[00101] The process proceeds to 620, where the final position information of the item is determined based on the tracked movement of at least one of the item and the first operator 120a tagged with the item. The control server 112 may be configured to determine the final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item. [00102] The process then proceeds to 622, where the final position information of the item is compared with the desired position information of the item. The control server 112 may be configured to compare the final position information of the item with the desired position information of the item.
[00103] The process then proceeds to 624, where the successful handling of the item or the failure in handling the item is detected based on the result of the comparison between the final position information and the desired position information. The control server 112 may be configured to detect the successful handling of the item or the failure in handling the item based on the result of the comparison between the final position information and the desired position information.
[00104] The process then proceeds to 626, where it is determined whether the item has been handled successfully. If at 626, it is determined that the item has been handled successfully, the process stops. If at 626, it is determined that the item has not been handled successfully, the process proceeds to 628, where the one or more instructions are displayed on the display of the first portable device 121a of the first operator 120a or a display (for example, the display 318) of the storage facility 102. The control server 112 may be configured to display the one or more instructions on the display. The one or more instructions indicate a sequence of actions to be performed on the item by the second operator 120b to match the final position information of the item with the desired position information. The process then proceeds to 630, where the visual indicator mechanism is controlled to provide visual cues for indicating the desired position information of the item. The control server 112 may be configured to control the visual indicator mechanism to provide visual cues for indicating the desired position information of the item.
[00105] Various embodiments of the disclosure provide the control server 112 for automated tracking of the inventory items in the storage facility 102. The control server 112 may be configured to receive, from at least one imaging device (for example, the imaging device 110a) of the plurality of imaging devices HOa-l lOe, time-series images. The control server 112 may be further configured to process the time-series images to identify the item being handled by the first operator 120a in the storage facility 102. The control server 112 may be further configured to tag the item with the first operator 120a based on the successful identification of the item in the time-series images. The control server 112 may be further configured to track the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on subsequent time-series images captured by the at least one imaging device (e.g., the imaging device 110a) or one or more other imaging devices (for example, the imaging device 110b and 110c) of the plurality of imaging devices HOa-l lOe. The control server 112 may be further configured to determine the final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item. The control server 112 may be further configured to compare the final position information of the item with the desired position information of the item. The control server 112 may be further configured to detect the successful handling of the item or the failure in handling the item based on the result of the comparison between the final position information and the desired position information.
[00106] Various embodiments of the disclosure provide a non-transitory computer readable medium having stored thereon, computer executable instructions, which when executed by a computer, cause the computer to execute one or more operations automated tracking of inventory items in the storage facility 102. The one or more operations include receiving, by the control server 112, the time-series images from at least one imaging device (for example, the imaging device 110a) positioned within the storage facility 102. The one or more operations further include processing, by the control server 112, the time-series images to identify the item being handled by the first operator 120a in the storage facility 102. The one or more operations further include tagging, by the control server 112, the item with the first operator 120a based on the successful identification of the item in the time-series images. The one or more operations further include tracking, by the control server 112, the movement of the item and the first operator 120a, tagged with the item, within the storage facility 102 based on the subsequent time-series images captured by the at least one imaging device (for example, the imaging device 110a) or the one or more other imaging devices (for example, the imaging device 110b and 110c) positioned within the storage facility 102. The one or more operations further include determining, by the control server 112, the final position information of the item based on the tracked movement of at least one of the item and the first operator 120a tagged with the item. The one or more operations further include comparing, by the control server 112, the final position information of the item with the desired position information of the item. The one or more operations further include detecting, by the control server 112, the successful handling of the item or the failure in handling the item based on the result of the comparison between the final position information and the desired position information.
[00107] The disclosed embodiments encompass numerous advantages. Exemplary advantages of the disclosed methods include, but are not limited to, tracking and auditing of inventory stock. In other words, the disclosed methods and systems allow for security and automated monitoring of the inventory stock. Further, the disclosed methods and systems significantly reduce a time required for rectifying human errors in inventory handling. Therefore, the disclosed methods and systems increase the throughput of the storage facility 102. Further, the disclosed methods and systems significantly reduce inconvenience caused to operators of the storage facility 102. The disclosed methods and systems significantly reduce a probability of human errors being caused during processing of orders. Moreover, the disclosed methods and systems eliminate requirement to manually scan the inventory items for verification thereof for fulfilling the orders. Also, the disclosed methods and systems ensure the correct placement of inventory items while the items are being moved by the operators. Therefore, the inventory items may be processed in a less amount of time and with significantly reduced chance of error during processing of inventory items. Hence, the disclosed methods and systems significantly reduce time consumption and manual intervention required for order fulfillment, inventory stock replenishment, and inventory stock organization. Moreover, the disclosed methods and systems does not require any additional infrastructure (e.g., handheld devices) for processing the inventory items at the storage facility 102. Such lack of reliability on hardware infrastructure increases ease of implementation and flexibility of the disclosed methods and systems. Therefore, the disclosed methods and systems may be implemented in storage facilities of varying layout.
[00108] A person of ordinary skill in the art will appreciate that embodiments and exemplary scenarios of the disclosed subject matter may be practiced with various computer system configurations, including multi-core multiprocessor systems, minicomputers, mainframe computers, computers linked or clustered with distributed functions, as well as pervasive or miniature computers that may be embedded into virtually any device. Further, the operations may be described as a sequential process, however some of the operations may in fact be performed in parallel, concurrently, and/or in a distributed environment, and with program code stored locally or remotely for access by single or multiprocessor machines. In addition, in some embodiments, the order of operations may be rearranged without departing from the spirit of the disclosed subject matter.
[00109] Techniques consistent with the disclosure provide, among other features, systems and methods for tracking movement of the inventory items While various exemplary embodiments of the disclosed systems and methods have been described above, it should be understood that they have been presented for purposes of example only, and not limitations. It is not exhaustive and does not limit the disclosure to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practicing of the disclosure, without departing from the breadth or scope.

Claims

CLAIMS What is claimed is:
1. An automated item tracking system in a storage facility, the system comprising: a plurality of imaging devices positioned within the storage facility; and a control server configured to: receive, from at least one imaging device of the plurality of imaging devices, timeseries images; process the time-series images to identify an item being handled by a first operator in the storage facility; tag the item with the first operator based on a successful identification of the item in the time-series images; track a movement of the item and the first operator, tagged with the item, within the storage facility based on subsequent time-series images captured by the at least one imaging device or one or more other imaging devices of the plurality of imaging devices; determine final position information of the item based on the tracked movement of at least one of the item and the first operator tagged with the item; compare the final position information of the item with desired position information of the item; and detect a successful handling of the item or a failure in handling the item based on a result of the comparison between the final position information and the desired position information.
2. The automated item tracking system of claim 1 , wherein the handling of the item is detected as successful when the result of the comparison indicates that the final position information is same as the desired position information.
3. The automated item tracking system of claim 1, wherein the handling of the item is detected as failure when the result of the comparison indicates that the final position information is different from the desired position information.
4. The automated item tracking system of claim 1, wherein when the handling of the item is detected as failure, the control server is further configured to display one or more instructions on a display, and wherein the one or more instructions indicate a sequence of actions to be performed on the item by the first operator to match the final position information of the item with the desired position information.
5. The automated item tracking system of claim 1, wherein the control server is further configured to control a visual indicator mechanism to provide visual cues for indicating the desired position information of the item.
6. The automated item tracking system of claim 1, wherein the control server is further configured to communicate a set of instructions to the at least one imaging device or the one or more other imaging devices to orient in a direction of movement of the item.
7. The automated item tracking system of claim 1, wherein the control server is further configured to process the subsequent time-series images to identify at least one of the item or the first operator tagged with the item, and wherein the movement of the item and the first operator is tracked based on the identification of the item or the first operator in the subsequent timeseries images.
8. The automated item tracking system of claim 7, wherein the control server is further configured to generate a movement trajectory of the item in the storage facility based on the identification of the item or the first operator in the subsequent time-series images, and wherein a starting point of the movement trajectory corresponds to an initial position information of the item and an ending point of the movement trajectory corresponds to the final position information of the item.
9. The automated item tracking system of claim 7, wherein when the item is not visible in one or more images of the subsequent time-series images and the first operator tagged with the item is visible, the control server tracks the movement of the item based on the movement of the first operator.
10. The automated item tracking system of claim 1, wherein the control server is further configured to: detect, based on the subsequent time-series images, that the item is handled by a second operator different from the first operator; re-tag the item with the second operator based on the detection that the item is handled by the second operator; and track a movement of the second operator, tagged with the item, within the storage facility based on the subsequent time-series images, wherein the final position information of the item is further determined based on the tracked movement of the second operator.
11. The automated item tracking system of claim 1, wherein the control server is further configured to execute one or more image processing operations on the time-series images to identify the item.
12. An automated item tracking method for a storage facility, the method comprising: receiving, by a control server, time-series images from at least one imaging device positioned within the storage facility; processing, by the control server, the time-series images to identify an item being handled by a first operator in the storage facility; tagging, by the control server, the item with the first operator based on a successful identification of the item in the time-series images; tracking, by the control server, a movement of the item and the first operator, tagged with the item, within the storage facility based on subsequent time-series images captured by the at least one imaging device or one or more other imaging devices positioned within the storage facility; determining, by the control server, final position information of the item based on the tracked movement of at least one of the item and the first operator tagged with the item; comparing, by the control server, the final position information of the item with desired position information of the item; and detecting, by the control server, a successful handling of the item or a failure in handling the item based on a result of the comparison between the final position information and the desired position information.
13. The automated item tracking method of claim 12, wherein the handling of the item is detected as successful when the result of the comparison indicates that the final position information is same as the desired position information, and wherein the handling of the item is detected as failure when the result of the comparison indicates that the final position information is different from the desired position information.
14. The automated item tracking method of claim 13, further comprising displaying, by the control server, one or more instructions on a display when the handling of the item is detected as failure, wherein the one or more instructions indicate a sequence of actions to be performed on the item by the first operator to match the final position information of the item with the desired position information.
15. The automated item tracking method of claim 12, further comprising controlling, by the control server, a visual indicator mechanism in the storage facility to provide visual cues for indicating the desired position information of the item.
16. The automated item tracking method of claim 12, further comprising processing, by the control server, the subsequent time-series images to identify at least one of the item and the first operator tagged with the item, wherein the movement of the item and the first operator is tracked based on the identification of the item or the first operator in the subsequent time-series images.
17. The automated item tracking method of claim 16, further comprising generating, by the control server, a movement trajectory of the item in the storage facility based on the identification of the item or the first operator in the subsequent time-series images, wherein a starting point of the movement trajectory corresponds to an initial position information of the item and an ending point of the movement trajectory corresponds to the final position information of the item.
18. The automated item tracking method of claim 16, wherein when the item is not visible in one or more images of the subsequent time-series images and the first operator tagged with the item is visible, the movement of the item is tracked based on the movement of the first operator.
19. The automated item tracking method of claim 12, further comprising: detecting, by the control server, based on the subsequent time-series images, that the item is handled by a second operator different from the first operator; re-tagging, by the control server, the item with the second operator based on the detection that the item is handled by the second operator; and tracking, by the control server, a movement of the second operator, tagged with the item, within the storage facility based on the subsequent time-series images, wherein the final position information of the item is further determined based on the tracked movement of the second operator.
20. An automated item tracking system in a storage facility, the system comprising: a plurality of imaging devices positioned within the storage facility; and a control server configured to: receive, from at least one imaging device of the plurality of imaging devices, timeseries images; process the time-series images to identify a batch of items being handled by an operator; tag the batch of items with the operator based on a successful identification of the batch of items in the time-series images; track a movement of the batch of items and the operator, tagged with the batch of items, within the storage facility based on subsequent time-series images captured by the at least one imaging device or one or more other imaging devices of the plurality of imaging devices; determine final position information of the batch of items based on the tracked movement of at least one of the batch of items and the operator tagged with the batch of items; compare the final position information of the batch of items with desired position information of the batch of items; and detect a successful handling of the batch of items or a failure in handling the batch of items based on a result of the comparison between the final position information and the desired position information.
PCT/IB2021/060622 2020-11-23 2021-11-16 Automated tracking of inventory items for order fulfilment and replenishment WO2022107000A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202011050940 2020-11-23
IN202011050940 2020-11-23

Publications (1)

Publication Number Publication Date
WO2022107000A1 true WO2022107000A1 (en) 2022-05-27

Family

ID=78770847

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2021/060622 WO2022107000A1 (en) 2020-11-23 2021-11-16 Automated tracking of inventory items for order fulfilment and replenishment

Country Status (1)

Country Link
WO (1) WO2022107000A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024042457A1 (en) * 2022-08-23 2024-02-29 Flymingo Innovations Ltd. Visual pick validation

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10212319B1 (en) * 2014-11-04 2019-02-19 Amazon Technologies, Inc. Camera positioning fixture
US20190258853A1 (en) * 2014-03-27 2019-08-22 Amazon Technologies, Inc. Visual task feedback for workstations in materials handling facilities

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190258853A1 (en) * 2014-03-27 2019-08-22 Amazon Technologies, Inc. Visual task feedback for workstations in materials handling facilities
US10212319B1 (en) * 2014-11-04 2019-02-19 Amazon Technologies, Inc. Camera positioning fixture

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024042457A1 (en) * 2022-08-23 2024-02-29 Flymingo Innovations Ltd. Visual pick validation

Similar Documents

Publication Publication Date Title
US11049278B2 (en) System and method for visual identification, and system and method for classifying and sorting
US10083418B2 (en) Distributed autonomous robot systems and mehtods
US10512941B2 (en) Projection instruction device, parcel sorting system, and projection instruction method
US11565424B2 (en) System and method for task assignment management
US10860855B2 (en) Instruction projecting device, package sorting system and instruction projecting method
CN111899131B (en) Article distribution method, apparatus, robot, and medium
JP2019537541A (en) An adaptive process to guide inventory work performed by humans
US11462005B1 (en) Image partitioning for re-identification
US20210319195A1 (en) Computer vision system and method of label detection, reading, and registration of labels on objects
US20220315341A1 (en) Automated locker system for delivery and collection of inventory items
US20190084008A1 (en) Instruction projecting device, package sorting system and instruction projecting method
US10471474B2 (en) Projection indicator, cargo assortment system, and projection indicating method
EP4071684A1 (en) Warehouse monitoring system
EP4116906A1 (en) Method for warehouse storage-location monitoring, computer device, and non-volatile storage medium
US20230280758A1 (en) Autonomous Robotic Navigation In Storage Site
WO2022107000A1 (en) Automated tracking of inventory items for order fulfilment and replenishment
US11704787B2 (en) Method and system for determining stock in an inventory
WO2015194118A1 (en) Object management device, object management method, and recording medium storing object management program
JP6728995B2 (en) Automated warehouse system and automated warehouse management method
US11666948B2 (en) Projection instruction device, parcel sorting system, and projection instruction method
Agnihotram et al. Combination of advanced robotics and computer vision for shelf analytics in a retail store
US20220162001A1 (en) Predicting a path of material handling equipment and determining an obstacle-free path
JP2016066277A (en) Object management system, object management device, object management method, and object management program
JP6690411B2 (en) Management system, management method, and transportation system
US11010903B1 (en) Computer vision and machine learning techniques for item tracking

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21814898

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21814898

Country of ref document: EP

Kind code of ref document: A1