US20210027245A1 - Systems and methods for automated association of product information with electronic shelf labels - Google Patents
Systems and methods for automated association of product information with electronic shelf labels Download PDFInfo
- Publication number
- US20210027245A1 US20210027245A1 US16/935,688 US202016935688A US2021027245A1 US 20210027245 A1 US20210027245 A1 US 20210027245A1 US 202016935688 A US202016935688 A US 202016935688A US 2021027245 A1 US2021027245 A1 US 2021027245A1
- Authority
- US
- United States
- Prior art keywords
- shelf labels
- electronic shelf
- arv
- facility
- product information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/087—Inventory or stock management, e.g. order filling, procurement or balancing against orders
- G06Q10/0875—Itemisation or classification of parts, supplies or services, e.g. bill of materials
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F3/00—Labels, tag tickets, or similar identification or indication means; Seals; Postage or like stamps
- G09F3/08—Fastening or securing by means not forming part of the material of the label itself
- G09F3/18—Casings, frames or enclosures for labels
- G09F3/20—Casings, frames or enclosures for labels for adjustable, removable, or interchangeable labels
- G09F3/208—Electronic labels, Labels integrating electronic displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/35—Services specially adapted for particular environments, situations or purposes for the management of goods or merchandise
Definitions
- ESLs Electronic shelf labels
- FIG. 1 illustrates a block diagram of an exemplary system for automated association of product information with electronic shelf labels in accordance with some embodiments described herein.
- FIG. 2A illustrates an overhead view of an exemplary embodiment that obtains images of paper shelf labels on modular units.
- FIG. 2B illustrates an overhead view of the exemplary embodiment of FIG. 2A obtaining images of electronic shelf labels on modular units.
- FIG. 3A illustrates an image of paper labels obtained in an exemplary embodiment.
- FIG. 3B illustrates an image of electronic shelf labels obtained in an exemplary embodiment.
- FIG. 4 illustrates a block diagram of a remote computing device suitable for use with exemplary embodiments.
- FIG. 5 illustrates a network environment suitable for use with exemplary embodiments.
- FIG. 6 illustrates a flowchart for a method for automated association of product information with electronic shelf labels in an exemplary embodiment.
- the systems and methods employ an autonomous robotic vehicle (ARV) alone or in combination with a remote computing device to detect pre-existing product information in the form of paper labels located on modular units.
- ARV autonomous robotic vehicle
- the ARV can then detect the location of electronic shelf labels (ESLs) after installation and can associate the pre-existing product information gleaned from the paper labels with the corresponding ESLs.
- ESLs electronic shelf labels
- ESLs are an increasingly desirable way to display product information to purchasers at a retailer. Because information displayed on ESLs can be automatically updated from a central control server, or at least updated wirelessly from a local device, pricing or other information can be updated or corrected on a regular basis without requiring an entity (such as a retail employee) to physically walk to the shelf and replace the paper label with a new label containing updated information.
- the modular units that display products are conventionally modified to accommodate the ESLs. This is often done without removing the products from the modular unit.
- the process can involve removal of a portion of the modular unit that retains the paper labels, such as but not limited to shelf facings, and installation of a new portion that includes electronic shelf labels.
- a person manually identifies each ESL one-by-one consults the corresponding paper label (since removed from the modular unit) to determine the product information that should be associated with the ESL, and individually programs the ESL with the appropriate product information.
- This manual process utilizes a significant amount of labor to singly program each of the thousands of ESLs in a given retail facility.
- the manual process is repetitive and, thus, highly error-prone as it can be difficult to maintain correspondence between the ESLs and the removed paper labels over a work shift when each association must be made individually.
- errors are particularly difficult to detect for certain ESLs that display only price information for a product as the displayed price may not immediately indicate to the viewer that the association of the ESL with a product was made incorrectly.
- Systems and methods are described herein to automate the process of conversion for a facility from paper shelf labels to electronic shelf labels.
- an autonomous robotic vehicle By using an autonomous robotic vehicle to obtain initial images of paper shelf labels before removal and subsequent images of electronic shelf labels after placement on the shelf, systems and methods described herein can re-program many multiple ESLs in a batch-processing fashion.
- the time and cost associated with initial manual programming of the ESLs and costs associated with correcting errors in the programming process are significantly reduced.
- the process can be performed without human intervention, which enables the programming to be performed by the autonomous robotic vehicle and/or remote computing device while human labor resources are allocated elsewhere.
- the ARV can determine compliance or non-compliance of a modular unit with a planogram for the facility.
- FIG. 1 illustrates a system 100 for automated association of product information with electronic shelf labels in accordance with an exemplary embodiment.
- the system 100 includes an autonomous robotic vehicle (ARV) 110 and a remote computing device 150 .
- the ARV 110 includes a memory 116 , a processor 115 , at least one sensor 112 , and a communications interface 114 .
- the sensor 112 may be, but is not limited to, a camera or video camera capable of obtaining still or moving images.
- the memory 116 of the ARV 110 can store an identification module 160 that can be executed by the processor 115 .
- the ARV is a ground-based autonomous vehicle.
- the ARV may be an Unmanned Aerial Vehicle (UAV) capable of flight.
- UAV Unmanned Aerial Vehicle
- the remote computing device 150 includes a processor 155 , a communications interface 154 , and a memory 156 that may store the identification module 160 that can be executed by the processor 155 .
- the remote computing device 150 and/or the ARV 110 can be in communication with one or more databases 152 that include product information 142 related to products stored on the modular units.
- the database 152 including product information 142 is implemented within the remote computing device 150 .
- the remote computing device 150 and/or the ARV 110 can be in communication with one or more ESLs 134 disposed on a modular unit.
- the ARV 110 may execute instructions causing it to obtain one or more initial images of one or more modular units in a facility in which paper shelf labels on the modular units appear.
- the ARV 110 is configured to transmit the initial images to the remote computing device 150 using the communications interface 114 .
- the ARV 110 may also execute instructions causing it to obtain one or more subsequent images of the same modular units in which electronic shelf labels 134 appear.
- the ARV 110 is configured to transmit the subsequent images to the remote computing device 150 using the communications interface 114 .
- the remote computing device 150 receives the initial and subsequent images via the communications interface 154 .
- the remote computing device 150 also executes the identification module 160 to determine the product information 142 associated with paper shelf labels 132 in the initial images and to determine identifying information for the electronic shelf labels 134 in the subsequent images.
- the execution of the identification module 160 determines the correspondence between the paper shelf labels 132 in the initial images and the ESLs 134 in the subsequent images and associates the proper product information 142 with the ESLs 134 .
- the ARV 110 or remote computing device 150 can program the ESL 134 to display the correct product information 142 .
- the ARV 110 can move in relation to the modular units 130 in the facility.
- the ARV 110 can include wheels or treads to enable motion laterally with respect to the modular units 130 or to enable motion closer to or further from the modular units 130 .
- the ARV may hover in proximity of modular units containing paper labels or ESLs in a position enabling the ARV to obtain images.
- the sensor 112 can obtain initial images of the modular units 130 and associated paper labels 132 as shown schematically in FIG. 2A .
- Each of the paper labels 132 can correspond to a product stored on the modular unit 130 .
- the images are sent from the ARV 110 to the remote computing device 150 .
- the ARV 110 may communicate with remote computing device 150 via communications interface 114 of the ARV 110 and communications interface 154 of the remote computing device 150 .
- the communication may be performed using a wired or wireless communication standard including, but not limited to, 802.11x, BlueTooth®, Wi-Max, or any other suitable communications standard.
- the initial images can be retained for further analysis at the ARV 110 in embodiments without a remote computing device 150 . Movement of the ARV 110 and acquisition of images can be controlled by the processor 114 executing instructions on-board the ARV 110 in some embodiments.
- the modular units 130 can be prepared for conversion to electronic shelf labels.
- the modular units 130 can include a removable edge/shelf facing portion including the labels at the front of each shelf.
- the original removable edge portion including paper labels 132 can be removed and replaced with a new removable edge portion including ESLs 134 .
- the new removable edge portion can include a same number of ESLs 134 as the number of paper labels 132 on the original removable edge portion.
- each ESL 134 can be in a same position with respect to the removable edge portion as a position of the corresponding paper label 132 on the original removable edge portion.
- the ARV 110 can move relative to the modular units 130 and acquire subsequent images of the modular units 130 (subsequent to the addition of the ESLs) and associated ESLs 134 as shown schematically in FIG. 2B .
- FIG. 3A depicts a portion of an image 300 obtained by the ARV 110 during the image acquisition process depicted in FIG. 2A .
- the modular unit 130 paper labels 132 , and products 140 situated on shelves 135 of the modular unit 130 can appear.
- a modular unit identifier 138 associated with the modular unit 130 can appear in the image 300 .
- the ARV 110 may obtain multiple images of the modular units 130 as the ARV 110 moves relative to the modular units 130 in exemplary embodiments.
- the multiple images can include overlapping image content to enable stitching of the separate images or a similar method to identify the same objects in separate images.
- the image 300 can be analyzed by the identification module 160 performing video analytics to identify the paper shelf labels 132 appearing in the image 300 .
- the sensor 112 of the ARV 110 can acquire images of sufficiently high resolution that subsequent analysis can resolve information appearing on the paper shelf labels 132 from several feet away.
- the sensor 112 can include optics and/or detection elements (such as charge coupled devices or CCDs) capable of producing an image including legible paper shelf labels 132 with 8-10 point font from five feet away.
- the paper shelf labels 132 can include information associated with one or more products 140 .
- the paper shelf labels 132 can include a Universal Product Code (UPC), price information for the product, product serial numbers or other identification numbers, or a two-dimensional machine-translatable code such as a barcode or a QR Code® that identifies the product.
- UPC Universal Product Code
- price information for the product for the product
- product serial numbers or other identification numbers for the product
- a two-dimensional machine-translatable code such as a barcode or a QR Code® that identifies the product.
- the identification module 160 is stored in the memory 156 of the remote computing device 150 , and the initial images 300 are transmitted from the ARV 110 to the remote computing device 150 for analysis. In some embodiments, the identification module 160 is stored in the memory 116 of the ARV 110 , and the initial image 300 is analyzed locally in the ARV 110 .
- the memory 116 of the ARV 110 or the memory 156 of the remote computing device 150 can include one or more label templates.
- the one or more label templates can include information, for example, as to the location of a barcode or other information within the borders of the paper label 132 .
- portions of the initial image 300 including images of paper shelf labels 132 can be compared to the one or more label templates to improve accuracy in isolation and/or determination of information appearing on the paper shelf labels 132 .
- the identification module 160 can compare information obtained from the paper shelf labels 132 to product information 142 retrieved from the one or more databases 152 . The comparison ensures that the information was obtained without error from the product shelf labels 132 . Additionally, the comparison enables the identification module 160 to determine which product information 142 stored in the one or more databases is associated with each of the paper shelf labels 132 .
- the identification module 160 can assess the location of the paper shelf labels 132 with respect to the modular units 130 , with respect to one or more products 140 on the shelves 135 , or with respect to both.
- the identification module 160 can identify the paper shelf labels 132 and associate the paper shelf label 132 with the nearest product 140 in some embodiments.
- the identification module 160 can associate a location of each paper shelf label 132 on the modular unit 130 with the corresponding product information 142 in the database.
- the paper shelf labels 132 are removed from the modular units 130 .
- ESLs 134 are affixed to the modular units 130 and subsequent images are acquired as described next.
- FIG. 3B illustrates a portion of an example image 300 ′ obtained by the ARV 110 during the image acquisition process depicted in FIG. 2B after ESLs 134 have been affixed to the modular units 130 .
- the modular units 130 , ESLs 134 , and products 140 situated on shelves 135 of the modular unit 130 can appear.
- the modular unit identifier 138 associated with the modular unit 130 can appear in the image 300 ′.
- the ESLs 134 can include identifying information.
- the paper shelf labels 132 can include a serial number or other individualized number or a two-dimensional machine-translatable code such as a barcode or a QR Code® that identifies the ESL 134 .
- the sensor 112 can produce images 300 ′ of sufficient quality as to enable the resolution and/or analysis of identifying information displayed on the ESL 134 .
- the identification module 160 is stored in the memory 156 of the remote computing device 150 , and the subsequent image 300 ′ is transmitted from the ARV 110 to the remote computing device 150 for analysis. In some embodiments, the identification module 160 is stored in the memory 116 of the ARV 110 , and the subsequent image 300 ′ is analyzed locally in the ARV 110 .
- the identification module 160 can assess the location of the ESLs 134 with respect to the modular units 130 , with respect to one or more products 140 on the shelves 135 , or with respect to both.
- the identification module 160 can identify the ESLs 134 and associate the ESLs 134 with the nearest product 140 in some embodiments.
- the identification module 160 identifies a correspondence between each of the ESLs 134 in the subsequent image 300 ′ and one of the paper shelf labels 132 in the initial image 300 .
- the correspondence can be identified based upon the locations of the paper shelf label 132 and the ESL 134 relative to the modular unit 130 , relative to products 140 on shelves 135 , or both.
- a paper shelf label 132 is identified as being at a particular location in image 300 and an ESL 134 is identified as being at the same location in image 300 ′, the paper shelf label 132 and the ESL 134 correspond.
- the identification module 160 associates product information 142 previously assigned to each of the paper shelf labels 132 to the corresponding ESL 134 . In this way, each ESL 134 affixed on the modular unit 130 is properly associated with the product nearest to it on the shelf 135 .
- the identification module 160 can transmit instructions to the ARV 110 to program the ESL 134 with the associated product information 142 .
- the remote computing device 150 if the remote computing device 150 is able to communicate directly or indirectly with the ESL, the remote computing device can program each ESL 134 with product information 142 by transmitting instructions to do so via the communications interface 154 .
- the ESL 134 can display the product information 142 such as, but not limited to, price information.
- the identification module 160 performs video analytics and identifies and analyzes the modular unit identifier 138 disposed on the modular unit 130 and appearing in the initial images 300 , the subsequent images 300 ′, or both.
- the modular unit identifier 138 can include information specific to each modular unit 130 such as a serial number or two-dimensional machine-translatable code.
- the modular unit identifier 138 can include information related to the position of the modular unit 130 within the facility such as a number or graphic keyed to a planogram of the facility.
- the identification module 160 can identify a location of the modular unit 130 within the facility based on the analysis of the modular unit identifier 138 with respect to stored facility location information.
- the ARV 110 stores a planogram of the facility in memory and can check the accuracy of the planogram of the facility after image acquisition. For example, the ARV 110 can confirm that one or more ESLs 134 (e.g., the location or identity of the ESLs 134 ) corresponds to the planogram of the facility and transmits a notification to the remote computing device 150 . Alternatively or in addition, the ARV 110 can confirm that one or more ESLs 134 fail to correspond to the planogram of the facility and can transmit a notification to the remote computing device 150 . The notification can include the identifying information for the one or more ESLs 134 . Upon receipt of the notification that the ESL fails to correspond to the planogram, the remote computing device 150 can issue an alert.
- the remote computing device 150 can issue an alert.
- memory 156 included in the remote computing device 150 may store computer-readable and computer-executable instructions or software (e.g., identification module 160 for implementing exemplary operations of the remote computing device 150 such as identification module 160 .
- the remote computing device 150 also includes configurable and/or programmable processor 155 and associated core(s) 404 , and optionally, one or more additional configurable and/or programmable processor(s) 402 ′ and associated core(s) 404 ′ (for example, in the case of computer systems having multiple processors/cores), for executing computer-readable and computer-executable instructions or software stored in the memory 156 and other programs for implementing exemplary embodiments of the present disclosure.
- Virtualization may be employed in the remote computing device 150 so that infrastructure and resources in the remote computing device 150 may be shared dynamically.
- a virtual machine 412 may be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines may also be used with one processor.
- a user may interact with the remote computing device 150 through a visual display device 152 , such as a computer monitor, which may display one or more graphical user interfaces 416 .
- the user may interact with the remote computing device 150 using a multi-point touch interface 420 or a pointing device 418 .
- the remote computing device 150 may also include one or more computer storage devices 426 , such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer-readable instructions and/or software that implement exemplary embodiments of the present disclosure (e.g., applications).
- exemplary storage device 426 can include one or more databases 152 for storing product information 142 , location information for paper shelf labels 132 or ESLs 134 , planograms of the facility, or identifying information related to ESLs 134 .
- the databases 152 may be updated manually or automatically at any suitable time to add, delete, and/or update one or more data items in the databases.
- the communications interface 154 may include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the remote computing device 150 to any type of network capable of communication and performing the operations described herein.
- the remote computing device 150 can host one or more applications (e.g., the identification module 160 ) configured to interact with one or more components of the ARVs 110 and/or to facilitate access to the content of the databases 152 .
- the databases 152 may store information or data as described above herein.
- the databases 152 can include product information 142 , identifying information for one or more ESLs 134 , one or more planograms for the facility, and location information associated with paper shelf labels 132 and/or ESLs 134 .
- the databases 152 can be located at one or more geographically distributed locations away from the ARVs 110 and/or the remote computing device 150 .
- the databases 152 can be located at the same geographical location as the remote computing device 150 and/or at the same geographical location as the ARVs 110 .
- one or more portions of the communications network 505 can be an ad hoc network, a mesh network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless wide area network (WWAN), a metropolitan area network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a Wi-Fi network, a WiMAX network, an Internet-of-Things (IoT) network established using BlueTooth® or any other protocol, any other type of network, or a combination of two or more such networks.
- VPN virtual private network
- LAN local area network
- WLAN wireless LAN
- WAN wide area network
- WWAN wireless wide area network
- MAN metropolitan area network
- PSTN Public Switched Telephone Network
- PSTN Public Switched Telephone Network
- a cellular telephone network a wireless network
- Wi-Fi network
- FIG. 6 illustrates a flowchart for a method 600 for automated association of product information with electronic shelf labels in an exemplary embodiment.
- the method 600 includes obtaining initial images 300 of modular units 130 in a facility using at least one sensor 112 of an autonomous robot vehicle (ARV) 110 (step 602 ).
- the modular units 130 include multiple paper shelf labels 132 .
- the initial images 300 are taken before removal of the paper shelf labels 132 from the modular units 130 .
- the method 600 further includes obtaining, using the at least one sensor 112 , subsequent images 300 ′ of the modular units 130 (step 604 ).
- the subsequent images 300 ′ are taken after multiple electronic shelf labels 134 are affixed to the modular units 130 .
- the method 600 also includes retrieving product information 142 from one or more databases 152 holding product information 142 associated with products 140 assigned to the modular units 130 in the facility (step 606 ).
- the method 600 additionally includes analyzing the initial images 300 to identify the paper shelf labels 132 appearing in the initial images 300 to determine the product information 142 associated with each of the paper shelf labels 132 (step 608 ).
- the method 600 includes analyzing the electronic shelf labels 134 disposed on the modular units 130 that appear in the subsequent images 300 ′ to determine identifying information associated with each of the electronic shelf labels 134 (step 610 ).
- the method 600 includes identifying a correspondence between each of the electronic shelf labels 134 and one of the paper shelf labels 132 (step 612 ).
- the method 600 also includes associating product information 142 previously assigned to each of the paper shelf labels 133 with the corresponding one of the electronic shelf labels 134 (step 614 ).
- the corresponding one of the electronic shelf labels is programmed with the product information by the remote computing device or the ARV (step 616 ).
- Exemplary flowcharts are provided herein for illustrative purposes and are non-limiting examples of methods.
- One of ordinary skill in the art will recognize that exemplary methods may include more or fewer steps than those illustrated in the exemplary flowcharts, and that the steps in the exemplary flowcharts may be performed in a different order than the order shown in the illustrative flowcharts.
Abstract
Systems and methods that employ an autonomous robotic vehicle (ARV) alone or in combination with a remote computing device during the installation of electronic shelf labels (ESLs) in a facility are discussed. The ARV may detect pre-existing product information from paper labels located on modular units prior to their removal and then detect the location of electronic shelf labels (ESLs) after installation. Pre-existing product information gleaned from the paper labels is associated with the corresponding ESLs. The ARV may also determine compliance or non-compliance of modular units to which an ESL is affixed with a planogram of the facility.
Description
- This application claims priority from U.S. Provisional Application No. 62/878,162, filed Jul. 24, 2019, the entire contents of the above application being incorporated herein by reference.
- Electronic shelf labels (ESLs) are gaining greater acceptance in the retail environment. Unlike standard paper shelf labels, information displayed on ESLs can be automatically updated from a central control server.
- Illustrative embodiments are shown by way of example in the accompanying drawings and should not be considered as a limitation of the present disclosure.
-
FIG. 1 illustrates a block diagram of an exemplary system for automated association of product information with electronic shelf labels in accordance with some embodiments described herein. -
FIG. 2A illustrates an overhead view of an exemplary embodiment that obtains images of paper shelf labels on modular units. -
FIG. 2B illustrates an overhead view of the exemplary embodiment ofFIG. 2A obtaining images of electronic shelf labels on modular units. -
FIG. 3A illustrates an image of paper labels obtained in an exemplary embodiment. -
FIG. 3B illustrates an image of electronic shelf labels obtained in an exemplary embodiment. -
FIG. 4 illustrates a block diagram of a remote computing device suitable for use with exemplary embodiments. -
FIG. 5 illustrates a network environment suitable for use with exemplary embodiments. -
FIG. 6 illustrates a flowchart for a method for automated association of product information with electronic shelf labels in an exemplary embodiment. - Described in detail herein are systems and methods for automated association of product information with electronic shelf labels. The systems and methods employ an autonomous robotic vehicle (ARV) alone or in combination with a remote computing device to detect pre-existing product information in the form of paper labels located on modular units. The ARV can then detect the location of electronic shelf labels (ESLs) after installation and can associate the pre-existing product information gleaned from the paper labels with the corresponding ESLs.
- ESLs are an increasingly desirable way to display product information to purchasers at a retailer. Because information displayed on ESLs can be automatically updated from a central control server, or at least updated wirelessly from a local device, pricing or other information can be updated or corrected on a regular basis without requiring an entity (such as a retail employee) to physically walk to the shelf and replace the paper label with a new label containing updated information.
- When a retailer opts to change from the existing paper labels to ESLs, the modular units that display products are conventionally modified to accommodate the ESLs. This is often done without removing the products from the modular unit. The process can involve removal of a portion of the modular unit that retains the paper labels, such as but not limited to shelf facings, and installation of a new portion that includes electronic shelf labels. Conventionally, after installation of the new portion including the ESLs, a person manually identifies each ESL one-by-one, consults the corresponding paper label (since removed from the modular unit) to determine the product information that should be associated with the ESL, and individually programs the ESL with the appropriate product information. This manual process utilizes a significant amount of labor to singly program each of the thousands of ESLs in a given retail facility. In addition, the manual process is repetitive and, thus, highly error-prone as it can be difficult to maintain correspondence between the ESLs and the removed paper labels over a work shift when each association must be made individually. Furthermore, errors are particularly difficult to detect for certain ESLs that display only price information for a product as the displayed price may not immediately indicate to the viewer that the association of the ESL with a product was made incorrectly.
- Systems and methods are described herein to automate the process of conversion for a facility from paper shelf labels to electronic shelf labels. By using an autonomous robotic vehicle to obtain initial images of paper shelf labels before removal and subsequent images of electronic shelf labels after placement on the shelf, systems and methods described herein can re-program many multiple ESLs in a batch-processing fashion. As a result, the time and cost associated with initial manual programming of the ESLs and costs associated with correcting errors in the programming process are significantly reduced. Moreover, the process can be performed without human intervention, which enables the programming to be performed by the autonomous robotic vehicle and/or remote computing device while human labor resources are allocated elsewhere. Additionally, in some embodiments, the ARV can determine compliance or non-compliance of a modular unit with a planogram for the facility.
-
FIG. 1 illustrates asystem 100 for automated association of product information with electronic shelf labels in accordance with an exemplary embodiment. Thesystem 100 includes an autonomous robotic vehicle (ARV) 110 and aremote computing device 150. The ARV 110 includes amemory 116, aprocessor 115, at least onesensor 112, and acommunications interface 114. Thesensor 112, may be, but is not limited to, a camera or video camera capable of obtaining still or moving images. Optionally, thememory 116 of the ARV 110 can store anidentification module 160 that can be executed by theprocessor 115. In one embodiment, the ARV is a ground-based autonomous vehicle. In another embodiment, the ARV may be an Unmanned Aerial Vehicle (UAV) capable of flight. Theremote computing device 150 includes aprocessor 155, acommunications interface 154, and amemory 156 that may store theidentification module 160 that can be executed by theprocessor 155. Theremote computing device 150 and/or the ARV 110 can be in communication with one ormore databases 152 that includeproduct information 142 related to products stored on the modular units. In some embodiments, thedatabase 152 includingproduct information 142 is implemented within theremote computing device 150. In some embodiments, theremote computing device 150 and/or the ARV 110 can be in communication with one ormore ESLs 134 disposed on a modular unit. - Continuing with the description of
FIG. 1 , theARV 110 may execute instructions causing it to obtain one or more initial images of one or more modular units in a facility in which paper shelf labels on the modular units appear. The ARV 110 is configured to transmit the initial images to theremote computing device 150 using thecommunications interface 114. The ARV 110 may also execute instructions causing it to obtain one or more subsequent images of the same modular units in whichelectronic shelf labels 134 appear. The ARV 110 is configured to transmit the subsequent images to theremote computing device 150 using thecommunications interface 114. Theremote computing device 150 receives the initial and subsequent images via thecommunications interface 154. Theremote computing device 150 also executes theidentification module 160 to determine theproduct information 142 associated withpaper shelf labels 132 in the initial images and to determine identifying information for theelectronic shelf labels 134 in the subsequent images. The execution of theidentification module 160 determines the correspondence between thepaper shelf labels 132 in the initial images and theESLs 134 in the subsequent images and associates theproper product information 142 with theESLs 134. Once informed of the association, theARV 110 orremote computing device 150 can program theESL 134 to display thecorrect product information 142. By automating identification and association between paper shelf labels and ESLs, thesystem 100 reduces human involvement in the process of preparing and programming the replacement ESLs upon removal of paper shelf labels on modular units and reduces rates of error in programming of the ESLs. - As shown in
FIGS. 2A and 2B , theARV 110 can move in relation to themodular units 130 in the facility. In some embodiments, the ARV 110 can include wheels or treads to enable motion laterally with respect to themodular units 130 or to enable motion closer to or further from themodular units 130. In other embodiments, the ARV may hover in proximity of modular units containing paper labels or ESLs in a position enabling the ARV to obtain images. As theARV 110 moves in relation to themodular units 130, thesensor 112 can obtain initial images of themodular units 130 and associatedpaper labels 132 as shown schematically inFIG. 2A . Each of thepaper labels 132 can correspond to a product stored on themodular unit 130. In some embodiments, the images are sent from theARV 110 to theremote computing device 150. For example, theARV 110 may communicate withremote computing device 150 viacommunications interface 114 of theARV 110 and communications interface 154 of theremote computing device 150. In some embodiments, the communication may be performed using a wired or wireless communication standard including, but not limited to, 802.11x, BlueTooth®, Wi-Max, or any other suitable communications standard. As described below in greater detail, the initial images can be retained for further analysis at theARV 110 in embodiments without aremote computing device 150. Movement of theARV 110 and acquisition of images can be controlled by theprocessor 114 executing instructions on-board theARV 110 in some embodiments. - After the image acquisition described above in relation to
FIG. 2A , themodular units 130 can be prepared for conversion to electronic shelf labels. For example, themodular units 130 can include a removable edge/shelf facing portion including the labels at the front of each shelf. The original removable edge portion includingpaper labels 132 can be removed and replaced with a new removable edgeportion including ESLs 134. In some embodiments, the new removable edge portion can include a same number ofESLs 134 as the number of paper labels 132 on the original removable edge portion. In addition, eachESL 134 can be in a same position with respect to the removable edge portion as a position of thecorresponding paper label 132 on the original removable edge portion. - After installation of the
ESLs 134 on themodular units 130, theARV 110 can move relative to themodular units 130 and acquire subsequent images of the modular units 130 (subsequent to the addition of the ESLs) and associatedESLs 134 as shown schematically inFIG. 2B . -
FIG. 3A depicts a portion of animage 300 obtained by theARV 110 during the image acquisition process depicted inFIG. 2A . In theimage 300, themodular unit 130, paper labels 132, andproducts 140 situated onshelves 135 of themodular unit 130 can appear. In some embodiments, amodular unit identifier 138 associated with themodular unit 130 can appear in theimage 300. Although only asingle image 300 is illustrated herein, it should be appreciated that theARV 110 may obtain multiple images of themodular units 130 as theARV 110 moves relative to themodular units 130 in exemplary embodiments. In some embodiments, the multiple images can include overlapping image content to enable stitching of the separate images or a similar method to identify the same objects in separate images. - The
image 300 can be analyzed by theidentification module 160 performing video analytics to identify the paper shelf labels 132 appearing in theimage 300. In some embodiments, thesensor 112 of theARV 110 can acquire images of sufficiently high resolution that subsequent analysis can resolve information appearing on the paper shelf labels 132 from several feet away. For example, thesensor 112 can include optics and/or detection elements (such as charge coupled devices or CCDs) capable of producing an image including legible paper shelf labels 132 with 8-10 point font from five feet away. In some embodiments, the paper shelf labels 132 can include information associated with one ormore products 140. For example, the paper shelf labels 132 can include a Universal Product Code (UPC), price information for the product, product serial numbers or other identification numbers, or a two-dimensional machine-translatable code such as a barcode or a QR Code® that identifies the product. - In some embodiments, the
identification module 160 is stored in thememory 156 of theremote computing device 150, and theinitial images 300 are transmitted from theARV 110 to theremote computing device 150 for analysis. In some embodiments, theidentification module 160 is stored in thememory 116 of theARV 110, and theinitial image 300 is analyzed locally in theARV 110. - In some embodiments, the
memory 116 of theARV 110 or thememory 156 of theremote computing device 150 can include one or more label templates. The one or more label templates can include information, for example, as to the location of a barcode or other information within the borders of thepaper label 132. As part of the image analysis and information extraction performed by theidentification module 160, portions of theinitial image 300 including images of paper shelf labels 132 can be compared to the one or more label templates to improve accuracy in isolation and/or determination of information appearing on the paper shelf labels 132. - In some embodiments, the
identification module 160 can compare information obtained from the paper shelf labels 132 toproduct information 142 retrieved from the one ormore databases 152. The comparison ensures that the information was obtained without error from the product shelf labels 132. Additionally, the comparison enables theidentification module 160 to determine whichproduct information 142 stored in the one or more databases is associated with each of the paper shelf labels 132. - The
identification module 160 can assess the location of the paper shelf labels 132 with respect to themodular units 130, with respect to one ormore products 140 on theshelves 135, or with respect to both. Theidentification module 160 can identify the paper shelf labels 132 and associate thepaper shelf label 132 with thenearest product 140 in some embodiments. In some embodiments, theidentification module 160 can associate a location of eachpaper shelf label 132 on themodular unit 130 with the correspondingproduct information 142 in the database. - After the
ARV 110 acquires initial images (of whichimage 300 is an example), the paper shelf labels 132 are removed from themodular units 130. Then,ESLs 134 are affixed to themodular units 130 and subsequent images are acquired as described next. -
FIG. 3B illustrates a portion of anexample image 300′ obtained by theARV 110 during the image acquisition process depicted inFIG. 2B after ESLs 134 have been affixed to themodular units 130. In theimage 300′, themodular units 130,ESLs 134, andproducts 140 situated onshelves 135 of themodular unit 130 can appear. In some embodiments, themodular unit identifier 138 associated with themodular unit 130 can appear in theimage 300′. In some embodiments, theESLs 134 can include identifying information. For example, the paper shelf labels 132 can include a serial number or other individualized number or a two-dimensional machine-translatable code such as a barcode or a QR Code® that identifies theESL 134. As described above with respect toFIG. 3A , thesensor 112 can produceimages 300′ of sufficient quality as to enable the resolution and/or analysis of identifying information displayed on theESL 134. - In some embodiments, the
identification module 160 is stored in thememory 156 of theremote computing device 150, and thesubsequent image 300′ is transmitted from theARV 110 to theremote computing device 150 for analysis. In some embodiments, theidentification module 160 is stored in thememory 116 of theARV 110, and thesubsequent image 300′ is analyzed locally in theARV 110. - The
identification module 160 can assess the location of theESLs 134 with respect to themodular units 130, with respect to one ormore products 140 on theshelves 135, or with respect to both. Theidentification module 160 can identify theESLs 134 and associate theESLs 134 with thenearest product 140 in some embodiments. - The
identification module 160 identifies a correspondence between each of theESLs 134 in thesubsequent image 300′ and one of the paper shelf labels 132 in theinitial image 300. The correspondence can be identified based upon the locations of thepaper shelf label 132 and theESL 134 relative to themodular unit 130, relative toproducts 140 onshelves 135, or both. When apaper shelf label 132 is identified as being at a particular location inimage 300 and anESL 134 is identified as being at the same location inimage 300′, thepaper shelf label 132 and theESL 134 correspond. - The
identification module 160associates product information 142 previously assigned to each of the paper shelf labels 132 to thecorresponding ESL 134. In this way, eachESL 134 affixed on themodular unit 130 is properly associated with the product nearest to it on theshelf 135. In some embodiments, theidentification module 160 can transmit instructions to theARV 110 to program theESL 134 with the associatedproduct information 142. Alternatively, if theremote computing device 150 is able to communicate directly or indirectly with the ESL, the remote computing device can program eachESL 134 withproduct information 142 by transmitting instructions to do so via thecommunications interface 154. In some embodiments, theESL 134 can display theproduct information 142 such as, but not limited to, price information. - In some embodiments, the
identification module 160 performs video analytics and identifies and analyzes themodular unit identifier 138 disposed on themodular unit 130 and appearing in theinitial images 300, thesubsequent images 300′, or both. Themodular unit identifier 138 can include information specific to eachmodular unit 130 such as a serial number or two-dimensional machine-translatable code. In some embodiments, themodular unit identifier 138 can include information related to the position of themodular unit 130 within the facility such as a number or graphic keyed to a planogram of the facility. Theidentification module 160 can identify a location of themodular unit 130 within the facility based on the analysis of themodular unit identifier 138 with respect to stored facility location information. In some embodiments, the analysis of themodular unit identifier 138 includes an analysis of the planogram of the facility. Once the location of themodular unit 130 within the facility has been identified, the location can be associated with the identifying information of acorresponding ESL 134 that is affixed to thatmodular unit 130. Identification of the location of an ESL 134 (on a modular unit 130) within the facility provides the advantage that theESL 134 can be programmed withproduct information 142 that is tailored to the location of the associated product within the facility. For example, the facility may have two customer zones in which a product is sold at different prices. The first zone may be the general merchandise section of the facility while the second zone may be a special “convenience” section, a limited-availability sale section (e.g., a section including “doorbuster” products in limited quantities or for limited times), or a specialized section such as a home and garden section. Thus, anESL 134 for the same product may displaydifferent product information 142 depending upon the location of theESL 134 within the facility. Theidentification module 160 can program theESL 134 withproduct information 142 that takes into account not only the identifying information of theESL 134 but also associated location information. - In some embodiments, the
ARV 110 stores a planogram of the facility in memory and can check the accuracy of the planogram of the facility after image acquisition. For example, theARV 110 can confirm that one or more ESLs 134 (e.g., the location or identity of the ESLs 134) corresponds to the planogram of the facility and transmits a notification to theremote computing device 150. Alternatively or in addition, theARV 110 can confirm that one or more ESLs 134 fail to correspond to the planogram of the facility and can transmit a notification to theremote computing device 150. The notification can include the identifying information for the one ormore ESLs 134. Upon receipt of the notification that the ESL fails to correspond to the planogram, theremote computing device 150 can issue an alert. In one embodiment, the alert may be transmitted to a store associate that can then remedy the discrepancy if necessary. In another embodiment, the alert may be transmitted to the same or different ARV capable of performing an action to remedy the planogram issue. For example, if the ARV is equipped with an articulating arm capable of placing and removing items, the ARV may be tasked by the remote computing device with adding or removing items to or from the modular unit until the modular unit corresponds with the planogram. -
FIG. 4 is a block diagram of aremote computing device 150 suitable for use with exemplary embodiments of the present disclosure. Theremote computing device 150 may be, but is not limited to, a smartphone, laptop, tablet, desktop computer, server, or network appliance. Theremote computing device 150 includes one or more non-transitory computer-readable media for storing one or more computer-executable instructions or software for implementing exemplary embodiments. The non-transitory computer-readable media may include, but are not limited to, one or more types of hardware memory, non-transitory tangible media (for example, one or more magnetic storage disks, one or more optical disks, one or more flash drives, one or more solid state disks), and the like. For example,memory 156 included in theremote computing device 150 may store computer-readable and computer-executable instructions or software (e.g.,identification module 160 for implementing exemplary operations of theremote computing device 150 such asidentification module 160. Theremote computing device 150 also includes configurable and/orprogrammable processor 155 and associated core(s) 404, and optionally, one or more additional configurable and/or programmable processor(s) 402′ and associated core(s) 404′ (for example, in the case of computer systems having multiple processors/cores), for executing computer-readable and computer-executable instructions or software stored in thememory 156 and other programs for implementing exemplary embodiments of the present disclosure.Processor 155 and processor(s) 402′ may each be a single core processor or multiple core (404 and 404′) processor. Either or both ofprocessor 155 and processor(s) 402′ may be configured to execute one or more of the instructions described in connection withremote computing device 150. - Virtualization may be employed in the
remote computing device 150 so that infrastructure and resources in theremote computing device 150 may be shared dynamically. Avirtual machine 412 may be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines may also be used with one processor. -
Memory 156 may include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like.Memory 156 may include other types of memory as well, or combinations thereof. - A user may interact with the
remote computing device 150 through avisual display device 152, such as a computer monitor, which may display one or moregraphical user interfaces 416. The user may interact with theremote computing device 150 using amulti-point touch interface 420 or apointing device 418. - The
remote computing device 150 may also include one or morecomputer storage devices 426, such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer-readable instructions and/or software that implement exemplary embodiments of the present disclosure (e.g., applications). For example,exemplary storage device 426 can include one ormore databases 152 for storingproduct information 142, location information for paper shelf labels 132 orESLs 134, planograms of the facility, or identifying information related toESLs 134. Thedatabases 152 may be updated manually or automatically at any suitable time to add, delete, and/or update one or more data items in the databases. - The
remote computing device 150 can include acommunications interface 154 configured to interface via one ormore network devices 424 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, T1, T3, 56 kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above. In exemplary embodiments, theremote computing device 150 can include one ormore antennas 422 to facilitate wireless communication (e.g., via the network interface) between theremote computing device 150 and a network and/or between theremote computing device 150 and theARV 110. Thecommunications interface 154 may include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing theremote computing device 150 to any type of network capable of communication and performing the operations described herein. - The
remote computing device 150 may runoperating system 410, such as versions of the Microsoft® Windows® operating systems, different releases of the Unix and Linux operating systems, versions of the MacOS® for Macintosh computers, embedded operating systems, real-time operating systems, open source operating systems, proprietary operating systems, or other operating system capable of running on theremote computing device 150 and performing the operations described herein. In exemplary embodiments, theoperating system 410 may be run in native mode or emulated mode. In an exemplary embodiment, theoperating system 410 may be run on one or more cloud machine instances. -
FIG. 5 illustrates anetwork environment 500 including theARV 110 andremote computing system 150 suitable for use with exemplary embodiments. Thenetwork environment 500 can include one ormore databases 152, one ormore ARVs 110, one or more ESLs 134, and one or moreremote computing devices 150 that can communicate with one another via acommunications network 505. - The
remote computing device 150 can host one or more applications (e.g., the identification module 160) configured to interact with one or more components of theARVs 110 and/or to facilitate access to the content of thedatabases 152. Thedatabases 152 may store information or data as described above herein. For example, thedatabases 152 can includeproduct information 142, identifying information for one or more ESLs 134, one or more planograms for the facility, and location information associated with paper shelf labels 132 and/orESLs 134. Thedatabases 152 can be located at one or more geographically distributed locations away from theARVs 110 and/or theremote computing device 150. Alternatively, thedatabases 152 can be located at the same geographical location as theremote computing device 150 and/or at the same geographical location as theARVs 110. - In an example embodiment, one or more portions of the
communications network 505 can be an ad hoc network, a mesh network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless wide area network (WWAN), a metropolitan area network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a Wi-Fi network, a WiMAX network, an Internet-of-Things (IoT) network established using BlueTooth® or any other protocol, any other type of network, or a combination of two or more such networks. -
FIG. 6 illustrates a flowchart for amethod 600 for automated association of product information with electronic shelf labels in an exemplary embodiment. Themethod 600 includes obtaininginitial images 300 ofmodular units 130 in a facility using at least onesensor 112 of an autonomous robot vehicle (ARV) 110 (step 602). Themodular units 130 include multiple paper shelf labels 132. Theinitial images 300 are taken before removal of the paper shelf labels 132 from themodular units 130. Themethod 600 further includes obtaining, using the at least onesensor 112,subsequent images 300′ of the modular units 130 (step 604). Thesubsequent images 300′ are taken after multiple electronic shelf labels 134 are affixed to themodular units 130. - The
method 600 also includes retrievingproduct information 142 from one ormore databases 152 holdingproduct information 142 associated withproducts 140 assigned to themodular units 130 in the facility (step 606). Themethod 600 additionally includes analyzing theinitial images 300 to identify the paper shelf labels 132 appearing in theinitial images 300 to determine theproduct information 142 associated with each of the paper shelf labels 132 (step 608). Themethod 600 includes analyzing the electronic shelf labels 134 disposed on themodular units 130 that appear in thesubsequent images 300′ to determine identifying information associated with each of the electronic shelf labels 134 (step 610). - Additionally, the
method 600 includes identifying a correspondence between each of the electronic shelf labels 134 and one of the paper shelf labels 132 (step 612). Themethod 600 also includes associatingproduct information 142 previously assigned to each of the paper shelf labels 133 with the corresponding one of the electronic shelf labels 134 (step 614). Following the association of paper shelf label to ESL, the corresponding one of the electronic shelf labels is programmed with the product information by the remote computing device or the ARV (step 616). - In describing exemplary embodiments, specific terminology is used for the sake of clarity. For purposes of description, each specific term is intended to at least include all technical and functional equivalents that operate in a similar manner to accomplish a similar purpose. Additionally, in some instances where a particular exemplary embodiment includes multiple system elements, device components or method steps, those elements, components or steps may be replaced with a single element, component, or step. Likewise, a single element, component, or step may be replaced with multiple elements, components, or steps that serve the same purpose. Moreover, while exemplary embodiments have been shown and described with references to particular embodiments thereof, those of ordinary skill in the art will understand that various substitutions and alterations in form and detail may be made therein without departing from the scope of the present disclosure. Further still, other aspects, functions, and advantages are also within the scope of the present disclosure.
- Exemplary flowcharts are provided herein for illustrative purposes and are non-limiting examples of methods. One of ordinary skill in the art will recognize that exemplary methods may include more or fewer steps than those illustrated in the exemplary flowcharts, and that the steps in the exemplary flowcharts may be performed in a different order than the order shown in the illustrative flowcharts.
Claims (18)
1. A system for automated association of product information with electronic shelf labels, comprising:
a remote computing device that includes a processor, a memory, and a communications interface, the remote computing device configured to execute an identification module;
one or more databases holding product information associated with products assigned to a plurality of modular units in a facility; and
an autonomous robotic vehicle (ARV) that includes at least one sensor, a communications interface, a processor, and a memory, the memory storing instructions that, when executed by the processor, cause the ARV to:
obtain, using the at least one sensor, a plurality of initial images of a plurality of modular units in the facility, the plurality of modular units including a plurality of paper shelf labels, the plurality of initial images taken before a removal of the plurality of paper shelf labels from the plurality of modular units;
transmit the plurality of initial images to the remote computing device;
obtain, using the at least one sensor, a plurality of subsequent images of the plurality of modular units, the plurality of subsequent images taken after a plurality of electronic shelf labels are affixed to the plurality of modular units;
transmit the plurality of subsequent images to the remote computing device;
wherein the identification module when executed:
retrieves product information from the one or more databases,
analyzes the plurality of initial images to identify the plurality of paper shelf labels appearing in the plurality of initial images to determine the product information associated with each of the plurality of paper shelf labels,
analyzes the plurality of electronic shelf labels disposed on the modular unit that appear in the plurality of subsequent images to determine identifying information associated with each of the plurality of electronic shelf labels,
identifies a correspondence between each of the plurality of electronic shelf labels and one of the plurality of paper shelf labels, and
associates product information previously assigned to each of the plurality of paper shelf labels with the corresponding one of the plurality of electronic shelf labels, wherein the corresponding one of the plurality of electronic shelf labels is programmed with the product information.
2. The system of claim 1 , wherein the identification module when executed:
transmits instructions to the ARV to program the plurality of electronic shelf labels with the associated product information.
3. The system of claim 1 , wherein the identification module when executed:
analyzes a modular unit identifier disposed on the modular unit and appearing in the plurality of initial images;
identifies a location of the modular unit within a facility based on the analysis of the modular unit identifier and a planogram of the facility; and
associates the location with the identifying information of a corresponding one of the plurality of electronic shelf labels.
4. The system of claim 1 , wherein the ARV stores a planogram of the facility and uses the planogram to confirm that one of the plurality of electronic shelf labels corresponds to a planogram of the facility and transmits a notification to the remote computing device.
5. The system of claim 1 , wherein the ARV stores a planogram of the facility and uses the planogram to confirm that one of the plurality of electronic shelf labels fails to correspond to a planogram of the facility and transmits a notification to the remote computing device.
6. A system for automated association of product information with electronic shelf labels, comprising:
one or more databases holding product information associated with products assigned to a plurality of modular units in a facility; and
an autonomous robotic vehicle (ARV) that includes at least one sensor, an identification module, a processor, and a memory, the memory storing instructions that, when executed by the processor, cause the ARV to:
obtain, using the at least one sensor, a plurality of initial images of a plurality of modular units in the facility, the plurality of modular units including a plurality of paper shelf labels, the plurality of initial images taken before a removal of the plurality of paper shelf labels from the plurality of modular units;
obtain, using the at least one sensor, a plurality of subsequent images of the plurality of modular units, the plurality of subsequent images taken after a plurality of electronic shelf labels are affixed to the plurality of modular units;
wherein the identification module when executed:
retrieves product information from the one or more databases,
analyzes the plurality of initial images to identify the plurality of paper shelf labels appearing in the plurality of initial images to determine the product information associated with each of the plurality of paper shelf labels,
analyzes the plurality of electronic shelf labels disposed on the modular unit that appear in the plurality of subsequent images to determine identifying information associated with each of the plurality of electronic shelf labels,
identifies a correspondence between each of the plurality of electronic shelf labels and one of the plurality of paper shelf labels, and
associates product information previously assigned to each of the plurality of paper shelf labels with the corresponding one of the plurality of electronic shelf labels;
wherein the corresponding one of the plurality of electronic shelf labels is programmed with the product information.
7. The system of claim 6 , wherein the ARV includes a communications interface and the identification module when executed:
programs the plurality of electronic shelf labels with the associated product information using the communications interface.
8. The system of claim 6 , wherein the identification module when executed:
analyzes a modular unit identifier disposed on the modular unit and appearing in the plurality of initial images;
identifies a location of the modular unit within the facility based on the analysis of the modular unit identifier and a planogram of the facility; and
associates the location with the identifying information of a corresponding one of the plurality of electronic shelf labels.
9. The system of claim 6 , wherein the ARV stores a planogram of the facility and uses the planogram to confirm that one of the plurality of electronic shelf labels corresponds to a planogram of the facility and transmits a notification to a remote computing device.
10. The system of claim 6 , wherein the ARV stores a planogram of the facility and uses the planogram to confirm that one of the plurality of electronic shelf labels fails to correspond to a planogram of the facility and transmits a notification to a remote computing device.
11. The system of claim 6 wherein the ARV includes the one or more databases.
12. The system of claim 6 wherein the one or more databases include one or more communications interfaces and the ARV includes a communications interface, and wherein the identification module retrieves product information from the one or more databases by using the communications interface of the ARV to receive product information from the one or more communications interfaces of the one or more databases.
13. A method for automated association of product information with electronic shelf labels, comprising:
obtaining a plurality of initial images of a plurality of modular units in a facility using at least one sensor of an autonomous robot vehicle (ARV), the plurality of modular units including a plurality of paper shelf labels, the plurality of initial images taken before a removal of the plurality of paper shelf labels from the plurality of modular units;
obtaining, using the at least one sensor, a plurality of subsequent images of the plurality of modular units, the plurality of subsequent images taken after a plurality of electronic shelf labels are affixed to the plurality of modular units;
retrieving product information from one or more databases holding product information associated with products assigned to the plurality of modular units in the facility,
analyzing the plurality of initial images to identify the plurality of paper shelf labels appearing in the plurality of initial images to determine the product information associated with each of the plurality of paper shelf labels,
analyzing the plurality of electronic shelf labels disposed on the plurality of modular units that appear in the plurality of subsequent images to determine identifying information associated with each of the plurality of electronic shelf labels,
identifying a correspondence between each of the plurality of electronic shelf labels and one of the plurality of paper shelf labels, and
associating product information previously assigned to each of the plurality of paper shelf labels with the corresponding one of the plurality of electronic shelf labels; and
programming the corresponding one of the plurality of electronic shelf labels with the product information.
14. The method of claim 13 , further comprising:
programming the plurality of electronic shelf labels with the associated product information using the ARV.
15. The method of claim 13 , further comprising:
analyzing a modular unit identifier disposed on a modular unit in the plurality of modular units and appearing in the plurality of initial images;
identifying a location of the modular unit within the facility based on the analysis of the modular unit identifier and a planogram of the facility; and
associating the location with the identifying information of a corresponding one of the plurality of electronic shelf labels.
16. The method of claim 13 , wherein the ARV stores a planogram of the facility and further comprising:
determining using the stored planogram whether one of the plurality of electronic shelf labels corresponds to a planogram of the facility; and
upon a determination of correspondence, transmitting a notification from the ARV to a remote computing device.
17. The method of claim 13 , wherein the ARV stores a planogram of the facility and further comprising:
determining using the stored planogram whether one of the plurality of electronic shelf labels fails to correspond to a planogram of the facility; and
upon a determination of failure to correspond, transmitting a notification from the ARV to a remote computing device.
18. The method of claim 13 , further comprising:
transmitting the plurality of subsequent images to the remote computing device; and
transmitting the plurality of initial images to a remote computing device using a communications interface of the ARV, and
wherein analyzing the plurality of initial images and analyzing the plurality of electronic shelf labels disposed on the modular units that appear in the plurality of subsequent images is performed by the remote computing device.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/935,688 US20210027245A1 (en) | 2019-07-24 | 2020-07-22 | Systems and methods for automated association of product information with electronic shelf labels |
US17/522,252 US11580495B2 (en) | 2019-07-24 | 2021-11-09 | Systems and methods for automated association of product information with electronic shelf labels |
US18/155,538 US20230153756A1 (en) | 2019-07-24 | 2023-01-17 | Systems and methods for automated association of product information with electronic shelf labels |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962878162P | 2019-07-24 | 2019-07-24 | |
US16/935,688 US20210027245A1 (en) | 2019-07-24 | 2020-07-22 | Systems and methods for automated association of product information with electronic shelf labels |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/522,252 Continuation US11580495B2 (en) | 2019-07-24 | 2021-11-09 | Systems and methods for automated association of product information with electronic shelf labels |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210027245A1 true US20210027245A1 (en) | 2021-01-28 |
Family
ID=74190462
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/935,688 Abandoned US20210027245A1 (en) | 2019-07-24 | 2020-07-22 | Systems and methods for automated association of product information with electronic shelf labels |
US17/522,252 Active US11580495B2 (en) | 2019-07-24 | 2021-11-09 | Systems and methods for automated association of product information with electronic shelf labels |
US18/155,538 Abandoned US20230153756A1 (en) | 2019-07-24 | 2023-01-17 | Systems and methods for automated association of product information with electronic shelf labels |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/522,252 Active US11580495B2 (en) | 2019-07-24 | 2021-11-09 | Systems and methods for automated association of product information with electronic shelf labels |
US18/155,538 Abandoned US20230153756A1 (en) | 2019-07-24 | 2023-01-17 | Systems and methods for automated association of product information with electronic shelf labels |
Country Status (2)
Country | Link |
---|---|
US (3) | US20210027245A1 (en) |
CA (1) | CA3087587A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11276033B2 (en) * | 2017-12-28 | 2022-03-15 | Walmart Apollo, Llc | System and method for fine-tuning sales clusters for stores |
US11853834B2 (en) * | 2019-06-14 | 2023-12-26 | Ses-Imagotag Gmbh | Electronic shelf labelling system with a shelf edge strip sub-system |
CA3087587A1 (en) * | 2019-07-24 | 2021-01-24 | Walmart Apollo, Llc | Systems and methods for automated association of product information with electronic shelf labels |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5307294A (en) | 1992-12-22 | 1994-04-26 | Aman James A | Automated end tally system |
US7693757B2 (en) | 2006-09-21 | 2010-04-06 | International Business Machines Corporation | System and method for performing inventory using a mobile inventory robot |
KR20120074813A (en) * | 2010-12-28 | 2012-07-06 | 삼성전기주식회사 | Method for operating of esl system |
WO2017172790A1 (en) | 2016-03-29 | 2017-10-05 | Bossa Nova Robotics Ip, Inc. | Planogram assisted inventory system and method |
JP6744430B2 (en) | 2016-05-19 | 2020-08-19 | シムビ ロボティクス, インコーポレイテッドSimbe Robotics, Inc. | How to automatically generate a shelf allocation table that assigns products to a shelf structure in a store |
US20180257228A1 (en) | 2017-03-10 | 2018-09-13 | Walmart Apollo, Llc | Systems and methods for robotic assistance with retail location monitoring |
CL2017003463A1 (en) | 2017-12-28 | 2019-10-11 | Univ Pontificia Catolica Chile | Autonomous robotic system for automatic monitoring of the status of shelves in stores |
US10846561B1 (en) * | 2020-04-01 | 2020-11-24 | Scandit Ag | Recognition and selection of discrete patterns within a scene or image |
US11069073B2 (en) | 2019-07-23 | 2021-07-20 | Advanced New Technologies Co., Ltd. | On-shelf commodity detection method and system |
CA3087587A1 (en) * | 2019-07-24 | 2021-01-24 | Walmart Apollo, Llc | Systems and methods for automated association of product information with electronic shelf labels |
US11514665B2 (en) * | 2020-04-01 | 2022-11-29 | Scandit Ag | Mapping optical-code images to an overview image |
-
2020
- 2020-07-22 CA CA3087587A patent/CA3087587A1/en not_active Abandoned
- 2020-07-22 US US16/935,688 patent/US20210027245A1/en not_active Abandoned
-
2021
- 2021-11-09 US US17/522,252 patent/US11580495B2/en active Active
-
2023
- 2023-01-17 US US18/155,538 patent/US20230153756A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
CA3087587A1 (en) | 2021-01-24 |
US20220067653A1 (en) | 2022-03-03 |
US11580495B2 (en) | 2023-02-14 |
US20230153756A1 (en) | 2023-05-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11580495B2 (en) | Systems and methods for automated association of product information with electronic shelf labels | |
US20200265494A1 (en) | Remote sku on-boarding of products for subsequent video identification and sale | |
US10664692B2 (en) | Visual task feedback for workstations in materials handling facilities | |
KR102492129B1 (en) | System and method for dynamic inventory management | |
US11887051B1 (en) | Identifying user-item interactions in an automated facility | |
US20180270631A1 (en) | Object Identification Detection System | |
US11783288B2 (en) | Systems and methods for rush order fulfillment optimization | |
US11514665B2 (en) | Mapping optical-code images to an overview image | |
US10769585B2 (en) | Systems and methods for automated harmonized system (HS) code assignment | |
US11138551B2 (en) | Bundled application for load management system | |
US20180218471A1 (en) | Systems and methods for displaying an item in a selected storage location using augmented reality | |
US20230316250A1 (en) | Location based register rules | |
CN104200300A (en) | Packaging and printing enterprise production cost estimation system | |
US11397910B2 (en) | System and method for product recognition and assignment at an automated storage and retrieval device | |
US11238401B1 (en) | Identifying user-item interactions in an automated facility | |
US10372753B2 (en) | System for verifying physical object absences from assigned regions using video analytics | |
KR102278061B1 (en) | System for managing buying orders, device for buying agencies, and information processing method thereof | |
US20190045025A1 (en) | Distributed Recognition Feedback Acquisition System | |
US20190019339A1 (en) | Systems and methods for dynamically displaying information about an object using augmented reality | |
US10552685B2 (en) | Systems and methods for locating physical object using live image feeds | |
US11494729B1 (en) | Identifying user-item interactions in an automated facility | |
US20190102831A1 (en) | System and Method for Virtual Display of Customized Products in a Facility | |
US11556891B2 (en) | Operations system for combining independent product monitoring systems to automatically manage product inventory and product pricing and automate store processes | |
US11823124B2 (en) | Inventory management and delivery through image and data integration | |
US20240144340A1 (en) | Remote SKU On-Boarding of Products for Subsequent Video Identification and Sale |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: WALMART APOLLO, LLC, ARKANSAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HIGH, DONALD RAY;FAITAK, MARTIN THOMAS;SIGNING DATES FROM 20190728 TO 20190805;REEL/FRAME:053293/0038 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |