GB2543136A - Systems, devices and methods for monitoring modular compliance in a shopping space - Google Patents
Systems, devices and methods for monitoring modular compliance in a shopping space Download PDFInfo
- Publication number
- GB2543136A GB2543136A GB1613873.7A GB201613873A GB2543136A GB 2543136 A GB2543136 A GB 2543136A GB 201613873 A GB201613873 A GB 201613873A GB 2543136 A GB2543136 A GB 2543136A
- Authority
- GB
- United Kingdom
- Prior art keywords
- display module
- motorized transport
- item
- images
- baseline condition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0607—Regulated
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Development Economics (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Entrepreneurship & Innovation (AREA)
- Quality & Reliability (AREA)
- Artificial Intelligence (AREA)
- Human Resources & Organizations (AREA)
- Operations Research (AREA)
- Tourism & Hospitality (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Educational Administration (AREA)
- Game Theory and Decision Science (AREA)
- Handcart (AREA)
Abstract
Monitoring a condition of display modules in a retail space includes obtaining one or more images of a display module 650 in the shopping space from an image capture device 620 and retrieving, from a baseline condition database 610, a baseline condition model for a display space corresponding to the display module. The pictures are compared with the reference model to determine a modular compliance status for the module. The baseline model may be a three dimensional model. The determination of the compliance status may be involve comparing on item placement, signage placement or content, item quantity, or display area cleanliness with the baseline model and may also use information on environment temperature or barcode or RFID scans. The camera for capturing images may be mounted on a centrally controlled motorized transport unit. A follow up task may be determined based on the compliance.
Description
SYSTEMS, DEVICES AND METHODS FOR MONITORING MODULAR COMPLIANCE IN A SHOPPING SPACE
Cross-Reference To Related Application [0001] This application claims the benefit of U S. Provisional Application No. 62/205,548, filed August 14, 2015, and which is incorporated herein by reference.
Technical Field [0002] These teachings relate generally to shopping environments and more particularly to devices, systems and methods for assisting customers and/or workers in those shopping environments.
Background [0003] In a modem retail store environment, there is a need to improve the customer experience and/or convenience for the customer. Whether shopping in a large format (big box) store or smaller format (neighborhood) store, customers often require assistance that employees of the store are not always able to provide. For example, particularly during peak hours, there may not be enough employees available to assist customers such that customer questions go unanswered. Additionally, due to high employee turnover rates, available employees may not be fully trained or have access to information to adequately support customers. Other routine tasks also are difficult to keep up with, particularly during peak hours. For example, shopping carts are left abandoned, aisles become messy, inventory is not displayed in the proper locations or is not even placed on the sales floor, shelf prices may not be properly set, and theft is hard to discourage. All of these issues can result in low customer satisfaction or reduced convenience to the customer. With increasing competition from non-traditional shopping mechanisms, such as online shopping provided by e-commerce merchants and alternative store formats, it can be important for “brick and mortar” retailers to focus on improving the overall customer experience and/or convenience.
Brief Description of the Drawings [0004] The above needs are at least partially met through provision of embodiments of systems, devices, and methods designed to provide assistance to customers and/or workers in a shopping facility, such as described in the following detailed description, particularly when studied in conjunction with the drawings, wherein: [0005] FIG. 1 comprises a block diagram of a shopping assistance system as configured in accordance with various embodiments of these teachings; [0006] FIGS. 2 A and 2B are illustrations of a motorized transport unit of the system of FIG. 1 in a retracted orientation and an extended orientation in accordance with some embodiments; [0007] FIGS. 3A and 3B are illustrations of the motorized transport unit of FIGS. 2A and 2B detachably coupling to a movable item container, such as a shopping cart, in accordance with some embodiments; [0008] FIG. 4 comprises a block diagram of a motorized transport unit as configured in accordance with various embodiments of these teachings; [0009] FIG. 5 comprises a block diagram of a computer device as configured in accordance with various embodiments of these teachings; [0010] FIG. 6 comprises a block diagram of a system for monitoring modular compliance in accordance with some embodiments.
[0011] FIG. 7 comprises a flow diagram of a method for monitoring modular compliance in accordance with some embodiments.
[0012] FIG. 8 comprises an illustration of a baseline model and a store shelf compared for ensuring modular compliance in accordance with some embodiments.
[0013] FIG. 9 comprises a flow diagram of a process for monitoring modular compliance in accordance with some embodiments.
[0014] Elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions and/or relative positioning of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present teachings. Also, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present teachings. Certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. The terms and expressions used herein have the ordinary technical meaning as is accorded to such terms and expressions by persons skilled in the technical field as set forth above except where different specific meanings have otherwise been set forth herein.
Detailed Description [0015] The following description is not to be taken in a limiting sense, but is made merely for the purpose of describing the general principles of exemplary embodiments. Reference throughout this specification to "one embodiment," "an embodiment," or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment," "in an embodiment," and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0016] Generally speaking, pursuant to various embodiments, systems, devices and methods are provided for assistance of persons at a shopping facility. Generally, assistance may be provided to customers or shoppers at the facility and/or to workers at the facility. The facility may be any type of shopping facility at a location in which products for display and/or for sale are variously distributed throughout the shopping facility space. The shopping facility may be a retail sales facility, or any other type of facility in which products are displayed and/or sold. The shopping facility may include one or more of sales floor areas, checkout locations, parking locations, entrance and exit areas, stock room areas, stock receiving areas, hallway areas, common areas shared by merchants, and so on Generally, a shopping facility includes areas that may be dynamic in terms of the physical structures occupying the space or area and objects, items, machinery and/or persons moving in the area. For example, the shopping area may include product storage units, shelves, racks, modules, bins, etc., and other walls, dividers, partitions, etc. that may be configured in different layouts or physical arrangements. In other example, persons or other movable objects may be freely and independently traveling through the shopping facility space. And in other example, the persons or movable objects move according to known travel patterns and timing. The facility may be any size of format facility, and may include products from one or more merchants.
For example, a facility may be a single store operated by one merchant or may be a collection of stores covering multiple merchants such as a mall. Generally, the system makes use of automated, robotic mobile devices, e g., motorized transport units, that are capable of self powered movement through a space of the shopping facility and providing any number of functions. Movement and operation of such devices may be controlled by a central computer system or may be autonomously controlled by the motorized transport units themselves. Various embodiments provide one or more user interfaces to allow various users to interact with the system including the automated mobile devices and/or to directly interact with the automated mobile devices. In some embodiments, the automated mobile devices and the corresponding system serve to enhance a customer shopping experience in the shopping facility, e.g., by assisting shoppers and/or workers at the facility.
[0017] In some embodiments, a shopping facility personal assistance system comprises: a plurality of motorized transport units located in and configured to move through a shopping facility space; a plurality of user interface units, each corresponding to a respective motorized transport unit during use of the respective motorized transport unit; and a central computer system having a network interface such that the central computer system wirelessly communicates with one or both of the plurality of motorized transport units and the plurality of user interface units, wherein the central computer system is configured to control movement of the plurality of motorized transport units through the shopping facility space based at least on inputs from the plurality of user interface units.
[0018] SYSTEM OVERVIEW
[0019] Referring now to the drawings, FIG. 1 illustrates embodiments of a shopping facility assistance system 100 that can serve to carry out at least some of the teachings set forth herein. It will be understood that the details of this example are intended to serve in an illustrative capacity and are not necessarily intended to suggest any limitations as regards the present teachings. It is noted that generally, FIGS. 1-5 describe the general functionality of several embodiments of a system, and FIGS. 6-9 expand on some functionalities of some embodiments of the system and/or embodiments independent of such systems.
[0020] In the example of FIG. 1, a shopping assistance system 100 is implemented in whole or in part at a shopping facility 101. Generally, the system 100 includes one or more motorized transport units (MTUs) 102; one or more item containers 104; a central computer system 106 having at least one control circuit 108, at least one memory 110 and at least one network interface 112; at least one user interface unit 114; a location determination system 116; at least one video camera 118; at least one motorized transport unit (MTU) dispenser 120; at least one motorized transport unit (MTU) docking station 122; at least one wireless network 124; at least one database 126; at least one user interface computer device 128; an item display module 130; and a locker or an item storage unit 132. It is understood that more or fewer of such components may be included in different embodiments of the system 100.
[0021] These motorized transport units 102 are located in the shopping facility 101 and are configured to move throughout the shopping facility space. Further details regarding such motorized transport units 102 appear further below. Generally speaking, these motorized transport units 102 are configured to either comprise, or to selectively couple to, a corresponding movable item container 104. A simple example of an item container 104 would be a shopping cart as one typically finds at many retail facilities, or a rocket cart, a flatbed cart or any other mobile basket or platform that may be used to gather items for potential purchase.
[0022] In some embodiments, these motorized transport units 102 wirelessly communicate with, and are wholly or largely controlled by, the central computer system 106. In particular, in some embodiments, the central computer system 106 is configured to control movement of the motorized transport units 102 through the shopping facility space based on a variety of inputs. For example, the central computer system 106 communicates with each motorized transport unit 102 via the wireless network 124 which may be one or more wireless networks of one or more wireless network types (such as, a wireless local area network, a wireless personal area network, a wireless mesh network, a wireless star network, a wireless wide area network, a cellular network, and so on), capable of providing wireless coverage of the desired range of the motorized transport units 102 according to any known wireless protocols, including but not limited to a cellular, Wi-Fi, Zigbee or Bluetooth network.
[0023] By one approach the central computer system 106 is a computer based device and includes at least one control circuit 108, at least one memory 110 and at least one wired and/or wireless network interface 112. Such a control circuit 108 can comprise a fixed-purpose hard-wired platform or can comprise a partially or wholly programmable platform, such as a microcontroller, an application specification integrated circuit, a field programmable gate array, and so on. These architectural options are well known and understood in the art and require no further description here. This control circuit 108 is configured (for example, by using corresponding programming stored in the memory 110 as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein.
[0024] In this illustrative example the control circuit 108 operably couples to one or more memories 110. The memory 110 may be integral to the control circuit 108 or can be physically discrete (in whole or in part) from the control circuit 108 as desired. This memory 110 can also be local with respect to the control circuit 108 (where, for example, both share a common circuit board, chassis, power supply, and/or housing) or can be partially or wholly remote with respect to the control circuit 108 (where, for example, the memory 110 is physically located in another facility, metropolitan area, or even country as compared to the control circuit 108).
[0025] This memory 110 can serve, for example, to non-transitorily store the computer instructions that, when executed by the control circuit 108, cause the control circuit 108 to behave as described herein. (As used herein, this reference to “non-transitorily” will be understood to refer to a non-ephemeral state for the stored contents (and hence excludes when the stored contents merely constitute signals or waves) rather than volatility of the storage media itself and hence includes both non-volatile memory (such as read-only memory (ROM) as well as volatile memory (such as an erasable programmable read-only memory (EPROM).) [0026] Additionally, at least one database 126 may be accessible by the central computer system 106. Such databases may be integrated into the central computer system 106 or separate from it. Such databases may be at the location of the shopping facility 101 or remote from the shopping facility 101. Regardless of location, the databases comprise memory to store and organize certain data for use by the central control system 106. In some embodiments, the at least one database 126 may store data pertaining to one or more of: shopping facility mapping data, customer data, customer shopping data and patterns, inventory data, product pricing data, and so on.
[0027] In this illustrative example, the central computer system 106 also wirelessly communicates with a plurality of user interface units 114. These teachings will accommodate a variety of user interface units including, but not limited to, mobile and/or handheld electronic devices such as so-called smart phones and portable computers such as tablet/pad-styled computers. Generally speaking, these user interface units 114 should be able to wirelessly communicate with the central computer system 106 via a wireless network, such as the wireless network 124 of the shopping facility 101 (such as a Wi-Fi wireless network). These user interface units 114 generally provide a user interface for interaction with the system. In some embodiments, a given motorized transport unit 102 is paired with, associated with, assigned to or otherwise made to correspond with a given user interface unit 114. In some embodiments, these user interface units 114 should also be able to receive verbally-expressed input from a user and forward that content to the central computer system 106 or a motorized transport unit 102 and/or convert that verbally-expressed input into a form useful to the central computer system 106 or a motorized transport unit 102.
[0028] By one approach at least some of the user interface units 114 belong to corresponding customers who have come to the shopping facility 101 to shop. By another approach, in lieu of the foregoing or in combination therewith, at least some of the user interface units 114 belong to the shopping facility 101 and are loaned to individual customers to employ as described herein. In some embodiments, one or more user interface units 114 are attachable to a given movable item container 104 or are integrated with the movable item container 104. Similarly, in some embodiments, one or more user interface units 114 may be those of shopping facility workers, belong to the shopping facility 101 and are loaned to the workers, or a combination thereof.
[0029] In some embodiments, the user interface units 114 may be general purpose computer devices that include computer programming code to allow it to interact with the system 106. For example, such programming may be in the form of an application installed on the user interface unit 114 or in the form of a browser that displays a user interface provided by the central computer system 106 or other remote computer or server (such as a web server). In some embodiments, one or more user interface units 114 may be special purpose devices that are programmed to primarily function as a user interface for the system 100. Depending on the functionality and use case, user interface units 114 may be operated by customers of the shopping facility or may be operated by workers at the shopping facility, such as facility employees (associates or colleagues), vendors, suppliers, contractors, etc.
[0030] By one approach, the system 100 optionally includes one or more video cameras 118. Captured video imagery from such a video camera 118 can be provided to the central computer system 106. That information can then serve, for example, to help the central computer system 106 determine a present location of one or more of the motorized transport units 102 and/or determine issues or concerns regarding automated movement of those motorized transport units 102 in the shopping facility space. As one simple example in these regards, such video information can permit the central computer system 106, at least in part, to detect an object in a path of movement of a particular one of the motorized transport units 102.
[0031] By one approach these video cameras 118 comprise existing surveillance equipment employed at the shopping facility 101 to serve, for example, various security purposes. By another approach these video cameras 118 are dedicated to providing video content to the central computer system 106 to facilitate the latter’s control of the motorized transport units 102. If desired, the video cameras 118 can have a selectively movable field of view and/or zoom capability that the central computer system 106 controls as appropriate to help ensure receipt of useful information at any given moment.
[0032] In some embodiments, a location detection system 116 is provided at the shopping facility 101. The location detection system 116 provides input to the central computer system 106 useful to help determine the location of one or more of the motorized transport units 102. In some embodiments, the location detection system 116 includes a series of light sources (e.g., LEDs (light-emitting diodes)) that are mounted in the ceiling at known positions throughout the space and that each encode data in the emitted light that identifies the source of the light (and thus, the location of the light). As a given motorized transport unit 102 moves through the space, light sensors (or light receivers) at the motorized transport unit 102, on the movable item container 104 and/or at the user interface unit 114 receive the light and can decode the data. This data is sent back to the central computer system 106 which can determine the position of the motorized transport unit 102 by the data of the light it receives, since it can relate the light data to a mapping of the light sources to locations at the facility 101. Generally, such lighting systems are known and commercially available, e.g., the ByteLight system from ByteLight of Boston, Massachusetts. In embodiments using a ByteLight system, a typical display screen of the typical smart phone device can be used as a light sensor or light receiver to receive and process data encoded into the light from the ByteLight light sources.
[0033] In other embodiments, the location detection system 116 includes a series of low energy radio beacons (e.g., Bluetooth low energy beacons) at known positions throughout the space and that each encode data in the emitted radio signal that identifies the beacon (and thus, the location of the beacon). As a given motorized transport unit 102 moves through the space, low energy receivers at the motorized transport unit 102, on the movable item container 104 and/or at the user interface unit 114 receive the radio signal and can decode the data. This data is sent back to the central computer system 106 which can determine the position of the motorized transport unit 102 by the location encoded in the radio signal it receives, since it can relate the location data to a mapping of the low energy radio beacons to locations at the facility 101. Generally, such low energy radio systems are known and commercially available. In embodiments using a Bluetooth low energy radio system, a typical Bluetooth radio of a typical smart phone device can be used as a receiver to receive and process data encoded into the Bluetooth low energy radio signals from the Bluetooth low energy beacons.
[0034] In still other embodiments, the location detection system 116 includes a series of audio beacons at known positions throughout the space and that each encodes data in the emitted audio signal that identifies the beacon (and thus, the location of the beacon). As a given motorized transport unit 102 moves through the space, microphones at the motorized transport unit 102, on the movable item container 104 and/or at the user interface unit 114 receive the audio signal and can decode the data. This data is sent back to the central computer system 106 which can determine the position of the motorized transport unit 102 by the location encoded in the audio signal it receives, since it can relate the location data to a mapping of the audio beacons to locations at the facility 101. Generally, such audio beacon systems are known and commercially available. In embodiments using an audio beacon system, a typical microphone of a typical smart phone device can be used as a receiver to receive and process data encoded into the audio signals from the audio beacon.
[0035] Also optionally, the central computer system 106 can operably couple to one or more user interface computers 128 (comprising, for example, a display and a user input interface such as a keyboard, touch screen, and/or cursor-movement device). Such a user interface computer 128 can permit, for example, a worker (e.g., an associate, analyst, etc.) at the retail or shopping facility 101 to monitor the operations of the central computer system 106 and/or to attend to any of a variety of administrative, configuration or evaluation tasks as may correspond to the programming and operation of the central computer system 106. Such user interface computers 128 may be at or remote from the location of the facility 101 and may access one or more the databases 126.
[0036] In some embodiments, the system 100 includes at least one motorized transport unit (MTU) storage unit or dispenser 120 at various locations in the shopping facility 101. The dispenser 120 provides for storage of motorized transport units 102 that are ready to be assigned to customers and/or workers. In some embodiments, the dispenser 120 takes the form of a cylinder within which motorized transports units 102 are stacked and released through the bottom of the dispenser 120. Further details of such embodiments are provided further below. In some embodiments, the dispenser 120 may be fixed in location or may be mobile and capable of transporting itself to a given location or utilizing a motorized transport unit 102 to transport the dispenser 120, then dispense one or more motorized transport units 102.
[0037] In some embodiments, the system 100 includes at least one motorized transport unit (MTU) docking station 122. These docking stations 122 provide locations where motorized transport units 102 can travel and connect to. For example, the motorized transport units 102 may be stored and charged at the docking station 122 for later use, and/or may be serviced at the docking station 122.
[0038] In accordance with some embodiments, a given motorized transport unit 102 detachably connects to a movable item container 104 and is configured to move the movable item container 104 through the shopping facility space under control of the central computer system 106 and/or the user interface unit 114. For example, a motorized transport unit 102 can move to a position underneath a movable item container 104 (such as a shopping cart, a rocket cart, a flatbed cart, or any other mobile basket or platform), align itself with the movable item container 104 (e.g., using sensors) and then raise itself to engage an undersurface of the movable item container 104 and lift a portion of the movable item container 104. Once the motorized transport unit is cooperating with the movable item container 104 (e.g., lifting a portion of the movable item container), the motorized transport unit 102 can continue to move throughout the facility space 101 taking the movable item container 104 with it. In some examples, the motorized transport unit 102 takes the form of the motorized transport unit 202 of FIGS. 2A-3B as it engages and detachably connects to a given movable item container 104. It is understood that in other embodiments, the motorized transport unit 102 may not lift a portion of the movable item container 104, but that it removably latches to, connects to or otherwise attaches to a portion of the movable item container 104 such that the movable item container 104 can be moved by the motorized transport unit 102. For example, the motorized transport unit 102 can connect to a given movable item container using a hook, a mating connector, a magnet, and so on.
[0039] In addition to detachably coupling to movable item containers 104 (such as shopping carts), in some embodiments, motorized transport units 102 can move to and engage or connect to an item display module 130 and/or an item storage unit or locker 132. For example, an item display module 130 may take the form of a mobile display rack or shelving unit configured to house and display certain items for sale. It may be desired to position the display module 130 at various locations within the shopping facility 101 at various times. Thus, one or more motorized transport units 102 may move (as controlled by the central computer system 106) underneath the item display module 130, extend upward to lift the module 130 and then move it to the desired location. A storage locker 132 may be a storage device where items for purchase are collected and placed therein for a customer and/or worker to later retrieve. In some embodiments, one or more motorized transport units 102 may be used to move the storage locker to a desired location in the shopping facility 101. Similar to how a motorized transport unit engages a movable item container 104 or item display module 130, one or more motorized transport units 102 may move (as controlled by the central computer system 106) underneath the storage locker 132, extend upward to lift the locker 132 and then move it to the desired location.
[0040] FIGS. 2A and 2B illustrate some embodiments of a motorized transport unit 202, similar to the motorized transport unit 102 shown in the system of FIG. 1. In this embodiment, the motorized transport unit 202 takes the form of a disc-shaped robotic device having motorized wheels (not shown), a lower body portion 204 and an upper body portion 206 that fits over at least part of the lower body portion 204. It is noted that in other embodiments, the motorized transport unit may have other shapes and/or configurations, and is not limited to disc-shaped. For example, the motorized transport unit may be cubic, octagonal, triangular, or other shapes, and may be dependent on a movable item container with which the motorized transport unit is intended to cooperate. Also included are guide members 208. In FIG. 2A, the motorized transport unit 202 is shown in a retracted position in which the upper body portion 206 fits over the lower body portion 204 such that the motorized transport unit 202 is in its lowest profile orientation which is generally the preferred orientation for movement when it is unattached to a movable item container 104 for example. In FIG. 2B, the motorized transport unit 202 is shown in an extended position in which the upper body portion 206 is moved upward relative to the lower body portion 204 such that the motorized transport unit 202 is in its highest profile orientation for movement when it is lifting and attaching to a movable item container 104 for example. The mechanism within the motorized transport unit 202 is designed to provide sufficient lifting force to lift the weight of the upper body portion 206 and other objects to be lifted by the motorized transport unit 202, such as movable item containers 104 and items placed within the movable item container, item display modules 130 and items supported by the item display module, and storage lockers 132 and items placed within the storage locker. The guide members 208 are embodied as pegs or shafts that extend horizontally from the both the upper body portion 206 and the lower body portion 204. In some embodiments, these guide members 208 assist docking the motorized transport unit 202 to a docking station 122 or a dispenser 120. In some embodiments, the lower body portion 204 and the upper body portion are capable to moving independently of each other. For example, the upper body portion 206 may be raised and/or rotated relative to the lower body portion 204. That is, one or both of the upper body portion 206 and the lower body portion 204 may move toward/away from the other or rotated relative to the other. In some embodiments, in order to raise the upper body portion 206 relative to the lower body portion 204, the motorized transport unit 202 includes an internal lifting system (e.g., including one or more electric actuators or rotary drives or motors). Numerous examples of such motorized lifting and rotating systems are known in the art. Accordingly, further elaboration in these regards is not provided here for the sake of brevity.
[0041] FIGS. 3 A and 3B illustrate some embodiments of the motorized transport unit 202 detachably engaging a movable item container embodied as a shopping cart 302. In FIG 3 A, the motorized transport unit 202 is in the orientation of FIG. 2A such that it is retracted and able to move in position underneath a portion of the shopping cart 302. Once the motorized transport unit 202 is in position (e.g., using sensors), as illustrated in FIG. 3B, the motorized transport unit 202 is moved to the extended position of FIG. 2B such that the front portion 304 of the shopping cart is lifted off of the ground by the motorized transport unit 202, with the wheels 306 at the rear of the shopping cart 302 remaining on the ground. In this orientation, the motorized transport unit 202 is able to move the shopping cart 302 throughout the shopping facility. It is noted that in these embodiments, the motorized transport unit 202 does not bear the weight of the entire cart 302 since the rear wheels 306 rest on the floor. It is understood that in some embodiments, the motorized transport unit 202 may be configured to detachably engage other types of movable item containers, such as rocket carts, flatbed carts or other mobile baskets or platforms.
[0042] FIG. 4 presents a more detailed example of some embodiments of the motorized transport unit 102 of FIG. 1. In this example, the motorized transport unit 102 has a housing 402 that contains (partially or fully) or at least supports and carries a number of components. These components include a control unit 404 comprising a control circuit 406 that, like the control circuit 108 of the central computer system 106, controls the general operations of the motorized transport unit 102. Accordingly, the control unit 404 also includes a memory 408 coupled to the control circuit 406 and that stores, for example, operating instructions and/or useful data.
[0043] The control circuit 406 operably couples to a motorized wheel system 410. This motorized wheel system 410 functions as a locomotion system to permit the motorized transport unit 102 to move within the aforementioned retail or shopping facility 101 (thus, the motorized wheel system 410 may more generically be referred to as a locomotion system). Generally speaking, this motorized wheel system 410 will include at least one drive wheel (i.e., a wheel that rotates (around a horizontal axis) under power to thereby cause the motorized transport unit 102 to move through interaction with, for example, the floor of the shopping facility 101). The motorized wheel system 410 can include any number of rotating wheels and/or other floor-contacting mechanisms as may be desired and/or appropriate to the application setting.
[0044] The motorized wheel system 410 also includes a steering mechanism of choice. One simple example in these regards comprises one or more of the aforementioned wheels that can swivel about a vertical axis to thereby cause the moving motorized transport unit 102 to turn as well.
[0045] Numerous examples of motorized wheel systems are known in the art. Accordingly, further elaboration in these regards is not provided here for the sake of brevity save to note that the aforementioned control circuit 406 is configured to control the various operating states of the motorized wheel system 410 to thereby control when and how the motorized wheel system 410 operates.
[0046] In this illustrative example, the control circuit 406 also operably couples to at least one wireless transceiver 412 that operates according to any known wireless protocol.
This wireless transceiver 412 can comprise, for example, a Wi-Fi-compatible and/or Bluetooth-compatible transceiver that can communicate with the aforementioned central computer system 106 via the aforementioned wireless network 124 of the shopping facility 101. So configured the control circuit 406 of the motorized transport unit 102 can provide information to the central computer system 106 and can receive information and/or instructions from the central computer system 106. As one simple example in these regards, the control circuit 406 can receive instructions from the central computer system 106 regarding movement of the motorized transport unit 102.
[0047] These teachings will accommodate using any of a wide variety of wireless technologies as desired and/or as may be appropriate in a given application setting. These teachings will also accommodate employing two or more different wireless transceivers 412 if desired.
[0048] The control circuit 406 also couples to one or more on-board sensors 414. These teachings will accommodate a wide variety of sensor technologies and form factors.
By one approach at least one such sensor 414 can comprise a light sensor or light receiver. When the aforementioned location detection system 116 comprises a plurality of light emitters disposed at particular locations within the shopping facility 101, such a light sensor can provide information that the control circuit 406 and/or the central computer system 106 employs to determine a present location and/or orientation of the motorized transport unit 102.
[0049] As another example, such a sensor 414 can comprise a distance measurement unit configured to detect a distance between the motorized transport unit 102 and one or more objects or surfaces around the motorized transport unit 102 (such as an object that lies in a projected path of movement for the motorized transport unit 102 through the shopping facility 101). These teachings will accommodate any of a variety of distance measurement units including optical units and sound/ultrasound units. In one example, a sensor 414 comprises a laser distance sensor device capable of determining a distance to objects in proximity to the sensor. In some embodiments, a sensor 414 comprises an optical based scanning device to sense and read optical patterns in proximity to the sensor, such as bar codes variously located on structures in the shopping facility 101. In some embodiments, a sensor 414 comprises a radio frequency identification (RFID) tag reader capable of reading RFID tags in proximity to the sensor. Such sensors may be useful to determine proximity to nearby objects, avoid collisions, orient the motorized transport unit at a proper alignment orientation to engage a movable item container, and so on.
[0050] The foregoing examples are intended to be illustrative and are not intended to convey an exhaustive listing of all possible sensors. Instead, it will be understood that these teachings will accommodate sensing any of a wide variety of circumstances or phenomena to support the operating functionality of the motorized transport unit 102 in a given application setting.
[0051] By one optional approach an audio input 416 (such as a microphone) and/or an audio output 418 (such as a speaker) can also operably couple to the control circuit 406. So configured the control circuit 406 can provide a variety of audible sounds to thereby communicate with a user of the motorized transport unit 102, other persons in the vicinity of the motorized transport unit 102, or even other motorized transport units 102 in the area. These audible sounds can include any of a variety of tones and other non-verbal sounds.
These audible sounds can also include, in lieu of the foregoing or in combination therewith, pre-recorded or synthesized speech.
[0052] The audio input 416, in turn, provides a mechanism whereby, for example, a user provides verbal input to the control circuit 406. That verbal input can comprise, for example, instructions, inquiries, or information So configured, a user can provide, for example, a question to the motorized transport unit 102 (such as, “Where are the towels?”). The control circuit 406 can cause that verbalized question to be transmitted to the central computer system 106 via the motorized transport unit’s wireless transceiver 412. The central computer system 106 can process that verbal input to recognize the speech content and to then determine an appropriate response. That response might comprise, for example, transmitting back to the motorized transport unit 102 specific instructions regarding how to move the motorized transport unit 102 (via the aforementioned motorized wheel system 410) to the location in the shopping facility 101 where the towels are displayed.
[0053] In this example the motorized transport unit 102 includes a rechargeable power source 420 such as one or more batteries. The power provided by the rechargeable power source 420 can be made available to whichever components of the motorized transport unit 102 require electrical energy. By one approach the motorized transport unit 102 includes a plug or other electrically conductive interface that the control circuit 406 can utilize to automatically connect to an external source of electrical energy to thereby recharge the rechargeable power source 420.
[0054] By one approach the motorized transport unit 102 comprises an integral part of a movable item container 104 such as a grocery cart. As used herein, this reference to “integral” will be understood to refer to a non-temporary combination and joinder that is sufficiently complete so as to consider the combined elements to be as one. Such a joinder can be facilitated in a number of ways including by securing the motorized transport unit housing 402 to the item container using bolts or other threaded fasteners as versus, for example, a clip.
[0055] These teachings will also accommodate selectively and temporarily attaching the motorized transport unit 102 to an item container 104. In such a case the motorized transport unit 102 can include a movable item container coupling structure 422. By one approach this movable item container coupling structure 422 operably couples to a control circuit 406 to thereby permit the latter to control, for example, the latched and unlatched states of the movable item container coupling structure 422. So configured, by one approach the control circuit 406 can automatically and selectively move the motorized transport unit 102 (via the motorized wheel system 410) towards a particular item container until the movable item container coupling structure 422 can engage the item container to thereby temporarily physically couple the motorized transport unit 102 to the item container. So latched, the motorized transport unit 102 can then cause the item container to move with the motorized transport unit 102. In embodiments such as illustrated in FIGS. 2A-3B, the movable item container coupling structure 422 includes a lifting system (e.g., including an electric drive or motor) to cause a portion of the body or housing 402 to engage and lift a portion of the item container off of the ground such that the motorized transport unit 102 can carry a portion of the item container. In other embodiments, the movable transport unit latches to a portion of the movable item container without lifting a portion thereof off of the ground.
[0056] In either case, by combining the motorized transport unit 102 with an item container, and by controlling movement of the motorized transport unit 102 via the aforementioned central computer system 106, these teachings will facilitate a wide variety of useful ways to assist both customers and associates in a shopping facility setting. For example, the motorized transport unit 102 can be configured to follow a particular customer as they shop within the shopping facility 101. The customer can then place items they intend to purchase into the item container that is associated with the motorized transport unit 102.
[0057] In some embodiments, the motorized transport unit 102 includes an input/output (I/O) device 424 that is coupled to the control circuit 406. The I/O device 424 allows an external device to couple to the control unit 404. The function and purpose of connecting devices will depend on the application. In some examples, devices connecting to the I/O device 424 may add functionality to the control unit 404, allow the exporting of data from the control unit 404, allow the diagnosing of the motorized transport unit 102, and so on.
[0058] In some embodiments, the motorized transport unit 102 includes a user interface 426 including for example, user inputs and/or user outputs or displays depending on the intended interaction with the user. For example, user inputs could include any input device such as buttons, knobs, switches, touch sensitive surfaces or display screens, and so on. Example user outputs include lights, display screens, and so on. The user interface 426 may work together with or separate from any user interface implemented at a user interface unit 114 (such as a smart phone or tablet device).
[0059] The control unit 404 includes a memory 408 coupled to the control circuit 406 and that stores, for example, operating instructions and/or useful data. The control circuit 406 can comprise a fixed-purpose hard-wired platform or can comprise a partially or wholly programmable platform. These architectural options are well known and understood in the art and require no further description here. This control circuit 406 is configured (for example, by using corresponding programming stored in the memory 408 as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein. The memory 408 may be integral to the control circuit 406 or can be physically discrete (in whole or in part) from the control circuit 406 as desired. This memory 408 can also be local with respect to the control circuit 406 (where, for example, both share a common circuit board, chassis, power supply, and/or housing) or can be partially or wholly remote with respect to the control circuit 406. This memory 408 can serve, for example, to non-transitorily store the computer instructions that, when executed by the control circuit 406, cause the control circuit 406 to behave as described herein. (As used herein, this reference to “non-transitorily” will be understood to refer to a non-ephemeral state for the stored contents (and hence excludes when the stored contents merely constitute signals or waves) rather than volatility of the storage media itself and hence includes both non-volatile memory (such as read-only memory (ROM) as well as volatile memory (such as an erasable programmable read-only memory (EPROM).) [0060] It is noted that not all components illustrated in FIG. 4 are included in all embodiments of the motorized transport unit 102. That is, some components may be optional depending on the implementation.
[0061] FIG. 5 illustrates a functional block diagram that may generally represent any number of various electronic components of the system 100 that are computer type devices. The computer device 500 includes a control circuit 502, a memory 504, a user interface 506 and an input/output (I/O) interface 508 providing any type of wired and/or wireless connectivity to the computer device 500, all coupled to a communication bus 510 to allow data and signaling to pass therebetween. Generally, the control circuit 502 and the memory 504 may be referred to as a control unit. The control circuit 502, the memory 504, the user interface 506 and the I/O interface 508 may be any of the devices described herein or as understood in the art. The functionality of the computer device 500 will depend on the programming stored in the memory 504. The computer device 500 may represent a high level diagram for one or more of the central computer system 106, the motorized transport unit 102, the user interface unit 114, the location detection system 116, the user interface computer 128, the MTU docking station 122 and the MTU dispenser 120, or any other device or component in the system that is implemented as a computer device.
[0062] ADDITIONAL FEATURES OVERVIEW
[0063] Referring generally to FIGS. 1-5, the shopping assistance system 100 may implement one or more of several different features depending on the configuration of the system and its components. The following provides a brief description of several additional features that could be implemented by the system. One or more of these features could also be implemented in other systems separate from embodiments of the system. This is not meant to be an exhaustive description of all features and not meant to be an exhaustive description of the details any one of the features. Further details with regards to one or more features beyond this overview may be provided herein.
[0064] Tagalong Steering: This feature allows a given motorized transport unit 102 to lead or follow a user (e.g., a customer and/or a worker) throughout the shopping facility 101.
For example, the central computer system 106 uses the location detection system 116 to determine the location of the motorized transport unit 102. For example, LED smart lights (eg., the ByteLight system) of the location detection system 116 transmit a location number to smart devices which are with the customer (e g., user interface units 114), and/or on the item container 104/motorized transport unit 102. The central computer system 106 receives the LED location numbers received by the smart devices through the wireless network 124. Using this information, in some embodiments, the central computer system 106 uses a grid placed upon a 2D CAD map and 3D point cloud model (e.g., from the databases 126) to direct, track, and plot paths for the other devices. Using the grid, the motorized transport unit 102 can drive a movable item container 104 in a straight path rather than zigzagging around the facility. As the user moves from one grid to another, the motorized transport unit 102 drives the container 104 from one grid to the other. In some embodiments, as the user moves towards the motorized transport unit, it stays still until the customer moves beyond an adjoining grid.
[0065] Detecting Objects: In some embodiments, motorized transport units 102 detect objects through several sensors mounted on motorized transport unit 102, through independent cameras (e g., video cameras 118), through sensors of a corresponding movable item container 104, and through communications with the central computer system 106. In some embodiments, with semi-autonomous capabilities, the motorized transport unit 102 will attempt to avoid obstacles, and if unable to avoid, it will notify the central computer system 106 of an exception condition. In some embodiments, using sensors 414 (such as distance measurement units, e.g., laser or other optical-based distance measurement sensors), the motorized transport unit 102 detects obstacles in its path, and will move to avoid, or stop until the obstacle is clear.
[0066] Visual Remote Steering: This feature enables movement and/or operation of a motorized transport unit 102 to be controlled by a user on-site, off-site, or anywhere in the world. This is due to the architecture of some embodiments where the central computer system 106 outputs the control signals to the motorized transport unit 102. These controls signals could have originated at any device in communication with the central computer system 106. For example, the movement signals sent to the motorized transport unit 102 may be movement instructions determined by the central computer system 106; commands received at a user interface unit 114 from a user; and commands received at the central computer system 106 from a remote user not located at the shopping facility space.
[0067] Determining Location: Similar to that described above, this feature enables the central computer system 106 to determine the location of devices in the shopping facility 101. For example, the central computer system 106 maps received LED light transmissions, Bluetooth low energy radio signals or audio signals (or other received signals encoded with location data) to a two-dimensional map of the shopping facility. Objects within the area of the shopping facility are also mapped and associated with those transmissions. Using this information, the central computer system 106 can determine the location of devices such as motorized transport units.
[0068] Digital Physical Map Integration: In some embodiments, the system 100 is capable of integrating 2D and 3D maps of the shopping facility with physical locations of objects and workers. Once the central computer system 106 maps all objects to specific locations using algorithms, measurements and LED geo-location, for example, grids are applied which sections off the maps into access ways and blocked sections. Motorized transport units 102 use these grids for navigation and recognition. In some cases, grids are applied to 2D horizontal maps along with 3D models. In some cases, grids start at a higher unit level and then can be broken down into smaller units of measure by the central computer system 106 when needed to provide more accuracy.
[0069] Calling a Motorized Transport Unit: This feature provides multiple methods to request and schedule a motorized transport unit 102 for assistance in the shopping facility In some embodiments, users can request use of a motorized transport unit 102 through the user interface unit 114. The central computer system 106 can check to see if there is an available motorized transport unit. Once assigned to a given user, other users will not be able to control the already assigned transport unit. Workers, such as store associates, may also reserve multiple motorized transport units in order to accomplish a coordinated large job.
[0070] Locker Delivery: In some embodiments, one or more motorized transport units 102 may be used to pick, pack, and deliver items to a particular storage locker 132. The motorized transport units 102 can couple to and move the storage locker to a desired location. In some embodiments, once delivered, the requestor will be notified that the items are ready to be picked up, and will be provided the locker location and locker security code key.
[0071] Route Optimization: In some embodiments, the central computer system automatically generates a travel route for one or more motorized transport units through the shopping facility space. In some embodiments, this route is based on one or more of a user provided list of items entered by the user via a user interface unit 114; user selected route preferences entered by the user via the user interface unit 114; user profile data received from a user information database (e.g., from one of databases 126); and product availability information from a retail inventory database (e g., from one of databases 126) In some cases, the route intends to minimize the time it takes to get through the facility, and in some cases, may route the shopper to the least busy checkout area. Frequently, there will be multiple possible optimum routes. The route chosen may take the user by things the user is more likely to purchase (in case they forgot something), and away from things they are not likely to buy (to avoid embarrassment). That is, routing a customer through sporting goods, women’s lingerie, baby food, or feminine products, who has never purchased such products based on past customer behavior would be non-productive, and potentially embarrassing to the customer. In some cases, a route may be determined from multiple possible routes based on past shopping behavior, e.g., if the customer typically buys a cold Diet Coke product, children’s shoes or power tools, this information would be used to add weight to the best alternative routes, and determine the route accordingly.
[0072] Store Facing Features: In some embodiments, these features enable functions to support workers in performing store functions. For example, the system can assist workers to know what products and items are on the shelves and which ones need attention. For example, using 3D scanning and point cloud measurements, the central computer system can determine where products are supposed to be, enabling workers to be alerted to facing or zoning of issues along with potential inventory issues.
[0073] Phone Home: This feature allows users in a shopping facility 101 to be able to contact remote users who are not at the shopping facility 101 and include them in the shopping experience. For example, the user interface unit 114 may allow the user to place a voice call, a video call, or send a text message. With video call capabilities, a remote person can virtually accompany an in-store shopper, visually sharing the shopping experience while seeing and talking with the shopper. One or more remote shoppers may join the experience.
[0074] Returns: In some embodiments, the central computer system 106 can task a motorized transport unit 102 to keep the returns area clear of returned merchandise. For example, the transport unit may be instructed to move a cart from the returns area to a different department or area. Such commands may be initiated from video analytics (the central computer system analyzing camera footage showing a cart full), from an associate command (digital or verbal), or on a schedule, as other priority tasks allow. The motorized transport unit 102 can first bring an empty cart to the returns area, prior to removing a full one.
[0075] Bring a Container: One or more motorized transport units can retrieve a movable item container 104 (such as a shopping cart) to use. For example, upon a customer or worker request, the motorized transport unit 102 can re-position one or more item containers 104 from one location to another. In some cases, the system instructs the motorized transport unit where to obtain an empty item container for use. For example, the system can recognize an empty and idle item container that has been abandoned or instruct that one be retrieved from a cart storage area. In some cases, the call to retrieve an item container may be initiated through a call button placed throughout the facility, or through the interface of a user interface unit 114.
[0076] Respond to Voice Commands: In some cases, control of a given motorized transport unit is implemented through the acceptance of voice commands. For example, the user may speak voice commands to the motorized transport unit 102 itself and/or to the user interface unit 114. In some embodiments, a voice print is used to authorize to use of a motorized transport unit 102 to allow voice commands from single user at a time.
[0077] Retrieve Abandoned Item Containers: This feature allows the central computer system to track movement of movable item containers in and around the area of the shopping facility 101, including both the sale floor areas and the back-room areas. For example, using visual recognition through store cameras 118 or through user interface units 114, the central computer system 106 can identify abandoned and out-of-place movable item containers. In some cases, each movable item container has a transmitter or smart device which will send a unique identifier to facilitate tracking or other tasks and its position using LED geo-location identification. Using LED geo-location identification with the Determining Location feature through smart devices on each cart, the central computer system 106 can determine the length of time a movable item container 104 is stationary.
[0078] Stocker Assistance: This feature allows the central computer system to track movement of merchandise flow into and around the back-room areas. For example, using visual recognition and captured images, the central computer system 106 can determine if carts are loaded or not for moving merchandise between the back room areas and the sale floor areas. Tasks or alerts may be sent to workers to assign tasks.
[0079] Self-Docking: Motorized transport units 102 will run low or out of power when used. Before this happens, the motorized transport units 102 need to recharge to stay in service. According to this feature, motorized transport units 102 will self-dock and recharge (e.g., at a MTU docking station 122) to stay at maximum efficiency, when not in use. When use is completed, the motorized transport unit 102 will return to a docking station 122. In some cases, if the power is running low during use, a replacement motorized transport unit can be assigned to move into position and replace the motorized transport unit with low power. The transition from one unit to the next can be seamless to the user.
[0080] Item Container Retrieval: With this feature, the central computer system 106 can cause multiple motorized transport units 102 to retrieve abandoned item containers from exterior areas such as parking lots. For example, multiple motorized transport units are loaded into a movable dispenser, e.g., the motorized transport units are vertically stacked in the dispenser. The dispenser is moved to the exterior area and the transport units are dispensed. Based on video analytics, it is determined which item containers 104 are abandoned and for how long. A transport unit will attach to an abandoned cart and return it to a storage bay.
[0081] Motorized Transport Unit Dispenser: This feature provides the movable dispenser that contains and moves a group of motorized transport units to a given area (e.g., an exterior area such as a parking lot) to be dispensed for use. For example, motorized transport units can be moved to the parking lot to retrieve abandoned item containers 104. In some cases, the interior of the dispenser includes helically wound guide rails that mate with the guide member 208 to allow the motorized transport units to be guided to a position to be dispensed.
[0082] Specialized Module Retrieval: This feature allows the system 100 to track movement of merchandise flow into and around the sales floor areas and the back-room areas including special modules that may be needed to move to the sales floor. For example, using video analytics, the system can determine if a modular unit it loaded or empty. Such modular units may house items that are of seasonal or temporary use on the sales floor. For example, when it is raining, it is useful to move a module unit displaying umbrellas from a back room area (or a lesser accessed area of the sales floor) to a desired area of the sales floor area.
[0083] Authentication: This feature uses a voice imprint with an attention code/word to authenticate a user to a given motorized transport unit. One motorized transport unit can be swapped for another using this authentication. For example, a token is used during the session with the user. The token is a unique identifier for the session which is dropped once the session is ended. A logical token may be a session id used by the application of the user interface unit 114 to establish the session id when user logs on and when deciding to do use the system 100. In some embodiments, communications throughout the session are encrypted using SSL or other methods at transport level.
[0084] FURTHER DETAILS OF SOME EMBODIMENTS
[0085] In accordance with some embodiments, further details are now provided for one or more of these and other features. A system and method for monitoring modular compliance is provided herein.
[0086] An MTU may be an intelligent robotic device capable of carrying out various tasks either alone, in conjunction with another MTU, and/or in concert with a remote or host control. Multiple MTUs can work as a team to accomplish tasks that are assigned. In some embodiments, tasks performed by MTUs include the ability to scan shelves for compliance. Having the ability to connect to a central computer system, MTUs may compare sections in the shopping space to what is expected to ensure modular compliance. In some embodiments, each MTU in a shopping space may have this capability to allow for continuous and real-time monitoring of shopping space conditions.
[0087] With a continuous product flow on and off of retailers’ shelves and limited manpower, it is difficult to ensure that shelves are properly set for maximizing sales at all time. Associates must periodically roam the area to look for situations that require attention. However, associates can’t always keep up with frequent modular and price changes made by the retailers and suppliers. Since MTUs may be continuously monitoring the shelves and has knowledge of the most recent layout and pricing information, MTUs may be used to detect when a section is out-of-tolerance with the expectation, and may either correct the issue, or may notify an associate in order to bring the module back to an acceptable level of compliance. In some locations, having an inaccurate price displayed on the shelf can cost the store fines and legal fees.
[0088] In some embodiments, an MTU may include specialized spy-it capabilities including visually recognizing merchandise, looking up the merchandise expected to be the shelf, looking for zoning and re-stocking opportunities, and providing alerts and corrections when discrepancies exist. Additionally, since MTUs may be monitoring continuously, an MTU may uncover an opportunity for resetting a modular if it is frequently found “out of tolerance.” MTUs with visual and voice recognition may be utilized for this task. An MTU can be tasked to monitor a modular, aisle, or department. By watching sales and receiving, the MTU system may determine the optimum time to scan the shelf to provide the best data to ensure compliance. For example, peak selling timeframes or just after a load of merchandise has been received may not be the best time to scan the shelves. The MTU may instead scan the shelves after a selling rush when items are likely to be low in stock and/or misplaced. In some embodiments, MTUs may always be searching for an “out of tolerance” situations as they perform other tasks to ensure that tasks are assigned to take corrective action as soon as possible.
[0089] The MTU modular compliance monitoring system allows for constant modular accuracy checks which can respond to frequent price changes and modular changes. The MTUs may also provide accurate module layouts (store unique modules) based on up-to-date models stored on a server. The MTUs may then collect information on modular issues. For example, the information may be gathered regarding a frequently damaged or unusable product and pricing mistake. Information gathered by MTUs may also be used to determine whether some items should be relocated and/or should be featured. The shelf images gathered by the MTUs can also be provided to buyers, manufacturers, distributors, etc.
[0090] When an MTU is assigned a shelf scanning task, the MTU may continuously monitor shelves to look for out of compliance and “out of compliance” situations. Using image recognition, and with the help of the central computer, an MTU may capture an image for the central system to detect whether the monitored section is in tolerance or out of tolerance. When an out of tolerance condition is detected, a store associate may be alerted to address the condition. For example, if a shelf pricing label is different than what is expected, corrective action may be required. The system may generate a task message based on the detected condition (e.g. “Inaccurate shelf label: aisle 13, green beans (item should be priced at $.87/lb)”).
[0091] In some embodiments, MTUs may be configured to multi-task. An MTU may continue to perform other tasks to which it is assigned while scanning shelves. In some embodiments, the MTU may capture images of shelves opposite to the side on which it is driving in order to capture a full image of the display module. When freed from other tasks, an MTU may search for situations that require attention and monitor everything in its path while traveling. The MTU system may keep track of the last time an image was captured of each module to make sure that every module has been scanned at least once within a predetermined period of time. The MTU system may determine how often to scan a section based on the amount of customer traffic that has passed through the section. In some embodiments, management may be alerted when an MTU has been idle for a prolonged period of time and task assignments may be modified to more fully utilize the MTUs.
[0092] In some embodiments, images captured by the MTUs may form two-dimensional images and/or three-dimensional models. Images may be captured at the same precise location over time to analyze customer behavior. In some embodiments, each frame may be placed in motion from one image to the next from a time-lapse 3D model to understand how customers shop. These images can be later studied for improving the shelving and replenishment process.
[0093] In some embodiments, at least some merchandise may include RFID tags. An MTU may read and interpret the RFID signals around it where appropriate. Items with a short shelf life may have an expiration date stamp (which may be in invisible ink). An MTU may further read the expiration date on items to determine if the items need to be marked down or thrown out. In some embodiments, an image of the shelf may be used to determine inventory accuracy and adjust inventory where appropriate. In some embodiments, if images reveal that a particular item unit has remained on the shelf in the same location for an unreasonable amount of time, it may be determined to be unsellable and an alert may be sent to an associate to pick it up or move it.
[0094] In some embodiments, an MTU may communicate with a central computer system to identify item locations and compliance attributes. An MTU may further be configured to provide inventory statistics when asked by a customer and/or an associate. An MTU may notify the central computer that it is available for task assignment. An MTU may further communicate with associates to determine the items to scan (e g. “Do you mean the flashlight batteries or the car batteries?”). An MTU may also communicate with an associate to request assistance and provide information in bringing a module into compliance.
[0095] In some embodiments, a central computer system may identify items based on shelf labels captured by MTUs and determine an item’s expected inventory from the store’s inventory database. The central computer system may further visually recognize the price found on the label and compare the price to expected price. The central computer system may also determine the frequency that shelves should be scanned based on customer activity and/or captured data. For example, the system may increase the scanning frequency for modules that are frequently found to be out of compliance.
[0096] FIG. 6 illustrates a block diagram of a modular compliance monitoring system 600 as configured in accordance with various embodiments of these teachings. The modular compliance monitoring system 600 includes a central computer system 630, a baseline condition database 610, and at least an image capture device 620 configured to capture images of sections of a shopping space such as a display module 650. The system is configured to output modular compliance status to be stored in the modular compliance status database 640. The modular compliance monitoring system 600 may include or may be implemented at least partially with one or more components shown in FIGS. 1, 4, and 5 or may be more generically implemented outside of the system described with reference to FIGS. 1, 4 and 5.
[0097] The central computer system 630 includes a control circuit 621 and a memory 622 and may be generally referred to as a processor-based device, a computing device, a server, and the like. In some embodiments, the central computer system 630 may be implemented with one or more of the central computer system 106 and/or the computer device 500 described above. For example, the functionalities of the central computer system 630 described herein may be implemented as one or more software and/or hardware modules in the central computer system 106.
[0098] The central computer system 630 has stored on its memory 622 a set of computer readable instructions that is executable by the control circuit 621 to cause the control circuit 621 to obtain images from the image capture device 620, compare the captured images with baseline models in the baseline condition database 610, and output modular compliance status to the modular compliance status database 640. In some embodiments, the image capture device 620 is implemented on an MTU and the central computer system 630 is further configured to instruct the MTU as it travels through a shopping space. In some embodiments, the central computer system 630 may further be configured to generate tasks for MTUs and/or store associates based on the modular compliance status information stored in the modular compliance status database 640. In some embodiments, the central computer system 630 may be located inside of and serve a specific shopping space. In some embodiments, the central computer system 630 may be at least partially implemented on a remote or cloud-based server that provides instructions to MTUs in one or more shopping spaces.
[0099] The baseline condition database 610 may be a non-transitory memory storage that stores one or more baseline condition models for one or more shopping spaces. The baseline condition database 610 may be coupled to the central computer system 630 through one or more of local, remote, cloud-based, wired, and wireless connections. In some embodiments, the baseline condition database 610 may be at least partially implemented on one or more of the memory 632, the database 126, the memory 110, the memory 408, and the memory 504 described herein. A baseline condition model stored in the baseline condition database 610 may include one or more of a 2D image of a baseline module, 3D image of a baseline module, a computer generated image of a baseline module, and descriptive data of baseline module. In some embodiments, the baseline models may be based on data captured from modules in the monitored shopping space right after they are properly set up. In some embodiments, the baseline models may be based on data captured from modules in a separate facility. For example, a retail entity may scan standard modules at one location and build baseline models that are used at multiple shopping spaces to ensure uniformity between store locations. In some embodiments, the baseline models may include descriptive data. The descriptive data may be extracted from the images of a baseline module, manually entered, and/or machine generated. A baseline model may include one or more of permitted items in the model, allotted space for each item, designated position for each item, item identifier (e.g. barcode, RFID), pricing information, signage information, etc. In some embodiments, a display module may be stocked by a party other than the operator of the shopping space. For example, employees of a manufacturer, a vendor, a distributor, and the like may be responsible for placing certain items on the shelves of a display module. The baseline condition model may be based on an agreement between the operator of the shopping space and the manufacturer, the vendor, the distributor, and the like.
[00100] The image capture device 620 may include one or more optical sensors configured to capture images of display modules in a shopping space. In some embodiments, the image capture device 620 may include one or more of optical sensors, image sensors, the video camera cameras 118, and sensors on MTUs 102 described with reference to FIG. 1 above. In some embodiments, the image capture device 620 may be attached an MTU that is configured to travel down aisles of a shopping space to capture images of display modules. In some embodiments, the image capture device 620 may include a set of stationary cameras for providing images of the shopping space to the central computer system 630. The central computer system 630 may analyze the images captured by the image capture device 620 to determine whether the display modules in the shopping space are in compliance with the baseline model.
[00101] The modular compliance status database 640 may be a non-transitory memory device. In some embodiments, the modular compliance status database 640 may be stored on one or more of the memory 632, the database 126, the memory 110, the memory 408, and the memory 504 described herein. The modular compliance status database 640 may be implemented with or separately from the baseline condition database 610. The modular compliance status database 640 may include one or more of a display module identifier, a noncompliant condition type indicator (e.g. item placement error, pricing error, signage error, etc.), a noncompliance item indicator (e.g. A-brand box cereal, B-brand toilet paper, etc.), and a noncompliant condition description. The information stored in the modular compliance status database 640 may be used by the central computer system 630 or another system to assign tasks to MTUs and/or store associates. For example, based on the noncompliant condition, an MTU or a store associate may be instructed to arrange the items based on the baseline model, replace a price tag, set up the correct signage, etc.
[00102] In some embodiments, the central computer system 630 may further determine a modular compliance score and store the score in the modular compliance status database 640. Each modular compliance score may be associated with a display module and/or item and may be based on the number and the degree of noncompliance detected. For example, an item that slightly exceeds its allotted display space may cause a small deduction in the compliance score. A missing item or a pricing mistake may cause a heavier deduction in the compliance score. Multiple compliance errors may cause further deductions to the compliance score. In some embodiments, the compliance score may be used by a system to determine whether a task needs to be assigned to address the noncompliance status of a module or an item. For example, a modular compliance task may be assigned only if the compliance score falls below a threshold value. In some embodiments, the compliance score may be used by the system to prioritize the assigning of the tasks to address the noncompliance condition of each module. For example, the tasks to address noncompliance conditions may be ordered based on the compliance score and when an available MTU and/or associates is identified, the task associated with the lowest compliance score is assigned first. The modular compliance status and score in the modular compliance status database 640 may be constantly updated as images are captured by the image capture device 620. In some embodiments, after a modular compliance task is completed, an MTU and/or associate may be instructed to capture an image of the assigned module and that image may be used to further update the compliance status of the module in the modular compliance status database 640. In some embodiments, if a display module is stocked and/or arranged by a third party (e g. manufacturer, vendor, distributor, etc ), the third party may be notified of the noncompliance condition.
[00103] In some embodiments, the modular compliance status database 640 may store a history of compliance status for a plurality of display modules. The historical compliance data may be used to identify problematic modules that are frequently out of compliance. In some embodiments, the central computer system 630 may further be configured to generate task assignments based on historical compliance data. For example, the central computer system 630 may increase the image capture frequency of problematic modules. In another example, the central computer system 630 may identify one or more modules as needing relocation and/or redesign to reduce noncompliant conditions.
[00104] The display module 650 generally may be a structure for displaying items for purchase in a shopping space. In some embodiments, the display module 650 may include one or more of a fixture, shelf, a stand, a case, a refrigerated unit, etc. A display module 650 may refer to a section of the shopping space that may be part of a physical display structure or encompass two or more physical display structures. A display module 650 may include one or more display spaces such as shelves. Each display module may correspond to a baseline condition model stored in the baseline condition database. In some embodiments, a module may correspond to a section of a shopping area that should appear similarly across different store locations while the placement of modules relative to each other in a store may vary depending on the store’s space and dimension.
[00105] The central computer system 630 may further be communicatively coupled to a plurality of MTUs (not shown). The MTUs may be the MTU 102 described in FIG. 1, the MTU shown in FIGS. 2A-3B, and/or the MTU 402 described in FIG. 4. Generally, an MTU may be a motorized device having an image capture device 620 and configured to travel in a shopping space according to instructions received from a central computer system 630 while capturing images with the image capture device 620. In some embodiments, the central computer system 630 may instruct the MTUs to capture images of each monitored module at a set frequency. In some embodiments, the monitoring frequency for each module may be adjusted based on one or more of: time of day, how often the customer visits the display module area, how often the display module has been found to be out of compliance in the past, etc. In some embodiments, the central computer 530 is further configured to instruct MTUs assigned to other tasks to capture images of display modules in their path of travel.
For example, an MTU that is traveling to retrieve a shopping cart for a customer may capture one or more images of display modules on the way to and from the cart. In some embodiments, the central computer system 530 may selectively modify the route of an MTU assigned to other tasks in order capture images of a module that needs a status update. For example, an MTU traveling to a cleaning task may be given an alternate route to travel in order to pass through a section that needs a modular compliance status update. In some embodiments, the central computer system 630 may assign a shelf scanning task to one or more MTUs and the MTUs travel around the shopping space for the specific purpose of capturing images of display modules. In some embodiments, the central computer system 630 may further be configured to assign tasks to the MTUs based on modular compliance status information stored in the modular compliance status database 640. In some embodiments, the MTUs may include other input and output devices such as a speaker, an audio input device, a visual status indicator, and the like for communicating with customers and/or store associates.
[00106] FIG. 7 shows a flow diagram of a method for monitoring for modular compliance in a shopping space in accordance with various embodiments of these teachings.
The steps shown in FIG. 7 may be performed by one or more of the central computer system 630 in FIG. 6, the central computer system 106 in FIG. 1, and the computer device 500 in FIG. 5, for example. In some embodiments, the steps are performed by a processor-based device executing a set of computer readable instructions stored on a memory device.
[00107] In step 710, the system obtains an image of a display module. In some embodiments, the image of the display module may be captured by an MTU communicating with a central computer system. The MTU may be assigned to perform a task in the store (e g. escort customer, retrieve shopping cart, etc.) and capture images of modules on its path of travel. In some embodiments, one or more MTUs may be assigned a shelf scanning task which directs the MTU to travel through the shopping space to capture images of display modules. In some embodiments, the system may instruct the MTU to travel to a specific location such that the captured image matches the angle and orientation of the image of the module in the baseline model. In some embodiments, the image of the display module may be captured by one or more of cameras on portable user devices and stationary cameras installed in a shopping space. . In some embodiments, the display module and/or items on the display module may include an identifier such as a bar code, a serial number, and a RFID tag that is also captured by one or more sensors of the capturing device.
[00108] In some embodiments, between steps 710 and 715, the system identifies a display module associated with the retrieved image. In some embodiments, the image capture device may further capture an identifier from the display module. For example, an optically readable code (e.g. barcode) may be placed at the bottom of a display module that can be scanned by an MTU. In some embodiments, the system may determine the module based on a location sensor on the MTU that captured the image. For example, the MTU may include a geolocation beacon receiver (e.g. smart LED sensor) and/or a GPS that can provide the MTU’s location information back to the central computer system. The central computer system may then match the location and orientation of the MTU and the image sensor to a module in the shopping space. In some embodiments, the system identifies a display module to scan when it instructs the MTU to capture the image and the obtained image is assumed to be of that of the assigned display module.
[00109] In step 715, the system retrieves a baseline condition model corresponding to the imaged displayed module from a baseline condition model database. In some embodiments, each baseline condition model may have a baseline model identifier. The central computer system may include a lookup table that matches at least some of display module identifiers associated with display modules in the shopping space to baseline model identifiers. The baseline condition models may include one or more of a 2D image of a baseline module, 3D model of a baseline module, a computer generated image/model of a baseline module, and descriptive data of a baseline module. In some embodiments, the baseline models may be based on data captured from modules in the shopping space being monitored right after the modules are properly set up. In some embodiments, the baseline models may be based on data captured from modules in a separate facility. For example, a retail entity may scan standard modules at one location and build baseline models that are used at multiple shopping spaces to ensure uniformity between store locations. In some embodiments, the baseline models may include descriptive data. The descriptive data may be extracted from the images of a baseline module through image analysis and/or be manually entered. For example, a system may analyze an image of a standard module to determine where each item should be displayed and how much shelf space each item should occupy. In some embodiments, the baseline model may include one or more of: allotted space for each item, designated positions for each item, item name, item identifier (e.g. barcode, RFID), pricing information, signage information, etc. In some embodiments, a display module may be stocked by a party other than the operator of the shopping space. For example, a display module may be stocked and/or arranged by employees of a manufacturer, vendor, distributor, and the like. The baseline condition model may be based on an agreement of permitted items and allotted space between the operator of the shopping space and a manufacturer, vendor, distributor, and the like.
[00110] In step 720, the system compares the image of the display module from step 710 and the baseline condition model retrieved in step 715. In some embodiments, the system may perform image analysis on the one or more images of the display module to determine one or more of: item placement, signage placement, signage content, price label content, item quantity, item expiration date, and display area cleanliness. In some embodiments, the system may perform image analysis on the image of the baseline model and the obtained image of the display module to isolate portions of images that appear different in the two images. The isolated portions may then be further analyzed to determine the nature of the divergence from the baseline model. For example, the system may determine whether the difference is caused by items and/or signage that are missing or misplaced. In some embodiments, the image of the baseline module may be an image of the same module. For example, the system may instruct MTUs to capture images of a module at the same location and same orientation over time. In some embodiments, the image of the baseline module may be of an image of a standard display module set up in a different location and/or a computer simulated image of a display module.
[00111] In some embodiments, the system may perform image analysis on the obtained image of the display module to identify one or more characteristics of the display module to compare to characteristics of the baseline condition model. For example, the system may compare the image of the display modules to known images of item packaging to identify one or more items displayed in the display module. The system may also determine whether any of the products is misplaced or missing based on the baseline condition model. In some embodiments, the system may determine the display position and/or orientation of the items and compare the position and/or orientation to the position and/or orientation specified in the baseline condition model. In some embodiments, the system runs a text recognition algorithm to identify the content of the pricing label text and/or signage on the display module. The system may determine whether the content of the pricing label and/or signage are accurate based on the pricing and signage information specified in the baseline condition model.
[00112] In some embodiments, the system also compares other information gathered by the image capturing device, such as an MTU, with the baseline model. For example, the MTU may scan for RFID tags and count the number of items on the shelves. The MTU may also gather other environmental information such as lighting, temperature, moisture, etc. The baseline condition model may further include a model range for the environmental conditions. In some embodiments, the system may also extract expiration dates from the images of items. In some embodiments, the baseline condition model may specific the earliest permissible expiration date for one or more of the items on the module. The system then assigns a task to remove any item that has an expiration date that predates the earliest permissible expiration date.
[00113] In step 725, the system determines a modular compliance status for the display module. In some embodiments, the modular compliance status may indicate whether the display module is within tolerance or out of tolerance. A tolerance level may be set to the degree that a display module is permitted to deviate from the baseline model before a task to address the condition is assigned. For example, a single item that is misplaced may be within the tolerance level while three misplaced items may cause the module’s modular compliance status to become out of tolerance. In some embodiments, the degree of noncompliance may be based on the amount of difference between an image of the display module and an image associated with the baseline condition model. In some embodiments, the system may identify the type and nature of the noncompliance condition to determine the degree of noncompliance.
[00114] In some embodiments, the system also determines a compliance score in step 725. The compliance score may be based on the number and the degree of noncompliance detected. For example, an item that slightly exceeds its allotted display space may cause a small deduction in the compliance score. A missing item or a pricing mistake may cause a heavier deduction in the compliance score. Multiple compliance errors may cause further deductions to the compliance score. In some embodiments, a deduction value may be associated with the type of noncompliant condition. For example, a misplaced item may have a low deduction value and a pricing mistake may have a high deduction value. In some embodiments, the deduction value may be based on the amount of divergence from the baseline model. For example, an item that is supposed to occupy a third of a shelf space may cause a minor compliance score deduction if it occupies half of a shelf space, and may cause a high compliance score deduction if it occupies two thirds of the shelf space. In another example, a space that is supposed to have at least eight items on display may cause a minor compliance score deduction if only six items are on display, and may cause a high compliance score deduction if no item is on display. In some embodiments, an aggregation of noncompliant conditions that does not, each by itself, cause the module to be out of tolerance may cause the module to be out of tolerance on the whole. A compliance score may be associated with each item and/or each module. The system may determine that an item is out of tolerance based on the item’s compliance score and/or determine that a module is out of tolerance based on the module’s compliance score. In some embodiments, the modular compliance score may comprise the compliance scores of the items displayed by the module.
[00115] In some embodiments, the compliance score may be used by a system to determine whether a task needs to be assigned to address the noncompliance status. For example, a modular compliance task may be assigned only if the compliance score falls below a threshold value. In some embodiments, the compliance score may be used by the system to prioritize the assigning of the tasks to address the noncompliance status in the shopping space. For example, tasks to address noncompliance conditions on various modules may be ordered based on the compliance score of each module such that any available MTUs and/or associates are instructed to address the module with the lowest compliance score first. The modular compliance status and score in the modular compliance status database may be constantly updated as images are captured by the image capture device. In some embodiments, after a modular compliance task is completed, an MTU and/or associate may capture an image of the assigned module and that image may be used to further update the compliance status of the module in the modular compliance status database. In some embodiments, if a display module is stocked by a third party (e.g. manufacturer, vendor, distributor, etc ), the third party may be notified of the noncompliance condition. While deductions to a compliance score are described herein, in some embodiments the system may similarly use a noncompliant score that adds values base on noncompliant conditions to determine whether action should be taken to address the noncompliant condition.
[00116] In some embodiments, the MTU further collects additional information when capturing an image of the module, such as environmental temperature, barcode scan, radio-frequency identification (RFID) tag scan, and surrounding area condition. The modular compliance status and/or score may be further based on the additional information. For example, a temperature outside of an accepted range may cause a noncompliance condition.
[00117] In some embodiments, after step 725, the system may determine a task for one or more of an MTU and a store associate based on the modular compliance status determined in step 725. The follow-up task assignment may be based on the type of noncompliant condition detected and determined in steps 720 and 725. For example, if an error in the content of a sign is detected, the system may assign a task to replace the sign. In another example, if items are misplaced or missing, the system may assign a task to remove or stock the item. In some embodiments, when a module is out of tolerance on the whole, the system may assign tasks to correct all noncompliant conditions in that module. In some embodiments, the system may check with the inventory system before assigning a task. For example, if an item is missing from a module, the system may check whether the item is out of stock according to the inventory system before assigning a task to restock the item. In some embodiments, if the task is assigned to a store associate, the system may provide natural language instructions item (e.g.” remove one column of A brand soup can and add one column of B brand soup can”) and/or cause the correct item and placement to be displayed on the store associate’s portable device for reference. In some embodiments, the baseline condition model may be displayed on the store associate’s portable device.
[00118] FIG. 8 is an illustration of a display module and a baseline condition model according to some embodiments. The baseline condition model 810 represents a model stored in the baseline condition model database. The baseline condition model 810 includes shelf spaces 811, 812, and 813. The display module 820 represents a module in a shopping space and includes shelf spaces 821, 822, and 823. A system comparing the baseline condition model 810 and the display module 820 may directly compare the images of the two modules and/or may perform image analysis to extract characteristics of one or both of the modules for comparison. For example, the system may compare shelf spaces 811 and 821 and identify that the items on the left are not fully stocked in the display module 820. The system may compare shelf spaces 812 and 822 and determine that the item on the left of the shelf 822 is displayed in the wrong orientation. The system may compare shelf spaces 813 and 823 and determine that the item on the right of the shelf 823 exceeds the allotted space. In some embodiments, the system may record each of these noncompliant conditions and determine whether the divergence in the display module 820 is out of tolerance from the baseline condition model 810. For example, the system may assign a score to each of the detected out of compliance conditions and determine whether the combined score for the display module 820 exceeds a tolerance threshold. In some embodiments, the system may further check for price tags and signage on the display module 820 to determine whether any other noncompliant condition exists. After comparing the baseline condition model 810 and the display module 820, the system may determine a compliance status for each item, each shelf (821, 822, 823), and/or the entire display module 820.
[00119] FIG. 9 is an illustration of a process for monitoring for modular compliance with MTUs. In some embodiments, the steps in FIG. 9 may be implemented by one or more components of the systems shown in FIG. 1 and FIG. 6. In step 911, a store colleague or associate performs the initial setup of a digital map, location beacons (e.g. LED smart lights), and MTUs. In step 951, the location beacons transmit location numbers corresponding to different areas of the store. In step 931, the MTU communicates its status and location to a central computer based on the location number transmitted by the location beacons in step 951.
[00120] In step 921, the central computer records a digital map of the shopping store, which may be mapped to the location numbers transmitted by location beacons. The central computer also records the statuses of MTUs in step 921. The statuses of MTUs may comprise availabilities and capabilities of MTUs in the system. In step 922, the central computer determines a baseline scanning path based on the digital map. In some embodiments, the path may cover each section with modules that the system monitors of modular compliance.
[00121] In step 932, at least one MTU drives the scanning equipment to the start of the baseline scanning path determined in step 922. The scanning equipment may include one or more of an image sensor, an optical sensor, an RFID reader, a 3D scanner, and the like. In step 923, the central computer directs the scanning equipment to begin an initial baseline scan. In step 941, the scanning equipment continues to scan for baseline images until all sections in a shopping space on the scanning path are scanned.
[00122] In step 924, the central computer stitches the baseline scans together from a baseline model. The baseline model may be a series of 2D images and/or a 3D model. Also in step 924, the central computer directs the MTUs to perform delta scans. Delta scans generally refer to scans to detect changes from the baseline scan. In step 933, the MTUs drives around the store to perform delta scans. In some embodiments, the baseline scans and the delta scans may be performed by the same devices. In some embodiments, the baseline scans may be performed by specialized equipment while delta scan may be performed by MTU’s image sensors. In step 942, the scanning equipment performs delta scans.
[00123] In step 925, the central computer compares data obtained from baseline scans with images from delta scans to determine modular compliance. In step 926, the central computer determines whether a module is in compliance. If the module is not in compliance in step 912, a store associate may be notified of the noncompliant status and assigned a task to perform. In some embodiments, a module may be determined to be not in compliance when its deviation from the baseline model exceeds a preset threshold. In step 927, the central computer continues to instruct MTUs to perform delta scans.
[00124] In some embodiments, the baseline scans may be performed at a different physical location and on different modules. In such embodiments, steps 921, 922, 923, 931, and 932 may be performed by a first system that stores the baseline scan in a baseline condition database accessible by multiple shopping facilities. Steps 912, 925, 926, 927, 933, and 942 may be performed by a second system associated with a particular shopping facility.
[00125] In some embodiments, apparatuses and methods are provided herein useful for monitoring modular compliance in a shopping space. In some embodiments, a method for monitoring modular compliance in a shopping space includes obtaining, from an image capture device, one or more images of a display module in the shopping space, retrieving, from a baseline condition database, a baseline condition model for a display space corresponding to the display module, comparing, with a control circuit, the one or more images of the display module in the shopping space and the baseline condition model, and determining, with the control circuit, a modular compliance status of the display module based on the comparing.
[00126] In some embodiments, a system for monitoring modular compliance in a shopping space includes an image capture device, a baseline condition database, and a control circuit coupled to the image capture device and the baseline condition database. The control circuit being configured to obtain, from the image capture device, one or more images of a display module in the shopping space, retrieve, from the baseline condition database, a baseline condition model for a display space corresponding to the display module, and compare the one or more images of the display module in the shopping space and the baseline condition model to determine a modular compliance status for the display module.
[00127] In some embodiments, an apparatus for monitoring modular compliance in a shopping space comprises a non-transitory storage medium storing a set of computer readable instructions and a control circuit configured to execute the set of computer readable instructions which causes the control circuit to: obtain, from an image capture device, one or more images of a display module in the shopping space, retrieve, from a baseline condition database, a baseline condition model for a display space corresponding to the display module, compare the one or more images of the display module in the shopping space and the baseline condition model, and determine a modular compliance status of the display module based on the comparison.
[00128] Those skilled in the art will recognize that a wide variety of modifications, alterations, and combinations can be made with respect to the above described embodiments without departing from the scope of the invention, and that such modifications, alterations, and combinations are to be viewed as being within the ambit of the inventive concept.
Claims (21)
1. A method for monitoring modular compliance in a shopping space comprising: obtaining, from an image capture device, one or more images of a display module in the shopping space; retrieving, from a baseline condition database, a baseline condition model for a display space corresponding to the display module; comparing, with a control circuit, the one or more images of the display module in the shopping space and the baseline condition model; and determining, with the control circuit, a modular compliance status of the display module based on the comparing.
2. The method of claim 1, wherein the baseline condition model is a three-dimensional model.
3. The method of claim 1, wherein the baseline condition model is based on images of a standard display module different from the display module in the shopping space.
4. The method of claim 1, wherein the image capture device comprises a camera attached to and operated by a motorized transport unit communicating with the control circuit.
5. The method of claim 1, further comprising: performing image analysis on the one or more images of the display module to determine one or more of: item placement, signage placement, signage content, price label content, item quantity, item expiration date, and display area cleanliness.
6. The method of claim 1, wherein the determining of the modular compliance status is based on comparing one or more of: item placement, signage placement, signage content, price label content, item quantity, item expiration date, and display area cleanliness in the one or more images of the display module with information in the baseline condition model.
7. The method of claim 1, wherein the retrieving of the baseline condition model is based on scanning an optically readable code on the display module.
8. The method of claim 1, further comprising: collecting additional information relating to the display module, the additional information comprises one or more of: environmental temperature, barcode scan, radiofrequency identification (RFID) tag scan, and surrounding area condition; wherein the determining of the modular compliance status is further based the additional information.
9. The method of claim 1, further comprising: determining a follow-up task assignment based on the modular compliance status of the display module.
10. The method of claim 9, wherein the determining of the modular compliance status of the display module comprises determining a modular compliance score and the follow-up task assignment is selected based at least on the modular compliance score.
11. A system for monitoring modular compliance in a shopping space comprising: an image capture device; a baseline condition database; and a control circuit coupled to the image capture device and the baseline condition database, the control circuit being configured to: obtain, from the image capture device, one or more images of a display module in the shopping space; retrieve, from the baseline condition database, a baseline condition model for a display space corresponding to the display module; and compare the one or more images of the display module in the shopping space and the baseline condition model to determine a modular compliance status for the display module.
12. The system of claim 11, wherein the baseline condition model is a three-dimensional model.
13. The system of claim 11, wherein the baseline condition model is based on images of a standard display module different from the display module in the shopping space.
14. The system of claim 11, wherein the image capture device comprises a camera attached to and operated by a motorized transport unit.
15. The system of claim 11, the control circuit is further configured to: perform image analysis on the one or more images of the display module to determine one or more of: item placement, signage placement, signage content, price label content, item quantity, item expiration date, and display area cleanliness.
16. The system of claim 11, wherein the modular compliance status is determined based on comparing one or more of: item placement, signage placement, signage content, price label content, item quantity, item expiration date, and display area cleanliness in the one or more images of the display module with information in the baseline condition model.
17. The system of claim 11, wherein the retrieving of the baseline condition model is based on scanning an optically readable code on the display module.
18. The system of claim 11, wherein the control circuit is further configured to: collect additional information relating to the display module, the additional information comprises one or more of: environmental temperature, barcode scan, radio-frequency identification (RFID) tag scan, and surrounding area condition; wherein the modular compliance status is determined further based the additional information.
19. The system of claim 11, wherein the control circuit is further configured to: determine a follow-up task assignment based on the modular compliance status of the display module.
20. The system of claim 19, wherein the modular compliance status comprises a modular compliance score and the follow-up task assignment is selected based at least on the modular compliance score.
21. An apparatus for monitoring modular compliance in a shopping space comprising: a non-transitory storage medium storing a set of computer readable instructions; and a control circuit configured to execute the set of computer readable instructions which causes the control circuit to: obtain, from an image capture device, one or more images of a display module in the shopping space; retrieve, from a baseline condition database, a baseline condition model for a display space corresponding to the display module; compare the one or more images of the display module in the shopping space and the baseline condition model; and determine a modular compliance status of the display module based on the comparison.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562205548P | 2015-08-14 | 2015-08-14 |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201613873D0 GB201613873D0 (en) | 2016-09-28 |
GB2543136A true GB2543136A (en) | 2017-04-12 |
Family
ID=56985872
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1613873.7A Withdrawn GB2543136A (en) | 2015-08-14 | 2016-08-12 | Systems, devices and methods for monitoring modular compliance in a shopping space |
Country Status (2)
Country | Link |
---|---|
CA (1) | CA2938573A1 (en) |
GB (1) | GB2543136A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220004745A1 (en) * | 2020-01-24 | 2022-01-06 | Synchrony Bank | Systems and methods for machine vision based object recognition |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109308251B (en) * | 2017-07-27 | 2022-03-25 | 阿里巴巴集团控股有限公司 | Test data verification method and device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009027835A2 (en) * | 2007-08-31 | 2009-03-05 | Accenture Global Services Gmbh | Detection of stock out conditions based on image processing |
US20100171826A1 (en) * | 2006-04-12 | 2010-07-08 | Store Eyes, Inc. | Method for measuring retail display and compliance |
WO2011063527A1 (en) * | 2009-11-27 | 2011-06-03 | Sentry Technology Corporation | Enterprise management system and auditing method employed thereby |
US20130051667A1 (en) * | 2011-08-31 | 2013-02-28 | Kevin Keqiang Deng | Image recognition to support shelf auditing for consumer research |
US20130300729A1 (en) * | 2012-05-11 | 2013-11-14 | Dassault Systemes | Comparing Virtual and Real Images in a Shopping Experience |
-
2016
- 2016-08-11 CA CA2938573A patent/CA2938573A1/en not_active Abandoned
- 2016-08-12 GB GB1613873.7A patent/GB2543136A/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100171826A1 (en) * | 2006-04-12 | 2010-07-08 | Store Eyes, Inc. | Method for measuring retail display and compliance |
WO2009027835A2 (en) * | 2007-08-31 | 2009-03-05 | Accenture Global Services Gmbh | Detection of stock out conditions based on image processing |
WO2011063527A1 (en) * | 2009-11-27 | 2011-06-03 | Sentry Technology Corporation | Enterprise management system and auditing method employed thereby |
US20130051667A1 (en) * | 2011-08-31 | 2013-02-28 | Kevin Keqiang Deng | Image recognition to support shelf auditing for consumer research |
US20130300729A1 (en) * | 2012-05-11 | 2013-11-14 | Dassault Systemes | Comparing Virtual and Real Images in a Shopping Experience |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220004745A1 (en) * | 2020-01-24 | 2022-01-06 | Synchrony Bank | Systems and methods for machine vision based object recognition |
US11741420B2 (en) * | 2020-01-24 | 2023-08-29 | Synchrony Bank | Systems and methods for machine vision based object recognition |
US12001997B2 (en) | 2020-01-24 | 2024-06-04 | Synchrony Bank | Systems and methods for machine vision based object recognition |
Also Published As
Publication number | Publication date |
---|---|
CA2938573A1 (en) | 2017-02-14 |
GB201613873D0 (en) | 2016-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160260148A1 (en) | Systems, devices and methods for monitoring modular compliance in a shopping space | |
US20180099846A1 (en) | Method and apparatus for transporting a plurality of stacked motorized transport units | |
CA2938075A1 (en) | Shopping space mapping systems, devices and methods | |
GB2542469A (en) | Shopping facility assistance systems, devices, and method to identify security and safety anomalies | |
GB2543136A (en) | Systems, devices and methods for monitoring modular compliance in a shopping space | |
GB2549188B (en) | Assignment of a motorized personal assistance apparatus | |
US12123155B2 (en) | Apparatus and method of monitoring product placement within a shopping facility | |
US20230374746A1 (en) | Apparatus and method of monitoring product placement within a shopping facility | |
GB2542473A (en) | Shopping facility assistance system and method to retrieve in-store abandoned mobile item containers | |
GB2550016B (en) | Motorized transport unit worker support systems and methods | |
CA2938589A1 (en) | Shopping facility assistance systems, devices, and methods to facilitate responding to a user's request for product pricing information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
732E | Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977) |
Free format text: REGISTERED BETWEEN 20180412 AND 20180418 |
|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |