WO2018002864A2 - Shopping cart-integrated system and method for automatic identification of products - Google Patents

Shopping cart-integrated system and method for automatic identification of products Download PDF

Info

Publication number
WO2018002864A2
WO2018002864A2 PCT/IB2017/053904 IB2017053904W WO2018002864A2 WO 2018002864 A2 WO2018002864 A2 WO 2018002864A2 IB 2017053904 W IB2017053904 W IB 2017053904W WO 2018002864 A2 WO2018002864 A2 WO 2018002864A2
Authority
WO
WIPO (PCT)
Prior art keywords
product
system
object
database
shopping cart
Prior art date
Application number
PCT/IB2017/053904
Other languages
French (fr)
Other versions
WO2018002864A3 (en
Inventor
Rami VILMOSH
Shmuel KOTZEV
Original Assignee
Rami VILMOSH
Kotzev Shmuel
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201662356558P priority Critical
Priority to US62/356,558 priority
Application filed by Rami VILMOSH, Kotzev Shmuel filed Critical Rami VILMOSH
Publication of WO2018002864A2 publication Critical patent/WO2018002864A2/en
Publication of WO2018002864A3 publication Critical patent/WO2018002864A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0054Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
    • G07G1/0063Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles with means for detecting the geometric dimensions of the article of which the code is read, such as its size or height, for the verification of the registration
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0054Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
    • G07G1/0072Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles with means for detecting the weight of the article of which the code is read, for the verification of the registration
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0081Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader the reader being a portable scanner or data reader

Abstract

A shopping cart-integrated system includes a shopping cart carrying a sensor arrangement with a pair of spaced-apart imaging sensors or a three-dimensional image sensor deployed to view the upper opening of the cart. A processing system processes data from the sensor arrangement to sense the presence of an object adjacent to the upper opening, to track motion of the object and to determine whether the object is inserted into the cart, and to perform a product identification process. The product identification process uses at least one non-image parameter to filter a database of products to derive a subset of candidate products from the database and then compares a sampled image of the object with reference images of the candidate products to identify the object as a product within the database.

Description

Title: Shopping Cart-Integrated System and Method for Automatic Identification of

Products

FIELD AND BACKGROUND OF THE INVENTION

The present invention relates to retail systems and, in particular, it concerns a shopping cart-integrated system and corresponding method for automatic identification of products placed into a shopping cart.

Conventional arrangements for shopping in stores such as supermarkets typically require a checkout process in which products are removed from a shopping cart and passed through a scanning process to assemble a list of products purchased. The checkout may be a manned checkout or a self-checkout. In either case, the checkout process tends to be a bottle-neck in the process, typically leading to significant and frustrating delays to customers at times of peak shopping activity. At off-peak times, much of the checkout area of the store is typically not utilized, leading to inefficient use of the retail floor space,

SUMMARY OF THE INVENTION

The present invention is a shopping cart-integrated system and corresponding method for automatic identification of products placed into a shopping cart.

According to the teachings of the present invention there is provided, a shopping cart- integrated system and corresponding method for automatic identification of products placed into a shopping cart

According to a further feature of an embodiment of the present invention, a shopping cart-integrated system comprising: (a) a shopping cart comprising a set of walls enclosing a product receiving volume having a base and an upper opening; (b) a sensor arrangement comprising a pair of spaced-apart imaging sensors or a three-dimensional image sensor, the sensor arrangement being deployed in fixed relation to the product receiving volume so as to have an imaging field of view substantially spanning the upper opening; and (c) a processing system including at least one processor, the processing system being in data communication with the sensor arrangement, the processing system being configured to process data from the sensor arrangement so as: (i) to sense the presence of an object in a region adjacent to the upper opening; (ii) to track motion of the object and to determine whether the motion corresponds to an insertion event in which the object is inserted into the product receiving volume; and (iii) to perform a product identification process comprising employing at least one non-image parameter to filter a database of products to derive a subset of candidate products from the database and comparing a sampled image of the object derived from the sensor arrangement with reference images of the subset of candidate products by an image matching process to identify the object as a product within the database,

According to a further feature of an embodiment of the present invention, the sensor arrangement further comprises a weighing device associated with the base of the product receiving volume and deployed for weighing objects placed within the product receiving volume.

According to a further feature of an embodiment of the present invention, the processing system is further configured to: (a) after determination of an insertion event, monitor for a change in weight measured by the weighing device; and (b) compare the change in weight to an expected weight of at least one product.

According to a further feature of an embodiment of the present invention, if the change in weight does not match the expected weight to within a predefined margin of error, a suspect-item flag is generated for communication to a sales supervisor.

According to a further feature of an embodiment of the present invention, there is also provided an indoor tracking system including at least one tracking system component associated with the shopping cart, the indoor tracking system being associated with the processing system so as to provide to the processing system an indication of a location of the shopping cart within a mapped store.

According to a further feature of an embodiment of the present invention, the product identification process further comprises performing a pre-fiitering selection from the database of products to derive a locality-based subset of candidate products from the database, the locality- based subset of candidate products being derived from a location of the shopping cart and a product map of product locations within the mapped store.

According to a further feature of an embodiment of the present invention, the processing system is further configured such that, in a case in which the product identification process has initially failed to identify the object as a product within the locality-based subset of candidate products, the processing system repeats the product identification process with the pre-fiitering selection used to select a second subset of candidate products derived from the location of the shopping cart and the product map of product locations within the mapped store based on a second-level proximity condition.

According to a further feature of an embodiment of the present invention, the processing system is further configured such that, in a case in which the product identification process has initially failed to identify the object as a product within the locality-based subset of candidate products, the processing system repeats the product identification process without the pre- filtering selection based on locality.

According to a further feature of an embodiment of the present invention, the processing system is further configured to process image data and range data derived from the sensor arrangement to derive at least one non-image parameter relating to the object for use in the product identification process, the at least one non-image parameter being selected from the group consisting of; at least one dimension of the object; a shape of the object; and a color property of the object.

According to a further feature of an embodiment of the present invention, the processing system is implemented at least in part through at least one computer located remotely relative to the shopping cart, the system further comprising wireless networking components associated with the shopping cart and the at least one computer.

According to a further feature of an embodiment of the present invention, the processing system is further configured to determine whether the motion of the object corresponds to a removal event in which an object previously inserted into the product receiving volume is removed from the product receiving volume.

According to a further feature of an embodiment of the present invention, there is also provided a payment system in data communication with the processing system, the payment system providing a payment user interface configured to present to a user a list of purchased items corresponding to products inserted into the product receiving volume and to receive payment from the user for the list of purchased items.

According to a further feature of an embodiment of the present invention, there is also provided a wired or short-range wireless data connection associated with the shopping cart and configured for data communication with a user mobile communications device to provide information to the user mobile communications device for display in a graphic user interface of the user mobile communications device.

There is also provided according to the teachings of an embodiment of the present invention, a method comprising the steps of: (a) employing the aforementioned shopping cart- integrated system: (i) to sense the presence of an object in a region adjacent to the upper opening, (ii) to track motion of the object and to determine whether the motion corresponds to an insertion event in which the object is inserted into the product receiving volume; and (iii) to perform a product identification process comprising employing at least one non-image parameter to filter a database of products to derive a subset of candidate products from the database and comparing a sampled image of the object derived from the sensor arrangement with reference images of the subset of candidate products by an image matching process to identify the object as a product within the database; (b) for each insertion event, adding a corresponding identified product to a list of items purchased; and (c) processing a customer confirmation to complete a sale based on the list of items purchased, without requiring a checkout product scan, BRIEF DESCRIPTION OF THE DRAWINGS

The invention is herein described, by way of example only, with reference to the accompanying drawings, wherein:

FIG, 1 is a block diagram showing the components of a non-limiting exemplary implementation of a shopping cart-integrated system, constructed and operative according to an embodiment of the present invention, for automatic identification of products placed into the shopping cart;

FIG, 2A is a schematic isometric view of a shopping cart with components of the shopping cart-integrated system of FIG. 1 ;

FIG. 2B is an enlarged view of a region of FIG, 2A designated II;

FIG. 2C is a schematic top view of the shopping cart of FIG. 2A illustrating the fields of view of two imaging sensors associated with the shopping cart-integrated system;

FIG, 3 is a flow diagram illustrating a sequence of operations performed by a user in order to complete purchases according to a non-limiting exemplary implementation of the present invention;

FIG. 4 is a flow diagram illustrating a mode of operation of the shopping cart-integrated system during selection of purchases according to a non-limiting exemplary implementation of the present invention;

FIG. 5 is a flow diagram illustrating a method for generating a database of products according to an implementation of the present invention;

FIG. 6 is a flow diagram illustrating a method according to an implementation of the present invention for identifying a product introduced by a user into the shopping cart, FIG. 7 is a schematic plan view of the layout of a store illustrating the application of a proximity criterion in filtering candidate products for a product identification process according to an aspect of the present invention; and

FIG. 8 is a block diagram showing the interrelation of a server of the shopping cart- integrated system of the present invention with other dedicated and non-dedicated subsystems according to a non-limiting exemplary implementation of the present invention.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

The present invention is a shopping cart-integrated system and corresponding method for automatic identification of products placed into a shopping cart.

The principles and operation of systems and methods according to the present invention may be better understood with reference to the drawings and the accompanying description.

Turning now to FIG. 1 , this illustrates schematically the components of a shopping cart- integrated system, generally designated 10, constructed and operative according to an embodiment of the present invention. In general terms, system 10 includes a shopping cart 12 which, as better seen in FIGS. 2A-2C, has a set of walls 14 enclosing a product receiving volume 16 having a base 18 and an upper opening 20. A sensor arrangement, including a pair of spaced- apart imaging sensors 24a, 24b or a three-dimensional image sensor (not shown), is deployed, typically as part of a cart-mounted housing 22, in fixed relation to product receiving volume 16 so as to have an imaging field of view (region of overlap in plan view of FIG. 2C) substantially spanning the upper opening 20. The cart-mounted housing 22 is typically mounted on the handle of shopping cart 12, slightly spaced from the product receiving volume 16, such that the region of stereoscopic coverage of the spaced-apart imaging sensors effectively spans the upper opening 20. In certain cases, even if the stereoscopic coverage does not extend to 100% of the upper opening, this can be compensated for by applying an additional tracking criterion, such as for example, if an object crosses in front of a boundaiy of the upper opening 20 as viewed even by only one of the cameras.

Returning to FIG. 1 , system 10 also includes a processing system including at least one processor in data communication with the sensor arrangement. The processing system is configured to process data from the sensor arrangement so as:

(a) to sense the presence of an object in a region adjacent to the upper opening;

(b) to track motion of the object and to determine whether the motion corresponds to an insertion event in which the object is inserted into the product receiving volume; and (c) to perform a product identification process comprising employing at least one non-image parameter to filter a database of products to derive a subset of candidate products from the database and comparing a sampled image of the object derived from the sensor arrangement with reference images of the subset of candidate products by an image matching process to identify the object as a product within the database.

The system as presented herein, with further details described below, enables a mode of shopping which essentially bypasses the checkout process, thereby streamlining and enhancing the shopping experience. An exemplary sequence of the shopping process 100, from the user's point of view, as enabled by the present invention, is presented in FIG. 3. Specifically, on arrival at a store, the user takes a "smart cart" (step 102), corresponding to shopping cart 12 with various cart-mounted components of system 10. The user arranges take-home shopping bags 120, which may be disposable or reusable bags, within product receiving volume 16 for receiving the purchased items (step 104). Optionally, in a smartphone-integrated implementation (detailed below), the user's smartphone is connected to system 10 via a wired or short-range wireless data connection (step 106). The user then identifies herself to the system (step 108) in any suitable manner, such as for example: by logging on to the system with a user ID and password via the cart-mounted system, via the smartphone or via a free-standing customer service station; by presenting a store-issued magnetic or RFID customer card; or by direct user recognition by facial recognition, a fingerprint reader, an iris reader, voice recognition or any other distinctive biometric identification technique, as is known in the field of biometric identification.

Once the user has been identified as an authorized user and the shopping cart has been associated with that user, the user proceeds with shopping simply by placing products into the shopping cart (step 110), with each product being captured by the image sensor(s) to perform tracking of the object into or out of the basket, as well as product recognition, all as further detailed below. If the user mistakenly adds an item to the cart, or changes her mind regarding a particular purchase, the purchase can be canceled by removing the item from the cart so that the image sensor(s) capture images for tracking the removal of the object and identifying the object.

When the user indicates that she has completed selecting items for purchase, a list of the purchased items is preferably displayed to the user for review, typically via the user's smartphone, or alternatively via a dedicated display, which may be part of the cart-mounted system or at a free-standing customer service station (step 114). After review of the items purchased, the user authorizes payment (step 116), either by performing an electronic payment transaction or by authorizing charging of the appropriate sum via a credit or debit account with the store, if such exists. The user is then free to leave the store directly (step 118), carrying the purchased items in shopping bags 120 into which they were placed inside the cart, without requiring removal and scanning of the items at any checkout process.

By avoiding the need for scanning of items at a checkout, many disadvantages of conventional supermarket logistics are avoided, for example, avoiding the major bottle-necks that may occur at peak times with users lining up at checkouts while also reducing the wastage of retail space taken up by an large array of checkout counters which lie unused during off-peak times.

Turning now to the features of system 10 in more detail, FIGS. I and 2A-2C show a number of components that are typically associated with cart 12 according to certain exemplary embodiments of the present invention. According to certain particularly preferred implementations of the present invention, the sensor arrangement further includes a weighing device built-in to shopping cart 12 so as to weight objects that are placed in product receiving volume 16. The weighing device is typically implemented as a set of load cells 26, deployed so as to support base 18 and wails 14, which are associated with suitable circuitry connected via electric cable(s) (not shown) to other cart-mounted electronic components in cart-mounted housing 22.

The processing system for implementing the present invention typically includes a combination of a local processing system 122 in the car-mounted housing 22 and a processing system 124 located remotely relative to the shopping cart, either as part of a back-office server system 126, located on site or elsewhere, or via a cloud computing implementation, or any other distributed combination of processing functions between these different options, all as is known in the art. Wireless networking components 128, 130 are associated with cart-mounted processing system 122 and back-office server system 126, respectively, to provide networked connection between them. Due to the relatively large number of cart-mounted systems required, and the fact that the cart-mounted systems are publicly accessible and subject to significant wear- and-tear, it is typically preferable to employ low-cost processing components with limited processing power for local processing system 122 in the cart-mounted system, and to perform most of the heavier processing tasks using processing system 124. In any case, any implication in the description herein as to specific functions being performed by specific processing systems should be understood to be only by way of an illustrative preferred embodiment without limiting the invention to the specific implementation.

According to certain preferred implementations of the present invention, the cart- mounted system may interface with a user mobile electronics device such as a smartphone or tablet. In the retail store environment of most industrialized countries, the vast majority of customers carry a mobile communications device with a wide range of networking capabilities and which is readily configurable to perform various desired functions by installation of a suitable application ("APP"). Association of a user mobile communications device such as a smartphone 132 with the cart-based system may be performed via a communication link 136 of the smartphone, which may be for example a radio connection according to the standards of a WIFI connection, a Bluetooth connection, or an NFC connection, or may be a wired connection such as via a suitable USB cable connection standard, and is preferably managed by a suitable dedicated APP 138 installed on the device. Once associated, the user mobile communications device may provide a graphical user interface (GUI) 134 for the system, thereby avoiding the need for integration of a display component into the cart-mounted system. This further reduces the unit cost and maintenance costs of the cart-based system. Integration with a user mobile communications device also allows integration of addition features such as management of shopping lists which may be created in advance automatically updated during shopping as items are added to the cart, or maintaining a history of purchases and facilitating tracking of expenditure. Integration of a personal mobile device also provides convenient options for identification of the customer, and may also provide options for executing payment on completion of the shopping process. Implementation of any and all of the above options will be readily achieved by a person having ordinary skill in the art.

Structurally, cart-mounted housing 22 is advantageously implemented with an adjustable mount 140 (FIG. 2B) to receive the user mobile communications device 132 during use of cart 12, thereby positioning the device to be conveniently accessible and visible as a GUI.

Although illustrated herein in an implementation that integrates a user mobile communications device, it should be noted that alternative implementations without such integration, or where such integration is relied upon less for basic functionality of the system, also fall within the scope of the present invention. In the latter cases, a touch-screen (not shown) is typically integrated into cart-mounted housing 22, for example, in place of adjustable mount 140.

Other preferred features of cart-mounted housing 22 typically include green and red indicator lights 142 and 144, which are used to indicate successful processing of a purchase/cancellation transaction or an error condition, respectively, as discussed further below. Other alternatives or additional indications, such as a message flashing on the display or an audio signal (beep for successful transaction, buzzer for error) may be used. Cart-mounted housing 22 may also include an RFID reader 146 as a further preferred option for identifying a customer who carries a customer ID card or tag, which may be issued by a specific store or store-chain, or may be a general purpose tagged ID issued by any suitable authority. Additionally, or alternatively, an RFID reader may play other roles in functions such as indoor navigation.

The various components of cart-mounted housing 22 are preferably powered by one or more rechargeable battery 148, which is preferably configured with suitable charging circuitry which docks with a charging arrangement (not shown) when the cart is tethered in a cart storage area between periods of use.

Additional hardware and/or software components are provided, represented genetically by element 150 in FIG. 1, as required to perform indoor navigation to trace the approximate location of cart 12 within a store. Many suitable techniques are known in the art for indoor navigation, which may readily be implemented in the cart-based systems of the present invention. The navigation system may be a true navigation system which provides an estimate of cart position within the store, for example, based on WEFI beacons. Alternatively, for the purposes of the present invention, the navigation system may track progress of the cart through a store in the sense of when it passes waypoints defined at the entrances to, and optionally also at intermediate positions, along aisles of a store. Sensing of waypoints may be achieved by use of RFID gates detecting passage of a cart, or by locating optical markers to be sensed either by cameras 24a or 24b, or by a dedicated camera deployed to detect markers deployed on walls or product displays, on the floor, or on the ceiling. One non-limiting example of a commercially available waypoint marker system is the iBeacon system from Apple Inc., based on Bluetooth Low energy (LE) technology.

Cameras 24a and 24b are preferably implemented as a pair of cameras in fixed spatial relation with dedicated preprocessing hardware and/or software which correlates the images to determine a range to each pixel. A wide range of photogrammetry software for deriving range information from image pairs is commercially available and can be used to implement such a device, for example, as listed at: One non-limiting example is the software package "123D Catch" commercially available from Autodesk Inc. This use of a stereo-camera arrangement with photogrammetry processing allows application of a simple range threshold-based algorithm to sense the presence of an object over the opening 20 of the cart. A similar functionality is provided by alternative implementations using a time-of-flight or "3D" camera which senses pixel distance together with a 2D image. In both cases, the camera(s) are deployed to provide a field of view which substantially spans the opening 20 to the inside of the cart, i.e., providing sufficient coverage to effectively prevent introduction of products into the cart without them being sensed.

Turning now to FIG. 4, this details a preferred non-limiting exemplary implementation of the operation of the system of the present invention, designated as process 200, which provides the user functionality described in steps 110 and 112 of FIG. 3. Firstly, at 202, the camera(s) monitor a volume above opening 20 to sense the presence of an object in that volume. When an object is sensed, at 204, tracking algorithms are applied to successive images to track motion of the object. Selection criteria are applied to the sensed track to identify whether the motion is towards (i .e., insertion into) the cart ("yes" at 206) or out of (i.e., removal from) the cart ("yes" at 208). If the tracked motion leaves the tracking volume over the cart without satisfying either of these criteria, this indicates a passing object, and the flow returns to the monitoring state of 202,

Implementation of the above tracking algorithms can typically be performed using standard routines available from open source libraries, as is well known in the field of computer vision. By way of non-limiting example, the processes might include: retrieval of pixel depths from 3D camera or stereo camera circuitry; thresholding of depth to obtain segmented image containing only pixels corresponding to object within volume over opening; selection of trackable features within segmented image, correlation of trackable features between successive images; and building of tracks for the trackable features from entry into field of view until exit from field of view. Optical flow algorithms may be used as an alternative to feature tracking. The boundaries of the field of view corresponding to outside the cart and inside the cart are typically well defined, such that it is typically straightforward to distinguish between cases of insertion into the cart, removal from the cart, and passing objects.

In the case of an insertion event, at 210, the system attempts to identify the object inserted into the cart via a process described more fully below with reference to FIG. 6, Where the identification is successful, resulting in a Stock Keeping Unit (SKU) code uniquely identifying the product, at 212, a transaction is then registered as a purchase of the corresponding product, resulting in lighting of the green light and/or any other predefined transaction confirmation as well as adding the SKU to the list of purchased products (step 212). The purchase is preferably also notified to the primary supermarket back-office computer system in order to update inventory records, although this could alternatively be done only at the end of the shopping session, for example on payment, in order to reduce data traffic to the supermarket server. If the product identification is not successful, the red light is activated (step 214) and notification is preferably sent to a human assistant to assist in resolving the problem (step 216), According to a further particularly preferred feature of certain implementations of the present invention, as a confirmation of successful identification at step 212, after determination of an insertion event, the system monitors for a change in weight of the contents of the cart, as measured by the weighing device (load ceils 26), and compares the change in weight to an expected weight of the identified product (step 218). If the weight change matches the expected weight change to within some predefined margin of error, this provides verification of the identity of the product added to the cart and the process is considered complete, with control returning to the monitoring mode of 202. If the change in weight does not match the expected weight of the purchased product to within a predefined margin of error, the red light is activated (step 214) and a suspect-item flag is generated for communication to a sales supervisor for follow-up at 216.

A parallel process occurs in the event of a removal event detected at 208, with retrieval of the SKU at 220 and, if successful, activation of green light 142 and subtraction of the detected item from the list of purchases (step 222). Here too, the transaction is verified by comparing the change in weight of items in the basket against the expected weight of the detected item as derived from the product database at step 224. Where a match is verified, control returns to the monitoring of step 202. In the event of a failure to identify the object, or where the weight verification step indicates an irregularity, the red error light is illuminated at 226 and a flag is generated for supervisor follow-up at 228.

The various components of the system operate under the control and integration of a computer system to provide the various functions described herein. The computer system may be a distributed system including the processing system 122 of the cart-mounted system in networked communication with the processing system 124 of the smart-cart server system 126, which typically also includes data storage 230. This may further be connected and/or integrated with one or more centralized computer system of the retail establishment and/or a remote or cloud-based computing system, such as back-office computer 234. Each computer typically includes at least one processor, with at least one associated non-volatile data storage device, configured either by software, by hardware design or by a combination thereof, to perform the various processes as described herein, all as will be readily understood by a person ordinarily skilled in the art. The various parts of the computer system are interconnected by suitable wireless, or in the case of the back-office computers optionally wired, communications infrastructure and communicate using standard communications protocols to form a local and/or wide area network. Dedicated, non-standard communications equipment and/or protocols may also be used. The number of devices need not follow the particular architecture illustrated here. For example, the functions of server system 126 may be integrated into the otherwise conventional back-office supermarket system 234 where sufficient processing power is available.

A database of product information may be accommodated in any suitable data storage device, for example in data storage 230, which may be a local back-office networked storage system operating a RAID array, may be a centralized storage system of a chain of stores located at a remote location, or may be implemented on a cloud server using dynamically allocated resources, all as is known in the art. The database preferably stores a set of N entries corresponding to products available in a store, each entry including at least one reference image of a product and at least one non-image parameter characterizing a property of the corresponding product. The reference imagery and the non-image parameters may be stored in distinct data structures and/or physically separate databases, so long as there is clear one-to-one indexing between them, but are referred to herein functionally as "a database". Examples of the non-image parameters, and details of a database update process will be discussed by way of example with reference to FIG. 5 below.

FIG. 6 illustrates the flow of a non-limiting example of the main identification process, generally designated 30, which provides the functionality of steps 210 and 220 of FIG. 4, according to an implementation of the present invention. The process starts with various input steps, primarily corresponding to the aforementioned segmentation of images from the imaging sensors to obtain images of pixels belonging to an object involved in an insertion or removal event (step 32). Since the image data also includes range information to the image pixels, this data also allows estimation of the size of the object, and further processing can provide at least partial image regarding object shape and color properties, as will be discussed further below. These inputs are then used by the computer system to derive one or more non-image parameter characterizing a corresponding property of the product presented by the user, and hence to narrow down the subset of possibly-relevant database records to which image-matching processing is to be applied.

According to one particularly preferred aspect of certain implementations of the present invention, an indoor tracking system including at least one tracking system component associated with shopping cart 12 provides the processing system an indication of a location of the shopping cart within a mapped store (step 34). This indication of shopping cart location is then used to perform a pre-filtering selection from the database of products to derive a locality- based subset (or "database slice") of candidate products from the database (step 36). Specifically, the locality-based subset of candidate products are derived from a location of the shopping cart and a product map of product locations within the mapped store, based on the assumption that most products placed into the cart have been taken from the shelves or other product displays adjacent to the current cart position.

The use of location for filtering the database of candidate products has a very major impact on the efficiency of the product identification process. Specifically, by limiting the product search to products which are nearby the current cart position, the number of candidate products can typically be reduced from many thousands to less than 100, rendering the additional filtering steps and the image matching process highly effective. It is noted however that the location proximity assumption is a refutable assumption, and is therefore preferably implemented as an adjustable or releasable condition, since it is possible for a customer to bring an item from elsewhere in the store to put in the cart. Accordingly, if at the end of the identification cycle at step 50 a match is not found, the location filter if preferably adjusted (step 54) and the process from step 36 is repeated.

One non-limiting example of an implementation of an adjustable proximity filter is illustrated in FIG. 7 which shows a schematic plan view of a part of a supermarket with product display shelves in regions labeled by number-letter combinations forming successive aisles. Two carts 12 in different positions are labeled I and II, respectively. For each cart position, the product identification process for a product inserted into the cart preferably employs a hierarchical search pattern, starting with the immediately adjacent shelving regions, then the regions a short distance away, then the further regions, and then finally releasing the proximity condition altogether. Thus, in the cases illustrated here, product selections for successive iterations of the identification process might be defines as follows:

Cart I:

1. 2C, 3C

2. 2B, 3B, ID, 2D

3, 1 A, 2A, 1C, 4C, 5C

4. Rest of store

Cart II:

1. 2D

2, 2C, 3C, 4C, 5C, ID, 3D

3. 2B, 3B, 4B, 5B, 1C, 6C

4. Rest of store

It will be noted that the relative "proximity" of successive areas of product display is preferably not based on absolute distance from the cart to the product display but rather the distance along the aisles or other features defining paths of travel followed by the customers moving around the store. For this reason, as mentioned earlier, a tracking system that uses waypoints rather than absolute position is typically sufficient for implementation of the present invention, and even where an absolute position tracking system is used, the processing system will typically translate the position into a system of zones as implied by the examples above.

The example of FIG. 7 is only schematic, and details such as the size of each zone and the number of zones defined along the length of each aisle will vary according to the dimensions of a store and the specific needs of a given implementation.

Parenthetically, the conceptual equivalent of the proximity filter in the case of a removal event is the list of items already placed into the cart, which inherently limits the number of candidate products to a very small subset of the products in the database.

In order to achieve highly reliable product identification, the system of the present invention preferably employs an image matching process for final identification of a given product, as described further below. Since, however, such image matching processes are computationally heavy, it is preferred to filter and further reduce the number of candidate products by use of non-image parameter of the products, prior to performing an image matching process. Non-limiting examples of suitable non-image parameters of products which may be used for such filtering include one or more of: at least one dimension (or a volume) of the object; a shape of the object; and a color property of the object. A number of these option will be discussed further below.

In the flow illustrated here, at step 38, the sampled images containing the product are processed to determine a shape of the presented product, and a corresponding subset of the database entries is selected. Shape determination may use standard computer vision techniques that do not require heavy processing, and is preferably greatly simplified by defining the task as a classification task, classifying each presented product into one of a relatively small number of predefined shapes. Exemplary shape classifications preferably include, but are not limited to: rectangular box ("cuboid"), cylindrical, bottle (optionally subdivided by neck geometry to distinguish wine bottles, soda bottles, conical-neck bottles), cartons, frustoconical tubs, flat rectangular packages, and a catch-all classification for "other" shapes that do not fall into one of the above classes.

Shape determination is typically achieved by image processing techniques such as segmentation, to determine the pixels of the images belonging to the presented product (which has typically already been performed based on pixel range data as a preparatory step for the tracking algorithms, as mentioned above), outlining to derive silhouettes of the product from the available camera directions combined with 3D depth data which may give a direct indication of the shape of the part of the product facing the cameras. The derived partial shapes are then compared with shape templates corresponding to each class of shapes to identify a match.

The shape classification is used to select a corresponding "slice" of entries from product database. If shape classification is inconclusive between two options, the two shape class options are typically combined to select a larger shape "slice".

In a related step 40, which may be performed prior to, subsequent to, or together with shape determination, one or more size parameter for the presented product is determined, and a corresponding database slice selected. The size parameter may be defined in various ways, and may be derived from the silhouettes or other shape data derived in the shape determination. According to one option, the size parameter is simply the largest measured dimension of the product. Alternatively, a parametric definition may be used including the largest dimension and the smallest dimension, or in some cases, three orthogonal dimensions. In one implementation, an estimate of product volume may be used as a non-imaging size-related parameter.

In certain implementations, the size determination may be integrated with the shape classification, benefitting from the determination of product orientation that is typically inherent in the shape classification process. For example, where a shape is classified as a cylinder, the parametric size determination may define an axial length and an outer diameter of the cylinder. In the case of a rectangular cuboid, the parametric definition of size preferably includes all three dimensions measured parallel to the edges of the cuboid. As with the other parameters, a range of values is selected around the measured values sufficient to comfortably encompass product-to- product variations and the degree of precision achieved by the measurement arrangements.

According to certain particularly preferred implementations as illustrated here, the non- image parameters used for filtering the database entries also include a color property of the presented product (step 42). The color property may be variously defined, but is preferably chosen to indicate the predominant color or colors of the product trade dress. According to one non-limiting implementation, the color property may be defined by quantizing the image pixels belonging to the product (according to a segmentation process described above in the context of shape determination) into a relatively small number of colors, such as 256 colors, and then picking out the one or two most prevalent pixel colors in a histogram of the pixel color distribution as a color signature. Alternatively, a full or reduced full histogram of colors associated with pixels belonging to the object in the sampled images may be used as the multidimensional "parameter" defining the color property. In cases where a product has different color properties when viewed from different directions, the product may have plural valid color property entries in the database. Here too, where combined with shape determination, the orientation together with the shape determination may be used to enhance the color property determination, for example, disregarding surfaces such as ends of cans which do not typically have distinguishing color properties. Additionally, or alternatively, images or regions of images which have near-uniform color may be disregarded.

The above example has related to each of the non-image properties as independently defining a "slice" of the main database. It should be noted however that this approach is only one of a number of ways to use the non-image properties to derive a reduced-size subset for image matching processing. In a further non-limiting example, each non-image parameter of the presented product may be used with a corresponding distance metric, probability distribution or other function to define a "distance", "probability" or other measure relating to the degree of match between the presented product and entries in the database. For single value parameters, the function may be a simple "normal distribution" centered on the measured weight for a "probability distribution", or an inverted bell curve for a "distance distribution". For more complex parameters, such as a color histogram derived from a product image, the measure may be a distance in multi-dimensional space defined by any suitable measure. These measures can then be combined, typically with different weights given to the different measures, to derive an overall score for each product database entry as a match for the presented item. The group of M highest scoring entri es (a subset of the full database of N entries) are then chosen for subsequent image-matching processing. If no match is found, an additional subset of the next-highest scoring entries may then be processed. This may be repeated until the score from the non-image parameter matching falls below some predefined threshold at which point a no-match result is returned.

It should be noted that the list of non-image parameters listed herein is only exemplary, and various implementations may use a subset of the above non-image parameters, variants of these non-image parameters, additional non-image parameters, and any combination thereof. By¬ way of one additional example, all or part of a barcode visible in a sampled image of a presented product may be searched for and used as a non-image parameter. Clearly, if an entire barcode happens to be visible and legible in an image, this gives positive identification of a product and may be used as a basis to bypass unnecessary steps of the identification process and reduce processing load. Even where only a part of the barcode is visible and legible in the images, this information can be a very helpful non-image parameter in determining a reduced subset of candidate matches within the database for further processing.

Whatever technique is employed, the use of non-image parameters to derive a subgroup of candidate entries drastically cuts down the number of candidate database entries for a give product. In a particularly preferred implementation employing location, shape, size and color property parameters, a database of tens-of-thousands of product entries is typically cut down to no more than at most a few tens of products which are still candidates for a given presented product. This reduces the subsequent image matching process, which is inherently computationally heavy, to a small-scale task which can be performed rapidly with standard low- cost computing resources at commercially acceptable rates, typically in the order of about 1 second per product. Thus the candidate database entries are identified and the corresponding reference images retrieved from the database are employed in an image matching processing 44 performed to search for a match for the sampled product images from amongst the candidate database entry reference images.

Image matching can be performed using well known techniques of computer vision which can be implemented using publicly available software modules, such as the Speeded-Up Robust Feature (SURF) algorithm, available in open source version (OpenSURF) written by Chris Evans and available from http://code.google.eom/p/opensurfl/.

At step 50, if a match is found in the reference images of a database entry, the product introduced into the cart is identified as the product with the corresponding database entry. In certain particularly preferred implementations, the system then performs the aforementioned weight-based verification 218, based on the change in weight of the contents of the cart and comparison with the expected weight as retrieved from the database. The verification thresholds are chosen to be suited to the precision of weight measurement that is available, as well as leaving a margin of variation in product weights, which may be defined globally, such as a variance of ±5% in the weight of each product, or may be a product-specific acceptable variance defined in the product database, optionally derived by statistical processing of weights of a number of samples of the product. The weight of each newly-added product is determined using the previous total weight of the cart contents as the new base-line "zero" value, so cumulative errors are avoided.

If the weight-based verification is successful, at step 52, the corresponding product is added to the list of products being purchased.

As mentioned earlier, if product identification fails, the location filter is preferably adjusted at 54 and the process repeated. In certain cases, the relative weighting of the other non- image parameter comparisons may be varied depending on the status of the location filter. For example, for many customers walking around a supermarket with their shopping cart, the first- level (high proximity) location based filter may be highly effective to allow identification of the vast majority of products based on a small number of candidate database entries, and it may be

1 unnecessary to apply most or all of the other non-image parameters for filtering. In such an application, only as the location filter is released and the number of candidate database entries increases, would the other filtering criteria be given progressively more weight.

Non-recognition of a product may occur for a number of reasons such as, for example, if the visual appearance of a product has changed significantly from the standard appearance, such as by removal of an outer wrapper of a product or by obscuring of a major part of the surface of the object, such as if a product is presented within an opaque bag or with a plurality of products obscuring each other. Optionally, where temporary impediments such as obscuration or overlap of products is likely, the user may be presented with notification and/or instructions on how to try to re-present the product, for example, removing visual obscurations or spatially separating products, in order for the automated recognition process to succeed. Non-recognition may also occur in cases where a new product is stocked or where an existing product packaging has been changed by the manufacturer without the database having been timely updated.

If, after successive adjustments of the location-based filter and/or re-presentation of the product, still no confirmed product match is found, or if the weight verification fails, indicating a possible "refilling" theft attempt or some other failure of the identification process, the human intervention/assistance procedures of step 216 are initiated. In cases of flagging of a product for intervention, a human customer seivice assistant is typically called, either physically present on the shop floor or via intercom or video conferencing from a back-office location. Particularly where there may be a concern of refilling, or any other concern of intentional misuse of the system, a customer service assistant physically present is preferred. However, it is expected that the frequency of flagging for intervention will be sufficiently low to allow a low ratio of attendants to customers. As in all retail environments, video monitoring of the store environment to watch for intentional foul-play is recommended.

The system preferably stores, at least for short term recall, a video record of every act involving inserting or removing a product into or from the cart, which can be replayed on demand by the customer service assistant to assess what occurred. If desired, this video may be stored for a defined period after completion of a sale, for use in subsequent quality control or for further investigation of any irregularities.

The process 30 of FIG. 6 preferably runs repeatedly, optionally with temporal overlap between the steps, for processing successive products that are placed by the user into the cart. An analogous process, not shown here separately in detail, performs identification for products photographed in removal events. For products removed from the cart, the process is greatly simplified, since the candidate database entries start from a relatively short list of the products previously added to the cart. As a result, most of the other pre-filtering steps can usually be omitted, or can be implemented with lower threshold values.

Turning now to FIG. 5, this illustrates schematically the flow of a process 76 for creating and updating the product database to support the identification process described herein. The process includes obtaining the product's barcode 77, for example using a barcode scanner, allowing retrieval of the product name and SKU (stock keeping unit), with which all the additional details are stored. Then, depending upon which non-image parameters are chosen for use in the identification process, a typical database entry procedure includes one or all of: weighing of the product 78, sampling images of the product 80, classifying the shape of the product 82, deriving dimensions of the product 84, and deriving a color property of the product 86. A composite database record is then stored containing non-image parameters characterizing a property of the product and at least one associated reference image of the product. Details of each of these processes will be clear by analogy to the corresponding parameters of the identification as discussed above, and will not be discussed further here,

The hardware required for creating and updating the database typically allows the operator to select particularly preferred directions for sampling images of the product most likely to be useful in the identification process. A single camera can be used sequentially to sample the required images. A keyboard or barcode reader may be used to identify the product within the inventory system and the data and sampled images derived from the product are then be used to automatically generate a new database entry, or to update or supplement an existing entry with a new appearance of an existing product, for example, after a change to the graphics of the product packaging.

Optionally, the registration process may be repeated for a number of samples of a given product, thereby allowing statistical analysis to assess the range of variation in the measured parameters between different samples. This statistical analysis may then be used to set confidence limits determining how narrowly each parametric slice can be defined for each product. For example, a product with very narrow variations in weight between samples may only need to appear in a single weight "slice" whereas a product of the same average weight but with larger variance may require inclusion in two or more adjacent weight "slices". The sampled images may also be stored together with the results of the identification process for subsequent offline analysis (investigation, quality control etc.).

Although the primary example described herein refers to matching of sampled product images to reference images from the database, it should be noted that the reference images in the database may not be simple images, but may instead be three-dimensional models of the products with associated textural information. This allows generation of a reference image of the product from any desired viewpoint, thereby in some cases making the image matching process more robust. Technology for generating three-dimensional models of objects is well known, and will not be described herein. In most cases, use of a database containing one or more two- dimensional reference image of each product is believed to be sufficient, and to simplify the processing in both the database update procedure and the purchase procedure.

Turning now to FIG. 8, the system of the present invention may interface with a range of dedicated and standard hardware components, both on-site and located remotely, which provide the various aspects of the functionality described above and/or additional functionality. In the exemplar}' non-limiting architecture illustrated here, the server system 126 forms the central hub which interconnects all the other components, including connections to the smart-cart cart-based systems 22 and, either directly or via the cart-based systems, to user smartphones 132. The server typically also interfaces with, or may in some cases by integrated as part of, the store server 234. Server 126 may also interface with a dedicated payment station 236, which is typically associated with an automatic gate, allowing the customer to leave the store after payment has been completed. Dedicated payment station 236 allows completion of a purchase process other than via a smartphone, for example, by credit card or cash transactions. Server 126 preferably also includes wireless networked connection to one or more mobile in-store service device 238 which may be carried by a store assistant, and may be used to intervene in purchase lists currently active, for example, manually adding or removing an item from a customer's list of purchases according to the results of a service intervention, for example, where identification of a certain product failed. Server 126 may also advantageously have access, directly or indirectly, to current or historic data from a system of in-store and/or out-of-store surveillance cameras 240. This may allow recall of relevant imagery according to a certain time-stamp and location within the store in order to assess any suspicious customer actions from one or more additional viewpoint, as well as allowing monitoring of any unauthorized removal of smart carts from the store vicinity. Optionally, the smart cart system may include a GPS-trackable tag to allow locating and retrieval of any misappropriated cart.

An array of charging terminals 242 are preferably provided for recharging the cart-based systems while not in use, preferably combined into a shopping-cart storage area. Optionally, the charger docking arrangement may also include a data communications capability interconnected with server 126 to allow tracking of which units are currently out-of-use, accounted for, and their state of charge. The charging terminals may also serve as a docking station allowing uploading of video and any other information from the cart-based systems, allowing emptying of the local data storage for subsequent use. Software updates etc, may also be applied via the docking station of a charging terminal.

Server 126 may further interface with an on-site or off-site support center 244, which may provide customer support services via intercom or video conferencing functionality directly to the customer via the cart-based system and/or customer mobile communications device without requiring a physical presence on the shop-floor.

It will be appreciated that the above descriptions are intended only to serve as examples, and that many other embodiments are possible within the scope of the present invention as defined in the appended claims.

Claims

WHAT IS CLAIMED IS:
1. A shopping cart-integrated system comprising:
(a) a shopping cart comprising a set of walls enclosing a product receiving volume having a base and an upper opening;
(b) a sensor arrangement comprising a pair of spaced-apart imaging sensors or a three-dimensional image sensor, said sensor arrangement being deployed in fixed relation to said product receiving volume so as to have an imaging field of view substantially spanning said upper opening; and
(c) a processing system including at least one processor, said processing system being in data communication with said sensor arrangement, said processing system being configured to process data from said sensor arrangement so as:
(i) to sense the presence of an object in a region adjacent to said upper opening;
(ii) to track motion of the object and to determine whether the motion corresponds to an insertion event in which the object is inserted into the product receiving volume; and
(iii) to perform a product identification process comprising employing at least one non-image parameter to filter a database of products to derive a subset of candidate products from said database and comparing a sampled image of the object derived from said sensor arrangement with reference images of the subset of candidate products by an image matching process to identify the object as a product within the database.
2. The system of claim 1, wherein said sensor arrangement further comprises a weighing device associated with the base of the product receiving volume and deployed for weighing objects placed within the product receiving volume.
3. The system of claim 2, wherein said processing system is further configured to:
(a) after determination of an insertion event, monitor for a change in weight measured by said weighing device; and
(b) compare the change in weight to an expected weight of at least one product.
4. The system of claim 3, wherein, if said change in weight does not match said expected weight to within a predefined margin of error, generating a suspect-item flag for communication to a sales supervisor.
5. The system of claim 1, further comprising an indoor tracking system including at least one tracking system component associated with said shopping cart, said indoor tracking system being associated with said processing system so as to provide to said processing system an indication of a location of said shopping cart within a mapped store,
6. The system of claim 5, wherein said product identification process further comprises performing a pre-filtering selection from said database of products to derive a locality- based subset of candidate products from said database, said locality-based subset of candidate products being derived from a location of said shopping cart and a product map of product locations within the mapped store,
7. The system of claim 6, wherein said processing system is further configured such that, in a case in which said product identification process has initially failed to identify the object as a product within said locality-based subset of candidate products, said processing system repeats said product identification process with said pre-filtering selection used to select a second subset of candidate products derived from the location of said shopping cart and the product map of product locations within the mapped store based on a second-level proximity condition.
8. The system of claim 6, wherein said processing system is further configured such that, in a case in which said product identification process has initially failed to identify the object as a product within said locality-based subset of candidate products, said processing system repeats said product identification process without said pre-filtering selection based on locality.
9. The system of claim 1 , wherein said processing system is further configured to process image data and range data derived from said sensor arrangement to derive at least one non-image parameter relating to the object for use in said product identification process, the at least one non-image parameter being selected from the group consisting of: at least one dimension of the object; a shape of the object: and a color property of the object.
10. The system of claim 1, wherein said processing system is implemented at least in part through at least one computer located remotely relative to said shopping cart, the system further comprising wireless networking components associated with said shopping cart and said at least one computer.
1 1. The system of claim I, wherein said processing system is further configured to determine whether the motion of the object corresponds to a removal event in which an object previously inserted into the product receiving volume is removed from the product receiving volume,
12. The system of claim 1, further comprising a payment system in data communication with said processing system, said payment system providing a payment user interface configured to present to a user a list of purchased items corresponding to products inserted into the product receiving volume and to receive payment from the user for said list of purchased items.
13. The system of claim 1, further comprising a wired or short-range wireless data connection associated with said shopping cart and configured for data communication with a user mobile communications device to provide information to the user mobile communications device for display in a graphic user interface of the user mobile communications device.
14. A method comprising the steps of:
(a) employing the shopping cart-integrated system of claim 1 :
(i) to sense the presence of an object in a region adjacent to said upper opening;
(ii) to track motion of the object and to determine whether the motion corresponds to an insertion event in which the object is inserted into the product receiving volume; and
(iii) to perform a product identification process comprising employing at least one non-image parameter to filter a database of products to derive a subset of candidate products from said database and comparing a sampled image of the object derived from said sensor arrangement with reference images of the subset of candidate products by an image matching process to identify the object as a product within the database;
(b) for each insertion event, adding a corresponding identified product to a list of items purchased; and
(c) processing a customer confirmation to complete a sale based on said list of items purchased, without requiring a checkout product scan.
PCT/IB2017/053904 2016-06-30 2017-06-29 Shopping cart-integrated system and method for automatic identification of products WO2018002864A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201662356558P true 2016-06-30 2016-06-30
US62/356,558 2016-06-30

Publications (2)

Publication Number Publication Date
WO2018002864A2 true WO2018002864A2 (en) 2018-01-04
WO2018002864A3 WO2018002864A3 (en) 2018-11-08

Family

ID=60787247

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/053904 WO2018002864A2 (en) 2016-06-30 2017-06-29 Shopping cart-integrated system and method for automatic identification of products

Country Status (1)

Country Link
WO (1) WO2018002864A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018144650A1 (en) * 2017-01-31 2018-08-09 Focal Systems, Inc. Automated checkout system through mobile shopping units
GB2562131A (en) * 2017-05-05 2018-11-07 Arm Kk Methods, systems and devicesfor detecting user interactions
US10319198B2 (en) 2016-05-02 2019-06-11 Focal Systems, Inc. Expedited checkout system through portable checkout units

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09307134A (en) * 1996-05-13 1997-11-28 Fujitsu Ltd Light receiving element and its optical module and optical unit
US8146811B2 (en) * 2007-03-12 2012-04-03 Stoplift, Inc. Cart inspection for suspicious items
US8325982B1 (en) * 2009-07-23 2012-12-04 Videomining Corporation Method and system for detecting and tracking shopping carts from videos
US20120320214A1 (en) * 2011-06-06 2012-12-20 Malay Kundu Notification system and methods for use in retail environments
US20150095189A1 (en) * 2012-03-16 2015-04-02 In Situ Media Corporation System and method for scanning, tracking and collating customer shopping selections
US9664510B2 (en) * 2013-06-22 2017-05-30 Intellivision Technologies Corp. Method of tracking moveable objects by combining data obtained from multiple sensor types

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10319198B2 (en) 2016-05-02 2019-06-11 Focal Systems, Inc. Expedited checkout system through portable checkout units
WO2018144650A1 (en) * 2017-01-31 2018-08-09 Focal Systems, Inc. Automated checkout system through mobile shopping units
GB2562131A (en) * 2017-05-05 2018-11-07 Arm Kk Methods, systems and devicesfor detecting user interactions

Also Published As

Publication number Publication date
WO2018002864A3 (en) 2018-11-08

Similar Documents

Publication Publication Date Title
US9757002B2 (en) Shopping facility assistance systems, devices and methods that employ voice input
US10339595B2 (en) System and method for computer vision driven applications within an environment
US20160110786A1 (en) Method, computer program product, and system for providing a sensor-based environment
US20160140397A1 (en) System and method for video content analysis using depth sensing
US10268983B2 (en) Detecting item interaction and movement
US8774462B2 (en) System and method for associating an order with an object in a multiple lane environment
US9697429B2 (en) Method and apparatus for image processing to avoid counting shelf edge promotional labels when counting product labels
US10176456B2 (en) Transitioning items from a materials handling facility
US9171442B2 (en) Item identification using video recognition to supplement bar code or RFID information
US10573141B2 (en) Security system, security method, and non-transitory computer readable medium
JPWO2015033577A1 (en) Customer behavior analysis system, customer behavior analysis method, customer behavior analysis program, and shelf system
CN101268478B (en) Method and apparatus for detecting suspicious activity using video analysis
US8876001B2 (en) Methods and apparatus for image recognition in checkout verification
US8254633B1 (en) Method and system for finding correspondence between face camera views and behavior camera views
US20190172039A1 (en) Information processing system
CN105324714B (en) Computer control, unattended automatic checkout retail shop
US10242267B2 (en) Systems and methods for false alarm reduction during event detection
US9396622B2 (en) Electronic article surveillance tagged item validation prior to deactivation
US20180165733A1 (en) Notification system and methods for use in retail environments
US9916561B2 (en) Methods, devices and computer readable storage devices for tracking inventory
CN204613978U (en) For managing the system of the assets in specific environment
US20150029339A1 (en) Whole Store Scanner
WO2014004576A1 (en) Image-augmented inventory management and wayfinding
US8430311B2 (en) Systems and methods for merchandise automatic checkout
CN104781857A (en) Mobile retail peripheral platform for handheld devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17819465

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17819465

Country of ref document: EP

Kind code of ref document: A2