WO2017055890A1 - Interactive product auditing with a mobile device - Google Patents

Interactive product auditing with a mobile device Download PDF

Info

Publication number
WO2017055890A1
WO2017055890A1 PCT/IB2015/002064 IB2015002064W WO2017055890A1 WO 2017055890 A1 WO2017055890 A1 WO 2017055890A1 IB 2015002064 W IB2015002064 W IB 2015002064W WO 2017055890 A1 WO2017055890 A1 WO 2017055890A1
Authority
WO
WIPO (PCT)
Prior art keywords
product
interest
key performance
region
auditing device
Prior art date
Application number
PCT/IB2015/002064
Other languages
French (fr)
Inventor
Diego GARCÍA MORATE
Antonio HURTADO GARCÍA
Original Assignee
The Nielsen Company (Us), Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Nielsen Company (Us), Llc filed Critical The Nielsen Company (Us), Llc
Priority to EP15905278.6A priority Critical patent/EP3357019A4/en
Priority to PCT/IB2015/002064 priority patent/WO2017055890A1/en
Priority to US14/894,901 priority patent/US10796262B2/en
Priority to EP21162114.9A priority patent/EP3862948A1/en
Publication of WO2017055890A1 publication Critical patent/WO2017055890A1/en
Priority to US17/062,159 priority patent/US11562314B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0242Determining effectiveness of advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Abstract

Interactive product auditing with a mobile device is described. Example methods disclosed herein include performing, with an auditing device, image recognition based on a first set of candidate patterns accessed by the auditing device lo identify a first product in a. first region of interest of a segmented image. The disclosed example methods also include prompting, with the auditing device, a user to enter input associated with a first grid of the first region of interest displayed on a display, the first grid including the first product The disclosed example methods further include determining, with the auditing device, a second set of candidate patterns to use to Identify a second product in a second region of interest of the segmented image, the second set of candidate patterns determined based on the user input and a group of products identified in a neighborhood of the first region of interest.

Description

FIELD OF THE DISCLOSURE
(OeOiJ This, disclosure relates generally to product auditing and, mor particularly, to interactive product auditing with a mo le device.
BACKGROUND
|0002] Shelf aud ts are typically performed by sending auditors to stores to collect, information abou different products in the stores. In some examples, shelf audits ar completed b performing image recognition on point of sale images taken by the auditors, for example, retail establishments, product manufacturers, and or other business
establishments may take advantage of image recognition techniques performed on photographs taken in such establishments (e.g., pictures of product shelving) to identify quantises and/or types of products m inventory, to identity shelves that need o be restocked and/or the frequency with which products need restocking, to recognize and read product barcodes, to assess product arrangements and displays, etc. Image recognition may be used to identify consumer packaged goods displayed on store shelves. In some examples, image recognition applications or programs attempt to identif products depicted in images of a shelf taken at a oi -of-sale. After the image recognition application or program has analyzed the point-of-sale image, an auditor manually reviews the results to verify the accuracy and/or make corrections. An auditor typically has to adjust or modify information in the results.
BR IEF DESCRIPTION OF THE DRAWINGS
|0β€8| FIG. 1 i a block diagram of an example environment in which an example auditing device constructed in. accordance with the teachings of this disclosure to perform interactive product auditing. 10004} FIG. 2 Is a lock diagram of an example implementation implemenied by the auditing device of FIG, 1.
[CHIOS] FIGS, 3-12 are example implementations of a user interface on the auditing device of FIGS. 1 and/or 2 t perform interactive product auditing,
[ J06J FIG, 13 is a flowchart representative of example machine-readable instructions fo interactive product auditing that may be executed by the example auditing device of FIGS. I and/or 2.
{0807} FIG. 14 is a flowchart representative of example machine-readable instnscdons for updating a candidate pattern list that may be executed by the example aaditins device of FIGS, 1 and/or 2..
fiMlftf j FIG. 15 is a flowchart representative of example machine-readable instructions for updating a key performance indicators that may be executed by the example auditing device of FIGS. 1 and/or 2.
|O 0 FIG, 16 is a block diagram of an example processor platform stractoed to execute (he example maehine-readabte mstractions of FUGS. .1.3, 14, and/or .1.5 implemented by the example anditing de c of FIGS. I and/or 2,
[0010] Wherever possible, the same reference numbers will be used tiuoughont the drawingfs) and accompanying written description to refer to the- same or like parts.
DETAILED DESCRIPTION
|·β0.1.Ι'| anafactnrers are interested in measuring effectiveness of product advertisements. In some examples, manufactures perform shelf audits to analyze- how products are being sold in stores and to measure Key Performance Indicators ( P&) that provide information related to the manners In which the products are presented in the stores and- whether the stores are displaying the prodircte according to the manufaetnters' specifications, Typically, shelf auditing is. labor Intensive and costly task. For example in prior shelf audits* a sales representative visits each store and manually collects one or more variables related' to the display of each product of Interest Such variables may include in- store location, number of facings, whether products holes are present, and whether products are out-of-stock. In some examples, collecting product information includes the sales representative manuall scan barcodes for products that appears on the shelves, which is potentially time consuming. Furthermore, the quality of the data collected using such methods may be inaccurate,
IO012J In some examples, the audit data is collected using image recognition techniques, which allow the process to be partially automated, hi some such examples, th sales .representative's- involvement m th audit is .limited to taking pictures of the shelves that are to be audited. Using Image recognitio techniques is typically more accurate than having the sales representative, manuall scan barcodes to obtain information, but requires that the pictures be sent to a central location for processing, hi some such examples, processing the images and verifying the results is performed by a human and is time consuming, inefficient., and cosily,
|¾0!3] Disclosed herein are example auditing methods, apparatus/systems, and articles f manufacture (e,g., physical storage media) that may be implemented to perform interactive product auditing with an auditing device using image recognition, thus improving the speed at which results are obtained, as well as th accuracy of the results. Interactive product auditing as disclosed herein can significantly reduce the turn-around time of receiving shelf audit results enabling the user to capture the point of sale image(s) and immediately view and modify results (e.g., which may include a segmented image created based on the point of sale image, Key Performance indietors (KPis), etc.) using the auditing device. Such interactive product auditing can increase the accuracy and efficiency of the results. For example, the user can fix errors in the results and/or immediately, collect more information related to the products recognized in the point of sale image. Thus, in some examples, the results tr n mitte from the auditing device and obtained at the time of the in- store audit are the final results and do not requite additional processing,
|Θ014| in examples disclosed herein, shelf auditing is completed using an example auditing application executed on the auditing device. The -exam le auditing application includes a user interface that enables a user (e.g., the sales representative) to capture one or more point-of-sale' images of a product shelf using a camera on the auditing device, in some examples, the auditing application analyzes the quality of the point of sale images and, if necessary, performs image stitching using the auditing device to create a single mag from multiple point-of-sale images of the same product, shelf, In some examples, a set of candidate patterns (e.g., a candidate pattern list) is used, as a guide when performing the image recognition. In some such examples, the initial candidate pattern list is determined based on the store, a product type, and/or .a user input, etc.
jOO'IS) In some examples, the auditing application prompts the user, via the user interface., for an input related to the position (e.g., top, middle, or bottom) of a region of interest (e.g.. each shelf). Based on the input from the user, the auditing device, m som examples, performs image recognition on the shelf individually and displays, via the user interlace, the auditing results for the shelf to the user. Th results may include, for example, a segmented image created from th point-of-sale image using Image recognition techniques and K'Pls indicating variables related to each product(s) depicted in the image. The segmented Image Includes, for example, a region of interest, a grid, a confidence level associated with each shel f, and an indication of whether the user reviewed the results for each shelf. The KPis include, for example, respective shares of shelf space occupied by the ■different products, a task list for the user to complete, assortment compliance, etc. In some examples, the auditing application enables the user to modif the results in an interactive way, such as allowing the user to fix errors in the results, including errors in the segmented image and/or the Kf.fe. Errors can include, for example, a failure of the image recognition system to find a product in a particular location on the shelf, a niisidentification of a product, a niisidentiikation of one or more va bles associa ted with the product (fc.g.r share of shell number of facings, etc), etc,
00 ] In some examples, the- interactive audit n application executed b the auditing device consider input(s) from the user related to a first region of interest (e.g., a first shelf) iu the- segmented image when performing image recognition on the remaining regions of interest (e.g., other shelves), thus increasing the accuracy of image recognition results for the subse uent shelves. For example, the candidate pattern list used to recognize products on subsequent shelves can be updated based on the Input from the user a¾d the products
Identified in relation to die first shell in some such xam les, the input(s) f m the user include a verification of products identified or a modification of the results due to an error in the recognition of the products. In some examples, the user verifies and/or modifies the results of each region of interest on the auditing device prior to transmitting the results to a server for view by a client In such examples, ike auditing device sends the results (e.g., segmented - mage, PIs, etc.) and the point of sale images to a central server. In some examples, the auditing device performs the shelf audit without requiring an internet connection and 'later connects to the internet to transmit the results to the central server.
pit)!?] FIG. 1 is a block diagram of an example environment 100 in which an example auditing device 102, constructed in accordance with the teachings of this disclosure, operates to perform interactive audits of product shelves in a store. In the Illustrated example, the auditing device 102 is a smartphone. However, in other examples, the auditing device 102 can ha, for example, a mobile device, a tablet, a laptop computer, and/or any other suitable device. The example auditing device 102 is described in farther detail' in connection with PIG, 2 below.
fCH l j The example environment 100 includes n example a central server 104 eormrmtatively coupled to the auditing- devic 102 to synchroniz information with the auditing device 102. in some examples, the example central server 104 communicates with the auditing device 102 via a wireless internet network. Additionally or alternatively, in some examples, the central server 104 communicates with the auditing device 102 using any other suitable communication protocol, including hut not limited to, a cellular network, a data network, Blue-tooth, Radio-Frequency Identirleadon (RFID), Near Field Communicatio (NFC), or a wired internet connection, etc. In some examples, product shelf audit data and/or results are communicated between the central server .104 nd t e auditing device 102. For example, the central server 104, m some examples., transmits patterns and/or images to the auditing device 102. in some examples, the auditing device 102 transmits reported results (e.g., image-based results and/or KPIs) to the central server 104,
(8#J' J In the illustrated example, the -example environment 100 includes' n example pattern database 106 in communication- with the central server 104 via any wired and/or wireless network. The example pattern database 106 includes, in some examples, patterns corresponding to products to be audited by the auditing device .1.02. In some examples, the auditing device 102 performs image recognition using the patterns (e.g., which may be reference images, graphics, etc, of producis-of-iutcrest) to match the patterns with products -on the product shelf. In some examples, the example pattern database 106 cormmmicates with the central server 104 to synchronic patterns to the auditing device 102.. Additionally or alternatively, the auditing device .1.02 may be in direct communication with the pattern database 106. In some examples, die patterns are communicated to the a di ting device .1.02 prior to the user arriving at a store to perform an audit. In such examples, the user Is able to audit the products in the store without recoaneclmg io the central server 104 and/or the pattern database 106. .m sam examples, the auditing device 102 may be iu eomrom cation (e,g,? via a wireless network) with the central server 104 and/or the pattern database 106 while performing the product shelfaudit in the store, in some examples, the- uditing device .1.02 creates new a new pattern b identifying a product on the product shelf that does not match an existing pattern. In some such examples, the auditing device 102 communicates the new pattern to the central server 1.04 arid/or the pattern database 106, In some examples, the example pattern database 106 is implemented by a server. Additionally or alternatively, the pattern database 106 can be implemented by, for example, a mass storage device, such as a hard drive, a flash disk, a flash drive, etc,
(0020} In some examples, the illustrated example environment 100 includes an image database 108, in some examples, the central server 104 is m communication with the image database 108 via a wired and/or wireless network, in some examples, the example centra! server 104 synchronizes dat and/or images between the example image database .1.08 and the example auditing device 102. Additionally or alternatively, In some examples, the auditing device 102 is in direct communication with the image database 108. in some examples, the auditing device 102 transmits reported image-based results and/or point of sale images to the central server 104 and/or the image database 108, In some such examples, the central server 104 communicates the image-based results and/or point of sale images to the central server 104 and/or the image database 108. in some examples, the auditing device 102 transmits the image-based results and/or the point of sale images immediately after obtaining the image- based results and/or the point of sale images, in other examples, the auditing devic 102 delays transmittal of the image-based results and/or the point of sale images until the auditing device 102 is in communication with the central server 104 via a network connection (e.g.. such as a wireless and/or wired Internet, connection). In some examples, the image database 108 transmits point of sale images to the auditing device 102 and/or the cen ral server 104, n some examples, the image database 108 is in communication with the central server 104, the auditing device 102, and/or the pattern database 06 via an wired or wireless connection. In some examples, the example image database 108 is implemented by a server. Additionally or alternatively, the image database 108 can be implemented by, for example, a mass storage device, -such as' a hard drive, a flash disk, a flash drive, etc.
0021 ] FIG, 2 is a block diagram of an example Impieiuent don of the auditing device 102 of FIG. 1. In some examples, the example .auditing device 102 include an example auditing device processor 202 structured to enable the auditing device to perform a product shelf audit interactively. In some such examples, the processor 202 is operatively coupled to additional com onents of the auditing device 102, such as an example camera 204, an example display 206, and/or an input/output (I/O) and/or other communication interface 208. f O22] In th illustrated example, the auditing device 102 includes the example camera 204 operatively coupled to tire processor 202, In some examples, the camera 204 captures point of sale image(s) of a region of interest (e.g., a product shelf) and
communicates the image(s) to the processor 202. In som examples, the camera 204 is capable of scanning barcodes to provide additional input related to the products in the point of sale image(s), and may communicate the barcode to the -processor 202,
j;0823J The example auditing device 102 of the illustrated example includes an example display 206 operatively coupled to the processor 202, The display 206, in some examples, presents results to the user via a user interface (e.g., a» interactiv .and/or graphical user interface) Implemented by the example processor 202 of the auditing device 102. In some examples, the displa 206 Is a touchscreen to simplify interaction between the auditing device 102 and the user when providing input related to the displayed results, hi some examples, the user provides input in response to prompts on the display 206 commnnlcated via the .user interface. In some examples, the user provides input to correct errors m the results presented to the user on, the display 206 via the user interface. F GS. 3-1.2 depict example re resentations of the user interface that may be implemented by the example processor 202 using the example display 206 on the .example auditing device 102,
{00241 & some examples, the auditing devic 102 includes an example input/output (I O) interface 208 opetativeiy coupled to the processor 202. The I/O interface 208 is operative to communicate with, in some examples, the central server 104, the pattern database 106, and/or the image database .1:08 of FIG, L !n some examples, the I/O interface 208 is operative to interactively communicate with the user using, for example, the displa 206, a button, on the auditing device, a voice command, a gesture, a sensor to receive input from the user, etc. In some such, examples, the I/O interface 208 enables the user to provide Input to the -user interface, via the display 206, related to the products on the region of Interest (e.g.,, th product, shelf) and/or the results displayed to the user.
|¾02S| An example implementation of the processor 202 of the example auditing device 102 s also depicted i FIG, 2. in some examples, the example processor 202 of the auditing device .102 includes an example image segmentor 210, In some examples, the example image segmentor 2:1.0 of the processor 202 receives the point of sale images from the camera 204, In some examples where multipl point of sale image are captured for a product shelf being evaluated, the image segmentor 210 perfon«s image stitching to combine the multiple' images to create a single image corresponding to the product shelf. An example image stitching process that ma be used in connection with the example Image segmentor 210 disclosed herein is described in detail in International Patent Application No.
PCT/ES2015/00007S (International Patent Publication No, }, dried Methods and
Apparatus to Capture Photographs Using Mobile Devices, and filed on June 18, 2015, which is hereby incorporated by reference in its entirety. |M26| In some examples., the imag segmental1 210 defines segments in the image that ma coniala products to be identified. In ome examples, the image segmentor 210 designates the locations of the segments in a segmented image by defining sha es (e.g>, rectangles boxes, eta) around the segmentsand/or products. As used herein, the term
"segmented image" refers to a point of sale image that has been segmented by the image segmentor 210, and when displayed, the segmented image includes, for example, the image content of the original image and the shapes (e.g., rectangles/boxes, etc) defining the products identified in the image. In some examples, the segmented image is displayed as a portion of the results (e.g., the image-based results) via the user interface and the display 2( 6. in some such examples, the image segmentor 210 displays the segmented image to the -user via the user interface to enable a user to verif that the image is properly segmented -and/or correct errors in the segmented mage. In some examples, the user designates segments to be added to the segmented image when reviewing and interacting with the results using the user interface on the display 206 of the auditing device 102. In som such examples, the user interface of theauditing device 102 prompts the user to- define and/or redefine the segments 'in the segmented image, in other such examples, the user defines additional segments and/or redefines existing segments to correct segmentation error(s) made by the image segmentor 210< For example, a segmentation error includes failing to create a segment for a product on the shelf that is to be identified, creating a segment where there ss no product, to be i entifie , creating a segment including too many products to be Identified, etc. In some examples, a segment is created where there is no product on the shelf, which may correspond to an out-of- stock product expected (e.g., based on stored/retrieved information from prior audit results) to he in that location on the shell An example image -based result including a segmented image designating example segments using boxes Is shown in FIG. 3, which- ncludes an example depiction of an example image based resul 300 displayed by the example user interface implemented by the example processor 202,
[0027J En som examples, the se ments defined by the image segmentor 210 include regions of irueresi. The regions of interest, in some examples, correspond to shelves (e.g., shelves of a product shelving unit) identified in the image of the product shelvin unit.
Ex mpl s of uch regions of interest corresponding to shelves are designated by, for example, box 306 of F!G> 3. Additionally or alternatively, in som examples, the regions of interest correspond to an entire product shelving unit, a pro ct type, an individual product, and/or any othe area in the point of sale image designated by the user as a region of interest. As used here, the term "product shelving unit" refers to a section of store shelving that includes multiple shelves, the terms, "shelf and/or "shelves" refer to individual shelves of the product shelving unit, the term "product type" refers to products on the product, shelving unit and/or a shelf that are identified as the same product, and the term "Individual product" refers to each product on. the product shelving unit and/or a shelf, regardless of product type.
[002&! In some examples, the segment defined b the image segmentor 21.0 include grids, In some examples, the girds correspond to a product type (e.g., multiple instances of an individual product of the same product type are Included in the grin). Examples of grids corresponding t the product type are depicted by, for example, box 308 of FIG. 3,
Additionally or alternatively, in other examples, the grids correspond to the product shelving unit, a shelf, or an individual product.
|O029] In some examples, the processor 202 includes an example candidate pattern selector 212. The example candidate pattern selector 2i2, in some examples, communicates with the pattern database 06 to download patterns from the pattern database 106 lo the auditing device 102, A pattern, in some examples, includes a reference image of a product, a graphical r presentation of a product, logos/brand information depicted on product packaging, etc. In some examples, the candidate pattern selector 212 selects patterns to download (e,g., downloaded patterns) based on a store and/or a type of store being audited and/or a user performing the audit in some such examples, the candidate pattern selector 212 selects and downloads the downloaded patterns to the amiuiog device 102 prior to the nser beginning the shelf audit. I some examples, the candidate pattern selector 212 selects and downloads the downloaded patterns after the audit is initiaiteed. In some examples, the candidate pattern selector 212 selects a firs set of patterns (e.g., a first candidate pattern list) from the downloaded patterns to be nsed by an example product identifier 214 (described in farther detail below) to evaluate a first region of interest (e.g., a first product shelf). In some sxich examples, the first set of patterns is selected f om the downloaded patterns based on a product type or a store type associated with the product shelf being evaluated, in some such examples, the product type is designated by an mp from a user via the user interface,
|0lB0j In some examples, in response to a verification of the products identified by the product identifier 2:1.4 in the first region of interest, the candidate pattern selector 232 receives an indication of the patterns used by the product identifier 214 during the evaluation of the first region of interest and/or an indication of the patterns matching products i the first region of interest. I some such examples, the candidate pattern selector 212 selects, based on the first set of patterns and/or the received indication(s) of the patterns associated with the first region of. Interest, a second set of patterns (e.g., a second candidate pattern list) to be used by the product identifier 234 to eva uate a second region of interest in th segmented image. In some such examples, the candidate pattern selector 212 determines a neighborhood of the products Identified in the first region of Interest to assist in choosing the second set of patterns. In some examples, the neighborhood for a give product: includes products (and/or grids of products) immediatel adjacent to and/or within a particular number of grids away from the given product identified in the first region of interest. I some examples, the neighborhood of a i es product identified In ihe firs region of interest includes the products identified in the first region of interest, other products identified in the product shelf eoataiiHBg the given product, other products identified *^ verified regions of interest of the segmented mage, and/or products ident fi d in unverified regions of interest of the segmented image, in some examples, the candidate pattern selector 212 chooses the second sei of patterns based on one or more of a product category, a category level* a store, etc. In some such examples, th product category, the category level , or the store may be determined from the segmented image and/or based on a user input In some examples, the candidate pattern selector 212 chooses a new sei of patterns to he used to evaluate different regions of interest in the segmented image- For example, if the segmented image includes five regions of interest, ihe candidate pattern selector 212 may select a new set of patterns after each of the regions of Interest in the segmented image is verified. In some such examples, the candidate patter selector 212 evaluates information related to the products identified in verified iegion(s) of interest to select the new set of patterns used to evaluate a subsequent region of interest
CM I] I some examples, the example product identifier 2,14 of the processor 202 uses image recognition techniques to identify products in, for example, a region of interest of a segmented image, a grid of the segmented image, etc, M some examples, the product identifier 214 compares the products in the regiou(s) of interest and/or the grkl(s) to the respective set of patterns obtained for that region/gfid (e.g., the first set of patterns is used for the first region of interest, the second set of patterns is used for the second region of Interest, etc.). For example, to evaluate the products in a first region of interest, the product Identifier 214 of the illustrated example compares the products to the first set of patterns to find a pattern that matches a product in the first region of interest. In some examples, a product that matches a pattern is referred to as an identified product, In some examples, the product identifier 214 displays the identified product in the corresponding grid of she segmented image for verification by the user. An example identified product matc ng a pattern 310 is shown in the example image-based results 300 of FIG. 3. In some examples, the product Identifier identifies an ou -of-stock product in a segment (e.g., which has no product) based B audit .information stored and/or retrieved for a prior audit of the shelving unit and identifying a product previously located in the shelving unit. I some such examples, the produc identifier identifies a type of product that is out- of-stoek.
10032 J In some examples, the product identifier 214 identifies some or all. of th products in a region of interest and/or a product shelf prior to displaying the identified products to the user In the segmented image via the user interface, in some such examples, the product identifier 214 determines a confidence level indicating the certainty that the products identified in the .region of interes are accurate. In some examples, the confidence level is determined as described in, for example, International Patent Application No,
PC'neS2O15/O 0il9 (International Patent Publication No. ), titled Product Auditing In
Point of Sale Images and .filed on September .1.0, 20.1.5, which is hereby incorporated by reference in its entirety. In some examples, the product identifier 214 displays the confidence level Irs th region of interest in the segmented image via the use interface, In some examples, the product Identifier 2.1.4 prompts the user to verify, via the user interface, that the identified product is correct and/or to select a correct product to replace the identified product via the user interface,. In some examples when the product identifier 214 identifies a product, the product identifier 214 also displays other potential matches for the product, via the user interface, and prompts the user to select the correct product to he the identified product, one of the other displayed potential matches or a different product entered by the user. In some such examples, if the potential matches are not the correct product, the user ma use the camera 204 to sc n: the barcode of the product and/or may enter the product information manually. In some example?;, if a product in the region of interest does not match any patterns, the product kfcntifiet 2 4 creates a new pattern correspondin to that product using information entered by the user. In some such examples, the new pattern is communicated to the pattern database 1.06.
fll033| In some examples, the processor includes an example key performance indicator (KPI) definer 216. In some exam les, the example KPI defe 236 computes key performance indicators (KPIs) based on th shelf" audit. In some examples, the KM defer 216 receives information related to the identified products (e.g., facings (e.g., a side(s) of the product facing outward from the shelf), location, assortments, share of shelf, etc.). In some examples, the KPi definer 216 computes the number of products (e.g., the total number of products and/or the number of each type of product), lit som examples, the KPI definer 216 computes metric information (e.g., dimensions of the product) related to the products on a product shelf, in some examples, the KPI definer 216 compiles information, (e.g., miputed infon.nation and/or received information) related to the product shelf audit. In some such examples, the KPI definer 216 determines the output KPis based on the Information. In som examples, the KPI definer 216 compares the output KP!s to target KPIs. In some such examples, the target KPIs are pre- defined and/or designated by the use? prior to the audit in some examples, the output KPIs are transmitted to the central server 104. In some such examples, the output KPI are queued for transmission to the central server 104 when the auditing device .102 is connected via a network connection,
|0834! In some examples, the KPI defer 216 creates a to-do list includin tasks to be completed by the user. In som examples, the KPIs are displayed by the user interface based on the type of KPI (e.g., tasks, assortment, share of shell promotions, prices, and position). For example, FIGS, 4-6 and 9-11 Illustrate example depictions of the KPIs displayed in lists via the user interface of the auditing device 102, In some examples, the KPIs are modifiable using the lists displayed via the user interface, In some such examples, the KPI deflner 216 prompts the user to edit the KPIs. For example, in FIG. 5, a user may select the value 506 corresponding io the number of each product on the shelf rid changes the value. In some such examples, changing the value 506 also updates the percentage value 508 corresponding the product I some examples, some KPIs, such as the share of shelf and position KPIs, include an image-based KPI result m the user interface of the auditing device "102 (as shown in FIGS. 7, 8, and 12). In some such examples, the KPIs are modifiable using either the listed KPI results display or the corresponding image-based KPI results display.
|O035J n some examples, the processor 202 includes an example resells analyzer 218. The example results analyzer 218, evaluates the segmented image and/Or the usage based results to determine whethe the product, identifier 214 has completed evaluation of the regions of interest or grids of the segmented images and/or the image -based results. In some examples, the results analyzer 218 determines if the user has verified all of the regions of interest and/or grids in the segmented image and/or the image-based results. In some examples, the results analyzer 2 8 additionally or alternativ ly, determines If any KPIs are to be evaluated by the user and/or whether the user i to provide additional input based on the KPIs. fa some examples, the results analyser 2 I S communicates with the central server 104, via the I O interface 208, to transmit the final results to the centra! server 104.
£<M>36 In some examples, the processor 202 kte.hid.es an example storage device 220, In some examples, the storage device 22 is in communication with the example image seg entor 210, the example candidate pattern selector 21.2, the example product identifier 21 , the example KPI definer 216, the example results analyzer 218, the camera 204, the display 206, and/or the I/O interface 208. In some examples, the camera 204 communicates images (e.g., pomt of sale images captured by the user) to the storage device 220 for later transmittal to the central server 104, hi some examples, the image segmentor 210 receives the point-of-sale images from the storage device 220. In other examples, the image segmentor 210 stores a segmented image in the storage device 220 for later evaluation and/or later transmittal to the central server 104, In scmie examples, the candidate pattern selector 212 downloads patterns from the pattern database 10 to the storage device 220 and/or retrieves patterns from, the storage device 220 to create candidate pattern lists, in some examples, the product identifier 214 stores image-based results (e.g., results not yet verified) to be presented to the user in the storage device 220. In some examples, the Pi define* 216 stores target KPIs and/or output KPI in the storage device 220. In some examples, the results analyzer 218 stores final results (image-based results and/or KPIs) in the storage device 220 for transmittal to the central server 104.
f(MB?] FIG. 3 is an example image-based result 300 viewable in an example user interface implemented by the auditing device 102 of FIGS, 1 and/or 2 for interactive product auditing. The example image-based result 300 is created using an example point-of-sale image 302, which is a. background layer in the illustrated example. In the illustrated example, the image-based result 300 depicts a product shelving unit 304 including one or more regions of interest 3Ό6. In some examples, a region of interest 306 includes one or more grids 308. In the example image-based result 300, a displayed grid 308 includes a candidate pattern 310 matched to the produet(s) in the .grid 308. in some examples, the grid 308 includes more than one instance of an individual product 312. In some -examples, the image-based result 300 includes an error, such -as a product 314 that has not been properly segmented. In some examples, the image-based result 300 includes an indicator 3.1 indicating whether the user has reviewed the region of interest 306. In some such examples, when the user has verified the products region of interest 306 are correctly identified, the indicator 316 includes a slgnifier 318 (e.g., a check mark) that the user has approved the region of interest 306. In some such examples, when the user has not reviewed the region of interest 306, the indicator
* 1 t * 316 signifies that, the region of interest 306 is not reviewed, In some examples, when the region of interest 306 has not been evaluated, the indicator 316 signifies that the region of interest 306 is pending valu tion In some examples, the indicators 316 include respective confidence level 320. In some examples, the imag -based result 300 includes an edi button 322 to enable the user to edit the image-based result 300. in some examples, the image-based result 300 includes a KPI button 324 that, when selected, displays a KM viewer m the user interface.
|ΘΘ3$| FIG. 4 is an example to -do list KPI 400 viewable in an example user interface Implemented by the auditing device 102 of FIGS. 1 and/or 2. In some examples, the to-do list KPI 400 Includes a title 402 to indicate which KFI the user is viewing. The example to-do list KPI 400 includes a list of products 404 related -to tasks to be performed by the user. In some examples, the to-do list KPI 400 include a legend 406 to provide instructions to the user related to the tasks. In the IP unrated example, a product in the list includes an indicator 408 (e.g.? "·Ό indicating that the user is t add an indicated number of units (e.g., one unit) of the product to the shell In the illustrated example, another product in the list includes an indicator 4.10 (e.g., "H") indicating that the user is to remove an indicated number of units (e.g.. one unit) of the product to the shelf. In some examples, the to-do list KFI includes a warning 412 to indicate that ihe user needs to complete one or more tasks before leaving the store. In some examples, the to-do lis KPI 400 includes back button 14 to return the user to the previous view in the user interface.
|ββ3·9] FIG. 5 is m example assortment KPI 500 viewable in an example user interface im lemented by the auditing device .102 of F GS. 1 and/o 2. In some examples, th assortment KPI 500 includes a title 502 to indicate which KPI the user is viewing. The example assortment KPI 500 includes a list of products 504 including a number of units 506 for each product and a percentage 508 of the units identified in ihe shelving unit being audited. In some exam les, the percentage 508 is based on the numbe of -units 506 of th product thai are on the example product shelving unit 304 of FIG, 3, In some examples, the user may change the number of product units 506 by selecting- the number of units 506 and using a pop-up Input window (e,g., a keyboard) to adjust the number of units 506. In some examples, the assortment M 500 includes one or mart view-switching buttons- 510. In the illustrated example, the view switching buttons 510 incl de a level button 512 and a product button- 514, In some examples, the level vie displays the products based on the region of interest (e.g.., shelf) in which the product is located, in some examples, the product view, as illustrated in the example assortment PI 500, displays the products in a defined order (e.g., by products having the most units). In some examples, the assortment KM 500 includes a hack button 516 to return the user to the previous viewin the user interface.
flhMOJ FIG. 6 is an example share of shelf KM 600 viewable in. an example user interface implemented by the auditing device 102 of FIGS. 1 and/or 2. I some examples, the share of shelf KPI 600 incl udes a title 602 to indicate which KPI the user is viewing. In some examples, the share of shelf KPI 600 includes a graph 604 representing the shar of shelf occupied by specified product over time. In the illustrated example, the graph 604 includes entries based on a monthly basis, in some examples, the shar of shelf for product is visually compared to a target share of shelf 606 using the graph 604. In some examples, the share of shelf KPI 600 includes a list of products 608 associated with the graph, in some examples, the share of shelf KPI 600 displays the share of shelf measured during the last visit In a first column 610. In som examples, the share of shelf measured during the current, audit is displayed in the second column 612 of the share of shelf KF1600. in some examples, a target share of shelf (e.g., within a tolerance, which may be the same or different for different products) is displayed in the third column 614. In some examples, an indica tor of whether the measured share of shelf meets the target, share of shelf i displayed in a fourth column 616. In some examples, the user can change the share of shelf values (e.g., the measured share of shell from the current audit 6.1.2, the target, share of shelf 614} by clicking on the values and using a pop-up window to enter a new share of shelf value. In some examples, the share of shelf KPI 600 includes ooe or more view-switch ng bu t ns 616. In the illustrated example, the view switching buttons 616 include a level button 618 and a product button 620. in some examples, the share of shelf KPi 600 includes a back button 622 to return the user to the previous view in the user interface.
[0( 41 J FIGS. 7 and 8 depict an example share of shelf image-based KPI 700 viewable in an example user interface implemented 'by the auditing device 102 of FIGS. 1 and/or 2. In some examples, the share of shelf image -based KPI 700 is displayed to the user to enable the user to change the share of shelf of one or more products identified in the product shelving unit 304 of F!CI 3. For example, in the illustrated share of shelf image-based KPi 700, the share of shelf for Kleenex Cottonei!ei! Toilet Paper 702 is changed from 73 cm in FIG. 7 to 100 cm in FIG. 8. In response to the change of the share of shelf for Kleenex CoOonelie® Toilet Paper 702, the share of shelf for Pepsi™ 704 changes from 60 cm In FIG. 7 to 33 cm in FIG. 8, In som examples, an unspecified share of shelf 706 is editable the user to include shares of shelf for differen products on the shelf, in some examples, the share of shelf image - based KPI 700. includes an edit button 708 to enable the user to edit the respective shares of shelf of the products. In some examples, the share of shelf image-based KPi 700 includes a back button 710 to return the user to the previous view in the user interface.
I' 042'l FIG. 9 is an example promotions KPi 900 viewable in an example user interface implemented, by the auditing device .102 of FIGS. 1 and/or 2, In some examples, the promotions KPI 900 includes a title 902 to indicate which KPI the user is viewing. In some- examples, the promotions KPI 900 includes a list of products 904 for which promotion(s) and/or discount(s) are currently available. In some examples, each product kt the list corresponds to m indicator (e.g.. a " Yes"' or "No".} to indicate to the user whether the product is include in a current promotion. In some' examples, the promotio KPI 900 includes a back button 906 to return the user to the previous view in the user interface.
|¾043 FIG. 10 is an example prices KFI 1000 viewable in an example user interface implemented by the auditing devic 102 of BIGS. 1 and/or 2, In some examples, the prices KFI .1000 includes a title 1002 to Indicate which KPI the user is viewing. In some examples, the prices KPI 1.000 includes a list of products I 04 including a price indication 1006 for each product. In some examples, the prices KFI includes an edit: button 1008 to enable the user to edit the prices, in some examples, the prices KPI 1000 Includes a back button 1010 to return the user to the previous view in the user interface.
i 4| FIX?. 11 is an example position KPI 1100 viewable in an example user interface implemented the auditing device 102 of FIGS. 1 and/or 2. In some examples, the position KPI 1100 includes a title 11.02 to indicate which KPI the user is viewing. In some examples, the prices KPI 1100 includes a list of products 1104 including a position indication .1 06 indicating a position on the product shelf for each product. In some examples, the indicated position includes eyes (e.g., corresponding to an eye -level position, a top-level position, etc.) 1108, hands (e.g., corresponding to a .hand-level position, a mid-level position, etc) 1110, or floor (e.g., corresponding to a iloor-level position, a bottom-level position, etc) 1.11.2, which define different position of the shelving unit 304. In some examples, the indicated position may include a combination of eyes 1108, hands 1110, or floor 1112. For example, a product on a shelf between .an eyes 1108 location and a hands 1110 location ma indicate the product is 50% in. eyes .1.108 and 50% in hands! 1.10. In some examples, the share of shelf KPI 1 00 includes one or more view-switching buttons 1114, In, the illustrated example, the view switching buttons 11.14 include a level button 1116 and a product button 11 IB. In some examples, the position KPI 1.100 includes hack button .1.120 to return the user to the previous view in the ser interface, In some examples, the prices KPI \ 100 includes an auxiliary button 1.1.22.
{6045J FIG. 12 depict an example position image-based KPI 1200 viewable in an example user interface iniplememed by the au itin device 102 of FIGS. 1 and/or 2. In some examples, he user uses the auxiliary button .1.122 of FIG, 11 to view the position image- based KPl 1200. In some examples, the user uses the position image based KPI 1200 to view the position regions including an eyes region .1202, a hands region. 1204, and a feet region 1206, In some examples, the user may edit the size and/or locations of the regions 1202, 1204, and 1206, In some such examples, an edit 'button 1208 enables the user to edit the regions 202, 1204, and 1206. In some examples, the position image-based KP 1200 includes a back button 1210 to return the user to the previous view in the user interface,
[0046] While an example manner of implementing the auditing device 102 of FIG. I is illustrated in. FIG, 2, one or more of the elements, processes and/or devices illustrated in FIG. 2 may be combined, divided, re-arranged, omitted, eliminated and/or implemented in any other way. Further, the example processor 202,: the example camera 204, the example display, 206, the example I/O interface 208, the exampl image segmentor 210, the example candidate pattern selector 212, the example product identifier 2:1.4, the example KPI definer 216, the example results analyser 218, the example storage device 220, and/or, more generally, the example auditing device 102 may be implemented by hardware, software, firmwar and/or any combination of hardware, software and/or firmware. Thus, for example, my of the example processor 202, the example camera 204. the example display, 206, the example I/O interface 208, the example image segmentor 210, the example candidate pattern selector 212, the example product identifier 2.14, the example KPI definer 216, the example results analyzer 2 8, the example storage device 220 a«d/or, more generally, the example auditing device 102 could be implemented by one or more analog or digital cifeuit(s), logic circuits, programmable processors), application specific integrated circnitis) (ASl (s»s programmable logic devlceCs) (PLD(s)) and or field programmable logic device(s)
( PLD(s}), When reading any of he apparatus or system claims of this patent to cover a purely software -and/or firmware implementation, at least one of the example processor 202, the example camera 204, the example display, 206, the example I/O interface 208, the example image segmentor 21.0, the example candidate pattern selector 212, the example product identifier 214, the example KFI defmer 216, the example results analyzer 218, the example storage device 220, and/or, more generally, the: example auditing device 102 is/arc hereby expressly defined to include a tangible computer readable storage device or storage disk such as a memory, a digital versatile disk ( VB), a compact disk (CD), a Bln~ray disk, etc. storing the software and/or firmware. Further still, the example auditing device 102 of FIG, 1 may include one or more elements, processes arsd/or devices in addition to, or instead ο those illustrated in FIG, 2, and/or may include more than one of any or all of the illustrated elements,, processes and devices*
P#47| Flowcharts representative of example machine readable instructions for implementing the example auditing device 102 of FIGS. 1 and/or 2 are shown in FIGS, 13-1.5 In these example, the machine readable instructions comprise a program or programs for execution by a processor such as the processor 1602 shown in the example processor platform .1 00 discussed below in connection with FIG, 16, The program may be embodied in software stored on a tangible computer readable storage medium such as a CD-ROM., -a floppy disk, a hard drive, a digital versatile disk (DVD), a Blu-ray disk, or a memor associated with the processor 1602, but the entire program or programs and/or parts thereof could alternatively be executed by a device other than the processor 1602 and/o embodied in firmware or dedicated hardware. Further, although the example program is described with reference to the flowchart illustrated i FIGS, 13-15, many other methods of implementing the example a d ting device 102 the xampl processor 202, the example camera 204, tiie example display, 206, the example I/O interface 208, the example image segmentor 210. the example candidate pattern selector 212, the example product identifier 214, the example KPl delmer 216, the example results analyzer 218, and/or the example storage device 220 may alternatively be used. For example, the order of execution of the blocks ma be changed, and/or some of the blocks described may be changed, eliminated, or combined..
j¾043S| As mentioned above, the example processes of FIGS. 13-15 ma be implemented using coded instructions (e.g., computer and/or machin readable instructions), stored on a tangible computer readable storage medium such as hard disk drive, a Hash memory, a read-only memory (ROM), a compact disk (CD), a digital versatile disk (DVD), cache, a random-access memory (RAM) and/or any other storage device or storage disk in which information is stored for any duration (e.g., for extended time periods, permanently, for brief instances, for temporarily bufferings and/or for caching of the information). As used herein, the term tangible computer readable storage medium is expressly defined to include any type of computer readable storage device and/or storage disk and to exclude propagating signals and transmission media. As used herein, "tangible computer readable storage medium" and "tangible machine readable storage medium" are used interchangeably, Additionally or alternatively, the example processes of FIGS. 13-15 ma be implemented using coded instructions (e.g., computer and/or machine readable instructions) stored on a no -transitory computer and/or machine readable medium such as a hard disk drive, a flash memory, a read-only memory, a compact disk, a digital versatile disk, a cache, a random- access memory and/or any other storage device or storage disk, in which information is stored for any duration (e.g., for extended time periods, permanently, for brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term eon- transitory computer readable medium is expressly defined to include any type of computer readable storage device aud/oj storage- disk and to exclude propagating signals and transmission media. As used herein, when the phrase "a least" is used as the transition term in a preamble of a claim, i s open-ended n the same manner as the term "comprising" is open ended.
£0049) FIG. 13 is a ow bar i 300 representative of example oiac ne-read&ble instructions for interactive product, auditing tha may be executed by the example auditing device 10:2 of FIGS. I and/or 2, Th instructions begin execution with the example, camer 204 capturing a point of sale image (block 1302). The example camera 204 transmits the point of sale image to the Image segmentor 210 and the image segmentor 210 creates a segmented image . from- the point of sale image (block 1304). The example candidate pattern selector 212 selects a first set of patterns to be used to evaluate the product shelf (block .1306). The results analyzer 218 determines if any regions of interest of the segmented Image have not been evaluated (block 1308), If the results analyzer 218 determines that there are regions of interest . that are unevakiated (block 1308), the product identifier 214 uses image recognition to identify products in the region of interest (block 13.10). In the illustrated example, the product identifier 214 estimates a confidence level associated with the identified products in the regio of interest (block 1312), The example results analyzer 218 determines if any grids in the region of interest are to be reviewed by the user (block .1314), if the results analyzer 218 determines that there are grids to be reviewed by the -user (block 1314)» the product identifier 21 receives input from ihe user related to the .products identified in the grids (Mock 1316). If the results analyzer 218 determines that there are no grids to be reviewed by the user, the example candidate pattern selector 212 identifies patterns related to the products identified m the grid (block 1318), The example candidate pattern selector 212 updates the set of patterns to he used during the product audit based on the identified patterns (block 1320). Execution returns to block 1308, |085$j If the results analyzer 218 dete««l»es in block .1308 that there are no more regions of iruerest to be evaluated, the KPI deflner 216 de ermin s output KPls based on the products identified during ihe shelf audit and updates the output KPIs based on a user input (block 1322). The ex-ample results analyzer 218 displays the final results (e.g., the image- based results and/or the KPis) to the user via the user interface of the auditing device 102 (block 324). The example results analyzer 218 then determines if the user made any changes to the final results (block 1326). If changes were made to th final results, the instructions return to block 1324. If no changes were made to the final results, the example results anal zer 218 determines If there are more product shelves In the store to evaluate (block 1.328). if the results analyzer .218 determines that there axe more product shelves to evaluate (block 1326), execution returns to block 1302, If the results analyzer 218 determines that there are no more product shelves in the store to evaluate, the results analyzer 2 8 transmits the resells to the central server 104 pock 1330), Execution of the program of FIG, 13 theft ends.
|'O05!'| PIG. 14 is a flowchart representative of example machine-readable
instructions lor updating a candidate pattern list in block 1320 of FIG, 13 and that may be executed by the example auditing device 102 of FIG, 1. The instructions begin execution with the example candidate pattern selector 2 2 receiving th user input related to a reviewed region of Interest (block 1402). The example results analyzer 218 verifies that all grids in the region of interest have been reviewed (block 1404), The example candidate pattern selector 212 receives the resul ts of the region of interest. Including an indication of the patterns matched to products in the region of interest (block. 1406). Th example candidate pattern selector 212 reviews the products identified in the region of interest and the patterns matched to the products (block 1408), The example candidate pattern selector 212 analyzes the neighborhood of the product in the region of interest (block 1410). The candida te pattern selector 2,12 analyzes the segmented image for additional information (e.g., numb r of products not analyzed, brand mfornmtion for identified products, etc.) to be used to update set of patterns (block 1412), The candidate patters selector 212 creates a new set of patterns to use to audit the product shelf to replace the first set of patterns used to audit the product shelf (block 1414). The candidate pattern selector 212 determines if. more regions of interest have been received (block 1416), If more regions of Merest have been received (block 1416), execution returns to block .1404, If no more regions of Interest hav been received, execution of the program of FIG, 14 ends.
[08S2] FIG. 15 is a flowchar representative o! example machine-readable
instructions for updating key performance indicators in block .1.322 of FIG - 13 and tha t may be executed by die example auditing device .102 of FIGS 1 and/o 2. The instructions begin execution with the example FI definer 216 identifying the position of the one or more products (block 1502). The example KM definer 216 estimates metric data (e.g., dimensions of a product, share of shelf of a product) for on or more products (block 1504). The example KPI definer 216 determines a number of products in a region of interes (block 1506). The example KPI define? 216 compares the -output KFIs calculated in blocks 1502 to .1506 to target KFIs designated by the user (block 1S0B). The example results analyzer 21.8 prepares the results for transmittal to the central server 104 (block .1.510), The example KPI definer 216 displays the output KPI(s) to the user via the user interface and modifies the output. PI(s)s as appropriate (block 151.2), The example KPI definer 216 receives input from the user related to the output KPIs (block 1514). The example results analyzer 218 determines if any PIs need to be reviewed by the user (block 516). If the results an lyzer 218 determines there are KPIs to be reviewed by th user (block 1516), execution returns to block 1514, In the results analyzer 218 determines there are no rnore KFIs to be reviewed b the user (block 1615), execution of the program of FIG. 15 ends. j¾>053J FIG. 16 Is a block diagram of an example processor platform 1 00 structured to execute the Instructions of FIGS 13-13 to implemeni the auditing device 102 of FIGS. 1 and/or 2,. The processor platform 1600 can be, for example, server, a personal compute?, a mobile device (e.g., a cell phone, a smart phone, a tablet such as an IPad a persona! digital assistant (PDA), an internet appl ance, a digital video recorder, a personal video recorder, or any other type of computing device.
]0054] The processor platform 1600 of the illustrated .example includes a processor 1602, The processor 202 of the illustrated example is hardware. For example, the processor 1 02 can be implemented by one or more : integrated circuits, logic circuits, microprocessors r controllers from any desired famil or manufacturer, in the iHastrated example, the processor 1600 executes example instructions 1632 corresponding to the example instructions of FIGS. 13, 14, and/or 15 io implement the example auditing device processor 202, the example image segmenior 210, the example candidate pattern selector 212, the example product Identifier 214, the example KPI defmer 216, and/or the example results analyser 218.
[005 1 The processor 1602 of the illustrated example includes a local memory .1 13 (e.g., a cache). The processor 1602 of the illustrated example is in eom.mumca lion with a mai memory including a volatile memory 16.14 and a non-volatile memory .1.616 via a bm 1618, The volatile memory 1614 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memor (DRAM), RAMBUS
Dynamic Random Access Memory (RDRAM) and/or any other type of random access memory device. The non-voktile memory 1616 may be implemented by flash memory and/or any other desired type of memor device, Access to the mai n memory .1614, 1616 is controlled by a memory controller.
10056') The processor platform 1600 of the illustrated example also includes an interface circuit 1620, The interface circuit 1620 may be implemented by any type of Interface standard, such as an Ethernet interface, a universal serial bus (USB), and/or a PCI express interface.
|085'?| In the illustrated example, one or more input devices 1622 are connected to the inter face circuit 1620. The input device(s) 1622 perrnit(s) a user to enter data and commands into the processor 1612, The input, deviee(s) can he implemented by, for example, an audio sensor, a microphone, camera (still or video), a keyboard, -a bntion, a mouse, touchscreen, a track-pad, a trackball, isopoini and/or a voice recognition system, in the illustrated xample, the one or more input device 1622 includes the example camera 204,
|0O58J One or more output devices 1624 are also connected to the interface circuit 1620 of the illustrated example. The output devices 1624 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode
(OLED), a liquid crystal display, a cathode ray tube display (CRT), a touchscreen, a tactile output device, a light emitting diode (LED), a printer and/or speakers). The interlace circuit 1620 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip or a graphics driver processor ..In the illustrated example, the one or more output device includes the example display 206,
(0059] The interface circuit 1620 of the illustrated example also includes a
communication device such as a transmitter, a receiver, a transceiver, a modem and/or network interface card to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 1626 (e.g., an Ethernet connection, a digital subscriber line (DSL), a telephone line, coaxial cable, a cellular telephone system, etc.). In the illustrated example, the interface circuit 1620 implements the example /O interface 208,
jOO&!!J The processor platform 1600 oi the illustrated example also includes one or more mass storage devices 1628 for storing software and/or data. Examples of such mass storage devices 1628 include floppy disk drives, hard drive disks, compact disk drives, Bin- ray disk drives, RAID systems, and digital versatile disk (DVD) drives. In some -examples,, the mass storage deviee(s) 1628 and/or the volatile memory 1614 implement die example storage device 220,
0(I6 f The coded instructions 1632 of IGS 13-15 may b stored in the mass storage device 1628-, m the volatile memory 1614, in the non-volatile memory 1616, and/or OH a. removable tangible comput r readable storage medium suc as a CD or DVD.,
| H>62] F m the foregoing, it will appreciated that the above disclosed example methods, apparatus and articles of manufacture can reduce th overall cost of erf rming shelf audits by not requiring complex infrastructures to perform the image recognition. Additionally, the example methods, apparatus;, and/or articles of manufacture disclosed herein reduce the amount of offline manual intervention required to review and verify the results, which is traditionally very costly. The example methods, appar us and/or articles of manufacture disclosed herein can also reduce the amount of time between collecting the information nd obtaining the final results,
{0063] Although certain example methods, apparatus and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto, On the contrary, this patent covers ail methods, apparatus and articles of manufacture f irl tailing within the scope of th claims of this patent.

Claims

What Is: Claimed Is:
1. An interactive product auditing method, comprising:
performing, with processor of an. auditing device, image recognition based on a first set of candidate patterns accessed by the auditing device to identify a first product depicted in a first region of interes of a segmented image;
prompting, with the processor of the auditing device, a user to enter a user input associated with a first grid of the firs region of interest displayed on a display of the auditing device, the first grid including the first product'; and
determining, with the processor of the auditing device, a second set of candidate patterns to use to identify a second product in a second region of interest of the segmented image, the second set of candidate patterns being determined based on the use input and a group of products identified in the segmented image in a neighborhood of the first region of interest.
2. The method of claim 1, further including:
displaying, on the display of the auditing device, an image-based result for the first and second products identified hi the first and second regions of interest; and
modifying the image -based results based on the user Input
3« The method of claim 2, wherein modifying the image based results Includes changing an identification of at least one of the fi rst product or the second product, or changing segment of the segmented image,
4, The method of claim 1, further including;
displaying, on the display of the auditing device, key performance indicators providing data related to the first and second products;
displaying a prompt to the user for a second user Input while ihe key performance indicators are displayed on the display of the auditing device; and
3.1 i»odi£y«¾ at least one of the key pefformance indicators based -on the second user input
5. The method of claim 4, wherein the key performance indicators Include at least one of respective shares- of she'll space or respective positions of the first and second products in the segmented image,
6. The method of claim 1 , f either including at least one of determining a location of the first and second products m the segmented image, determining respectiv numbers of facings of the- first and second products in the segmented image, detecting a grid in the first region of interest that is empty, or detecting a grid in the first region of interest thai i associated with an cmt-of-stock product.
7, The method of claim L wherein the. user input includes scanning a barcode associated with the first region of interest to identify he first product.
8, The method of claim 1, wherein the first set of candidate patterns is selected based on at le st one of a store being audited, a product category, or an auditor performing the audit,
9< The method of claim I, wherein, the second set of candidate patterns i further determined based on at least one of a product category or a store.
M An auditing device comprising:
a display;
an inpui/oeiput interface;
a product identifier to perform Image recognition based on a first set of candidate patterns accessed by the auditing device to identify a first- roduct depicted in a first region of interest of a segmented image;
a results analyzer to prompt, via the display, a user to. enter a use input via an input/output interface, associated with a first grid of the first region of interest displayed on the display, the first grid Including the first product; and a candidate pattern selector to determine a second set of candidate patterns to use to identify a second product in a second region of interest of the segmented image, the second set of candidate patterns being determined based on the user input and a group of products identified in the segmented image in a neighborhood of the first region of interest
11 , The auditing device of claim .10, wherein the results analyzer of the auditing device is further to:
display, via the display, a image -based result for the first and second products identified in the first and second regions of interest; and
enable the user to modify the image ased results,
12, The auditing device of claim 1.1, wherein the results analyzer is to modify the image based results by changing an identification of at least one of the first product or the second product, or changing a segment of the segmented Image,
13, The auditing device of claim 10, further including a key performance indicator definer to:
determine key performance indicators providing data related to the first and second products;
display the key performance indicators displayed on the display; and
prompt for a second user input to modify at least one of the key performance indicators.
14, The auditing device of claim 13, wherein the key performance indicators include at least one of respective shares of shelf space or respective positions of the first and second products in the segmented image.
15, The auditing device of claim .1.5, wherein the key performance indicato definer is further to:
determine a location of the first and second products in the segmented image; and determine respective numbe s of facings of the first and second products in the¬
Figure imgf000035_0001
16. The auditing device of claim ML wherein an imag segmentor is further to;
detect a grid in the first region of Interest that is empty; and
detect a grid In the first region of interest that is associated with an o«i-oi-siook product.
17. The auditing device of claim 10, further including a camera io:
capture point-of-sale images; and
sca a barcode associated with the first region of interest to identify the first product
18. The auditing device of claim "10, wherein the candidate pattern selector is further to select the first set of candidate patterns based on at least one of a store being audited, a product category, or a user peribrmiog the audit,
19. The auditing device of claim 10, wherein the candidate pattern selector is further io select the second set of candidate patterns based on at least one of a product category or a store.
20. A tangible computer readable storage medium com rising instructions' that, when executed, causes & processor of an auditing de vice to at least;
perform image recognition based on a first set of candidate patterns accessed by the and! ting device to identif a first product depicted in a first region of interest of a segmented image;
prompt a user to enter a user input associated with a first grid of the first region of interest displayed on a display of the auditing device, the first grid Including the first product; and
determine a second set of candidate patterns to use to identify a second product in a second region of interest of the segmented image, the second set of candidate patterns being determined based on the user input, and a gro p of products identified the segmented image- in a neighborhood of th first region of interest,
21. The machine-readable instructions of claim 20, wherein the instructions, whets executed, further cause the processor to:
display, on die display of the auditing device, an image-based result for the first and second products identified in the first and second regions of interest; and
modify the im ge- ased results based on the user input,
22. The machine-readable instructions of claim 21. wherein the instructions cause the processor to modify the image based resuits by changing an dentification of at least one of the first product or the second product, or changing a segment of the segmented image.
23. The machine -readable instaictions of claim 20, wherein the instructions., when executed, further cause the processor to:
display, on the display of the auditing device, key performance indicators providing data related to the first and second products;
display a prompt to the user for a second user input while the key performance indicators are displayed on the display of the auditing device and
modify at least on of the ke performance indicators based on the second user input.
24. The maehine-neadabte instructions of claim 23. wherein the -key performance indicators include at least one of respective shares of shelf space or respective positions of the first and second products i the segmented image.
25. The machine-readable instructions of claim 20, wherein the instructions, when executed, further cause the processor to at least one of determine u location of the first and second products i the segmen ed image, determine respective numbers of facings of the first, and second products in the segmented image, detect a grid i the first region of interest that is em ty, or d tect a grid in tfee first region of interest that is associated with an u >f -stock product,
26. The machine-readable Instructions of cMm 20. wherein the user in ut includes scanning a barcode associated with the first region of late-rest to identify the first product.
27. The. machine-readable Instmctions of claim 20» wherein the first set of candidate patterns is selected based on at least one of store being audited, a product category, or an auditor performing the audit
28. The machine-readable instructions of claim 20, wherein the second -set of candidate patterns is farther determined based on at least one of a produci category or a store.
29. Art interactive product auditing method comprising:
perfonnmg, wit a processor of an- auditing device, image recognition on a first segmented image to Identify a first product depicted in a first region of interest of the segmented image;
determining, with the processor of the auditing device, a key performance indicator based on the first region of interest and product information associated with the first product; and
prompting, with the processor of the auditing device, a user to enter a user input to modify the key performance indicator.
30. The method of claim 29, farther including;
displaying the key performance Indicator on a display of the auditing device while prompting the user;
modifying the key performance indicator based on the user input to determine a modified key performance indicator; and
displaying the modified key performance indicator on the display of the auditing device.
33. The me hod of claim 30, further including transmitting the modiiied .key performance indicator to a central server.
32. The method of claim 29, wherein the key performance indicator is a task to be completed by the user prior to. the user completing a store -audit, the task to provide second information related to the first product and first region of interest.
33. The method of claim 29, farther including modifying the key performance indicator by changing a value related to the first product, wherein the value includes at least one of an ssortment, -a number of facings, a share of shelf space, or a price.
34. The method of claim 33, further including displaying, via a display of the auditing device, an image-based result including the segmented image and the value related to the first product
35. The method of claim 29, farther including;
Identifying a second product in a second region of interest in the segmented image; and
determining- t¾e key performance indicator based on the first product and the second prod act.
36. The method of claim 29, wherein determining the key performance indicator includes at least one of determining a number of instances of the first product in the first region of interest, estimating dimensions of the f rst product, or identifying a position of the first product in the first region of interest.
37. The method of clai 29, further including:
comparing the key performance indicator with target- key performance indicator: and
displaying a result of the comparison on a display of the auditing device.
38. An auditing device comprising: a product. Identifier to perform image recognition on a first segrnenied image to identif a first product depicted in a first region of interest of Lite segmented linage; and
a key performance Indicator de iner to;
determine a key performance indicator based on the first region of interest and product Infottnation associated with the first product; and
prompt a user to enter a user input to modify the key performance Indicator,
39. The auditing device of claim 38, wherein the key performance indicator deflner is further to;
display the key performance indicator on a display of the auditing device while prompting the user;
modify the key performance indicator based on th user input to determine a xnodifled key performance indicator; and
display the modified ke performance indicator on the displa of the auditing device,
40. The auditing device of claim 39, further including . an inpnt/ontput interface, to transmit the modified key performance indicator to a central server.
41. The auditing device of claim 38, wherein the key performance indicator is a task to be completed by the user prior to the user completing a store audit the task to provide second information relate to the id product and first region of interest.
42. The auditing device of claim 38, wherein th key performance indicator definer is further to modify the key performance indicator by changing a value related to the first product,, wherein the value includes at least one of an assortment a number of facings, a sh re of s elf space, or price. 43, The auditing de vice of claim 42, further including a display of the auditing device to display aft image-based result including the segmented image nd the value related to the first product
44, The auditing device of claim 38, wherein the key performance Indicator deflner is further to;
identify a second product in a second region of interest in the segmented image; and determine the key performance indicator based on the first product and the second product.
45, The auditing device of claim 38, wherein the key performance indicator define? is further to determine the key performance indicator by at least one of detennining a number of instances of the first product in the first regio of interest,, estimating- dimensions of the first product, or identif ing a position of the first product m the first region of interest,
46, The auditing device of claim 38, wherein the key performance indicator defitier is further to:
compare the key performance indicator with a target key performance indicator; and displa a result of the comparison on a display of the auditing device,
47, A tangible computet readable storage medium comprising instructions that, when executed, causes a processor of an auditing device to at least;
perform image recognition on first segmented image to identify a firs product- depicted in a first region of interest of the segmented image;
determine a key performance indicator based on the first region of Interest and product information associated with the first product; and
prompt a user to enter a user input to modify the key performance indicator,
48, The machine-readable instructions of claim 47, wherein the instructions, when executed, further cause the processor to: dis lay ths key prnfoft nce indicator on a display of (be auditing device while prompting the user:
modify the key erformanc Indicator based on the user input to determine a modified key performance indicator; and
display the modified key perforxoanee indicator on the display of the auditing device,
49. The machine-readable instructions of claim 48, wherein the instructions, when executed, further cause the processor to transmit the modified key performance indicator to a central server.
50. The machi.us-readab.le instruction of claim 47, wherein the key performance indicator is a task to be completed by the user prior to the user completing a store audit, the task to provide second information related to the first product and first region of interest.
51. The machine-readable instructions of claim 47. wherein the instructions, when executed, further cause the processor to modif the key performance indicator by changing a. value related to the first product, wherein the value includes at least one of an assortment, a number of facings, a share of shelf space, or a price.
52. The machine-readable instructions of claim 5.1, wherein the instructions, when executed, further cause the processor to display, via a displa of the auditing device, a image-based result inekul g the segmented image and the value related to the first product,
53. The machine-readable instructions of claim 47, wherein the instructions, when executed, further cause the processor to;
identify a second product in a second region of Interest in the segmented image; and determine the key performance indicator based on the first product and the second product.
54. The machine-readable instructions of claim 47, wherein the instructions, when executed, cause the processor to determine the key performance indicator by at least one of determining a number of instances of the first' product in the first region of interest, estimating dimeusions of the first product, or Identifying a position of the firs product in the first region of interest,
55. The machine-readable insteetions of claim 47, wherein the instructions, when executed, further cause the processor to:
compare the key performance indicator with a target key performance indicator; and display a result of the comparison on a display of th auditin -de ice.
PCT/IB2015/002064 2015-09-30 2015-09-30 Interactive product auditing with a mobile device WO2017055890A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
EP15905278.6A EP3357019A4 (en) 2015-09-30 2015-09-30 Interactive product auditing with a mobile device
PCT/IB2015/002064 WO2017055890A1 (en) 2015-09-30 2015-09-30 Interactive product auditing with a mobile device
US14/894,901 US10796262B2 (en) 2015-09-30 2015-09-30 Interactive product auditing with a mobile device
EP21162114.9A EP3862948A1 (en) 2015-09-30 2015-09-30 Interactive product auditing with a mobile device
US17/062,159 US11562314B2 (en) 2015-09-30 2020-10-02 Interactive product auditing with a mobile device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2015/002064 WO2017055890A1 (en) 2015-09-30 2015-09-30 Interactive product auditing with a mobile device

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US14/894,901 A-371-Of-International US10796262B2 (en) 2015-09-30 2015-09-30 Interactive product auditing with a mobile device
US17/062,159 Continuation US11562314B2 (en) 2015-09-30 2020-10-02 Interactive product auditing with a mobile device

Publications (1)

Publication Number Publication Date
WO2017055890A1 true WO2017055890A1 (en) 2017-04-06

Family

ID=58422737

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/002064 WO2017055890A1 (en) 2015-09-30 2015-09-30 Interactive product auditing with a mobile device

Country Status (3)

Country Link
US (2) US10796262B2 (en)
EP (2) EP3357019A4 (en)
WO (1) WO2017055890A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017055890A1 (en) * 2015-09-30 2017-04-06 The Nielsen Company (Us), Llc Interactive product auditing with a mobile device
US10339690B2 (en) * 2015-12-18 2019-07-02 Ricoh Co., Ltd. Image recognition scoring visualization
US11093958B2 (en) * 2016-11-23 2021-08-17 Observa, Inc. System and method for facilitating real-time feedback in response to collection of real-world data
US11393047B2 (en) * 2018-12-11 2022-07-19 Nielsen Consumer Llc Methods, systems, articles of manufacture and apparatus to monitor auditing devices
WO2020181066A1 (en) * 2019-03-06 2020-09-10 Trax Technology Solutions Pte Ltd. Methods and systems for monitoring products
CN112001349B (en) * 2020-08-31 2023-09-26 杭州海康威视数字技术股份有限公司 Data auditing method, system and electronic equipment
US20220327511A1 (en) * 2021-04-07 2022-10-13 Vcognition, Inc. System and method for acquiring training data of products for automated checkout
USD1005305S1 (en) * 2021-08-01 2023-11-21 Soubir Acharya Computing device display screen with animated graphical user interface to select clothes from a virtual closet

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080306787A1 (en) * 2005-04-13 2008-12-11 Craig Hamilton Method and System for Automatically Measuring Retail Store Display Compliance
US20130051611A1 (en) * 2011-08-24 2013-02-28 Michael A. Hicks Image overlaying and comparison for inventory display auditing
US20130051667A1 (en) * 2011-08-31 2013-02-28 Kevin Keqiang Deng Image recognition to support shelf auditing for consumer research
US20130265400A1 (en) * 2000-11-06 2013-10-10 Nant Holdings Ip, Llc Image Capture and Identification System and Process
US20140013193A1 (en) * 2012-06-29 2014-01-09 Joseph John Selinger Methods and systems for capturing information-enhanced images

Family Cites Families (89)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4373133A (en) 1980-01-03 1983-02-08 Nicholas Clyne Method for producing a bill, apparatus for collecting items, and a self-service shop
US4973952A (en) 1987-09-21 1990-11-27 Information Resources, Inc. Shopping cart display system
JP2974459B2 (en) 1991-06-21 1999-11-10 エヌシーアール インターナショナル インコーポレイテッド Travel route data collection system
AU1333895A (en) 1993-11-30 1995-06-19 Raymond R. Burke Computer system for allowing a consumer to purchase packaged goods at home
US5699244A (en) 1994-03-07 1997-12-16 Monsanto Company Hand-held GUI PDA with GPS/DGPS receiver for collecting agronomic and GPS position data
US5640002A (en) 1995-08-15 1997-06-17 Ruppert; Jonathan Paul Portable RF ID tag and barcode reader
US5821513A (en) 1996-06-26 1998-10-13 Telxon Corporation Shopping cart mounted portable data collection device with tethered dataform reader
US6026387A (en) 1996-07-15 2000-02-15 Kesel; Brad Consumer comment reporting apparatus and method
CA2196930C (en) 1997-02-06 2005-06-21 Nael Hirzalla Video sequence recognition
US6026376A (en) 1997-04-15 2000-02-15 Kenney; John A. Interactive electronic shopping system and method
US6304284B1 (en) 1998-03-31 2001-10-16 Intel Corporation Method of and apparatus for creating panoramic or surround images using a motion sensor equipped camera
US6281874B1 (en) 1998-08-27 2001-08-28 International Business Machines Corporation Method and system for downloading graphic images on the internet
US20030055707A1 (en) 1999-09-22 2003-03-20 Frederick D. Busche Method and system for integrating spatial analysis and data mining analysis to ascertain favorable positioning of products in a retail environment
US6539393B1 (en) 1999-09-30 2003-03-25 Hill-Rom Services, Inc. Portable locator system
US6911908B1 (en) 1999-10-08 2005-06-28 Activerf Limited Security
US6381510B1 (en) 1999-11-19 2002-04-30 Eruggallery.Com Methods and apparatus for facilitating electronic commerce in area rugs
US7064783B2 (en) 1999-12-31 2006-06-20 Stmicroelectronics, Inc. Still picture format for subsequent picture stitching for forming a panoramic image
US6577346B1 (en) 2000-01-24 2003-06-10 Webtv Networks, Inc. Recognizing a pattern in a video segment to identify the video segment
US20100179859A1 (en) 2000-02-10 2010-07-15 Davis Bruce L Method and System for Facilitating On-Line Shopping
US6708156B1 (en) 2000-04-17 2004-03-16 Michael Von Gonten, Inc. System and method for projecting market penetration
AU2001264954A1 (en) 2000-05-25 2001-12-03 Realitybuy, Inc. A real time, three-dimensional, configurable, interactive product display systemand method
US20020007295A1 (en) 2000-06-23 2002-01-17 John Kenny Rental store management system
US7353188B2 (en) 2000-06-30 2008-04-01 Lg Electronics Product selling system and method for operating the same
US6876988B2 (en) 2000-10-23 2005-04-05 Netuitive, Inc. Enhanced computer performance forecasting system
US8103877B2 (en) 2000-12-21 2012-01-24 Digimarc Corporation Content identification and electronic tickets, coupons and credits
US20040012631A1 (en) 2001-03-20 2004-01-22 Wesley Skorski Master dynamic multi-catalog
JP2002312652A (en) 2001-04-12 2002-10-25 Noritsu Koki Co Ltd Inventory management device for photographic processor
US6584375B2 (en) 2001-05-04 2003-06-24 Intellibot, Llc System for a retail environment
US7206753B2 (en) 2001-05-04 2007-04-17 Axxon Robotics, Llc Methods for facilitating a retail environment
US8140378B2 (en) 2004-07-09 2012-03-20 Shopper Scientist, Llc System and method for modeling shopping behavior
US20030187677A1 (en) 2002-03-28 2003-10-02 Commerce One Operations, Inc. Processing user interaction data in a collaborative commerce environment
US20050035198A1 (en) 2003-01-23 2005-02-17 Wilensky Craig A. Mobile wireless computer system including devices and methods related thereto
US20040224703A1 (en) 2003-05-09 2004-11-11 Takaki Steven M. Method and system for enhancing venue participation by venue participants
US6928343B2 (en) 2003-07-30 2005-08-09 International Business Machines Corporation Shopper tracker and portable customer service terminal charger
CA2540575C (en) 2003-09-12 2013-12-17 Kevin Deng Digital video signature apparatus and methods for use with video program identification systems
US7148803B2 (en) 2003-10-24 2006-12-12 Symbol Technologies, Inc. Radio frequency identification (RFID) based sensor networks
US7751805B2 (en) 2004-02-20 2010-07-06 Google Inc. Mobile image-based information retrieval system
US7420464B2 (en) 2004-03-15 2008-09-02 Arbitron, Inc. Methods and systems for gathering market research data inside and outside commercial establishments
US7155336B2 (en) 2004-03-24 2006-12-26 A9.Com, Inc. System and method for automatically collecting images of objects at geographic locations and displaying same in online directories
US20060237532A1 (en) 2005-04-20 2006-10-26 Style And Form, Inc.., Incorporation: Delaware System and method for facilitating in-store customer service, product marketing and inventory management
US7420149B2 (en) 2005-08-24 2008-09-02 Avaak, Inc. Network sensor system and protocol
US7575171B2 (en) 2005-09-01 2009-08-18 Zvi Haim Lev System and method for reliable content access using a cellular/wireless device with imaging capabilities
TWI274493B (en) 2005-09-23 2007-02-21 Via Tech Inc Serial transceiver and control method thereof
JP4640155B2 (en) * 2005-12-15 2011-03-02 ソニー株式会社 Image processing apparatus and method, and program
US7681796B2 (en) 2006-01-05 2010-03-23 International Business Machines Corporation Mobile device tracking
US7412427B2 (en) 2006-01-27 2008-08-12 Microsoft Corporation Object instance recognition using feature symbol triplets
JP4018727B2 (en) 2006-02-14 2007-12-05 キヤノン株式会社 IMAGING DEVICE, ITS CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
US20100171826A1 (en) 2006-04-12 2010-07-08 Store Eyes, Inc. Method for measuring retail display and compliance
US8233675B2 (en) 2006-06-20 2012-07-31 L-1 Secure Credentialing, Inc. Image, video or audio filtering before biometric recognition
US8031940B2 (en) 2006-06-29 2011-10-04 Google Inc. Recognizing text in images using ranging data
US7953295B2 (en) 2006-06-29 2011-05-31 Google Inc. Enhancing text in images
US8098934B2 (en) 2006-06-29 2012-01-17 Google Inc. Using extracted image text
US8260689B2 (en) 2006-07-07 2012-09-04 Dollens Joseph R Method and system for managing and displaying product images
WO2008034001A1 (en) 2006-09-15 2008-03-20 Nielsen Media Research, Inc. Methods and apparatus to identify images in print advertisements
US8331725B2 (en) 2007-01-12 2012-12-11 Qualcomm Incorporated Panoramic imaging techniques
US9031858B2 (en) 2007-04-03 2015-05-12 International Business Machines Corporation Using biometric data for a customer to improve upsale ad cross-sale of items
US7949568B2 (en) * 2007-08-31 2011-05-24 Accenture Global Services Limited Determination of product display parameters based on image processing
US9135491B2 (en) 2007-08-31 2015-09-15 Accenture Global Services Limited Digital point-of-sale analyzer
US8630924B2 (en) 2007-08-31 2014-01-14 Accenture Global Services Limited Detection of stock out conditions based on image processing
US8009864B2 (en) 2007-08-31 2011-08-30 Accenture Global Services Limited Determination of inventory conditions based on image processing
US8189855B2 (en) 2007-08-31 2012-05-29 Accenture Global Services Limited Planogram extraction based on image processing
US8091782B2 (en) 2007-11-08 2012-01-10 International Business Machines Corporation Using cameras to monitor actual inventory
US9239958B2 (en) 2007-11-09 2016-01-19 The Nielsen Company (Us), Llc Methods and apparatus to measure brand exposure in media streams
US20090128644A1 (en) 2007-11-15 2009-05-21 Camp Jr William O System and method for generating a photograph
US20090192921A1 (en) 2008-01-24 2009-07-30 Michael Alan Hicks Methods and apparatus to survey a retail environment
GB2471036B (en) * 2008-03-03 2012-08-22 Videoiq Inc Object matching for tracking, indexing, and search
EP2250623A4 (en) 2008-03-05 2011-03-23 Ebay Inc Method and apparatus for image recognition services
WO2009155991A1 (en) 2008-06-27 2009-12-30 Nokia Corporation Image retrieval based on similarity search
US8767081B2 (en) 2009-02-23 2014-07-01 Microsoft Corporation Sharing video data associated with the same event
US20110184972A1 (en) 2009-12-23 2011-07-28 Cbs Interactive Inc. System and method for navigating a product catalog
WO2011106520A1 (en) 2010-02-24 2011-09-01 Ipplex Holdings Corporation Augmented reality panorama supporting visually impaired individuals
US8433142B2 (en) 2010-04-05 2013-04-30 The Nielsen Company (Us), Llc Methods and apparatus to detect differences between images
KR101293776B1 (en) * 2010-09-03 2013-08-06 주식회사 팬택 Apparatus and Method for providing augmented reality using object list
US8447863B1 (en) * 2011-05-06 2013-05-21 Google Inc. Systems and methods for object recognition
JP5830784B2 (en) * 2011-06-23 2015-12-09 サイバーアイ・エンタテインメント株式会社 Interest graph collection system by relevance search with image recognition system
US8311973B1 (en) 2011-09-24 2012-11-13 Zadeh Lotfi A Methods and systems for applications for Z-numbers
US9269022B2 (en) 2013-04-11 2016-02-23 Digimarc Corporation Methods for object recognition and related arrangements
US9224243B2 (en) 2013-05-20 2015-12-29 Nokia Technologies Oy Image enhancement using a multi-dimensional model
US9454848B2 (en) 2013-05-20 2016-09-27 Nokia Technologies Oy Image enhancement using a multi-dimensional model
US10290031B2 (en) 2013-07-24 2019-05-14 Gregorio Reid Method and system for automated retail checkout using context recognition
US10366306B1 (en) * 2013-09-19 2019-07-30 Amazon Technologies, Inc. Item identification among item variations
KR102113813B1 (en) 2013-11-19 2020-05-22 한국전자통신연구원 Apparatus and Method Searching Shoes Image Using Matching Pair
US9122958B1 (en) * 2014-02-14 2015-09-01 Social Sweepster, LLC Object recognition or detection based on verification tests
US10628789B2 (en) * 2014-05-20 2020-04-21 Gimme Vending LLC Communication device for vending machine and method of using the same
US9569692B2 (en) 2014-10-31 2017-02-14 The Nielsen Company (Us), Llc Context-based image recognition for consumer market research
US9619899B2 (en) * 2014-12-23 2017-04-11 Ricoh Co., Ltd. Distinguishing between stock keeping units using hough voting methodology
WO2017055890A1 (en) 2015-09-30 2017-04-06 The Nielsen Company (Us), Llc Interactive product auditing with a mobile device
US9911033B1 (en) * 2016-09-05 2018-03-06 International Business Machines Corporation Semi-supervised price tag detection
US10229347B2 (en) * 2017-05-14 2019-03-12 International Business Machines Corporation Systems and methods for identifying a target object in an image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130265400A1 (en) * 2000-11-06 2013-10-10 Nant Holdings Ip, Llc Image Capture and Identification System and Process
US20080306787A1 (en) * 2005-04-13 2008-12-11 Craig Hamilton Method and System for Automatically Measuring Retail Store Display Compliance
US20130051611A1 (en) * 2011-08-24 2013-02-28 Michael A. Hicks Image overlaying and comparison for inventory display auditing
US20130051667A1 (en) * 2011-08-31 2013-02-28 Kevin Keqiang Deng Image recognition to support shelf auditing for consumer research
US20140013193A1 (en) * 2012-06-29 2014-01-09 Joseph John Selinger Methods and systems for capturing information-enhanced images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3357019A4 *

Also Published As

Publication number Publication date
EP3357019A4 (en) 2019-03-27
EP3357019A1 (en) 2018-08-08
US20170255891A1 (en) 2017-09-07
US20210073705A1 (en) 2021-03-11
EP3862948A1 (en) 2021-08-11
US11562314B2 (en) 2023-01-24
US10796262B2 (en) 2020-10-06

Similar Documents

Publication Publication Date Title
US11562314B2 (en) Interactive product auditing with a mobile device
US11853347B2 (en) Product auditing in point-of-sale images
US10445821B2 (en) Planogram and realogram alignment
US10592854B2 (en) Planogram matching
CN108416902B (en) Real-time object identification method and device based on difference identification
CN107690657B (en) Trade company is found according to image
EP2662831B1 (en) Comparing virtual and real images of a shopping planogram
KR102358607B1 (en) Artificial intelligence appraisal system, artificial intelligence appraisal method and storage medium
US20170255830A1 (en) Method, apparatus, and system for identifying objects in video images and displaying information of same
US11699019B2 (en) Visual content optimization system using artificial intelligence (AI) based design generation and validation
US11436617B2 (en) Behavior pattern search system and behavior pattern search method
US20220189190A1 (en) Methods and apparatus to detect a text region of interest in a digital image using machine-based analysis
KR20160018550A (en) Methods and devices for smart shopping
US9582835B2 (en) Apparatus, system, and method for searching for power user in social media
US20190311317A1 (en) Inventory management server, inventory management system, inventory management program, and inventory management method
US20210256540A1 (en) Alcohol information management system and management method
KR102417157B1 (en) Electronic device for selling and design apparel product and method for operating thereof
US20170038924A1 (en) Graphical user interface indicating virtual storage of consumable items
JP2015513331A (en) System and method for rule-based content optimization
US20170039741A1 (en) Multi-dimensional visualization
US20170249697A1 (en) System and method for machine learning based line assignment
JP2021189598A (en) Order management device and order management method
KR20230085033A (en) Data curation for consumption and utilization data
WO2023059630A1 (en) Predicting the value of an asset using machine-learning techniques
US20140149310A1 (en) Selection of images to select a place

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 14894901

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15905278

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE