AU2021245114B1 - Image Recognition using Code Based Data Reduction - Google Patents

Image Recognition using Code Based Data Reduction Download PDF

Info

Publication number
AU2021245114B1
AU2021245114B1 AU2021245114A AU2021245114A AU2021245114B1 AU 2021245114 B1 AU2021245114 B1 AU 2021245114B1 AU 2021245114 A AU2021245114 A AU 2021245114A AU 2021245114 A AU2021245114 A AU 2021245114A AU 2021245114 B1 AU2021245114 B1 AU 2021245114B1
Authority
AU
Australia
Prior art keywords
product group
product
determining
server
digital device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2021245114A
Inventor
Van Der Weegen Mark
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Van Der Weegen Mark
Original Assignee
Van Der Weegen Mark
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Van Der Weegen Mark filed Critical Van Der Weegen Mark
Priority to AU2021245114A priority Critical patent/AU2021245114B1/en
Publication of AU2021245114B1 publication Critical patent/AU2021245114B1/en
Priority to PCT/AU2022/050860 priority patent/WO2023056500A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/587Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/538Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • G06V10/235Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on user input or interaction

Abstract

A computer implemented method (518') of selecting (1107), by a customer digital device (472), a product (219) from a plurality of products in a product image (200) displayed on the customer digital device, the method comprising the steps of: specifying (1107), by the customer digital device (472), a spatial coordinate on the product image displayed on the customer digital device; determining (1207), by a server (401), a pre-determined bounding box (301) dependent upon the spatial coordinate; determining (1209), by the server, a product group location code (304) associated with the pre-determined bounding box; and determining (1211), by the server (401), the product associated with the product group location code (304).

Description

IMAGE PROCESSING USING CODE BASED DATA REDUCTION
Technical Field
[0001] The present invention relates generally to image processing systems and, in particular, to such systems as they relate to product selection and identification in online-shopping applications. The present invention also relates to a method and apparatus for product selection and identification, and to a computer program product including a computer readable medium having recorded thereon a computer program for product selection and identification.
Background
[0002] Online shopping has been available for some time and many different types of products can now be purchased from the comfort of a customer's home with the click of a mouse or the touch of a tablet or smart phone.
[0003] Whilst online storefronts have evolved to become quite sophisticated over time, they still are cumbersome to use and are largely based around menus and images of products arranged in groups or categories. A customer normally needs to know exactly what they are looking for and it's rare for them to stumble across a product they didn't think they'd need as is often the case when browsing in a physical shop. Current menu-based systems typically consist of a menu listing the various product categories along with pictures and pricing of the products within those categories that can be selected. This requires the customer to dig down into the category menus and search for the desired products. Accordingly, current menu-based systems require a customer to know what they are looking for before starting to shop. If they want a large number of products, as is often the case when supermarket shopping, they are required to navigate through many levels of categories, back and forth to the main menu and eventually, after considerable time and effort, they will have filled their virtual shopping cart.
[0004] Virtual reality shopping is in its infancy and promises to be more intuitive than menu based shopping, offering the promise of allowing the customer to virtually walk down a shopping aisle, looking browsing and shopping for the products displayed on the shelves by pointing at the desired product using a pointing device such as a mouse or a finger swipe on a touch sensitive screen. However, this requires that the virtual shopping system recognise items (products) displayed within an image presented to and selected by the customer in real-time as the customer shops. Current image recognition systems require significant computational resources to perform such recognition. These problems are exacerbated in systems which are web based and operate over a communication network and need to operate in real-time as a customer searches for and selects a product for adding to their shopping cart. The complexity of such systems is further exacerbated if items on a shelf do not exactly match the photo of the item in the database or if the item is not displayed correctly on the shelf or if the item is partially obscured.
Summary
[0005] It is an object of the present invention to substantially overcome, or at least ameliorate, one or more disadvantages of existing arrangements or to offer a useful alternative.
[0006] Disclosed are Code Based Data Reduction (CBDR) arrangements which seek to address the above problems by providing a "real" panoramic image of the store to the customer for the customer to do their shopping selection, but providing a more efficient and less resource intensive processing arrangement by (i) affixing a unique machine readable Product Group Location Code (PGLC), typically implemented using a "Quick Response (QR) code or the like, in a Product Group Envelope (PGE) encompassing a Product Group (PG) on a shelf, (ii) determining the location of the Product Group Location Code (PGLC) in a Product Image (P) of the shelf, (iii) determining a Product Group Bounding Box (PGBB) and an associated Bounding Box Region (BBR) encompassing the Product Group (PG) and the Product Group Location Code (PGLC), (iv) associating the Product Group (PG), the Product Group Location Code (PGLC) and the Bounding Box Region (BBR) in a Bounding Box Database (BBD), and then when a customer clicks on a desired product in the "real" image using a mouse pointer, for example, (v) cross-referencing the Mouse Pointer Location (MPL) (also associated with a selected X-Y location on a selected product image P ie SXYL) to the corresponding Bounding Box Region (BBR) in the Bounding Box Database (BBD) to determine the selected product.
[0007] The aforementioned CBDR method is used instead of performing image processing on the "real" image presented to the customer, as is the current approach. This cross-referencing CBDR process can be performed in a fraction of the time needed by current image recognition systems to recognise the product being selected by the customer.
[0008] Although the CBDR arrangements are described in the context of virtual-reality on-line shopping, they are equally applicable to on-line shopping arrangements in which 2-D images are presented to a customer without a virtual reality context.
[0009] According to an aspect of the present disclosure, there is provided a computer implemented method (518') of selecting (1107), by a customer digital device (472), a product (219) from a plurality of products in a product image (200) displayed on the customer digital device, the method comprising the steps of: specifying (1107), by the customer digital device (472), a spatial coordinate on the product image displayed on the customer digital device; determining (1207), by a server (401), a pre-determined bounding box (301) dependent upon the spatial coordinate; determining (1209), by the server, a product group location code (304) associated with the pre-determined bounding box; and determining (1211), by the server (401), the product associated with the product group location code (304).
[0010] According to an aspect of the present disclosure, there is provided a computer implemented method of processing, by a server device, a selection signal from a customer digital device resulting from selection, by the customer digital device, a product from a plurality of products in a product image displayed on the customer digital device, the method comprising the steps of: determining, by the server device, the product image being displayed on the customer digital device; determining, by the server device, a spatial coordinate specified by the customer digital device on the product image displayed on the customer digital device; determining, by the server device, a pre-determined product group bounding box encompassing the spatial coordinate; determining, by the server device, a product group location code associated with the product group bounding box; and determining, by the server device, a product group associated with the product group location code.
[0011] According to an aspect of the present disclosure, there is provided a customer digital device for selecting a product from a plurality of products in a product image displayed on the customer digital device, the device comprising: a processor; and a memory storing a computer executable software program for directing the processor to perform a method comprising the steps of: specifying, by the customer digital device, a spatial coordinate on the product image displayed on the customer digital device; determining, by a server, a pre-determined product group bounding box dependent upon the spatial coordinate; determining, by the server, a product group location code associated with the pre-determined product group bounding box; and determining, by the server, the product associated with the product group location code.
[0012] According to an aspect of the present disclosure, there is provided a server device for processing a selection signal from a customer digital device resulting from selecting, by the customer digital device, a product from a plurality of products in a product image displayed on the customer digital device, the server device comprising: a processor; and a memory storing a computer executable software program for directing the processor to perform a method comprising the steps of: determining the product image being displayed on the customer digital device; determining a spatial coordinate specified by the customer digital device on the product image displayed on the customer digital device; determining a pre-determined product group bounding box encompassing the spatial coordinate; determining a product group location code associated with the product group bounding box; and determining a product group associated with the product group location code.
[0013] According to an aspect of the present disclosure, there is provided a computer executable software program for directing a processor of a customer digital device to perform a method for selecting a product from a plurality of products in a product image displayed on the customer digital device, the method comprising the steps of: specifying, by the customer digital device, a spatial coordinate on the product image displayed on the customer digital device; determining, by a server, a pre-determined product group bounding box dependent upon the spatial coordinate; determining, by the server, a product group location code associated with the pre-determined product group bounding box; and determining, by the server, the product associated with the product group location code.
[0014] According to an aspect of the present disclosure, there is provided a computer executable software program for directing a processor of a server device to perform a method for processing a selection signal from a customer digital device resulting from selection, by the customer digital device, a product from a plurality of products in a product image displayed on the customer digital device, the method comprising the steps of: determining the product image being displayed on the customer digital device; determining a spatial coordinate specified by the customer digital device on the product image displayed on the customer digital device; determining a pre-determined product group bounding box encompassing the spatial coordinate; determining a product group location code associated with the product group bounding box; and determining a product group associated with the product group location code.
[0015] According to an aspect of the present disclosure, there is provided an apparatus for implementing any one of the aforementioned methods.
[0016] According to an aspect of the present disclosure there is provided a computer program product including a computer readable medium having recorded thereon a computer program for implementing any one of the methods described above.
[0017] Other aspects are also disclosed.
Brief Description of the Drawings
[0018] Some aspects of the prior art and at least one embodiment of the present invention will now be described with reference to the drawings and appendices, in which:
[0019] Fig. 1 depicts an aisle view 100 in a virtual reality shopping system according to the disclosed CBDR arrangement;
[0020] Fig. 2 shows an example of a product image which a customer would view in the disclosed CBDR arrangement in which multiple product groups (PGs) are located on shelves;
[0021] Fig. 3 depicts a fragment of the product image of Fig. 2 showing examples of product group location codes PGLCs and bounding boxes associated with the product group location codes;
[0022] Figs. 4A and 4B form a schematic block diagram of a general-purpose computer system upon which the disclosed CBDR arrangements can be practiced;
[0023] Figs. 5A, 5B and 5C depict respective flow charts for example processes for preparing a store (involving a store preparation process, an image capture process, and an image processing process), a shopping process seen from a shopper perspective, and a shopping process from the CBDR server perspective according to the disclosed CBDR arrangement;
[0024] Figs. 6A and 6B are flow chart fragments showing, in aggregate, an example of a process for store preparation according to the process depicted in Fig. 5A;
[0025] Fig. 7 depicts a computer-controlled camera trolley which can, in one CBDR example, be used for capturing the product images such as that shown in Fig. 2 according to Fig. 5A;
[0026] Figs. 8A and 8B depict examples of a guidance arrangement that can be used to guide the trolley of Fig. 7 through a store in order to capture product images such as that shown in Fig. 2 according to a process depicted in Fig. 5A;
[0027] Fig. 9 is a flow chart for an example process used to guide the trolley of Fig. 7 according to the store layout of Fig. 8A for performing image capture according to a process depicted in Fig. 5A;
[0028] Figs. 10A, 10B and 10C are flow chart fragments for an example process for performing image processing of the product images captured by the trolley of Fig. 7 according to a process depicted in Fig. 5A;
[0029] Fig 11 is a flow chart for an example shopping process from a shopper perspective as depicted in Fig. 5B;
[0030] Fig 12 is a flow chart for an example shopping process from the CBDR server perspective as depicted in Fig. 5C;
[0031] Figs. 13A and 13B depict flowcharts for an alternate / complementary arrangement to that depicted in Figs. 6A and 6B for store preparation;
[0032] Figs. 14 to 17 depict flowcharts for an alternate / complementary arrangement to that depicted in Figs. 10A, 10B and 10C for image processing;
[0033] Figs. 18A and 18B depict flowchart fragments for an alternate / complementary arrangement to that depicted in Fig. 9 for image capture; and
[0034] APPENDIX A is a glossary of terms used in this description.
Detailed Description including Best Mode
[0035] Where reference is made in any one or more of the accompanying drawings to steps and/or features, which have the same reference numerals, those steps and/or features have for the purposes of this description the same function(s) or operation(s), unless the contrary intention appears.
[0036] It is to be noted that the discussions contained in the "Background" section and that above relating to prior art arrangements relate to discussions of arrangements which may form public knowledge through their use. Such discussions should not be interpreted as a representation by the inventor or the patent applicant that such arrangements in any way form part of the common general knowledge in the art.
[0037] The disclosed CBDR arrangements capture comprehensive two-dimensional images covering the store in question (described hereinafter in more detail with reference to Figs. 7, 8A, 8B and 9), and convert the captured images to panoramic images using known techniques such as those used by Google Street ViewTM. Virtual reality shopping using such panoramic images is more intuitive than traditional menu-based shopping.
[0038] A customer can typically use the disclosed CBDR arrangement by initiating a shopping session on a CBRD website (see 1101 in Fig. 11) using a CBDR software application 469 executing on the customer's digital device 472 (which may be a desktop PC, a smart phone, tablet or similar fixed or portable digital platform). The customer can shop (described hereinafter in more detail with reference to Figs. 5B, 5C, 11 and 12) by swiping his or her finger on a touch sensitive screen 474 or by moving a pointing device 403 to move a cursor on a screen 414, and walking virtually down an aisle (see Fig. 1 for example), looking left and right, browsing the products on the shelves as though they are in the store, selecting a desired product and adding the product to their shopping cart.
[0039] In situations where a physical store offers the virtual CBDR arrangement in addition to "bricks and mortar" shopping options, the CBDR layout of the store presented on the display 474 of the customer's device 472 can be similar to the store layout the customer would see in person. The customer can save his regular items in a list but in addition can virtually browse the aisles for products that he didn't plan on purchasing when he began. Specials can be placed anywhere in the store as is currently done in regular shops. The whole familiar experience of walking into a shop, browsing, purchasing and walking out can be recreated virtually. When the customer has finished shopping, she can head for the checkout and complete her purchase.
[0040] The CBDR system can be visualised as being made up of 4 processes as follows:
1. Store Preparation (described hereinafter in more detail with reference to Figs. 6A and 6B);
II. Image Capture (described hereinafter in more detail with reference to Figs. 7, 8A, 8B and 9);
III. Image Processing (described hereinafter in more detail with reference to Figs. 10A, 10B and 10C);
IV. User Interface (described hereinafter in more detail with reference to Figs. 11 and 12).
INTRODUCTION
[0041] Fig. 1 depicts an aisle view 100 in a virtual reality shopping system according to the disclosed CBDR arrangement. The aisle depicted in Fig. 1 has a floor 102 running between a left-hand side set of shelves 101 and a right-hand side set of shelves 103. The panoramic view 100 is constructed by stitching together two-dimensional images (such as that depicted in Fig. 2) captured of the store in question (described hereinafter in regard to Figs. 5A, 7, 8A, 8B and 9) in a manner such as that used in Google Street ViewTM.
STORE PREPARATION
[0042] Fig. 2 depicts a two-dimensional product image P1200 of a set of shelves in a store upon which a variety of products are stacked according to the CBDR arrangement. The product image 200 depicts four shelves (202, 203, 204 and 205). Products are stacked on the shelves 202-205 in Product Groups (PGs) such as 201, 210. A product group can consist of the single product 210 or a number of identical products 213. Dark grey rectangles 207, 214, 216 are referred to as Product Group Envelopes (PGEs). The individual Product Group Envelopes 207, 214, 216 represent the shelf space visualised for the corresponding product groups in question by a store employee preparing the shelves (described hereinafter in more detail with reference to Figs. 5A, 6A and 6B) for the CBDR arrangement.
[0043] The product group envelope visualised by the store employee forms the basis upon which the store employee manually affixes the associated product group location code PGLC. As described hereinafter in regard to Figs. 5A, 10A, 1OB and 1OC the CBDR server 401 processes the captured product images such as the image 200, using the affixed product group location codes PGLC, in order to determine the locations and sizes of the bounding boxes associated with the product groups.
[0044] Accordingly, the product group envelope PGE is an artefact visualized by the store employee to enable the employee to affix the product group location codes PGLCs on the shelves. In contrast, the product group boundary box PGBB is a mathematical construct, having a location and a size, determined by the CBDR system using the affixed product group location codes PGLCs and stored in the bounding box database 477, which is used to implement the CBDR method.
[0045] Two example types of Product Group Location Code (PGLC) are depicted in Fig. 2. Other types of PGLC can be used. A Boundary Product Group Location Code (BPGLC) 215 is shown affixed to the shelf 203 at the bottom left hand corner of the Product Group Envelope PGE 214 associated with the product group 201. An adjacent Boundary Product Group Location Code (BPGLC) 217 is associated with an adjacent Product Group Envelope (not shown). A Centreline Product Group Location Code (CPGLC) 209 is shown affixed to the shelf 205 at the bottom centre location of the Product Group Envelope PGE 207 associated with the product group 210.
[0046] Each product group PG typically also has a Product Description Code 208 (PDC) which is often implemented as a QR code, and which is used to identify the product group in a product database 490 for the store in question. The Product Description Code PDC is typically printed on a user readable label 208 associated with the product group in question and affixed to the product shelf. Each individual product typically also has a Product Bar Code (PBC) 218 on the product itself which identifies the specific product in question.
[0047] Fig. 2 shows an example of a product image which a customer would view as part of the panoramic image in the disclosed CBDR arrangement in which multiple product groups (PGs) are located on shelves. Fig. 2 depicts Product Group Location Codes PGLCs located at boundary corners and centrelines within their corresponding Product Group Envelopes PGEs. However, the disclosed CBDR arrangement can be implemented using Product Group Location Codes PGLCs located at any pre-defined relative location to their corresponding Product Group Envelopes PGEs.
[0048] Prior to the Image Capture process, described hereinafter in more detail with reference to Figs. 7, 8A, 8B and 9, a store employee works their way down each aisle, ensuring that the products are displayed neatly on each shelf above their respective product label, which includes a price tag, to facilitate the best possible image for the CBDR website and recognition process. The store employee also visualises, for each product group PG such as 201, an associated Product Group Envelope PGE 214. Having visualised the Product Group Envelope PGE 214, in one CBDR arrangement the store employee affixes a Centreline Product Group Location Code CPGLC on the shelf centred on a vertical centre line through the Product Group Envelope PGE. See for example the Centreline Product Group Location Code CPGLC 209 which is located on a centre line (not shown) of the Product Group Envelope PGE 207 associated with the product group PG 210. In another CBDR arrangement the store employee affixes a Boundary Product Group Location Code BPGCL on the shelf in the bottom left-hand corner of the corresponding Product Group Envelope PGE. See for example the Boundary Product Group Location Code BPGLC 215 which is located at the bottom left-hand corner of the Product Group Envelope PGE 214 associated with the product group PG 201.
[0049] Multiple product items of the same product are often placed on a shelf stacked next to and / or on top of each other. The number of items that are located next to each other in a row for that particular product group is stored in a Qty-Wide parameter in the product database 490, and the number stacked on top of each other is stored in a Qty-High parameter in the product database 490.
[0050] Fig. 3 depicts a product image incorporating showing examples of Product Group Location Codes PGLCs and product group bounding boxes PGBBs associated with the product group location codes PGLCs. A first product group (PGI) 319, located on the shelf 203, consists of a single product 319 which is located on the shelf 203. A boundary product group location code BPGLC, 304 is located at a lower left-hand corner of a product group bounding box PGBB, (301) which has been constructed based upon a location 303 (with X-Y coordinates PGLCX, PGLCY,) of the boundary product group location code BPGLCI (304).
[0051] In general, in the disclosed CBDR arrangements a "location" of a product group location code PGLC can be specified to be any fixed location within the PGLC in question. PGLCs in a CBDR system can thus have their "locations" specified in any convenient manner.
[0052] For example, the boundary product group location code BPGLCI (304) of the first product group PGi has a location specified by a black dot 303 (with X-Y coordinates PGLCXI, PGLCYi) at the bottom left hand corner of the boundary product group location code BPGLC (304). An associated product group bounding box PGBBI (301) has a product group boundary box width (PGBBW) 307 which in this example is determined by an X distance 307 (PGLCX 2 PGLCX) between the location 303 (PGLCXi, PGLCYI) of the first boundary product group location code BPGLC, (304) and a location (PGLCX 2, PGLCY 2) (308) of an adjacent (second) boundary product group location BPGLC 2 code 309. The first product group bounding box PGBBI (301) has a product group bounding box height (PGBBHI = PGLCY, + DI) 302 which in this example is determined by a distance D, (302) between a bottom surface of the shelf 203 and a lower surface of the upper shelf 202. The first product group boundary box PGBB 301 has a product group bounding box width (PGBBW = PGLCX2 - PGLCXi) 307 and a product group bounding box height (PGBBH, = PGLCY, + Di) 302.
[0053] A second product group PG 2 328 comprising eight products including 313, located on the shelf 203, each product having a product width PW2 (321) and a product height PH 2 (322) is arranged two products wide (ie Qty-Wide 2 = 2) and four products high (ie Qty-High2 = 4). The second product group PG 2 has a boundary product group location code BPGLC 2 (with a location 308 having X-Y coordinates PGLCX 2 , PGLCY 2). A product group bounding box PGBB 2 311 associated with the product group PG 2 (including the product 313) extends upwards (from PGLCY 2 to PGLCY 2 + PGBBH 2 ) and to the right (from PGLCX2 to PGLCX2 + PGBBW 2) from the location 308 (PGLCX 2, PGLCY2 ) of the boundary product group location code BPGLC 2 309.
The product group bounding box 311 width (PGBBW2) 310 in this example is determined by the width (321) of the product 313 (PW2) and the parameter Qty-Wide 2 = 2 therefore PGBBW2= PW 2 * Qty-Wide 2.The product group bounding box 311 has a product group bounding box height (PGBBH 2) 312 which in this example is determined by the height (322) of the product 313 (PH 2) and the parameter Qty-High2 = 4 and therefore PGBBH 2 = PH 2 * Qty-High2
+ PGLCPHH 2 where PGLCPHH 2 is the height of the boundary product group location code BPGLC 2 309.
[0054] A third product group PG 3 327 comprises eight products such as 314, suspended on a hook arrangement (not shown) from the shelf 202, each product having a product width (PW3
) 323 and a product height PH3 (324), arranged two products wide (ie Qty-Wide 3 = 2) and four products high (ie Qty-High 3 = 4). The third product group PG3 has an associated centreline product group location code CPGLC 3 316 having a location 320 at PGLCX, PGLCY3
.
[0055] A product group bounding box PGBB 3 315 associated with the third product group PG3 (including the product 314) extends equidistantly to the right and left of the location 320 of the centreline product group location code (CPGLC) 316 and extends downwards from an upper edge of the centreline product group location code (CPGLC) 316, the upper edge having a Y coordinate of PGLCY 3 - PGLCPHH 3/2 where PGLCY 3 is the Y-coordinate of the location 320 of the centreline product group location code CPGLC 3 and PGLCPHH 3 /2 is half the height of the product group location code CPGLC3 .
[0056] The product group bounding box PGBB 3 315 has a width (PGBBW) 317 which in this example is determined by the width of the product 314 (PW) and the parameter Qty-Wide3 = 2, and therefore PGBBW3 = PW* Qty-Wide 3 and extends from (PGLCX - PGBBW 3 /2) to (PGLCX 3 + PGBBWI2) where PGLCX 3 is the X-coordinate of the location of the centreline product group location code CPGLC 3 and PGBBW 3 2 is half the width of the product group bounding box PGBB. The product group bounding box PGBB 3 315 has a height (PGBBH3 )
318 which in this example is determined by the height (PH3 ) of the product 314 and the parameter Qty-High3 = 4 and therefore PGBBH = PH* Qty-High3 + PGLCPHH 3 and extends from (PGLCY 3 - PGLCPHH/2) to (PGLCY 3 + PGBBH + PGLCPHH/2) where PGLCY is the Y-coordinate of the location of the centreline product group location code CPGLCC 3 ,
PGLCPHH 3 /2is half the height of the centreline product group location code CPGLCC, and PGBBH is the product boundary box height of the product group location code CPGLCC3 .
[0057] Although different PGLCs can have different locations, as shown by the location 303 for the PGLC 304, and the location 320 for the PGLC 316, for the sake of simplicity it is possible for all PGLCs in a CBDR system to use the same location specification, eg the location specified by the black dot 303 at the bottom left hand corner of the boundary product group location code BPGLC 304.
[0058] Figs. 4A and 4B form a schematic block diagram of a general-purpose computer system upon which the disclosed CBDR arrangements can be practiced. The description below is primarily directed to the CBDR server 401 and its operation. However, the description applies mutatis mutandis to the customer device 472, an image capture trolley 700, and a hand scanner 478.
[0059] Furthermore, although the description may indicate that a particular process step is performed by a particular one of the CBDR server 401, the customer device 472, the image capture trolley 700, and the hand scanner 478 executing respective CBDR server software 433, customer digital device CBDR software 469, trolley CBDR software (not shown), and hand scanner CBDR software (not shown), the various CBDR software applications can be distributed in a flexible manner so that processes may to a significant degree be performed equivalently by one or more of the CBDR server 401, the customer device 472, the image capture trolley 700, and the hand scanner 478 and their respective CBDR software applications, subject of course to constraints in computing and memory resources, network latency and the like.
[0060] As seen in Fig. 4A, the computer system 400 includes: a computer module 401; input devices such as a keyboard 402, a mouse pointer device 403, a scanner 426, a camera 427, and a microphone 480; and output devices including a printer 415, a display device 414 and loudspeakers 417. An external Modulator-Demodulator (Modem) transceiver device 416 may be used by the computer module 401 (the CBRD server) for communicating to and from a customer PC or mobile terminal 472, the trolley 700 and the hand scanner 478 over a communications network 420 via respective connections 421/473 and 421/475. The communications network 420 may be a wide-area network (WAN), such as the Internet, a cellular telecommunications network, or a private WAN. Where the connection 421 is a telephone line, the modem 416 may be a traditional "dial-up" modem. Alternatively, where the connection 421 is a high capacity (e.g., cable) connection, the modem 416 may be a broadband modem. A wireless modem may also be used for wireless connection to the communications network 420. The customer device 472, the trolley 700 and the hand scanner 478 communicate with the network 420 as depicted by dashed lines 473, 475 and 481. The line 473 can depict a hard-wired connection if the customer device 472 is a PC, or can depict a wireless connection if the customer device 472 is a smart phone or a tablet and do on. Similarly the trolley 1100 can communicate with the network 420 using either a hardwired or wireless connection.
[0061] The computer module 401 typically includes at least one processor unit 405, and a memory unit 406. For example, the memory unit 406 may have semiconductor random access memory (RAM) and semiconductor read only memory (ROM). The computer module 401 also includes an number of input/output (1/O) interfaces including: an audio-video interface 407 that couples to the video display 414, loudspeakers 417 and microphone 480; an 1/O interface 413 that couples to the keyboard 402, mouse 403, scanner 426, camera 427 and optionally a joystick or other human interface device (not illustrated); and an interface 408 for the external modem 416 and printer 415. In some implementations, the modem 416 may be incorporated within the computer module 401, for example within the interface 408. The computer module 401 also has a local network interface 411, which permits coupling of the computer system 400 via a connection 423 to a local-area communications network 422, known as a Local Area Network (LAN). As illustrated in Fig. 4A, the local communications network 422 may also couple to the wide network 420 via a connection 424, which would typically include a so called "firewall" device or device of similar functionality. The local network interface 411 may comprise an Ethernet circuit card, a Bluetooth© wireless arrangement or an IEEE 802.11 wireless arrangement; however, numerous other types of interfaces may be practiced for the interface 411.
[0062] The 1/O interfaces 408 and 413 may afford either or both of serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and having corresponding USB connectors (not illustrated). Storage devices 409 are provided and typically include a hard disk drive (HDD) 410. Other storage devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used. An optical disk drive 412 is typically provided to act as a non-volatile source of data. Portable memory devices, such optical disks (e.g., CD-ROM, DVD, Bluray DiscTM), USB-RAM, portable, external hard drives, and floppy disks, for example, may be used as appropriate sources of data to the system 400.
[0063] The components 405 to 413 of the computer module 401 typically communicate via an interconnected bus 404 and in a manner that results in a conventional mode of operation of the computer system 400 known to those in the relevant art. For example, the processor 405 is coupled to the system bus 404 using a connection 418. Likewise, the memory 406 and optical disk drive 412 are coupled to the system bus 404 by connections 419. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations, Apple MacTM or like computer systems.
[0064] The CBDR method may be implemented using the computer system 400 wherein the processes of Figs. 5-6 and 9-18, to be described, may be implemented as one or more software application programs 433, 469 and so on executable within the computer system 400. In particular, the steps of the CBDR method are effected by instructions 431 (see Fig. 4B) in the software 433 that are carried out within the computer system 400. The software instructions 431 may be formed as one or more code modules, distributed between the server 401, the customer device 472, the trolley 700 and the hand scanner 478 each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the CBDR methods and a second part and the corresponding code modules manage a user interface between the first part and the user.
[0065] The software may be stored in a computer readable medium, including the storage devices described below, for example. The software is loaded into the computer system 400 from the computer readable medium, and then executed by the computer system 400. A computer readable medium having such software or computer program recorded on the computer readable medium is a computer program product. The use of the computer program product in the computer system 400 preferably effects an advantageous CBDR apparatus.
[0066] The software 433 is typically stored in the HDD 410 or the memory 406 (the aforementioned software can be distributed between the HDD 410 or the memory 406 of the server 401 and a memory 470 in the customer device 472). The software is loaded into the computer system 400 from a computer readable medium and executed by the computer system 400. Thus, for example, the software 433 may be stored on an optically readable disk storage medium (e.g., CD-ROM) 425 that is read by the optical disk drive 412. A computer readable medium having such software or computer program recorded on it is a computer program product. The use of the computer program product in the computer system 400 preferably effects a CBDR apparatus.
[0067] In some instances, the application programs 433 may be supplied to the user encoded on one or more CD-ROMs 425 and read via the corresponding drive 412, or alternatively may be read by the user from the networks 420 or 422. Still further, the software can also be loaded into the computer system 400 from other computer readable media. Computer readable storage media refers to any non-transitory tangible storage medium that provides recorded instructions and/or data to the computer system 400 for execution and/or processing. Examples of such storage media include floppy disks, magnetic tape, CD-ROM, DVD, Blu-ray TM Disc, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 401. Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the computer module 401 include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like.
[0068] The second part of the application programs 433 and the corresponding code modules mentioned above may be executed to implement one or more graphical user interfaces (GUls) to be rendered or otherwise represented upon the display 414. Through manipulation of typically the keyboard 402 and the mouse 403, a user of the computer system 400 and the application may manipulate the interface in a functionally adaptable manner to provide controlling commands and/or input to the applications associated with the GUI(s). Other forms of functionally adaptable user interfaces may also be implemented, such as an audio interface utilizing speech prompts output via the loudspeakers 417 and user voice commands input via the microphone 480.
[0069] Fig. 4B is a detailed schematic block diagram of the processor 405 and a "memory" 434. The memory 434 represents a logical aggregation of all the memory modules (including the HDD 409 and semiconductor memory 406) that can be accessed by the computer module 401 in Fig. 4A. The following description applies mutatis mutandis to the customer device 472.
[0070] When the computer module 401 is initially powered up, a power-on self-test (POST) program 450 executes. The POST program 450 is typically stored in a ROM 449 of the semiconductor memory 406 of Fig. 4A. A hardware device such as the ROM 449 storing software is sometimes referred to as firmware. The POST program 450 examines hardware within the computer module 401 to ensure proper functioning and typically checks the processor 405, the memory 434 (409, 406), and a basic input-output systems software (BIOS) module 451, also typically stored in the ROM 449, for correct operation. Once the POST program 450 has run successfully, the BIOS 451 activates the hard disk drive 410 of Fig. 4A. Activation of the hard disk drive 410 causes a bootstrap loader program 452 that is resident on the hard disk drive 410 to execute via the processor 405. This loads an operating system 453 into the RAM memory 406, upon which the operating system 453 commences operation. The operating system 453 is a system level application, executable by the processor 405, to fulfil various high-level functions, including processor management, memory management, device management, storage management, software application interface, and generic user interface.
[0071] The operating system 453 manages the memory 434 (409, 406) to ensure that each process or application running on the computer module 401 has sufficient memory in which to execute without colliding with memory allocated to another process. Furthermore, the different types of memory available in the system 400 of Fig. 4A must be used properly so that each process can run effectively. Accordingly, the aggregated memory 434 is not intended to illustrate how particular segments of memory are allocated (unless otherwise stated), but rather to provide a general view of the memory accessible by the computer system 400 and how such is used.
[0072] As shown in Fig. 4B, the processor 405 includes a number of functional modules including a control unit 439, an arithmetic logic unit (ALU) 440, and a local or internal memory 448, sometimes called a cache memory. The cache memory 448 typically includes a number of storage registers 444 - 446 in a register section. One or more internal busses 441 functionally interconnect these functional modules. The processor 405 typically also has one or more interfaces 442 for communicating with external devices via the system bus 404, using a connection 418. The memory 434 is coupled to the bus 404 using a connection 419.
[0073] The application program 433 includes a sequence of instructions 431 that may include conditional branch and loop instructions. The program 433 may also include data 432 which is used in execution of the program 433. The instructions 431 and the data 432 are stored in memory locations 428, 429, 430 and 435, 436, 437, respectively. Depending upon the relative size of the instructions 431 and the memory locations 428-430, a particular instruction may be stored in a single memory location as depicted by the instruction shown in the memory location 430. Alternately, an instruction may be segmented into a number of parts each of which is stored in a separate memory location, as depicted by the instruction segments shown in the memory locations 428 and 429.
[0074] In general, the processor 405 is given a set of instructions which are executed therein. The processor 405 waits for a subsequent input, to which the processor 405 reacts to by executing another set of instructions. Each input may be provided from one or more of a number of sources, including data generated by one or more of the input devices 402, 403, data received from an external source across one of the networks 420, 402, data retrieved from one of the storage devices 406, 409 or data retrieved from a storage medium 425 inserted into the corresponding reader 412, all depicted in Fig. 4A. The execution of a set of the instructions may in some cases result in output of data. Execution may also involve storing data or variables to the memory 434.
[0075] The disclosed CBDR arrangements use input variables 454, which are stored in the memory 434 in corresponding memory locations 455, 456, 457. The CBDR arrangements produce output variables 461, which are stored in the memory 434 in corresponding memory locations 462, 463, 464. Intermediate variables 458 may be stored in memory locations 459, 460, 466 and 467.
[0076] Referring to the processor 405 of Fig. 4B, the registers 444, 445, 446, the arithmetic logic unit (ALU) 440, and the control unit 439 work together to perform sequences of micro operations needed to perform "fetch, decode, and execute" cycles for every instruction in the instruction set making up the program 433. Each fetch, decode, and execute cycle comprises:
• a fetch operation, which fetches or reads an instruction 431 from a memory location 428, 429, 430;
• a decode operation in which the control unit 439 determines which instruction has been fetched; and
• an execute operation in which the control unit 439 and/or the ALU 440 execute the instruction.
[0077] Thereafter, a further fetch, decode, and execute cycle for the next instruction may be executed. Similarly, a store cycle may be performed by which the control unit 439 stores or writes a value to a memory location 432.
[0078] Each step or sub-process in the processes of Figs. 5-6 and 9-18 is associated with one or more segments of the program 433 and is performed by the register section 444, 445, 447, the ALU 440, and the control unit 439 in the processor 405 working together to perform the fetch, decode, and execute cycles for every instruction in the instruction set for the noted segments of the program 433.
CBDR ARRANGEMENT TOP LEVEL PROCESSES
[0079] Figs. 5A, 5B and 5C depict respective flow charts for example processes for preparing a store (involving a store preparation process, an image capture process, and an image processing process), a shopping process seen from a shopper perspective, and a shopping process from the CBDR server perspective according to the disclosed CBDR arrangement.
[0080] Fig. 5A is a flow chart for an example process 500 for setting up the store in question for the CBDR arrangement. The process 500 has three process elements relating to a store preparation process 528 (described hereinafter in more detail with reference to Figs. 6A, 6B), an image capture process 507 (described hereinafter in more detail with reference to Figs. 7, 8 and 9), and an image processing process (described hereinafter in more detail with reference to Figs. 10A, 10B and 10C).
[0081] The store preparation process 528 (performed by the CBDR server 401 executing the CBDR software application 433 and the employee hand scanner 478 executing CBDR scanner software (not shown)) need to be performed on a regular basis, to ensure that products are displayed neatly on each shelf above their respective product label, and that all products have associated product group location codes PGLCs properly affixed. Accordingly, the CBDR server may typically arrange for the store preparation process 528 to be performed according to a pre determined schedule, or after special events such as sales which result in movement of products on the shelves, by sending suitable notifications to relevant store employees.
[0082] The image capture process 507 (performed by the CBDR server 401 executing the CBDR software application 433 and the computer-controlled camera trolley 700 executing CBDR trolley software (not shown)) similarly needs to be performed on a regular basis, to ensure that products as displayed in the panoramic image presented to customers, and as displayed in the product images 200, accurately represent the actual product layouts on the store shelves. This is necessary to ensure that out-of-stock items, items which are no longer carried, new products and so on are faithfully displayed in order to maintain customer satisfaction. Accordingly, the CBDR server may typically arrange for the image capture process 507 to be performed according to a pre-determined schedule, or after special events such as sales, by directing a computer-controlled camera trolley 700 (described hereinafter in more detail with reference to Figs. 7, 8A, 8B and 9) to perform the image capture process. The CBDR server would also typically arrange for the image processing process 512 (performed by the CBDR server 401 executing the CBDR software application 433) to be performed (described hereinafter in more detail with reference to Figs. 10A, 10B and 10C) after an image capture cycle.
[0083] The process 500 includes a decision step 501 in which the server 401, based upon elapsed time or a pre-programmed schedule, determines if store preparation is required (this relates to the tidying up of the shelves and affixing product group location codes). If the step 501 returns a TRUE value, then the process 500 follows a YES arrow 502 from 501 to the step 528 (described hereinafter in more detail with reference to process fragments 518' and 528" in Figs. 6A, 6B respectively) which results in product group location codes PGLCs being affixed (ie 527) and the process 500 then follows an arrow 503 back to the decision step 501.
[0084] If the step 501 returns a FALSE value, then the process 500 follows a NO arrow 504 from 501 to a decision step 505 in which the CBDR server determines, based upon elapsed time or a pre-programmed schedule, if image capture is required. If the step 505 returns a TRUE value, then the process 500 follows a YES arrow 506 from 505 to the step 507 (described hereinafter in more detail with reference to Figs. 7, 8 and 9) which results in captured images being stored in a folder 1 (ie 529), and the process 500 then follows an arrow 508 back to the decision step 505. Each captured image is stored in a memory module 476 (also referred to as folder 1) along with the aisle number ANO in which the image was captured, the identifying number of the camera which captured the image CID, and a sequence number in the sequence of images taken by the camera in question in the aisle number in questionISNO. The data structure consisting of the three parameters {ANO, CID, ISNO} is referred to as the product image identifier P11. If the step 505 returns a FALSE value, then the process 500 follows a NO arrow 509 from 505 to a decision step 510 in which the CBDR server determines, based upon elapsed time or a pre-programmed schedule, if image processing is required. If the step 505 returns a TRUE value, then the process 500 follows a YES arrow 511 from 510 to the step 512 (described hereinafter in more detail with reference to Figs. Figs. 10A, 10B and 10C) which results in bounding box data being determined from product group location codes PGLCs affixed in the process 528 and stored in the bounding box database (ie 477). The process 500 then follows an arrow 513 back to the decision step 510. If the step 510 returns a FALSE value, then the process 500 follows an arrow 514 from 510 back to the step 501.
[0085] Fig. 5B is a flow chart for an example process 515 for performing a customer shopping process by the customer according to the CBDR arrangement. The process 515 has a decision step 516 in which the customer determines whether to initiate a shopping session according to the CBDR arrangement. If this is the case, then the process 515 follows a YES arrow 517 from 516 to 518 (described hereinafter in more detail with reference to Fig. 11 and performed by the CBDR server 401 executing the CBDR software application 433 and the customer digital device 472 executing the CBDR software 469). After conclusion of the shopping session the process 515 follows an arrow 519 back from 518 to the step 516. If a shopping session is not required, the process 515 follows a NO arrow 520 from the step 516 back to the step 516.
[0086] Fig. 5C is a flow chart for an example process 521 for performing a customer shopping process by the CBDR server according to the CBDR arrangement. The process 521 has a decision step 522 in which the CBDR server determines if a shopping session has been initiated by a customer. If this is the case, then the process 521 follows a YES arrow 523 from 522 to a step 1200 (described hereinafter in more detail with reference to Fig. 12 and performed by the CBDR server 401 executing the CBDR software application 433 and the customer digital device 472 executing the CBDR software 469). After conclusion of the shopping session the process 521 follows an arrow 525 back to the step 522. If a shopping session is not required, the process 521 follows a NO arrow 526 from the step 522 back to the step 522.
CBDR ARRANGEMENT STORE PREPARATION
[0087] Fig. 6A is a flow chart for an example process fragment 528' by which the store employee performs the store preparation process 528 (see Figs. 13A and 13B for an alternate/ complementary store preparation process). In a decision step 601 the store employee decides if all aisles have been processed. The store employee makes use of the portable digital hand-held scanner 478 which can communicate with the CBDR server 401. The scanner 478 enables the employee to enter an "aisle processed" command after each aisle is processed according to the store preparation process, thereby enabling him to keep track of what has been done. If all aisles have been processed, then the process 528'follows a YES arrow 616 from 601 back to 501 in Fig. 5A. If however all aisles have not been processed then the process 528'follows a NO arrow 602 from the step 601 to a step 603 in which the store employee identifies and goes to the next aisle by querying the CBDR server to enquire which aisles have not yet been processed.
[0088] The process 528' then follows an arrow 604 from the step 603 to a decision step 605 in which the store employee determines if all product groups PGs in the aisle have been processed by properly affixing the relevant product group location codes PGLCs. This is a manual process requiring the employee to visually inspect each product group PG to determine if the relevant product group location codes PGLCs have been affixed. If this is the case, the process 528'follows a YES arrow 615 from 605 back to the decision step 601. If however the step 605 returns a FALSE value (which occurs if the employee finds one or more product groups PGs in the aisle being processed without the relevant product group location codes PGLCs), then the process 528'follows a NO arrow 606 from 605 to a step 607 in which the store employee visually finds the next product group PG which requires that a relevant product group location code PGLC be affixed. The process 528'then follows an arrow 608 from 607 to
609 in which the employee visually identifies a product on the shelf in the product group. The process 528' then follows an arrow 610 to 631 in Fig. 6B.
[0089] Fig, 6B depicts a flowchart fragment 528" which is entered via an arrow 610 from 609 in Fig. 6A and is directed to a step 631. In 631 the employee scans a product bar code (PBC) on one of the products in the product group PG in question using the hand scanner 478 and then follows an arrow 632 from 631 to 633 in which the employee confirms that the product description provided by the hand scanner matches the actual product on the shelf. If this is not the case the employee reports this via the hand scanner and moves to the next unprocessed product group (not shown). The process 528" then follows an arrow 634 from 633 to 635. In the step 635 if the product description provided by the hand scanner matches the actual product the employee confirms the match to the scanner 478 and the scanner then asks the employee to indicate whether the product group location code PGLC to be affixed should be affixed at the boundary (see 304 in Fig. 3) or the centre (see 316 in Fig. 3). The employee makes this decision based upon her visual inspection of the product group in question and guidelines provided in this regard by the store and enters this decision into the hand scanner 478.
[0090] If the employee decides that the product group location code PGLC is to be affixed in the centre, the process 528" then follows a "C" arrow 636 from 635 to 637 in which the scanner asks the employee to specify whether the product group location code PGLC is to be affixed above the product group PG (as in 316 in Fig. 3) or below the product group (as in 325 in Fig. 3), and the employee enters his selection into the hand scanner 478. The process 528" then follows an arrow 638 from 637 to a decision step 639 in which the employee interrogates the CBDR server using the hand scanner 478 to determine if the dimensions of the product (ie PH and PW) making up the product group in question, as well as the parameters Qty-Wide and Qty-High for the product group PG in question are already stored in the product database 490.
[0091] If the scanner indicates that this information is stored in the product database 490 then the process 528" follows a "Y" arrow 640 from 639 to a decision step 641 in which the employee manually checks the product dimensions PH and PWas well as the parameters Qty-Wide and Qty-High provided by the hand scanner 478 against manual measurement of the product dimensions PH and PWand visual inspection of the product group PG. If the measurements and the parameters match, then the employee confirms this to the scanner 478 and the process 528" follows a "Y" arrow 646 from 641 to 647 where the scanner 478 prints the centreline product group location code CPGLC. The printed centreline product group location code CPGLC contains, in one example, a prefix "CU" or "CL"do denote whether the CPGLC is to be located above or below the product group PG in question, the height PGH of the product group
PG being PH * Qty-High, the width PGW of the product group PG being PW* Qty-Wide, and a unique product group identifier PGID.
[0092] The process 528" then follows an arrow 648 to a step 649 in which the employee visualizes the product group envelope PGE for the product group PG in question and affixes the centreline product group location code CPGLC at a centre location within the product group envelope PGE, above or below the product group PG in question as specified by the corresponding prefix "CU" or "CL" in the step 637. The process 528" then follows an arrow 614 to 607 in Fig. 6A. If however at 641 the measurements and parameters don't match then the process 528" follows a "N" arrow 642 from 641 to 644 in which the employee measures the product dimensions PH and PWand enters these measurements as well as the parameters Qty-Wide and Qty-High if necessary into the scanner 478. The process 528" then follows an arrow 645 to the step 647. The (width) * (height) dimensions of the product group PG printed on the CPGLC are thus (PW* Qty-Wide)* (PH * Qty-High).
[0093] Returning to the step 635, the scanner asks the employee to indicate whether the product group location code should be affixed at the boundary (see 304 in Fig. 3) or the centre (see 316 in Fig. 3) of the product group envelope PGE. The employee makes this decision based upon her visual inspection of the product group in question and guidelines provided in this regard by the store and enters this decision into the hand scanner 478.
[0094] If the employee decides that the product group location code is to be affixed at the boundary, the process 528" then follows a "B" arrow 650 from 635 to 651 in which the scanner asks the employee to specify whether the boundary product group location code BPGLC is to be affixed above the product group PG (as in 316 in Fig. 3) or below the product group PG (as in 325 in Fig. 3), and the employee enters his selection into the hand scanner 478. The process 528" then follows an arrow 652 from 651 to a decision step 653 in which the employee interrogates the CBDR server using the hand scanner 478 to determine if the dimensions PH and PWof the product (eg 314 in Fig. 3) in the product group in question (ie the product group PG in the dashed ellipse 327 ) and the parameters Qty-Wide and Qty-High are already stored in the product database 490.
[0095] If the scanner indicates that the product height PH is stored in the product database 490 then the process 528" follows a "Y" arrow 654 from 653 to a decision step 655 in which the employee manually checks the product height PH provided by the hand scanner 478 against manual measurement of the product height PH. If the measurements match, then the employee confirms this to the scanner 478 and the process 528" follows a "Y" arrow 656 from 655 to 657 where the scanner requests the Qty-High parameter for the product group in question (ie Qty High is 4 for the product group 327) and the employee enters this into the hand scanner. Thereafter the CBDR server determines the height PGH of the product group which is PH * Qty High.
[0096] The process 528" then follows an arrow 662 from 657 to a decision step 663 in which the scanner asks the employee if there is an adjacent product group (on the right-hand side for example) on the same shelf (eg the product group 328 is adjacent on the right-hand side of the product group 319). If this is the case the employee indicates this on the scanner 478 and the process 528" follows a "Y" arrow 666 from 663 to 647 in which the scanner 478 prints the boundary product group location code BPGLC. In this event, ie there is an adjacent product group on the same shelf, the printed boundary product group location code BPGLC contains, in one example, a prefix "CU" or "CL"do denote whether the PGLC is to be located above or below the product group PG in question, the height PGH of the product group PG being PH* Qty-High, and a unique product group identifier PGID. The width PGW of the product group is not printed on the boundary product group location code BPGLC in this case because this will be determined by the image processing process 512 having regard to the adjacent product group on the same shelf. Accordingly, the (width) * (height) dimensions of the product group PG printed on the BPGLC are thus (not printed) * (PH * Qty-High).
[0097] Returning to the step 663 if there is no adjacent product group PG on the same shelf, then the employee enters this on the scanner 478, and the process 528" follows a "N" arrow 664 to a step 665 in which the employee visualizes the product group envelope PGE for the product group PG in question and enters the product group envelope width PGEWinto the scanner 478. The process 528" then follows the arrow 603 from 665 to 647 in which the scanner 478 prints the boundary product group location code BPGLC. In this event, ie there is no adjacent product group on the same shelf, the printed boundary product group location code BPGLC contains, in one example, a prefix "CU" or "CL"do denote whether the PGLC is to be located above or below the product group PG in question, the height PGH of the product group PG being PH * Qty-High, the measured product group envelope width PGEW, and a unique product group identifier PGID. The (width) * (height) dimensions of the product printed on the BPGLC are thus (PGEW) * (PH * Qty-High).
IMAGE CAPTURE
[0098] Fig. 7 depicts a computer-controlled camera trolley 700 for capturing the sets of images according to the process depicted in Fig. 5A in one CBDR example. A platform 709 is supported on wheels 710, 719. The wheels 710, 719 are powered by a driving mechanism and processor (not shown) which are controlled, as depicted by a dashed line 475 in Fig. 4A, by the CBDR software 433 executing on the CBDR server processor 405. Three opposing sets of image-capturing devices 708, 718 and 706, 716 and 704, 714, controlled, as depicted by a dashed line 475 in Fig. 4A, by the CBDR software 433 executing on the server processor 405, are attached to a central pillar 701. The three opposing sets of image-capturing devices 708, 718 and 706, 716 and 704, 714 are directed sideways as depicted by respective arrows 707, 717 and 705, 715 and 703, 713. A further image capturing device 712, controlled, as depicted by a dashed line 475 in Fig. 4A, by the CBDR software 433 executing on the server processor 405, is attached to the central pillar 701 and is directed forward as depicted by an arrow 702. A further image capturing device (not visible), controlled, as depicted by a dashed line 475 in Fig. 4A, by the CBDR software 433 executing on the server processor 405, is attached to the central pillar 701 and is directed backwards as depicted by an arrow 711.
[0099] Figs. 8A and 8B depict examples of a guidance arrangement that can be used to guide the trolley 700 of Fig. 7 through a store in order to capture product images such as that shown in Fig. 2 according to a process as depicted in Fig. 9.
[00100] Fig 8A depicts a CBDR trolley guidance arrangement store shelving layout 800 from above, the layout comprising seven sets of shelves 805, 807, 818, 823, 824, 829, and 834 spaced apart by aisles 1 - 8. The trolley 700, when not in image capture mode, is positioned at a "home" location indicated by a camera trolley home marker CTHM 801 which is stuck or painted on the floor. The trolley 700 has a sensing mechanism on its underside (not shown) comprising a code reader configured to both (i) read markers (eg CTSC), typically implemented using QR codes, which are stuck or painted on the floor of the store in order to guide and control movement of the trolley, and (ii) to follow a line 841 which is stuck onto / painted onto the floor as a guide. A number of markers are used in the described example including CTSC (A marker indicating the start of an aisle), CTEC (A marker indicating the end of an aisle), CTTC (A marker indicating that the trolley should make a right hand turn) and CTHM (A marker indicating the location of the trolley at rest).
[00101] As described hereinafter in more detail with reference to Fig. 9 when the CBDR server processor 405 decides that a new image capture cycle 507 should be initiated, the trolley is directed to move from the home marker 801 along the line 841. The trolley 700 reaches and reads a camera trolley start code CTSC 802 which incorporates data indicating the start of aisle 1. The trolley proceeds along the line in aisle 1, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 804 which incorporates data indicating the end of aisle 1 at which point the trolley 700 ceases image capture. The trolley follows the line 841 around the corner 842 until it reaches and reads a camera trolley start code CTSC 806 which incorporates data indicating the start of aisle 2. The trolley 700 proceeds along the line in aisle 2, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 809 which incorporates data indicating the end of aisle 2 at which point the trolley 700 ceases image capture.
[00102] The trolley 700 follows the line around a corner 843 until it reaches and reads a camera trolley start code CTSC 810 which incorporates data indicating the start of aisle 9. The trolley 700 proceeds along the line in aisle 9, periodically capturing sets of images, and continues until it reaches and reads a camera trolley turn code CTTC 811 which incorporates data indicating that the trolley should make a right hand turn around a corner 844 which the trolley does until it reaches and reads a camera trolley start code CTSC 812 which incorporates data indicating the start of aisle 3. The trolley 700 proceeds along the line in aisle 3, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 814 which incorporates data indicating the end of aisle 3 at which point the trolley 700 ceases image capture.
[00103] The trolley follows the line around a corner 845 until it reaches and reads a camera trolley end code CTEC 804 which the trolley ignores because it has encountered this code previously in the present image capture cycle. The trolley follows the line around the corner 842 until it reaches and reads the camera trolley start code CTSC 806 which the trolley ignores because it has encountered this code previously in the present image capture cycle. The trolley 700 proceeds along the line in aisle 2, without capturing sets of images because it has already done so in the present cycle and continues until it reaches and reads a camera trolley end code CTEC 809 which the trolley ignores because it has encountered this code previously in the present image capture cycle. The trolley follows the line around the corner 843 until it reaches and reads the camera trolley start code CTSC 810 which the trolley ignores because it has encountered this code previously in the present image capture cycle. The trolley 700 proceeds along the line in aisle 9, without capturing sets of images because it has already done so in the present image capture cycle and continues until it reaches and reads the camera trolley turn code CTTC 811 which the trolley 700 ignores because it has previously encountered this code in the present cycle. The trolley thus follows the line 846, periodically capturing sets of images because it has not done so in the present cycle, until it reaches and reads a camera trolley turn code CTTC 815 which incorporates data indicating that the trolley should make a right hand turn around a corner 847 which the trolley does until it reaches and reads a camera trolley start code CTSC 816 which incorporates data indicating the start of aisle 4.
[00104] The trolley continues in this manner with respect to aisles 4, 5, 6 and 7 until the trolley reaches the camera trolley end code CTEC 836 at which point it ceases image capture. The trolley follows the line around a corner 848 until it reaches and reads a camera trolley start code CTSC 837 which incorporates data indicating the start of aisle 8. The trolley 700 proceeds along the line in aisle 8, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 839 which incorporates data indicating the end of aisle 8 at which point the trolley 700 ceases image capture. The trolley continues along the line and encounters a camera trolley turn code CTTC 840 which incorporates data indicating that the trolley should make a left hand turn around a corner 849 which the trolley does until it reaches and reads the camera trolley home code CTHC 801 which incorporates data indicating the home location, at which point the trolley ceases image capture and powers down, waiting for the next image capture cycle.
[00105] Fig 8B depicts an alternate CBDR trolley guidance arrangement store shelving layout 850 from above, the layout comprising seven sets of shelves 855, 857,863, 866, 869, 872 and 875 spaced apart by aisles 1 - 8. The trolley 700, when not in image capture mode, is positioned at a "home" location indicated by a camera trolley home marker CTHM852 which is stuck or painted on the floor.
[00106] When the CBDR server processor 405 decides that a new image capture cycle 507 should be initiated, the trolley is directed to move from the home marker 852 along a line 853. The trolley 700 reaches and reads a camera trolley start code CTSC 854 which incorporates data indicating the start of aisle 1. The trolley proceeds along the line in aisle 1, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 856 which incorporates data indicating the end of aisle 1 at which point the trolley 700 ceases image capture. The trolley follows the line around a corner 878 until it reaches and reads a camera trolley start code CTSC 858 which incorporates data indicating the start of aisle 2. The trolley 700 proceeds along the line in aisle 2, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 859 which incorporates data indicating the end of aisle 2 at which point the trolley 700 ceases image capture.
[00107] The trolley follows the line around a corner 879 until it reaches and reads a camera trolley start code CTSC 860 which incorporates data indicating the start of aisle 3. The trolley 700 proceeds along the line in aisle 3, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 861 which incorporates data indicating the end of aisle 3 at which point the trolley 700 ceases image capture. The trolley follows the line around a corner 880 until it reaches and reads a camera trolley start code CTSC 862 which incorporates data indicating the start of aisle 4. The trolley 700 proceeds along the line in aisle 4, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 864 which incorporates data indicating the end of aisle 4 at which point the trolley 700 ceases image capture.
[00108] The trolley follows the line around a corner 881 until it reaches and reads a camera trolley start code CTSC 865 which incorporates data indicating the start of aisle 5. The trolley 700 proceeds along the line in aisle 5, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 867 which incorporates data indicating the end of aisle 5 at which point the trolley 700 ceases image capture. The trolley follows the line around a corner 882 until it reaches and reads a camera trolley start code CTSC 868 which incorporates data indicating the start of aisle 6. The trolley 700 proceeds along the line in aisle 6, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 870 which incorporates data indicating the end of aisle 6 at which point the trolley 700 ceases image capture.
[00109] The trolley follows the line around a corner 883 until it reaches and reads a camera trolley start code CTSC 871 which incorporates data indicating the start of aisle 7. The trolley 700 proceeds along the line in aisle 7, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 873 which incorporates data indicating the end of aisle 7 at which point the trolley 700 ceases image capture. The trolley follows the line around a corner 884 until it reaches and reads a camera trolley start code CTSC 874 which incorporates data indicating the start of aisle 8. The trolley 700 proceeds along the line in aisle 8, periodically capturing sets of images, and continues until it reaches and reads a camera trolley end code CTEC 876 which incorporates data indicating the end of aisle 8 at which point the trolley 700 ceases image capture. The trolley follows a line 877 back to the camera trolley home marker CTHM 852.
[00110] Fig. 9 is a flow chart for an example process 507' used to guide the trolley of Fig. 7 according to the store layout of Fig. 8A (see Figs. 18A and 18B for alternate / complementary store preparation processes). The process 507' enters with an arrow 506 from 505 in Fig. 5A and then commences with a step 901 in which the CBDR server decides the list of aisles for which images are to be captured. This would typically be all the aisles, and the capture process would be performed periodically. However, in periods during which special events occur, such as seasonal clearances, more frequent image capture cycles may be called for. The process 507'follows an arrow 902 from 901 to 903 in which the CBDR server 401 directs the trolley 700 to follow the guideline 841 from the camera trolley home marker CTHM 801 and to look for the next marker. The process 507'follows an arrow 904 from 903 to a decision step 905 in which the processor and sensing system (not shown) of the trolley determines if a marker has been found. If this is not the case, the process 507'follows a "N" arrow 906 from 905 back to 903.
[00111] If however a marked has been found then the process 507'follows a "Y" arrow 907 from 905 to a step 908. In the step 908 the trolley determines if the marker is a camera trolley start code CTSC. If this is the case the process 507'follows a "Y" arrow 909 from 908 to 910 in which the trolley determines if this marker is the next marker on the list. If this is the case the process 507' follows a "Y" arrow 911 from 910 to 912 which directs all the cameras on the trolley 700 to commence capturing and storing images in folder 1 (ie 529). The process 507' then follows an arrow 913 from 912 to 914 which directs the trolley to follow the guideline 841, periodically capturing sets of images, looking for the next aisle marker. The process 507' then follows an arrow 915 from 914 to 916 in which the trolley determines if a next marker has been found.
[00112] If this is the case, then the process 507'follows a "Y" arrow 917 from 916 to 918 in which the trolley determines if the marker is a camera trolley end of aisle code CTEC. If this is not the case, then the process 507'follows a "N" arrow 919 from 918 back to 912. If however the marker is a camera trolley end of aisle code CTEC then the process 507'follows a "Y" arrow 920 back to the step 903.
[00113] Returning to the step 908 if the trolley determines that the marker is not a camera trolley start code CTSC then the process 507'follows a "N" arrow 921 from 908 to a decision step 922. In the step 922 the trolley determines if the marker is a camera trolley turn code CTTC. If this is not the case, then the process 507'follows a "N" arrow 923 from 922 to a decision step 924. In the step 924 the trolley determines if the marker is a camera trolley end of cycle code CTECC. If this is not the case, the process 507'follows a "N" arrow 925 from 924 to 903. If however the marker is a camera trolley end of cycle code CTECC, the process 507' follows a "Y" arrow 935 from 924 to a step 936 which directs the trolley to follow the guideline and proceed to the camera trolley home marker CTHM 801. The process 507' then follows an arrow 508 to 505 in Fig. 5A.
[00114] Returning to the decision step 922, if the step determines that the marker is a camera trolley turn code CTTC, then the process 507'follows a "Y" arrow 926 from 922 to 927. In the step 927 the trolley determines if the turn aisle is next of the list. If this is the case, then the process 507'follows a "Y" arrow 928 from 927 to 929. In the step 929 the trolley follows the guideline to the right and the process 507' then follows an arrow 930 back to the step 903. If the decision step 927 returns a FALSE value, then the process 507'follows a "N" arrow 933 back to 903.
[00115] Returning to the decision step 910, if the step returns a FALSE value, then the process 507'follows a "N" arrow 931 back to the step 903.
IMAGE PROCESSING
[00116] Figs. 10A, 10B and 10C are flow charts for an example process for performing image processing of the product images captured by the trolley of Fig. 7.
[00117] Fig. 10A is a flow chart fragment 512'which is entered via an arrow 511 from 510 in Fig. 5A which is directed to a step 1006 in which the CBDR server finds the next (unprocessed) product image from folder 1 (ie 529). If all images have been processed, then folder 1 is empty. The process 512'then follows an arrow 1005 to a decision step 1004 in which the CBDR server determines if all product images have been processed. If this is the case the process 512' then follows an arrow 513 from 1004 to 510 in Fig. 5A. If however the step 1004 returns a FALSE value then the process 512'follows a "NO" arrow 1007 from 1004 to a decision step 1008 in which the CBDR server determines if all product group location codes PGLCs have been processed. If this is the case the process 512' follows a "YES" arrow 1015 from 1008 to a step 1002 in which the CBDR server moves the captured image from the folder 1 (ie 529) to a folder 2 (1001). The process 512'then follows an arrow 1003 from 1002 to the step 1006. If however the step 1008 returns a FALSE value, then the process 512'follows an arrow 1009 from 1008 to a step 1010 in which the CBDR serverfinds the next product group location code PGLC. The process 512'then follows an arrow 1011 from 1010 to a decision step 1012 in which the CBDR server determines if the PGLC has been affixed according to a centreline process (per the step 635 in Fig. 6B). If this is the case then the process 512'follows a "YES" arrow 1013 to 1050 in Fig. 10C. If however the step 1012 returns a FALSE value, then the process 512'follows a "NO" arrow 1014 to 1025 in Fig. 10B.
[00118] Fig. 10B is a flow chart fragment 512" (for processing boundary product group location codes BPGLCs) which is entered via an arrow 1014 from 1012 in Fig. 10A, and which is directed to a step 1025 in which the CBDR server determines the location of the PGLC (eg by finding 303 or 320 in Fig. 3), which is a Boundary Product Group Location Code BPGLC (see Figs. 14 and 15 for alternate / complementary store preparation processes). If the BPGLC is implemented as a QR code, then open source "zebra crossing" (ZXing) code can be used to implement the step 1025. The process 512" then follows an arrow 1026 from the step 1025 to a step 1027 in which the CBDR server directly measures the pixel height (PGLCPXH) of the boundary product group location code BPGLC and the pixel width (PGLCPXW) of the boundary product group location code BPGLC by directly measuring the pixel height and pixel width of the boundary product group location code (BPGLC) in the product image P being processed according to the step 1006 in Fig. 10A.
[00119] In the step 1027 the CBDR server also determines a conversion factor PHYPXH for converting from physical height units to pixel height units using the previously stored product group location code physical height PGLCPHH and the now measured boundary product group location code pixel height PGLCPXH so that PHYPXH = PGLCPXH IPGLCPHH. The CBDR server similarly determines a conversion factor PHYPXWfor converting from physical width units to pixel width units using the previously stored boundary product group location code physical width PGLCPHWand the now measured boundary product group location code pixel width PGLCPXW so that PHYPXW = PGLCPXW IPGLCPHW
[00120] The process 512" then follows an arrow 1028 from 1027 to 1029 in which the CBDR server determines the boundary product group bounding box height in pixels units (PGBBPH) using the information printed on the boundary product group location code BPGLC.
[00121] For the boundary product group location code case if there is an adjacent product group on the same shelf, the printed boundary product group location code BPGLC contains, in one example, a prefix "CU" or "CL"do denote whether the PGLC is to be located above or below the product group PG in question, the height PGH of the product group PG being PH* Qty-High, and a unique product group identifier PGID. In this case the product group bounding box height in pixels units PGBBPH = PGH * PHYPXH.
[00122] For the boundary product group location code case if there is no adjacent product group on the same shelf, the printed boundary product group location code BPGLC contains, in one example, a prefix "CU" or "CL"to denote whether the PGLC is to be located above or below the product group PG in question, the height PGH of the product group PG being PH * Qty High, the measured product group envelope width PGEW, and a unique product group identifier PGID. In this case the boundary product group bounding box height in pixels units PGBBPH= PGH * PHYPIXH.
[00123] The process 512" then follows an arrow 1030 from 1029 to a decision step 1031 in which the CBDR server determines if the detected boundary product group location code BPGLC contains the product group bounding box width PGBBW. If this is the case, then the process 512" follows a "Y" arrow 1032 from 1031 to 1033 in which the CBDR server determines the boundary product group bounding box pixel width PGBBPW as PGBBPW= PGBBW* PHYPXW The process 512" then follows an arrow 1034 from 1033 to 1035 in which the CBDR server stores the product image identifier P11, the boundary product group identifier PGD, the boundary product group bounding box pixel height and width PGBBPH and PGBBPW, and the product group location code location coordinates PGLCX and PGLCY in the bounding box data memory module 477. The process 512" then follows an arrow 1016 from 1035 to 1008 in Fig. A.
[00124] Returning to the step 1031, if the step returns a FALSE value then the process 512" follows a "N" arrow 1036 from 1031 to a step 1037 in which the CBDR server determines the location of the immediately adjacent boundary product group location code BPGLC. The process 512" then follows an arrow 1038 from 1037 to a step 1039 in which the CBDR server determines the boundary product group bounding box pixel width PGBBPWfrom the product group boundary box width PGBBW (eg 307 in Fig. 3 which is determined by a distance 307 between the location 303 of the boundary product group location code BPGLC 304 and a location 308 of an adjacent boundary product group location code BPGLC 309) and the conversion factor for converting from physical width units to pixel width units PHYPXWwhereby PGBBPW = PGBBW * PHYPXW.
[00125] The process 512" then follows an arrow 1040 from 1039 to 1035.
[00126] Fig. 1OC is a flow chart fragment 512"' (for processing centreline product group location codes CPGLCs) which is entered via an arrow 1013 from 1012 in Fig. 10A, and which is directed to a step 1050 in which the CBDR server determines the location of the CPGLC (eg by finding 303 or 320 in Fig. 3), which is a Centreline Product Group Location Code CPGLC (see Figs. 16 and 16 for alternate / complementary store preparation processes). If the CPGLC is implemented as a QR code, then open source "zebra crossing" (ZXing) code can be used to implement the step 1050. The process 512"' then follows an arrow 1051 from the step 1050 to a step 1052 in which the CBDR server directly measures the pixel height (PGLCPXH) of the centreline product group location code CPGLC and the pixel width (PGLCPXW) of the centreline product group location code CPGLC by directly measuring the pixel height and pixel width of centreline Product group location code (CPGLC) in the product image P being processed according to the step 1006 in Fig. 10A.
[00127] In the step 1052 the CBDR server also determines a conversion factor PHYPXH for converting from physical height units to pixel height units using the previously stored centreline product group location code physical height PGLCPHH and the now measured product group location code pixel height PGLCPXH so that PHYPIXH = PGLCPXH IPGLCPHH. The CBDR server similarly determines a conversion factor PHYPXWfor converting from physical width units to pixel width units using the previously stored centreline product group location code physical width PGLCPHWand the now measured product group location code pixel width PGLCPXWso that PHYPXW= PGLCPXWIPGLCPHW.
[00128] The process 512"' then follows an arrow 1053 from 1052 to 1054 in which the CBDR server determines the centreline product group bounding box height in pixels units (PGBBPH) using the information printed on the centreline product group location code CPGLC.
[00129] For the centreline product group location code case the printed centreline product group location code CPGLC contains, in one example, a prefix "CU" or "CL" do denote whether the CPGLC is to be located above or below the product group PG in question, the height PGH of the product group PG being PH * Qty-High, the width PGWof the product group PG being PW* Qty-Wide, and a unique product group identifier PGID. In this case the product group bounding box height in pixels units PGBBPH = PGH * PHYPXH.
[00130] The process 512"' then follows an arrow 1055 from 1054 to a step 1056 in which the CBDR server determines the centreline product group bounding box pixel width PGBBPWas PGBBPW = PGBBW * PHYPIXW. The process 512"' then follows an arrow 1057 from 1056 to 1058 in which the CBDR server stores the product image identifier P11, the centreline product group identifier PGID, centreline product group bounding box pixel height and width PGBBPH and PGBBPW, and the centreline product group location code location coordinates PGLCX and PGLCY in the bounding box data memory module 477. The process 512"' then follows an arrow 1016 from 1058 to 1008 in Fig. 10A.
USERINTERFACE
[00131] Fig. 11 is a flow chart for an example computer-implemented process by which a customer performs on-line shopping using their digital device in accordance with the disclosed CBDR arrangement by browsing the captured set of images.
[00132] A customer is presented with an image of the store front. They click the door to enter the store and then decide if they would like to browse the store as they would in the physical version or search for particular products. The comprehensive two-dimensional images covering the store in question (described in more detail with reference to Figs. 7, 8A, 8B and 9) are converted to panoramic images using known techniques such as those used by Google Street ViewTMand provided to the customer on his or her display 474 on their digital device 472 when they establish a shopping session as depicted in a step 1101.
[00133] Once an aisle is chosen, they are taken to the start of the aisle to begin browsing. If they choose to search for a particular product, this will be possible via a text box on the main screen in a similar way to existing online store searches. After specifying a search, the customer is taken directly to the image of the product on the actual shelf in the aisle that the product is located in. There the customer can see multiple brands that might be located next to or near the searched item and even unrelated items that are normally located on shelves nearby. From there the customer can perform another search or continue browsing that aisle for other products. Whether a customer is browsing the virtual store isle by isle or searching for a particular product as described above, when they come across a product they wish to purchase, they simply click on the product they have chosen within the image they are viewing. The CBDR arrangement identifies the selected product, as described in more detail with reference to Fig. 12 and adds the selected product to the customer's cart.
[00134] Fig. 11 follows an arrow 1102 from 1101 to a decision step 1103, in which the CBDR server determines, dependent upon a command provided by the customer to a user interface (not shown) of their digital device 472, if the customer wishes to check out. If this is the case the process 1100 follows a YES arrow 1111 to a step 1112. The step 1112, performed the CBDR server, performs a checkout process which enables the customer to check the list of selected products and pay for them using electronic funds transfer or a similar mechanism. The process 1100 then follows an arrow 1113 from 1112 to an end step 1114.
[00135] Returning to the decision step 1103, if this step returns a FALSE value, then the process 1100 follows an arrow 1104 from 1103 to a step 1105. The step 1105, performed by the CBDR server, enables the customer to navigate to a desired shelf or a desired product and a corresponding product image P1 having a unique product image identifier P11. The process 1100 then follows an arrow 1106 from 1105 to a step 1107. The step 1107, performed by the SBDR server, enables the customer to point to (ie specify) a desired product which is visible in the product image P1 being viewed, and select a "quantity" of the product desired, using the user interface (not shown) of their digital device 472 by designating (pointing to) a selected X-Y spatial coordinate location SXYL on the desired product in the product image P being viewed. This selection generates a selection signal SSG (479) which is communicated to the CBDR server 401. The selection signal SSG includes at least the unique product image identifier P11 and the designated X-Y location SXYL being pointed to by the customer on the selected product image P1.
[00136] The process 1100 then follows an arrow 1108 from 1107 to a step 1109. The step 1109, performed by the customer digital device 472 and the CBDR server 401, confirms the product selection made in the step 1107 and adds the selected product to the customer's cart. The process 1100 then follows an arrow 1110 back to the decision step 1103.
SERVER OPERATION
[00137] Fig. 12 is a flow chart for an example computer-implemented process 1200 by which the CBDR server identifies, according to the disclosed CBDR arrangement, a product selected by a customer shopping in the step 1107 in the process of Fig. 11.
[00138] The process 1200 is entered by an arrow 523 from 522 in Fig. 5C, and then commences with a decision step 1201, performed by the CBDR server, which determines if a product selection pointing signal 479 has been received by the server 401 from the step 1107 in Fig. 11 dependent upon the customer making a selection using the user interface (not shown) of their digital device 472. If this is not the case, then the process 1200 follows a NO arrow 1222 back to the step 1201. If, however, the step 1201 returns a TRUE output, then the process 1200 follows a YES arrow 1202 from 1201 to a step 1203. The step 1203, performed by the CBDR server identifies, from the selection signal 479 from the customer digital device, the selection signal parameter SSIG which includes the unique product image identifier P11 and the designated X-Y location SXYL being pointed to by the customer on the selected product image P1. From the aforementioned information the step 1203 determines, from the captured images in folder 2 (578), the product image being viewed (having the corresponding product image identifier Pl).
[00139] The process 1200 then follows an arrow 1204 from 1203 to a step 1205. The step 1205, performed by the CBDR server, determines the pointer location provided by the customer at the step 1107 in Fig. 11 using her digital device 472. The process 1200 then follows an arrow 1206 to a step 1207. The step 1207, performed by the CBDR server, identifies the (pre determined) product group bounding box PGBB which corresponds to (ie encompasses) the pointer location provided by the customer using his digital device 572 with reference to the product image. More particularly, as described with reference to Figs. 3 and 10A and 10B each product group PG is processed and for each product group PG the CBDR server stores a corresponding product image identifier P11, product group identifier PGID, product group bounding box pixel height and width PGBBPH and PGBBPW, and product group location code location coordinates PGLCX and PGLCY in the bounding box data memory module 477. Accordingly, the CBDR server is able to identify the relevant product group bounding box PGBB within whose bounding box region BRR the pointer location SXYL lies.
[00140] The process 1200 then follows an arrow 1208 from 1207 to 1209 in which the CBDR server identifies the product group location code PGLC associated with the product group bounding box PGBB identified in the step 1207. The process then follows an arrow 1210 from 1209 to 1211 in which the CBDR server identifies the product group PG associated with the product group location code PGLC identified in the step 1209.
[00141] The process 1200 then follows an arrow 1221 to a step 1213 which displays the product group PG identified in the step 1211 to the customer on their display 474. The process then follows an arrow 1214 from 1213 to a decision step 1215 in which the CBDR server waits for a confirmation signal from the step 1109 in Fig. 11 in which the customer confirms the displayed product group PG and adds it to their cart. If a confirmation signal is not received, then the process 1200 follows a "N" arrow 1221 back to the step 1201. If however the step 1215 provides a TRUE output then the process 1200 follows a "Y" arrow 1216 from 1215 to 1217 in which the CBDR server 401 adds one or more products from the selected product group PG, according to a quantity input by the customer in the step 1107, to the customer's shopping cart. The process 1200 then follows an arrow 1218 from 1217 to a decision step 1219 in which the CBDR server determines in the customer wishes to continue shopping. If this is the case, then the process 1200 follows a "Y" arrow 1220 back to the step 1201. If however the step 1219 returns a FALSE value, then the process 1200 follows a "N" arrow 525 to 522 in Fig. 5C.
[00142] A virtual shopping cart is maintained in the same way as traditional online stores currently operate. Once the user has completed their shopping, they checkout and pay in the same way as they do in current systems.
Industrial Applicability
[00143] The arrangements described are applicable to the computer and data processing industries and particularly for the on-line shopping industry.
[00144] The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive.
[00145] In the context of this specification, the word "comprising" means "including principally but not necessarily solely" or "having" or "including", and not "consisting only of". Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings.
Appendix A
GLOSSARY OF TERMS (dimensions are in physical units such as cm unless otherwise stated)
Acronym / Term Meaning parameter ANO Aisle number The aisle number associated with a particular image BBD Bounding Box A database in which bounding box and associated data is Database stored BBR Bounding Box a spatial region lying within the associated bounding box Region BPGLC Boundary PGLC affixed at a corner within a rectangular BB Product Group Location Code CBDR Code Based the name of the claimed invention Data Reduction CID Camera The identifying number of the camera in question Identifier CPGLC Centreline PGLC affixed at a centre of a side a rectangular BB Product Group Location Code CTSC Camera trolley A marker indicating the start of an aisle start code CTHM Camera trolley A marker indicating the location of the trolley at rest home marker CTEC Camera trolley A marker indicating the end of an aisle end code CTECC Camera trolley A marker indicating the end of the image capture cycle end of cycle code CTTC Camera trolley A marker indicating that the trolley should make a right-hand turn code or a left-hand turn D Inter shelf spacing between a bottom surface of a shelf upon which the spacing product group in question is located and a lower surface of the shelf above the product group in question ISNO Image sequence The sequence number of the image taken by a specified number camera in a specified aisle number MPL Mouse Pointer a point which the customer selects in the "real" panoramic Location image presented PBC Product Bar A tag typically printed on a product used to identify the Code specific product in question PDC Product A tag with information used to identify the product group in a Description product database Code PG Product group one or more products which are co-located on or under a shelf for display to customers PGBB Product group a closed curve, usually rectangular in shape, describing a bounding box region associated with a particular product group which, if a customer points to a point on or within the curve, indicates selection of the product making up the product group
PGBBH product group the height (physical units) of a product group bounding box bounding box height PGBBW product group the width (physical units) of a product group bounding box bounding box width PGBBPH product group the height (pixels) of a product group bounding box bounding box height PGBBPW product group the width (pixels) of a product group bounding box bounding box width PGE Product group the shelf space allocated to the corresponding product group envelope in question as visualised by the store employee preparing the shelves PGEH product group the height of a product group envelope envelope height PGEW product group the width of a product group envelope envelope width PGH Product group the height of a product group height PGW Product group the width of a product group width PGLC Product Group a tag affixed to a shelf or the like in the vicinity of a product Location Code group the tag used to construct a bounding box encompassing the associated product group PGID Product group A unique identifier for the product group identifier PGLCPHH Product group Height of the product group location code (in physical units location code such as cm) physical height PGLCPHW Product group Width of the product group location code (in physical units location code such as cm) physical width PGLCPXH Product group Height of the product group location code (in pixels) location code pixel height PGLCPXW Product group Width of the product group location code (in pixels) location code pixel width PGLCX Product group X coordinate of location of product group location code (eg location code 303 or 320 in Fig. 3) location PGLCY Product group Y coordinate of location of product group location code (eg location code 303 or 320 in Fig. 3) location PHYPIXH Height A conversion factor for converting from physical height units to conversion pixel height units factor physical units to pixel units PHYPIXW Conversion A conversion factor for converting from physical width units to factor physical pixel width units units to pixel units PW Product width Width of the product PH Product height Height of the product PI Product image One of the series of captured 2-D images used to represent the store and stitched together to form the corresponding panoramic image to be browsed by the customer P/1 Product image A unique identifier for each product image comprising the identifier tuple {ANO, CID, ISNO} Qty-Wide number of items that are located next to each other in a row for that particular product group Qty-High number of items that are located on top of each other in a row for that particular product group SSIG Selection signal A signal generated by the user digital device comprising at least the product image identifier PH1of the image being viewed and the X-Y location being pointed to SXYL SXYL Selected X-Y Selected X-Y location on a selected product image P location

Claims (10)

CLAIMS:
1. A computer implemented method of selecting, by a customer digital device, a product from a plurality of products in a product image displayed on the customer digital device, the method comprising the steps of: specifying, by the customer digital device, a spatial coordinate on the product image displayed on the customer digital device; determining, by a server, a pre-determined product group bounding box dependent upon the spatial coordinate; determining, by the server, a product group location code associated with the pre determined product group bounding box; and determining, by the server, the product associated with the product group location code.
2. A method according to claim 1, wherein prior to the step of determining, by the server, the pre-determined product group bounding box, the method comprises the further steps of: determining, by the server, a size and location of the product group location code in the product image displayed on the customer digital device; and determining, by the server, a size and location of the product group bounding box relative to the size and location of the product group location code.
3. A method according to claim 2, comprising the further steps of: adding, by the server, the product to the customer shopping cart.
4. A method according to claim 2 or 3, wherein the step of determining, by the server, the size and location of the product group bounding box comprises the steps of: determining by the server that the product group location code is located at a boundary of an associated product group bounding box; determining pixel dimensions of the product group location code and a conversion factor from physical to pixel dimensions; determining a pixel height of the product group bounding box dependent upon the conversion factor and a height of a product group associated with the product group location code;and determining a pixel width of the product group bounding box dependent upon a distance between a location of the product group location code and a location of an adjacent product group location code in the image.
5. A method according to claim 2 or 3, wherein the step of determining, by the server, the size and location of the product group bounding box comprises the steps of: determining by the server that the product group location code is located at a centreline of an associated product group bounding box; determining pixel dimensions of the product group location code and a conversion factor from physical to pixel dimensions; determining a pixel height of the product group bounding box dependent upon the conversion factor and a height of a product group associated with the product group location code;and determining a pixel width of the product group bounding box dependent upon the conversion factor and a width of a product group associated with the product group location code.
6. A computer implemented method of processing, by a server device, a selection signal from a customer digital device resulting from selecting, by the customer digital device, a product from a plurality of products in a product image displayed on the customer digital device, the method comprising the steps of: determining, by the server device, the product image being displayed on the customer digital device; determining, by the server device, a spatial coordinate specified by the customer digital device on the product image displayed on the customer digital device; determining, by the server device, a pre-determined product group bounding box encompassing the spatial coordinate; determining, by the server device, a product group location code associated with the product group bounding box; and determining, by the server device, a product group associated with the product group location code.
7. A customer digital device for selecting a product from a plurality of products in a product image displayed on the customer digital device, the device comprising: a processor; and a memory storing a computer executable software program for directing the processor to perform a method comprising the steps of: specifying, by the customer digital device, a spatial coordinate on the product image displayed on the customer digital device; determining, by a server, a pre-determined product group bounding box dependent upon the spatial coordinate; determining, by the server, a product group location code associated with the pre determined product group bounding box; and determining, by the server, the product associated with the product group location code.
8. A server device for processing a selection signal from a customer digital device resulting from selection, by the customer digital device, a product from a plurality of products in a product image displayed on the customer digital device, the server device comprising: a processor; and a memory storing a computer executable software program for directing the processor to perform a method comprising the steps of: determining the product image being displayed on the customer digital device; determining a spatial coordinate specified by the customer digital device on the product image displayed on the customer digital device; determining a pre-determined product group bounding box encompassing the spatial coordinate; determining a product group location code associated with the product group bounding box; and determining a product group associated with the product group location code.
9. A computer executable software program for directing a processor of a customer digital device to perform a method for selecting a product from a plurality of products in a product image displayed on the customer digital device, the method comprising the steps of: specifying, by the customer digital device, a spatial coordinate on the product image displayed on the customer digital device; determining, by a server, a pre-determined product group bounding box dependent upon the spatial coordinate; determining, by the server, a product group location code associated with the pre determined product group bounding box; and determining, by the server, the product associated with the product group location code.
10. A computer executable software program for directing a processor of a server device to perform a method for processing a selection signal from a customer digital device resulting from selection, by the customer digital device, a product from a plurality of products in a product image displayed on the customer digital device, the method comprising the steps of: determining the product image being displayed on the customer digital device; determining a spatial coordinate specified by the customer digital device on the product image displayed on the customer digital device; determining a pre-determined product group bounding box encompassing the spatial coordinate; determining a product group location code associated with the product group bounding box; and determining a product group associated with the product group location code.
van der Weegen, Mark Patent Attorneys for the Applicant/Nominated Person SPRUSON&FERGUSON
AU2021245114A 2021-10-05 2021-10-05 Image Recognition using Code Based Data Reduction Active AU2021245114B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2021245114A AU2021245114B1 (en) 2021-10-05 2021-10-05 Image Recognition using Code Based Data Reduction
PCT/AU2022/050860 WO2023056500A1 (en) 2021-10-05 2022-08-09 Image processing using code based data reduction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2021245114A AU2021245114B1 (en) 2021-10-05 2021-10-05 Image Recognition using Code Based Data Reduction

Publications (1)

Publication Number Publication Date
AU2021245114B1 true AU2021245114B1 (en) 2022-03-10

Family

ID=80533967

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021245114A Active AU2021245114B1 (en) 2021-10-05 2021-10-05 Image Recognition using Code Based Data Reduction

Country Status (2)

Country Link
AU (1) AU2021245114B1 (en)
WO (1) WO2023056500A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080077473A1 (en) * 2006-09-25 2008-03-27 Allin-Bradshaw Catherine E Method and apparatus for collecting information relating to the possible consumer purchase of one or more products
US20120239536A1 (en) * 2011-03-18 2012-09-20 Microsoft Corporation Interactive virtual shopping experience

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9811754B2 (en) * 2014-12-10 2017-11-07 Ricoh Co., Ltd. Realogram scene analysis of images: shelf and label finding
US9342900B1 (en) * 2014-12-23 2016-05-17 Ricoh Co., Ltd. Distinguishing between stock keeping units using marker based methodology
CA3018381A1 (en) * 2016-03-29 2017-10-05 Bossa Nova Robotics Ip, Inc. System and method for locating, identifying and counting items
WO2018002709A2 (en) * 2016-06-29 2018-01-04 Adato Yair Identifying products using a visual code
WO2019246452A1 (en) * 2018-06-20 2019-12-26 Simbe Robotics, Inc Method for managing click and delivery shopping events

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080077473A1 (en) * 2006-09-25 2008-03-27 Allin-Bradshaw Catherine E Method and apparatus for collecting information relating to the possible consumer purchase of one or more products
US20120239536A1 (en) * 2011-03-18 2012-09-20 Microsoft Corporation Interactive virtual shopping experience

Also Published As

Publication number Publication date
WO2023056500A1 (en) 2023-04-13

Similar Documents

Publication Publication Date Title
US11507933B2 (en) Cashier interface for linking customers to virtual data
CN108175227B (en) Goods shelf control method and device and electronic equipment
US11176602B2 (en) Interactive transaction system, method, and device for physical merchant stores
US9020845B2 (en) System and method for enhanced shopping, preference, profile and survey data input and gathering
JP5967553B2 (en) Method for estimating purchase behavior of customer in store or between stores, and computer system and computer program thereof
US20090313142A1 (en) Information providing device, computer-readable recording medium, and store system
US20180324549A1 (en) Prompting method and apparatus
US20190005569A1 (en) Methods and systems for automatically mapping a retail location
US10643267B2 (en) Retail purchasing computer system and method of operating same
TWI496096B (en) System, method, and storage unit for managing multiple objects in an object zone
US20120330781A1 (en) System and Method for Shopping Goods, Virtualizing a Personalized Storefront
CN106779940B (en) Method and device for confirming display commodity
JPWO2017065187A1 (en) Information display terminal device, product information providing system, and product sales promotion method
JP2024003212A (en) Information processing device, information processing method, program, and guide system
US20180012267A1 (en) Electronic device, apparatus and system
WO2012075589A1 (en) Method and system for virtual shopping
CN113168602A (en) Customer-assisted robotic picking
KR20170067976A (en) Method, apparatus and computer program for managing shopping information
JP5021327B2 (en) Setting information creating apparatus, setting information creating method, setting information creating program, and information output system
WO2015159601A1 (en) Information-processing device
CN109948736A (en) Commodity identification model active training method, system, equipment and storage medium
AU2021245114B1 (en) Image Recognition using Code Based Data Reduction
US20180315226A1 (en) Information processing system and information processing device
JP6522173B1 (en) INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING PROGRAM
KR20210014280A (en) Method for shopping virtual product in offline virtual product display store and system therefore

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)
DA3 Amendments made section 104

Free format text: THE NATURE OF THE AMENDMENT IS: AMEND THE INVENTION TITLE TO READ IMAGE RECOGNITION USING CODE BASED DATA REDUCTION