US20220161763A1 - Systems, Method And Apparatus For Automated Inventory Interaction - Google Patents

Systems, Method And Apparatus For Automated Inventory Interaction Download PDF

Info

Publication number
US20220161763A1
US20220161763A1 US17/669,118 US202217669118A US2022161763A1 US 20220161763 A1 US20220161763 A1 US 20220161763A1 US 202217669118 A US202217669118 A US 202217669118A US 2022161763 A1 US2022161763 A1 US 2022161763A1
Authority
US
United States
Prior art keywords
inventory
proximity
fascia
camera
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/669,118
Inventor
Greg Schumacher
Kevin Howard
Emad Mirgoli
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Adroit Worldwide Media Inc
Original Assignee
Adroit Worldwide Media Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Adroit Worldwide Media Inc filed Critical Adroit Worldwide Media Inc
Priority to US17/669,118 priority Critical patent/US20220161763A1/en
Assigned to ADROIT WORLDWIDE MEDIA, INC. reassignment ADROIT WORLDWIDE MEDIA, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SCHUMACHER, GREG, HOWARD, KEVIN, MIRGOLI, Emad
Publication of US20220161763A1 publication Critical patent/US20220161763A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/03Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for supply of electrical power to vehicle subsystems or for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60SSERVICING, CLEANING, REPAIRING, SUPPORTING, LIFTING, OR MANOEUVRING OF VEHICLES, NOT OTHERWISE PROVIDED FOR
    • B60S1/00Cleaning of vehicles
    • B60S1/02Cleaning windscreens, windows or optical devices
    • B60S1/023Cleaning windscreens, windows or optical devices including defroster or demisting means
    • B60S1/026Cleaning windscreens, windows or optical devices including defroster or demisting means using electrical means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24143Distances to neighbourhood prototypes, e.g. restricted Coulomb energy networks [RCEN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0207Discounts or incentives, e.g. coupons or rebates
    • G06Q30/0223Discounts or incentives, e.g. coupons or rebates based on inventory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0267Wireless devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02SGENERATION OF ELECTRIC POWER BY CONVERSION OF INFRARED RADIATION, VISIBLE LIGHT OR ULTRAVIOLET LIGHT, e.g. USING PHOTOVOLTAIC [PV] MODULES
    • H02S40/00Components or accessories in combination with PV modules, not provided for in groups H02S10/00 - H02S30/00
    • H02S40/30Electrical components
    • H02S40/38Energy storage means, e.g. batteries, structurally associated with PV modules
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02SGENERATION OF ELECTRIC POWER BY CONVERSION OF INFRARED RADIATION, VISIBLE LIGHT OR ULTRAVIOLET LIGHT, e.g. USING PHOTOVOLTAIC [PV] MODULES
    • H02S40/00Components or accessories in combination with PV modules, not provided for in groups H02S10/00 - H02S30/00
    • H02S40/40Thermal components
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/91Battery charging
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2400/00Special features of vehicle units
    • B60Y2400/21External power supplies
    • B60Y2400/216External power supplies by solar panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E10/00Energy generation through renewable energy sources
    • Y02E10/50Photovoltaic [PV] energy
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E70/00Other energy conversion or management systems reducing GHG emissions
    • Y02E70/30Systems combining energy storage with energy generation of non-fossil origin

Definitions

  • Retail environments are ever challenging. Consumers typically are confronted with pricing and information about a continuously increasing number of competitors and brands, including information about pricing, labeling, promotions, and the like. Traditionally, this information has been provided using print systems, such as slide-in paper systems, plastic label systems, and adhesive label systems.
  • print systems such as slide-in paper systems, plastic label systems, and adhesive label systems.
  • consumers are increasingly confounded by the sheer volume of printed information displayed in retail environments, and thus a growing number of consumers are turning to online shopping for day-to-day purchases.
  • a retailer's overall performance and profits are significantly impacted by the challenge of getting the right products to the right places at the right time.
  • a retailer may lose money due to a failure to restock inventory. For example, a customer may approach a shelf seeking to purchase a particular item; however, the shelf indicated as the location of the particular item may be empty. In some situations, a retailer may have that particular item stored in the back of the store but due to a lack of knowledge that the shelf was empty, the shelf was not restocked with the item causing the retailer to lose the money the customer would have spent on purchasing the particular item. Such a situation occurs at a high rate and may cost a retailer thousands or even millions of dollars in lost revenue each year.
  • customers often enter a retail location or pass by a retail exhibit (e.g., vending machine or small retail stand such as in a mall, an airport, a hospital, etc.) and fail to notice objects on some shelving units or fail to realize promotions or discounts apply to certain objects.
  • a retail exhibit e.g., vending machine or small retail stand such as in a mall, an airport, a hospital, etc.
  • customers are often distracted for a variety of reasons including looking at their mobile device, talking on their mobile device and/or watching children.
  • an proximity camera system comprises a proximity camera having a lens and a housing, one or more fascia, one or more processors communicatively coupled to the proximity camera and the one or more fascia, and a non-transitory computer-readable medium communicatively coupled to the one or more processors and having logic thereon, the logic, when executed by the one or more processors, being configured to perform operations including: (i) receiving an image captured by the proximity camera, (ii) performing object recognition techniques on the image, (iii) determining whether an object was detected within a first predetermined proximity region, and (iv) transmitting one or more instructions configured to cause a graphical display to be displayed by the one or more fascia.
  • the proximity camera and the one or more fascia are coupled to a shelving unit. Additionally, the image illustrates a geo-fence region at least partially surrounding the shelving unit, the geo-fence region including the first predetermined proximity region. In some embodiments, the logic is configured to determine whether a second object is detected within a second predetermined proximity region, the second proximity region including a physical area closer to the shelving unit than the first predetermined proximity region.
  • the image is transmitted to a cloud computing service for analysis of the first predetermined proximity region.
  • the graphical display displayed by the one or more fascia may include an immersive graphic.
  • the immersive graphic spans a plurality of the one or more fascia.
  • a cabinet display top may be communicatively coupled to the one or more processors and the non-transitory computer-readable medium, and the immersive graphic spans a plurality of the one or more fascia and the cabinet display top.
  • the graphical display displayed by the one or more fascia includes a product information graphic, wherein the product information graphic includes at least pricing information for one or more inventory items.
  • the graphical display displayed by the one or more fascia includes a promotional graphic, wherein the promotional graphic includes at least information corresponding to a promotion or discount for one or more inventory items.
  • a computerized method includes receiving an image captured by the proximity camera, performing object recognition techniques on the image, determining whether an object was detected within a first predetermined proximity region, and transmitting one or more instructions configured to cause a graphical display to be displayed by the one or more fascia.
  • the proximity camera and the one or more fascia are coupled to a shelving unit.
  • the image may illustrates a geo-fence region at least partially surrounding the shelving unit, the geo-fence region including the first predetermined proximity region.
  • the computerized method may further include determining whether a second object is detected within a second predetermined proximity region, the second proximity region including a physical area closer to the shelving unit than the first predetermined proximity region.
  • the graphical display displayed by the one or more fascia includes an immersive graphic that spans a plurality of the one or more fascia.
  • a non-transitory computer readable storage medium having stored thereon instructions is disclosed.
  • the instructions are executable by one or more processors to perform operations including receiving an image captured by the proximity camera, performing object recognition techniques on the image, determining whether an object was detected within a first predetermined proximity region, and transmitting one or more instructions configured to cause a graphical display to be displayed by the one or more fascia.
  • the proximity camera and the one or more fascia are coupled to a shelving unit.
  • the image illustrates a geo-fence region at least partially surrounding the shelving unit, the geo-fence region including the first predetermined proximity region.
  • the instructions are executable by the one or more processors to perform further operations including determining whether a second object is detected within a second predetermined proximity region, the second proximity region including a physical area closer to the shelving unit than the first predetermined proximity region.
  • the graphical display displayed by the one or more fascia includes an immersive graphic that spans a plurality of the one or more fascia.
  • FIG. 1 provides an illustration of an automated inventory intelligence system in accordance with some embodiments
  • FIG. 2A provides a second illustration of a plurality of shelves with an automated inventory intelligence system in accordance with some embodiments
  • FIG. 2B provides an illustration of a mount of the inventory camera of FIG. 2A in accordance with some embodiments
  • FIG. 2C provides an illustration of the inventory camera positioned within the mount of the automated inventory intelligence system of FIGS. 2A-2B ;
  • FIG. 3 provides a second illustration of a plurality of shelves with an automated inventory intelligence system in accordance with some embodiments
  • FIG. 4 provides an illustration of a portion of an automated inventory intelligence system in accordance with some embodiments
  • FIG. 5 provides an illustration of an image captured by a camera of an automated inventory intelligence system in accordance with some embodiments
  • FIG. 6A provides a schematic illustrating a sensor coupled to a retail shelving unit in accordance with some embodiments in shown
  • FIG. 6B provides a schematic illustrating a sensor such as an inventory camera coupled to an automated inventory intelligence system in accordance with some embodiments
  • FIG. 6C provides a schematic illustrating a sensor such as an inventory camera coupled to the automated inventory intelligence system in accordance with some embodiments
  • FIG. 7A provides an exemplary embodiment of a first logical representation of the automated inventory intelligence system of FIG. 1 ;
  • FIG. 7B provides an exemplary embodiment of a second logical representation of the automated inventory intelligence system of FIG. 1 ;
  • FIG. 8 provides a flowchart illustrating an exemplary method for analyzing, by the automated inventory intelligence system logic of FIGS. 7A-7B , an image of inventory to determine whether the inventory is to be restocked in accordance with some embodiments;
  • FIG. 9 provides a flowchart illustrating an exemplary method for analyzing, by the automated inventory intelligence system logic of FIGS. 7A-7B , an image of inventory to determine whether the inventory is to be restocked based on a triggering event in accordance with some embodiments.
  • FIG. 10 provides an exemplary embodiment of the proximity sensor positioned on a cabinet display top of the automated inventory intelligence system of FIG. 1 in accordance with some embodiments;
  • FIG. 11A provides a first illustration of an image captured by the proximity sensor of the automated inventory intelligence system of FIG. 1 in accordance with some embodiments
  • FIG. 11B provides a second illustration of an image captured by the proximity sensor of the automated inventory intelligence system of FIG. 1 in accordance with some embodiments
  • FIG. 12A provides an exemplary illustration of a plurality of proximity regions based on one configuration of a proximity sensor of an automated inventory intelligence system in accordance with some embodiments
  • FIG. 12B provides an exemplary illustration fascia of the automated inventory intelligence system of FIG. 12A displaying a first graphic while in a promotion state in accordance with some embodiments;
  • FIG. 12C provides an exemplary illustration fascia of the automated inventory intelligence system of FIG. 12A displaying a first graphic while in a product information state in accordance with some embodiments;
  • FIG. 12D provides an exemplary illustration fascia of the automated inventory intelligence system of FIG. 12A displaying a first graphic while in an immersive state in accordance with some embodiments;
  • FIG. 13 provides an exemplary flowchart illustrating operations corresponding to detecting whether an object is located within one of a plurality of proximity regions performed by an automated inventory intelligence system in accordance with some embodiments
  • FIG. 14 provides an exemplary flowchart illustrating operations corresponding to detecting one or more objects in one or more of a plurality of proximity regions and generating activity logs performed by an automated inventory intelligence system in accordance with some embodiments;
  • FIG. 15A provides an exemplary embodiment of a first user interface display screen produced by an automated inventory intelligence system, where the first user interface display screen provides an interactive dashboard in accordance with some embodiments;
  • FIG. 15B provides an exemplary embodiment of a second user interface display screen produced by an automated inventory intelligence system, where the second user interface display screen provides an interactive dashboard in accordance with some embodiments.
  • Labels such as “left,” “right,” “front,” “back,” “top,” “bottom,” “forward,” “reverse,” “clockwise,” “counter clockwise,” “up,” “down,” or other similar terms such as “upper,” “lower,” “aft,” “fore,” “vertical,” “horizontal,” “proximal,” “distal,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by those of ordinary skill in the art.
  • the present disclosure describes an apparatus and a method for an automated inventory intelligence system that provides intelligence in tracking inventory on, for example retail shelves, as well intelligence in determining the proximity of retail customers as they approach, stall and pass a particular retail shelf or display and the demographics of the retail customers.
  • the automated inventory intelligence system is comprised of a cabinet top display, fascia, a proximity sensor, one or more inventory sensors, and one or more demographic tracking sensors.
  • the cabinet top display can be configured to display animated and/or graphical content and is mounted on top of in-store shelves.
  • the fascia may include one or more panels of light-emitting diodes (LEDs) configured to display animated and/or graphical content and to mount to an in-store retail shelf.
  • LEDs light-emitting diodes
  • the automated inventory intelligence system can also include a data processing system comprising a media player that is configured to simultaneously execute (i.e., “play”) a multiplicity of media files that are displayed on the cabinet top and/or the fascia.
  • the cabinet top and the fascia are typically configured to display content so as to entice potential customers to approach the shelves, and then the fascia may switch to displaying pricing and other information pertaining to the merchandise on the shelves once a potential customer approaches the shelves.
  • the proximity sensor is configured to detect the presence of potential customers.
  • one or more inventory sensors may be configured to track the inventory stocked on one or more in-store retail shelves.
  • the automated inventory intelligence system may create one or more alerts once the stocked inventory remaining on the shelves is reduced to a predetermined minimum threshold quantity.
  • the automated inventory intelligence system 100 couples to a shelving unit 102 , which often includes shelves 104 , a back component 105 (e.g., pegboard, gridwall, slatwall, etc.) and a cabinet top display 106 .
  • a back component 105 e.g., pegboard, gridwall, slatwall, etc.
  • the cabinet display top 106 is coupled to an upper portion of the shelving unit 102 , extending vertically from the back component 105 .
  • a proximity camera 107 may be positioned on top of, or otherwise affixed to, the cabinet top display 106 .
  • the proximity camera 107 is shown in FIG. 1 as being centrally positioned atop the cabinet top display 106 , the proximity camera 107 may be positioned in different locations, such as near either end of the top of the cabinet top 106 , on a side of the cabinet top 106 and/or at other locations coupled to the shelving unit 102 and/or the fascia 108 .
  • the cabinet display top 106 and fascia 108 may be attached to the shelves 104 by way of any fastening means deemed suitable, wherein examples include, but are not limited or restricted to, magnets, adhesives, brackets, hardware fasteners, and the like.
  • the fascia 108 and the cabinet display top 106 may each be comprised of one or more arrays of light emitting diodes (LEDs) that are configured to display visual content (e.g., still or animated content), with optional speakers, not shown, coupled thereto to provide audio content.
  • LEDs light emitting diodes
  • any of the fascia 108 and/or the cabinet display top 106 may be comprised of relatively smaller LED arrays that may be coupled together so as to tessellate the cabinet display top 106 and the fascia 108 , such that the fascia and cabinet top desirably extend along the length of the shelves 104 .
  • the smaller LED arrays may be comprised of any number of LED pixels, which may be organized into any arrangement to conveniently extend the cabinet display top 106 and the fascia 108 along the length of a plurality of shelves 104 .
  • a first dimension of the smaller LED arrays may be comprised of about 132 or more pixels.
  • a second dimension of the smaller LED arrays may be comprised of about 62 or more pixels.
  • the cabinet display top 106 and the fascia 108 may be configured to display visual content to attract the attention of potential customers. As shown in the embodiment of FIG. 1 , the cabinet display top 106 may display desired visual content that extends along the length of the shelves 104 .
  • the desired content may be comprised of a single animated or graphical image that fills the entirety of the cabinet display top 106 , or the desired content may be a group of smaller, multiple animated or graphical images that cover the area of the cabinet display top 106 .
  • the fascia 108 may cooperate with the cabinet display top 106 to display either a single image or multiple images that appear to be spread across the height and/or length of the shelves 104 .
  • the cabinet display top 106 may display visual content selected to attract the attention of potential customers to one or more products comprising inventory 112 , e.g., merchandise, located on the shelves 104 .
  • the visual content shown on the cabinet display top 106 may be specifically configured to draw the potential customers to approach the shelves 104 , and is often related to the specific inventory 112 located on the corresponding shelves 104 .
  • a similar configuration with respect to visual content displayed on the fascia 108 may apply as well, as will be discussed below.
  • the content shown on the cabinet display top 106 , as well as the fascia 108 may be dynamically changed to engage and inform customers of ongoing sales, promotions, and advertising. As will be appreciated, these features offer brands and retailers a way to increase sales locally by offering customers a personalized campaign that may be easily changed quickly.
  • portions of the fascia 108 may display visual content such as images of brand names and/or symbols representing products stocked on the shelves 104 nearest to each portion of the fascia.
  • a single fascia 108 may be comprised of a first portion 114 and a second portion 116 .
  • the first portion 114 may display an image of a brand name of inventory 112 that is stocked on the shelf above the first portion 114 (e.g., in one embodiment, stocked directly above the first portion 114 ), while the second portion 116 may display pricing information for the inventory 112 .
  • Additional portions may include an image of a second brand name and/or varied pricing information when such portions correspond to inventory different than inventory 112 .
  • the fascia 108 extending along each of the shelves 104 may be sectionalized to display images corresponding to each of the products stocked on the shelves 104 . It is further contemplated that the displayed images will advantageously simplify customers quickly locating desired products.
  • the animated and/or graphical images displayed on the cabinet display top 106 and the fascia 108 are comprised of media files that are executed by way of a suitable media player.
  • the media player is often configured to simultaneously play any desired number of media files that may be displayed on the smaller LED arrays.
  • each of the smaller LED arrays may display one media file being executed by the multiplayer, such that a group of adjacent smaller LED arrays combine to display the desired images to the customer.
  • base video may be stretched to fit any of various sizes of the smaller LED arrays, and/or the cabinet display top 106 and fascia 108 . It should be appreciated, therefore, that the multiplayer disclosed herein enables implementing a single media player per aisle in-store instead relying on multiple media players dedicated to each aisle.
  • FIG. 1 illustrates a plurality of inventory cameras 110 (i.e., the inventory cameras 110 1 - 110 8 ).
  • the inventory cameras 110 are coupled to the shelving unit 102 , e.g., via the pegboard 105 , and positioned above merchandise 112 , also referred to herein as “inventory. “Each of the inventory cameras 110 can be configured to monitor a portion of the inventory stocked on each shelf 104 , and in some instances, may be positioned below a shelf 104 , e.g., as is seen with the inventory cameras 110 3 - 110 8 .
  • an inventory camera 110 may not be positioned below a shelf 104 , e.g., as is seen with the inventory cameras 110 1 - 110 2 .
  • the inventory camera 110 4 is positioned above the inventory portion 116 and therefore capable of (and configured to), monitor the inventory portion 116 .
  • the inventory camera 110 4 may have a viewing angle of 180° (degrees) and is capable of monitoring a larger portion of the inventory 112 on the shelf 104 2 than merely inventory portion 116 .
  • FIG. 5 illustrates one exemplary image captured by an inventory camera having a viewing of 180°.
  • the positioning of the inventory cameras 110 may differ from the illustration of FIG. 1 .
  • the inventory cameras 110 degree may be affixed to the shelving unit 102 in a variety of manners, including attachment to various types of shelves 104 and monitoring of any available inventory 112 stored thereon.
  • various embodiments of the automated inventory intelligence system 100 can also include a facial recognition camera 109 .
  • the facial recognition camera 109 may be coupled to the exterior of the shelving unit 102 .
  • the facial recognition camera 109 may be positioned between five to six feet from the ground in order to obtain a clear image of the faces of a majority of customers.
  • the facial recognition camera 109 may be positioned at heights other than five to six feet from the ground.
  • the facial recognition camera 109 need not be coupled to the exterior of the shelving unit 102 as illustrated in FIG. 1 ; instead, the illustration of FIG. 1 is merely one embodiment.
  • the facial recognition camera 109 may be coupled to in the interior of a side of the shelving unit 109 as well as to any portion of any of the shelves 104 1 - 104 4 , the cabinet display top 106 , the fascia 108 and/or the back component 105 of the shelving unit 102 . Further, a plurality of facial recognition cameras 109 may be coupled to the shelving unit 102 . In certain embodiments, the facial recognition camera 109 may be eliminated and its associated functions accomplished by any available proximity cameras 107 . In these embodiments, software can be utilized to account for any discrepancy between the image and angles captured between the proximity cameras 107 as compared to the facial recognition cameras 109 . In further embodiments, especially where privacy concerns are heightened, facial recognition cameras may be eliminated leaving the automated inventory intelligent system 100 to gather customer data by other means including, but not limited to, mobile phone signals/application data and/or radio-frequency identification (RFID) signals.
  • RFID radio-frequency identification
  • the automated inventory intelligence system 100 may include one or more processors, a non-transitory computer-readable memory, one or more communication interfaces, and logic stored on the non-transitory computer-readable memory.
  • the images or other data captured by the proximity sensor 107 , the facial recognition camera 109 and/or the inventory cameras 110 1 - 110 8 may be analyzed by the logic of the automated inventory intelligence system 100 .
  • the non-transitory computer-readable medium may be local storage, e.g., located at the store in which the proximity sensor 107 , the facial recognition camera 109 and/or the inventory cameras 110 1 - 110 8 reside, or may be cloud-computing storage.
  • the one or more processors may be local to the proximity sensor 107 , the facial recognition camera 109 and/or the inventory cameras 110 1 - 110 8 or may be provided by cloud computing services.
  • Examples of the environment in which the automated inventory intelligence system 100 may be located include, but are not limited or restricted to, a retailer, a warehouse, an airport, a high school, college or university, any cafeteria, a hospital lobby, a hotel lobby, a train station, or any other area in which a shelving unit for storing inventory may be located.
  • FIG. 2A a second illustration of a plurality of shelves with an automated inventory intelligence system in accordance with some embodiments is shown.
  • FIG. 2A illustrates the automated inventory intelligence system 206 coupled to a shelving unit 200 .
  • the shelving unit 200 includes a back component 202 (e.g., pegboard) and shelves 204 (wherein shelves 204 1 - 204 3 are illustrated; however, the shelving unit 200 may include additional shelves).
  • the automated inventory intelligence system 206 includes fascia 208 and the inventory sensor 210 (herein the inventory sensor 210 is depicted as inventory camera). Although only a single inventory camera 210 is shown in FIG. 2A , the automated inventory intelligence system 206 may include additional inventory cameras not shown.
  • FIG. 2A illustrates the automated inventory intelligence system 206 coupled to a shelving unit 200 .
  • the shelving unit 200 includes a back component 202 (e.g., pegboard) and shelves 204 (wherein shelves 204 1 - 204 3 are illustrated; however, the shelving unit 200 may include additional shelves).
  • FIG. 2A provides a clear perspective as to the positioning of the inventory camera 210 may be in one embodiment.
  • the inventory camera 210 is shown to be coupled to a corner formed by an underside of the shelf 204 1 and the back component 202 .
  • the positioning of the inventory camera 210 can enable the inventory camera 210 to monitor the inventory 212 .
  • Additional detail of the coupling of the inventory camera 210 to the shelving unit 200 is seen in FIG. 2B .
  • the fascia e.g., fascia 208 2 may display pricing information (as also shown in FIG.
  • an alert e.g., a visual indicator via LEDs of a portion of the fascia, indicating that inventory stocked on the corresponding shelf, e.g., the shelf 208 2 , is to be restocked.
  • FIG. 2B an illustration of a mount of the inventory camera 210 of FIG. 2A is shown in accordance with some embodiments.
  • the mount 222 which may be “L-shaped” in nature (i.e., two sides extending at a 90° (degree) angle from each other, is shown without the inventory camera 210 placed therein.
  • the inventory camera 210 may snap into the mount 222 , which may enable inventory cameras to be easily replaced, moved, removed for charging or repair, etc.
  • the mount 222 is shown as being coupled to a corner formed by an underside of the shelf 204 1 and the back component 202 .
  • first metal runner 214 is attached to the back component 202 and a second metal runner 220 is shown as being attached to the underside of the shelf 204 1 .
  • the first metal runner 214 includes a first groove 216 and a second groove 218 to which flanges of the mount 222 , such as the flange 228 , may slide or otherwise couple.
  • a groove is also formed by the second metal runner 220 , which may also assist in the coupling of the mount 222 .
  • the mount 222 includes a top component 224 , a side component 226 , an optional flange 228 , bottom grips 230 , top grips 232 , a top cavity 234 and side cavity 236 .
  • a flange extending from the top component 224 to couple with the metal runner 220 may be included.
  • the inventory camera 210 may couple to the mount 222 and be securely held in place by the bottom grips 230 and the top grips 232 .
  • the body of the inventory camera 210 may include projections that couple, e.g., mate, with the cavity 234 and/or the cavity 236 to prevent shifting of the inventory camera 210 upon coupling with the mount 222 .
  • FIG. 2C an illustration of the inventory camera 210 positioned within the mount 222 of the automated inventory intelligence system 206 of FIGS. 2A-2B is shown.
  • the inventory camera 210 is positioned within the mount 222 and includes a lens 238 and a housing 240 .
  • the inventory camera 210 is shown as having four straight sides but may take alternative forms as still be within the scope of the invention. For example, in other embodiments, the inventory camera 210 may only have two straight sides and may include two curved sides. Additionally, the inventory camera 210 may take a circular shape or include one or more circular arcs. Further, the inventory camera 210 may take the form of any polygon or other known geometric shape.
  • the housing 240 may have an angled face such that the face of the housing 240 slopes away from the lens 238 , which may be advantageous in capturing an image having a viewing angle of 180°.
  • the inventory camera 210 may snap into the mount 222 and held in place by friction of the bottom grips 230 and top grips 232 , and the force applied by the top component 224 and the side component 226 .
  • the mount 222 can comprise a variety of shapes depending on the camera and shelving unit 200 being utilized, as can be shown in the camera mount depicted in FIG. 3 below.
  • FIG. 3 a second illustration of a plurality of shelves with an automated inventory intelligence system is shown in accordance with some embodiments.
  • FIG. 3 illustrates an inventory camera 310 1 of the automated inventory intelligence system 300 coupled to the underside of a shelf 304 1 , which is part of the shelving unit 302 .
  • the automated inventory intelligence system 300 includes the fascia 306 1 - 306 2 , the camera 310 1 and a mount 314 .
  • the mount 314 is coupled to underside of shelf 3041 , which is possible due to the configuration of the shelf 304 1 , particularly, the shelf 304 1 is comprised of a series of grates. Due to the grated nature of the shelf 304 1 , the mount 314 may be configured to clip directly to one or more of the grates.
  • the shelving unit 302 is refrigerated, e.g., configured for housing milk, and includes a door, not shown. As a result of being refrigerated, the shelving unit 302 experiences temperature swings as the door is opened and closed, which often results in the temporary accumulation of condensation on the lens of the inventory camera 310 1 .
  • the logic of the automated inventory intelligence system may perform various forms of processing for handling the temporary accumulation of condensation on the lens of the inventory camera 310 1 , which may include, for example, (i) sensing when the door of the shelving unit 302 is opened, e.g., via sensing activation of a light, and waiting a predetermined amount of time before taking an image capture with the inventory camera 310 1 (e.g., to wait until the condensation has dissipated), and/or (ii) capturing an image with the inventory camera 310 1 , performing image processing such as object recognition techniques, and discarding the image when the object recognition techniques do not provide a confidence level of the recognized objects above a predetermined threshold (e.g., condensation blurred or otherwise obscured the image, indicating the presence of condensation).
  • a predetermined threshold e.g., condensation blurred or otherwise obscured the image, indicating the presence of condensation
  • the inventory camera 310 1 may be coupled to the front of the shelf 304 1 and face the inventory 312 .
  • Such an embodiment may be advantageous with refrigerated shelving units such as the shelving unit 302 when a light source, not shown, is housed within the shelving unit and turns on when a door of the shelving unit is opened. More specifically, when the light source is positioned at the rear of the shelving unit, the image captured by the inventory camera 310 1 may appear clearer and less blurred in such an embodiment.
  • a sensor 408 is shown positioned near merchandise 406 stocked on a shelving unit 402 of an automated inventory intelligence system 400 .
  • the sensor 408 is shown integrated in a housing 404 , wherein the housing 404 may, in one embodiment, take the form of a rod that extends along at least a portion of the back component of the shelving unit and may be configured to couple to the shelving unit.
  • the sensor 408 may include a digital camera; however, in other embodiments, the sensor 408 may be any sensing device whereby merchandise stocked on a shelving unit may be monitored.
  • the senor 408 is configured to be coupled directly to the shelving unit 402 by way of any fastening means deemed suitable, such as, by way of non-limiting example, magnets, adhesives, brackets, hardware fasteners, and the like. In other embodiments, such as those illustrated in FIGS. 5-6 below, the sensor 408 may be coupled to the shelving unit 402 through a mounting bracket 506 . Further, the location of a sensor such as the sensor 408 is not to be limited to the location shown in FIG. 4 . It should be understood that the sensor 408 may be disposed in any location with respect to a retail display or warehouse storage unit whereby the stocked merchandise may be monitored. Embodiments of some alternative positioning of sensors are illustrated in FIGS. 6A-6C .
  • preferred locations suited to receive the sensor 408 will generally depend upon one or more factors, such as, for example, the type of merchandise, an ability to capture a desired quantity of merchandise within the field of view of the sensor 408 , as well as the methods whereby customers typically remove merchandise from the retail display units.
  • any of the retail displays or warehouse storage units outfitted with the automated inventory intelligence system 400 can monitor the quantity of stocked merchandise by way of one or more sensors such as the sensor 408 and then create a notification or an alert once the remaining merchandise is reduced to a predetermined minimum threshold quantity.
  • low-inventory alerts may be created when the remaining merchandise is reduced to 50% and 20% thresholds; however, the disclosure is not intended to be so limited and thresholds may be predetermined and/or dynamically configurable (e.g., in response to weather conditions, and/or past sales history data).
  • the low-inventory alerts may be sent to in-store staff to signal that a retail display needs to be restocked with merchandise.
  • the low-inventory alerts can include real-time images and/or stock levels of the retail displays so that staff can see the quantity of merchandise remaining on the retail displays by way of a computer or a mobile device.
  • the low-inventory alerts may be sent in the form of text messages in real time to mobile devices carried by in-store staff.
  • the low-inventory alerts can signal in-store staff to restock the retail displays with additional merchandise to maintain a frictionless shopping experience for consumers.
  • the automated inventory intelligence system 400 can facilitate deeper analyses of sales performance by coupling actual sales with display shelf activity.
  • FIG. 5 an illustration of an image captured by a camera of an automated inventory intelligence system is shown in accordance with some embodiments.
  • the image 500 shown in FIG. 5 illustrates the ability of an inventory camera configured for use with the automated inventory intelligence system 206 of FIGS. 2A-2C to capture the image 500 having an approximately 180° viewing angle.
  • an inventory camera such as the inventory camera 310 1 of FIG. 3
  • the inventory camera 310 1 may capture an image such as the image 500 , which includes a capture of an inventory portion 508 and an inventory portion 510 stocked on shelving 506 .
  • the image 500 may include a capture of a portion of the store environment 502 and additional inventory 512 .
  • the positioning of the inventory camera as shown in FIG. 5 enables the inventory camera to capture images such as the image 500 , which may be analyzed by logic of the automated inventory intelligence system 206 to automatically and intelligently determine the amount of inventory stocked on the shelf.
  • the first inventory portion 508 and the second inventory portion 510 may be identified by the automated inventory intelligence system 206 using object recognition techniques.
  • object recognition techniques For example, upon recognition of the first inventory portion 508 (e.g., recognition of Pepsi bottles), logic of the automated inventory intelligence system 206 may analyze the quantity remaining on the shelf 506 .
  • the automated inventory intelligence system 206 may determine whether a threshold number of bottles have been removed from the shelf 506 .
  • the automated inventory intelligence system 206 may generate a report and/or an alert notifying employees and/or manufacturer that the inventory portion 508 requires restocking. In additional embodiments, the automated inventory intelligence system 206 may determine that less than a threshold number of bottles remain on the shelf 506 and therefore the first inventory portion 508 requires restocking. Utilization of other methodologies of determining whether at least a predetermined number of items remain on a shelf for a given inventory set are within the scope of the invention.
  • the term “inventory set” generally refers to a grouping of a particular item, e.g., a grouping of a particular type of merchandise, which may include brand, product size (12 oz. bottle v. 2 L bottle), etc.
  • the image 500 may also be analyzed to determine the remaining items of other inventory portions such as the second inventory portion 510 and/or the alternative portion 512 .
  • the inventory camera may be placed at various varying positions within, or coupled to, a shelving unit. The utilization of such alternative configurations may be dependent upon the type of shelving unit, the type of inventory being captured in images taken by the inventory camera and/or the positioning of inventory within the store environment (e.g., across an aisle).
  • FIGS. 6A-6C provide schematics illustrating sensors coupled to retail displays in accordance with some embodiments.
  • the one or more sensors are configured to be disposed in a retail environment such as by coupling the sensors to retail displays or warehouse storage units.
  • retail displays include, but are not limited to, shelves, panels (e.g., pegboard, gridwall, slatwall, etc.), tables, cabinets, cases, bins, boxes, stands, and racks
  • warehouse storage includes, but is not limited to, shelves, cabinets, bins, boxes, and racks.
  • the sensors may be coupled to the retail displays or the warehouse storage units such that one sensor is provided for every set of inventory items (e.g., one-to-one relationship), one sensor for a number of sets of inventory items (e.g., one-to-many relationship), or a combination thereof.
  • the sensors may also be coupled to the retail displays or the warehouse storage units with more than one sensor for every set of inventory items (e.g., many-to-one relationship), more than one sensor for a number of sets of inventory items (e.g., many-to-many relationship), or a combination thereof.
  • at least two sensors monitor the same set of inventory items thereby providing contemporaneous sensor data for the set of inventory items.
  • FIGS. 6A-6C shows a one-to-one relationship of a sensor to a set of inventory items, but each sensor can alternatively be in one of the foregoing alternative relationships with one or more sets of inventory items.
  • the sensors include, but are not limited to, light- or sound-based sensors such as digital cameras and microphones, respectively.
  • the sensors are digital cameras, also referred to as “inventory cameras,” with a wide viewing angle up to a 180° viewing angle.
  • FIG. 6A a schematic illustrating a sensor such as a sensor 606 coupled to a retail shelving unit 604 is shown in accordance with some embodiments.
  • the sensor 606 e.g., an inventory camera
  • the shelving unit 604 is a component of the housing 602 of the automated inventory intelligence system 600 .
  • the inventory camera 606 is configured in an orientation to view a set of inventory items 608 on an inventory item-containing shelf beneath the upper shelf.
  • the inventory camera 606 is shown mounted inside the retail shelving unit 604 such as on a back (e.g., pegboard) of the housing 602 and looking out from the automated inventory intelligence system 600 , the inventory camera 606 may alternatively be coupled to the upper shelf and looking in to the automated inventory intelligence system 600 . Due to a wide viewing angle of up to 180°, whether looking out from or in to the automated inventory intelligence system 600 , the inventory camera 606 may collect visual information on sets of inventory items adjacent to the set of inventory items 608 .
  • FIG. 6B a schematic illustrating a sensor such as an inventory camera 612 coupled to an automated inventory intelligence system 600 is shown in accordance with some embodiments.
  • the inventory camera 612 may be coupled to or mounted on the automated inventory intelligence system 600 on an inventory-item containing shelf of the automated inventory intelligence system 600 in an orientation to view a set of inventory items 614 on the inventory item-containing shelf. While the inventory camera 612 is shown mounted inside the automated inventory intelligence system 600 on the inventory item-containing shelf and looking in to the automated inventory intelligence system 600 , which may be advantageous when a light 610 is present in a back of automated inventory intelligence system 600 , the inventory camera 612 may alternatively be coupled to the inventory item-containing shelf and looking out from the automated inventory intelligence system 600 . Due to a wide viewing angle of up to 180°, whether looking in to or out from the automated inventory intelligence system 600 , the inventory camera 612 may collect visual information on sets of inventory items adjacent to the set of inventory items 614 .
  • FIG. 6C a schematic illustrating a sensor such as an inventory camera 622 coupled to the automated inventory intelligence system 600 is shown in accordance with some embodiments.
  • FIG. 6C further provides a second housing 618 with a second sensor such as an inventory camera 624 coupled to a second upper shelf 620 and in communication with a second automated inventory intelligence system 616 in accordance with some embodiments.
  • the automated inventory intelligence system 600 and second automated inventory intelligence system 616 may be separate and independent systems or may be communicatively coupled and/or processing data cooperatively.
  • the inventory camera 622 may be physically coupled to or mounted on the automated inventory intelligence system 600 in an orientation to view a set of inventory items 628 on an inventory-item containing shelf of an opposing shelving unit across an aisle such as the automated inventory intelligence system 616 .
  • the inventory camera 624 may be coupled to or mounted on the automated inventory intelligence system 616 in an orientation to view a set of inventory items 626 on an inventory-item containing shelf of an opposing shelving unit across an aisle such as the automated inventory intelligence system 600 .
  • the inventory camera 622 can collect visual information on sets of inventory items on the automated inventory intelligence system 616 adjacent to the set of inventory items 628 (not shown), and the inventory camera 622 can collect visual information on sets of inventory items on the automated inventory intelligence system 616 adjacent to the set of inventory items 626 (not shown).
  • inventory cameras such as inventory cameras 606 , 612 , 622 , and 624 are coupled to or mounted on endcaps or other vantage points of the automated inventory intelligence systems to collect visual information while looking in to the retail shelving units.
  • the automated inventory intelligence system 700 may include one or more processors 702 that are coupled to a communication interface 704 .
  • the communication interface 704 in combination with a communication interface logic 708 , enables communications with external network devices and/or other network appliances transmit and receive data.
  • the communication interface 704 may be implemented as a physical interface including one or more ports for wired connectors. Additionally, or in the alternative, the communication interface 704 may be implemented with one or more radio units for supporting wireless communications with other electronic devices.
  • the communication interface logic 708 may include logic for performing operations of receiving and transmitting data via the communication interface 704 to enable communication between the automated inventory intelligence system 700 and network devices via a network (e.g., the internet) and/or cloud computing services, not shown.
  • the processor(s) 702 is further coupled to a persistent storage 706 .
  • the persistent storage 706 may store logic as software modules includes an automated inventory intelligence system logic 710 and the communication interface logic 708 .
  • the operations of these software modules, upon execution by the processor(s) 702 are described above. Of course, it is contemplated that some or all of this logic may be implemented as hardware, and if so, such logic could be implemented separately from each other.
  • the automated inventory intelligence system 700 may include hardware components such as fascia 711 1 - 711 m (wherein m ⁇ 1), inventory cameras 712 1 - 712 i (wherein i ⁇ 1), proximity sensors 714 1 - 714 j (wherein j ⁇ 1), and facial recognition cameras 716 1 - 716 k (wherein k ⁇ 1).
  • hardware components such as fascia 711 1 - 711 m (wherein m ⁇ 1), inventory cameras 712 1 - 712 i (wherein i ⁇ 1), proximity sensors 714 1 - 714 j (wherein j ⁇ 1), and facial recognition cameras 716 1 - 716 k (wherein k ⁇ 1).
  • couplings i.e., communication paths, are not illustrated between the processor(s) 702 and the fascia 711 1 - 711 m , the inventory cameras 712 1 - 712 i , the proximity sensors 714 1 - 714 j , and the facial recognition cameras 716 1 - 716 k : however such couplings may be direct or indirect and configured to allow for the provision of instructions from the automated inventory intelligence system logic 710 to such components
  • Each of the inventory cameras 712 1 - 712 i , the proximity sensors 714 1 - 714 j , and the facial recognition cameras 716 1 - 716 k may be configured to capture images, e.g., at predetermined time intervals or upon a triggering event, and transmit the images to the persistent storage 706 .
  • the automated inventory intelligence system logic 710 may, upon execution by the processor(s) 702 , perform operations to analyze the images.
  • the automated inventory intelligence system logic 710 includes an image receiving logic 718 , an object recognition logic 718 , an inventory threshold logic 722 , an alert generation logic 724 , a facial recognition logic 726 and a proximity logic 728 . As will be discussed in further detail below with respect to FIGS.
  • the image receiving logic 718 can be configured to, upon execution by the processor(s) 702 , perform operations to receive a plurality of images from a sensor, such as the inventory cameras 712 1 - 712 1 .
  • the image receiving logic 718 may receive a trigger, such as a request for a determination as to whether an inventory set needs to be restocked, and request an image be captured by one or more of the inventory cameras 712 1 - 712 1 .
  • the object recognition logic 720 is configured to, upon execution by the processor(s) 702 , perform operations to analyze an image received by an inventory camera 712 1 - 712 i , including object recognition techniques.
  • the object recognition techniques may include the use of machine learning, predetermined rule sets and/or deep convolutional neural networks.
  • the object recognition logic 720 may be configured to identify one or more inventory sets within an image and determine an amount of each product within the inventory set.
  • the object recognition logic 720 may identify a percentage, numerical determination, or other equivalent figure that indicates how much of the inventory set remains on the shelf (i.e., stocked) relative to an initial amount (e.g., based on analysis and comparison with an earlier image and/or retrieval of an initial amount predetermined and stored in a data store, such as the inventory threshold data store 730 ).
  • the inventory threshold logic 722 is configured to, upon execution by the processor(s) 702 , perform operations to retrieve one or more predetermined thresholds and determine whether the inventory set needs to be restocked.
  • a plurality of predetermined holds which may be stored in the inventory threshold data store 730 , may be utilized in a single embodiment.
  • a first threshold may be used to determine whether the inventory set needs to be stocked and an alert transmitted to, for example, a retail employee (e.g., at least a first amount of the initial inventory set has been removed).
  • a second threshold may be used to determine whether a product delivery person needs to deliver more of the corresponding product to the retailer (e.g., indicating at least a second amount of the initial inventory set has been removed, the second amount greater than the first amount).
  • alerts may be transmitted to both a retail employee and a product delivery person.
  • the alert generation logic 724 can be configured to, upon execution by the processor(s) 702 , perform operations to generate alerts according to determinations made by, for example, the object recognition logic 720 and the inventory threshold 722 .
  • the alerts may take any form such as a digital communication transmitted to one or more electronic devices, and/or an audio/visual cue in proximity to the shelf on which the inventory set is stocked, etc.
  • the facial recognition logic 726 may be configured to, upon execution by the processor(s) 702 , perform operations to analyze images received by the image receiving logic 718 from the facial recognition cameras 716 1 - 716 k .
  • the facial recognition logic 726 may look to determine trends in customers based on ethnicity, age, gender, time of visit, geographic location of the store, etc., and, based on additional analysis, the automated inventory intelligence system logic 710 may determine trends in accordance with graphics displayed by the automated inventory intelligence system 700 , sales, time of day, time of the year, day of the week, etc.
  • Facial recognition logic 726 may also be able to generate data relating to the overall traffic associated with the facial recognition cameras 716 1 - 716 k . This can be generated directly based on the number of faces (unique and non-unique) that are processed within a given time period. This data can be stored within the persistent storage 706 within a traffic density log 734 .
  • the proximity logic 728 can be configured to, upon execution by the processor(s) 702 , perform operations to analyze images received by, for example, the image receiving logic 718 from the proximity sensors 714 1 - 714 j .
  • the proximity logic 728 may determine when a customer is within a particular distance threshold from the shelving unit on which the inventory set is stocked and transmit a communication (e.g., instruction or command) to the change the graphics displayed on the fascia, e.g., such as the fascia 711 1 - 711 m .
  • a communication e.g., instruction or command
  • Data related to the proximity, and therefore the potential effectiveness of an advertisement may be stored within a proximity log 732 . In this way, data may be provided that can be tracked with particular displays, products, and/or advertising campaigns.
  • FIG. 7B an exemplary embodiment of a second logical representation of the automated inventory intelligence system of FIG. 1 is shown in accordance with some embodiments.
  • the illustration of FIG. 7B provides a second embodiment of the automated inventory intelligence system 700 in which the automated inventory intelligence system logic 710 of FIG. 7A resides in cloud computing services 740 .
  • each of the fascia 711 1 - 711 m , the inventory cameras 712 1 - 712 1 , the proximity sensors 714 1 - 714 j , and the facial recognition cameras 716 1 - 716 k may be configured to capture images which are then transmitted, via the communication interface 704 , to the automated inventory intelligence system 710 in the cloud computing services 740 .
  • the automated inventory intelligence system 710 upon execution via the cloud computing services 740 , perform operations to analyze the images.
  • Processor(s) 702 can represent a single processor or multiple processors with a single processor core or multiple processor cores included therein.
  • Processor(s) 702 can represent one or more general-purpose processors such as a microprocessor, a central processing unit (“CPU”), or the like. More particularly, processor(s) 702 may be a complex instruction set computing (“CISC”) microprocessor, reduced instruction set computing (“RISC”) microprocessor, very long instruction word (“VLIW”) microprocessor, or processor implementing other instruction sets, or processors implementing a combination of instruction sets.
  • CISC complex instruction set computing
  • RISC reduced instruction set computing
  • VLIW very long instruction word
  • Processor(s) 702 can also be one or more special-purpose processors such as an application specific integrated circuit (“ASIC”), a field programmable gate array (“FPGA”), a digital signal processor (“DSP”), a network processor, a graphics processor, a network processor, a communications processor, a cryptographic processor, a co-processor, an embedded processor, or any other type of logic capable of processing instructions.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • DSP digital signal processor
  • processors 702 can be configured to execute instructions for performing the operations and steps discussed herein.
  • Persistent storage 706 can include one or more volatile storage (or memory) devices, such as random access memory (“RAM”), dynamic RAM (“DRAM”), synchronous DRAM (“SDRAM”), static RAM (“SRAM”), or other types of storage devices.
  • Persistent storage 706 can store information including sequences of instructions that are executed by the processor(s) 702 , or any other device. For example, executable code and/or data of a variety of operating systems, device drivers, firmware (e.g., input output basic system or BIOS), and/or applications may be loaded in persistent storage 706 and executed by the processor(s) 702 .
  • BIOS input output basic system
  • An operating system may be any kind of operating systems, such as, for example, Windows® operating system from Microsoft®, Mac OS®/iOS® from Apple, Android® from Google®, Linux®, Unix®, or other real-time or embedded operating systems such as VxWorks.
  • the techniques shown in the figures may be implemented using code and data stored and executed on one or more electronic devices.
  • Such electronic devices store and communicate (internally and/or with other electronic devices over a network) code and data using computer-readable media, such as non-transitory computer-readable storage media (e.g., magnetic disks; optical disks; random access memory; read only memory; flash memory devices; phase-change memory) and transitory computer-readable transmission media (e.g., electrical, optical, acoustical or other form of propagated signals—such as carrier waves, infrared signals, digital signals).
  • non-transitory computer-readable storage media e.g., magnetic disks; optical disks; random access memory; read only memory; flash memory devices; phase-change memory
  • transitory computer-readable transmission media e.g., electrical, optical, acoustical or other form of propagated signals—such as carrier waves, infrared signals, digital signals.
  • processing logic that includes hardware (e. g. circuitry, dedicated logic, etc.), firmware, software (e.g., embodied on a non-transitory computer readable medium), or a combination of both.
  • hardware e. g. circuitry, dedicated logic, etc.
  • firmware e.g., embodied on a non-transitory computer readable medium
  • processing logic includes hardware (e. g. circuitry, dedicated logic, etc.), firmware, software (e.g., embodied on a non-transitory computer readable medium), or a combination of both.
  • FIG. 8 a flowchart illustrating an exemplary method for analyzing, by the automated inventory intelligence system logic 710 of FIGS. 7A-7B , an image of inventory to determine whether the inventory is to be restocked is shown in accordance with some embodiments.
  • Each block illustrated in FIG. 8 represents an operation performed in the method 800 of analyzing an image of inventory to determine whether the inventory is to be restocked.
  • the method 800 starts when the image receiving logic 718 of the automated inventory intelligence system logic 710 receives an image captured by an inventory camera, e.g., the inventory camera 712 1 (block 802 ).
  • the object recognition logic 720 of the automated inventory intelligence system logic 710 Upon receiving the image, the object recognition logic 720 of the automated inventory intelligence system logic 710 performs processing on the image including one or more object recognition techniques to distill at least a first inventory set (block 804 ).
  • the object recognition logic 720 may perform operations including, but not limited or restricted to, identification of objects of interest within the image, quantification of objects of interest within the image, depth determination of object of interest within the image, and/or pattern recognition of at least portions of items of interest within the image in order to differentiate inventory sets.
  • the object recognition logic 720 may distill at least two inventory sets from the image 500 including the first inventory portion 508 and second inventory portion 510 .
  • the object recognition techniques may utilize deep convolutional neural networks, machine learning techniques and/or predetermined rule sets. Object recognition techniques may also be referred to as image recognition and/or computer vision.
  • object recognition logic 720 determines the amount of inventory within the first inventory set (block 806 ).
  • the amount of inventory may refer to a specific number of items (e.g., 16 2 L soda bottles remain) or a percentage/ratio (e.g., 50% of the 2 L soda bottles remain or 1 ⁇ 2 of the 2 L soda bottles remain).
  • the number of items initially stocked may be predetermined and stored in the inventory threshold data store 730 .
  • the object recognition logic 720 may determine or estimate the number of items to fill the shelf space provided for the first inventory set (e.g., estimate the shelf space based on the location of the first inventory set relative to the edges of the shelf and other inventory sets also located on the shelf).
  • the inventory threshold logic 722 can determine whether the amount of inventory set meets a predetermined threshold (block 808 ).
  • the thresholds may be stored in and retrieved from the inventory threshold data store 730 .
  • a single predetermined threshold may be utilized for all inventory sets, e.g., a percentage of inventory remaining on the shelf.
  • the object recognition logic 720 may determine the percentage of inventory remaining by retrieving a predetermined number representing the initial inventory (e.g., a number items initially stocked) and determining the ratio between the initial inventory and the current inventory.
  • Such a predetermined number may also be stored in the inventory threshold data store 730 and stored according to the inventory set and the retailer or retail location (as one retailer may not have the same amount of space to stock the inventory set as a second retailer).
  • the predetermined number may be dynamically adjustable by, for example, a retail employee when the inventory set is restocked and/or the inventory set is moved from a first location to a second location.
  • a predetermined threshold (e.g., either a percentage or a number) may be stored for one or more inventory sets.
  • a first threshold may correspond to a first inventory set, such as 2 L bottles of a first soda
  • a second threshold (different than the first threshold) may correspond to a second inventory set, such as 12 oz. bottles of the first soda.
  • thresholds may be dynamically adjusted for various reasons including, but not limited to, sale prices, local stock levels, sales history, promotional campaigns, etc.
  • the alert generation logic 724 may optionally generate a report, or add information to an existing report (such as a log), indicating that a determination was made that the first inventory set did not need to be restocked (block 810 ). For example, one or more of the following may be included in a log. a time stamp, an indication or identifier of the inventory camera that captured the image, the image, an indication or identifier of the first inventory set (as well as other inventory sets recognized), an indication of the determination not to restock, etc.
  • the alert generation logic 724 may generate an alert (and/or a report) indicating that the first inventory set is to be restocked (block 812 ).
  • the alert may take many forms including, but not limited or restricted to, a text message, an email, or any other digitally transmitted audio/visual indication (e.g., message transmitted via Bluetooth®). Additionally, or in the alternative, the alert may include a visual indication located proximate to the shelf on which the first inventory set is stocked (e.g., a light on or near the relevant shelf or a light on the corresponding cabinet display top).
  • the alert may include information such as an identifier of the first inventory set, an indicator identifying the location of the first inventory set within the retailer or retail location (e.g., a map of the retailer's space and an indicator of the relevant shelf on the map or an aisle number), an amount of inventory to restock (e.g., a particular number or percentage, the determination of which is discussed above), the image of the first inventory set, etc.
  • the alert may be transmitted to one or more of: an electronic device at the retailer or retail location (e.g., a dedicated tablet in the back of the retailer), an electronic device of a specific employee, an electronic device of a distributor/product delivery person, etc.
  • the determination of what information to include and to what electronic devices to transmit the alert may be based on a predetermined rule set (which may also be stored in the inventory threshold data store 730 ).
  • the predetermined rule set may include a set of thresholds such that a first, lower threshold indicates an alert is to be sent to a retail employee, while a second, higher threshold indicates the alert is also to be sent to a distributor/product delivery person (i.e., when inventory is very low, send alert to distributor/product delivery person).
  • the alert generation logic 724 may access a data store, not shown, which includes information as to the total amount of the product comprising the first inventory set at a particular retailer or retail location (e.g., amount of the product stocked and also located in the back of the retailer or retail location, which may be dynamically updated in real-time based on product sales information obtained when a customer makes a purchase).
  • a data store not shown, which includes information as to the total amount of the product comprising the first inventory set at a particular retailer or retail location (e.g., amount of the product stocked and also located in the back of the retailer or retail location, which may be dynamically updated in real-time based on product sales information obtained when a customer makes a purchase).
  • FIG. 9 a flowchart illustrating an exemplary method for analyzing, by the automated inventory intelligence system logic of FIGS. 7A-7B , an image of inventory to determine whether the inventory is to be restocked based on a triggering event is shown in accordance with some embodiments.
  • Each block illustrated in FIG. 9 represents an operation performed in the method 900 of analyzing an image of inventory to determine whether the inventory is to be restocked based on a triggering event.
  • the method 900 can start when the image receiving logic 718 of the automated inventory intelligence system logic 710 receives a trigger to determine whether inventory of a first inventory set is to be restocked (block 902 ).
  • a trigger may take various including a communication from a proximity sensors 714 1 - 714 j , from a facial recognition camera 716 1 - 716 k , a retail employee and/or a product delivery person.
  • a trigger be received automatically at predetermined intervals (e.g., a pull system such that the inventory cameras 712 1 - 712 i are activated upon an instruction from the image receiving logic 718 ).
  • the image receiving logic 718 can transmit a signal or other communication to an inventory camera 712 1 - 712 i monitoring the first inventory set identified in the trigger (e.g., the inventory configured to capture an image of the first inventory set) (block 904 ).
  • the image receiving logic 718 may determine one or more inventory cameras of the inventory cameras 712 1 - 712 i that are to capture an image.
  • the received trigger include have an indication of a particular inventory set (e.g., 2 L bottles of a first soda).
  • the image receiving logic 718 may determine the location of the particular inventory set in the retail location and determine the inventory camera 712 1 - 712 i that is configured/positioned to capture an image of the identified location.
  • a data store of the automated inventory intelligence system 700 may be configured to store data indicating pairings of: ⁇ inventory set, retail location ⁇ . Based on the stored pairing, which may be dynamically adjustable based on restocking of shelves, the image receiving logic 718 may determine the retail location of the inventory. In addition, the data store may also be configured to store pairings of: ⁇ retail location, inventory camera ⁇ . Therefore, based on the determined retail location, the image receiving logic 718 may determine the inventory camera 712 1 - 712 i that is positioned to capture an image of the retail location at which the particular inventory set is stocked. Based on this determination, the image receiving logic 718 may transmit a signal to the identified inventory camera with the instruction to capture an image.
  • the image receiving logic 718 can receive an imaged captured by the relevant inventory camera (block 906 ).
  • the automated inventory intelligence system logic 710 determines an amount of inventory within the first inventory set as discussed in FIG. 8 (block 908 ). Further, the automated inventory intelligence system logic 710 transmits a communication indicating whether inventory of the first inventory set is to be restocked based on a comparison of the amount of inventory of the first inventory set remaining with one or more inventory thresholds as discussed in FIG. 8 (block 910 ).
  • FIG. 10 an exemplary embodiment of the proximity sensor positioned on a cabinet display top of the automated inventory intelligence system of FIG. 1 is shown in accordance with some embodiments.
  • the proximity sensor 107 is a camera (“proximity camera 107 ”); however, the disclosure is not intended to be so limited.
  • the proximity sensor 107 may be any sensor configured to detect objects within a predetermined proximity region, e.g., without any physical contact with the objects.
  • a proximity region may be a geo-fence, e.g., a virtual perimeter for a geographical area.
  • Examples of alternative proximity sensors may include, but are not limited or restricted to, electromagnetic sensors, infrared sensors, and optical sensors.
  • capacitive proximity sensors, photoelectric proximity sensors, and/or inductive proximity sensors may be used.
  • the proximity camera 107 of FIG. 10 is configured to couple to the cabinet display top 106 via a mount 1004 and includes a lens 1000 and a housing 1002 of proximity camera.
  • the proximity camera 107 may be the same camera as the inventory camera 210 as seen in, for example, FIG. 2C , or an equivalent model.
  • the proximity camera 107 may have a viewing angle of approximately 180° (degrees) and is capable of monitoring a larger portion of the store environment, particularly, the geographic region in front of the shelving unit to which the proximity camera 107 is coupled.
  • the proximity camera 107 may be coupled to a side or the bottom of the cabinet display top 106 . In addition, although not shown, the proximity camera 107 may be integrated directly into the cabinet display top 106 . In addition, the proximity camera 107 may be positioned at any location on the cabinet display top 106 , e.g., the middle, toward either end, etc.
  • the image 1100 provides one exemplary perspective from the proximity camera 107 of FIGS. 1 and 10 .
  • the image 1100 provides an illustration of a store environment 1102 including a view of numerous store aisles as well as a geographic region proximate the shelving unit to which the automated inventory intelligence system (and hence the proximity camera 107 ) is coupled.
  • the proximity camera 107 is configured to capture the image 1100 , which provides a capture of objects in front of, and approaching, the front of the shelving unit.
  • the image 1100 includes an image capture, e.g., a picture, of a region including a first proximity region 1104 .
  • the image 110 captures a first shopper 1106 at least partially within the first proximity region 1104 , and shoppers 1108 , 1110 and 1112 outside of the first proximity region 1104 .
  • the image 1100 may capture additional objects 1111 .
  • the proximity logic 728 as seen in FIGS. 7A-7B , is communicatively coupled to the proximity sensor 107 and may determine the graphic(s) to be displayed on the fascia, according to the location of one or more objects detected in the captured image 1100 .
  • the proximity logic 728 may be configured to detect objects within the image 1100 and determine whether each detected object is within a proximity region. In some embodiments, the proximity logic 728 may be configured to detect objects with one or more proximity regions.
  • a single proximity region 1104 may be utilized such that when an object is detected and determined to be within the proximity region 1104 , the proximity logic 728 may transmit one or more instructions causing the automated inventory intelligence system to display a first graphic (e.g., product information) on its fascia, e.g., the fascia 108 1 - 108 4 , as well as the cabinet display top 106 . Subsequently, when objects are no longer detected within the proximity region, the proximity logic 728 may transmit one or more instructions causing the automated inventory intelligence system to display a second graphic (e.g., a promotion or other immersive graphic) on the fascia 108 1 - 108 4 and/or the cabinet display top 106 .
  • a first graphic e.g., product information
  • the proximity logic 728 may transmit one or more instructions causing the automated inventory intelligence system to display a second graphic (e.g., a promotion or other immersive graphic) on the fascia 108 1 - 108 4 and/or the cabinet display top 106 .
  • the proximity logic 728 may be configured to detect objects and determine whether each object is within one of a plurality of proximity regions such that the automated inventory intelligence system causes the display of various graphics depending on in which region each object is determined to be.
  • FIG. 11B a second illustration of an image captured by the proximity sensor of the automated inventory intelligence system of FIG. 1 is shown in accordance with some embodiments.
  • the image 1114 captured by the proximity camera 107 illustrates the proximity logic 728 being configured to detect objects within a plurality of proximity regions, e.g., the first proximity region 1118 and the second proximity region 1120 within the store environment 1116 .
  • the proximity logic 728 may provide instructions that cause the corresponding fascia to display varying graphics depending on whether objects are detected in the first proximity region 1118 and/or the second proximity region 1120 .
  • the second proximity region 1120 may comprise a smaller region compared to the first proximity region 1118 and be located closer to the shelving unit compared to the first proximity region 1118 .
  • first and second proximity regions 1118 , 1120 separate and non-overlapping (or a mix of both).
  • FIG. 12A an exemplary illustration of a plurality of proximity regions based on one configuration of a proximity sensor of an automated inventory intelligence system is shown in accordance with some embodiments.
  • the illustration of FIG. 12A provides an illustration of a first proximity region 1216 and a second proximity region 1218 partially surrounding the shelving unit 1200 , with a shopper 1219 located within the first proximity region 1216 .
  • the shelving unit 1200 being similar in this embodiment to the shelving unit 102 of FIG. 1 , includes a first shelf 1202 and a second shelf 1204 as well as a back component 1206 .
  • the shelving unit 1200 may be coupled to the automated inventory intelligence system 1207 , which is shown to include inventory cameras 1208 1 - 1208 4 , and fascia 1210 1 - 1210 2 .
  • the automated inventory intelligence system 1207 of the depicted embodiment may be closely equivalent to the automated inventory intelligence system 100 of FIG. 1 .
  • the shelving unit 1200 may be stocked with inventory such as a first inventory set 1212 and a second inventory set 1214 .
  • the first and second inventory sets 1212 , 1214 may be monitored by the inventory cameras 1208 1 - 1208 4 as discussed above in accordance with at least FIGS. 1-5 .
  • the graphics displayed on the fascia 1210 1 - 1210 2 may be changed in accordance with instructions provided by the proximity logic 728 based on whether an object is detected in the first proximity region 1216 and/or the second proximity region 1218 .
  • the proximity logic 728 may receive images captured by a proximity camera, such as the proximity camera 107 , and perform object recognition techniques, e.g., such as those discussed above with respect to the images captured by the inventory cameras 110 1 - 110 i .
  • the images captured by the proximity camera may be provided to the object recognition logic 720 .
  • the proximity logic 728 may cause the automated inventory intelligence system 1207 to change states, e.g., an immersive state, a promotion state, and a product information state. It should be noted that these are merely exemplary names and that an alternative naming convention may be utilized. Further, more or fewer states may be utilized by the proximity logic 728 . For example, the image 1100 of FIG. 11A illustrates a scenario in which the proximity logic 728 may be configured with fewer than three states, e.g., two states.
  • the shopper 1219 has been detected as being located in the first proximity region 1216 . Based on the detection of the shopper 1219 being located in the first proximity region 1216 , the proximity logic 728 has caused the automated inventory intelligence system 1207 to enter into a “promotion state” such that the fascia 1210 1 - 1210 2 are configured to display a promotion graphic (e.g., “Sale!” and “Kids Cereal!”).
  • a promotion graphic e.g., “Sale!” and “Kids Cereal!
  • the proximity logic 728 may be configured with three possible states: (1) an immersive state triggered when no objects are detected in either of the first proximity region 1216 or the second proximity threshold 1218 , wherein the fascia 1210 1 - 1210 2 are to display an immersive graphic; (2) a promotion state triggered when an object is detected in the first proximity region 1216 but no object detected in the second proximity region 1218 , wherein the fascia 1210 1 - 1210 2 are to display a promotion graphic; and (3) a promotion state triggered when an object is detected in the second proximity region 1218 , wherein the fascia 1210 1 - 1210 2 are to display a product information graphic.
  • Additional fascia, not shown, and/or the cabinet display top 1220 may also display graphics according to the state of the automated inventory intelligence system 1207 .
  • Embodiments of an automated inventory intelligence system 1207 in different states are discussed below and illustrated in FIGS. 12B-12D .
  • one goal of the graphics displayed on the cabinet display top 1220 and/or the fascia 1210 1 - 1210 2 is to grab the attention of shoppers and prolong dwell time in front of the shelving unit 1200 , which makes it more likely the shopper will purchase an item stocked on the shelving unit 1200 .
  • the invention provides manufacturers and retailers the advantage of enabling point of sale marketing through dynamic video and animation.
  • FIGS. 12B-12D an exemplary illustration fascia of the automated inventory intelligence system 1207 displaying a first graphic while in a promotion state is shown in accordance with some embodiments.
  • FIG. 12B illustrates the shelving unit 1200 of FIG. 12A coupled to the automated inventory intelligence system of FIG. 12A in a promotion state.
  • the promotion state may correspond to the display of content-specific promotions graphics on one or more of the fascia 1210 1 - 1210 i and cabinet display top 1220 .
  • the graphics displayed may be (i) specific to the items stocked on the shelving unit, (ii) specific to the brands having items stocked on the shelving unit, (iii) promotions and/or discounts specific to the items stocked on the shelving unit, etc.
  • the proximity logic 728 has provided instructions causing the fascia 1210 1 - 1210 i to display promotional information, e.g., a promotion benefitting schools by virtue of purchasing cereal and returning a portion of the cereal box, which corresponds to the items, e.g., cereal, stocked on the shelving unit 1200 .
  • FIG. 12C provides an exemplary illustration fascia of the automated inventory intelligence system of FIG. 12A displaying a first graphic while in a product information state is shown in accordance with some embodiments.
  • FIG. 12C illustrates the shelving unit 1200 of FIG. 12A coupled to the automated inventory intelligence system 1207 in a product information state.
  • the product information state may correspond to the display of product specific information graphics, including pricing information, on one or more of the fascia 1210 1 - 1210 i and cabinet display top 1220 .
  • the product information state may include information similar to information displayed in the promotion state including discounts or promotions on particular items or with particular brands.
  • the proximity logic 728 has provided instructions causing the fascia 1210 1 - 1210 i to display product information, e.g., pricing information for each of the items stocked on the shelving unit 1200 .
  • FIG. 12D provides an exemplary illustration fascia of the automated inventory intelligence system 1207 displaying a first graphic while in an immersive state is shown in accordance with some embodiments.
  • FIG. 12D illustrates the shelving unit 1200 of FIG. 12A coupled to the automated inventory intelligence system 1207 in an immersive state.
  • the immersive state may correspond to the display of graphics which may span a plurality of the fascia 1210 1 - 1210 i .
  • the graphics displayed while the automated inventory intelligence system 1207 is in the immersive state are intended to grab the attention of shoppers not standing immediately in front of the shelving unit 1200 .
  • the graphics displayed while the automated inventory intelligence system 1207 can be in the immersive state span from the cabinet display top 1220 and across each of the fascia 1210 1 - 1210 i .
  • the proximity logic 728 has provided instructions causing the cabinet display top 1220 and the fascia 1210 1 - 1210 i to display an immersive graphic, e.g., an animation of a school bus.
  • the graphic may be animated wherein lights of the school bus may appear to flash and/or the school bus may appear to be approaching the edge of the shelving unit 1200 facing customers and the retail environment.
  • FIG. 13 an exemplary flowchart illustrating operations corresponding to detecting whether an object is located within one of a plurality of proximity regions performed by an automated inventory intelligence system is shown in accordance with some embodiments.
  • Each block illustrated in FIG. 13 represents an operation performed in the method 1300 of detecting whether an object is located within one of a plurality of proximity regions performed by an automated inventory intelligence system.
  • FIG. 13 will be discussed with reference to FIGS. 12A-12D .
  • the method 1300 begins when an image captured by a proximity sensor, e.g., the proximity camera 107 , is received (block 1302 ). Following receipt of the image from the proximity camera 107 , object recognition techniques, as described above, can be performed on the image (block 1304 ).
  • instructions are transmitted by the proximity logic 728 that are configured to cause fascia, e.g., the fascia 1210 1 - 1210 i , and/or the cabinet display top 1220 to display a product information graphic (block 1308 ).
  • the method 1300 then continues the operations discussed below with respect to FIG. 13 .
  • instructions are transmitted by the proximity logic 728 that are configured to cause fascia, e.g., the fascia 1210 1 - 1210 i and/or the cabinet display top 1220 to display a promotion graphic (block 1312 ).
  • the method 1300 then continues the operations discussed below with respect to FIG. 13 .
  • instructions are transmitted by the proximity logic 728 that are configured to cause fascia, e.g., the fascia 1210 1 - 1210 i , and/or the cabinet display top 1220 to display an immersive graphic (block 1314 ).
  • the method 1300 then continues the operations discussed below with respect to FIG. 13 .
  • the proximity logic 728 can stall for a predetermined amount of time. Following the expiration of the predetermined amount of time, the proximity logic 728 may transmit a request for an updated image to the proximity camera 107 and performs object recognition techniques on the received image (block 1316 ).
  • an object is detected only in the first proximity region 1216 , i.e., no change since the previous image was captured, no change is instituted to the graphic displayed on the fascia 1210 1 - 1210 i and/or the cabinet display top 1220 and the method 1300 returns to block 1316 .
  • instructions are transmitted by the proximity logic 728 that are configured to cause fascia, e.g., the fascia 1210 1 - 1210 i , and/or the cabinet display top 1220 to display a product information graphic (block 1320 ) and the method 1300 returns to block 1316 .
  • instructions can be transmitted by the proximity logic 728 that are configured to cause fascia, e.g., the fascia 1210 1 - 1210 i , and/or the cabinet display top 1220 to display an immersive graphic (block 1322 ) and the method 1300 returns to block 1316 .
  • FIG. 14 an exemplary flowchart illustrating operations corresponding to detecting one or more objects in one or more of a plurality of proximity regions and generating activity logs performed by an automated inventory intelligence system is shown in accordance with some embodiments.
  • Each block illustrated in FIG. 14 represents an operation performed in the method 1400 of detecting one or more objects in one or more of a plurality of proximity regions and generating activity logs performed by an automated inventory intelligence system.
  • the method 1400 begins when an image captured by a proximity sensor, e.g., the proximity camera 107 , is received by, e.g., the proximity logic 728 and/or the object recognition logic 720 (block 1402 ). Following receipt of the image from the proximity camera 107 , object recognition techniques, as described above, are performed on the image (block 1404 ).
  • a proximity log e.g., the proximity log 732 of FIGS. 7A-7B (block 1406 ).
  • traffic density information can be determined, e.g., by the proximity logic 728 , and recorded in a traffic density log, e.g., the traffic log 734 of FIGS. 7A-7B (block 1408 ).
  • the traffic density information may include, but is not limited or restricted to, traffic density (of shoppers, employees, or a mix of either) over a given time period (broken down in time frames including by the hours of the day, days of the week, etc.), inventory on the corresponding unit during each time frame, the display on the fascia and/or the cabinet top display during each time frame, demographic data of each shopper, location of the retail location, location of the shelving unit within the retail location, an outdoor temperature at the retail location during a given time frame (e.g., exact, average, etc.), etc.
  • the shelving unit may be able to differentiate shoppers from employees due to an identifying uniform/object (which can be parsed by the object detection techniques described above) or via a signal transmission (from, for example, employee cards/identification badges).
  • the proximity logic 728 and/or the alert generation logic 724 may generate and cause the rendering of one or more visual graphic displays based on at least in part the traffic density information (block 1410 ). Exemplary visual graphic displays are illustrated in FIGS. 15A-15B .
  • FIG. 15A an exemplary embodiment of a first user interface display screen produced by an automated inventory intelligence system, where the first user interface display screen provides an interactive dashboard is shown in accordance with some embodiments.
  • the first user interface display screen 1500 generated by either the proximity logic 728 and/or the alert generation logic 724 , and rendered by the alert generation logic 724 comprises a plurality of display areas including a top navigation panel 1502 , a side navigation panel 1504 , a first graphical display area 1506 and a second graphical display area 1508 .
  • the top navigation panel 1502 may include graphic display elements, each configured to receive user input and cause performance of specific operations including filtering options and/or a downloading options.
  • the side navigation panel 1504 may include graphic display elements, each configured to receive user input and cause performance of specific operations such as rendering of user interface display screens directed to options or links such as: administrative options or links, media player options or links, triggering options or links (e.g., of one or more of the sensors included in the automated inventory intelligence system 700 as seen in FIGS. 7A-7B ), etc.
  • the first graphical display area 1506 and the second graphical display area 1508 may provide information pertaining to the traffic density information in a graphical user interface format including a graphical display of dwell time of shoppers by the hour of the day for a given period of time (e.g., the first graphical display area 1506 ) and/or a graphical display of proximity triggering events by the hour of the day for a given period of time (second graphical display area 1508 ).
  • FIG. 15B an exemplary embodiment of a second user interface display screen produced by an automated inventory intelligence system, where the second user interface display screen provides an interactive dashboard is shown in accordance with some embodiments.
  • the second user interface display screen 1510 generated in many embodiments by either the proximity logic 728 and/or the alert generation logic 724 , and rendered by the alert generation logic 724 comprises a plurality of display areas including the top navigation panel 1502 , the side navigation panel 1504 , a third graphical display area 1512 and a fourth graphical display area 1514 .
  • the third graphical display area 1512 and the fourth graphical display area 1514 may provide information pertaining to the traffic density information in a graphical user interface format including a graphical display of an average shopper count by the hour of the day for a given period of time (e.g., the third graphical display area 1512 ) and/or a graphical display of average shopper count by the day of the week for a given period of time (fourth graphical display area 1514 ).
  • the graphical displays provided by the graphical display areas 1506 , 1508 , 1512 and 1514 are merely examples and representative of graphical displays that may be generated by the proximity logic 728 and/or the alert generation logic 724 based at least in part on the traffic density information.

Abstract

In one embodiment, a proximity camera system is disclosed. The proximity camera system may include a proximity camera having a lens and a housing, fascia, and one or more processors communicatively coupled to the proximity camera and the fascia. The proximity camera system may additionally include a non-transitory computer-readable medium communicatively coupled to the one or more processors and having logic thereon, the logic, when executed by the one or more processors, being configured to perform operations including: (i) receiving an image captured by the proximity camera, (ii) performing object recognition techniques on the image, (iii) determining whether an object was detected within a first predetermined proximity region, and (iv) transmitting one or more instructions configured to cause a graphical display to be displayed by the fascia. In some embodiments, the proximity camera and the fascia may be coupled to a shelving unit.

Description

    PRIORITY
  • This application is a continuation of U.S. patent application Ser. No. 16/598,557, filed Oct. 10, 2019, now issued U.S. Pat. No. 11,250,465, which claims the benefit of priority to U.S. Provisional Patent Application No. 62/743,734, filed Oct. 10, 2018, which are hereby incorporated by reference into this application in their entireties.
  • BACKGROUND
  • Retail environments are ever challenging. Consumers typically are confronted with pricing and information about a continuously increasing number of competitors and brands, including information about pricing, labeling, promotions, and the like. Traditionally, this information has been provided using print systems, such as slide-in paper systems, plastic label systems, and adhesive label systems. However, consumers are increasingly confounded by the sheer volume of printed information displayed in retail environments, and thus a growing number of consumers are turning to online shopping for day-to-day purchases. Furthermore, a retailer's overall performance and profits are significantly impacted by the challenge of getting the right products to the right places at the right time.
  • In addition, retailers are constantly concerned with the stocking of their shelves. A retailer may lose money due to a failure to restock inventory. For example, a customer may approach a shelf seeking to purchase a particular item; however, the shelf indicated as the location of the particular item may be empty. In some situations, a retailer may have that particular item stored in the back of the store but due to a lack of knowledge that the shelf was empty, the shelf was not restocked with the item causing the retailer to lose the money the customer would have spent on purchasing the particular item. Such a situation occurs at a high rate and may cost a retailer thousands or even millions of dollars in lost revenue each year.
  • Furthermore, customers often enter a retail location or pass by a retail exhibit (e.g., vending machine or small retail stand such as in a mall, an airport, a hospital, etc.) and fail to notice objects on some shelving units or fail to realize promotions or discounts apply to certain objects. When shopping at a retail location, customers are often distracted for a variety of reasons including looking at their mobile device, talking on their mobile device and/or watching children.
  • SUMMARY
  • In one embodiment, an proximity camera system, comprises a proximity camera having a lens and a housing, one or more fascia, one or more processors communicatively coupled to the proximity camera and the one or more fascia, and a non-transitory computer-readable medium communicatively coupled to the one or more processors and having logic thereon, the logic, when executed by the one or more processors, being configured to perform operations including: (i) receiving an image captured by the proximity camera, (ii) performing object recognition techniques on the image, (iii) determining whether an object was detected within a first predetermined proximity region, and (iv) transmitting one or more instructions configured to cause a graphical display to be displayed by the one or more fascia.
  • In one embodiment, the proximity camera and the one or more fascia are coupled to a shelving unit. Additionally, the image illustrates a geo-fence region at least partially surrounding the shelving unit, the geo-fence region including the first predetermined proximity region. In some embodiments, the logic is configured to determine whether a second object is detected within a second predetermined proximity region, the second proximity region including a physical area closer to the shelving unit than the first predetermined proximity region.
  • In one embodiment, the image is transmitted to a cloud computing service for analysis of the first predetermined proximity region. Additionally, the graphical display displayed by the one or more fascia may include an immersive graphic. In some embodiments, the immersive graphic spans a plurality of the one or more fascia. In one embodiment, a cabinet display top may be communicatively coupled to the one or more processors and the non-transitory computer-readable medium, and the immersive graphic spans a plurality of the one or more fascia and the cabinet display top. In some embodiments, the graphical display displayed by the one or more fascia includes a product information graphic, wherein the product information graphic includes at least pricing information for one or more inventory items. In other embodiments, the graphical display displayed by the one or more fascia includes a promotional graphic, wherein the promotional graphic includes at least information corresponding to a promotion or discount for one or more inventory items.
  • In one embodiment a computerized method is disclosed. The method includes receiving an image captured by the proximity camera, performing object recognition techniques on the image, determining whether an object was detected within a first predetermined proximity region, and transmitting one or more instructions configured to cause a graphical display to be displayed by the one or more fascia. In some embodiments of the computerized method, the proximity camera and the one or more fascia are coupled to a shelving unit. The image may illustrates a geo-fence region at least partially surrounding the shelving unit, the geo-fence region including the first predetermined proximity region.
  • In some embodiments, the computerized method may further include determining whether a second object is detected within a second predetermined proximity region, the second proximity region including a physical area closer to the shelving unit than the first predetermined proximity region. In yet other embodiments of the computerized method, the graphical display displayed by the one or more fascia includes an immersive graphic that spans a plurality of the one or more fascia.
  • In one embodiment a non-transitory computer readable storage medium having stored thereon instructions is disclosed. In some embodiments, the instructions are executable by one or more processors to perform operations including receiving an image captured by the proximity camera, performing object recognition techniques on the image, determining whether an object was detected within a first predetermined proximity region, and transmitting one or more instructions configured to cause a graphical display to be displayed by the one or more fascia. In some embodiments, the proximity camera and the one or more fascia are coupled to a shelving unit. In some embodiments, the image illustrates a geo-fence region at least partially surrounding the shelving unit, the geo-fence region including the first predetermined proximity region.
  • In other embodiments, the instructions are executable by the one or more processors to perform further operations including determining whether a second object is detected within a second predetermined proximity region, the second proximity region including a physical area closer to the shelving unit than the first predetermined proximity region. In yet other embodiments, the graphical display displayed by the one or more fascia includes an immersive graphic that spans a plurality of the one or more fascia.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention may best be understood by referring to the following description and accompanying drawings that are used to illustrate embodiments of the invention. In the drawings:
  • FIG. 1 provides an illustration of an automated inventory intelligence system in accordance with some embodiments;
  • FIG. 2A provides a second illustration of a plurality of shelves with an automated inventory intelligence system in accordance with some embodiments;
  • FIG. 2B provides an illustration of a mount of the inventory camera of FIG. 2A in accordance with some embodiments;
  • FIG. 2C provides an illustration of the inventory camera positioned within the mount of the automated inventory intelligence system of FIGS. 2A-2B;
  • FIG. 3 provides a second illustration of a plurality of shelves with an automated inventory intelligence system in accordance with some embodiments;
  • FIG. 4 provides an illustration of a portion of an automated inventory intelligence system in accordance with some embodiments;
  • FIG. 5 provides an illustration of an image captured by a camera of an automated inventory intelligence system in accordance with some embodiments;
  • FIG. 6A provides a schematic illustrating a sensor coupled to a retail shelving unit in accordance with some embodiments in shown;
  • FIG. 6B provides a schematic illustrating a sensor such as an inventory camera coupled to an automated inventory intelligence system in accordance with some embodiments;
  • FIG. 6C provides a schematic illustrating a sensor such as an inventory camera coupled to the automated inventory intelligence system in accordance with some embodiments;
  • FIG. 7A provides an exemplary embodiment of a first logical representation of the automated inventory intelligence system of FIG. 1;
  • FIG. 7B provides an exemplary embodiment of a second logical representation of the automated inventory intelligence system of FIG. 1;
  • FIG. 8 provides a flowchart illustrating an exemplary method for analyzing, by the automated inventory intelligence system logic of FIGS. 7A-7B, an image of inventory to determine whether the inventory is to be restocked in accordance with some embodiments; and
  • FIG. 9 provides a flowchart illustrating an exemplary method for analyzing, by the automated inventory intelligence system logic of FIGS. 7A-7B, an image of inventory to determine whether the inventory is to be restocked based on a triggering event in accordance with some embodiments.
  • FIG. 10 provides an exemplary embodiment of the proximity sensor positioned on a cabinet display top of the automated inventory intelligence system of FIG. 1 in accordance with some embodiments;
  • FIG. 11A provides a first illustration of an image captured by the proximity sensor of the automated inventory intelligence system of FIG. 1 in accordance with some embodiments;
  • FIG. 11B provides a second illustration of an image captured by the proximity sensor of the automated inventory intelligence system of FIG. 1 in accordance with some embodiments;
  • FIG. 12A provides an exemplary illustration of a plurality of proximity regions based on one configuration of a proximity sensor of an automated inventory intelligence system in accordance with some embodiments;
  • FIG. 12B provides an exemplary illustration fascia of the automated inventory intelligence system of FIG. 12A displaying a first graphic while in a promotion state in accordance with some embodiments;
  • FIG. 12C provides an exemplary illustration fascia of the automated inventory intelligence system of FIG. 12A displaying a first graphic while in a product information state in accordance with some embodiments;
  • FIG. 12D provides an exemplary illustration fascia of the automated inventory intelligence system of FIG. 12A displaying a first graphic while in an immersive state in accordance with some embodiments;
  • FIG. 13 provides an exemplary flowchart illustrating operations corresponding to detecting whether an object is located within one of a plurality of proximity regions performed by an automated inventory intelligence system in accordance with some embodiments;
  • FIG. 14 provides an exemplary flowchart illustrating operations corresponding to detecting one or more objects in one or more of a plurality of proximity regions and generating activity logs performed by an automated inventory intelligence system in accordance with some embodiments;
  • FIG. 15A provides an exemplary embodiment of a first user interface display screen produced by an automated inventory intelligence system, where the first user interface display screen provides an interactive dashboard in accordance with some embodiments; and
  • FIG. 15B provides an exemplary embodiment of a second user interface display screen produced by an automated inventory intelligence system, where the second user interface display screen provides an interactive dashboard in accordance with some embodiments.
  • DETAILED DESCRIPTION
  • In response to the problems outlined above, a continuing need exists for solutions that help retailers increase operational efficiencies, create intimate customer experiences, streamline processes, and provide real-time understanding of customer behavior in the store. Provided herein are automated inventory intelligence systems and methods that address the foregoing. Thus, it would be advantageous for retail locations and manufacturers that have inventory for sale at retail locations to be able to: (i) cause customers to notice a particular shelving unit, and (ii) provide entertaining and attractive graphics that grab customers' attention and also provide promotion as well as product information.
  • Before some particular embodiments are provided in greater detail, it should be understood that the particular embodiments provided herein do not limit the scope of the concepts provided herein. It should also be understood that a particular embodiment provided herein can have features that may be readily separated from the particular embodiment and optionally combined with or substituted for features of any of a number of other embodiments provided herein.
  • Regarding terms used herein, it should also be understood the terms are for the purpose of describing some particular embodiments, and the terms do not limit the scope of the concepts provided herein. Ordinal numbers (e.g., first, second, third, etc.) are generally used to distinguish or identify different features or steps in a group of features or steps, and do not supply a serial or numerical limitation. For example, “first,” “second,” and “third” features or steps need not necessarily appear in that order, and the particular embodiments including such features or steps need not necessarily be limited to the three features or steps. Labels such as “left,” “right,” “front,” “back,” “top,” “bottom,” “forward,” “reverse,” “clockwise,” “counter clockwise,” “up,” “down,” or other similar terms such as “upper,” “lower,” “aft,” “fore,” “vertical,” “horizontal,” “proximal,” “distal,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by those of ordinary skill in the art.
  • In general, the present disclosure describes an apparatus and a method for an automated inventory intelligence system that provides intelligence in tracking inventory on, for example retail shelves, as well intelligence in determining the proximity of retail customers as they approach, stall and pass a particular retail shelf or display and the demographics of the retail customers. In one embodiment, the automated inventory intelligence system is comprised of a cabinet top display, fascia, a proximity sensor, one or more inventory sensors, and one or more demographic tracking sensors. The cabinet top display can be configured to display animated and/or graphical content and is mounted on top of in-store shelves. In many embodiments, the fascia may include one or more panels of light-emitting diodes (LEDs) configured to display animated and/or graphical content and to mount to an in-store retail shelf. It would be understood by those skilled in the art that other light-emitting technologies may be utilized that can provide sufficient brightness, resolution, contrast, and/or color response. The automated inventory intelligence system can also include a data processing system comprising a media player that is configured to simultaneously execute (i.e., “play”) a multiplicity of media files that are displayed on the cabinet top and/or the fascia. The cabinet top and the fascia are typically configured to display content so as to entice potential customers to approach the shelves, and then the fascia may switch to displaying pricing and other information pertaining to the merchandise on the shelves once a potential customer approaches the shelves. The proximity sensor is configured to detect the presence of potential customers. Further, one or more inventory sensors may be configured to track the inventory stocked on one or more in-store retail shelves. The automated inventory intelligence system may create one or more alerts once the stocked inventory remaining on the shelves is reduced to a predetermined minimum threshold quantity.
  • I. System Architecture
  • Referring now to FIG. 1, an illustration of an automated inventory intelligence system 100 in accordance with some embodiments is shown. The automated inventory intelligence system 100 comprises a proximity camera 107, fascia 108 1-108 4, a plurality of inventory cameras 110 1-110 i (wherein i≥1, herein, i=8) and a facial recognition camera 109. It is noted that the disclosure is not limited to the automated inventory intelligence system 100 including a single cabinet display top 106 but may include a plurality of cabinet top displays 106. Additionally, the automated inventory intelligence system 100 is not limited to the number of fascia, shelving units, proximity cameras, facial recognition cameras and/or inventory cameras shown in FIG. 1. In typical embodiments, the automated inventory intelligence system 100 couples to a shelving unit 102, which often includes shelves 104, a back component 105 (e.g., pegboard, gridwall, slatwall, etc.) and a cabinet top display 106.
  • In many embodiments, the cabinet display top 106 is coupled to an upper portion of the shelving unit 102, extending vertically from the back component 105. Further, a proximity camera 107 may be positioned on top of, or otherwise affixed to, the cabinet top display 106. Although the proximity camera 107 is shown in FIG. 1 as being centrally positioned atop the cabinet top display 106, the proximity camera 107 may be positioned in different locations, such as near either end of the top of the cabinet top 106, on a side of the cabinet top 106 and/or at other locations coupled to the shelving unit 102 and/or the fascia 108.
  • The cabinet display top 106 and fascia 108 may be attached to the shelves 104 by way of any fastening means deemed suitable, wherein examples include, but are not limited or restricted to, magnets, adhesives, brackets, hardware fasteners, and the like. In a variety of embodiments, the fascia 108 and the cabinet display top 106 may each be comprised of one or more arrays of light emitting diodes (LEDs) that are configured to display visual content (e.g., still or animated content), with optional speakers, not shown, coupled thereto to provide audio content. Any of the fascia 108 and/or the cabinet display top 106 may be comprised of relatively smaller LED arrays that may be coupled together so as to tessellate the cabinet display top 106 and the fascia 108, such that the fascia and cabinet top desirably extend along the length of the shelves 104. The smaller LED arrays may be comprised of any number of LED pixels, which may be organized into any arrangement to conveniently extend the cabinet display top 106 and the fascia 108 along the length of a plurality of shelves 104. In some embodiments, for example, a first dimension of the smaller LED arrays may be comprised of about 132 or more pixels. In some embodiments, a second dimension of the smaller LED arrays may be comprised of about 62 or more pixels.
  • The cabinet display top 106 and the fascia 108 may be configured to display visual content to attract the attention of potential customers. As shown in the embodiment of FIG. 1, the cabinet display top 106 may display desired visual content that extends along the length of the shelves 104. The desired content may be comprised of a single animated or graphical image that fills the entirety of the cabinet display top 106, or the desired content may be a group of smaller, multiple animated or graphical images that cover the area of the cabinet display top 106. In some embodiments, the fascia 108 may cooperate with the cabinet display top 106 to display either a single image or multiple images that appear to be spread across the height and/or length of the shelves 104.
  • In some embodiments, the cabinet display top 106 may display visual content selected to attract the attention of potential customers to one or more products comprising inventory 112, e.g., merchandise, located on the shelves 104. Thus, the visual content shown on the cabinet display top 106 may be specifically configured to draw the potential customers to approach the shelves 104, and is often related to the specific inventory 112 located on the corresponding shelves 104. A similar configuration with respect to visual content displayed on the fascia 108 may apply as well, as will be discussed below. The content shown on the cabinet display top 106, as well as the fascia 108, may be dynamically changed to engage and inform customers of ongoing sales, promotions, and advertising. As will be appreciated, these features offer brands and retailers a way to increase sales locally by offering customers a personalized campaign that may be easily changed quickly.
  • Moreover, as referenced above, portions of the fascia 108 may display visual content such as images of brand names and/or symbols representing products stocked on the shelves 104 nearest to each portion of the fascia. For example, in an embodiment, a single fascia 108 may be comprised of a first portion 114 and a second portion 116. The first portion 114 may display an image of a brand name of inventory 112 that is stocked on the shelf above the first portion 114 (e.g., in one embodiment, stocked directly above the first portion 114), while the second portion 116 may display pricing information for the inventory 112. Additional portions may include an image of a second brand name and/or varied pricing information when such portions correspond to inventory different than inventory 112. It is contemplated, therefore, that the fascia 108 extending along each of the shelves 104 may be sectionalized to display images corresponding to each of the products stocked on the shelves 104. It is further contemplated that the displayed images will advantageously simplify customers quickly locating desired products.
  • In an embodiment, the animated and/or graphical images displayed on the cabinet display top 106 and the fascia 108 are comprised of media files that are executed by way of a suitable media player. The media player is often configured to simultaneously play any desired number of media files that may be displayed on the smaller LED arrays. In some embodiments, each of the smaller LED arrays may display one media file being executed by the multiplayer, such that a group of adjacent smaller LED arrays combine to display the desired images to the customer. Still, in some embodiments, base video may be stretched to fit any of various sizes of the smaller LED arrays, and/or the cabinet display top 106 and fascia 108. It should be appreciated, therefore, that the multiplayer disclosed herein enables implementing a single media player per aisle in-store instead relying on multiple media players dedicated to each aisle.
  • Furthermore, FIG. 1 illustrates a plurality of inventory cameras 110 (i.e., the inventory cameras 110 1-110 8). In some embodiments, the inventory cameras 110 are coupled to the shelving unit 102, e.g., via the pegboard 105, and positioned above merchandise 112, also referred to herein as “inventory. “Each of the inventory cameras 110 can be configured to monitor a portion of the inventory stocked on each shelf 104, and in some instances, may be positioned below a shelf 104, e.g., as is seen with the inventory cameras 110 3-110 8. However, in some instances, an inventory camera 110 may not be positioned below a shelf 104, e.g., as is seen with the inventory cameras 110 1-110 2. Taking the inventory camera 110 4, as an example, the inventory camera 110 4 is positioned above the inventory portion 116 and therefore capable of (and configured to), monitor the inventory portion 116. Although, it should be noted that the inventory camera 110 4 may have a viewing angle of 180° (degrees) and is capable of monitoring a larger portion of the inventory 112 on the shelf 104 2 than merely inventory portion 116. For example, FIG. 5 illustrates one exemplary image captured by an inventory camera having a viewing of 180°.
  • As is illustrated in FIGS. 2A-4 and 6A-6C and discussed with respect thereto, the positioning of the inventory cameras 110 may differ from the illustration of FIG. 1. In addition to being positioned differently with respect to spacing above inventory 112 on a particular shelf 104, the inventory cameras 110 degree may be affixed to the shelving unit 102 in a variety of manners, including attachment to various types of shelves 104 and monitoring of any available inventory 112 stored thereon.
  • In addition to the proximity camera 107 and the inventory cameras 110 1-110 8, various embodiments of the automated inventory intelligence system 100 can also include a facial recognition camera 109. In one embodiment, the facial recognition camera 109 may be coupled to the exterior of the shelving unit 102. In some embodiments, the facial recognition camera 109 may be positioned between five to six feet from the ground in order to obtain a clear image of the faces of a majority of customers. The facial recognition camera 109 may be positioned at heights other than five to six feet from the ground. The facial recognition camera 109 need not be coupled to the exterior of the shelving unit 102 as illustrated in FIG. 1; instead, the illustration of FIG. 1 is merely one embodiment. The facial recognition camera 109 may be coupled to in the interior of a side of the shelving unit 109 as well as to any portion of any of the shelves 104 1-104 4, the cabinet display top 106, the fascia 108 and/or the back component 105 of the shelving unit 102. Further, a plurality of facial recognition cameras 109 may be coupled to the shelving unit 102. In certain embodiments, the facial recognition camera 109 may be eliminated and its associated functions accomplished by any available proximity cameras 107. In these embodiments, software can be utilized to account for any discrepancy between the image and angles captured between the proximity cameras 107 as compared to the facial recognition cameras 109. In further embodiments, especially where privacy concerns are heightened, facial recognition cameras may be eliminated leaving the automated inventory intelligent system 100 to gather customer data by other means including, but not limited to, mobile phone signals/application data and/or radio-frequency identification (RFID) signals.
  • In some embodiments, the automated inventory intelligence system 100 may include one or more processors, a non-transitory computer-readable memory, one or more communication interfaces, and logic stored on the non-transitory computer-readable memory. The images or other data captured by the proximity sensor 107, the facial recognition camera 109 and/or the inventory cameras 110 1-110 8 may be analyzed by the logic of the automated inventory intelligence system 100. The non-transitory computer-readable medium may be local storage, e.g., located at the store in which the proximity sensor 107, the facial recognition camera 109 and/or the inventory cameras 110 1-110 8 reside, or may be cloud-computing storage. Similarly, the one or more processors may be local to the proximity sensor 107, the facial recognition camera 109 and/or the inventory cameras 110 1-110 8 or may be provided by cloud computing services.
  • Examples of the environment in which the automated inventory intelligence system 100 may be located include, but are not limited or restricted to, a retailer, a warehouse, an airport, a high school, college or university, any cafeteria, a hospital lobby, a hotel lobby, a train station, or any other area in which a shelving unit for storing inventory may be located.
  • II. Inventory Sensors
  • Referring to FIG. 2A, a second illustration of a plurality of shelves with an automated inventory intelligence system in accordance with some embodiments is shown. Specifically, FIG. 2A illustrates the automated inventory intelligence system 206 coupled to a shelving unit 200. More particularly, the shelving unit 200 includes a back component 202 (e.g., pegboard) and shelves 204 (wherein shelves 204 1-204 3 are illustrated; however, the shelving unit 200 may include additional shelves). In the illustrated embodiment, the automated inventory intelligence system 206 includes fascia 208 and the inventory sensor 210 (herein the inventory sensor 210 is depicted as inventory camera). Although only a single inventory camera 210 is shown in FIG. 2A, the automated inventory intelligence system 206 may include additional inventory cameras not shown. FIG. 2A provides a clear perspective as to the positioning of the inventory camera 210 may be in one embodiment. Specifically, the inventory camera 210 is shown to be coupled to a corner formed by an underside of the shelf 204 1 and the back component 202. The positioning of the inventory camera 210 can enable the inventory camera 210 to monitor the inventory 212. Additional detail of the coupling of the inventory camera 210 to the shelving unit 200 is seen in FIG. 2B. In addition, the fascia, e.g., fascia 208 2 may display pricing information (as also shown in FIG. 1) as well as display an alert, e.g., a visual indicator via LEDs of a portion of the fascia, indicating that inventory stocked on the corresponding shelf, e.g., the shelf 208 2, is to be restocked.
  • Referring now to FIG. 2B, an illustration of a mount of the inventory camera 210 of FIG. 2A is shown in accordance with some embodiments. The mount 222, which may be “L-shaped” in nature (i.e., two sides extending at a 90° (degree) angle from each other, is shown without the inventory camera 210 placed therein. In some embodiments, the inventory camera 210 may snap into the mount 222, which may enable inventory cameras to be easily replaced, moved, removed for charging or repair, etc. The mount 222 is shown as being coupled to a corner formed by an underside of the shelf 204 1 and the back component 202. In particular, the shelving unit 200 depicted in FIG. 2B comprises a first metal runner 214 is attached to the back component 202 and a second metal runner 220 is shown as being attached to the underside of the shelf 204 1. The first metal runner 214 includes a first groove 216 and a second groove 218 to which flanges of the mount 222, such as the flange 228, may slide or otherwise couple. Although not shown, a groove is also formed by the second metal runner 220, which may also assist in the coupling of the mount 222.
  • In the embodiment illustrated, the mount 222 includes a top component 224, a side component 226, an optional flange 228, bottom grips 230, top grips 232, a top cavity 234 and side cavity 236. In addition, although not shown, a flange extending from the top component 224 to couple with the metal runner 220 may be included. The inventory camera 210 may couple to the mount 222 and be securely held in place by the bottom grips 230 and the top grips 232. Further, the body of the inventory camera 210 may include projections that couple, e.g., mate, with the cavity 234 and/or the cavity 236 to prevent shifting of the inventory camera 210 upon coupling with the mount 222.
  • Referring to FIG. 2C, an illustration of the inventory camera 210 positioned within the mount 222 of the automated inventory intelligence system 206 of FIGS. 2A-2B is shown. The inventory camera 210 is positioned within the mount 222 and includes a lens 238 and a housing 240. The inventory camera 210 is shown as having four straight sides but may take alternative forms as still be within the scope of the invention. For example, in other embodiments, the inventory camera 210 may only have two straight sides and may include two curved sides. Additionally, the inventory camera 210 may take a circular shape or include one or more circular arcs. Further, the inventory camera 210 may take the form of any polygon or other known geometric shape. In addition, the housing 240 may have an angled face such that the face of the housing 240 slopes away from the lens 238, which may be advantageous in capturing an image having a viewing angle of 180°. The inventory camera 210 may snap into the mount 222 and held in place by friction of the bottom grips 230 and top grips 232, and the force applied by the top component 224 and the side component 226. It would be understood to those skilled in the art that the mount 222 can comprise a variety of shapes depending on the camera and shelving unit 200 being utilized, as can be shown in the camera mount depicted in FIG. 3 below.
  • Referring now to FIG. 3, a second illustration of a plurality of shelves with an automated inventory intelligence system is shown in accordance with some embodiments. In particular, FIG. 3 illustrates an inventory camera 310 1 of the automated inventory intelligence system 300 coupled to the underside of a shelf 304 1, which is part of the shelving unit 302. In the embodiment depicted in FIG. 3, the automated inventory intelligence system 300 includes the fascia 306 1-306 2, the camera 310 1 and a mount 314. In one embodiment, the mount 314 is coupled to underside of shelf 3041, which is possible due to the configuration of the shelf 304 1, particularly, the shelf 304 1 is comprised of a series of grates. Due to the grated nature of the shelf 304 1, the mount 314 may be configured to clip directly to one or more of the grates.
  • It should also be noted that the shelving unit 302 is refrigerated, e.g., configured for housing milk, and includes a door, not shown. As a result of being refrigerated, the shelving unit 302 experiences temperature swings as the door is opened and closed, which often results in the temporary accumulation of condensation on the lens of the inventory camera 310 1. Thus, the logic of the automated inventory intelligence system may perform various forms of processing for handling the temporary accumulation of condensation on the lens of the inventory camera 310 1, which may include, for example, (i) sensing when the door of the shelving unit 302 is opened, e.g., via sensing activation of a light, and waiting a predetermined amount of time before taking an image capture with the inventory camera 310 1 (e.g., to wait until the condensation has dissipated), and/or (ii) capturing an image with the inventory camera 310 1, performing image processing such as object recognition techniques, and discarding the image when the object recognition techniques do not provide a confidence level of the recognized objects above a predetermined threshold (e.g., condensation blurred or otherwise obscured the image, indicating the presence of condensation).
  • Although not shown, in one embodiment, the inventory camera 310 1 may be coupled to the front of the shelf 304 1 and face the inventory 312. Such an embodiment may be advantageous with refrigerated shelving units such as the shelving unit 302 when a light source, not shown, is housed within the shelving unit and turns on when a door of the shelving unit is opened. More specifically, when the light source is positioned at the rear of the shelving unit, the image captured by the inventory camera 310 1 may appear clearer and less blurred in such an embodiment.
  • Referring to FIG. 4, an illustration of a portion of an automated inventory intelligence system is shown in accordance with some embodiments. In particular, a sensor 408 is shown positioned near merchandise 406 stocked on a shelving unit 402 of an automated inventory intelligence system 400. The sensor 408 is shown integrated in a housing 404, wherein the housing 404 may, in one embodiment, take the form of a rod that extends along at least a portion of the back component of the shelving unit and may be configured to couple to the shelving unit. As in other embodiments disclosed herein, the sensor 408 may include a digital camera; however, in other embodiments, the sensor 408 may be any sensing device whereby merchandise stocked on a shelving unit may be monitored. In the embodiment shown, the sensor 408 is configured to be coupled directly to the shelving unit 402 by way of any fastening means deemed suitable, such as, by way of non-limiting example, magnets, adhesives, brackets, hardware fasteners, and the like. In other embodiments, such as those illustrated in FIGS. 5-6 below, the sensor 408 may be coupled to the shelving unit 402 through a mounting bracket 506. Further, the location of a sensor such as the sensor 408 is not to be limited to the location shown in FIG. 4. It should be understood that the sensor 408 may be disposed in any location with respect to a retail display or warehouse storage unit whereby the stocked merchandise may be monitored. Embodiments of some alternative positioning of sensors are illustrated in FIGS. 6A-6C. Furthermore, preferred locations suited to receive the sensor 408 will generally depend upon one or more factors, such as, for example, the type of merchandise, an ability to capture a desired quantity of merchandise within the field of view of the sensor 408, as well as the methods whereby customers typically remove merchandise from the retail display units.
  • Any of the retail displays or warehouse storage units outfitted with the automated inventory intelligence system 400 can monitor the quantity of stocked merchandise by way of one or more sensors such as the sensor 408 and then create a notification or an alert once the remaining merchandise is reduced to a predetermined minimum threshold quantity. For example, low-inventory alerts may be created when the remaining merchandise is reduced to 50% and 20% thresholds; however, the disclosure is not intended to be so limited and thresholds may be predetermined and/or dynamically configurable (e.g., in response to weather conditions, and/or past sales history data). The low-inventory alerts may be sent to in-store staff to signal that a retail display needs to be restocked with merchandise. In some embodiments, the low-inventory alerts can include real-time images and/or stock levels of the retail displays so that staff can see the quantity of merchandise remaining on the retail displays by way of a computer or a mobile device. In some embodiments, the low-inventory alerts may be sent in the form of text messages in real time to mobile devices carried by in-store staff. As will be appreciated, the low-inventory alerts can signal in-store staff to restock the retail displays with additional merchandise to maintain a frictionless shopping experience for consumers. In addition, with the automated inventory intelligence system 400 can facilitate deeper analyses of sales performance by coupling actual sales with display shelf activity.
  • III. Inventory Monitoring
  • Referring to FIG. 5, an illustration of an image captured by a camera of an automated inventory intelligence system is shown in accordance with some embodiments. The image 500 shown in FIG. 5 illustrates the ability of an inventory camera configured for use with the automated inventory intelligence system 206 of FIGS. 2A-2C to capture the image 500 having an approximately 180° viewing angle. In certain embodiments, an inventory camera, such as the inventory camera 310 1 of FIG. 3, may be positioned within a shelving unit, such as the shelving unit 302 of FIG. 3, such that the inventory camera is located at the inner rear of the shelving unit and above a portion of inventory. In such an embodiment, the inventory camera 310 1 may capture an image such as the image 500, which includes a capture of an inventory portion 508 and an inventory portion 510 stocked on shelving 506. In addition, the image 500 may include a capture of a portion of the store environment 502 and additional inventory 512.
  • Specifically, the positioning of the inventory camera as shown in FIG. 5 enables the inventory camera to capture images such as the image 500, which may be analyzed by logic of the automated inventory intelligence system 206 to automatically and intelligently determine the amount of inventory stocked on the shelf. For example, as seen in the image 500, the first inventory portion 508 and the second inventory portion 510 may be identified by the automated inventory intelligence system 206 using object recognition techniques. For example, upon recognition of the first inventory portion 508 (e.g., recognition of Pepsi bottles), logic of the automated inventory intelligence system 206 may analyze the quantity remaining on the shelf 506. In additional embodiments, the automated inventory intelligence system 206 may determine whether a threshold number of bottles have been removed from the shelf 506. Upon determining at least the threshold number of bottles have been removed, the automated inventory intelligence system 206 may generate a report and/or an alert notifying employees and/or manufacturer that the inventory portion 508 requires restocking. In additional embodiments, the automated inventory intelligence system 206 may determine that less than a threshold number of bottles remain on the shelf 506 and therefore the first inventory portion 508 requires restocking. Utilization of other methodologies of determining whether at least a predetermined number of items remain on a shelf for a given inventory set are within the scope of the invention. Herein, the term “inventory set” generally refers to a grouping of a particular item, e.g., a grouping of a particular type of merchandise, which may include brand, product size (12 oz. bottle v. 2 L bottle), etc.
  • In some embodiments, the image 500 may also be analyzed to determine the remaining items of other inventory portions such as the second inventory portion 510 and/or the alternative portion 512. As seen in FIGS. 6A-6C, the inventory camera may be placed at various varying positions within, or coupled to, a shelving unit. The utilization of such alternative configurations may be dependent upon the type of shelving unit, the type of inventory being captured in images taken by the inventory camera and/or the positioning of inventory within the store environment (e.g., across an aisle).
  • FIGS. 6A-6C provide schematics illustrating sensors coupled to retail displays in accordance with some embodiments. The one or more sensors are configured to be disposed in a retail environment such as by coupling the sensors to retail displays or warehouse storage units. Such retail displays include, but are not limited to, shelves, panels (e.g., pegboard, gridwall, slatwall, etc.), tables, cabinets, cases, bins, boxes, stands, and racks, and such warehouse storage includes, but is not limited to, shelves, cabinets, bins, boxes, and racks. The sensors may be coupled to the retail displays or the warehouse storage units such that one sensor is provided for every set of inventory items (e.g., one-to-one relationship), one sensor for a number of sets of inventory items (e.g., one-to-many relationship), or a combination thereof. The sensors may also be coupled to the retail displays or the warehouse storage units with more than one sensor for every set of inventory items (e.g., many-to-one relationship), more than one sensor for a number of sets of inventory items (e.g., many-to-many relationship), or a combination thereof. In an example of a many-to-one relationship, at least two sensors monitor the same set of inventory items thereby providing contemporaneous sensor data for the set of inventory items. Providing two (or more) sensors for a single set of inventory is useful for sensor data redundancy or simply having a backup. Each of FIGS. 6A-6C shows a one-to-one relationship of a sensor to a set of inventory items, but each sensor can alternatively be in one of the foregoing alternative relationships with one or more sets of inventory items.
  • The sensors include, but are not limited to, light- or sound-based sensors such as digital cameras and microphones, respectively. In some embodiments, the sensors are digital cameras, also referred to as “inventory cameras,” with a wide viewing angle up to a 180° viewing angle.
  • Referring now to FIG. 6A, a schematic illustrating a sensor such as a sensor 606 coupled to a retail shelving unit 604 is shown in accordance with some embodiments. As shown, the sensor 606, e.g., an inventory camera, may be coupled to or mounted on the retail shelving unit 604 under an upper shelf of the shelving unit 604, wherein the shelving unit 604 is a component of the housing 602 of the automated inventory intelligence system 600. In the illustrated embodiment, the inventory camera 606 is configured in an orientation to view a set of inventory items 608 on an inventory item-containing shelf beneath the upper shelf. While the inventory camera 606 is shown mounted inside the retail shelving unit 604 such as on a back (e.g., pegboard) of the housing 602 and looking out from the automated inventory intelligence system 600, the inventory camera 606 may alternatively be coupled to the upper shelf and looking in to the automated inventory intelligence system 600. Due to a wide viewing angle of up to 180°, whether looking out from or in to the automated inventory intelligence system 600, the inventory camera 606 may collect visual information on sets of inventory items adjacent to the set of inventory items 608.
  • Referring to FIG. 6B, a schematic illustrating a sensor such as an inventory camera 612 coupled to an automated inventory intelligence system 600 is shown in accordance with some embodiments. As shown, the inventory camera 612 may be coupled to or mounted on the automated inventory intelligence system 600 on an inventory-item containing shelf of the automated inventory intelligence system 600 in an orientation to view a set of inventory items 614 on the inventory item-containing shelf. While the inventory camera 612 is shown mounted inside the automated inventory intelligence system 600 on the inventory item-containing shelf and looking in to the automated inventory intelligence system 600, which may be advantageous when a light 610 is present in a back of automated inventory intelligence system 600, the inventory camera 612 may alternatively be coupled to the inventory item-containing shelf and looking out from the automated inventory intelligence system 600. Due to a wide viewing angle of up to 180°, whether looking in to or out from the automated inventory intelligence system 600, the inventory camera 612 may collect visual information on sets of inventory items adjacent to the set of inventory items 614.
  • Referring to FIG. 6C, a schematic illustrating a sensor such as an inventory camera 622 coupled to the automated inventory intelligence system 600 is shown in accordance with some embodiments. In addition, FIG. 6C further provides a second housing 618 with a second sensor such as an inventory camera 624 coupled to a second upper shelf 620 and in communication with a second automated inventory intelligence system 616 in accordance with some embodiments. In certain embodiments the automated inventory intelligence system 600 and second automated inventory intelligence system 616 may be separate and independent systems or may be communicatively coupled and/or processing data cooperatively.
  • As shown, the inventory camera 622 may be physically coupled to or mounted on the automated inventory intelligence system 600 in an orientation to view a set of inventory items 628 on an inventory-item containing shelf of an opposing shelving unit across an aisle such as the automated inventory intelligence system 616. Likewise, the inventory camera 624 may be coupled to or mounted on the automated inventory intelligence system 616 in an orientation to view a set of inventory items 626 on an inventory-item containing shelf of an opposing shelving unit across an aisle such as the automated inventory intelligence system 600. Due to wide viewing angles of up to 180°, the inventory camera 622 can collect visual information on sets of inventory items on the automated inventory intelligence system 616 adjacent to the set of inventory items 628 (not shown), and the inventory camera 622 can collect visual information on sets of inventory items on the automated inventory intelligence system 616 adjacent to the set of inventory items 626 (not shown).
  • In some embodiments, inventory cameras such as inventory cameras 606, 612, 622, and 624 are coupled to or mounted on endcaps or other vantage points of the automated inventory intelligence systems to collect visual information while looking in to the retail shelving units.
  • Referring to FIG. 7A, an exemplary embodiment of a first logical representation of the automated inventory intelligence system of FIG. 1 is shown in accordance with some embodiments. In many embodiments, the automated inventory intelligence system 700, may include one or more processors 702 that are coupled to a communication interface 704. The communication interface 704, in combination with a communication interface logic 708, enables communications with external network devices and/or other network appliances transmit and receive data. According to one embodiment of the disclosure, the communication interface 704 may be implemented as a physical interface including one or more ports for wired connectors. Additionally, or in the alternative, the communication interface 704 may be implemented with one or more radio units for supporting wireless communications with other electronic devices. The communication interface logic 708 may include logic for performing operations of receiving and transmitting data via the communication interface 704 to enable communication between the automated inventory intelligence system 700 and network devices via a network (e.g., the internet) and/or cloud computing services, not shown.
  • The processor(s) 702 is further coupled to a persistent storage 706. According to one embodiment of the disclosure, the persistent storage 706 may store logic as software modules includes an automated inventory intelligence system logic 710 and the communication interface logic 708. The operations of these software modules, upon execution by the processor(s) 702, are described above. Of course, it is contemplated that some or all of this logic may be implemented as hardware, and if so, such logic could be implemented separately from each other.
  • Additionally, the automated inventory intelligence system 700 may include hardware components such as fascia 711 1-711 m (wherein m≥1), inventory cameras 712 1-712 i (wherein i≥1), proximity sensors 714 1-714 j (wherein j≥1), and facial recognition cameras 716 1-716 k (wherein k≥1). For the purpose of clarity, couplings, i.e., communication paths, are not illustrated between the processor(s) 702 and the fascia 711 1-711 m, the inventory cameras 712 1-712 i, the proximity sensors 714 1-714 j, and the facial recognition cameras 716 1-716 k: however such couplings may be direct or indirect and configured to allow for the provision of instructions from the automated inventory intelligence system logic 710 to such components
  • Each of the inventory cameras 712 1-712 i, the proximity sensors 714 1-714 j, and the facial recognition cameras 716 1-716 k may be configured to capture images, e.g., at predetermined time intervals or upon a triggering event, and transmit the images to the persistent storage 706. The automated inventory intelligence system logic 710 may, upon execution by the processor(s) 702, perform operations to analyze the images. Specifically, the automated inventory intelligence system logic 710 includes an image receiving logic 718, an object recognition logic 718, an inventory threshold logic 722, an alert generation logic 724, a facial recognition logic 726 and a proximity logic 728. As will be discussed in further detail below with respect to FIGS. 8-9, the image receiving logic 718 can be configured to, upon execution by the processor(s) 702, perform operations to receive a plurality of images from a sensor, such as the inventory cameras 712 1-712 1. In some embodiments, the image receiving logic 718 may receive a trigger, such as a request for a determination as to whether an inventory set needs to be restocked, and request an image be captured by one or more of the inventory cameras 712 1-712 1.
  • The object recognition logic 720 is configured to, upon execution by the processor(s) 702, perform operations to analyze an image received by an inventory camera 712 1-712 i, including object recognition techniques. In some embodiments, the object recognition techniques may include the use of machine learning, predetermined rule sets and/or deep convolutional neural networks. The object recognition logic 720 may be configured to identify one or more inventory sets within an image and determine an amount of each product within the inventory set. In addition, the object recognition logic 720 may identify a percentage, numerical determination, or other equivalent figure that indicates how much of the inventory set remains on the shelf (i.e., stocked) relative to an initial amount (e.g., based on analysis and comparison with an earlier image and/or retrieval of an initial amount predetermined and stored in a data store, such as the inventory threshold data store 730).
  • The inventory threshold logic 722 is configured to, upon execution by the processor(s) 702, perform operations to retrieve one or more predetermined thresholds and determine whether the inventory set needs to be restocked. A plurality of predetermined holds, which may be stored in the inventory threshold data store 730, may be utilized in a single embodiment. For example, a first threshold may be used to determine whether the inventory set needs to be stocked and an alert transmitted to, for example, a retail employee (e.g., at least a first amount of the initial inventory set has been removed). In addition, a second threshold may be used to determine whether a product delivery person needs to deliver more of the corresponding product to the retailer (e.g., indicating at least a second amount of the initial inventory set has been removed, the second amount greater than the first amount). In such an embodiment, when the second threshold is met, alerts may be transmitted to both a retail employee and a product delivery person.
  • The alert generation logic 724 can be configured to, upon execution by the processor(s) 702, perform operations to generate alerts according to determinations made by, for example, the object recognition logic 720 and the inventory threshold 722. In certain embodiments, the alerts may take any form such as a digital communication transmitted to one or more electronic devices, and/or an audio/visual cue in proximity to the shelf on which the inventory set is stocked, etc.
  • The facial recognition logic 726 may be configured to, upon execution by the processor(s) 702, perform operations to analyze images received by the image receiving logic 718 from the facial recognition cameras 716 1-716 k. In some embodiments, the facial recognition logic 726 may look to determine trends in customers based on ethnicity, age, gender, time of visit, geographic location of the store, etc., and, based on additional analysis, the automated inventory intelligence system logic 710 may determine trends in accordance with graphics displayed by the automated inventory intelligence system 700, sales, time of day, time of the year, day of the week, etc. Facial recognition logic 726 may also be able to generate data relating to the overall traffic associated with the facial recognition cameras 716 1-716 k. This can be generated directly based on the number of faces (unique and non-unique) that are processed within a given time period. This data can be stored within the persistent storage 706 within a traffic density log 734.
  • The proximity logic 728 can be configured to, upon execution by the processor(s) 702, perform operations to analyze images received by, for example, the image receiving logic 718 from the proximity sensors 714 1-714 j. In some embodiments, the proximity logic 728 may determine when a customer is within a particular distance threshold from the shelving unit on which the inventory set is stocked and transmit a communication (e.g., instruction or command) to the change the graphics displayed on the fascia, e.g., such as the fascia 711 1-711 m. Data related to the proximity, and therefore the potential effectiveness of an advertisement, may be stored within a proximity log 732. In this way, data may be provided that can be tracked with particular displays, products, and/or advertising campaigns.
  • Referring to FIG. 7B, an exemplary embodiment of a second logical representation of the automated inventory intelligence system of FIG. 1 is shown in accordance with some embodiments. The illustration of FIG. 7B provides a second embodiment of the automated inventory intelligence system 700 in which the automated inventory intelligence system logic 710 of FIG. 7A resides in cloud computing services 740. In such an embodiment, each of the fascia 711 1-711 m, the inventory cameras 712 1-712 1, the proximity sensors 714 1-714 j, and the facial recognition cameras 716 1-716 k may be configured to capture images which are then transmitted, via the communication interface 704, to the automated inventory intelligence system 710 in the cloud computing services 740. The automated inventory intelligence system 710, upon execution via the cloud computing services 740, perform operations to analyze the images.
  • Processor(s) 702 can represent a single processor or multiple processors with a single processor core or multiple processor cores included therein. Processor(s) 702 can represent one or more general-purpose processors such as a microprocessor, a central processing unit (“CPU”), or the like. More particularly, processor(s) 702 may be a complex instruction set computing (“CISC”) microprocessor, reduced instruction set computing (“RISC”) microprocessor, very long instruction word (“VLIW”) microprocessor, or processor implementing other instruction sets, or processors implementing a combination of instruction sets. Processor(s) 702 can also be one or more special-purpose processors such as an application specific integrated circuit (“ASIC”), a field programmable gate array (“FPGA”), a digital signal processor (“DSP”), a network processor, a graphics processor, a network processor, a communications processor, a cryptographic processor, a co-processor, an embedded processor, or any other type of logic capable of processing instructions. Processor(s) 702 can be configured to execute instructions for performing the operations and steps discussed herein.
  • Persistent storage 706 can include one or more volatile storage (or memory) devices, such as random access memory (“RAM”), dynamic RAM (“DRAM”), synchronous DRAM (“SDRAM”), static RAM (“SRAM”), or other types of storage devices. Persistent storage 706 can store information including sequences of instructions that are executed by the processor(s) 702, or any other device. For example, executable code and/or data of a variety of operating systems, device drivers, firmware (e.g., input output basic system or BIOS), and/or applications may be loaded in persistent storage 706 and executed by the processor(s) 702. An operating system may be any kind of operating systems, such as, for example, Windows® operating system from Microsoft®, Mac OS®/iOS® from Apple, Android® from Google®, Linux®, Unix®, or other real-time or embedded operating systems such as VxWorks.
  • Some portions of the description provided herein have been presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the ways used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities.
  • It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the above discussion, it should be appreciated that throughout the description, discussions utilizing terms such as those set forth in the claims below, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system's memories or registers or other such information storage, transmission or display devices.
  • The techniques shown in the figures may be implemented using code and data stored and executed on one or more electronic devices. Such electronic devices store and communicate (internally and/or with other electronic devices over a network) code and data using computer-readable media, such as non-transitory computer-readable storage media (e.g., magnetic disks; optical disks; random access memory; read only memory; flash memory devices; phase-change memory) and transitory computer-readable transmission media (e.g., electrical, optical, acoustical or other form of propagated signals—such as carrier waves, infrared signals, digital signals).
  • The processes or methods depicted in the figures may be performed by processing logic that includes hardware (e. g. circuitry, dedicated logic, etc.), firmware, software (e.g., embodied on a non-transitory computer readable medium), or a combination of both. Although the processes or methods are described above in terms of some sequential operations, it should be appreciated that some of the operations described may be performed in a different order. Moreover, some operations may be performed in parallel rather than sequentially.
  • IV. Inventory Monitoring Methodology
  • Referring now to FIG. 8 with additional reference to FIGS. 7A-7B, a flowchart illustrating an exemplary method for analyzing, by the automated inventory intelligence system logic 710 of FIGS. 7A-7B, an image of inventory to determine whether the inventory is to be restocked is shown in accordance with some embodiments. Each block illustrated in FIG. 8 represents an operation performed in the method 800 of analyzing an image of inventory to determine whether the inventory is to be restocked. Herein, the method 800 starts when the image receiving logic 718 of the automated inventory intelligence system logic 710 receives an image captured by an inventory camera, e.g., the inventory camera 712 1 (block 802).
  • Upon receiving the image, the object recognition logic 720 of the automated inventory intelligence system logic 710 performs processing on the image including one or more object recognition techniques to distill at least a first inventory set (block 804). For example, the object recognition logic 720 may perform operations including, but not limited or restricted to, identification of objects of interest within the image, quantification of objects of interest within the image, depth determination of object of interest within the image, and/or pattern recognition of at least portions of items of interest within the image in order to differentiate inventory sets. For example, as illustrated in FIG. 5, the object recognition logic 720 may distill at least two inventory sets from the image 500 including the first inventory portion 508 and second inventory portion 510. The object recognition techniques may utilize deep convolutional neural networks, machine learning techniques and/or predetermined rule sets. Object recognition techniques may also be referred to as image recognition and/or computer vision.
  • Following the performance of the object recognition techniques and the distilling of at least the first inventory set, object recognition logic 720 determines the amount of inventory within the first inventory set (block 806). As discussed in further detail below with respect to thresholds, the amount of inventory may refer to a specific number of items (e.g., 16 2 L soda bottles remain) or a percentage/ratio (e.g., 50% of the 2 L soda bottles remain or ½ of the 2 L soda bottles remain). As discussed below, the number of items initially stocked may be predetermined and stored in the inventory threshold data store 730. Additionally, or in the alternative, the object recognition logic 720 may determine or estimate the number of items to fill the shelf space provided for the first inventory set (e.g., estimate the shelf space based on the location of the first inventory set relative to the edges of the shelf and other inventory sets also located on the shelf).
  • Subsequent to the determination of the amount of inventory within the first inventory set, the inventory threshold logic 722 can determine whether the amount of inventory set meets a predetermined threshold (block 808). In certain embodiments, the thresholds may be stored in and retrieved from the inventory threshold data store 730. In some embodiments, a single predetermined threshold may be utilized for all inventory sets, e.g., a percentage of inventory remaining on the shelf. In one embodiment, the object recognition logic 720 may determine the percentage of inventory remaining by retrieving a predetermined number representing the initial inventory (e.g., a number items initially stocked) and determining the ratio between the initial inventory and the current inventory. Such a predetermined number may also be stored in the inventory threshold data store 730 and stored according to the inventory set and the retailer or retail location (as one retailer may not have the same amount of space to stock the inventory set as a second retailer). The predetermined number may be dynamically adjustable by, for example, a retail employee when the inventory set is restocked and/or the inventory set is moved from a first location to a second location.
  • In other embodiments, a predetermined threshold (e.g., either a percentage or a number) may be stored for one or more inventory sets. For example, a first threshold may correspond to a first inventory set, such as 2 L bottles of a first soda, and a second threshold (different than the first threshold) may correspond to a second inventory set, such as 12 oz. bottles of the first soda. Additionally, thresholds may be dynamically adjusted for various reasons including, but not limited to, sale prices, local stock levels, sales history, promotional campaigns, etc.
  • When the inventory threshold logic 722 determines that the amount of inventory of the first inventory set meets or exceeds the predetermined threshold (yes at block 808), the alert generation logic 724 may optionally generate a report, or add information to an existing report (such as a log), indicating that a determination was made that the first inventory set did not need to be restocked (block 810). For example, one or more of the following may be included in a log. a time stamp, an indication or identifier of the inventory camera that captured the image, the image, an indication or identifier of the first inventory set (as well as other inventory sets recognized), an indication of the determination not to restock, etc.
  • When the inventory threshold logic 722 determines that the amount of inventory of the first inventory set does not at least meet the predetermined threshold (no at block 808), the alert generation logic 724 may generate an alert (and/or a report) indicating that the first inventory set is to be restocked (block 812). The alert may take many forms including, but not limited or restricted to, a text message, an email, or any other digitally transmitted audio/visual indication (e.g., message transmitted via Bluetooth®). Additionally, or in the alternative, the alert may include a visual indication located proximate to the shelf on which the first inventory set is stocked (e.g., a light on or near the relevant shelf or a light on the corresponding cabinet display top). The alert may include information such as an identifier of the first inventory set, an indicator identifying the location of the first inventory set within the retailer or retail location (e.g., a map of the retailer's space and an indicator of the relevant shelf on the map or an aisle number), an amount of inventory to restock (e.g., a particular number or percentage, the determination of which is discussed above), the image of the first inventory set, etc. The alert may be transmitted to one or more of: an electronic device at the retailer or retail location (e.g., a dedicated tablet in the back of the retailer), an electronic device of a specific employee, an electronic device of a distributor/product delivery person, etc. In some embodiments, the determination of what information to include and to what electronic devices to transmit the alert may be based on a predetermined rule set (which may also be stored in the inventory threshold data store 730). For example, the predetermined rule set may include a set of thresholds such that a first, lower threshold indicates an alert is to be sent to a retail employee, while a second, higher threshold indicates the alert is also to be sent to a distributor/product delivery person (i.e., when inventory is very low, send alert to distributor/product delivery person). In some embodiments, the alert generation logic 724 may access a data store, not shown, which includes information as to the total amount of the product comprising the first inventory set at a particular retailer or retail location (e.g., amount of the product stocked and also located in the back of the retailer or retail location, which may be dynamically updated in real-time based on product sales information obtained when a customer makes a purchase).
  • Referring now to FIG. 9, a flowchart illustrating an exemplary method for analyzing, by the automated inventory intelligence system logic of FIGS. 7A-7B, an image of inventory to determine whether the inventory is to be restocked based on a triggering event is shown in accordance with some embodiments. Each block illustrated in FIG. 9 represents an operation performed in the method 900 of analyzing an image of inventory to determine whether the inventory is to be restocked based on a triggering event. Herein, in accordance with certain embodiments, the method 900 can start when the image receiving logic 718 of the automated inventory intelligence system logic 710 receives a trigger to determine whether inventory of a first inventory set is to be restocked (block 902). A trigger may take various including a communication from a proximity sensors 714 1-714 j, from a facial recognition camera 716 1-716 k, a retail employee and/or a product delivery person. In addition, a trigger be received automatically at predetermined intervals (e.g., a pull system such that the inventory cameras 712 1-712 i are activated upon an instruction from the image receiving logic 718).
  • In response to receiving a trigger, the image receiving logic 718 can transmit a signal or other communication to an inventory camera 712 1-712 i monitoring the first inventory set identified in the trigger (e.g., the inventory configured to capture an image of the first inventory set) (block 904). The image receiving logic 718 may determine one or more inventory cameras of the inventory cameras 712 1-712 i that are to capture an image. In one embodiment, the received trigger include have an indication of a particular inventory set (e.g., 2 L bottles of a first soda). The image receiving logic 718 may determine the location of the particular inventory set in the retail location and determine the inventory camera 712 1-712 i that is configured/positioned to capture an image of the identified location. For example, a data store of the automated inventory intelligence system 700, not shown, may be configured to store data indicating pairings of: {inventory set, retail location}. Based on the stored pairing, which may be dynamically adjustable based on restocking of shelves, the image receiving logic 718 may determine the retail location of the inventory. In addition, the data store may also be configured to store pairings of: {retail location, inventory camera}. Therefore, based on the determined retail location, the image receiving logic 718 may determine the inventory camera 712 1-712 i that is positioned to capture an image of the retail location at which the particular inventory set is stocked. Based on this determination, the image receiving logic 718 may transmit a signal to the identified inventory camera with the instruction to capture an image.
  • Subsequent to the transmission of the signal to the relevant inventory camera, the image receiving logic 718 can receive an imaged captured by the relevant inventory camera (block 906). Following the receipt of the image, the automated inventory intelligence system logic 710 determines an amount of inventory within the first inventory set as discussed in FIG. 8 (block 908). Further, the automated inventory intelligence system logic 710 transmits a communication indicating whether inventory of the first inventory set is to be restocked based on a comparison of the amount of inventory of the first inventory set remaining with one or more inventory thresholds as discussed in FIG. 8 (block 910).
  • V. Proximity Sensor
  • a. Proximity Sensor Architecture
  • Referring now to FIG. 10, an exemplary embodiment of the proximity sensor positioned on a cabinet display top of the automated inventory intelligence system of FIG. 1 is shown in accordance with some embodiments. In particular, FIG. 10 illustrates one embodiment in which the proximity sensor 107 is a camera (“proximity camera 107”); however, the disclosure is not intended to be so limited. Instead, the proximity sensor 107 may be any sensor configured to detect objects within a predetermined proximity region, e.g., without any physical contact with the objects. For example, a proximity region may be a geo-fence, e.g., a virtual perimeter for a geographical area. Examples of alternative proximity sensors may include, but are not limited or restricted to, electromagnetic sensors, infrared sensors, and optical sensors. In addition, in certain embodiments, capacitive proximity sensors, photoelectric proximity sensors, and/or inductive proximity sensors may be used.
  • The proximity camera 107 of FIG. 10 is configured to couple to the cabinet display top 106 via a mount 1004 and includes a lens 1000 and a housing 1002 of proximity camera. In one embodiment, the proximity camera 107 may be the same camera as the inventory camera 210 as seen in, for example, FIG. 2C, or an equivalent model. In one embodiment, the proximity camera 107 may have a viewing angle of approximately 180° (degrees) and is capable of monitoring a larger portion of the store environment, particularly, the geographic region in front of the shelving unit to which the proximity camera 107 is coupled.
  • Although illustrated as being positioned on top of the cabinet display top 106, the proximity camera 107 may be coupled to a side or the bottom of the cabinet display top 106. In addition, although not shown, the proximity camera 107 may be integrated directly into the cabinet display top 106. In addition, the proximity camera 107 may be positioned at any location on the cabinet display top 106, e.g., the middle, toward either end, etc.
  • b. Proximity Region Analysis and Graphic Displays
  • Referring to FIG. 11A, a first illustration of an image captured by the proximity sensor of the automated inventory intelligence system of FIG. 1 is shown in accordance with some embodiments. The image 1100 provides one exemplary perspective from the proximity camera 107 of FIGS. 1 and 10. The image 1100 provides an illustration of a store environment 1102 including a view of numerous store aisles as well as a geographic region proximate the shelving unit to which the automated inventory intelligence system (and hence the proximity camera 107) is coupled. For example and as shown in FIG. 11A, the proximity camera 107 is configured to capture the image 1100, which provides a capture of objects in front of, and approaching, the front of the shelving unit.
  • In particular, the image 1100 includes an image capture, e.g., a picture, of a region including a first proximity region 1104. Additionally, the image 110 captures a first shopper 1106 at least partially within the first proximity region 1104, and shoppers 1108, 1110 and 1112 outside of the first proximity region 1104. In some embodiments, the image 1100 may capture additional objects 1111. In one embodiment, the proximity logic 728, as seen in FIGS. 7A-7B, is communicatively coupled to the proximity sensor 107 and may determine the graphic(s) to be displayed on the fascia, according to the location of one or more objects detected in the captured image 1100. Specifically, upon receipt of the image 1100 from the proximity sensor 107, the proximity logic 728 may be configured to detect objects within the image 1100 and determine whether each detected object is within a proximity region. In some embodiments, the proximity logic 728 may be configured to detect objects with one or more proximity regions.
  • For example, as shown in FIG. 11A, a single proximity region 1104 may be utilized such that when an object is detected and determined to be within the proximity region 1104, the proximity logic 728 may transmit one or more instructions causing the automated inventory intelligence system to display a first graphic (e.g., product information) on its fascia, e.g., the fascia 108 1-108 4, as well as the cabinet display top 106. Subsequently, when objects are no longer detected within the proximity region, the proximity logic 728 may transmit one or more instructions causing the automated inventory intelligence system to display a second graphic (e.g., a promotion or other immersive graphic) on the fascia 108 1-108 4 and/or the cabinet display top 106. As will be discussed in further detail below with respect to FIGS. 11B and 12A, the proximity logic 728 may be configured to detect objects and determine whether each object is within one of a plurality of proximity regions such that the automated inventory intelligence system causes the display of various graphics depending on in which region each object is determined to be.
  • Referring to FIG. 11B, a second illustration of an image captured by the proximity sensor of the automated inventory intelligence system of FIG. 1 is shown in accordance with some embodiments. The image 1114 captured by the proximity camera 107 illustrates the proximity logic 728 being configured to detect objects within a plurality of proximity regions, e.g., the first proximity region 1118 and the second proximity region 1120 within the store environment 1116. As will discussed in more detail with respect to FIGS. 12A-12D, the proximity logic 728 may provide instructions that cause the corresponding fascia to display varying graphics depending on whether objects are detected in the first proximity region 1118 and/or the second proximity region 1120. In a variety of embodiments, the second proximity region 1120 may comprise a smaller region compared to the first proximity region 1118 and be located closer to the shelving unit compared to the first proximity region 1118. However, many embodiments may also have first and second proximity regions 1118, 1120 separate and non-overlapping (or a mix of both).
  • Referring now to FIG. 12A, an exemplary illustration of a plurality of proximity regions based on one configuration of a proximity sensor of an automated inventory intelligence system is shown in accordance with some embodiments. The illustration of FIG. 12A provides an illustration of a first proximity region 1216 and a second proximity region 1218 partially surrounding the shelving unit 1200, with a shopper 1219 located within the first proximity region 1216. The shelving unit 1200, being similar in this embodiment to the shelving unit 102 of FIG. 1, includes a first shelf 1202 and a second shelf 1204 as well as a back component 1206. The shelving unit 1200 may be coupled to the automated inventory intelligence system 1207, which is shown to include inventory cameras 1208 1-1208 4, and fascia 1210 1-1210 2. The automated inventory intelligence system 1207 of the depicted embodiment may be closely equivalent to the automated inventory intelligence system 100 of FIG. 1. Additionally, the shelving unit 1200 may be stocked with inventory such as a first inventory set 1212 and a second inventory set 1214. The first and second inventory sets 1212, 1214 may be monitored by the inventory cameras 1208 1-1208 4 as discussed above in accordance with at least FIGS. 1-5.
  • In some embodiments, the graphics displayed on the fascia 1210 1-1210 2 may be changed in accordance with instructions provided by the proximity logic 728 based on whether an object is detected in the first proximity region 1216 and/or the second proximity region 1218. For example, in certain embodiments, the proximity logic 728 may receive images captured by a proximity camera, such as the proximity camera 107, and perform object recognition techniques, e.g., such as those discussed above with respect to the images captured by the inventory cameras 110 1-110 i. In other embodiments, the images captured by the proximity camera may be provided to the object recognition logic 720. Based on the detection of one or more objects in one or more of the first proximity region 1216 and/or the second proximity region 1218, the proximity logic 728 may cause the automated inventory intelligence system 1207 to change states, e.g., an immersive state, a promotion state, and a product information state. It should be noted that these are merely exemplary names and that an alternative naming convention may be utilized. Further, more or fewer states may be utilized by the proximity logic 728. For example, the image 1100 of FIG. 11A illustrates a scenario in which the proximity logic 728 may be configured with fewer than three states, e.g., two states.
  • As illustrated in FIG. 12A, the shopper 1219 has been detected as being located in the first proximity region 1216. Based on the detection of the shopper 1219 being located in the first proximity region 1216, the proximity logic 728 has caused the automated inventory intelligence system 1207 to enter into a “promotion state” such that the fascia 1210 1-1210 2 are configured to display a promotion graphic (e.g., “Sale!” and “Kids Cereal!”). In the embodiment illustrated in FIG. 12A, the proximity logic 728 may be configured with three possible states: (1) an immersive state triggered when no objects are detected in either of the first proximity region 1216 or the second proximity threshold 1218, wherein the fascia 1210 1-1210 2 are to display an immersive graphic; (2) a promotion state triggered when an object is detected in the first proximity region 1216 but no object detected in the second proximity region 1218, wherein the fascia 1210 1-1210 2 are to display a promotion graphic; and (3) a promotion state triggered when an object is detected in the second proximity region 1218, wherein the fascia 1210 1-1210 2 are to display a product information graphic. Additional fascia, not shown, and/or the cabinet display top 1220 may also display graphics according to the state of the automated inventory intelligence system 1207. Embodiments of an automated inventory intelligence system 1207 in different states are discussed below and illustrated in FIGS. 12B-12D.
  • As is evident from the illustrations of FIGS. 12A-12D, one goal of the graphics displayed on the cabinet display top 1220 and/or the fascia 1210 1-1210 2 is to grab the attention of shoppers and prolong dwell time in front of the shelving unit 1200, which makes it more likely the shopper will purchase an item stocked on the shelving unit 1200. Specifically, the invention provides manufacturers and retailers the advantage of enabling point of sale marketing through dynamic video and animation.
  • The discussion of FIGS. 12B-12D will be in reference to FIG. 12A. Referring now to FIG. 12B, an exemplary illustration fascia of the automated inventory intelligence system 1207 displaying a first graphic while in a promotion state is shown in accordance with some embodiments. FIG. 12B illustrates the shelving unit 1200 of FIG. 12A coupled to the automated inventory intelligence system of FIG. 12A in a promotion state. The promotion state may correspond to the display of content-specific promotions graphics on one or more of the fascia 1210 1-1210 i and cabinet display top 1220. In particular embodiments, the graphics displayed may be (i) specific to the items stocked on the shelving unit, (ii) specific to the brands having items stocked on the shelving unit, (iii) promotions and/or discounts specific to the items stocked on the shelving unit, etc. As is illustrated in FIG. 12B, the proximity logic 728 has provided instructions causing the fascia 1210 1-1210 i to display promotional information, e.g., a promotion benefitting schools by virtue of purchasing cereal and returning a portion of the cereal box, which corresponds to the items, e.g., cereal, stocked on the shelving unit 1200.
  • Referring now to FIG. 12C provides an exemplary illustration fascia of the automated inventory intelligence system of FIG. 12A displaying a first graphic while in a product information state is shown in accordance with some embodiments. FIG. 12C illustrates the shelving unit 1200 of FIG. 12A coupled to the automated inventory intelligence system 1207 in a product information state. The product information state may correspond to the display of product specific information graphics, including pricing information, on one or more of the fascia 1210 1-1210 i and cabinet display top 1220. In some embodiments, the product information state may include information similar to information displayed in the promotion state including discounts or promotions on particular items or with particular brands. As is illustrated in FIG. 12C, the proximity logic 728 has provided instructions causing the fascia 1210 1-1210 i to display product information, e.g., pricing information for each of the items stocked on the shelving unit 1200.
  • Referring to FIG. 12D provides an exemplary illustration fascia of the automated inventory intelligence system 1207 displaying a first graphic while in an immersive state is shown in accordance with some embodiments. FIG. 12D illustrates the shelving unit 1200 of FIG. 12A coupled to the automated inventory intelligence system 1207 in an immersive state. The immersive state may correspond to the display of graphics which may span a plurality of the fascia 1210 1-1210 i. The graphics displayed while the automated inventory intelligence system 1207 is in the immersive state are intended to grab the attention of shoppers not standing immediately in front of the shelving unit 1200. In some embodiments, the graphics displayed while the automated inventory intelligence system 1207 can be in the immersive state span from the cabinet display top 1220 and across each of the fascia 1210 1-1210 i. For example, as is illustrated in FIG. 12D, the proximity logic 728 has provided instructions causing the cabinet display top 1220 and the fascia 1210 1-1210 i to display an immersive graphic, e.g., an animation of a school bus. In such an embodiment, the graphic may be animated wherein lights of the school bus may appear to flash and/or the school bus may appear to be approaching the edge of the shelving unit 1200 facing customers and the retail environment.
  • c. Proximity Region Detection Methodology
  • Referring to FIG. 13, an exemplary flowchart illustrating operations corresponding to detecting whether an object is located within one of a plurality of proximity regions performed by an automated inventory intelligence system is shown in accordance with some embodiments. Each block illustrated in FIG. 13 represents an operation performed in the method 1300 of detecting whether an object is located within one of a plurality of proximity regions performed by an automated inventory intelligence system. FIG. 13 will be discussed with reference to FIGS. 12A-12D. The method 1300 begins when an image captured by a proximity sensor, e.g., the proximity camera 107, is received (block 1302). Following receipt of the image from the proximity camera 107, object recognition techniques, as described above, can be performed on the image (block 1304).
  • Subsequent to the performance of object recognition techniques, a determination is made as to whether an object is detected within the second proximity region 1218 of FIG. 12A, i.e., a proximity region closest to the proximity camera 107 (block 1306). When an objected is detected within the second proximity region 1218, instructions are transmitted by the proximity logic 728 that are configured to cause fascia, e.g., the fascia 1210 1-1210 i, and/or the cabinet display top 1220 to display a product information graphic (block 1308). The method 1300 then continues the operations discussed below with respect to FIG. 13.
  • When an objected is not detected within the second proximity region 1218, a determination is made as to whether an object is detected within the first proximity region 1216 of FIG. 12A, (block 1310). When an objected is detected within the first proximity region 1216 but not the second proximity region 1218, instructions are transmitted by the proximity logic 728 that are configured to cause fascia, e.g., the fascia 1210 1-1210 i and/or the cabinet display top 1220 to display a promotion graphic (block 1312). The method 1300 then continues the operations discussed below with respect to FIG. 13.
  • When an objected is not detected within the first proximity region 1216 or the second proximity region 1218, instructions are transmitted by the proximity logic 728 that are configured to cause fascia, e.g., the fascia 1210 1-1210 i, and/or the cabinet display top 1220 to display an immersive graphic (block 1314). The method 1300 then continues the operations discussed below with respect to FIG. 13.
  • Referring now to FIG. 13, subsequent to the transmission of instructions to display a graphic on the fascia 1210 1-1210 i, and/or the cabinet display top 1220, any of a product information graphic, a promotion graphic or an immersive graphic, the proximity logic 728 can stall for a predetermined amount of time. Following the expiration of the predetermined amount of time, the proximity logic 728 may transmit a request for an updated image to the proximity camera 107 and performs object recognition techniques on the received image (block 1316).
  • Subsequent to the performance of object recognition techniques, a determination is made as to whether an object is detected within a proximity region (block 1318). When an object is detected only in the first proximity region 1216, i.e., no change since the previous image was captured, no change is instituted to the graphic displayed on the fascia 1210 1-1210 i and/or the cabinet display top 1220 and the method 1300 returns to block 1316.
  • When an object is detected in the second proximity region 1218, instructions are transmitted by the proximity logic 728 that are configured to cause fascia, e.g., the fascia 1210 1-1210 i, and/or the cabinet display top 1220 to display a product information graphic (block 1320) and the method 1300 returns to block 1316.
  • When no object is detected any proximity region, instructions can be transmitted by the proximity logic 728 that are configured to cause fascia, e.g., the fascia 1210 1-1210 i, and/or the cabinet display top 1220 to display an immersive graphic (block 1322) and the method 1300 returns to block 1316.
  • Referring to FIG. 14, an exemplary flowchart illustrating operations corresponding to detecting one or more objects in one or more of a plurality of proximity regions and generating activity logs performed by an automated inventory intelligence system is shown in accordance with some embodiments. Each block illustrated in FIG. 14 represents an operation performed in the method 1400 of detecting one or more objects in one or more of a plurality of proximity regions and generating activity logs performed by an automated inventory intelligence system. The method 1400 begins when an image captured by a proximity sensor, e.g., the proximity camera 107, is received by, e.g., the proximity logic 728 and/or the object recognition logic 720 (block 1402). Following receipt of the image from the proximity camera 107, object recognition techniques, as described above, are performed on the image (block 1404).
  • Upon completion of the object recognition techniques, information corresponding to the detected object(s) and accompanying metadata is recorded in a proximity log, e.g., the proximity log 732 of FIGS. 7A-7B (block 1406). Based on the information corresponding to the detected object(s) and the accompanying metadata, traffic density information can be determined, e.g., by the proximity logic 728, and recorded in a traffic density log, e.g., the traffic log 734 of FIGS. 7A-7B (block 1408). The traffic density information may include, but is not limited or restricted to, traffic density (of shoppers, employees, or a mix of either) over a given time period (broken down in time frames including by the hours of the day, days of the week, etc.), inventory on the corresponding unit during each time frame, the display on the fascia and/or the cabinet top display during each time frame, demographic data of each shopper, location of the retail location, location of the shelving unit within the retail location, an outdoor temperature at the retail location during a given time frame (e.g., exact, average, etc.), etc. In certain embodiments, the shelving unit may be able to differentiate shoppers from employees due to an identifying uniform/object (which can be parsed by the object detection techniques described above) or via a signal transmission (from, for example, employee cards/identification badges).
  • In addition, the proximity logic 728 and/or the alert generation logic 724 may generate and cause the rendering of one or more visual graphic displays based on at least in part the traffic density information (block 1410). Exemplary visual graphic displays are illustrated in FIGS. 15A-15B.
  • d. Visual Graphic Display of Traffic Density Information
  • Referring to FIG. 15A, an exemplary embodiment of a first user interface display screen produced by an automated inventory intelligence system, where the first user interface display screen provides an interactive dashboard is shown in accordance with some embodiments. The first user interface display screen 1500 generated by either the proximity logic 728 and/or the alert generation logic 724, and rendered by the alert generation logic 724 comprises a plurality of display areas including a top navigation panel 1502, a side navigation panel 1504, a first graphical display area 1506 and a second graphical display area 1508.
  • The top navigation panel 1502 may include graphic display elements, each configured to receive user input and cause performance of specific operations including filtering options and/or a downloading options. The side navigation panel 1504 may include graphic display elements, each configured to receive user input and cause performance of specific operations such as rendering of user interface display screens directed to options or links such as: administrative options or links, media player options or links, triggering options or links (e.g., of one or more of the sensors included in the automated inventory intelligence system 700 as seen in FIGS. 7A-7B), etc.
  • The first graphical display area 1506 and the second graphical display area 1508 may provide information pertaining to the traffic density information in a graphical user interface format including a graphical display of dwell time of shoppers by the hour of the day for a given period of time (e.g., the first graphical display area 1506) and/or a graphical display of proximity triggering events by the hour of the day for a given period of time (second graphical display area 1508).
  • Referring now to FIG. 15B, an exemplary embodiment of a second user interface display screen produced by an automated inventory intelligence system, where the second user interface display screen provides an interactive dashboard is shown in accordance with some embodiments. The second user interface display screen 1510 generated in many embodiments by either the proximity logic 728 and/or the alert generation logic 724, and rendered by the alert generation logic 724 comprises a plurality of display areas including the top navigation panel 1502, the side navigation panel 1504, a third graphical display area 1512 and a fourth graphical display area 1514.
  • The third graphical display area 1512 and the fourth graphical display area 1514 may provide information pertaining to the traffic density information in a graphical user interface format including a graphical display of an average shopper count by the hour of the day for a given period of time (e.g., the third graphical display area 1512) and/or a graphical display of average shopper count by the day of the week for a given period of time (fourth graphical display area 1514). It should be noted that the graphical displays provided by the graphical display areas 1506, 1508, 1512 and 1514 are merely examples and representative of graphical displays that may be generated by the proximity logic 728 and/or the alert generation logic 724 based at least in part on the traffic density information.
  • As will be understood by those skilled in the art, the descriptions of logics utilized in the embodiments depicted in FIGS. 8-15B are not limiting. In fact, various processes performed by a certain logic may be configured to be performed by a separate logic, and/or a comparable logic/service located remotely in, for example, a cloud computing service. Indeed, a mix or match of various logics located locally and the remotely may exist within the system.
  • While some particular embodiments have been provided herein, and while the particular embodiments have been provided in some detail, it is not the intention for the particular embodiments to limit the scope of the concepts presented herein. Additional adaptations and/or modifications can appear to those of ordinary skill in the art, and, in broader aspects, these adaptations and/or modifications are encompassed as well. Accordingly, departures may be made from the particular embodiments provided herein without departing from the scope of the concepts provided herein.

Claims (20)

What is claimed is:
1. A proximity camera system, comprising:
a proximity camera having a lens and a housing;
one or more fascia;
one or more processors communicatively coupled to the proximity camera and the one or more fascia; and
a non-transitory computer-readable medium communicatively coupled to the one or more processors and having logic thereon, the logic, when executed by the one or more processors, being configured to perform operations including:
receiving an image captured by the proximity camera,
performing object recognition techniques on the image,
determining whether an object was detected within a first predetermined proximity region, and
transmitting one or more instructions configured to cause a graphical display to be displayed by the one or more fascia.
2. The proximity camera system of claim 1, wherein the proximity camera and the one or more fascia are coupled to a shelving unit.
3. The proximity camera system of claim 2, wherein the image illustrates a geo-fence region at least partially surrounding the shelving unit, the geo-fence region including the first predetermined proximity region.
4. The proximity camera system of claim 2, wherein the logic is configured to determine whether a second object is detected within a second predetermined proximity region, the second proximity region including a physical area closer to the shelving unit than the first predetermined proximity region.
5. The proximity camera system of claim 1, wherein the image is transmitted to a cloud computing service for analysis of the first predetermined proximity region.
6. The proximity camera system of claim 1, wherein the graphical display displayed by the one or more fascia includes an immersive graphic.
7. The proximity camera system of claim 6, wherein the immersive graphic spans a plurality of the one or more fascia.
8. The proximity camera system of claim 6, further comprising a cabinet display top communicatively coupled to the one or more processors and the non-transitory computer-readable medium, and wherein the immersive graphic spans a plurality of the one or more fascia and the cabinet display top.
9. The proximity camera system of claim 1, wherein the graphical display displayed by the one or more fascia includes a product information graphic, wherein the product information graphic includes at least pricing information for one or more inventory items.
10. The proximity camera system of claim 1, wherein the graphical display displayed by the one or more fascia includes a promotional graphic, wherein the promotional graphic includes at least information corresponding to a promotion or discount for one or more inventory items.
11. A computerized method, the method comprising:
receiving an image captured by the proximity camera,
performing object recognition techniques on the image,
determining whether an object was detected within a first predetermined proximity region, and
transmitting one or more instructions configured to cause a graphical display to be displayed by the one or more fascia.
12. The computerized method of claim 11, wherein the proximity camera and the one or more fascia are coupled to a shelving unit.
13. The computerized method of claim 12, wherein the image illustrates a geo-fence region at least partially surrounding the shelving unit, the geo-fence region including the first predetermined proximity region.
14. The computerized method of claim 12, further comprising determining whether a second object is detected within a second predetermined proximity region, the second proximity region including a physical area closer to the shelving unit than the first predetermined proximity region.
15. The computerized method of claim 12, wherein the graphical display displayed by the one or more fascia includes an immersive graphic that spans a plurality of the one or more fascia.
16. A non-transitory computer readable storage medium having stored thereon instructions, the instructions being executable by one or more processors to perform operations comprising:
receiving an image captured by the proximity camera,
performing object recognition techniques on the image,
determining whether an object was detected within a first predetermined proximity region, and
transmitting one or more instructions configured to cause a graphical display to be displayed by the one or more fascia.
17. The non-transitory computer readable storage medium of claim 16, wherein the proximity camera and the one or more fascia are coupled to a shelving unit.
18. The non-transitory computer readable storage medium of claim 17, wherein the image illustrates a geo-fence region at least partially surrounding the shelving unit, the geo-fence region including the first predetermined proximity region.
19. The non-transitory computer readable storage medium of claim 17, wherein the instructions being executable by the one or more processors to perform further operations comprising determining whether a second object is detected within a second predetermined proximity region, the second proximity region including a physical area closer to the shelving unit than the first predetermined proximity region.
20. The non-transitory computer readable storage medium of claim 18, wherein the graphical display displayed by the one or more fascia includes an immersive graphic that spans a plurality of the one or more fascia.
US17/669,118 2018-10-10 2022-02-10 Systems, Method And Apparatus For Automated Inventory Interaction Abandoned US20220161763A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/669,118 US20220161763A1 (en) 2018-10-10 2022-02-10 Systems, Method And Apparatus For Automated Inventory Interaction

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862743734P 2018-10-10 2018-10-10
US201916598577A 2019-10-10 2019-10-10
US17/669,118 US20220161763A1 (en) 2018-10-10 2022-02-10 Systems, Method And Apparatus For Automated Inventory Interaction

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US201916598577A Continuation 2018-10-10 2019-10-10

Publications (1)

Publication Number Publication Date
US20220161763A1 true US20220161763A1 (en) 2022-05-26

Family

ID=70159544

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/598,557 Active 2039-11-25 US11250456B2 (en) 2018-10-10 2019-10-10 Systems, method and apparatus for automated inventory interaction
US17/669,118 Abandoned US20220161763A1 (en) 2018-10-10 2022-02-10 Systems, Method And Apparatus For Automated Inventory Interaction

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/598,557 Active 2039-11-25 US11250456B2 (en) 2018-10-10 2019-10-10 Systems, method and apparatus for automated inventory interaction

Country Status (2)

Country Link
US (2) US11250456B2 (en)
WO (1) WO2020077096A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230325138A1 (en) * 2020-05-14 2023-10-12 Nec Corporation Image storage apparatus, image storage method, and non-transitory computer-readable medium

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11126861B1 (en) 2018-12-14 2021-09-21 Digimarc Corporation Ambient inventorying arrangements
US10896584B2 (en) * 2019-04-30 2021-01-19 Walmart Apollo, Llc Systems and methods for projecting action indicators
US11495348B2 (en) * 2019-05-28 2022-11-08 Candice E. Lowry Artificial intelligence storage and tracking system for emergency departments and trauma centers
US11410122B1 (en) * 2020-01-31 2022-08-09 Amazon Technologies, Inc. Determining inventory levels using switch-equipped strips and patterns of activated or deactivated indicators
JP7434032B2 (en) * 2020-03-31 2024-02-20 キヤノン株式会社 Information processing device, information processing method, and program
US20220101354A1 (en) * 2020-09-28 2022-03-31 Sensormatic Electronics, LLC Shopper influencer system and method
US20220101245A1 (en) * 2020-09-29 2022-03-31 International Business Machines Corporation Automated computerized identification of assets
AU2021453892A1 (en) * 2021-07-01 2024-01-25 Ses-Imagotag Gmbh Display-system for displaying product or price information
US20230341847A1 (en) * 2022-04-22 2023-10-26 Pepsico, Inc. Multi-sensor perception for resource tracking and quantification

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070076251A1 (en) * 2005-09-30 2007-04-05 Brother Kogyo Kabushiki Kaisha System and Device for Image Reading
JP2014525152A (en) * 2011-05-19 2014-09-25 クアルコム,インコーポレイテッド Method and apparatus for enhanced multi-camera motion capture using proximity sensors
US20190149725A1 (en) * 2017-09-06 2019-05-16 Trax Technologies Solutions Pte Ltd. Using augmented reality for image capturing a retail unit
US10546173B2 (en) * 2015-04-09 2020-01-28 Nec Corporation Information processing device, information processing system, position reporting method, and program recording medium
US10872227B2 (en) * 2018-03-29 2020-12-22 Boe Technology Group Co., Ltd. Automatic object recognition method and system thereof, shopping device and storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8484076B2 (en) * 2003-09-11 2013-07-09 Catalina Marketing Corporation Proximity-based method and system for generating customized incentives
US8672427B2 (en) 2010-01-25 2014-03-18 Pepsico, Inc. Video display for product merchandisers
US8607242B2 (en) * 2010-09-02 2013-12-10 International Business Machines Corporation Selecting cloud service providers to perform data processing jobs based on a plan for a cloud pipeline including processing stages
US20130027561A1 (en) * 2011-07-29 2013-01-31 Panasonic Corporation System and method for improving site operations by detecting abnormalities
US10360593B2 (en) * 2012-04-24 2019-07-23 Qualcomm Incorporated Retail proximity marketing
US20140249928A1 (en) 2013-02-01 2014-09-04 Shelfbucks Shelf to consumer platform
KR20150018264A (en) * 2013-08-09 2015-02-23 엘지전자 주식회사 Wearable glass-type device and control method thereof
EP3195300A4 (en) 2014-07-31 2018-05-23 Cloverleaf Media LLC Dynamic merchandising communication system
US11042894B2 (en) 2015-05-13 2021-06-22 Abl Ip Holding, Llc Systems and methods for POP display and wireless beacon engagement with mobile devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070076251A1 (en) * 2005-09-30 2007-04-05 Brother Kogyo Kabushiki Kaisha System and Device for Image Reading
JP2014525152A (en) * 2011-05-19 2014-09-25 クアルコム,インコーポレイテッド Method and apparatus for enhanced multi-camera motion capture using proximity sensors
US10546173B2 (en) * 2015-04-09 2020-01-28 Nec Corporation Information processing device, information processing system, position reporting method, and program recording medium
US20190149725A1 (en) * 2017-09-06 2019-05-16 Trax Technologies Solutions Pte Ltd. Using augmented reality for image capturing a retail unit
US10872227B2 (en) * 2018-03-29 2020-12-22 Boe Technology Group Co., Ltd. Automatic object recognition method and system thereof, shopping device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Westell, J.; Saeedi, P., 3D object recognition via multi-view inspection in unkown environments (English), 2010 11th International Conference on control Automation Robotics & Vision (Page(s): 2088-2095), 1-Dec-2010 (Year: 2010) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230325138A1 (en) * 2020-05-14 2023-10-12 Nec Corporation Image storage apparatus, image storage method, and non-transitory computer-readable medium

Also Published As

Publication number Publication date
US11250456B2 (en) 2022-02-15
WO2020077096A1 (en) 2020-04-16
US20200118154A1 (en) 2020-04-16

Similar Documents

Publication Publication Date Title
US11250456B2 (en) Systems, method and apparatus for automated inventory interaction
US11288734B2 (en) Intelligent shelf display system
US20210216952A1 (en) System and Methods for Inventory Management
US11727353B2 (en) Comparing planogram compliance to checkout data
US10846512B2 (en) Updating online store inventory based on physical store inventory
US20210216951A1 (en) System and Methods for Inventory Tracking
US20170124603A1 (en) Marketing display systems and methods
US9361628B2 (en) Interactive video shelving system
US9235375B2 (en) Retail digital signage
US9575558B2 (en) System and method for electronically assisting a customer at a product retail location
US8650073B2 (en) Glasses-free 3D advertising system and method
JP2020518936A (en) Method, system, and device for detecting user interaction
AU2019271906A1 (en) Systems and methods for merchandizing electronic displays
WO2017030177A1 (en) Exhibition device, display control device and exhibition system
US20200118077A1 (en) Systems, Method and Apparatus for Optical Means for Tracking Inventory
US20130317903A1 (en) Integrated system for displaying items and for measurable promotional communication
US20200250736A1 (en) Systems, method and apparatus for frictionless shopping
US20230016554A1 (en) Electronic Shelf-Tag Systems and Methods Thereof
JP6981002B2 (en) Display shelves, information processing equipment, programs and display shelf systems
US20210295341A1 (en) System and Methods for User Authentication in a Retail Environment
KR20180062619A (en) Method, system and non-transitory computer-readable recording medium for managing digital signage
US20200118078A1 (en) Systems, Method and Apparatus for Automated and Intelligent Inventory Stocking
US20150103248A1 (en) Configurable advertising and content rendering
JP2024055919A (en) Intelligent Marketing and Advertising Platform
GB2547534A (en) Audio/visual recording apparatus, audio/visual recording and playback system and methods for the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: ADROIT WORLDWIDE MEDIA, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SCHUMACHER, GREG;HOWARD, KEVIN;MIRGOLI, EMAD;SIGNING DATES FROM 20191107 TO 20191110;REEL/FRAME:058977/0026

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION