US20160110791A1 - Method, computer program product, and system for providing a sensor-based environment - Google Patents

Method, computer program product, and system for providing a sensor-based environment Download PDF

Info

Publication number
US20160110791A1
US20160110791A1 US14/590,240 US201514590240A US2016110791A1 US 20160110791 A1 US20160110791 A1 US 20160110791A1 US 201514590240 A US201514590240 A US 201514590240A US 2016110791 A1 US2016110791 A1 US 2016110791A1
Authority
US
United States
Prior art keywords
person
field
item
view
items
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/590,240
Inventor
Dean Frederick Herring
Monsak Jason Chirakansakcharoen
Ankit Singh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Global Commerce Solutions Holdings Corp
Original Assignee
Toshiba Global Commerce Solutions Holdings Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201462064323P priority Critical
Application filed by Toshiba Global Commerce Solutions Holdings Corp filed Critical Toshiba Global Commerce Solutions Holdings Corp
Priority to US14/590,240 priority patent/US20160110791A1/en
Assigned to TOSHIBA GLOBAL COMMERCE SOLUTIONS HOLDINGS CORPORATION reassignment TOSHIBA GLOBAL COMMERCE SOLUTIONS HOLDINGS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HERRING, DEAN FREDERICK, CHIRAKANSAKCHAROEN, MONSAK JASON, SINGH, Ankit
Publication of US20160110791A1 publication Critical patent/US20160110791A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping
    • G06Q30/0623Item investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/40Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight
    • G01G19/413Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means
    • G01G19/414Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means using electronic computing means only
    • G01G19/4144Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means using electronic computing means only for controlling weight of goods in commercial establishments, e.g. supermarket, P.O.S. systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00221Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/78Combination of image acquisition and recognition functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading, distribution or shipping; Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement, balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/18Payment architectures involving self- service terminals [SSTs], vending machines, kiosks or multimedia terminals
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • G06Q20/208Input by product or record sensing, e.g. weighing or scanner processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • G06Q20/209Specified transaction journal output feature, e.g. printed receipt or voice output
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices using wireless devices
    • G06Q20/322Aspects of commerce using mobile devices [M-devices]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/01Customer relationship, e.g. warranty
    • G06Q30/016Customer service, i.e. after purchase service
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • G06Q30/0207Discounts or incentives, e.g. coupons, rebates, offers or upsales
    • G06Q30/0224Discounts or incentives, e.g. coupons, rebates, offers or upsales based on user history
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • G06Q30/0207Discounts or incentives, e.g. coupons, rebates, offers or upsales
    • G06Q30/0235Including timing, i.e. limited awarding or usage time constraint
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • G06Q30/0241Advertisement
    • G06Q30/0251Targeted advertisement
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • G06Q30/0241Advertisement
    • G06Q30/0251Targeted advertisement
    • G06Q30/0269Targeted advertisement based on user profile or attribute
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping
    • G06Q30/0609Buyer or seller confidence or verification
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping
    • G06Q30/0623Item investigation
    • G06Q30/0625Directed, with specific intent or strategy
    • G06Q30/0629Directed, with specific intent or strategy for generating comparisons
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • G06Q30/0635Processing of requisition or of purchase orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping
    • G06Q30/0639Item locations
    • G06T7/0069
    • G06T7/2093
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0054Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0054Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
    • G07G1/0063Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles with means for detecting the geometric dimensions of the article of which the code is read, such as its size or height, for the verification of the registration
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0054Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
    • G07G1/0072Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles with means for detecting the weight of the article of which the code is read, for the verification of the registration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/02Measuring arrangements characterised by the use of optical means for measuring length, width or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/40Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K2209/00Indexing scheme relating to methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K2209/17Recognition of food, fruit, vegetables
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Abstract

Method, computer program product, and system to influence a person within an environment having a plurality of items for selection. The method includes capturing, using a first visual sensor disposed within the environment, field of view information for the person, performing analysis on the field of view information using a computing device, and identifying, based on the analysis, one or more first items of the plurality of items that are included within the field of view of the person. The method further includes selecting, based on the identified one or more items, at least one second item of the plurality of items for presentation to the person; and presenting information related to the at least one second item to the person.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims benefit of U.S. provisional patent application Ser. No. 62/064,323, filed Oct. 15, 2014, entitled “Integrated Shopping Environment,” which is herein incorporated by reference.
  • BACKGROUND
  • The present disclosure relates to a sensor-based environment, and more specifically, to providing an adaptive personal experience within such an environment using a determined field of view of the person.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1 illustrates an integrated shopping environment, according to one embodiment.
  • FIG. 2 illustrates a shopping environment system, according to one embodiment.
  • FIG. 3 illustrates an integrated shopping environment, according to one embodiment.
  • FIG. 4 illustrates a system of influencing shopping experience based on a customer field of view, according to one embodiment.
  • FIGS. 5A and 5B illustrate an example wearable computing device for use in a shopping environment, according to one embodiment.
  • FIGS. 6A-6C illustrate determining a customer field of view and identifying items included within the field of view, according to one embodiment.
  • FIG. 7 illustrates several example views of determining a customer focus on an item within the customer's field of view, according to one embodiment.
  • FIGS. 8A-8C illustrate several example presentations of information related to an item, according to one embodiment.
  • FIG. 9A illustrates a method of influencing a shopping experience of a customer within a shopping environment, according to one embodiment.
  • FIG. 9B illustrates a method of determining content to present to a customer based on a customer interest score, according to one embodiment.
  • To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements disclosed in one embodiment may be beneficially utilized on other embodiments without specific recitation. The illustrations referred to here should not be understood as being drawn to scale unless specifically noted. Also, the drawings are often simplified and details or components omitted for clarity of presentation and explanation. The drawings and discussion serve to explain principles discussed below, where like designations denote like elements.
  • DETAILED DESCRIPTION
  • Aspects of the current disclosure relate to an integrated environment capable of providing a personalized, automated, and adaptive experience for a person within the environment. A number of different sensor devices may be employed within the environment, and networked with various computing devices such as point-of-sale (POS) terminals, digital signage, servers, and mobile or handheld computing devices to provide a seamless integration of mobile technologies and e-commerce into a traditional experience.
  • Using one or more visual sensors within the environment, a retailer or other provider may determine a person's field of view, and may compile personal behaviors and determine personal preferences. This data may then be used to provide timely, tailored recommendations in real-time to the person in order to more effectively influence their experience. While generally discussed within the context of a shopping environment, such as a retail store or other commercial environment, it is contemplated that the techniques disclosed herein may be applied to other environments (some non-limiting examples include libraries, museums, classrooms, hospitals, etc.) to provide an adaptive experience for persons included therein.
  • FIG. 1 illustrates an integrated shopping environment, according to one embodiment. As shown, environment 100 includes a plurality of terminals 105, a plurality of servers 110 1, 110 2 coupled to a network 115, one or more sensors 120 of different types, one or more user devices 140, and one or more other devices 150. In some embodiments, the environment 100 may be integrated into the layout of a retail store, market, or other commercial environment that is known or hereinafter developed.
  • Terminals 105 generally include any structure that is capable of receiving input from customers and/or producing output to customers within the environment 100. The terminals 105 may include computing systems, portions of computing systems, or devices controllable by computing systems. In one example, a terminal may include a computing device that is communicatively coupled with a visual display and audio speaker(s), as well as being communicatively coupled with one or more input devices. In another example, a terminal may include a visual display and associated driver hardware, but a computing device coupled to the terminal and providing data for display is disposed separately from the terminal. In some embodiments, terminals 105 may be implemented as standalone devices, such as a kiosk disposed on the store floor or monolithic device disposed on a shelf or platform. In some embodiments, terminals 105 may be integrated partially or wholly with other components of the environment 100, such as input or output devices included with shelving or other structural components in the environment (e.g., components used for product display or storage). In some embodiments, terminals 105 may be modular and may be easily attachable and detachable to elements of the environment 100, such as the structural components.
  • Generally, terminals 105 may be distributed throughout the environment 100 and may enhance various phases of the shopping experience for customers. For example, terminals 105 may include digital signage 108 disposed throughout the environment, such as included in or near aisles, endcaps, displays, and/or shelving in the environment. A customer may view and/or interact with the digital signage 108 as he/she moves through the store environment. The digital signage may be included in a static display or may be movable, such as including digital signage within a customer's shopping cart or basket. Terminals 105 may also include POS terminals 106 that provide a checkout functionality, allowing the customer to complete his/her shopping transaction (e.g., make payment for selected items). In some embodiments, terminals 105 may provide an integrated functionality. For example, the terminals may function in one mode as digital signage, and when engaged by a customer, the terminals function as a POS terminal.
  • Servers 110 1, 110 2 generally include processors, memory, and communications capabilities, and may perform various computing tasks to support the commercial operation of the environment 100. Servers 110 1, 110 2 communicate using various wired and/or wireless communications methods with terminals 105, sensors 120, and with other networked devices such as user devices 140 and other devices 150. Servers 110 1, 110 2 generally execute computer program code in which input data is received from networked devices, the input data is processed and/or stored by the servers, and output data is provided to networked devices for operation of the environment 100.
  • Sensors 120 may include various sensor devices, such as video sensors 125, audio sensors 130, and other sensors 135. The other sensors 135 generally include any sensors that are capable of providing meaningful information about customer interactions with the environment, e.g., location sensors, weight sensors, and so forth. Sensors 120 may be deployed throughout the environment 100 in fixed and/or in movable locations. For example, sensors 120 may be statically included in walls, floors, ceilings, displays, or other devices, or may be included in shopping carts or baskets capable of being transported around the environment. In one embodiment, sensors 120 may include adjustable position sensor devices, such as motorized cameras attached to a rail, wire, or frame. In one embodiment, sensors 120 may be included on one or more unmanned vehicles, such as unmanned ground vehicles (UGVs) or unmanned aerial vehicles (UAVs or “drones”). Sensors 120 may also include sensor devices that are included in user devices 140 or other devices 150 (which in some cases may include body-worn or carried devices). User devices 140 and other devices 150 may include passive or actively-powered devices capable of communicating with at least one of the networked devices of environment 100. One example of a passive device (which may be worn or carried) is a near-field communication (NFC) tag. Active devices may include mobile computing devices, such as smartphones or tablets, or wearable devices such as a Google Glass™ interactive eyepiece (Glass is a trademark of Google Inc.). The user devices 140 generally denotes ownership or possession of the devices by customers, while the other devices 150 denotes ownership or possession by the retailer or other administrator of the environment 100. In some cases, other devices 150 may be carried by employees and used in the course of their employment. User devices 140 and other devices 150 may execute applications or other program code that generally enables various features provided by the servers and/or other networked computing devices.
  • FIG. 2 illustrates a shopping environment system, according to one embodiment. Generally, the system 200 corresponds to the environment 100 described above. System 200 includes one or more processors 205, memory 210, and input/output 250, which are interconnected using one or more connections 240. In one embodiment, system 200 may be included in a singular computing device, and the connection 240 may be a common bus. In other embodiments, system 200 is distributed and includes a plurality of discrete computing devices that are connected through wired or wireless networking. Processors 205 may include any processing element suitable for performing functions described herein, and may include single or multiple core processors, as well as combinations thereof. Processors 205 may be included in a single computing device, or may represent an aggregation of processing elements included across a number of networked devices such as user devices 140, POS terminals 105, etc.
  • Memory 210 may include a variety of computer-readable media selected for their size, relative performance, or other capabilities: volatile and/or non-volatile media, removable and/or non-removable media, etc. Memory 210 may include cache, random access memory (RAM), storage, etc. Storage included as part of memory 210 may typically provide a non-volatile memory for the networked computing devices (e.g., servers 110 1, 110 2), and may include one or more different storage elements such as Flash memory, a hard disk drive, a solid state drive, an optical storage device, and/or a magnetic storage device. Memory 210 may be included in a single computing device or may represent an aggregation of memory included in networked devices. Memory 210 may include a plurality of modules 211 for performing various functions described herein. The modules 211 include program code that is executable by one or more of the processors 205. As shown, modules 211 include user identification 212, item identification 214, advertising 216, recommendations 218, virtual cart 220, assistance 222, security 224, power management 226, gaming 228, audit 230, loyalty program 232, and inventory 234. The modules 211 may also interact to perform certain functions. For example, a loyalty program module 232 during operation may make calls to user identification module 212, item identification module 214, advertising module 216, and so forth. The person of ordinary skill will recognize that the modules provided here are merely non-exclusive examples; different functions and/or groupings of functions may be included as desired to suitably operate the shopping environment. Memory 210 may also include customer profiles 236 and customer images 238, which may be accessed and/or modified by various of the modules 211. In one embodiment, the customer profiles 236 and customer images 238 may be stored on the servers 110 1, 110 2 or on a separate database.
  • Input/output (I/O) 250 may include a number of different devices interfacing with various computing devices and with the shopping environment. I/O 250 includes sensors 120, which have been described above. I/O 250 may further include input devices 252 and output devices 254 that are included to enhance the shopping experience for customers. In one embodiment, terminals 105, user devices 140, and other devices 150 may include visual displays and/or audio speakers (examples of the output devices 254), and various input devices 252 (such as cameras, keyboards or keypads, touchscreens, buttons, inertial sensors, etc.). I/O 250 may further include wired or wireless connections to an external network 256 using I/O adapter circuitry. Network 256 may include one or more networks of various types, including a local area or local access network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet). In one embodiment, various networked computing devices of the system 200 are interconnected using a LAN, and one or more computing devices (e.g., servers 110 1, 110 2, user devices 140) include connections to the Internet.
  • FIG. 3 illustrates an integrated shopping environment, according to one embodiment. The environment 300 includes a plurality of sensor modules 302 disposed in the ceiling 301 of the store. The sensor modules 302 may each include one or more types of sensors, such as video sensors (e.g., cameras), audio sensors (e.g., microphones), and so forth. Sensor modules 302 may also include actuating devices for providing a desired sensor orientation. Sensor modules or individual sensors may generally be disposed at any suitable location within the environment 300. Some non-limiting examples of alternative locations include below, within, or above the floor 330, within other structural components of the environment 300 such as a shelving unit 303 or walls, and so forth. In some embodiments, sensors may be disposed on, within, or near product display areas such as shelving unit 303. The sensors may also be oriented toward an expected location of a customer interaction with items, to provide better data about a customer's interaction, such as determining a customer's field of view.
  • Environment 300 also includes a number of kiosks (or terminals) 305. Generally, kiosks 305 may be configured for performing customer checkout and/or other shopping functions. Each kiosk 305 may each include computing devices or portions of computing systems, and may include various I/O devices, such as visual displays, audio speakers, cameras, microphones, etc. for interacting with the customer. In some embodiments, a customer 340 may have a mobile computing device, such as smartphone 345, that communicatively couples with the kiosk 305 for completing the checkout transaction. In one embodiment, the mobile computing device may execute a store application that is connected to the networked computing systems (e.g., through servers 110 1, 110 2), or may be directly connected to kiosk 305 through wireless networks within the environment (e.g., over Wi-Fi or Bluetooth). In one embodiment, the mobile computing device may couple to the kiosk 305 when brought within range, e.g., using Bluetooth or NFC.
  • Environment 300 also includes one or more shelving units 303 having shelves 310 that support various store items 315. Though not shown, multiple shelving units 303 may be disposed in a particular arrangement in the environment 300, and the space between adjacent shelving units may form aisles through which customers may travel. In some embodiments, the shelving unit 303 may include visual sensors or other sensor devices or I/O devices. The sensors or devices may couple to a customer's smartphone 345 and/or other networked computing devices (including servers) within the environment 300. For example, the front portions 320 of shelves 310 may include video sensors oriented outward from the shelving unit 303 to capture customer interactions with items 315 on the shelving unit 305, and the data from the video sensors may be provided to back-end servers for storage and/or analysis. In some embodiments, portions of the shelving unit 303 (such as the front portions 320 of shelves 310) may include indicator lights or other visual display devices or audio output devices that are used to communicate with a customer.
  • FIG. 4 illustrates a system of influencing shopping experience based on a customer field of view, according to one embodiment. System 400 may be used in coordination with the various shopping environments described herein. Generally, system 400 may share at least portions of several components with the shopping environment system 200, such as processors 205, memory 210, and I/O 250. System 400 may also utilize one or more of the modules 211 to provide various aspects of the system's functionality, such as item identification module 214, advertising module 216, recommendation module 218, and so on.
  • I/O 250 includes output devices 254 and sensors 120. Output devices 254 include one or more devices for presenting information to customers and include audio output devices 440 and visual output devices 445. The audio output devices 440 may include conventional audio speakers having any suitable form factor (e.g., in a stereo, headphones, etc.), as well as devices using alternative methods of producing sound to a customer, such as bone conduction transducers in a worn device. Visual output devices 445 may include visual displays and various visual indicators such as light emitting diodes (LEDs). Other output devices 450 may provide information to customers through tactile feedback (e.g., haptic devices) or other sensory stimuli. Sensors 120 may include visual sensors 455 which may be carried or worn sensors 460, and distributed sensors 465 that are disposed throughout the shopping environment. Other sensors 470 may be included that are suitable for collecting information about a customer and his/her interactions within the shopping environment. Examples of other sensors 470 may include infrared (IR) sensors, thermal sensors, weight sensors, capacitive sensors, magnetic sensors, sonar sensors, radar sensors, lidar sensors, and so forth.
  • The visual sensors 455 may be used to capture one or more images 415 of the customer and/or the shopping environment, which may include views from various perspectives (e.g., a customer-worn visual sensor, static or movable visual sensors at various locations in the environment). The images 415 may be stored in memory 210, and may be individually or collectively processed to determine information about customers in the environment and their respective interactions with items in the environment.
  • Memory 210 includes one or more programs 435 that collectively receive data about customers and the shopping environment, process the received data, and transmit information to customers in order to influence the customers' shopping experiences. Programs 435 may include program code to determine a customer's field of view at a given time, including which items are included in the field of view. In one embodiment, the customer's field of view may be determined directly. For example, a body-worn device may include a visual sensor (i.e., a worn visual sensor 460) that, when the device is worn, gives the visual sensor an orientation that is similar to the orientation of customer's head or eyes (e.g., a forward-looking camera). Images captured from the worn visual sensor may generally reflect the customer's field of view.
  • In some embodiments, the customer's field of view may be estimated (or determined indirectly) using other sensor measurements. In one embodiment, the customer's field of view may be estimated by determining the orientation of one or both of the customer's eyes. Eye orientation may be determined using worn visual sensors 460 (e.g., an inward-facing camera on a head-worn device) and/or distributed visual sensors 465 (e.g., capturing images of the customer's face and image processing to determine an eye orientation). In other embodiments, the customer's field of view may be estimated by determining the position and/or orientation of the customer's head and/or body using various visual sensor measurements. The customer's field of view may be represented in any suitable data format, such as an image or as coordinate data (e.g., Cartesian, polar, spherical coordinates).
  • While it is possible that a single visual sensor 455 may be used to determine a customer's field of view, several embodiments employ a combination of a plurality of visual sensors 455 to determine a field of view. These embodiments may be preferred as providing additional data to support a more accurate estimate of the field of view. Additionally, the plurality of visual sensors used to determine a field of view may include visual sensors selected from different categories (worn sensors 460, distributed sensors 465) to provide additional robustness to the collected data.
  • Programs 435 may also include program code to identify one or more items included within the customer's field of view. The identification process may be performed directly or estimated. One example of direct identification is performing image processing on images collected from a worn, forward-looking camera to visually identify one or more items. Estimating items within a customer's field of view may require combining sensor data with known information about the shopping environment, such as the items in the environment (item data 420) and their relative arrangement or layout within the environment (location data 425).
  • Programs 435 may also include program code to present information to customers based on the identified one or more items. In some embodiments, the information may be used to influence the customer's shopping experience. The information presented to the customer may include information about the identified items (e.g., nutritional data, pricing, a customer's purchase history of the items, etc.), information encouraging the purchase of identified items (e.g., bringing particular items to the customer's attention, touting the items' features, offering discounts or other promotions on the items, etc.), and information encouraging the purchase of alternatives to the identified items (e.g., highlighting differences of the items, offering discounts, etc.).
  • To present relevant and persuasive information in real-time to a particular customer, programs 435 may access and analyze various additional data in memory 210 that is related to the customer and perhaps other customers of the shopping environment. In one embodiment, programs 435 may analyze shopping data 430 collected from previous shopping experiences for the particular customer and/or for other customers or groups of customers. For example, shopping data 430 may include customer views and the items included therein, progressions of customer views (showing a customer's interest over time), selection or purchase history for items, and so forth. While shopping data 430 may be compiled and used to generate information to present to customers and influence their shopping experiences in real-time, the shopping data 430 may also be used by the retailer or administrator of the shopping environment to modify the layout of the environment. The shopping data 430 may help the retailer identify trends in customer shopping, and to optimize placement of items within the environment to improve customer sales.
  • System 400 may also present information to customers based on their personal preferences 405. The preferences 405 may generally be stored in a corresponding customer profile 236, and may reflect preferences that are explicitly specified by a customer, or may be determined based on the customer's historical shopping behavior (e.g., included in shopping data 430). For example, a customer may have an allergy to a particular ingredient, and the customer may enter this allergy information in preferences 405, e.g., using a mobile phone app for the retailer. Accordingly, the system 400 when determining which information to present to the customer may present information that highlights items within the customer's field of view that include the ingredient, and may further suggest alternative items that do not include the ingredient.
  • A customer's shopping history may also be used to determine customer preferences 405. For example, the customer's determined fields of view and purchase history from shopping data 430 may be processed to deduce which items, combinations of items, and/or aspects of items are preferred by the customer. For example, preferred aspects might include preferred brands, costs, quantities, sizes, ingredients, nutritional properties (e.g., calorie content, fat, sugar, vitamins, minerals, etc.), and so forth. For example, the customer may specify a preference for low-fat foods, and the system may determine recommended items based on the items included in the customer's field of view and the customer's preferences. This may include suggesting a particular item within the field of view for purchase (or alternatively, an item located outside the field of view) and/or advising the customer about the item's properties vis-à-vis the customer's preferences (e.g., reporting fat content).
  • Of course, a customer's preferences may be included as a logical combination of a plurality of these aspects (e.g., a customer prefers Brand X items to Brand Y, so long as the cost of the Brand X item is no more than 150% of the Brand Y item). In some embodiments, other customers' shopping data may also be used to deduce a particular customer's preferences. Of course, the preferences may be dynamically updated to identify whether deduced preferences are accurate or not. The dynamic updating may be caused by the customer's explicit indication and/or by the customer's shopping patterns following the deduced preference. For example, the system 400 may deduce that a customer has a categorical preference for Brand X items over similar Brand Y items. However, the customer's historical shopping data indicated that the customer looked at a Brand X item (e.g., field of view data) before selecting and purchasing a similar Brand Y item (e.g., field of view data and/or purchase history data). The system in response may adapt or may entirely remove the deduced (and determined inaccurate) preference.
  • In some embodiments, system 400 may present information to customers that is also based on other programs 410. Examples of programs 410 may include fitness, nutrition, or health goals, money management goals, etc. In some embodiments, the programs 410 themselves may be integrated into a store application and accessible by the customer's mobile computing device or wearable device. In other embodiments, the store application may interface with applications from other providers to determine the customer's goals and present appropriate information during the shopping experience. For example, the system 400 could include a nutrition-oriented program, and may make suggestions for more nutritious items to a customer who is looking at junk food items (e.g., candy).
  • FIGS. 5A and 5B illustrate an example wearable computing device for use in a shopping environment, according to one embodiment. Portions of wearable computing device 500 may be head-worn or worn on other portions of the body. The wearable computing device 500 includes a housing 505 that includes several structural components. The band 505 may be used as a structural frame, supporting other portions while itself being supported by a customer's head when worn. Other structural components may include nose pieces 520, ear piece 515, and enclosure 512. Enclosure 512 may include a computing device 525, which includes video I/O components 530 and audio I/O components 540. The enclosure formed by the earpiece 515 may also house components, such as a battery 545 for powering the computing device 525. Although not shown, computing device 525 also includes wireless networking components for communicating with other computing devices. The video I/O components 530 may include a forward-looking camera 513 that provides an estimate of the wearer's field of view based on their head orientation, and a transparent prism 514 that is used to project light onto the wearer's retina, displaying information to the wearer over the wearer's natural field of view. Other video I/O components may include an inward-looking camera that is configured to capture the eye orientation of the wearer, a conventional display device (e.g., a LCD), and so forth. Audio I/O components 540 may include one or more microphones and/or audio output devices, such as speakers or bone-conducting transducers.
  • FIG. 5B illustrates the wear of wearable computing device 500. As shown, scene 550 includes the wearer's natural view 560. A portion of the area of the natural view 560 may be used for displaying an overlay 565 (e.g., using the prism 514) that provides additional information to the wearer. The display of the overlay 565 may be adequately transparent to permit the wearer to continue to observe their natural view through the overlay area. As shown, the overlay 565 includes a map view of the wearer's current location (e.g., the wearer is at W 34th street). In some embodiments, information may be selected and visually presented in text and/or graphics to a customer wearing such a device to influence his/her shopping experience.
  • FIGS. 6A-6C illustrate determining a customer field of view and identifying items included within the field of view, according to one embodiment. In scene 600, a shelving unit 603 is depicted having a plurality of shelves 610 that each support and display a number of items 612 that are available for selection and purchase by a customer.
  • Within the scene 600 are defined a customer's field of view 615 and an area 605 outside the customer field of view. In one embodiment, the customer's field of view 615 may be represented by an image captured from a forward-looking camera. While shown as generally rectangular, the customer field of view 615 may have any suitable alternative shape and size. For example, the customer's actual vision may encompass a significantly larger area, but determining the field of view for purposes of the shopping environment may include applying a threshold or weighting scheme that emphasizes areas that are closer to the center of a customer's vision. Of course, data provided by various visual sensors and/or other sensors may be used to make these determinations.
  • The field of view 615 may include a plurality of fully included items 620, as well as a plurality of partially included items 625. When determining which items to identify as “included” in the field of view, certain embodiments may categorically include or exclude the partially included items 625. An alternative embodiment may rely on image processing to determine whether a partially included item 625 should be identified as included. For example, if the processing cannot recognize the particular item with a certain degree of confidence, the item may be excluded. In another alternative embodiment, partially included items 625 may be included, and the amount of item inclusion (e.g., the percentage of surface area of the item included) may be used to determine a customer focus or calculate a customer interest score, which are discussed further below.
  • Items that are fully or partially included in the customer field of view 615 may be recognized by performing image processing techniques on images captured by various visual sensors. For example, images that include the items may be compared against stock item images stored in a database or server. To aid image processing, items may also include markers or distinctive symbols, some of which may include item identification data such as barcodes or quick response (QR) codes. Of course, other processing techniques may be employed to recognize a particular item, such as textual recognition, determining the item's similarity to adjacent items, and so forth.
  • Scene 650 of FIG. 6B depicts a customer 660 in a shopping environment. The customer 660 is standing in an aisle 655 adjacent to a shelving unit 603, which has a plurality of shelves 610. Visual sensors may capture one or more images of scene 650 from various spatial perspectives, and the images may be used to determine the customer's field of view. Specifically, various aspects of the scene that are captured in the images may be used to estimate the customer's field of view.
  • In one embodiment, the relative position and/or orientation of portions of the customer's body may be determined. In one embodiment, the position and orientation of the customer's eyes 680 may be determined. For example, eye position within the environment may be determined in Cartesian coordinates (i.e., determining x, y, and z-direction values) and eye orientation may be represented by an angle a defined relative to a reference direction or plane (such as horizontal or an x-y plane corresponding to a particular value of z). In other embodiments, other portions of the customer's body may (also) be used to determine the field of view, such as the position and orientation of the customer's head 665, or of one or both shoulders 670. In other embodiments, the customer's interaction with the shelving unit 603 by extending her arm 675 may also be captured in one or more images, and the direction of the extended arm may be used to determine her field of view.
  • Of course, embodiments may use combinations of various aspects of the scene to determine the customer's field of view. In some embodiments, the combinations may be weighted; for example, data showing a customer 660 reaching out her arm 675 towards a specific item may be weighted more heavily to determine her field of view than the orientation of her shoulders. In some embodiments, the weights may be dynamically updated based on the customer's shopping behaviors following an estimate of the customer's field of view. For example, if a customer reached for (or selected) an item that was not included in the determined field of view, the system may adjust the relative weighting in order to accurately capture the customer's field of view. This adjustment may include determining correlation values between particular captured aspects of the scene to the selected item; for example, the customer's head may be partly turned towards the selected item, but their eye orientation may generally be more closely tied to the selected item. In some embodiments, the correlation values may be more useful where one or more aspects of the scene cannot be determined (e.g., the system may be unable to determine eye orientation for a customer wearing sunglasses, non-optimal visual sensor positioning, etc.).
  • Scene 685 of FIG. 6C illustrates an overhead view of several customers 660 in a shopping environment. In one embodiment, the view of scene 685 may be represented by an image captured from a ceiling mounted camera, or from a drone.
  • Certain additional aspects depicted in scene 685 and captured in images may be used to estimate a customer's field of view. In one example, the orientation of customer 660A may be estimated using the relative position of his/her shoulders 670. As shown, a line connecting the two shoulders may be compared to a reference direction or plane (e.g., parallel to the length of shelving unit 603A) and represented by an angle β. In another example, the orientation of customer 660B may be estimated using the orientation of his/her head 665, comparing a direction of the customer's head to a reference direction or plane, which may be represented by an angle γ. Images may also capture a customer 660C interacting with the shelving unit 603B, and the position and/or orientation of the customer's arm 675 may be used to determine the customer's field of view.
  • FIG. 7 illustrates several example views of determining a customer focus on an item within the customer's field of view, according to one embodiment. While the computing systems may determine information to present to a customer based on all of the items identified within a determined field of view, in some embodiments it may be advantageous to make a further identification of one or more items that the customer is specifically focused on. Presenting information to the customer that is based on the customer-focused items may generally provide a more relevant and more persuasive information to influence the customer's shopping experience.
  • Items 715A-D are included on a shelf 710 of a shelving unit 703. The determined field of view 705 of a customer may include different groups of items at different times, and the progression of the customer's field of view over time may help determine which item(s) within the field of view are specifically being focused on by the customer. Generally, the customer's focus on a particular item may indicate that the item is merely attracting the customer's attention, or that the customer is deciding whether or not to purchase the item. Either way, understanding the object of a customer's focus may help retailers or suppliers to improve packaging and placement of items or to influence the customer's shopping experience in real-time.
  • View 1 illustrates a field of view 705 that includes items 715A, 715B, and a portion of 715C. In one embodiment, items 715A and 715B may be included in a customer focus determination due to the items' full inclusion within the field of view 705. Conversely, item 715C may be excluded from a customer focus for being only partially included in the field of view 705. In an alternative embodiment, all three items may be included by virtue of being at least partially included in the field of view 705. In some embodiments, a customer focus on particular items may be a time-based determination. For example, if the customer's field of view 705 remained relatively steady during a preset amount of time (e.g., 5 or 10 seconds), such that both items 715A, 715B remained within the field of view during this time, the computing system may determine that the customer is focused on items 715A-C. In some embodiments, the particular item(s) must continuously remain in the field of view 705 during the preset amount of time (e.g., remain across several samples of the field of view during this time).
  • View 2 illustrates a field of view 705 that includes item 715B, and portions of items 715A and 715C. View 2 could represent the same field of view as View 1 at a later time. In some embodiments, the customer's focus may be determined to include item 715B but not 715A. For example, the system could determine that 715B is the lone customer-focused item based on its relatively central position within the field of view 705, or perhaps based on the changes to the field of view from View 1 (i.e., a shift away from previously fully included item 715A). In one embodiment, item 715B must also remain within the field of view 705 for the predetermined amount of time to be considered as a customer-focused item. Of course, other methods are possible to identify one or more relatively prominent items within a field of view to determine a customer's focus, such as determining of the percentage of the item that is included within the field of view, determining the percentage of the field of view occupied by the item, and so forth. In alternative embodiments, the customer-focused items may still include item 715A and/or item 715C.
  • View 3 illustrates a field of view 705 that includes item 715B, and portions of items 715A and 715C. View 3 differs from View 2 in that the field of view 705 is “closer” to the shelving unit 703 and items 715 in View 3 than in View 2. View 3 could represent the same field of view as View 2 at a later time, e.g., as the customer moves towards the shelving unit 703. Here, the system could determine that 715B is a customer-focused item based on its relatively central position within the field of view 705 and/or based on the changes to the field of view from View 2 (i.e., item 715B occupies an increased percentage of the field of view 705).
  • View 4 also illustrates a field of view 705 that includes item 715B, and portions of items 715A and 715C. View 4 could represent the same field of view as View 3 at a later time. In View 4, the customer has selected item 715B and is holding the item in his/her hand 725. Here, the system could determine that 715B is a customer-focused item based on its relatively central position within the field of view 705, based on the changes to the field of view from View 3 (e.g., item 715B occupies an increased percentage of the field of view 705), and/or based on the presence of a portion 730 of the customer's hand 725 included in the field of view 705.
  • In some embodiments, a customer interest score may be determined for items included within the customer's field of view. The customer interest score is generally time-based, and score may be used to timely interject desired information in order to influence a customer's shopping decisions (e.g., whether to buy a particular item) in real-time. In one embodiment, a customer interest score is calculated and updated concurrent with determining a customer focus, and the customer interest score for an item exceeding a threshold value may be used to determine that the customer has focused on the item. The customer interest score may also be adjusted based on various other aspects shown in the Views 1-4, such as position within the field of view (e.g., a central position may mean more interest), percentage occupied within the field of view (e.g., a larger percentage may mean more interest), customer selection and/or manipulation of the item, and so forth. Of course, the calculation of a customer's interest score may reflect their particular shopping behaviors (e.g., a customer is determined to be 95% likely to pick up and manipulate a customer-focused item within their field of view, so the mere fact of picking up an item might not indicate an increased level of customer interest in the item).
  • After determining that a customer has focused on a particular item, the customer interest score may generally decrease over time as it becomes more likely that the customer will not purchase the particular item. To maintain or improve customer interest, and to thereby increase the likelihood that the customer will purchase the item, the system may provide information about the item at desired times (corresponding to customer interest scores). This may include providing information about the items or alternatives, and/or offering promotional pricing on the item (or an alternative item). In one embodiment, to provide more effective promotions (e.g., more customer-tailored and more timely), and to prevent customer abuse of a time-based promotional pricing scheme, the customer shopping history may include data regarding the number of promotions already offered to the customer, the length of time (or customer interest score) before offering the promotion, the customer utilization rate of the previous promotions, etc.
  • FIGS. 8A-8C illustrate several example presentations of information related to an item, according to one embodiment. As discussed above, the presentation of information may be based upon some or all of the items that are identified as being within the customer's field of view. For ease of description, FIG. 8A illustrates a single item 805 within the customer's field of view 705; this could also represent an example in which one item is identified from several items as being a customer focused item. However, similar techniques may be applied to a field of view or a customer focus that includes more than one item. In some embodiments, the computing system may present information to the customer about a particular item, based on the item 805. The presented information may relate to the same item 805 and/or to one or more different items. The different items may include other items located within the customer's field of view 705, and may include items (such as item 810) that are not included within the field of view 705. In some embodiments, the information may be determined and timely presented so as to influence the customer to purchase the particular item(s).
  • FIG. 8B illustrates several example presentations of information to a customer. The presentations may be delivered in real-time to the customer using any suitable manner, such as a visual output (e.g., graphical and/or textual), audio-based output, and/or other sensory output (e.g., haptic). While the presentations 815, 820, 825 are shown as textual presentations for simplicity, it is possible for the presentations to be spoken or presented in any suitable alternative manner. In one example, a presentation may be displayed by a wearable computing device of the customer, such as a visual overlay on a display of the wearable computing device as discussed above with respect to FIGS. 5A and 5B. In another example, a presentation may be transmitted as a message to the customer's mobile computing device, causing text or graphics to be displayed on the mobile computing device and causing the device to vibrate.
  • Presentation 815 indicates a customer preference for a second item 810 to the currently viewed or focused item 805. Presentation 815 generally suggests that the customer may also prefer the second item 810. The customer preference may be determined based on compiled previous shopping data for other customers. The shopping data may include separate statistics for the two items, for example, based on historical purchase data for each item. The shopping data may also include compiled information that directly interrelates the two items. For example, in previous shopping experiences, other customers also focused on item 805 (e.g., based on determined fields of view) but ultimately declined to purchase item 805 in favor of item 810.
  • Presentation 820 indicates that a promotion is available for the second item 810. Presentation 820 may reflect a sales promotion or prices already in effect (e.g., a reporting function), or the promotion may be dynamically generated to encourage the customer to purchase item 810. For example, manufacturers or stores may wish to promote item 810 over item 805, and the promotion may be presented to the customer in real-time. As discussed above, in some embodiments, the promotion may be determined, as well as the timing of its presentation to the customer, based on a determined level of customer interest in an item.
  • Presentation 825 indicates that item 805 would have a negative impact on the customer's goals. The presentation 825 may be based on customer profile data such as customer preferences and/or programs. Examples of programs may include fitness, nutrition, or health goals, money management goals, etc. Although not shown here, one embodiment may present the information to the customer upon which the negative impact determination was made. If the system determines that item 805 is incompatible with specified goals or preferences, or that better alternative items exist consistent with the customer's goals/preferences, the system may recommend one or more alternative items for presentation to the customer.
  • FIG. 8C illustrates several example presentations of information to a customer. Like the presentations discussed above, the presentations may be delivered in real-time to the customer using any suitable manner, such as a visual output (e.g., graphical and/or textual), audio-based output, and/or other sensory output (e.g., haptic). The presentations may be output using a customer's worn or carried computing device, or using output hardware (e.g., displays, speakers, etc.) that is disposed within the shopping environment. For example, LEDs or other visual display elements may be included in the shelving units, such as display elements 835 a-b and 836 c-d.
  • As shown, a single item 805 is included within a customer's field of view 705, and item 810 is located outside the field of view. Of course, similar techniques may be applied to a field of view or a customer focus that includes more than one item. In cases where the item 805 (already included in the field of view 705) is recommended for presentation to the customer, the system may display various indicators that highlight the item 805 to the user. For example, if the field of view 705 represents the customer view through a head-worn computing device such as wearable computing device 500, the system may display colors, shapes, etc. that visually enhance the item 805, or may reduce the visibility of other items in the field of view (e.g., gray out). The system may also display location information of the item 805, for example, by illuminating one or more of the display elements 835 a-b nearest to the item 805.
  • In cases where an item is recommended for presentation that is not disposed within the field of view 705 (e.g., item 810), the system may output directional information from the current field of view 705 to the item. For example, the system may display an arrow 830 overlay indicating the relative direction from the field of view 705 or item 805 to the target item 810. The system may also provide directional information using one or more of the display elements 835, 836. For example, the display elements 835, 836 may be illuminated in sequence from left to right to guide the customers' eye toward item 810. Of course, the various types of information discussed here (e.g., highlighting, location, and directional) may additionally or alternately be provided by non-visual means to the customer, such as audio or haptic outputs.
  • FIG. 9A illustrates a method of influencing a shopping experience of a customer within a shopping environment, according to one embodiment. Method 900 may generally be used in conjunction with the various environments and systems discussed above. Method 900 begins at block 905, where a field of view is determined for the customer. Determining a field of view may include a “direct” determination, such as capturing an image from a body-worn, forward-looking camera (i.e., the image strongly corresponds to the customer's eye gaze). Alternatively (or in addition to the direct determination), the field of view may be determined “indirectly” by capturing images of the shopping environment and performing image processing to determine various aspects. The aspects may include the determined position and orientation of one or more of the customer's body, head, shoulders, eyes, arms, hands, etc. The images may be captured by visual sensors that are worn or disposed throughout the shopping environment.
  • At block 915, one or more first items that are included within the customer's field of view are identified. In one embodiment, identifying items may include performing image processing on a captured image that reflects the customer's field of view (e.g., from a forward-looking camera). In another embodiment, identifying items may include referencing coordinate data determined for the customer's field of view against a known store layout that includes item location information. In such an embodiment, image processing may not be necessary to identify the one or more first items.
  • At block 925, at least one second item is selected for presentation to the customer. The selection is based on the identified one or more first items. In one embodiment, at least one of the second item(s) may be included in the identified first items. In other embodiments, each of the second item(s) may be different than the identified first items. The selection may be based on decisional logic for recommending a particular item for sale, based on historical shopping data for the customer (and/or other customers) and/or customer preferences or programs.
  • At block 935, information related to the at least one second item is presented to the customer. Presenting information may be done in any suitable manner, such as displaying textual or graphical information in a visual overlay, displaying directional information to the customer, or outputting sound or other feedback. The information may be related to the properties of the second item (e.g., price, nutrition information, etc.) or to historical shopping data, promotions, or other customer-specific preferences or programs. The timing of the presentation may be determined based on customer interest scores or other timing considerations that are collectively intended to influence a customer's shopping purchases.
  • FIG. 9B illustrates a method of determining content to present to a customer based on a customer interest score, according to one embodiment. Method 940 may generally be used in conjunction with the various shopping environments and systems discussed above. In one embodiment, method 940 is performed as part of performing method 900. Method 940 begins at block 945, where a customer focus is determined on at least one item selected from one or more items that are included within the customer's field of view. The customer focus may be time-based, such as requiring the item to remain within the field of view for a minimum amount of time.
  • At block 955, a customer interest score is determined for the at least one customer-focused item. The customer interest score may also be time-based, and may be influenced by various characteristics of the interaction of the customer with the item, which may be observed through the determined customer field of view. In one embodiment, the customer interest score may be determined concurrently with determining the customer focus on an item. In one embodiment, the customer-focused item is determined based on the customer interest score for that item reaching a predetermined threshold value. In another embodiment, the customer-focused item is determined by the item's score that exceeds scores for other items within the field of view.
  • At block 965, the content of the information to present to the customer is determined based on the determined customer interest score. Determining the content may include determining whether to present information for the customer-focused item or for an alternate item, whether or not to present a promotion for the item or the alternate item, determining an amount of the promotion, and so forth.
  • The descriptions of the various embodiments of the present disclosure have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
  • In the preceding, reference is made to embodiments presented in this disclosure. However, the scope of the present disclosure is not limited to specific described embodiments. Instead, any combination of the following features and elements, whether related to different embodiments or not, is contemplated to implement and practice contemplated embodiments. Furthermore, although embodiments disclosed herein may achieve advantages over other possible solutions or over the prior art, whether or not a particular advantage is achieved by a given embodiment is not limiting of the scope of the present disclosure. Thus, the following aspects, features, embodiments and advantages are merely illustrative and are not considered elements or limitations of the appended claims except where explicitly recited in a claim(s). Likewise, reference to “the invention” shall not be construed as a generalization of any inventive subject matter disclosed herein and shall not be considered to be an element or limitation of the appended claims except where explicitly recited in a claim(s).
  • Aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.”
  • The present disclosure may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
  • The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
  • Embodiments of the disclosure may be provided to end users through a cloud computing infrastructure. Cloud computing generally refers to the provision of scalable computing resources as a service over a network. More formally, cloud computing may be defined as a computing capability that provides an abstraction between the computing resource and its underlying technical architecture (e.g., servers, storage, networks), enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. Thus, cloud computing allows a user to access virtual computing resources (e.g., storage, data, applications, and even complete virtualized computing systems) in “the cloud,” without regard for the underlying physical systems (or locations of those systems) used to provide the computing resources.
  • Typically, cloud computing resources are provided to a user on a pay-per-use basis, where users are charged only for the computing resources actually used (e.g., an amount of storage space consumed by a user or a number of virtualized systems instantiated by the user). A user can access any of the resources that reside in the cloud at any time, and from anywhere across the Internet. In context of the present disclosure, a user may access applications (e.g., a retail store app for a mobile computing device) or related data (e.g., compiled shopping data) available in the cloud. Doing so allows a user to access this information from any computing system attached to a network connected to the cloud (e.g., the Internet).
  • While the foregoing is directed to embodiments of the present disclosure, other and further embodiments of the disclosure may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.

Claims (20)

What is claimed is:
1. A computer-implemented method to influence a person within an environment having a plurality of items for selection, the method comprising:
capturing, using a first visual sensor disposed within the environment, field of view information for the person;
performing analysis on the field of view information using a computing device;
identifying, based on the analysis, one or more first items of the plurality of items that are included within the field of view of the person;
selecting, based on the identified one or more first items, at least one second item of the plurality of items for presentation to the person; and
presenting information related to the at least one second item to the person.
2. The computer-implemented method of claim 1, wherein capturing field of view information for the person includes using the first visual sensor to determine at least one of an eye orientation, a head orientation, and a body orientation of the person.
3. The computer-implemented method of claim 1, wherein performing analysis on the field of view information includes comparing the field of view information with location data for the plurality of items of the environment.
4. The computer-implemented method of claim 1, wherein performing analysis on the field of view information includes performing image processing upon one or more images of portions of the environment captured using at least a second visual sensor disposed within the environment.
5. The computer-implemented method of claim 1, further comprising:
determining a focus of the person on at least one item selected from the one or more first items included within the person's field of view, wherein determining a focus is based on inclusion of the at least one selected item within the person's field of view for at least a predetermined first length of time.
6. The computer-implemented method of claim 5, further comprising:
calculating an interest score of the person for the at least one selected item based on a length of time of inclusion within the person's field of view.
7. The computer-implemented method of claim 6, wherein presenting information related to the at least one second item includes determining content of the information based on the interest score.
8. The computer-implemented method of claim 1, wherein selecting the at least one second item is based on data compiled from one or more previous experiences of one or more persons.
9. The computer-implemented method of claim 1, wherein selecting the at least one second item is based on personal preferences included in a personal profile associated with the person.
10. The computer-implemented method of claim 1, wherein the at least one second item is not included within the person's field of view, and wherein presenting information related to the at least one second item includes indicating a direction to a location of the at least one second item within the environment.
11. A computer program product to influence a person within an environment having a plurality of items for selection, the computer program product comprising:
a computer-readable storage medium having computer-readable program code embodied therewith, the computer-readable program code executable by one or more computer processors to:
capture, using a first visual sensor disposed within the environment, field of view information for the person;
perform analysis on the field of view information using a computing device;
identify one or more first items of the plurality of items that are included within the field of view of the person;
select, based on the identified one or more first items, at least one second item of the plurality of items for presentation to the person; and
present information related to the at least one second item to the person.
12. The computer program product of claim 11, wherein performing analysis on the field of view information includes performing image processing upon one or more images of portions of the environment captured using at least a second visual sensor disposed within the environment.
13. The computer program product of claim 11, wherein the computer-readable program code is further executable to:
determine a focus of the person on at least one item selected from the one or more first items included within the person's field of view, wherein determining a focus is based on inclusion of the at least one selected item within the person's field of view for at least a predetermined first length of time.
14. The computer program product of claim 13, wherein the computer-readable program code is further executable to:
calculate an interest score of the person for the at least one selected item based on a length of time of inclusion within the person's field of view.
15. The computer program product of claim 14, wherein presenting information related to the at least one second item includes determining content of the information based on the interest score.
16. A system to influence a person within an environment having a plurality of items for selection, the system comprising:
one or more computer processors;
a first visual sensor disposed within the environment and communicatively coupled with the one or more computer processors; and
a memory containing a program which, when executed by the one or more computer processors, performs an operation comprising:
capturing, using a first visual sensor disposed within the environment, field of view information for the person;
performing analysis on the field of view information using a computing device;
identifying, based on the analysis, one or more first items of the plurality of items that are included within the field of view of the person;
selecting, based on the identified one or more first items, at least one second item of the plurality of items for presentation to the person; and
presenting information related to the at least one second item to the person.
17. The system of claim 16, wherein the first visual sensor is included in a body-worn computing device of the person.
18. The system of claim 16, further comprising a plurality of visual sensors distributed throughout the environment, and wherein performing analysis on the field of view information includes performing image processing upon one or more images of portions of the environment captured using the plurality of visual sensors.
19. The system of claim 16, further comprising an output device communicatively coupled with the one or more computer processors, wherein the output device is configured to present the information related to the at least one second item to the person using at least one of audio and visual output.
20. The system of claim 19, wherein the output device is one of a handheld computing device and a body-worn computing device of the person.
US14/590,240 2014-10-15 2015-01-06 Method, computer program product, and system for providing a sensor-based environment Abandoned US20160110791A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201462064323P true 2014-10-15 2014-10-15
US14/590,240 US20160110791A1 (en) 2014-10-15 2015-01-06 Method, computer program product, and system for providing a sensor-based environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/590,240 US20160110791A1 (en) 2014-10-15 2015-01-06 Method, computer program product, and system for providing a sensor-based environment

Publications (1)

Publication Number Publication Date
US20160110791A1 true US20160110791A1 (en) 2016-04-21

Family

ID=55748790

Family Applications (18)

Application Number Title Priority Date Filing Date
US14/590,240 Abandoned US20160110791A1 (en) 2014-10-15 2015-01-06 Method, computer program product, and system for providing a sensor-based environment
US14/644,888 Abandoned US20160110700A1 (en) 2014-10-15 2015-03-11 Transaction audit suggestions based on customer shopping behavior
US14/659,128 Active 2037-01-24 US10482724B2 (en) 2014-10-15 2015-03-16 Method, computer program product, and system for providing a sensor-based environment
US14/659,169 Active 2035-05-24 US9679327B2 (en) 2014-10-15 2015-03-16 Visual checkout with center of mass security check
US14/663,190 Active 2036-11-26 US10417878B2 (en) 2014-10-15 2015-03-19 Method, computer program product, and system for providing a sensor-based environment
US14/673,390 Active US9424601B2 (en) 2014-10-15 2015-03-30 Method, computer program product, and system for providing a sensor-based environment
US14/675,025 Pending US20160110799A1 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/674,845 Active US9786000B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/675,206 Abandoned US20160110751A1 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/675,161 Active 2035-06-27 US9842363B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for producing combined image information to provide extended vision
US14/674,922 Abandoned US20160110793A1 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/674,776 Active 2037-06-21 US10176677B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/883,198 Pending US20160110703A1 (en) 2014-10-15 2015-10-14 Method, product, and system for identifying items for transactions
US14/883,146 Pending US20160110701A1 (en) 2014-10-15 2015-10-14 Method, product, and system for unmanned vehicles in retail environments
US14/883,178 Active 2037-02-02 US10157413B2 (en) 2014-10-15 2015-10-14 Method of using, apparatus, product, and system for a no touch point-of-sale self-checkout
US15/837,507 Pending US20180108074A1 (en) 2014-10-15 2017-12-11 Method, computer program product, and system for producing combined image information to provide extended vision
US16/201,194 Pending US20190096198A1 (en) 2014-10-15 2018-11-27 Method of using apparatus, product, and system for a no touch point-of-sale self-checkout
US16/241,610 Pending US20190139375A1 (en) 2014-10-15 2019-01-07 Method, computer program product, and system for providing a sensor-based environment

Family Applications After (17)

Application Number Title Priority Date Filing Date
US14/644,888 Abandoned US20160110700A1 (en) 2014-10-15 2015-03-11 Transaction audit suggestions based on customer shopping behavior
US14/659,128 Active 2037-01-24 US10482724B2 (en) 2014-10-15 2015-03-16 Method, computer program product, and system for providing a sensor-based environment
US14/659,169 Active 2035-05-24 US9679327B2 (en) 2014-10-15 2015-03-16 Visual checkout with center of mass security check
US14/663,190 Active 2036-11-26 US10417878B2 (en) 2014-10-15 2015-03-19 Method, computer program product, and system for providing a sensor-based environment
US14/673,390 Active US9424601B2 (en) 2014-10-15 2015-03-30 Method, computer program product, and system for providing a sensor-based environment
US14/675,025 Pending US20160110799A1 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/674,845 Active US9786000B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/675,206 Abandoned US20160110751A1 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/675,161 Active 2035-06-27 US9842363B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for producing combined image information to provide extended vision
US14/674,922 Abandoned US20160110793A1 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/674,776 Active 2037-06-21 US10176677B2 (en) 2014-10-15 2015-03-31 Method, computer program product, and system for providing a sensor-based environment
US14/883,198 Pending US20160110703A1 (en) 2014-10-15 2015-10-14 Method, product, and system for identifying items for transactions
US14/883,146 Pending US20160110701A1 (en) 2014-10-15 2015-10-14 Method, product, and system for unmanned vehicles in retail environments
US14/883,178 Active 2037-02-02 US10157413B2 (en) 2014-10-15 2015-10-14 Method of using, apparatus, product, and system for a no touch point-of-sale self-checkout
US15/837,507 Pending US20180108074A1 (en) 2014-10-15 2017-12-11 Method, computer program product, and system for producing combined image information to provide extended vision
US16/201,194 Pending US20190096198A1 (en) 2014-10-15 2018-11-27 Method of using apparatus, product, and system for a no touch point-of-sale self-checkout
US16/241,610 Pending US20190139375A1 (en) 2014-10-15 2019-01-07 Method, computer program product, and system for providing a sensor-based environment

Country Status (1)

Country Link
US (18) US20160110791A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018208671A1 (en) * 2017-05-08 2018-11-15 Walmart Apollo, Llc Uniquely identifiable customer traffic systems and methods

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9139363B2 (en) 2013-03-15 2015-09-22 John Lert Automated system for transporting payloads
US10129507B2 (en) * 2014-07-15 2018-11-13 Toshiba Global Commerce Solutions Holdings Corporation System and method for self-checkout using product images
US20160110791A1 (en) * 2014-10-15 2016-04-21 Toshiba Global Commerce Solutions Holdings Corporation Method, computer program product, and system for providing a sensor-based environment
US9330474B1 (en) 2014-12-23 2016-05-03 Ricoh Co., Ltd. Distinguishing between stock keeping units using a physical dimension of a region depicted in an image
US20160260142A1 (en) 2015-03-06 2016-09-08 Wal-Mart Stores, Inc. Shopping facility assistance systems, devices and methods to support requesting in-person assistance
WO2016142794A1 (en) 2015-03-06 2016-09-15 Wal-Mart Stores, Inc Item monitoring system and method
IN2015CH01602A (en) * 2015-03-28 2015-04-24 Wipro Ltd
US10373190B2 (en) 2015-05-13 2019-08-06 Shelfbucks, Inc. System and methods for determining location of pop displays with wireless beacons through engagement with mobile devices
US9697560B2 (en) * 2015-05-21 2017-07-04 Encompass Technologies Llp Product palletizing system
US10489863B1 (en) * 2015-05-27 2019-11-26 United Services Automobile Association (Usaa) Roof inspection systems and methods
US10435241B2 (en) 2015-06-02 2019-10-08 Alert Innovation Inc. Storage and retrieval system
US9571738B2 (en) * 2015-06-23 2017-02-14 Toshiba Tec Kabushiki Kaisha Image processing apparatus
US9911290B1 (en) * 2015-07-25 2018-03-06 Gary M. Zalewski Wireless coded communication (WCC) devices for tracking retail interactions with goods and association to user accounts
US10318976B2 (en) * 2015-07-28 2019-06-11 Walmart Apollo, Llc Methods for determining measurement data of an item
US20170090195A1 (en) * 2015-09-25 2017-03-30 Intel Corporation Selective object filtering devices, systems and methods
US20170140314A1 (en) * 2015-11-17 2017-05-18 Target Brands, Inc. Planogram resetting using augmented reality in a retail environment
US20170178103A1 (en) * 2015-12-16 2017-06-22 Samsung Electronics Co., Ltd. Guided Positional Tracking
US9875548B2 (en) * 2015-12-18 2018-01-23 Ricoh Co., Ltd. Candidate list generation
US10041827B2 (en) * 2015-12-21 2018-08-07 Ncr Corporation Image guided scale calibration
US20170195572A1 (en) * 2016-01-06 2017-07-06 Orcam Technologies Ltd. Systems and methods for automatically varying privacy settings of wearable camera systems
CA2961938A1 (en) 2016-04-01 2017-10-01 Wal-Mart Stores, Inc. Systems and methods for moving pallets via unmanned motorized unit-guided forklifts
US10366144B2 (en) * 2016-04-01 2019-07-30 Ebay Inc. Analyzing and linking a set of images by identifying objects in each image to determine a primary image and a secondary image
US10331964B2 (en) * 2016-05-23 2019-06-25 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Trunk inventory detector
US10083358B1 (en) * 2016-07-26 2018-09-25 Videomining Corporation Association of unique person to point-of-sale transaction data
US20180032990A1 (en) * 2016-07-28 2018-02-01 Ncr Corporation Item location detection on scales
US10445791B2 (en) * 2016-09-08 2019-10-15 Walmart Apollo, Llc Systems and methods for autonomous assistance and routing
US10438164B1 (en) * 2016-09-27 2019-10-08 Amazon Technologies, Inc. Merging events in interactive data processing systems
WO2018081782A1 (en) * 2016-10-31 2018-05-03 Caliburger Cayman Devices and systems for remote monitoring of restaurants
EP3542327A1 (en) * 2016-11-17 2019-09-25 Alert Innovation Inc. Automated-service retail system and method
WO2018165093A1 (en) * 2017-03-07 2018-09-13 Walmart Apollo, Llc Unmanned vehicle in shopping environment
FR3070086B1 (en) * 2017-08-08 2019-08-30 Safran Identity & Security Fraud detection for access control by facial recognition
JP2019032276A (en) * 2017-08-09 2019-02-28 株式会社DSi Metering system, electronic force balance, and electronic force balance marker
JP2019045311A (en) * 2017-09-01 2019-03-22 東芝テック株式会社 Register
UA127479U (en) * 2017-12-18 2018-08-10 Юрій Юрійович Голузинець Automated system for identification and personalized communications with consumers of goods and services
FR3077261A1 (en) * 2018-02-01 2019-08-02 Eddy Gouraud Large surface trolley integrating artificial intelligence with visual object recognition
USD848530S1 (en) 2018-03-14 2019-05-14 Tambria Wagner Sign
JP2019164418A (en) * 2018-03-19 2019-09-26 日本電気株式会社 Adjustment system, adjustment method, and program
CN108520605A (en) * 2018-03-23 2018-09-11 阿里巴巴集团控股有限公司 A kind of self-help shopping air control method and system
CN108875690A (en) * 2018-06-29 2018-11-23 百度在线网络技术(北京)有限公司 Unmanned Retail commodity identifying system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080228577A1 (en) * 2005-08-04 2008-09-18 Koninklijke Philips Electronics, N.V. Apparatus For Monitoring a Person Having an Interest to an Object, and Method Thereof
US20110143779A1 (en) * 2009-12-11 2011-06-16 Think Tek, Inc. Providing City Services using Mobile Devices and a Sensor Network
US20130054377A1 (en) * 2011-08-30 2013-02-28 Nils Oliver Krahnstoever Person tracking and interactive advertising
US20130060843A1 (en) * 2010-03-29 2013-03-07 Rakuten, Inc. Server apparatus, information providing method, information providing program, recording medium recording the information providing program, and information providing system
US20130085345A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Personal Audio/Visual System Providing Allergy Awareness
US20130290107A1 (en) * 2012-04-27 2013-10-31 Soma S. Santhiveeran Behavior based bundling
US20140372211A1 (en) * 2013-06-14 2014-12-18 International Business Machines Corporation Real-time advertisement based on common point of attraction of different viewers

Family Cites Families (185)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4327819A (en) 1980-08-01 1982-05-04 Coutta John M Object detection system for a shopping cart
JPH0533941Y2 (en) * 1984-02-16 1993-08-27
US4964053A (en) * 1988-04-22 1990-10-16 Checkrobot, Inc. Self-checkout of produce items
US5235509A (en) * 1989-06-28 1993-08-10 Management Information Support, Inc. Customer self-ordering system using information displayed on a screen
US5425140A (en) * 1992-03-20 1995-06-13 International Business Machines Corporation Method and apparatus for providing conditional cascading in a computer system graphical user interface
AT181219T (en) 1993-02-05 1999-07-15 S T O P International Brighton Monitoring device einkaufswagen for control of
US5485006A (en) 1994-01-28 1996-01-16 S.T.O.P. International (Brighton) Inc. Product detection system for shopping carts
US5497314A (en) * 1994-03-07 1996-03-05 Novak; Jeffrey M. Automated apparatus and method for object recognition at checkout counters
EP0709658B1 (en) * 1994-05-13 2001-09-12 ISHIDA CO., Ltd. Combination weighing apparatus
US5666157A (en) 1995-01-03 1997-09-09 Arc Incorporated Abnormality detection and surveillance system
EP0727760A3 (en) * 1995-02-17 1997-01-29 Ibm Produce size recognition system
JP3276547B2 (en) 1995-12-01 2002-04-22 シャープ株式会社 Image recognition method
US6092725A (en) 1997-01-24 2000-07-25 Symbol Technologies, Inc. Statistical sampling security methodology for self-scanning checkout system
US5973699A (en) * 1996-09-19 1999-10-26 Platinum Technology Ip, Inc. System and method for increasing the performance for real-time rendering of three-dimensional polygonal data
US5910769A (en) 1998-05-27 1999-06-08 Geisler; Edwin Shopping cart scanning system
US6513015B2 (en) * 1998-09-25 2003-01-28 Fujitsu Limited System and method for customer recognition using wireless identification and visual data transmission
US6268882B1 (en) * 1998-12-31 2001-07-31 Elbex Video Ltd. Dome shaped camera with simplified construction and positioning
US8391851B2 (en) 1999-11-03 2013-03-05 Digimarc Corporation Gestural techniques with wireless mobile phone devices
AUPQ212499A0 (en) 1999-08-10 1999-09-02 Ajax Cooke Pty Ltd Item recognition method and apparatus
US6250671B1 (en) * 1999-08-16 2001-06-26 Cts Corporation Vehicle occupant position detector and airbag control system
US6998152B2 (en) * 1999-12-20 2006-02-14 Micron Technology, Inc. Chemical vapor deposition methods utilizing ionic liquids
US6726094B1 (en) * 2000-01-19 2004-04-27 Ncr Corporation Method and apparatus for multiple format image capture for use in retail transactions
US6685000B2 (en) 2000-05-19 2004-02-03 Kabushiki Kaisha Nippon Conlux Coin discrimination method and device
GB2368928A (en) 2000-07-21 2002-05-15 Dennis Stephen Livingstone Computer system for a kitchen
US20020047867A1 (en) 2000-09-07 2002-04-25 Mault James R Image based diet logging
US6412694B1 (en) * 2000-09-20 2002-07-02 Ncr Corporation Produce recognition system and method including weighted rankings
US20020109600A1 (en) 2000-10-26 2002-08-15 Mault James R. Body supported activity and condition monitor
US7845554B2 (en) 2000-10-30 2010-12-07 Fujitsu Frontech North America, Inc. Self-checkout method and apparatus
US7640512B1 (en) * 2000-12-22 2009-12-29 Automated Logic Corporation Updating objects contained within a webpage
US6790448B2 (en) * 2001-05-08 2004-09-14 The Texas A&M University System University Surface proteins from gram-positive bacteria having highly conserved motifs and antibodies that recognize them
US7933797B2 (en) 2001-05-15 2011-04-26 Shopper Scientist, Llc Purchase selection behavior analysis system and method
US6601762B2 (en) * 2001-06-15 2003-08-05 Koninklijke Philips Electronics N.V. Point-of-sale (POS) voice authentication transaction system
US20030018897A1 (en) * 2001-07-20 2003-01-23 Psc Scanning, Inc. Video identification verification system and method for a self-checkout system
US20030036985A1 (en) * 2001-08-15 2003-02-20 Soderholm Mark J. Product locating system for use in a store or other facility
CA2457198A1 (en) * 2001-08-16 2003-02-27 Trans World New York Llc User-personalized media sampling, recommendation and purchasing system using real-time inventory database
US20030039379A1 (en) 2001-08-23 2003-02-27 Koninklijke Philips Electronics N.V. Method and apparatus for automatically assessing interest in a displayed product
KR20040065260A (en) * 2001-12-13 2004-07-21 코닌클리케 필립스 일렉트로닉스 엔.브이. Recommending media content on a media system
US6991066B2 (en) * 2002-02-01 2006-01-31 International Business Machines Corporation Customized self-checkout system
US7050078B2 (en) 2002-12-19 2006-05-23 Accenture Global Services Gmbh Arbitrary object tracking augmented reality applications
JP3992629B2 (en) * 2003-02-17 2007-10-17 株式会社ソニー・コンピュータエンタテインメント Image generation system, image generation apparatus, and image generation method
US7180014B2 (en) * 2003-03-20 2007-02-20 Boris Farber Method and equipment for automated tracking and identification of nonuniform items
EP1616288A4 (en) 2003-04-07 2008-02-27 Silverbrook Res Pty Ltd Laser scanning device for printed product identification codes
US7406331B2 (en) * 2003-06-17 2008-07-29 Sony Ericsson Mobile Communications Ab Use of multi-function switches for camera zoom functionality on a mobile phone
US6926202B2 (en) * 2003-07-22 2005-08-09 International Business Machines Corporation System and method of deterring theft of consumers using portable personal shopping solutions in a retail environment
US20050097064A1 (en) * 2003-11-04 2005-05-05 Werden Todd C. Method and apparatus to determine product weight and calculate price using a camera
EP1702222A4 (en) * 2003-12-30 2009-05-27 Trans World New York Llc Systems and methods for the selection and purchase of digital assets
US7246745B2 (en) * 2004-02-27 2007-07-24 Evolution Robotics Retail, Inc. Method of merchandising for checkout lanes
US7337960B2 (en) 2004-02-27 2008-03-04 Evolution Robotics, Inc. Systems and methods for merchandise automatic checkout
US7207477B1 (en) * 2004-03-08 2007-04-24 Diebold, Incorporated Wireless transfer of account data and signature from hand-held device to electronic check generator
AU2005251372B2 (en) 2004-06-01 2008-11-20 L-3 Communications Corporation Modular immersive surveillance processing system and method
US8448858B1 (en) * 2004-06-21 2013-05-28 Stoplift, Inc. Method and apparatus for detecting suspicious activity using video analysis from alternative camera viewpoint
US7631808B2 (en) 2004-06-21 2009-12-15 Stoplift, Inc. Method and apparatus for detecting suspicious activity using video analysis
US7516888B1 (en) 2004-06-21 2009-04-14 Stoplift, Inc. Method and apparatus for auditing transaction activity in retail and other environments using visual recognition
US20050283402A1 (en) * 2004-06-22 2005-12-22 Ncr Corporation System and method of facilitating remote interventions in a self-checkout system
US20060018516A1 (en) * 2004-07-22 2006-01-26 Masoud Osama T Monitoring activity using video information
US7219838B2 (en) 2004-08-10 2007-05-22 Howell Data Systems System and method for notifying a cashier of the presence of an item in an obscured area of a shopping cart
US7168618B2 (en) 2004-08-12 2007-01-30 International Business Machines Corporation Retail store method and system
JP4284448B2 (en) 2005-01-28 2009-06-24 富士フイルム株式会社 Image processing apparatus and method
US8040361B2 (en) 2005-04-11 2011-10-18 Systems Technology, Inc. Systems and methods for combining virtual and real-time physical environments
US8046375B2 (en) * 2005-06-16 2011-10-25 Lycos, Inc. Geo targeted commerce
DE102005036572A1 (en) 2005-08-01 2007-02-08 Scheidt & Bachmann Gmbh A method of automatically determining the number of people and / or objects in a gate
US8639543B2 (en) 2005-11-01 2014-01-28 International Business Machines Corporation Methods, systems, and media to improve employee productivity using radio frequency identification
JP4607797B2 (en) 2006-03-06 2011-01-05 株式会社東芝 Behavior discrimination device, method and program
KR100786700B1 (en) * 2006-07-14 2007-12-21 삼성전자주식회사 Method of drying an object and apparatus for performing the same
US7930204B1 (en) 2006-07-25 2011-04-19 Videomining Corporation Method and system for narrowcasting based on automatic analysis of customer behavior in a retail store
WO2008031163A1 (en) 2006-09-13 2008-03-20 Eatingsafe Pty Ltd. On-line ingredient register
US7987111B1 (en) 2006-10-30 2011-07-26 Videomining Corporation Method and system for characterizing physical retail spaces by determining the demographic composition of people in the physical retail spaces utilizing video image analysis
US7533799B2 (en) 2006-12-14 2009-05-19 Ncr Corporation Weight scale fault detection
US9269244B2 (en) 2007-03-06 2016-02-23 Verint Systems Inc. Event detection based on video metadata
US8146811B2 (en) 2007-03-12 2012-04-03 Stoplift, Inc. Cart inspection for suspicious items
US20080228549A1 (en) * 2007-03-14 2008-09-18 Harrison Michael J Performance evaluation systems and methods
US8965042B2 (en) 2007-03-20 2015-02-24 International Business Machines Corporation System and method for the measurement of retail display effectiveness
US7762458B2 (en) 2007-03-25 2010-07-27 Media Cart Holdings, Inc. Media enabled shopping system user interface
US7679522B2 (en) 2007-03-26 2010-03-16 Media Cart Holdings, Inc. Media enhanced shopping systems with electronic queuing
US20080294514A1 (en) * 2007-05-23 2008-11-27 Calman Matthew A System and method for remote deposit capture and customer information gathering
US8794524B2 (en) * 2007-05-31 2014-08-05 Toshiba Global Commerce Solutions Holdings Corporation Smart scanning system
EP2181427A2 (en) * 2007-07-09 2010-05-05 Velti Plc Mobile device marketing and advertising platforms, methods, and systems
US7672876B2 (en) 2007-07-13 2010-03-02 Sunrise R&D Holdings, Llc System for shopping in a store
US8876001B2 (en) 2007-08-07 2014-11-04 Ncr Corporation Methods and apparatus for image recognition in checkout verification
US20090039165A1 (en) 2007-08-08 2009-02-12 Ncr Corporation Methods and Apparatus for a Bar Code Scanner Providing Video Surveillance
US7909248B1 (en) 2007-08-17 2011-03-22 Evolution Robotics Retail, Inc. Self checkout with visual recognition
JP4413957B2 (en) 2007-08-24 2010-02-10 株式会社東芝 Moving object detection device and autonomous moving object
US7949568B2 (en) 2007-08-31 2011-05-24 Accenture Global Services Limited Determination of product display parameters based on image processing
US8189855B2 (en) * 2007-08-31 2012-05-29 Accenture Global Services Limited Planogram extraction based on image processing
JP5080196B2 (en) 2007-10-09 2012-11-21 任天堂株式会社 Program, information processing apparatus, information processing system, and information processing method
US8456293B1 (en) 2007-10-22 2013-06-04 Alarm.Com Incorporated Providing electronic content based on sensor data
JP2011503724A (en) * 2007-11-08 2011-01-27 ウォル−マート・ストアーズ・インコーポレイテッドWal−Mart Stores, Inc. Automatic shopper checkout method and automatic shopper checkout apparatus using radio frequency identification technology
WO2009078114A1 (en) 2007-12-18 2009-06-25 Ssd Company Limited Mobile recording apparatus, body movement measuring apparatus, information processing apparatus, movement pattern determining apparatus, activity amount calculating apparatus, recording method, body movement measuring method, information processing method, movement pattern determining method, activity amount calculating met
FR2927442B1 (en) 2008-02-12 2013-06-14 Cliris Method for determining a local transformation rate of an object of interest
US8746557B2 (en) * 2008-02-26 2014-06-10 Toshiba Global Commerce Solutions Holding Corporation Secure self-checkout
JP5521276B2 (en) * 2008-03-13 2014-06-11 富士通株式会社 Authentication apparatus, authentication method, and authentication program
US20090237564A1 (en) 2008-03-18 2009-09-24 Invism, Inc. Interactive immersive virtual reality and simulation
US8419433B2 (en) 2008-04-15 2013-04-16 International Business Machines Corporation Monitoring recipe preparation using interactive cooking device
US8341077B1 (en) 2008-06-16 2012-12-25 Bank Of America Corporation Prediction of future funds positions
US8219438B1 (en) * 2008-06-30 2012-07-10 Videomining Corporation Method and system for measuring shopper response to products based on behavior and facial expression
US8126195B2 (en) * 2008-07-01 2012-02-28 International Business Machines Corporation Graphical retail item identification with point-of-sale terminals
US20110191117A1 (en) * 2008-08-15 2011-08-04 Mohammed Hashim-Waris Systems and methods for delivering medical consultation at pharmacies
US8448859B2 (en) 2008-09-05 2013-05-28 Datalogic ADC, Inc. System and method for preventing cashier and customer fraud at retail checkout
US20100063862A1 (en) * 2008-09-08 2010-03-11 Thompson Ronald L Media delivery system and system including a media delivery system and a building automation system
US8818875B2 (en) * 2008-09-23 2014-08-26 Toshiba Global Commerce Solutions Holdings Corporation Point of sale system with item image capture and deferred invoicing capability
US8194985B2 (en) * 2008-10-02 2012-06-05 International Business Machines Corporation Product identification using image analysis and user interaction
US8493408B2 (en) 2008-11-19 2013-07-23 Apple Inc. Techniques for manipulating panoramas
US8289162B2 (en) 2008-12-22 2012-10-16 Wimm Labs, Inc. Gesture-based user interface for a wearable portable device
US8571298B2 (en) 2008-12-23 2013-10-29 Datalogic ADC, Inc. Method and apparatus for identifying and tallying objects
US8494909B2 (en) 2009-02-09 2013-07-23 Datalogic ADC, Inc. Automatic learning in a merchandise checkout system with visual recognition
US8494215B2 (en) 2009-03-05 2013-07-23 Microsoft Corporation Augmenting a field of view in connection with vision-tracking
US20100262461A1 (en) * 2009-04-14 2010-10-14 Mypoints.Com Inc. System and Method for Web-Based Consumer-to-Business Referral
SG175827A1 (en) * 2009-06-16 2011-12-29 Intel Corp Camera applications in a handheld device
US10296937B2 (en) * 2009-06-29 2019-05-21 Excalibur Ip, Llc Operating a sensor recording marketplace
GB0913990D0 (en) * 2009-08-11 2009-09-16 Connelly Sean R Trolley
US8452868B2 (en) 2009-09-21 2013-05-28 Checkpoint Systems, Inc. Retail product tracking system, method, and apparatus
US8538820B1 (en) 2009-10-26 2013-09-17 Stoplift, Inc. Method and apparatus for web-enabled random-access review of point of sale transactional video
US8332255B2 (en) * 2009-11-09 2012-12-11 Palo Alto Research Center Incorporated Sensor-integrated mirror for determining consumer shopping behavior
US8320633B2 (en) * 2009-11-27 2012-11-27 Ncr Corporation System and method for identifying produce
US8964298B2 (en) * 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
US20110231331A1 (en) * 2010-03-19 2011-09-22 International Business Machines Corporation Providing An Enhanced Shopping Experience
US20110279446A1 (en) 2010-05-16 2011-11-17 Nokia Corporation Method and apparatus for rendering a perspective view of objects and content related thereto for location-based services on mobile device
US9326116B2 (en) * 2010-08-24 2016-04-26 Rhonda Enterprises, Llc Systems and methods for suggesting a pause position within electronic text
US8941723B2 (en) 2010-08-26 2015-01-27 Blast Motion Inc. Portable wireless mobile device motion capture and analysis system and method
US20120072936A1 (en) * 2010-09-20 2012-03-22 Microsoft Corporation Automatic Customized Advertisement Generation System
US9171442B2 (en) * 2010-11-19 2015-10-27 Tyco Fire & Security Gmbh Item identification using video recognition to supplement bar code or RFID information
GB201022049D0 (en) * 2010-12-29 2011-02-02 Imp Innovations Ltd Methods
US20120233003A1 (en) * 2011-03-08 2012-09-13 Bank Of America Corporation Providing retail shopping assistance
US20120239504A1 (en) * 2011-03-15 2012-09-20 Microsoft Corporation Virtual Shopping Assistance
JP5780791B2 (en) * 2011-03-23 2015-09-16 オリンパス株式会社 Cell tracking method
US20120271715A1 (en) 2011-03-25 2012-10-25 Morton Timothy B System and method for the automatic delivery of advertising content to a consumer based on the consumer's indication of interest in an item or service available in a retail environment
DE102011016663A1 (en) * 2011-04-05 2012-10-11 How To Organize (H2O) Gmbh Device and method for identifying instruments
US20120290288A1 (en) * 2011-05-09 2012-11-15 Xerox Corporation Parsing of text using linguistic and non-linguistic list properties
WO2012170551A2 (en) 2011-06-06 2012-12-13 Stoplift, Inc. Notification system and methods for use in retail environments
US8698874B2 (en) * 2011-06-10 2014-04-15 Microsoft Corporation Techniques for multiple video source stitching in a conference room
KR101822655B1 (en) * 2011-06-21 2018-01-29 삼성전자주식회사 Object recognition method using camera and camera system for the same
US20130030915A1 (en) * 2011-06-23 2013-01-31 Qualcomm Incorporated Apparatus and method for enhanced in-store shopping services using mobile device
US8851372B2 (en) 2011-07-18 2014-10-07 Tiger T G Zhou Wearable personal digital device with changeable bendable battery and expandable display used as standalone electronic payment card
US20130024265A1 (en) * 2011-07-22 2013-01-24 Marc Lotzof Programmable Customer Loyalty and Discount Card
US9251679B2 (en) 2011-08-16 2016-02-02 Tamperseal Ab Method and a system for monitoring the handling of an object
US20130046648A1 (en) * 2011-08-17 2013-02-21 Bank Of America Corporation Shopping list system and process
US20130222371A1 (en) 2011-08-26 2013-08-29 Reincloud Corporation Enhancing a sensory perception in a field of view of a real-time source within a display screen through augmented reality
US9367770B2 (en) 2011-08-30 2016-06-14 Digimarc Corporation Methods and arrangements for identifying objects
US9033238B2 (en) 2011-08-30 2015-05-19 Digimarc Corporation Methods and arrangements for sensing identification information from objects
JP2013109539A (en) 2011-11-21 2013-06-06 Hitachi Consumer Electronics Co Ltd Product purchase device and product purchase method
US20130185155A1 (en) * 2012-01-12 2013-07-18 Big Red Pen, Inc. Systems and methods for providing contributions from third parties to lower a cost of a transaction for a purchaser
US9530060B2 (en) 2012-01-17 2016-12-27 Avigilon Fortress Corporation System and method for building automation using video content analysis with depth sensing
DE112013000936T5 (en) * 2012-02-10 2014-11-27 Zachary T. Bonefas System and method for material transport with an imaging device on acquiring vehicle for controlling the distribution of material in the hold of the receiving vehicle
JP5785123B2 (en) * 2012-03-16 2015-09-24 株式会社イシダ Combination weighing device
US20150095189A1 (en) * 2012-03-16 2015-04-02 In Situ Media Corporation System and method for scanning, tracking and collating customer shopping selections
US20130254114A1 (en) 2012-03-23 2013-09-26 Ncr Corporation Network-based self-checkout
US9875483B2 (en) * 2012-05-17 2018-01-23 Wal-Mart Stores, Inc. Conversational interfaces
EP2850570A4 (en) 2012-05-17 2015-10-07 Catalina Marketing Corp System and method of initiating in-trip audits in a self-checkout system
US8919653B2 (en) 2012-07-19 2014-12-30 Datalogic ADC, Inc. Exception handling in automated data reading systems
US9135789B2 (en) 2012-07-31 2015-09-15 Ncr Corporation Method and apparatus for reducing recognition times in an image-based product recognition system
US9171382B2 (en) * 2012-08-06 2015-10-27 Cloudparc, Inc. Tracking speeding violations and controlling use of parking spaces using cameras
US8856034B2 (en) 2012-08-16 2014-10-07 International Business Machines Corporation Intelligent point of sale system
US20140207600A1 (en) * 2012-08-24 2014-07-24 Daniel Ezell System and method for collection and management of items
US20160063671A1 (en) 2012-08-30 2016-03-03 Nokia Corporation A method and apparatus for updating a field of view in a user interface
US20140098185A1 (en) 2012-10-09 2014-04-10 Shahram Davari Interactive user selected video/audio views by real time stitching and selective delivery of multiple video/audio sources
US9396622B2 (en) 2012-11-02 2016-07-19 Tyco Fire & Security Gmbh Electronic article surveillance tagged item validation prior to deactivation
US20140182953A1 (en) 2012-12-31 2014-07-03 Fujikura Composite America, Inc. Electronic scale
JP5314199B1 (en) * 2013-01-29 2013-10-16 パナソニック株式会社 Customer segment analysis apparatus, customer segment analysis system, and customer segment analysis method
US20140214623A1 (en) * 2013-01-30 2014-07-31 Wal-Mart Stores, Inc. In-store customer scan process including product automated ingredient warning
US9076157B2 (en) 2013-01-30 2015-07-07 Wal-Mart Stores, Inc. Camera time out feature for customer product scanning device
US10438228B2 (en) * 2013-01-30 2019-10-08 Walmart Apollo, Llc Systems and methods for price matching and comparison
US20140222596A1 (en) 2013-02-05 2014-08-07 Nithin Vidya Prakash S System and method for cardless financial transaction using facial biomertics
US10179543B2 (en) 2013-02-27 2019-01-15 Magna Electronics Inc. Multi-camera dynamic top view vision system
US20140274307A1 (en) * 2013-03-13 2014-09-18 Brainz SAS System and method for providing virtual world reward in response to the user accepting and/or responding to an advertisement for a real world product received in the virtual world
US9330413B2 (en) * 2013-03-14 2016-05-03 Sears Brands, L.L.C. Checkout and/or ordering systems and methods
US9033227B2 (en) 2013-05-20 2015-05-19 Ncr Corporation Methods and systems for performing security weight checks at checkouts
US20140365333A1 (en) * 2013-06-07 2014-12-11 Bby Solutions, Inc. Retail store customer natural-gesture interaction with animated 3d images using sensor array
US20140365334A1 (en) 2013-06-07 2014-12-11 Bby Solutions, Inc. Retail customer service interaction system and method
US20140365336A1 (en) 2013-06-07 2014-12-11 Bby Solutions, Inc. Virtual interactive product display with mobile device interaction
US20140367466A1 (en) * 2013-06-12 2014-12-18 Motorola Solutions, Inc. Checkout kiosk
US9338440B2 (en) * 2013-06-17 2016-05-10 Microsoft Technology Licensing, Llc User interface for three-dimensional modeling
US10268983B2 (en) 2013-06-26 2019-04-23 Amazon Technologies, Inc. Detecting item interaction and movement
US10176456B2 (en) 2013-06-26 2019-01-08 Amazon Technologies, Inc. Transitioning items from a materials handling facility
US9127891B2 (en) * 2013-07-10 2015-09-08 Honeywell International, Inc. Furnace visualization
US9473747B2 (en) * 2013-07-25 2016-10-18 Ncr Corporation Whole store scanner
US20150039388A1 (en) * 2013-07-30 2015-02-05 Arun Rajaraman System and method for determining consumer profiles for targeted marketplace activities
KR20150018037A (en) * 2013-08-08 2015-02-23 주식회사 케이티 System for monitoring and method for monitoring using the same
US20150100433A1 (en) * 2013-10-04 2015-04-09 Retailigence Corporation Online Reservation System For Local Pickup Of Products Across Multiple Retailers
US20150134413A1 (en) * 2013-10-31 2015-05-14 International Business Machines Corporation Forecasting for retail customers
US9122958B1 (en) * 2014-02-14 2015-09-01 Social Sweepster, LLC Object recognition or detection based on verification tests
US9244280B1 (en) * 2014-03-25 2016-01-26 Rockwell Collins, Inc. Near eye display system and method for display enhancement or redundancy
US9779395B2 (en) * 2014-05-13 2017-10-03 Wal-Mart Stores, Inc. Systems and methods for identifying transaction capabilities of cashier
US20150379118A1 (en) * 2014-06-27 2015-12-31 United Video Properties, Inc. Methods and systems for generating playlists based on activities being performed by a user
US20160110791A1 (en) * 2014-10-15 2016-04-21 Toshiba Global Commerce Solutions Holdings Corporation Method, computer program product, and system for providing a sensor-based environment
JP6302849B2 (en) * 2015-01-23 2018-03-28 東芝テック株式会社 Article recognition apparatus, sales data processing apparatus, and control program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080228577A1 (en) * 2005-08-04 2008-09-18 Koninklijke Philips Electronics, N.V. Apparatus For Monitoring a Person Having an Interest to an Object, and Method Thereof
US20110143779A1 (en) * 2009-12-11 2011-06-16 Think Tek, Inc. Providing City Services using Mobile Devices and a Sensor Network
US20130060843A1 (en) * 2010-03-29 2013-03-07 Rakuten, Inc. Server apparatus, information providing method, information providing program, recording medium recording the information providing program, and information providing system
US20130054377A1 (en) * 2011-08-30 2013-02-28 Nils Oliver Krahnstoever Person tracking and interactive advertising
US20130085345A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Personal Audio/Visual System Providing Allergy Awareness
US20130290107A1 (en) * 2012-04-27 2013-10-31 Soma S. Santhiveeran Behavior based bundling
US20140372211A1 (en) * 2013-06-14 2014-12-18 International Business Machines Corporation Real-time advertisement based on common point of attraction of different viewers

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018208671A1 (en) * 2017-05-08 2018-11-15 Walmart Apollo, Llc Uniquely identifiable customer traffic systems and methods

Also Published As

Publication number Publication date
US10482724B2 (en) 2019-11-19
US20160110751A1 (en) 2016-04-21
US20160110793A1 (en) 2016-04-21
US20160110797A1 (en) 2016-04-21
US20190139375A1 (en) 2019-05-09
US20160110786A1 (en) 2016-04-21
US20160110702A1 (en) 2016-04-21
US9424601B2 (en) 2016-08-23
US20160110622A1 (en) 2016-04-21
US20190096198A1 (en) 2019-03-28
US20180108074A1 (en) 2018-04-19
US9786000B2 (en) 2017-10-10
US10417878B2 (en) 2019-09-17
US9679327B2 (en) 2017-06-13
US20160110700A1 (en) 2016-04-21
US20160110902A1 (en) 2016-04-21
US20160110799A1 (en) 2016-04-21
US9842363B2 (en) 2017-12-12
US10157413B2 (en) 2018-12-18
US20160110772A1 (en) 2016-04-21
US20160110760A1 (en) 2016-04-21
US20160110701A1 (en) 2016-04-21
US10176677B2 (en) 2019-01-08
US20160110703A1 (en) 2016-04-21
US20160109281A1 (en) 2016-04-21

Similar Documents

Publication Publication Date Title
US10198712B2 (en) Virtual planogram management systems and methods
US9796093B2 (en) Customer service robot and related systems and methods
US20160042315A1 (en) System and methods for order fulfillment, inventory management, and providing personalized services to customers
US9053483B2 (en) Personal audio/visual system providing allergy awareness
US7624923B2 (en) Providing directed content to anonymous customers
US20130110666A1 (en) Interactive retail system
US20140365334A1 (en) Retail customer service interaction system and method
US20160026868A1 (en) Wearable apparatus and method for processing images including product descriptors
US10290031B2 (en) Method and system for automated retail checkout using context recognition
US20130293530A1 (en) Product augmentation and advertising in see through displays
KR101997957B1 (en) Systems and methods for providing information based on location
US20090182499A1 (en) Method and apparatus for augmented reality shopping assistant
US10360571B2 (en) Method for monitoring and analyzing behavior and uses thereof
AU2014225837B2 (en) In-store item alert architecture
CA2871413C (en) Customer assistance request system using smart device
US10176677B2 (en) Method, computer program product, and system for providing a sensor-based environment
US9575558B2 (en) System and method for electronically assisting a customer at a product retail location
US8636209B2 (en) System and method for interactive marketing to consumers
US20130117153A1 (en) Fully interactive, wireless, retail video display tag, integrated with content distribution, data management, feedback data collection, inventory and product price search capabilities
TW201303784A (en) Guideline-based food purchase management
US8479975B2 (en) System and method for using machine-readable indicia to provide additional information and offers to potential customers
US20100153174A1 (en) Generating Retail Cohorts From Retail Data
US20160203499A1 (en) Customer behavior analysis system, customer behavior analysis method, non-transitory computer readable medium, and shelf system
US9811840B2 (en) Consumer interface device system and method for in-store navigation
US10382804B2 (en) Systems and methods for identifying exposure to a recognizable item

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOSHIBA GLOBAL COMMERCE SOLUTIONS HOLDINGS CORPORA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HERRING, DEAN FREDERICK;CHIRAKANSAKCHAROEN, MONSAK JASON;SINGH, ANKIT;SIGNING DATES FROM 20141223 TO 20141224;REEL/FRAME:034644/0068

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION