US20180018681A1 - Holographic Technology Implemented Retail Solutions - Google Patents

Holographic Technology Implemented Retail Solutions Download PDF

Info

Publication number
US20180018681A1
US20180018681A1 US15/381,262 US201615381262A US2018018681A1 US 20180018681 A1 US20180018681 A1 US 20180018681A1 US 201615381262 A US201615381262 A US 201615381262A US 2018018681 A1 US2018018681 A1 US 2018018681A1
Authority
US
United States
Prior art keywords
persons
mixed reality
servers
devices
store
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/381,262
Inventor
Robert B. Locke
Paul B. Rasband
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tyco Fire and Security GmbH
Original Assignee
Tyco Fire and Security GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tyco Fire and Security GmbH filed Critical Tyco Fire and Security GmbH
Priority to US15/381,262 priority Critical patent/US20180018681A1/en
Assigned to TYCO FIRE & SECURITY GMBH reassignment TYCO FIRE & SECURITY GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RASBAND, PAUL B.
Assigned to TYCO FIRE & SECURITY GMBH reassignment TYCO FIRE & SECURITY GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LOCKE, Robert B.
Publication of US20180018681A1 publication Critical patent/US20180018681A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/117Tagging; Marking up; Designating a block; Setting of attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • G06F40/169Annotation, e.g. comment data or footnotes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • G06K9/00288
    • G06K9/00362
    • G06K9/00771
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • G06N5/043Distributed expert systems; Blackboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0281Customer communication at a business location, e.g. providing product or service information, consulting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/22Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
    • G07C9/25Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
    • G07C9/257Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition electronically
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • H04N13/044
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/004Annotating, labelling
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19654Details concerning communication with a camera
    • G08B13/19656Network used to communicate with a camera, e.g. WAN, LAN, Internet
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/28Constructional details of speech recognition systems
    • G10L15/30Distributed recognition, e.g. in client-server systems, for mobile phones or network applications
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Definitions

  • This description relates to operation of sensor networks such as those used with security, intrusion and alarm systems installed on commercial facility.
  • sensors are deployed in commercial buildings. Types of sensors typically include motion detectors, cameras, and proximity sensors (used to determine whether a door or window has been opened). Such sensors can constantly collecting data that is used to determine whether an alarm should be triggered, but also continues to collect data after an alarm is triggered.
  • Retail establishments often use simple physical walk-throughs with users having smart-phone and/or tablet based presentations, and use conventional retail analytics applications, and verbal descriptions as tools used for analysis to investigate trends and potential explanations of observations suggested by data analytics.
  • Augmented reality virtual reality and mixed reality technologies are known.
  • virtual reality refers to technologies that replicate an environment with a simulation of a user being immersed in the replicated environment.
  • Augmented reality generally refers to technologies that present a view of a real-world environment augmented with computer generated data.
  • Mixed reality a relatively new term generally involves technologies that involve a merging of real world and virtual world environments where real and virtual objects exist and interact.
  • a system includes r.f. detectors that detect presence of r.f. signals, a mixed reality system including a processor device, memory in communication with the processor device, a head mounted display device including a stereoscopic 3D display that renders one or more images of persons within a field of view of the mixed reality system device, with the mixed reality system configured to send the images to one or more server computer systems.
  • the systems also includes the one or more server computer systems configured to process information from r.f. signals sent from r.f.
  • devices in possession of at least some of the persons in the field of view, analyze the processed information and the one or more images to identify at least one of the persons in the field of view, access at least one profile corresponding to the at least one identified person, and classify the one or more persons in the one or more images according to a status.
  • aspects also include computer program products and methods.
  • the disclosed techniques use computer implemented techniques that obtain information from various electronic systems/devices in the physical world, which devices are exemplified by security systems, and merge that information into a virtual world of policies and analytics that involve retail systems that generate analytical information regarding customers and their preferences and needs. This improves upon simple physical walk-throughs blended with smart-phone and tablet based presentations, conventional retail analytics apps, and verbal descriptions.
  • FIG. 1 is a schematic diagram of an exemplary networked security system.
  • FIG. 2 is a block diagram of a generally conventional constrained device that is typically used in a security system application.
  • FIG. 3 is a flow chart depicting application processing using mixed reality devices adapted for retail solutions.
  • FIG. 4 is a flow chart depicting an alternative application processing using mixed reality devices adapted for retail solutions.
  • FIG. 5 is a block diagram that shows a mixed reality session manager.
  • FIG. 6 is a block diagram that shows components of a typical AI based cognitive assistant or agent.
  • an integrated platform 10 that integrates via a distributed network 11 , servers executing various analytical applications to detect trends, loss prevention, etc., with mixed reality devices 13 a - 13 c and sensors such as from installed security/intrusion/alarm/surveillance systems 15 a - 15 c (typically including sensors 20 , functional nodes 18 and typically including a panel not shown).
  • Example mixed reality devices 13 a - 13 c are those in which the mixed reality devices 13 a - 13 c incorporate a live, real world presentation of elements of the physical real-world with virtual elements so that to a user these elements are perceived to exist together in a common environment.
  • Examples of such mixed reality devices 13 a - 13 c include Mixed Reality Device® (Microsoft), (a smart-glasses, cordless, Windows 10® (Microsoft) computer headset that includes various sensors and a high-definition stereoscopic 3D optical head-mounted display, and spatial sound to allow for augmented reality applications.
  • Other mixed reality devices/augmented reality systems such as Google Glass® (Google) could be used. There are many such systems on the market of which these are two examples.
  • the security systems 15 a - 15 c typically include a panel (not shown), such as for an intrusion detection system, an intrusion detection panel wired or wirelessly connected to a variety of sensors deployed in a facility. Typically, such panels receive signals from one or more of these sensors to indicate a current state or value or that a particular condition being monitored has changed or become unsecure.
  • security systems include a surveillance system that includes plural cameras that are wired or wirelessly connected to a local computing system for display of video on monitors, and which feed video to servers 14 for various types of processing.
  • sensors types may include motion detectors, proximity sensors (used, e.g., to determine whether a door has been opened) r.f, hotspots, to detect the presence of various devices such as cellphones, e.g., smart phones, and so forth.
  • the integrated platform 10 includes data collection systems that are coupled to wireless sensor networks and wireless devices, with remote server-based monitoring via servers 14 and report generation.
  • wireless sensor networks generally use a combination of wired and wireless links between computing devices, with wireless links usually used for the lowest level connections (e.g., end-node device to hub/gateway 16 ).
  • the edge (wirelessly-connected) tier of the network is comprised of resource-constrained devices 20 with specific functions. These devices 20 may have a small-to-moderate amount of processing power and memory, and may be battery powered, thus requiring that they conserve energy by spending much of their time in sleep mode.
  • a typical model is one where the edge devices 20 generally form a single wireless network in which each end-node communicates directly with its parent node (e.g., 18 ) in a hub-and-spoke-style architecture.
  • the parent node may be, e.g., an access point on a gateway or a sub-coordinator which is, in turn, connected to the access point or another sub-coordinator.
  • the distributed network 11 is logically divided into a set of tiers or hierarchical levels 12 a - 12 c .
  • the mixed reality devices 13 a - 13 n are shown in communication with the top one or two tiers or hierarchical levels 12 a - 12 c .
  • the lower level tier 12 c is illustrated divided into different facility 19 a - 19 c for ease in explaining details of the applications that will be discussed below.
  • the facility 19 a - 19 c are each associated with one of the security systems 15 a - 15 c .
  • the security systems can be independent meaning that there are no connections (as shown) among fully functional nodes of different facility or dependent meaning that there are connections (not shown) among fully functional nodes of different facility.
  • servers and/or virtual servers 14 running a “cloud computing” paradigm that are networked together using well-established networking technology such as Internet protocols or which can be private networks that use none or part of the Internet.
  • Applications that run on those servers 14 communicate using various protocols such as for Web Internet networks) XML/SOAP, RESTful web service, and other application layer technologies such as HTTP and ATOM.
  • the distributed network 11 has direct links between devices (nodes) as shown and discussed below.
  • Servers 14 execute analytics (analysis programs of various sorts) that are managed in concert with a session manager system 80 ( FIG. 4 ).
  • the servers 14 can access a database 23 .
  • the second logically divided tier or hierarchical level 12 b involves gateways 16 located at central, convenient places inside individual buildings and structures, e.g., 13 a - 13 c . These gateways 16 communicate with servers 14 in the upper tier whether the servers are stand-alone dedicated servers and/or cloud based servers running cloud applications using web programming techniques.
  • the middle tier gateways 16 are also shown with both local area network 17 a (e.g., Ethernet or 802.11) and cellular network interfaces 17 b .
  • Each gateway is equipped with an access point (fully functional node or “F” node) that is physically attached to that access point and that provides a wireless connection point to other nodes in the wireless network.
  • the links (illustrated by lines not numbered) shown in FIG. 1 represent direct (single-hop MAC layer) connections between devices.
  • a formal networking layer that functions in each of the three tiers shown in FIG. 1 ) uses a series of these direct links together with routing devices to send messages (fragmented or non-fragmented) from one device to another over the network.
  • the distributed network topology also includes a lower tier (edge layer) 12 c set of devices that involve fully-functional sensor nodes 18 (e.g., sensor nodes that include wireless devices, e.g., transceivers or at least transmitters, which in FIG. 1 are marked in with an “F”) as well as constrained wireless sensor nodes or sensor end-nodes 20 (marked in the FIG. 1 with “C”).
  • fully-functional sensor nodes 18 e.g., sensor nodes that include wireless devices, e.g., transceivers or at least transmitters, which in FIG. 1 are marked in with an “F”
  • constrained wireless sensor nodes or sensor end-nodes 20 marked in the FIG. 1 with “C”.
  • wired sensors can be included in aspects of the distributed network 11 .
  • the distributed network 11 implements a state machine approach to an application layer that runs on the lower tier devices 18 and 20 .
  • States in the state machine are comprised of sets of functions that execute in coordination, and these functions can be individually deleted or substituted or added to in order to alter the states in the state machine of a particular lower tier device.
  • the state function based application layer uses an edge device operating system that allows for loading and execution of individual functions (after the booting of the device) without rebooting the device (so-called “dynamic programming”).
  • edge devices could use other operating systems provided such systems allow for loading and execution of individual functions (after the booting of the device) preferably without rebooting of the edge devices.
  • a constrained device 20 that is part of the security/intrusion/alarm/surveillance systems (either integrated examples of such system or standalone examples) is shown.
  • a constrained device 20 as used herein is a device having substantially less persistent and volatile memory other computing devices, sensors, systems in a particular networked detection/sensor/alarm system.
  • Constrained device 20 includes a processor device 21 a , e.g., a CPU and or other type of controller device that executes under an operating system, generally with 8-bit or 16-bit logic rather than the 32- and 64-bit logic used by high-end computers and microprocessors.
  • the constrained device 20 has a relatively small flash/persistent store 21 b and volatile memory 21 c in comparison with other the computing devices on the network.
  • the persistent store 21 b is about a megabyte of storage or less and volatile memory 21 c is about several kilobytes of RAM memory or less.
  • the constrained device 20 has a network interface card 21 d that interfaces the constrained device 20 to the network 11 .
  • a wireless interface card is used, but in some instances a wired interface could be used.
  • a transceiver chip driven by a wireless network protocol stack e.g., 802.15.4/6LoWPAN
  • the constrained device 20 also includes a sensor 22 and a sensor interface 22 a that interfaces to the processor 21 a .
  • Sensor 22 can be any type of sensor type device. Typical types of sensors include temperature, simple motion, 1- 2- or 3-axis acceleration force, humidity, pressure, selective chemical, sound/piezo-electric transduction, and/or numerous others, implemented singly or in combination to detect complex events.
  • a constrained device 20 can follow the current constraints on flash/persistent storage memory and RAM memory and less than 10-20 kilobytes of RAM/volatile memory, but can have more depending on configuration and in some instances the operating system. These constrained devices 20 are configured in this manner; generally due to cost/physical configuration considerations. These types of constrained devices 20 generally have a static software image (i.e., the logic programmed into the constrained device is always the same).
  • Constrained devices 20 execute a real-time operating system that can use dynamic programming and support.
  • the real-time operating system (“RTOS”) executes and otherwise manages a dynamic set of user-defined independent executable functions or tasks that are either built into a loaded image (software and RTOS that executes on the constrained device) or that are downloaded during normal operation of the constrained device 20 or a combination of the two, with the former (built into the image) using as subroutines instances of the latter (downloaded during operation).
  • RTOS real-time operating system
  • Certain of the applications set forth below will cause systems to access these constrained devices 20 to upload data and otherwise control the devices 20 according to needs of the applications.
  • a facility can be any type but is typically, e.g., a commercial, industrial, facility, with interior areas, (buildings) and exterior areas that are subject to surveillance and other types of monitoring.
  • the buildings can be of any configuration, wide open spaces such as a warehouse, to compartmentalized facilities such as labs/offices.
  • the retail establishment includes the plural sensors 22 ( FIG. 1 ).
  • a portion of the sensors 22 are r.f, hot spots or the like, through which customers are provided with free Wi-Fi or other Internet access service.
  • the plural sensors including the hotspots are part of a security system, e.g., an alarm system such as fire or intrusion detection or surveillance systems. In other implementations, the plural sensors including the hotspots are part of another type of integrated system.
  • a user e.g., a customer in exchange for obtaining free Wi-Fi service voluntarily conveys selective personal information, e.g., name, address, cell phone no. cell id, etc. In other implementations less information is conveyed such as being limited to cell ID or cell phone number.
  • sensors 22 that are hot spots or the like, capture at least this cell id, or cell phone number information.
  • the servers 14 As a customer enters 42 the retail establishment, because the customer has a cell phone that broadcasts signals, the r.f. sensors 22 in the store detect these broadcasted signals.
  • These signals are sent for processing 44 such as by the servers 14 , where the cell phone numbers and/or cell phone IDs are extracted from each signal.
  • the extracted cell phone numbers and/or cell phone IDs together with any other information, e.g., video from cameras in surveillance systems are processed by the servers 14 to determine the identity of a person, i.e., carrying the cell phone.
  • Determining identity is accomplished by, e.g., a lookup operation, where the servers 14 use the extracted cell phone number and/or cell phone ID as an index into a table (Table 1, below) that stores profile ID's to retrieve a profile that likely corresponds to the holder of the cell phone.
  • a lookup operation where the servers 14 use the extracted cell phone number and/or cell phone ID as an index into a table (Table 1, below) that stores profile ID's to retrieve a profile that likely corresponds to the holder of the cell phone.
  • the servers 14 can alternatively form queries that search databases for entries having that cell phone number or cell phone ID to lookup the owner of the cell phone. From that information, the servers can access the owner's profile.
  • the databases 23 can be in-house (i.e., private databases) and/or third party databases that store user information along with cell phone number and/or cell phone ID. These databases 23 can store relatively little information or a significant amount of information in such profiles, depending on how freely the person associated with the profile shares personal information with such systems including a store's web portal.
  • the store has a web portal.
  • portal data is collected.
  • the portal data are associated with that particular person.
  • the servers 14 access the web portal and pull 46 data from the web portal regarding recent activity of the customer corresponding to that profile.
  • the servers 14 pull this data based on matching server-determined identity (based on identifying the user in the store by the r.f. signals from the cell phone and/or facial recognition) to user-supplied identity on the portal (based on information collected from the user of the portal during sessions on the portal). From recent activity on the portal, the servers 14 can determine 48 the customer's potential interests in purchases of particular retail items.
  • Store surveillance systems send 50 a video data to the servers 14 .
  • the servers 14 executing facial recognition algorithms on this video data identify 50 b person(s) in the video data.
  • the servers 14 (and either or optionally a local servers system), continually receive 50 c video data from various cameras associated with the surveillance system and the server 14 , tracks 50 the person(s).
  • the servers 14 executing 52 a a mapping algorithm sends data to the mixed reality device 13 a carried by the sales associate.
  • the data from the mapping algorithm shows a customer of interest's current location in the store, via a map.
  • Servers 14 push 52 b data to the mixed reality device for use by the sales associate with the mixed reality device 13 a to inform the sales associate how to advise the customer.
  • a sales associate has a mixed reality device 13 a and cameras on the mixed reality device 13 a capture 62 video data and send 64 the captured video data to the server systems 14 .
  • the server systems 14 process 66 the video data and from those frames the server systems 14 executing facial recognition processing extract features of those faces. These features can be used to identify the persons belonging to the faces captured in those frames. From these extracted features faces in the video are associated profiles of persons and such profile(s) are retrieved 68 .
  • the mixed reality device sends 70 a video data to the server systems 14 and these servers 14 identify the person by executing 70 b facial recognition algorithms on the video data.
  • the servers 14 continually receive 70 c video data from mixed reality device 13 a and thus can track 70 d the person for the mixed reality device 13 a , execute mapping algorithms that send 72 a data to the mixed reality device carried by the sales associate, which shows the customer's current location via a map.
  • the servers 14 push 72 b data to the mixed reality device 13 a for use by the sales associate with the mixed reality device.
  • the user of the mixed reality device 13 a can focus on one customer appearing in the field of view by gesturing to point to that person.
  • the servers 14 can perform this “focusing” automatically. That is, the user can be informed by the servers 14 processing profile records (as discussed above) to determine customer classification (discussed below).
  • the servers 14 would produce indications that can be rendered in the mixed reality device 13 a to point out person(s) according to classification.
  • the processed video data involves extracting facial features that are used an index into a database that stores profiles of persons.
  • database can hold relevant information that the store needs about individuals, including facial features that are recognized by facial recognition and classification of the customer.
  • Classification includes the nature of a person, e.g., as a customer, e.g., frequent high value customer or infrequent low value customer, as well as other classifications such as the person being a habitual shoplifter, etc.
  • Each of the statuses or classifications that can be ascribed to persons e.g., a customer, a frequent high value customer, infrequent low value customer, as well as other classifications such as the person being a habitual shoplifter, etc. are empirically determined by individual store managements. It is not reasonable therefore to ascribe any fixed ranges or values of such customers. However, some criteria are used. For instance it is common for businesses that have analytically processing capabilities to characterize persons according to a ranking based on sales (or profit) per time period (visit, monthly, etc.).
  • a high value customer can be one where attributed sales (or profit) places that person in, a specific segment of a store's customers, e.g., top 5% or top 25% whereas a low value customer can be one that is in the bottom e.g., 5% or bottom 25%.
  • Other groupings are possible.
  • customers can also be segmented according to frequency of visits, high frequency can be a customer where attributed visits places that person in a specific segment of a store's customers, e.g., top 5% or top 25% whereas a low frequency customer can be one that is in the bottom e.g., 5% or bottom 25%.
  • high frequency can be a customer where attributed visits places that person in a specific segment of a store's customers, e.g., top 5% or top 25% whereas a low frequency customer can be one that is in the bottom e.g., 5% or bottom 25%.
  • Other groupings are possible.
  • the servers 14 determine the classification and send the classification to the mixed reality device 13 a .
  • the store associate's mixed reality device 13 a is updated with the relevant information as determined by the servers 14 .
  • the display of the mixed reality device 13 a is configured to render an indicium according to the determined classification or status in an image of the person, which indicium is disposed adjacent the location of the person in the image as rendered on the display of the mixed reality device 13 a .
  • Various types of indicia can be used including the mixed reality device forming an outline in a color about or adjacent the person's image in the display and lock that outline to the person's image as the sales associate and the customer move about the store.
  • the systems 14 by tracking their online activity, determine the products/services of interest to that customer and can focus analytically determined personalized service to that customer while the customer is in the store.
  • the servers 14 and sales associate can assist the customer to find such products/services, and provide additional assistance.
  • the servers 14 produce data that can assist store personnel with customer interactions. These data are displayed by the display portion of the mixed reality device 13 a .
  • the servers 14 analyze data collected on the customer from various sources and especially the web portal whether in store or outside of the premises. When the user is connected to the portal whether in the store or not, the system retrieves information from the portal regarding items that they have been searching. That information is displayed on the mixed reality device 13 a when the sales associate views the customers in the store. For example, when a customer enters the store, the sales associate knows from data processed by the servers 14 , the products of interest to the customer. In addition, from data processed by the servers 14 , the servers forward to the mixed reality device 13 a additional data can assist the sales associate in locating the product for the customer and educating the customer with other information regarding the product and/or suggest alternative products of interest to the customer.
  • the servers 14 can determine whether that the person is a person that has been at the store's portal and had activity on the portal, from which, the servers 14 by analyzing data collected from that activity determine potential interests in purchases of particular retail items. In some implementations based on this analysis, the servers, can determine whether the person is a likely purchaser or a likely “window shopper.” This determination can be used to prioritize contacts with the customer. This process could be cloud-based executing analytical programs on the servers 14 to process the collected data.
  • the second type of person provides the mixed reality device 13 a , with an indication of shoplifters or other criminals when they enter the store. This can be done by detecting the presence of their cell phone in the store, or through the combined use of facial recognition technology as described above.
  • the servers 14 would have access to database that stores indications of a user as a known threat. For example, facial recognition features stored based on previous encounters, information from commercial databases that track criminals, etc. that indicated that the particular person is a person legally excluded from shopping at that store.
  • the person's image would be captured by the surveillance system and their image would be flagged to the mixed reality device 13 a.
  • the retail application provides a mixed reality device display from customer tracking software.
  • customer tracking software detects customer actions that suggest frustration (such as frustration at not being able to find a particular product)
  • this can be identified to a sales associate in the mixed reality device 13 a . Therefore, when the sales associate is notified, the mixed reality device 13 a can guide the sales associate to the customer in order to provide assistance.
  • the system 80 shown includes databases (generally 23 ) containing data on retail items, including item name, SKU number, retail price, wholesale price, location in the store (aisle no., shelf no., slot no., planogram reference no., etc.).
  • the system also shows other databases which include store layout information.
  • the system includes a mobile AR/VR (augmented reality/virtual reality) device, an AR/VR session management system, and finally a wireless (e.g., WiFi) network with wireless access points.
  • a mobile AR/VR augmented reality/virtual reality
  • WiFi wireless
  • FIG. 5 The organization of the databases in FIG. 5 are given as examples and are somewhat simplified relative to the design and implementation of actual enterprise-scale retail databases encountered in the commercial world. That is, no attempt is made in the figure to show how the databases are fragmented and deployed for data redundancy, scalability, fast data access, and so forth. Also, the segregation of various types of data into separate databases is simplified in FIG. 5 and it should be recognized that other database architectures can be imagined which are compatible with and included in the current invention as additional embodiments.
  • the mixed reality device 13 a allows the user to see the real environment with data or “artificial images” imposed on the view of the real environment.
  • One Microsoft Hololens® and Google Glass® are examples of commercial devices which allow this mixing of “real” and “virtual” realities as referred to herein also as mixed reality devices. It is also necessary that the device interact with an outside network and the web (e.g., using a WiFi connection) and also allow for input from the user (e.g., using hand gestures and/or voice commands).
  • the location of the user and associated mixed reality device 13 a inside the retail store is determined and tracked as the user moves around the store with the mixed reality device 13 a . This may be accomplished through a number of techniques including wireless triangulation of the device, various “internal GPS” technologies (BLE, RFID, NFC, etc.) or dead-reckoning based accelerometer data integration.
  • BLE Bluetooth Low-Fi Protected Access
  • NFC NFC
  • dead-reckoning based accelerometer data integration For the purposes of discussion it is only necessary to note that the physical location of either the mixed reality device 13 a or some other device on the person of the user may be estimated to within a few feet of the user's actual location in the retail store using technologies well known to those skilled in the art.
  • other technology components such as cameras, beacons, and other access points may be required, and these are omitted from FIG. 5 for the sake of simplicity.
  • the tracked device makes its location (and by inference the location of the user and the mixed reality device 13 a ) known by sending location data over the in-store wireless network to the AR/VR session manager.
  • the location of the user and mixed reality device 13 a may be determined without any location determination functionality on the mixed reality device 13 a , and without any second device (i.e., smart phone) if some other outside system (e.g., a video surveillance system with image analytics capabilities able to determine location) is available and is used to track the user's location during the AR/VR session.
  • the user may also specify where in the store they are by some other technique such as selecting a location on a map of the store.
  • the mixed reality device 13 a may determine its own location by capturing the image or images of items in its surroundings which have been previously mapped to the current location. Using such a location to image map the mixed reality device 13 a can determine its own location.
  • the “image” in such a case might be an actual image recorded in some convenient file format, or it might be an index or set of indices derived from the image in a manner which makes them unique to that image (i.e., an image index or hash).
  • an AR/VR session manager 90 interacts with the mixed reality device 13 a over the Internet using a “session portal” 92 , e.g., a web service application programming interface (API), or in another embodiment, a dedicated socket with SMTP or other transfer protocol.
  • the session portal 92 is bi-directional, meaning that each of the mixed reality devices 13 a - 13 c can send data to the session manager 90 and received data from the session manager 90 .
  • the mixed reality devices (MRS) 13 a - 13 c send updates on their states to the session manager 90 .
  • the states of the mixed reality devices 13 a - 13 c are represented virtually or “mirrored” in a device state representation 94 inside the session manager 80 .
  • Input from the mixed reality devices (MRS) 13 a - 13 c to the session manager 90 is used in analytic programs executed on the servers 14 .
  • cameras in the facility can be sending video feeds to the servers that send relevant data to the mixed reality devices (MRS) 13 a - 13 c
  • cameras on the mixed reality device 13 a - 13 c also may send video.
  • This video is analyzed by input analyzer 96 using various techniques to inform analytical manager 98 that inputs to analytic programs (not shown) executing on the servers 14 .
  • the analytics manager 98 uses a current mode and inputs presented to it, in order to decide what to present (virtually) to the user on the device viewer and what to request of the analytics executing on the server 14 .
  • the session mode manager 90 monitors the mode selected by the user (as mirrored in the device state representation) and informs the analytics manager of the selection. Session logs and notes (not referenced) can also be stored.
  • the session may be logged by the input analyzer 96 , including any notes or annotations provided by at least some users of the mixed reality devices 13 a - 13 c , e.g., verbal or text sent from the mixed reality devices 13 a - 13 c or otherwise.
  • This locale log/record in the session manager 90 may be backed up in external database 23 or other databases (not shown) for long-term storage, reporting, and further analysis.
  • This local session and long-term storage may also include a full record or “recording” of part or all of the session, rather than just the user notes.
  • the mixed reality devices 13 a - 13 c can be controlled via a switch on the device, a voice command, and/or a hand gesture that can be used to awakens the device (i.e., loads operating system components and prepares for input) when the device senses motion or can be used to request inputs to the device from the servers 14 .
  • the mixed reality devices 13 a - 13 c may require input of a user id and password to enable further operation and interaction with the user and servers 14 .
  • Web service clients 102 provide connections to various ones of the databases 23 .
  • the sensor network illustrated in FIG. 1 is an example of a network that collects and analyzes data from various sensor devices.
  • Other configurations of servers and gateways can be used.
  • the session manager system 90 can be implemented in the servers 14 or in local or detached server systems.
  • Servers can be any of a variety of computing devices capable of receiving information, such as a server, a distributed computing system 10 , a rack-mounted server and so forth. Server may be a single server or a group of servers that are at a same location or at different locations. Servers can receive information from client device user device via interfaces. Interfaces can be any type of interface capable of receiving information over a network, such as an Ethernet interface, a wireless networking interface, a fiber-optic networking interface, a modem, and so forth. Server also includes a processor and memory and a bus system including, for example, an information bus and a motherboard, can be used to establish and to control information communication between the components of server.
  • Processor may include one or more microprocessors.
  • processor may include any appropriate processor and/or logic that is capable of receiving and storing information, and of communicating over a network (not shown).
  • Memory can include a hard drive and a random access memory storage device, such as a dynamic random access memory computer readable hardware storage devices and media and other types of non-transitory storage devices.
  • Embodiments can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations thereof.
  • Computer programs can be implemented in a high-level procedural or object oriented programming language, or in assembly or machine language if desired; and in any case, the language can be a compiled or interpreted language.
  • Suitable processors include, by way of example, both general and special purpose microprocessors.
  • a processor will receive instructions and information from a read-only memory and/or a random access memory.
  • a computer will include one or more mass storage devices for storing information files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks.
  • Storage devices suitable for tangibly embodying computer program instructions and information include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD_ROM disks. Any of the foregoing can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
  • semiconductor memory devices such as EPROM, EEPROM, and flash memory devices
  • magnetic disks such as internal hard disks and removable disks
  • magneto-optical disks magneto-optical disks
  • CD_ROM disks compact discs

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Accounting & Taxation (AREA)
  • General Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Economics (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Game Theory and Decision Science (AREA)
  • Software Systems (AREA)
  • Acoustics & Sound (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Tourism & Hospitality (AREA)
  • Human Resources & Organizations (AREA)
  • Primary Health Care (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Optics & Photonics (AREA)
  • Oral & Maxillofacial Surgery (AREA)

Abstract

Disclosed are techniques that use mixed reality and/or augmented reality and virtual reality technologies for analysis of retail processes and activity in retail stores. The disclosed techniques use computer implemented techniques that obtain information from various electronic systems/devices in the physical world, which devices are exemplified by security systems, and merge that information into a virtual world of policies and analytics that involve retail systems that generate analytical information regarding customers and their preferences and needs.

Description

    CLAIM OF PRIORITY
  • This application claims priority under 35 U.S.C. §119(e) to provisional U.S. Patent Application 62/361,053, filed on Jul. 12, 2016, entitled: “Holographic Technology Implemented Security and Retail Solutions” the entire contents of which is incorporated herein by reference and provisional U.S. Patent Application 62/361,669, filed on Jul. 13, 2016, entitled: “Holographic Technology Implemented Security and Retail Solutions the entire contents of which is incorporated herein by reference.
  • HOLOGRAPHIC TECHNOLOGY IMPLEMENTED RETAIL SOLUTIONS Background
  • This description relates to operation of sensor networks such as those used with security, intrusion and alarm systems installed on commercial facility.
  • It is common for businesses to have systems for detecting conditions at their facility and signaling the conditions to a monitoring station or to authorized users of the security system. For example, such buildings employ systems in the areas of fire detection, smoke detection, intrusion detection, access control, video surveillance etc. Many different types of security sensors are deployed in commercial buildings. Types of sensors typically include motion detectors, cameras, and proximity sensors (used to determine whether a door or window has been opened). Such sensors can constantly collecting data that is used to determine whether an alarm should be triggered, but also continues to collect data after an alarm is triggered.
  • Retail establishments often use simple physical walk-throughs with users having smart-phone and/or tablet based presentations, and use conventional retail analytics applications, and verbal descriptions as tools used for analysis to investigate trends and potential explanations of observations suggested by data analytics.
  • Augmented reality, virtual reality and mixed reality technologies are known. Generally, virtual reality refers to technologies that replicate an environment with a simulation of a user being immersed in the replicated environment. Augmented reality, generally refers to technologies that present a view of a real-world environment augmented with computer generated data. Mixed reality a relatively new term generally involves technologies that involve a merging of real world and virtual world environments where real and virtual objects exist and interact.
  • SUMMARY
  • According to an aspect, a system includes r.f. detectors that detect presence of r.f. signals, a mixed reality system including a processor device, memory in communication with the processor device, a head mounted display device including a stereoscopic 3D display that renders one or more images of persons within a field of view of the mixed reality system device, with the mixed reality system configured to send the images to one or more server computer systems. The systems also includes the one or more server computer systems configured to process information from r.f. signals sent from r.f. devices in possession of at least some of the persons in the field of view, analyze the processed information and the one or more images to identify at least one of the persons in the field of view, access at least one profile corresponding to the at least one identified person, and classify the one or more persons in the one or more images according to a status.
  • Aspects also include computer program products and methods.
  • Disclosed are techniques that use mixed reality and/or augmented reality and virtual reality technologies to improve the analysis of retail processes and activity in retail stores. The disclosed techniques use computer implemented techniques that obtain information from various electronic systems/devices in the physical world, which devices are exemplified by security systems, and merge that information into a virtual world of policies and analytics that involve retail systems that generate analytical information regarding customers and their preferences and needs. This improves upon simple physical walk-throughs blended with smart-phone and tablet based presentations, conventional retail analytics apps, and verbal descriptions.
  • The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the invention is apparent from the description and drawings, and from the claims.
  • DESCRIPTION OF DRAWINGS
  • FIG. 1 is a schematic diagram of an exemplary networked security system.
  • FIG. 2 is a block diagram of a generally conventional constrained device that is typically used in a security system application.
  • FIG. 3 is a flow chart depicting application processing using mixed reality devices adapted for retail solutions.
  • FIG. 4 is a flow chart depicting an alternative application processing using mixed reality devices adapted for retail solutions.
  • FIG. 5 is a block diagram that shows a mixed reality session manager.
  • FIG. 6 is a block diagram that shows components of a typical AI based cognitive assistant or agent.
  • DETAILED DESCRIPTION
  • As shown in FIG. 1, described herein are examples of an integrated platform 10 that integrates via a distributed network 11, servers executing various analytical applications to detect trends, loss prevention, etc., with mixed reality devices 13 a-13 c and sensors such as from installed security/intrusion/alarm/surveillance systems 15 a-15 c (typically including sensors 20, functional nodes 18 and typically including a panel not shown).
  • Example mixed reality devices 13 a-13 c are those in which the mixed reality devices 13 a-13 c incorporate a live, real world presentation of elements of the physical real-world with virtual elements so that to a user these elements are perceived to exist together in a common environment. Examples of such mixed reality devices 13 a-13 c include Mixed Reality Device® (Microsoft), (a smart-glasses, cordless, Windows 10® (Microsoft) computer headset that includes various sensors and a high-definition stereoscopic 3D optical head-mounted display, and spatial sound to allow for augmented reality applications. Other mixed reality devices/augmented reality systems such as Google Glass® (Google) could be used. There are many such systems on the market of which these are two examples.
  • The security systems 15 a-15 c typically include a panel (not shown), such as for an intrusion detection system, an intrusion detection panel wired or wirelessly connected to a variety of sensors deployed in a facility. Typically, such panels receive signals from one or more of these sensors to indicate a current state or value or that a particular condition being monitored has changed or become unsecure.
  • Examples of security systems include a surveillance system that includes plural cameras that are wired or wirelessly connected to a local computing system for display of video on monitors, and which feed video to servers 14 for various types of processing. In addition to cameras, other sensors types may include motion detectors, proximity sensors (used, e.g., to determine whether a door has been opened) r.f, hotspots, to detect the presence of various devices such as cellphones, e.g., smart phones, and so forth.
  • The integrated platform 10 includes data collection systems that are coupled to wireless sensor networks and wireless devices, with remote server-based monitoring via servers 14 and report generation. As described in more detail below, wireless sensor networks generally use a combination of wired and wireless links between computing devices, with wireless links usually used for the lowest level connections (e.g., end-node device to hub/gateway 16). In an example network, the edge (wirelessly-connected) tier of the network is comprised of resource-constrained devices 20 with specific functions. These devices 20 may have a small-to-moderate amount of processing power and memory, and may be battery powered, thus requiring that they conserve energy by spending much of their time in sleep mode. A typical model is one where the edge devices 20 generally form a single wireless network in which each end-node communicates directly with its parent node (e.g., 18) in a hub-and-spoke-style architecture. The parent node may be, e.g., an access point on a gateway or a sub-coordinator which is, in turn, connected to the access point or another sub-coordinator.
  • In FIG. 1, the distributed network 11 is logically divided into a set of tiers or hierarchical levels 12 a-12 c. The mixed reality devices 13 a-13 n are shown in communication with the top one or two tiers or hierarchical levels 12 a-12 c. In FIG. 1, the lower level tier 12 c is illustrated divided into different facility 19 a-19 c for ease in explaining details of the applications that will be discussed below. The facility 19 a-19 c are each associated with one of the security systems 15 a-15 c. The security systems can be independent meaning that there are no connections (as shown) among fully functional nodes of different facility or dependent meaning that there are connections (not shown) among fully functional nodes of different facility.
  • In the upper tier or hierarchical level 12 a of the network are disposed servers and/or virtual servers 14 running a “cloud computing” paradigm that are networked together using well-established networking technology such as Internet protocols or which can be private networks that use none or part of the Internet. Applications that run on those servers 14 communicate using various protocols such as for Web Internet networks) XML/SOAP, RESTful web service, and other application layer technologies such as HTTP and ATOM. The distributed network 11 has direct links between devices (nodes) as shown and discussed below. Servers 14 execute analytics (analysis programs of various sorts) that are managed in concert with a session manager system 80 (FIG. 4). The servers 14 can access a database 23.
  • The second logically divided tier or hierarchical level 12 b, referred to here as a middle tier, involves gateways 16 located at central, convenient places inside individual buildings and structures, e.g., 13 a-13 c. These gateways 16 communicate with servers 14 in the upper tier whether the servers are stand-alone dedicated servers and/or cloud based servers running cloud applications using web programming techniques. The middle tier gateways 16 are also shown with both local area network 17 a (e.g., Ethernet or 802.11) and cellular network interfaces 17 b. Each gateway is equipped with an access point (fully functional node or “F” node) that is physically attached to that access point and that provides a wireless connection point to other nodes in the wireless network. The links (illustrated by lines not numbered) shown in FIG. 1 represent direct (single-hop MAC layer) connections between devices. A formal networking layer (that functions in each of the three tiers shown in FIG. 1) uses a series of these direct links together with routing devices to send messages (fragmented or non-fragmented) from one device to another over the network.
  • The distributed network topology also includes a lower tier (edge layer) 12 c set of devices that involve fully-functional sensor nodes 18 (e.g., sensor nodes that include wireless devices, e.g., transceivers or at least transmitters, which in FIG. 1 are marked in with an “F”) as well as constrained wireless sensor nodes or sensor end-nodes 20 (marked in the FIG. 1 with “C”). In some embodiments wired sensors (not shown) can be included in aspects of the distributed network 11.
  • The distributed network 11 implements a state machine approach to an application layer that runs on the lower tier devices 18 and 20. States in the state machine are comprised of sets of functions that execute in coordination, and these functions can be individually deleted or substituted or added to in order to alter the states in the state machine of a particular lower tier device. The state function based application layer uses an edge device operating system that allows for loading and execution of individual functions (after the booting of the device) without rebooting the device (so-called “dynamic programming”). In other implementations, edge devices could use other operating systems provided such systems allow for loading and execution of individual functions (after the booting of the device) preferably without rebooting of the edge devices.
  • Referring to FIG. 2, a generic constrained computing device 20 that is part of the security/intrusion/alarm/surveillance systems (either integrated examples of such system or standalone examples) is shown. A constrained device 20 as used herein is a device having substantially less persistent and volatile memory other computing devices, sensors, systems in a particular networked detection/sensor/alarm system. Constrained device 20 includes a processor device 21 a, e.g., a CPU and or other type of controller device that executes under an operating system, generally with 8-bit or 16-bit logic rather than the 32- and 64-bit logic used by high-end computers and microprocessors. The constrained device 20 has a relatively small flash/persistent store 21 b and volatile memory 21 c in comparison with other the computing devices on the network. Generally the persistent store 21 b is about a megabyte of storage or less and volatile memory 21 c is about several kilobytes of RAM memory or less.
  • The constrained device 20 has a network interface card 21 d that interfaces the constrained device 20 to the network 11. Typically a wireless interface card is used, but in some instances a wired interface could be used. Alternatively, a transceiver chip driven by a wireless network protocol stack (e.g., 802.15.4/6LoWPAN) can be used as the (wireless) network interface. These components are coupled together via a bus structure. The constrained device 20 also includes a sensor 22 and a sensor interface 22 a that interfaces to the processor 21 a. Sensor 22 can be any type of sensor type device. Typical types of sensors include temperature, simple motion, 1- 2- or 3-axis acceleration force, humidity, pressure, selective chemical, sound/piezo-electric transduction, and/or numerous others, implemented singly or in combination to detect complex events.
  • The disclosed implementations of a constrained device 20 can follow the current constraints on flash/persistent storage memory and RAM memory and less than 10-20 kilobytes of RAM/volatile memory, but can have more depending on configuration and in some instances the operating system. These constrained devices 20 are configured in this manner; generally due to cost/physical configuration considerations. These types of constrained devices 20 generally have a static software image (i.e., the logic programmed into the constrained device is always the same).
  • Constrained devices 20 execute a real-time operating system that can use dynamic programming and support. The real-time operating system (“RTOS”) executes and otherwise manages a dynamic set of user-defined independent executable functions or tasks that are either built into a loaded image (software and RTOS that executes on the constrained device) or that are downloaded during normal operation of the constrained device 20 or a combination of the two, with the former (built into the image) using as subroutines instances of the latter (downloaded during operation). Certain of the applications set forth below will cause systems to access these constrained devices 20 to upload data and otherwise control the devices 20 according to needs of the applications.
  • In the examples below, a facility can be any type but is typically, e.g., a commercial, industrial, facility, with interior areas, (buildings) and exterior areas that are subject to surveillance and other types of monitoring. The buildings can be of any configuration, wide open spaces such as a warehouse, to compartmentalized facilities such as labs/offices.
  • Referring to FIG. 3, a real-time process 40 for determining shopping activity integrated with the mixed reality devices 13 a-13 c is shown. In the discussion that follows, mixed reality device 13 a is referred to as an illustrative example. The retail establishment includes the plural sensors 22 (FIG. 1). In one implementation, a portion of the sensors 22 are r.f, hot spots or the like, through which customers are provided with free Wi-Fi or other Internet access service. In some implementations, the plural sensors including the hotspots are part of a security system, e.g., an alarm system such as fire or intrusion detection or surveillance systems. In other implementations, the plural sensors including the hotspots are part of another type of integrated system.
  • In some implementations, a user, e.g., a customer in exchange for obtaining free Wi-Fi service voluntarily conveys selective personal information, e.g., name, address, cell phone no. cell id, etc. In other implementations less information is conveyed such as being limited to cell ID or cell phone number. In any event, sensors 22 that are hot spots or the like, capture at least this cell id, or cell phone number information. As a customer enters 42 the retail establishment, because the customer has a cell phone that broadcasts signals, the r.f. sensors 22 in the store detect these broadcasted signals. These signals are sent for processing 44 such as by the servers 14, where the cell phone numbers and/or cell phone IDs are extracted from each signal. The extracted cell phone numbers and/or cell phone IDs together with any other information, e.g., video from cameras in surveillance systems are processed by the servers 14 to determine the identity of a person, i.e., carrying the cell phone.
  • Determining identity is accomplished by, e.g., a lookup operation, where the servers 14 use the extracted cell phone number and/or cell phone ID as an index into a table (Table 1, below) that stores profile ID's to retrieve a profile that likely corresponds to the holder of the cell phone.
  • Cell phone
    ID Profile ID
    Value of Location of
    number/Id profile
    * *
    * *
    * *
  • The servers 14 can alternatively form queries that search databases for entries having that cell phone number or cell phone ID to lookup the owner of the cell phone. From that information, the servers can access the owner's profile. The databases 23 can be in-house (i.e., private databases) and/or third party databases that store user information along with cell phone number and/or cell phone ID. These databases 23 can store relatively little information or a significant amount of information in such profiles, depending on how freely the person associated with the profile shares personal information with such systems including a store's web portal.
  • As mentioned, in some implementations, the store has a web portal. Each time the customer logs into the portal, portal data is collected. The portal data are associated with that particular person. The servers 14 access the web portal and pull 46 data from the web portal regarding recent activity of the customer corresponding to that profile. The servers 14 pull this data based on matching server-determined identity (based on identifying the user in the store by the r.f. signals from the cell phone and/or facial recognition) to user-supplied identity on the portal (based on information collected from the user of the portal during sessions on the portal). From recent activity on the portal, the servers 14 can determine 48 the customer's potential interests in purchases of particular retail items.
  • These profiles will include various information such as that commonly collected by entities from visitors to websites/portals (as well as other information sources), and which is commonly used with various types of analytical processing to market products/services to customers. The details of such analytical processing are well-known and not needed for an understanding of the processes discussed herein.
  • As the customer enters the store, in addition to broadcasting its “r.f. identity,” i.e., the cell phone no. and/or cell phone ID, the customer is also identified, e.g., by facial recognition or other techniques. Store surveillance systems send 50 a video data to the servers 14. The servers 14 executing facial recognition algorithms on this video data identify 50 b person(s) in the video data. The servers 14 (and either or optionally a local servers system), continually receive 50 c video data from various cameras associated with the surveillance system and the server 14, tracks 50 the person(s). The servers 14 executing 52 a a mapping algorithm sends data to the mixed reality device 13 a carried by the sales associate. The data from the mapping algorithm shows a customer of interest's current location in the store, via a map. Servers 14 push 52 b data to the mixed reality device for use by the sales associate with the mixed reality device 13 a to inform the sales associate how to advise the customer.
  • Referring now to FIG. 4, an alternative real-time process 60 for determining shopping activity integrated with the mixed reality devices 13 a-13 c is shown. In this processing 60, a sales associate has a mixed reality device 13 a and cameras on the mixed reality device 13 a capture 62 video data and send 64 the captured video data to the server systems 14. In some of the frames of this video data faces of individuals in the field of view of the mixed reality device 13 a are presented. These servers system 14 process 66 the video data and from those frames the server systems 14 executing facial recognition processing extract features of those faces. These features can be used to identify the persons belonging to the faces captured in those frames. From these extracted features faces in the video are associated profiles of persons and such profile(s) are retrieved 68.
  • As also shown, in some implementations, the mixed reality device sends 70 a video data to the server systems 14 and these servers 14 identify the person by executing 70 b facial recognition algorithms on the video data. The servers 14 continually receive 70 c video data from mixed reality device 13 a and thus can track 70 d the person for the mixed reality device 13 a, execute mapping algorithms that send 72 a data to the mixed reality device carried by the sales associate, which shows the customer's current location via a map. The servers 14 push 72 b data to the mixed reality device 13 a for use by the sales associate with the mixed reality device.
  • In this above example, of course the user of the mixed reality device 13 a can focus on one customer appearing in the field of view by gesturing to point to that person. However, in some instances, the servers 14 can perform this “focusing” automatically. That is, the user can be informed by the servers 14 processing profile records (as discussed above) to determine customer classification (discussed below). The servers 14 would produce indications that can be rendered in the mixed reality device 13 a to point out person(s) according to classification.
  • For instance, as discussed, the processed video data involves extracting facial features that are used an index into a database that stores profiles of persons. Such database can hold relevant information that the store needs about individuals, including facial features that are recognized by facial recognition and classification of the customer. Classification includes the nature of a person, e.g., as a customer, e.g., frequent high value customer or infrequent low value customer, as well as other classifications such as the person being a habitual shoplifter, etc.
  • Each of the statuses or classifications that can be ascribed to persons, e.g., a customer, a frequent high value customer, infrequent low value customer, as well as other classifications such as the person being a habitual shoplifter, etc. are empirically determined by individual store managements. It is not reasonable therefore to ascribe any fixed ranges or values of such customers. However, some criteria are used. For instance it is common for businesses that have analytically processing capabilities to characterize persons according to a ranking based on sales (or profit) per time period (visit, monthly, etc.). Accordingly, a high value customer can be one where attributed sales (or profit) places that person in, a specific segment of a store's customers, e.g., top 5% or top 25% whereas a low value customer can be one that is in the bottom e.g., 5% or bottom 25%. Other groupings are possible. There could also be a middle grouping (or several middle groupings) of customers between high and low value.
  • In addition, besides segmenting customers according to value, customers can also be segmented according to frequency of visits, high frequency can be a customer where attributed visits places that person in a specific segment of a store's customers, e.g., top 5% or top 25% whereas a low frequency customer can be one that is in the bottom e.g., 5% or bottom 25%. Other groupings are possible.
  • The servers 14 determine the classification and send the classification to the mixed reality device 13 a. When the person comes into field of view of the mixed reality device 13 a, the store associate's mixed reality device 13 a is updated with the relevant information as determined by the servers 14. The display of the mixed reality device 13 a is configured to render an indicium according to the determined classification or status in an image of the person, which indicium is disposed adjacent the location of the person in the image as rendered on the display of the mixed reality device 13 a. Various types of indicia can be used including the mixed reality device forming an outline in a color about or adjacent the person's image in the display and lock that outline to the person's image as the sales associate and the customer move about the store.
  • Therefore, if the person is a customer that signed up for an e-service offered by the store, the systems 14 by tracking their online activity, determine the products/services of interest to that customer and can focus analytically determined personalized service to that customer while the customer is in the store. The servers 14 and sales associate can assist the customer to find such products/services, and provide additional assistance.
  • Executing various types of analytical programs, the servers 14 produce data that can assist store personnel with customer interactions. These data are displayed by the display portion of the mixed reality device 13 a. The servers 14 analyze data collected on the customer from various sources and especially the web portal whether in store or outside of the premises. When the user is connected to the portal whether in the store or not, the system retrieves information from the portal regarding items that they have been searching. That information is displayed on the mixed reality device 13 a when the sales associate views the customers in the store. For example, when a customer enters the store, the sales associate knows from data processed by the servers 14, the products of interest to the customer. In addition, from data processed by the servers 14, the servers forward to the mixed reality device 13 a additional data can assist the sales associate in locating the product for the customer and educating the customer with other information regarding the product and/or suggest alternative products of interest to the customer.
  • From recent activity on the portal, the servers 14 can determine whether that the person is a person that has been at the store's portal and had activity on the portal, from which, the servers 14 by analyzing data collected from that activity determine potential interests in purchases of particular retail items. In some implementations based on this analysis, the servers, can determine whether the person is a likely purchaser or a likely “window shopper.” This determination can be used to prioritize contacts with the customer. This process could be cloud-based executing analytical programs on the servers 14 to process the collected data.
  • The second type of person provides the mixed reality device 13 a, with an indication of shoplifters or other criminals when they enter the store. This can be done by detecting the presence of their cell phone in the store, or through the combined use of facial recognition technology as described above. The servers 14 would have access to database that stores indications of a user as a known threat. For example, facial recognition features stored based on previous encounters, information from commercial databases that track criminals, etc. that indicated that the particular person is a person legally excluded from shopping at that store.
  • Moreover, for a person shopping in the store, i.e., a customer, if that “customer” commits an action that triggers an alarm, e.g., smashing a glass case holding jewelry, etc., the person's image would be captured by the surveillance system and their image would be flagged to the mixed reality device 13 a.
  • The retail application provides a mixed reality device display from customer tracking software. When the customer tracking software detects customer actions that suggest frustration (such as frustration at not being able to find a particular product), this can be identified to a sales associate in the mixed reality device 13 a. Therefore, when the sales associate is notified, the mixed reality device 13 a can guide the sales associate to the customer in order to provide assistance.
  • Described below is a specific implementation of a retail solution using mixed reality devices 13 a-13 c discussed above.
  • Referring to FIG. 5, the system 80 shown includes databases (generally 23) containing data on retail items, including item name, SKU number, retail price, wholesale price, location in the store (aisle no., shelf no., slot no., planogram reference no., etc.). The system also shows other databases which include store layout information. In addition, the system includes a mobile AR/VR (augmented reality/virtual reality) device, an AR/VR session management system, and finally a wireless (e.g., WiFi) network with wireless access points.
  • The organization of the databases in FIG. 5 are given as examples and are somewhat simplified relative to the design and implementation of actual enterprise-scale retail databases encountered in the commercial world. That is, no attempt is made in the figure to show how the databases are fragmented and deployed for data redundancy, scalability, fast data access, and so forth. Also, the segregation of various types of data into separate databases is simplified in FIG. 5 and it should be recognized that other database architectures can be imagined which are compatible with and included in the current invention as additional embodiments.
  • The mixed reality device 13 a allows the user to see the real environment with data or “artificial images” imposed on the view of the real environment. One Microsoft Hololens® and Google Glass® are examples of commercial devices which allow this mixing of “real” and “virtual” realities as referred to herein also as mixed reality devices. It is also necessary that the device interact with an outside network and the web (e.g., using a WiFi connection) and also allow for input from the user (e.g., using hand gestures and/or voice commands).
  • The location of the user and associated mixed reality device 13 a inside the retail store is determined and tracked as the user moves around the store with the mixed reality device 13 a. This may be accomplished through a number of techniques including wireless triangulation of the device, various “internal GPS” technologies (BLE, RFID, NFC, etc.) or dead-reckoning based accelerometer data integration. For the purposes of discussion it is only necessary to note that the physical location of either the mixed reality device 13 a or some other device on the person of the user may be estimated to within a few feet of the user's actual location in the retail store using technologies well known to those skilled in the art. Depending on the technology used to track the location of the mixed reality device 13 a or the user, other technology components such as cameras, beacons, and other access points may be required, and these are omitted from FIG. 5 for the sake of simplicity.
  • In the case where the actual device being tracked is not the mixed reality device 13 a but rather some other device (such as a smart phone in the pocket of the user), the tracked device makes its location (and by inference the location of the user and the mixed reality device 13 a) known by sending location data over the in-store wireless network to the AR/VR session manager. It should also be noted that the location of the user and mixed reality device 13 a may be determined without any location determination functionality on the mixed reality device 13 a, and without any second device (i.e., smart phone) if some other outside system (e.g., a video surveillance system with image analytics capabilities able to determine location) is available and is used to track the user's location during the AR/VR session. The user may also specify where in the store they are by some other technique such as selecting a location on a map of the store. In another embodiment the mixed reality device 13 a may determine its own location by capturing the image or images of items in its surroundings which have been previously mapped to the current location. Using such a location to image map the mixed reality device 13 a can determine its own location. The “image” in such a case might be an actual image recorded in some convenient file format, or it might be an index or set of indices derived from the image in a manner which makes them unique to that image (i.e., an image index or hash).
  • Referring now to FIG. 6, an AR/VR session manager 90 is shown. The manager interacts with the mixed reality device 13 a over the Internet using a “session portal” 92, e.g., a web service application programming interface (API), or in another embodiment, a dedicated socket with SMTP or other transfer protocol. The session portal 92 is bi-directional, meaning that each of the mixed reality devices 13 a-13 c can send data to the session manager 90 and received data from the session manager 90. The mixed reality devices (MRS) 13 a-13 c send updates on their states to the session manager 90. The states of the mixed reality devices 13 a-13 c are represented virtually or “mirrored” in a device state representation 94 inside the session manager 80.
  • Input from the mixed reality devices (MRS) 13 a-13 c to the session manager 90 is used in analytic programs executed on the servers 14. For example, while cameras in the facility can be sending video feeds to the servers that send relevant data to the mixed reality devices (MRS) 13 a-13 c, cameras on the mixed reality device 13 a-13 c also may send video. This video is analyzed by input analyzer 96 using various techniques to inform analytical manager 98 that inputs to analytic programs (not shown) executing on the servers 14. The analytics manager 98 uses a current mode and inputs presented to it, in order to decide what to present (virtually) to the user on the device viewer and what to request of the analytics executing on the server 14.
  • Information presented is produced by the analytics manager using data received from the various analytical programs that execute various analytics both conventional as well as to be developed. The session mode manager 90 monitors the mode selected by the user (as mirrored in the device state representation) and informs the analytics manager of the selection. Session logs and notes (not referenced) can also be stored.
  • In some embodiments, the session may be logged by the input analyzer 96, including any notes or annotations provided by at least some users of the mixed reality devices 13 a-13 c, e.g., verbal or text sent from the mixed reality devices 13 a-13 c or otherwise. This locale log/record in the session manager 90 may be backed up in external database 23 or other databases (not shown) for long-term storage, reporting, and further analysis. This local session and long-term storage may also include a full record or “recording” of part or all of the session, rather than just the user notes. The mixed reality devices 13 a-13 c can be controlled via a switch on the device, a voice command, and/or a hand gesture that can be used to awakens the device (i.e., loads operating system components and prepares for input) when the device senses motion or can be used to request inputs to the device from the servers 14. The mixed reality devices 13 a-13 c may require input of a user id and password to enable further operation and interaction with the user and servers 14.
  • Web service clients 102 provide connections to various ones of the databases 23.
  • The sensor network illustrated in FIG. 1, is an example of a network that collects and analyzes data from various sensor devices. Other configurations of servers and gateways can be used. In addition, the session manager system 90 can be implemented in the servers 14 or in local or detached server systems.
  • Servers can be any of a variety of computing devices capable of receiving information, such as a server, a distributed computing system 10, a rack-mounted server and so forth. Server may be a single server or a group of servers that are at a same location or at different locations. Servers can receive information from client device user device via interfaces. Interfaces can be any type of interface capable of receiving information over a network, such as an Ethernet interface, a wireless networking interface, a fiber-optic networking interface, a modem, and so forth. Server also includes a processor and memory and a bus system including, for example, an information bus and a motherboard, can be used to establish and to control information communication between the components of server.
  • Processor may include one or more microprocessors. Generally, processor may include any appropriate processor and/or logic that is capable of receiving and storing information, and of communicating over a network (not shown). Memory can include a hard drive and a random access memory storage device, such as a dynamic random access memory computer readable hardware storage devices and media and other types of non-transitory storage devices.
  • Embodiments can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations thereof. Computer programs can be implemented in a high-level procedural or object oriented programming language, or in assembly or machine language if desired; and in any case, the language can be a compiled or interpreted language. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, a processor will receive instructions and information from a read-only memory and/or a random access memory. Generally, a computer will include one or more mass storage devices for storing information files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks. Storage devices suitable for tangibly embodying computer program instructions and information include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD_ROM disks. Any of the foregoing can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
  • Other embodiments are within the scope and spirit of the description claims. For example, due to the nature of software, functions described above can be implemented using software, hardware, firmware, hardwiring, or combinations of any of these. Features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations. Other embodiments are within the scope of the following claims.

Claims (16)

What is claimed is:
1. A system comprises:
r.f. detectors that detect presence of r.f. signals;
a mixed reality system comprising:
a processor device;
a memory in communication with the processor device;
a head mounted display device including a stereoscopic 3D display that renders one or more images of persons within a field of view of the mixed reality system device;
with the mixed reality system configured to send the images to one or more server computer systems;
the one or more server computer systems configured to:
process information from r.f. signals sent from r.f. devices in possession of at least some of the persons in the field of view;
analyze the processed information and the one or more images to identify at least one of the persons in the field of view;
access at least one profile corresponding to the at least one identified person; and
classify the one or more persons in the one or more images according to a status.
2. The system of claim 1 wherein the status is a message associated with the one or more images of the one or more persons.
3. The system of claim 1 wherein the servers access databases containing data on retail items, including item name, SKU number, retail price, wholesale price, location in the store and store layout information during analyzing of the processed information.
4. The system of claim 1 wherein the statuses are ascribed to persons as customers that are one of a frequent high value customer, infrequent low value customer, a habitual shoplifter.
5. The system of claim 4 wherein a criterion used to determine status results from analytically processing to characterize persons according to a ranking based on sales or profit per time period relative to other customers.
6. The system of claim 4 wherein the servers determine the classification and send the classification to the mixed reality device.
7. The system of claim 4 wherein the head mounted display device of the mixed reality device is configured to render an indicium adjacent an image of the person with the determined status.
8. The system of claim 4 wherein the servers are configured to track online browsing activity of the one or more persons,
determine products and/or services of interest to the one or more persons; and
analytically determine personalized service to direct to the one or more persons while the one or more persons are in the store.
9. The system of claim 1 wherein the statuses are ascribed to persons as customers that are one of a frequent high value customer, infrequent low value customer, a habitual shoplifter.
10. The system of claim 1 wherein the servers are further configured to:
track location of the one or more persons and the mixed reality device; and
store session data including notes or annotations provided by a user of the mixed reality devices.
11. A system comprising:
one or more server computer systems comprising:
one or more processor devices;
memory in communication with the one or more processor devices, with the one or more server computer systems configured to:
receive from a mixed reality device images of persons within a field of view of the mixed reality device;
receive and process information from r.f. signals sent from r.f. devices in possession of at least some of the persons in the field of view;
analyze the processed information and the one or more images to identify at least one of the persons in the field of view;
access at least one profile corresponding to the at least one identified person; and
classify the one or more persons in the one or more images according to a status.
12. The system of claim 11 wherein the status is a message associated with the one or more images of the one or more persons.
13. The system of claim 11 wherein the statuses are ascribed to persons as customers that are one of a frequent high value customer, infrequent low value customer, a habitual shoplifter and wherein a criterion used to determine status results from analytically processing to characterize persons according to a ranking based on sales or profit per time period relative to other customers.
14. The system of claim 11 wherein the servers send the classification to the mixed reality device that renders an indicium adjacent an image of the person with the determined status.
15. The system of claim 11 wherein the servers are configured to track online browsing activity of the one or more persons,
determine products and/or services of interest to the one or more persons; and
analytically determine personalized service to direct to the one or more persons while the one or more persons are in the store.
16. The system of claim 11 wherein the servers are further configured to:
track location of the one or more persons and the mixed reality device; and
execute a mapping algorithm that sends data to the mixed reality device with the mapping algorithm showing the one or more persons' current location in the store, via a map; and the servers push data to the mixed reality device to inform a sales associate how to advise the customer.
US15/381,262 2016-07-12 2016-12-16 Holographic Technology Implemented Retail Solutions Abandoned US20180018681A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/381,262 US20180018681A1 (en) 2016-07-12 2016-12-16 Holographic Technology Implemented Retail Solutions

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201662361053P 2016-07-12 2016-07-12
US201662361669P 2016-07-13 2016-07-13
US15/381,262 US20180018681A1 (en) 2016-07-12 2016-12-16 Holographic Technology Implemented Retail Solutions

Publications (1)

Publication Number Publication Date
US20180018681A1 true US20180018681A1 (en) 2018-01-18

Family

ID=60940733

Family Applications (7)

Application Number Title Priority Date Filing Date
US15/379,647 Active 2037-05-29 US10650593B2 (en) 2016-07-12 2016-12-15 Holographic technology implemented security solution
US15/379,657 Active US10769854B2 (en) 2016-07-12 2016-12-15 Holographic technology implemented security solution
US15/381,396 Active US10614627B2 (en) 2016-07-12 2016-12-16 Holographic technology implemented security solution
US15/381,262 Abandoned US20180018681A1 (en) 2016-07-12 2016-12-16 Holographic Technology Implemented Retail Solutions
US15/381,588 Active US10147238B2 (en) 2016-07-12 2016-12-16 Holographic technology implemented retail solutions
US15/381,555 Abandoned US20180018708A1 (en) 2016-07-12 2016-12-16 Holographic Technology Implemented Retail Solutions
US16/200,341 Active US10521968B2 (en) 2016-07-12 2018-11-26 Systems and methods for mixed reality with cognitive agents

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US15/379,647 Active 2037-05-29 US10650593B2 (en) 2016-07-12 2016-12-15 Holographic technology implemented security solution
US15/379,657 Active US10769854B2 (en) 2016-07-12 2016-12-15 Holographic technology implemented security solution
US15/381,396 Active US10614627B2 (en) 2016-07-12 2016-12-16 Holographic technology implemented security solution

Family Applications After (3)

Application Number Title Priority Date Filing Date
US15/381,588 Active US10147238B2 (en) 2016-07-12 2016-12-16 Holographic technology implemented retail solutions
US15/381,555 Abandoned US20180018708A1 (en) 2016-07-12 2016-12-16 Holographic Technology Implemented Retail Solutions
US16/200,341 Active US10521968B2 (en) 2016-07-12 2018-11-26 Systems and methods for mixed reality with cognitive agents

Country Status (1)

Country Link
US (7) US10650593B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109658573A (en) * 2018-12-24 2019-04-19 上海爱观视觉科技有限公司 A kind of intelligent door lock system
CN109903438A (en) * 2019-02-21 2019-06-18 安徽鸿延传感信息有限公司 A kind of phonetic warning system of enterprise security risk management and control inspection
US10360572B2 (en) * 2016-03-07 2019-07-23 Ricoh Company, Ltd. Image processing system, method and computer program product for evaluating level of interest based on direction of human action
CN110417648A (en) * 2018-04-30 2019-11-05 奇邑科技股份有限公司 Multiple gateway communication means and its wireless gateway system
US10521968B2 (en) 2016-07-12 2019-12-31 Tyco Fire & Security Gmbh Systems and methods for mixed reality with cognitive agents
US20210331648A1 (en) * 2020-04-23 2021-10-28 Toyota Motor Engineering & Manufacturing North America, Inc. Tracking and video information for detecting vehicle break-in

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3055626A1 (en) * 2017-03-15 2018-09-20 Financial & Risk Organisation Limited Systems and methods for detecting and locating unsecured sensors in a network
US20180293596A1 (en) * 2017-04-10 2018-10-11 International Business Machines Corporation Shelf image recognition analytics
US11495118B2 (en) * 2017-06-27 2022-11-08 Oneevent Technologies, Inc. Augmented reality of a building
US10366599B1 (en) * 2017-09-15 2019-07-30 Global Tel*Link Corporation Communication devices for guards of controlled environments
US10403123B2 (en) * 2017-10-31 2019-09-03 Global Tel*Link Corporation Augmented reality system for guards of controlled environment residents
US10775776B2 (en) * 2017-11-17 2020-09-15 Accenture Global Solutions Limited Digital manufacturing with product tracking and social media analysis of the product
KR102619621B1 (en) * 2018-02-07 2023-12-29 삼성전자주식회사 Electronic device and method for communicating with chatbot
US10846532B2 (en) * 2018-02-27 2020-11-24 Motorola Solutions, Inc. Method and apparatus for identifying individuals using an augmented-reality application
US10613505B2 (en) * 2018-03-29 2020-04-07 Saudi Arabian Oil Company Intelligent distributed industrial facility safety system
CN108650245A (en) * 2018-04-24 2018-10-12 上海奥孛睿斯科技有限公司 Internet of things system based on augmented reality and operation method
US11417064B2 (en) * 2018-07-10 2022-08-16 Motorola Solutions, Inc. Method, apparatus and system for mapping an incident type to data displayed at an incident scene
US11850514B2 (en) 2018-09-07 2023-12-26 Vulcan Inc. Physical games enhanced by augmented reality
CN109144014B (en) * 2018-10-10 2021-06-25 北京交通大学 System and method for detecting operation condition of industrial equipment
US11670080B2 (en) * 2018-11-26 2023-06-06 Vulcan, Inc. Techniques for enhancing awareness of personnel
US10616419B1 (en) 2018-12-12 2020-04-07 Mitel Networks Corporation Devices, systems and methods for communications that include social media clients
WO2020131049A1 (en) * 2018-12-19 2020-06-25 Hewlett-Packard Development Company, L.P. Security detection analytics
US11030814B1 (en) * 2019-01-15 2021-06-08 Facebook, Inc. Data sterilization for post-capture editing of artificial reality effects
WO2020163530A1 (en) 2019-02-08 2020-08-13 Vulcan Inc. Devices to assist ecosystem development and preservation
CN109920203A (en) * 2019-02-12 2019-06-21 合肥极光科技股份有限公司 A kind of campus security intelligent monitor system based on technology of Internet of things
US11912382B2 (en) 2019-03-22 2024-02-27 Vulcan Inc. Underwater positioning system
WO2020219643A1 (en) * 2019-04-23 2020-10-29 Apple Inc. Training a model with human-intuitive inputs
CN110177252A (en) * 2019-05-13 2019-08-27 安徽银点电子科技有限公司 A kind of monitoring system of entering based on electronic equipment
CN111327925A (en) * 2019-06-04 2020-06-23 杭州海康威视系统技术有限公司 Data processing method and device, electronic equipment and machine-readable storage medium
FR3103955A1 (en) * 2019-11-29 2021-06-04 Orange Device and method for environmental analysis, and device and voice assistance method implementing them
US11769066B2 (en) 2021-11-17 2023-09-26 Johnson Controls Tyco IP Holdings LLP Building data platform with digital twin triggers and actions
CN111105218A (en) * 2020-01-07 2020-05-05 国网福建省电力有限公司 Power distribution network operation monitoring method based on holographic image technology
US11854046B2 (en) 2020-02-14 2023-12-26 Walmart Apollo, Llc Systems and methods for presenting augmented reality promotion indicators
CN111476171B (en) * 2020-04-09 2021-03-26 腾讯科技(深圳)有限公司 Distributed object recognition system and method and edge computing equipment
CN111541876A (en) * 2020-05-18 2020-08-14 上海未高科技有限公司 Method for realizing high-altitude cloud anti-AR technology
US11126405B1 (en) * 2020-06-19 2021-09-21 Accenture Global Solutions Limited Utilizing augmented reality and artificial intelligence to automatically generate code for a robot
CN112489390B (en) * 2020-07-13 2022-05-10 北京宏远汇通网络科技有限公司 Security node collaborative alarm method based on intelligent security
US20220122096A1 (en) * 2020-10-15 2022-04-21 International Business Machines Corporation Product performance estimation in a virtual reality environment
US11956324B2 (en) * 2021-01-07 2024-04-09 Stmicroelectronics S.R.L. Sensor device, system and method
US11893551B2 (en) 2021-04-15 2024-02-06 Bank Of America Corporation Information security system and method for augmented reality check generation
US11934966B2 (en) 2021-11-17 2024-03-19 Johnson Controls Tyco IP Holdings LLP Building data platform with digital twin inferences
US20230230109A1 (en) * 2022-01-19 2023-07-20 Martin A. Alpert Trend prediction
US20230245152A1 (en) * 2022-02-03 2023-08-03 Capital One Services, Llc Local trend and influencer identification using machine learning predictive models
US11870852B1 (en) * 2023-03-31 2024-01-09 Meta Platforms Technologies, Llc Systems and methods for local data transmission

Family Cites Families (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5357148A (en) 1992-12-29 1994-10-18 Sgs-Thomson Microelectronics, Inc. Device for biasing an RF device operating in quasi-linear modes with voltage compensation
US20030025599A1 (en) 2001-05-11 2003-02-06 Monroe David A. Method and apparatus for collecting, sending, archiving and retrieving motion video and still images and notification of detected events
US7023913B1 (en) 2000-06-14 2006-04-04 Monroe David A Digital security multimedia sensor
US20040075738A1 (en) 1999-05-12 2004-04-22 Sean Burke Spherical surveillance system architecture
US8520068B2 (en) 1999-07-20 2013-08-27 Comcast Cable Communications, Llc Video security system
WO2001064481A2 (en) 2000-03-02 2001-09-07 Donnelly Corporation Video mirror systems incorporating an accessory module
US20110058036A1 (en) 2000-11-17 2011-03-10 E-Watch, Inc. Bandwidth management and control
CA2327847C (en) 2000-12-07 2010-02-23 Phasys Limited System for transmitting and verifying alarm signals
US7253732B2 (en) 2001-09-10 2007-08-07 Osann Jr Robert Home intrusion confrontation avoidance system
US6970083B2 (en) 2001-10-09 2005-11-29 Objectvideo, Inc. Video tripwire
US20030158771A1 (en) 2002-01-16 2003-08-21 Ncr Corporation Retention modeling methodology for airlines
US7321386B2 (en) 2002-08-01 2008-01-22 Siemens Corporate Research, Inc. Robust stereo-driven video-based surveillance
US20050010649A1 (en) 2003-06-30 2005-01-13 Ray Payne Integrated security suite architecture and system software/hardware
US7801833B2 (en) 2003-12-22 2010-09-21 Endicott Interconnect Technologies, Inc. Item identification control method
US7249064B1 (en) 2004-01-16 2007-07-24 Carmen Billy W Method for consumer referral of products to retailers
US8965460B1 (en) 2004-01-30 2015-02-24 Ip Holdings, Inc. Image and augmented reality based networks using mobile devices and intelligent electronic glasses
US8963713B2 (en) * 2005-03-16 2015-02-24 Icontrol Networks, Inc. Integrated security network with security alarm signaling system
US20060136575A1 (en) 2004-05-11 2006-06-22 Ray Payne Integrated security suite architecture and system software/hardware
US20060179463A1 (en) 2005-02-07 2006-08-10 Chisholm Alpin C Remote surveillance
US20070011105A1 (en) 2005-05-03 2007-01-11 Greg Benson Trusted decision support system and method
WO2007038622A2 (en) 2005-09-28 2007-04-05 The Government Of The United State Of America , As Represented By The Secretary Of The Navy Open-loop controller
US8009100B2 (en) 2006-06-27 2011-08-30 Telefonaktiebolaget L M Ericsson (Publ) Radio frequency emitter detection and location method and system
US20080071559A1 (en) 2006-09-19 2008-03-20 Juha Arrasvuori Augmented reality assisted shopping
US20080189169A1 (en) 2007-02-01 2008-08-07 Enliven Marketing Technologies Corporation System and method for implementing advertising in an online social network
US8405196B2 (en) 2007-03-05 2013-03-26 DigitalOptics Corporation Europe Limited Chips having rear contacts connected by through vias to front contacts
WO2009012289A1 (en) * 2007-07-16 2009-01-22 Cernium Corporation Apparatus and methods for video alarm verification
US8180396B2 (en) 2007-10-18 2012-05-15 Yahoo! Inc. User augmented reality for camera-enabled mobile devices
EP2166493A1 (en) 2008-09-12 2010-03-24 BRITISH TELECOMMUNICATIONS public limited company Control of supply networks and verification of items
US8937658B2 (en) 2009-10-15 2015-01-20 At&T Intellectual Property I, L.P. Methods, systems, and products for security services
US8310365B2 (en) 2010-01-08 2012-11-13 Utc Fire & Security Americas Corporation, Inc. Control system, security system, and method of monitoring a location
US20120242698A1 (en) 2010-02-28 2012-09-27 Osterhout Group, Inc. See-through near-eye display glasses with a multi-segment processor-controlled optical layer
US20110213664A1 (en) 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
WO2011112941A1 (en) 2010-03-12 2011-09-15 Tagwhat, Inc. Purchase and delivery of goods and services, and payment gateway in an augmented reality-enabled distribution network
US20110254680A1 (en) 2010-04-16 2011-10-20 Infrasafe, Inc. Security monitoring system
KR101016556B1 (en) 2010-05-06 2011-02-24 전성일 Method, server and computer-readable recording medium for accessing information on person using augmented reality
US10456209B2 (en) 2010-10-13 2019-10-29 Gholam A. Peyman Remote laser treatment system with dynamic imaging
US20140002236A1 (en) 2010-12-02 2014-01-02 Viscount Security Systems Inc. Door Lock, System and Method for Remotely Controlled Access
KR101292463B1 (en) 2011-01-27 2013-07-31 주식회사 팬택 Augmented reality system and method that share augmented reality service to remote
US8898091B2 (en) 2011-05-11 2014-11-25 Ari M. Frank Computing situation-dependent affective response baseline levels utilizing a database storing affective responses
US8223088B1 (en) 2011-06-09 2012-07-17 Google Inc. Multimode input field for a head-mounted display
TWI461721B (en) 2012-03-16 2014-11-21 Quadlink Technology Inc Object detection device and method thereof
US9148341B2 (en) 2012-03-26 2015-09-29 Jds Uniphase Corporation Upgrading a programmable logic gate array in an in-service pluggable transceiver
US20140081858A1 (en) 2012-09-14 2014-03-20 Diebold Self-Service Systems Division Of Diebold, Incorporated Banking system controlled responsive to data read from data bearing records
US9607025B2 (en) 2012-09-24 2017-03-28 Andrew L. DiRienzo Multi-component profiling systems and methods
US10977701B2 (en) 2012-12-04 2021-04-13 Crutchfield Corporation Techniques for providing retail customers a seamless, individualized discovery and shopping experience between online and brick and mortar retail locations
US10110805B2 (en) 2012-12-06 2018-10-23 Sandisk Technologies Llc Head mountable camera system
US9852381B2 (en) 2012-12-20 2017-12-26 Nokia Technologies Oy Method and apparatus for providing behavioral pattern generation for mixed reality objects
US9721373B2 (en) 2013-03-14 2017-08-01 University Of Southern California Generating instructions for nonverbal movements of a virtual character
US10243786B2 (en) 2013-05-20 2019-03-26 Citrix Systems, Inc. Proximity and context aware mobile workspaces in enterprise systems
US9630098B2 (en) * 2013-06-09 2017-04-25 Sony Interactive Entertainment Inc. Head mounted display
US20150020086A1 (en) 2013-07-11 2015-01-15 Samsung Electronics Co., Ltd. Systems and methods for obtaining user feedback to media content
WO2015088057A1 (en) 2013-12-10 2015-06-18 엘지전자 주식회사 3d camera module
US9384656B2 (en) 2014-03-10 2016-07-05 Tyco Fire & Security Gmbh False alarm avoidance in security systems filtering low in network
US9715613B2 (en) 2014-05-02 2017-07-25 The Boeing Company Systems and methods for use in authenticating an object
US20150317418A1 (en) 2014-05-02 2015-11-05 Honeywell International Inc. Providing three-dimensional monitoring of a facility
US9600069B2 (en) 2014-05-09 2017-03-21 Google Inc. Systems and methods for discerning eye signals and continuous biometric identification
US20160070343A1 (en) 2014-09-09 2016-03-10 Beijing Lenovo Software Ltd. Information processing method and electronic device
US9945928B2 (en) 2014-10-30 2018-04-17 Bastille Networks, Inc. Computational signal processing architectures for electromagnetic signature analysis
IL236752B (en) 2015-01-15 2019-10-31 Eran Jedwab An integrative security system and method
US10300361B2 (en) 2015-01-23 2019-05-28 Playsight Interactive Ltd. Ball game training
CN107211195B (en) 2015-02-12 2020-04-24 日商可乐普拉股份有限公司 Apparatus and system for viewing and listening to content using head mounted display
KR102348812B1 (en) * 2015-03-09 2022-01-07 삼성전자주식회사 User information processing method and electronic device supporting the same
AU2016228525B2 (en) 2015-03-12 2021-01-21 Alarm.Com Incorporated Virtual enhancement of security monitoring
US10650593B2 (en) 2016-07-12 2020-05-12 Tyco Fire & Security Gmbh Holographic technology implemented security solution
US10540550B2 (en) 2017-03-20 2020-01-21 Mastercard International Incorporated Augmented reality systems and methods for service providers
US11270510B2 (en) 2017-04-04 2022-03-08 David Peter Warhol System and method for creating an augmented reality interactive environment in theatrical structure

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10360572B2 (en) * 2016-03-07 2019-07-23 Ricoh Company, Ltd. Image processing system, method and computer program product for evaluating level of interest based on direction of human action
US10521968B2 (en) 2016-07-12 2019-12-31 Tyco Fire & Security Gmbh Systems and methods for mixed reality with cognitive agents
US10614627B2 (en) * 2016-07-12 2020-04-07 Tyco Fire & Security Gmbh Holographic technology implemented security solution
US10650593B2 (en) 2016-07-12 2020-05-12 Tyco Fire & Security Gmbh Holographic technology implemented security solution
US10769854B2 (en) 2016-07-12 2020-09-08 Tyco Fire & Security Gmbh Holographic technology implemented security solution
CN110417648A (en) * 2018-04-30 2019-11-05 奇邑科技股份有限公司 Multiple gateway communication means and its wireless gateway system
CN109658573A (en) * 2018-12-24 2019-04-19 上海爱观视觉科技有限公司 A kind of intelligent door lock system
CN109903438A (en) * 2019-02-21 2019-06-18 安徽鸿延传感信息有限公司 A kind of phonetic warning system of enterprise security risk management and control inspection
US20210331648A1 (en) * 2020-04-23 2021-10-28 Toyota Motor Engineering & Manufacturing North America, Inc. Tracking and video information for detecting vehicle break-in
US11945404B2 (en) * 2020-04-23 2024-04-02 Toyota Motor Engineering & Manufacturing North America, Inc. Tracking and video information for detecting vehicle break-in

Also Published As

Publication number Publication date
US20180018824A1 (en) 2018-01-18
US10650593B2 (en) 2020-05-12
US20190139315A1 (en) 2019-05-09
US10769854B2 (en) 2020-09-08
US20180018823A1 (en) 2018-01-18
US20180018708A1 (en) 2018-01-18
US10614627B2 (en) 2020-04-07
US20180018867A1 (en) 2018-01-18
US20180018861A1 (en) 2018-01-18
US10147238B2 (en) 2018-12-04
US10521968B2 (en) 2019-12-31

Similar Documents

Publication Publication Date Title
US20180018681A1 (en) Holographic Technology Implemented Retail Solutions
US20200211347A1 (en) Automatic detection of zones of interest in a video
US10706446B2 (en) Method, system, and computer-readable medium for using facial recognition to analyze in-store activity of a user
JP6688317B2 (en) Utilization of IoT (Internet of Things) to enhance interaction between user and physical object
AU2018333873B2 (en) System and method for classifying passive human-device interactions through ongoing device context awareness
JP5958723B2 (en) System and method for queue management
CN110710190B (en) Method, terminal, electronic device and computer-readable storage medium for generating user portrait
US10080129B2 (en) Method and apparatus for integrated tracking of visitors
US20160357762A1 (en) Smart View Selection In A Cloud Video Service
US20170034483A1 (en) Smart shift selection in a cloud video service
CN107851243A (en) Infer physics conference location
US9648116B2 (en) System and method for monitoring mobile device activity
US20240236630A9 (en) Low energy network
US20170132648A1 (en) Anonymous reporting of multiple venue location data
Mondal Application of IOT in Library
US20190362406A1 (en) Executing digital actions in a retail environment
Devare Analysis and design of IoT based physical location monitoring system
JP2005332127A (en) Status prediction system by personal profile and method, program and recording medium
US11113746B1 (en) Method, medium, and system for automated product identification
US20150302439A1 (en) System and method for monitoring mobile device activity
Solti et al. Privacy in location-sensing technologies
Adegoke INTERNET OF THINGS (IOT)
Mitra et al. A heuristic approach for mobile phone bases information management system in wireless sensor network

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: TYCO FIRE & SECURITY GMBH, SWITZERLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RASBAND, PAUL B.;REEL/FRAME:044513/0568

Effective date: 20171006

AS Assignment

Owner name: TYCO FIRE & SECURITY GMBH, SWITZERLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LOCKE, ROBERT B.;REEL/FRAME:044581/0741

Effective date: 20180104

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION