US20140223463A1 - System and method for user monitoring and intent determination - Google Patents
System and method for user monitoring and intent determination Download PDFInfo
- Publication number
- US20140223463A1 US20140223463A1 US14/178,859 US201414178859A US2014223463A1 US 20140223463 A1 US20140223463 A1 US 20140223463A1 US 201414178859 A US201414178859 A US 201414178859A US 2014223463 A1 US2014223463 A1 US 2014223463A1
- Authority
- US
- United States
- Prior art keywords
- home entertainment
- event
- recited
- created
- event record
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44204—Monitoring of content usage, e.g. the number of times a movie has been viewed, copied or the amount which has been watched
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42203—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4333—Processing operations in response to a pause request
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/441—Acquiring end-user identification, e.g. using personal code sent by the remote control or by inserting a card
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44222—Analytics of user selections, e.g. selection of programs or purchase activity
- H04N21/44224—Monitoring of user activity on external systems, e.g. Internet browsing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
- H04N21/4667—Processing of monitored end-user data, e.g. trend analysis based on the log file of viewer selections
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
- H04N21/4852—End-user interface for client configuration for modifying audio parameters, e.g. switching between mono and stereo
Definitions
- Home entertainment systems comprised of plural appliances and/or the controlling devices used to issue commands to such appliances may be provisioned with devices for detecting user presence and/or user interaction via methods such as gesture, spoken voice, facial recognition, spatial analysis, etc., as known in the art.
- devices for detecting user presence and/or user interaction via methods such as gesture, spoken voice, facial recognition, spatial analysis, etc., as known in the art.
- personal communication devices such as smart phones, tablet computers, etc., may provide additional means for identification of user presence via detection of such personal communication devices on a local wireless network such as a WiFi network, a Bluetooth network, etc.
- a central routing appliance such as an AV receiver, set top box, smart TV, etc.
- sensing interfaces such as an image sensing interface (e.g., an interface associated with a camera), a sound sensing interface (e.g., an interface associated with microphone), and/or an interface for sensing the presence of an RF device such as a smart phone may be used to fully or partially automate a system response to common events which may occur during a TV viewing session, such as a user or users leaving or entering the viewing area, a user answering a telephone call, the detection of a doorbell ringing or a baby monitor alarm, etc.
- image sensing interface e.g., an interface associated with a camera
- a sound sensing interface e.g., an interface associated with microphone
- an interface for sensing the presence of an RF device such as a smart phone
- data derived from such sensing interfaces may be utilized to enhance the responsiveness of one or more system components, for example by sensing when a user is reaching for a physical remote control unit or preparing a component to issue a voice or gesture command.
- user presence data derived from such sensing interfaces may be used by a central routing appliance in conjunction with media stream information in order to capture and report user viewing habits and/or preferences.
- FIG. 1 illustrates an exemplary system in which the teachings of the subject invention may be utilized
- FIG. 2 illustrates a further exemplary system in which the teachings of the subject invention may be utilized
- FIG. 3 illustrates, in block diagram form, an exemplary hardware architecture for an appliance which may be a component part of the systems illustrated in FIGS. 1 and 2 ;
- FIG. 4 illustrates, in block diagram form, an exemplary software architecture for the illustrative appliance of FIG. 3 ;
- FIG. 5 illustrates, in flowchart form, the operation of an exemplary event processing module of the software architecture illustrated in FIG. 4 .
- an exemplary home entertainment system in which the methods of the subject invention may be applied may comprise an AV receiver 100 which serves as a hub for directing a selected video and/or audio media stream from a source appliance such as, for example, a satellite or cable system set top box and/or DVR device (“STB”) 104 , a DVD player 106 , a CD player 108 , or a game console 110 to a destination appliance, such as a TV set 102 , where the selected video and/or audio media stream is to be rendered.
- a source appliance such as, for example, a satellite or cable system set top box and/or DVR device (“STB”) 104 , a DVD player 106 , a CD player 108 , or a game console 110 to a destination appliance, such as a TV set 102 , where the selected video and/or audio media stream is to be rendered.
- STB satellite or cable system set top box and/or DVR device
- connections 130 , 132 between appliances 102 through 110 and AV receiver 100 may generally comprise connections for carrying HDMI-compliant digital signals, although it will be appreciated that other interface standards such as component video, PCM audio, etc., may be substituted where necessitated by the limitations of a particular appliance.
- additional AV content streams may also be available from a streaming service 118 such as for example Netflix, Vudu, YouTube, NBC online, etc., via a wide area network such as the Internet 116 , to which end AV receiver 100 may be provisioned with a connection 112 to an Internet gateway device such as for example router 114 .
- connection between AV receiver 100 and Internet gateway device 114 may be wired as illustrated, or may be wireless, e.g., a WiFi local area network, as appropriate.
- an exemplary home entertainment system may also be provisioned with one or more sensing interfaces, such as interfaces associated with microphone 120 and camera 122 , suitable for the capture of audible and/or visible events within the home entertainment system environment.
- Users 124 , 126 of the illustrative entertainment system may select the media stream currently being viewed by means of any convenient method such through use of a remote control as known in the art, a voice command, a gesture, etc.
- data regarding these selections including without limitation media source, channel, track, title, etc., together with viewing duration, user presence, etc. may be accumulated and reported to a database server 128 for the aggregation and analysis of user viewing habits and preferences as discussed in greater detail below.
- a “smart” TV device 200 may incorporate both content rendering and source stream selection functionality.
- local appliances 104 through 110 may be connected directly to multiple input ports of TV 200 via, for example, HDMI connections 130 .
- TV 200 may also support a connection 112 to a wide area network such as the Internet 116 over which streaming AV content and other data may be received.
- the means for user command input to TV 200 and appliances 104 through 110 may take the form of a controlling device 204 , for example a conventional remote control or a smart phone app in communication with the appliances via any convenient infrared (IR), radio frequency (RF), hardwired, point-to-point, or networked protocol, as necessary to cause the respective target appliances to perform the desired operational functions.
- a controlling device 204 for example a conventional remote control or a smart phone app in communication with the appliances via any convenient infrared (IR), radio frequency (RF), hardwired, point-to-point, or networked protocol, as necessary to cause the respective target appliances to perform the desired operational functions.
- user input may also comprise spoken and/or gestured commands in place of or supplemental to controlling device signals, which sounds and gestures may be received by microphone 120 and camera 122 , processed and decoded by one of the appliances, for example TV 200 , and where necessary relayed to other target appliances via, for example, HDMI CEC commands, IR or RF signals, etc., as described for example in co-pending U.S. patent application Ser. No. 13/657,176 “System and Method for Optimized Appliance Control,” of common ownership and incorporated by reference herein in its entirety.
- an exemplary central routing appliance such as smart TV appliance 200
- TV programming may be stored within the memory 302 (hereafter the “TV programming”) for execution by TV engine and media processor(s) 300 .
- An exemplary architecture for such TV programming is presented in FIG. 4 .
- the exemplary TV programming may include, as required for a particular application, an underlying operating system 402 , such as for example LINUX, which may support a set of software modules implementing the various functionalities of the smart TV device.
- Such software modules may include a hardware abstraction layer 404 to provide a device independent interface between the various application software modules and the hardware dependent software modules such as video output driver 406 , audio output driver 408 , HDMI interface 410 , analog input/output ADC/DAC 412 , Ethernet and/or WiFi interface 414 , Bluetooth interface 416 ; USB interface 418 , remote control interface 420 , and camera and microphone drivers 422 and 423 .
- a hardware abstraction layer 404 to provide a device independent interface between the various application software modules and the hardware dependent software modules such as video output driver 406 , audio output driver 408 , HDMI interface 410 , analog input/output ADC/DAC 412 , Ethernet and/or WiFi interface 414 , Bluetooth interface 416 ; USB interface 418 , remote control interface 420 , and camera and microphone drivers 422 and 423 .
- Exemplary application modules which reside above abstraction layer 404 may include as required for a particular embodiment transport and session layer protocol and interface management 428 ; AV output management 424 ; input/output processing and routing 440 ; a miscellaneous services module 426 to support closed captioning, display configuration, OSD, etc.; remote control command decoder 430 ; and device resource management 442 .
- the exemplary TV programming may include audio and visual event detector modules 432 , 434 , for example voice and/or image recognition engines or the like; user event processing 436 ; and a user statistics gathering and reporting module 438 .
- smart TV appliance 200 may for example receive an incoming AV media stream from one of the input ports 306 , 308 to be processed, buffered, separated into audio and video components, and routed to outputs 318 , 320 for rendering on TV display screen 322 and loudspeaker(s) 324 ; may receive commands from remote control interface 316 which are decoded and acted upon, for example to select an input media stream, adjust audio volume, etc.; may manage a connection to the Internet through Ethernet or WiFi interface 310 to enable browsing for content, download of software updates, video telephony utilizing inputs from camera 122 and microphone 120 ; etc.
- the exemplary TV programming may receive and process input signals from controlling device 204 , camera 122 and/or microphone 120 in order to detect user presence, identify individual users, and/or receive user command input, as will described hereafter.
- the source of audio input signals may comprise a microphone 120 and associated interface 314 provisioned as part of a smart TV appliance 200
- audio input signals may be captured by any other appliance in the system and forwarded to appliance 200 for processing, or may originate from a microphone provisioned in a controlling device such as remote control or smartphone 204 , the output of which microphone may, by way of example, be digitized and/or processed by controlling device 204 and wirelessly forwarded to smart TV appliance 200 via remote control interface 326 , WiFi interface 310 , Bluetooth interface 328 , or any other means as appropriate for a particular implementation.
- the user event processing module 436 of the TV programming of TV appliance 200 may act as illustrated in the flowchart of FIG. 5 upon occurrence of a user-related event.
- event processing may act as illustrated in the flowchart of FIG. 5 upon occurrence of a user-related event.
- step 502 it may be determined if the event constitutes receipt of a remote control command, as may be reported by remote control command decoder 430 .
- remote control commands may be received via any or all of RC interface 420 (e.g., infrared or RF4CE signals or the like), Ethernet/WiFi interface 414 , or Bluetooth interface 416 , depending on the particular embodiment and the particular controlling device currently in use.
- the requested functional operation may be executed.
- such operations may include adjustment of output audio volume to be performed by AV management module 424 , selection of a new media input stream to be performed by I/O and routing module 440 , etc.
- received remote control commands may also comprise requests to direct the functional operation of other connected appliances, for example control of DVD player 106 or STB 104 via CEC commands to be issued by interface management module 428 over HDMI connections 130 ; or alternatively via a connected IR blaster or a LAN as described for example in co-pending U.S. patent application Ser. No.
- step 542 it may next be determined if the command function just performed comprised a change in the media stream being rendered by TV 200 , for example selection of a new input port 306 , 308 ; a change to the selected broadcast channel or DVR playback of STB 104 ; a change to an internet media source, etc. If so, at step 544 data regarding this event may be conveyed to user statistics module 438 for logging and ultimate reporting to database server 128 .
- the data logged regarding the new content stream may comprise some or all of the command parameters themselves, e.g., an STB channel number and timestamp; metadata items obtainable from the content source device or channel such as a DVD title, streaming video URL, etc.; a sample of the audio or video content itself for analysis as is known in the art and described, for example, in U.S. Pat. Nos. 7,986,913, 7,627,477 or 7,346,512; or any other data which may be suitable for identification purposes.
- the command parameters themselves, e.g., an STB channel number and timestamp
- metadata items obtainable from the content source device or channel such as a DVD title, streaming video URL, etc.
- a sample of the audio or video content itself for analysis as is known in the art and described, for example, in U.S. Pat. Nos. 7,986,913, 7,627,477 or 7,346,512; or any other data which may be suitable for identification purposes.
- step 546 it may next be determined if the executed function comprised a reportable event, such as for example powering TV 200 (or any other connected device) on or off, issuing a fast forward command during DVR playback, etc. If so, this event may also be reported to user statistics module 438 for logging, after which processing of the remote control command is complete.
- reportable appliance events such as powering an attached device on or off may also be separately initiated via for example direct communication to an appliance from its own remote control. Accordingly, though not illustrated, where such events are detectable, for example via HDMI status, absence or presence of video or audio signals, etc., these events may also be reported and logged.
- the event processing may next determine if the reported event constitutes an image change event reported by visual event detection module 432 in response to analysis of image data received from camera 122 via camera driver 422 .
- image processing may utilize for example the techniques described in U.S. Pat. Nos. 5,534,917, 6,829,384, 8,274,535, WIPO (PCT) patent application publication WO2010/057683A1, or the like, and may for example periodically monitor an image comprising the field of view of camera 122 in order in order to initiate image analysis in response to detection of any variation in image data which exceed a certain threshold value.
- the event processing 436 may be adapted to issue a “pause” command to the source of the current media stream.
- Other actions may include without limitation activating the recording function of a DVR, logging off a Web site, etc., as appropriate for a particular embodiment and configuration.
- the event processing may cause display of a request for confirmation on TV screen 322 , e.g. “Would you like to pause this show? (Y/N).” If confirmed by the user at step 522 , which confirmation may take the form of a gesture, spoken command, remote control input, etc., or, in those embodiments where the default is to take action, a timeout, at step 528 the indicated action may be executed.
- the performance accuracy of audio and/or visual event detection modules 432 , 434 may be improved by indicating a range of possible responses (e.g., “yes” or “no” in this instance) to these modules in advance, thereby limiting the number of sound or gesture templates which need to be matched.
- the sound level of TV audio output 320 may be temporarily lowered to reduce background noise.
- data regarding the change in user including user identity where this is determinable, for example via use of techniques such as described in U.S. Pat. Nos. 7,551,756, 7,702,599, or the like, may be conveyed to statistic gathering module 438 for logging, after which processing is complete.
- the event processor may next determine if the reported event comprises the arrival of a new or additional user in the TV viewing environment and if so take appropriate action.
- the reported event comprises the arrival of a new or additional user in the TV viewing environment and if so take appropriate action.
- some embodiments may be adapted to allow a viewer to invoke a “private viewing” status which may cause the current content to be automatically muted, paused, switched, etc. in the event an additional user enters the viewing environment.
- at step 526 it may be determined if such a status currently exists. If so at step 528 the appropriate action may be executed, after which data regarding the user arrival, including user identity where this is determinable, may be conveyed to statistic gathering module 438 for logging, and event processing is complete.
- re-entry of a user into a viewing environment may trigger an offer to resume playback of previously paused content; or in a multi-room, multi-device household in which appliances are networked together and equipped with viewer recognition, the entry of a user into one viewing environment may cause the event processor in that environment to query other event processors and/or statistic modules elsewhere in the household to determine if that user has recently departed another environment, and if so, offer to resume playback of a content stream which was previously paused in that other environment.
- the detected image change event is not a user arrival or departure, at steps 532 , 534 and 536 it may next be determined if the reported event comprises a detectable gesture, a detectable gesture in this context comprising a user action or motion which either by pre-programming or a learning process has been identified to visual event detection module 432 as having significance as user input. If the reported event is determined to be associated with an operational command function, for example “pause”, “mute”, “channel up”, etc., processing may continue at step 540 to execute the command as described previously. If the reported event is determined to be a preparatory gesture, appropriate anticipatory action may be taken at step 538 .
- an operational command function for example “pause”, “mute”, “channel up”, etc.
- a preparatory gesture may comprise without limitation any preliminary motion or gesture by a user which may be interpreted as a possible indication of that user's intent to perform an action, for example standing up, reaching for or setting down a remote control device, beckoning an out of sight person to enter the room, picking up a phone, etc.
- Anticipatory actions may include for example pre-conditioning visual and/or audio detection modules 432 , 434 to favor certain subsets of templates for matching purposes; modifying an on-screen menu from a format optimized for gesture control to one optimized for navigation via a keyboarded device such as a remote control, or vice-versa; reducing or muting audio volume; signaling a remote control device exit a quiescent state, to turn on or shut off its backlight, or the like; etc.
- the event processing may next determine if the reported event constitutes a sound recognition event reported by audio event detection module 434 .
- Speech or sound recognition by module 434 may utilize for example the techniques described in U.S. Pat. No. 7,603,276, WIPO (PCT) patent application publication WO2002/054382A1, or the like.
- the event is a sound recognition event
- at step 512 it may be determined if the decoded sound constitutes a voice command issued by a user. If so processing may continue at step 540 to execute the desired command as described previously. If not a voice command, at step 514 it may be determined if the reported event constitutes a trigger sound.
- a trigger sound may be an instance of a non-vocal audio signal received via microphone 120 which either by pre-programming or via a learning process has been assigned a command or an anticipatory action.
- trigger sounds may include a phone or doorbell ringing, a baby monitor, smoke alarm, microwave chime, etc. If the reported sound event is a trigger sound, the appropriate action, such as muting the television, etc., may be taken at step 516 .
- the system can be also be programmed to recognize various sounds and or spoken words/phrases as being indicative of a preparatory event whereupon one or more of the system components will be readied via an anticipatory action as described above.
- a spoken phrase such as “let's see what else is on” may be recognized as a preparatory event whereupon an anticipatory action may be executed to place a remote control device into a state wherein the remote control device is readied to receive input in anticipation of its use or a spoken phrase such as “come here,” the sound of a door bell, or the like may be recognized as a preparatory event whereupon an anticipatory action may be executed to ready the system to look for an anticipated event, e.g., a specific gesture, such as a user standing up, leaving the viewing area, etc. whereupon the appropriate response action to the sensed event that was anticipated, e.g., pausing the media, may be performed.
- an anticipated event e.g., a specific gesture, such as a user standing up, leaving the viewing area, etc.
- the system may execute, as needed, further actions, such as restorative action, to place the system into a state as desired, e.g., to return one or more components of the home entertainment system to a state where the component(s) is no longer looking for the occurrence of the anticipation event.
- further actions such as restorative action
- the event processing may next determine if the reported event comprises a wireless device such as for example a smart phone, tablet computer, game controller, etc., joining into or dropping from a LAN or PAN associated with smart TV 200 or other appliance in the system. If such devices have been previously registered with the TV programming, such activity may be used to infer the presence or absence of particular users. Such information may then be processed in a similar manner to user image detection (e.g., processed as a user being added or departing) continuing at step 518 as previously described.
- a wireless device such as for example a smart phone, tablet computer, game controller, etc.
- the event processor may process any other event report activity as consistent with a particular embodiment.
- users may be provisioned with personal remote control devices which embed user identification data in their command transmissions, such as described in co-pending U.S. patent application Ser. No. 13/225,635 “Controlling Devices Used to Provide an Adaptive User Interface,” of common ownership and incorporated by reference herein in its entirety, in which case user presence reporting events may be generated by remote control interface 420 .
- technologies such as infrared body heat sensing such as proposed in the art for use in automatic unattended power-off applications may be further adapted for the purposes described herein.
- Additional sources of activity events may also include data received from other household equipment such as security, lighting, or HVAC control systems equipped with occupancy sensors; entryway cameras; driveway sensors, etc., where appropriate.
- Statistic gathering module 438 may be adapted to report the data conveyed to it during the event processing steps described above to a centralized service, e.g., hosted on Internet connected server device 128 , for aggregation and analysis of user viewing habits and preferences. Depending on the particular embodiment such reporting may be performed on an event-by-event basis, or alternatively the data may be accumulated and reported at predetermined time intervals or only upon receipt of a request from the server device.
- data reported to statistic gathering module 438 may be formatted into several different event record classes and types for uploading to server device 128 .
- Exemplary record classes may comprise user events, e.g., as may be reported at step 530 of FIG. 5 ; appliance events, e.g., as may be reported at step 548 of FIG. 5 ; and content events, e.g., as may be reported at step 544 of FIG. 5 .
- user events e.g., as may be reported at step 530 of FIG. 5
- appliance events e.g., as may be reported at step 548 of FIG. 5
- content events e.g., as may be reported at step 544 of FIG. 5 .
- different or additional event classes may be appropriate in alternate embodiments and accordingly the above classifications are presented by way of example only and without limitation.
- user event record types may include addition (i.e., arrival) of user to the viewing area and deletion (i.e., departure) of a user from the viewing area.
- each of these record types may include timestamp data and a user ID.
- the timestamp illustrated is suitable for use in applications where the server device is already aware of the geographical location of the reporting system, e.g., as a result of an initial setup procedure, by URL decoding, etc.
- the timestamp field may include additional data such as a time zone, zip code, etc., where required.
- User identity may be any item of data which serves to uniquely identify individual users to facilitate viewing habit and preference analysis at server 128 .
- user ID data may comprise identities explicitly assigned during a setup/configuration process; random numbers assigned by the system as each distinct user is initially detected; a hash value generated by a facial or voice recognition algorithm; a MAC address or serial number assigned to a smart phone or tablet computer; etc. as appropriate for a particular embodiment.
- appliance event records may comprise record types indicative of events reported to, functional commands issued to, and/or operations performed by various controlled appliances, e.g., as reported at step 548 of FIG. 5 .
- Such events may include without limitation appliance power on/off commands, playback control commands, etc., as necessary for a complete understanding of user viewing habits and preferences.
- fast forward commands may also be logged in order to monitor commercial skipping activity.
- each appliance event record type may also include a timestamp field as described above and an appliance type/ID field comprising an appliance type indicator (e.g.
- STB/DVR Portable Network Video Recorder
- DVD Portable Network Video Recorder
- Internet stream etc.
- a unique ID or subtype value which may be assigned by the event monitoring and/or statistics gathering module in order to distinguish between multiple appliances of the same type, e.g., a household with multiple DVD players, or with both cable and satellite STBs.
- content event record types may include without limitation a channel or track change information record type; a title information record type containing, for example, a show title retrieved from program guide data, DVD or video-on-demand title information, etc.; a metadata record type containing metadata values obtained from a DVD or CD, streaming video service, or the like; a content sample record type containing a sample clip of audio and/or video content for comparison against a content identification database; or in alternate embodiments any other data which may be utilized in determining the identity of a particular content stream.
- Each record type may comprise timestamp and source appliance fields as described above, together with a field containing identity data, which may comprise numeric, text, or binary data as necessary.
- additional record and/or field types may be utilized in other embodiments, as necessary to enable reliable identification of media content streams.
- Tables 1 through 3 are presented herein using a tabular format for ease of reference, in practice these may be implemented in various forms using any convenient data representation, for example a structured database, XML file, cloud-based service, etc., as appropriate for a particular embodiment.
- the statistics gathering and recording functionality of the illustrative embodiment is implemented as part of the programming of an exemplary appliance 200 , i.e., in software module 438
- this functionality may be provisioned at a different location, for example in one of the other appliances forming part of an entertainment system, in a local PC, at a remote server or cable system headend, etc., or at any other convenient location to which the particular appliance programming may be capable of reporting user events.
- image analysis and/or speech recognition may be performed by a smart TV device, a locally connected personal computer, a home security system, etc., or even “in the cloud”, i.e., by an Internet based service, with the results reported to a user statistic gathering module and/or an event processing module resident in a connected AV receiver or STB.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Computer Interaction (AREA)
- Marketing (AREA)
- Game Theory and Decision Science (AREA)
- Economics (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
- Telephonic Communication Services (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Selective Calling Equipment (AREA)
Abstract
Sensing interfaces associated with a home entertainment system are used to automate a system response to events which occur in a viewing area associated with the home entertainment system. Data derived from such sensing interfaces may also be used to enhance the response readiness of one or more system components. Still further, user presence data derived from such sensing interfaces may be used to capture and report user viewing habits and/or preferences.
Description
- This application is a divisional of U.S. application Ser. No. 13/758,307, filed on Feb. 4, 2013, the disclosure of which is incorporated herein by reference in its entirety.
- Home entertainment systems comprised of plural appliances and/or the controlling devices used to issue commands to such appliances may be provisioned with devices for detecting user presence and/or user interaction via methods such as gesture, spoken voice, facial recognition, spatial analysis, etc., as known in the art. Furthermore, growing use of personal communication devices such as smart phones, tablet computers, etc., may provide additional means for identification of user presence via detection of such personal communication devices on a local wireless network such as a WiFi network, a Bluetooth network, etc. While multiple media sources and multiple media rendering devices may be coupled in many of these home entertainment systems through a central routing appliance such as an AV receiver, set top box, smart TV, etc., no systems or methods currently exist for using user presence and/or user interaction detection alone or in conjunction with a central routing appliance to provide enhanced home entertainment system functionalities.
- This invention relates generally to home entertainment systems and control methods therefor and, in particular, to enhanced functionalities for such home entertainment systems which are enabled by the availability of additional user-related input methods for such systems. For example, in one aspect of the invention sensing interfaces such as an image sensing interface (e.g., an interface associated with a camera), a sound sensing interface (e.g., an interface associated with microphone), and/or an interface for sensing the presence of an RF device such as a smart phone may be used to fully or partially automate a system response to common events which may occur during a TV viewing session, such as a user or users leaving or entering the viewing area, a user answering a telephone call, the detection of a doorbell ringing or a baby monitor alarm, etc. In another aspect of the invention, data derived from such sensing interfaces may be utilized to enhance the responsiveness of one or more system components, for example by sensing when a user is reaching for a physical remote control unit or preparing a component to issue a voice or gesture command. In a yet further aspect of the invention, user presence data derived from such sensing interfaces may be used by a central routing appliance in conjunction with media stream information in order to capture and report user viewing habits and/or preferences.
- A better understanding of the objects, advantages, features, properties and relationships of the invention will be obtained from the following detailed description and accompanying drawings which set forth illustrative embodiments and which are indicative of the various ways in which the principles of the invention may be employed.
- For a better understanding of the various aspects of the invention, reference may be had to preferred embodiments shown in the attached drawings in which:
-
FIG. 1 illustrates an exemplary system in which the teachings of the subject invention may be utilized; -
FIG. 2 illustrates a further exemplary system in which the teachings of the subject invention may be utilized;FIG. 3 illustrates, in block diagram form, an exemplary hardware architecture for an appliance which may be a component part of the systems illustrated inFIGS. 1 and 2 ; -
FIG. 4 illustrates, in block diagram form, an exemplary software architecture for the illustrative appliance ofFIG. 3 ; and -
FIG. 5 illustrates, in flowchart form, the operation of an exemplary event processing module of the software architecture illustrated inFIG. 4 . - With reference to
FIG. 1 , an exemplary home entertainment system in which the methods of the subject invention may be applied may comprise anAV receiver 100 which serves as a hub for directing a selected video and/or audio media stream from a source appliance such as, for example, a satellite or cable system set top box and/or DVR device (“STB”) 104, aDVD player 106, aCD player 108, or agame console 110 to a destination appliance, such as aTV set 102, where the selected video and/or audio media stream is to be rendered. In one preferred embodiment, theconnections appliances 102 through 110 andAV receiver 100 may generally comprise connections for carrying HDMI-compliant digital signals, although it will be appreciated that other interface standards such as component video, PCM audio, etc., may be substituted where necessitated by the limitations of a particular appliance. In some embodiments, additional AV content streams may also be available from astreaming service 118 such as for example Netflix, Vudu, YouTube, NBC online, etc., via a wide area network such as the Internet 116, to whichend AV receiver 100 may be provisioned with aconnection 112 to an Internet gateway device such as forexample router 114. As will be appreciated, the connection betweenAV receiver 100 andInternet gateway device 114 may be wired as illustrated, or may be wireless, e.g., a WiFi local area network, as appropriate. To support audio and/or video phone calling, conferencing, etc. and in accordance with certain teachings of the subject invention, an exemplary home entertainment system may also be provisioned with one or more sensing interfaces, such as interfaces associated withmicrophone 120 andcamera 122, suitable for the capture of audible and/or visible events within the home entertainment system environment.Users database server 128 for the aggregation and analysis of user viewing habits and preferences as discussed in greater detail below. - Turning now to
FIG. 2 , in a second illustrative embodiment, a “smart”TV device 200 may incorporate both content rendering and source stream selection functionality. In this configuration,local appliances 104 through 110 may be connected directly to multiple input ports ofTV 200 via, for example,HDMI connections 130. As is characteristic of the smart TV genre, TV 200 may also support aconnection 112 to a wide area network such as the Internet 116 over which streaming AV content and other data may be received. The means for user command input to TV 200 andappliances 104 through 110 may take the form of a controllingdevice 204, for example a conventional remote control or a smart phone app in communication with the appliances via any convenient infrared (IR), radio frequency (RF), hardwired, point-to-point, or networked protocol, as necessary to cause the respective target appliances to perform the desired operational functions. Additionally, in certain embodiments user input may also comprise spoken and/or gestured commands in place of or supplemental to controlling device signals, which sounds and gestures may be received bymicrophone 120 andcamera 122, processed and decoded by one of the appliances, for example TV 200, and where necessary relayed to other target appliances via, for example, HDMI CEC commands, IR or RF signals, etc., as described for example in co-pending U.S. patent application Ser. No. 13/657,176 “System and Method for Optimized Appliance Control,” of common ownership and incorporated by reference herein in its entirety. - For brevity, the discussions which follow will generally be with reference to the exemplary equipment configuration of
FIG. 2 , it being understood however that in other embodiments, for example such that illustrated inFIG. 1 , the steps of the methods presented herein may be performed, mutatis mutandis, by various appliances or combinations of appliances as appropriate for a particular equipment configuration. - As illustrated in
FIG. 3 , an exemplary central routing appliance, such assmart TV appliance 200, may include, as needed for a particular application, rendering capabilities, e.g., TV engine and media processor 300 (it being appreciated that this may comprise one or more than one physical processor depending on the particular embodiment);memory 302 which may comprise any type of readable or read/write media such as RAM, ROM, FLASH, EEPROM, hard disk, optical disk, etc., or a combination thereof; aUSB interface 304; digital AV input ports andinterface 306, for example DVI or HDMI; analog AV input ports andinterface 308, for example composite or component video with associated analog audio; an Ethernet and/orWiFi interface 310; a Bluetoothinterface 328; adigital camera interface 312 with associatedcamera 122 which may be externally connected or built into the cabinet ofTV 200; amicrophone interface 314 withassociated microphone 120 which may be externally connected or built into the cabinet ofTV 200; aremote control interface 316 for receiving user-initiated operational commands via IR orRF signals 326; adisplay output 318 connected toTV screen 322; and anaudio output 320 connected to internal orexternal loudspeakers 324. - To cause the
smart TV appliance 200 to perform an action, appropriate programming instructions may be stored within the memory 302 (hereafter the “TV programming”) for execution by TV engine and media processor(s) 300. An exemplary architecture for such TV programming is presented inFIG. 4 . As illustrated, the exemplary TV programming may include, as required for a particular application, anunderlying operating system 402, such as for example LINUX, which may support a set of software modules implementing the various functionalities of the smart TV device. Such software modules may include ahardware abstraction layer 404 to provide a device independent interface between the various application software modules and the hardware dependent software modules such asvideo output driver 406,audio output driver 408,HDMI interface 410, analog input/output ADC/DAC 412, Ethernet and/orWiFi interface 414, Bluetoothinterface 416;USB interface 418,remote control interface 420, and camera andmicrophone drivers abstraction layer 404 may include as required for a particular embodiment transport and session layer protocol andinterface management 428;AV output management 424; input/output processing androuting 440; amiscellaneous services module 426 to support closed captioning, display configuration, OSD, etc.; remotecontrol command decoder 430; anddevice resource management 442. Additionally, in keeping with the teachings of this invention, the exemplary TV programming may include audio and visualevent detector modules user event processing 436; and a user statistics gathering andreporting module 438. - Under the control of such TV programming,
smart TV appliance 200 may for example receive an incoming AV media stream from one of theinput ports TV display screen 322 and loudspeaker(s) 324; may receive commands fromremote control interface 316 which are decoded and acted upon, for example to select an input media stream, adjust audio volume, etc.; may manage a connection to the Internet through Ethernet orWiFi interface 310 to enable browsing for content, download of software updates, video telephony utilizing inputs fromcamera 122 and microphone 120; etc. Additionally, in accordance with the teachings herein, the exemplary TV programming may receive and process input signals from controllingdevice 204,camera 122 and/or microphone 120 in order to detect user presence, identify individual users, and/or receive user command input, as will described hereafter. As will be appreciated, while in the illustrative embodiment the source of audio input signals may comprise amicrophone 120 and associatedinterface 314 provisioned as part of asmart TV appliance 200, in alternative embodiments audio input signals may be captured by any other appliance in the system and forwarded toappliance 200 for processing, or may originate from a microphone provisioned in a controlling device such as remote control orsmartphone 204, the output of which microphone may, by way of example, be digitized and/or processed by controllingdevice 204 and wirelessly forwarded tosmart TV appliance 200 viaremote control interface 326,WiFi interface 310, Bluetoothinterface 328, or any other means as appropriate for a particular implementation. - In an exemplary embodiment the user
event processing module 436 of the TV programming of TV appliance 200 (hereafter “event processing”) may act as illustrated in the flowchart ofFIG. 5 upon occurrence of a user-related event. First, atstep 502 it may be determined if the event constitutes receipt of a remote control command, as may be reported by remotecontrol command decoder 430. As will be appreciated, remote control commands may be received via any or all of RC interface 420 (e.g., infrared or RF4CE signals or the like), Ethernet/WiFi interface 414, or Bluetoothinterface 416, depending on the particular embodiment and the particular controlling device currently in use. If it is determined that the event constitutes receipt of a remote control command, atstep 540 the requested functional operation may be executed. By way of example without limitation, such operations may include adjustment of output audio volume to be performed byAV management module 424, selection of a new media input stream to be performed by I/O androuting module 440, etc. In some embodiments, received remote control commands may also comprise requests to direct the functional operation of other connected appliances, for example control ofDVD player 106 or STB 104 via CEC commands to be issued byinterface management module 428 overHDMI connections 130; or alternatively via a connected IR blaster or a LAN as described for example in co-pending U.S. patent application Ser. No. 13/657,176 “System and Method for Optimized Appliance Control,” of common ownership and incorporated by reference herein in its entirety. Upon completion of the requested command function, atstep 542 it may next be determined if the command function just performed comprised a change in the media stream being rendered by TV 200, for example selection of anew input port STB 104; a change to an internet media source, etc. If so, atstep 544 data regarding this event may be conveyed touser statistics module 438 for logging and ultimate reporting todatabase server 128. As will be appreciated, the data logged regarding the new content stream may comprise some or all of the command parameters themselves, e.g., an STB channel number and timestamp; metadata items obtainable from the content source device or channel such as a DVD title, streaming video URL, etc.; a sample of the audio or video content itself for analysis as is known in the art and described, for example, in U.S. Pat. Nos. 7,986,913, 7,627,477 or 7,346,512; or any other data which may be suitable for identification purposes. If it is determined that the executed function did not comprise any change to the current media stream, atstep 546 it may next be determined if the executed function comprised a reportable event, such as for example powering TV 200 (or any other connected device) on or off, issuing a fast forward command during DVR playback, etc. If so, this event may also be reported touser statistics module 438 for logging, after which processing of the remote control command is complete. In this regard, it will be appreciated that reportable appliance events such as powering an attached device on or off may also be separately initiated via for example direct communication to an appliance from its own remote control. Accordingly, though not illustrated, where such events are detectable, for example via HDMI status, absence or presence of video or audio signals, etc., these events may also be reported and logged. - If the reported event is not a remote control command, at
step 504 the event processing may next determine if the reported event constitutes an image change event reported by visualevent detection module 432 in response to analysis of image data received fromcamera 122 viacamera driver 422. Such image processing may utilize for example the techniques described in U.S. Pat. Nos. 5,534,917, 6,829,384, 8,274,535, WIPO (PCT) patent application publication WO2010/057683A1, or the like, and may for example periodically monitor an image comprising the field of view ofcamera 122 in order in order to initiate image analysis in response to detection of any variation in image data which exceed a certain threshold value. If it is determined that the event is a report of a detected image change, atstep 518 it is next determined if the event comprises the departure or imminent departure of a user from the viewing environment of TV 200. If so, various actions may be taken byevent processing 436 as appropriate. By way of example, if the departing user is the sole viewer (or, in some embodiments, the primary user, i.e. the user who initiated the current viewing session) the event processing may be adapted to issue a “pause” command to the source of the current media stream. Other actions may include without limitation activating the recording function of a DVR, logging off a Web site, etc., as appropriate for a particular embodiment and configuration. If such an action is to be taken, atstep 520 in the illustrative embodiment the event processing may cause display of a request for confirmation onTV screen 322, e.g. “Would you like to pause this show? (Y/N).” If confirmed by the user atstep 522, which confirmation may take the form of a gesture, spoken command, remote control input, etc., or, in those embodiments where the default is to take action, a timeout, atstep 528 the indicated action may be executed. In this context it will be appreciated that in embodiments where voice or gesture responses are expected, the performance accuracy of audio and/or visualevent detection modules TV audio output 320 may be temporarily lowered to reduce background noise. Thereafter, atstep 530 data regarding the change in user, including user identity where this is determinable, for example via use of techniques such as described in U.S. Pat. Nos. 7,551,756, 7,702,599, or the like, may be conveyed tostatistic gathering module 438 for logging, after which processing is complete. - If the detected image change event is not a user departure, at
step 524 the event processor may next determine if the reported event comprises the arrival of a new or additional user in the TV viewing environment and if so take appropriate action. By way of example, some embodiments may be adapted to allow a viewer to invoke a “private viewing” status which may cause the current content to be automatically muted, paused, switched, etc. in the event an additional user enters the viewing environment. In such embodiments, atstep 526 it may be determined if such a status currently exists. If so atstep 528 the appropriate action may be executed, after which data regarding the user arrival, including user identity where this is determinable, may be conveyed tostatistic gathering module 438 for logging, and event processing is complete. By way of further example without limitation, re-entry of a user into a viewing environment may trigger an offer to resume playback of previously paused content; or in a multi-room, multi-device household in which appliances are networked together and equipped with viewer recognition, the entry of a user into one viewing environment may cause the event processor in that environment to query other event processors and/or statistic modules elsewhere in the household to determine if that user has recently departed another environment, and if so, offer to resume playback of a content stream which was previously paused in that other environment. - If the detected image change event is not a user arrival or departure, at
steps event detection module 432 as having significance as user input. If the reported event is determined to be associated with an operational command function, for example “pause”, “mute”, “channel up”, etc., processing may continue atstep 540 to execute the command as described previously. If the reported event is determined to be a preparatory gesture, appropriate anticipatory action may be taken atstep 538. In this context, a preparatory gesture may comprise without limitation any preliminary motion or gesture by a user which may be interpreted as a possible indication of that user's intent to perform an action, for example standing up, reaching for or setting down a remote control device, beckoning an out of sight person to enter the room, picking up a phone, etc. Anticipatory actions may include for example pre-conditioning visual and/oraudio detection modules - If the reported event is not an image change event, at
step 506 the event processing may next determine if the reported event constitutes a sound recognition event reported by audioevent detection module 434. Speech or sound recognition bymodule 434 may utilize for example the techniques described in U.S. Pat. No. 7,603,276, WIPO (PCT) patent application publication WO2002/054382A1, or the like. If the event is a sound recognition event, atstep 512 it may be determined if the decoded sound constitutes a voice command issued by a user. If so processing may continue atstep 540 to execute the desired command as described previously. If not a voice command, atstep 514 it may be determined if the reported event constitutes a trigger sound. In this context a trigger sound may be an instance of a non-vocal audio signal received viamicrophone 120 which either by pre-programming or via a learning process has been assigned a command or an anticipatory action. By way of example without limitation, trigger sounds may include a phone or doorbell ringing, a baby monitor, smoke alarm, microwave chime, etc. If the reported sound event is a trigger sound, the appropriate action, such as muting the television, etc., may be taken atstep 516. The system can be also be programmed to recognize various sounds and or spoken words/phrases as being indicative of a preparatory event whereupon one or more of the system components will be readied via an anticipatory action as described above. By way of example, a spoken phrase such as “let's see what else is on” may be recognized as a preparatory event whereupon an anticipatory action may be executed to place a remote control device into a state wherein the remote control device is readied to receive input in anticipation of its use or a spoken phrase such as “come here,” the sound of a door bell, or the like may be recognized as a preparatory event whereupon an anticipatory action may be executed to ready the system to look for an anticipated event, e.g., a specific gesture, such as a user standing up, leaving the viewing area, etc. whereupon the appropriate response action to the sensed event that was anticipated, e.g., pausing the media, may be performed. - In the event that an anticipated event is not performed within a predetermined period of time, the system may execute, as needed, further actions, such as restorative action, to place the system into a state as desired, e.g., to return one or more components of the home entertainment system to a state where the component(s) is no longer looking for the occurrence of the anticipation event.
- If the reported event is not a sound recognition event, at
step 508 the event processing may next determine if the reported event comprises a wireless device such as for example a smart phone, tablet computer, game controller, etc., joining into or dropping from a LAN or PAN associated withsmart TV 200 or other appliance in the system. If such devices have been previously registered with the TV programming, such activity may be used to infer the presence or absence of particular users. Such information may then be processed in a similar manner to user image detection (e.g., processed as a user being added or departing) continuing atstep 518 as previously described. - Finally, at
step 510 the event processor may process any other event report activity as consistent with a particular embodiment. For example, in some embodiments users may be provisioned with personal remote control devices which embed user identification data in their command transmissions, such as described in co-pending U.S. patent application Ser. No. 13/225,635 “Controlling Devices Used to Provide an Adaptive User Interface,” of common ownership and incorporated by reference herein in its entirety, in which case user presence reporting events may be generated byremote control interface 420. In other embodiments technologies such as infrared body heat sensing such as proposed in the art for use in automatic unattended power-off applications may be further adapted for the purposes described herein. Additional sources of activity events may also include data received from other household equipment such as security, lighting, or HVAC control systems equipped with occupancy sensors; entryway cameras; driveway sensors, etc., where appropriate. -
Statistic gathering module 438 may be adapted to report the data conveyed to it during the event processing steps described above to a centralized service, e.g., hosted on Internet connectedserver device 128, for aggregation and analysis of user viewing habits and preferences. Depending on the particular embodiment such reporting may be performed on an event-by-event basis, or alternatively the data may be accumulated and reported at predetermined time intervals or only upon receipt of a request from the server device. - By way of example, in an illustrative embodiment data reported to
statistic gathering module 438 may be formatted into several different event record classes and types for uploading toserver device 128. Exemplary record classes may comprise user events, e.g., as may be reported atstep 530 ofFIG. 5 ; appliance events, e.g., as may be reported atstep 548 ofFIG. 5 ; and content events, e.g., as may be reported atstep 544 ofFIG. 5 . As will be appreciated, different or additional event classes may be appropriate in alternate embodiments and accordingly the above classifications are presented by way of example only and without limitation. - With reference to Table 1 below, user event record types may include addition (i.e., arrival) of user to the viewing area and deletion (i.e., departure) of a user from the viewing area. As illustrated, each of these record types may include timestamp data and a user ID. The timestamp illustrated is suitable for use in applications where the server device is already aware of the geographical location of the reporting system, e.g., as a result of an initial setup procedure, by URL decoding, etc. In embodiments where this is not the case, the timestamp field may include additional data such as a time zone, zip code, etc., where required. User identity may be any item of data which serves to uniquely identify individual users to facilitate viewing habit and preference analysis at
server 128. By way of example, user ID data may comprise identities explicitly assigned during a setup/configuration process; random numbers assigned by the system as each distinct user is initially detected; a hash value generated by a facial or voice recognition algorithm; a MAC address or serial number assigned to a smart phone or tablet computer; etc. as appropriate for a particular embodiment. -
TABLE 1 User event record Event type Time stamp User ID User:add yyyy:mm:dd:hh:mm:ss xxxxxxxx User:delete yyyy:mm:dd:hh:mm:ss xxxxxxxx - Referring now to Table 2, appliance event records may comprise record types indicative of events reported to, functional commands issued to, and/or operations performed by various controlled appliances, e.g., as reported at
step 548 ofFIG. 5 . Such events may include without limitation appliance power on/off commands, playback control commands, etc., as necessary for a complete understanding of user viewing habits and preferences. By way of example, in addition to capturing power status for a DVR appliance, fast forward commands may also be logged in order to monitor commercial skipping activity. As illustrated, each appliance event record type may also include a timestamp field as described above and an appliance type/ID field comprising an appliance type indicator (e.g. STB/DVR, DVD, Internet stream, etc.) together with a unique ID or subtype value which may be assigned by the event monitoring and/or statistics gathering module in order to distinguish between multiple appliances of the same type, e.g., a household with multiple DVD players, or with both cable and satellite STBs. -
TABLE 2 Appliance event record Appliance Event type Time stamp type/ID Appliance:on yyyy:mm:dd:hh:mm:ss tt:xxxx Appliance:off yyyy:mm:dd:hh:mm:ss tt:xxxxx Appliance:play yyyy:mm:dd:hh:mm:ss tt:xxxxx Appliance:stop yyyy:mm:dd:hh:mm:ss tt:xxxxx Appliance:pause yyyy:mm:dd:hh:mm:ss tt:xxxxx Appliance:ffwd yyyy:mm:dd:hh:mm:ss tt:xxxxx Etc . . . - Referring now to Table 3, content event record types may include without limitation a channel or track change information record type; a title information record type containing, for example, a show title retrieved from program guide data, DVD or video-on-demand title information, etc.; a metadata record type containing metadata values obtained from a DVD or CD, streaming video service, or the like; a content sample record type containing a sample clip of audio and/or video content for comparison against a content identification database; or in alternate embodiments any other data which may be utilized in determining the identity of a particular content stream. Each record type may comprise timestamp and source appliance fields as described above, together with a field containing identity data, which may comprise numeric, text, or binary data as necessary. As will be appreciated, additional record and/or field types may be utilized in other embodiments, as necessary to enable reliable identification of media content streams.
-
TABLE 3 Content event record Source Event type Time stamp appliance Identity data Content:chan/track yyyy:mm:dd:hh:mm:ss tt:xxxx xxxxx Content:title yyyy:mm:dd:hh:mm:ss tt:xxxx {text} Content:metadata yyyy:mm:dd:hh:mm:ss tt:xxxx {text} Content:sample yyyy:mm:dd:hh:mm:ss tt:xxxx {binary data} Etc . . . - It will be appreciated that while the exemplary data structures of Tables 1 through 3 are presented herein using a tabular format for ease of reference, in practice these may be implemented in various forms using any convenient data representation, for example a structured database, XML file, cloud-based service, etc., as appropriate for a particular embodiment. Furthermore, it will also be appreciated that while the statistics gathering and recording functionality of the illustrative embodiment is implemented as part of the programming of an
exemplary appliance 200, i.e., insoftware module 438, in other embodiments this functionality may be provisioned at a different location, for example in one of the other appliances forming part of an entertainment system, in a local PC, at a remote server or cable system headend, etc., or at any other convenient location to which the particular appliance programming may be capable of reporting user events. - While various concepts have been described in detail, it will be appreciated by those skilled in the art that various modifications and alternatives to those concepts could be developed in light of the overall teachings of the disclosure. For example, in alternate embodiments the steps of the methods described above may be advantageously performed in various other appliances as appropriate, e.g. an AV receiver or a cable/satellite STB. Further, in an interconnected system such as illustrated in
FIG. 1 or 2, especially where such interconnection is digital, it will be appreciated that the various steps of the methods may be performed by different appliances. For example without limitation, image analysis and/or speech recognition may be performed by a smart TV device, a locally connected personal computer, a home security system, etc., or even “in the cloud”, i.e., by an Internet based service, with the results reported to a user statistic gathering module and/or an event processing module resident in a connected AV receiver or STB. - Further, while described in the context of functional modules and illustrated using block diagram format, it is to be understood that, unless otherwise stated to the contrary, one or more of the described functions and/or features may be integrated in a single physical device and/or a software module, or one or more functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion of the actual implementation of each module is not necessary for an enabling understanding of the invention. Rather, the actual implementation of such modules would be well within the routine skill of an engineer, given the disclosure herein of the attributes, functionality, and inter-relationship of the various functional modules in the system. Therefore, a person skilled in the art, applying ordinary skill, will be able to practice the invention set forth in the claims without undue experimentation. It will be additionally appreciated that the particular concepts disclosed are meant to be illustrative only and not limiting as to the scope of the invention which is to be given the full breadth of the appended claims and any equivalents thereof.
- All patents cited within this document are hereby incorporated by reference in their entirety.
Claims (15)
1. A method for collecting information indicative of usage of at least one component device in a home entertainment system comprised of a plurality of component devices, comprising:
receiving event data via at least one of an image sensing interface, a sound sensing interface, and a networked device sensing interface;
determining from the received event data at least one of a number of and identity of viewers in a viewing area associated with the home entertainment system; and
causing an event record to be created in which is stored data indicative of a time at which the event record is created, a state of at least one of the plurality of component devices in the home entertainment system at the time, and at least one of the number of and identity of viewers determined to be in the viewing area associated with the home entertainment system at the time.
2. The method as recited in claim 1 , comprising causing the event record to be created in response to a command being issued to change the state of the at least one of the plurality of component devices in the home entertainment system.
3. The method as recited in claim 1 , comprising causing the event record to be created in response to a request issued from a system server in communication with the home entertainment system.
4. The method as recited in claim 1 , comprising causing the event record to be created in response to a change in at least one of the number of and identity of viewers determined to be in the viewing area associated with the home entertainment system.
5. The method as recited in claim 1 , comprising causing the event record to be created at predetermined time intervals.
6. The method as recited in claim 1 , wherein the data indicative of the state of the at least one of the plurality of component devices in the home entertainment system comprises data indicative of a media content being delivered by the at least one of the plurality of component devices.
7. The method as recited in claim 6 , wherein the data indicative of the media content being delivered by the at least one of the plurality of component devices comprises a sampling of the media content.
8. The method as recited in claim 1 , comprising causing a command action to be executed whereupon at least one of the plurality of component devices is caused to perform an operational function when at least one of a number of and identity of viewers determined to be in a viewing area associated with the home entertainment system has changed.
9. A home entertainment system, comprising;
a home entertainment device; and
at least one of an image sensing device and a sound sensing device;
wherein the at least one of the image sensing device and the sound sensing device are adapted to generate event data, the home entertainment device comprises a processor and a memory, and the memory includes instructions which, when executed by the processor, causes the home entertainment device to determine from event data received from the at least one of the image sensing device and the sound sensing device at least one of a number of and identity of viewers in a viewing area associated with the home entertainment device and to create an event record in which is stored data indicative of a time at which the event record is created, a state of the home entertainment device at the time at which the event record is created, and at least one of the number of and identity of viewers determined to be in the viewing area associated with the home entertainment device at the time the event record is created.
10. The system as recited in claim 9 , wherein the instructions cause the event record to be created in response to a command being issued to change the state of the home entertainment device.
11. The system as recited in claim 9 , wherein the instructions cause the event record to be created in response to a request issued from a system server in communication with the home entertainment device.
12. The system as recited in claim 9 , wherein the instructions cause the event record to be created in response to a change in at least one of the number of and identity of viewers determined to be in the viewing area associated with the home entertainment device.
13. The system as recited in claim 9 , wherein the instruction cause the event record to be created at predetermined time intervals.
14. The system as recited in claim 9 , wherein the data indicative of the state of the home entertainment device comprises data indicative of a media content being delivered to the home entertainment device.
15. The system as recited in claim 9 , wherein the instructions cause a command action to be executed whereupon the home entertainment device is caused to perform an operational function when at least one of a number of and identity of viewers determined to be in a viewing area associated with the home entertainment system has changed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/178,859 US20140223463A1 (en) | 2013-02-04 | 2014-02-12 | System and method for user monitoring and intent determination |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/758,307 US9706252B2 (en) | 2013-02-04 | 2013-02-04 | System and method for user monitoring and intent determination |
US14/178,859 US20140223463A1 (en) | 2013-02-04 | 2014-02-12 | System and method for user monitoring and intent determination |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/758,307 Division US9706252B2 (en) | 2013-02-04 | 2013-02-04 | System and method for user monitoring and intent determination |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140223463A1 true US20140223463A1 (en) | 2014-08-07 |
Family
ID=51260466
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/758,307 Active 2034-06-25 US9706252B2 (en) | 2013-02-04 | 2013-02-04 | System and method for user monitoring and intent determination |
US14/178,859 Abandoned US20140223463A1 (en) | 2013-02-04 | 2014-02-12 | System and method for user monitoring and intent determination |
US14/461,928 Active US10820047B2 (en) | 2013-02-04 | 2014-08-18 | System and method for user monitoring and intent determination |
US17/019,501 Active US11477524B2 (en) | 2013-02-04 | 2020-09-14 | System and method for user monitoring and intent determination |
US17/939,469 Active US11949947B2 (en) | 2013-02-04 | 2022-09-07 | System and method for user monitoring and intent determination |
US18/588,171 Pending US20240196052A1 (en) | 2013-02-04 | 2024-02-27 | System and method for user monitoring and intent determination |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/758,307 Active 2034-06-25 US9706252B2 (en) | 2013-02-04 | 2013-02-04 | System and method for user monitoring and intent determination |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/461,928 Active US10820047B2 (en) | 2013-02-04 | 2014-08-18 | System and method for user monitoring and intent determination |
US17/019,501 Active US11477524B2 (en) | 2013-02-04 | 2020-09-14 | System and method for user monitoring and intent determination |
US17/939,469 Active US11949947B2 (en) | 2013-02-04 | 2022-09-07 | System and method for user monitoring and intent determination |
US18/588,171 Pending US20240196052A1 (en) | 2013-02-04 | 2024-02-27 | System and method for user monitoring and intent determination |
Country Status (5)
Country | Link |
---|---|
US (6) | US9706252B2 (en) |
EP (1) | EP2951662A4 (en) |
CN (1) | CN105122177B (en) |
BR (1) | BR112015017883A2 (en) |
WO (1) | WO2014120438A1 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140380362A1 (en) * | 2013-06-20 | 2014-12-25 | Microsoft Corporation | Paired Devices |
US20150215564A1 (en) * | 2014-01-30 | 2015-07-30 | Echostar Uk Holdings Limited | Methods and apparatus for creation of a reference time index for audio/video programming |
US20150373419A1 (en) * | 2014-06-20 | 2015-12-24 | Ray Enterprises Inc. | Content driven interface |
US20160080806A1 (en) * | 2013-04-26 | 2016-03-17 | Sharp Corporation | Device state checking system, device state checking method, server device, communication terminal device, and computer program |
US20160112649A1 (en) * | 2014-10-15 | 2016-04-21 | Benjamin Nowak | Controlling capture of content using one or more client electronic devices |
US9615122B2 (en) | 2014-01-30 | 2017-04-04 | Echostar Technologies L.L.C. | Methods and apparatus to synchronize second screen content with audio/video programming using closed captioning data |
WO2017064576A1 (en) * | 2015-10-14 | 2017-04-20 | Sony Corporation | Auto pause when disturbed |
EP3257258A4 (en) * | 2015-02-10 | 2018-01-10 | Universal Electronics, Inc. | System and method for aggregating and analyzing the status of a system |
US20180018965A1 (en) * | 2016-07-12 | 2018-01-18 | Bose Corporation | Combining Gesture and Voice User Interfaces |
US9892632B1 (en) * | 2016-04-18 | 2018-02-13 | Google Llc | Configuring universal remote control device for appliances based on correlation of received infrared signals and detected appliance events |
WO2018125032A1 (en) * | 2016-12-27 | 2018-07-05 | Rovi Guides, Inc. | Systems and methods for dynamically adjusting media output based on presence detection of individuals |
US10200733B1 (en) * | 2018-03-09 | 2019-02-05 | Krishna Adusumilli | Automatic input selection |
US10771518B2 (en) | 2014-10-15 | 2020-09-08 | Benjamin Nowak | Systems and methods for multiple device control and content curation |
US11973813B2 (en) | 2014-10-15 | 2024-04-30 | Benjamin Nowak | Systems and methods for multiple device control and content curation |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10863234B2 (en) | 2009-03-03 | 2020-12-08 | Mobilitie, Llc | System and method for secure appliance operation |
US9706252B2 (en) * | 2013-02-04 | 2017-07-11 | Universal Electronics Inc. | System and method for user monitoring and intent determination |
US9891600B2 (en) * | 2013-03-11 | 2018-02-13 | Honeywell International Inc. | Upgradable home awareness system |
US9472205B2 (en) * | 2013-05-06 | 2016-10-18 | Honeywell International Inc. | Device voice recognition systems and methods |
CN104349208A (en) * | 2013-08-09 | 2015-02-11 | 中兴通讯股份有限公司 | Message processing method, message processing device, gateway, set-top box and network television system |
US9696701B2 (en) * | 2013-12-07 | 2017-07-04 | Svv Technology Innovations, Inc. | Radio frequency occupancy sensing load control |
WO2017020954A1 (en) * | 2015-08-06 | 2017-02-09 | Arcelik Anonim Sirketi | Multi-point motion sensing and user monitoring system for an image display device |
WO2018140420A1 (en) | 2017-01-24 | 2018-08-02 | Honeywell International, Inc. | Voice control of an integrated room automation system |
US10121494B1 (en) * | 2017-03-30 | 2018-11-06 | Amazon Technologies, Inc. | User presence detection |
US10984329B2 (en) | 2017-06-14 | 2021-04-20 | Ademco Inc. | Voice activated virtual assistant with a fused response |
US20190332848A1 (en) | 2018-04-27 | 2019-10-31 | Honeywell International Inc. | Facial enrollment and recognition system |
US20190390866A1 (en) | 2018-06-22 | 2019-12-26 | Honeywell International Inc. | Building management system with natural language interface |
US10897647B1 (en) * | 2018-07-25 | 2021-01-19 | Imdb.Com, Inc. | Ascertaining audience reactions for a media item |
US10798572B2 (en) * | 2018-10-25 | 2020-10-06 | Ioxt, Llc | System and method for secure appliance operation |
US11418502B2 (en) * | 2018-11-20 | 2022-08-16 | International Business Machines Corporation | Input entry based on user identity validation |
CN113031455A (en) * | 2019-12-25 | 2021-06-25 | 中国移动通信集团终端有限公司 | Control method, device, equipment and computer storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6233734B1 (en) * | 1995-01-05 | 2001-05-15 | Videoguide, Inc. | System and method for controlling the broadcast and recording of television programs and for distributing information to be displayed on a television screen |
US20070143777A1 (en) * | 2004-02-19 | 2007-06-21 | Landmark Digital Services Llc | Method and apparatus for identificaton of broadcast source |
US20130219417A1 (en) * | 2012-02-16 | 2013-08-22 | Comcast Cable Communications, Llc | Automated Personalization |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4931865A (en) | 1988-08-24 | 1990-06-05 | Sebastiano Scarampi | Apparatus and methods for monitoring television viewers |
US5235414A (en) | 1990-05-21 | 1993-08-10 | Control Data Corporation | Non-obtrusive programming monitor |
US6130726A (en) | 1997-03-24 | 2000-10-10 | Evolve Products, Inc. | Program guide on a remote control display |
US7185355B1 (en) | 1998-03-04 | 2007-02-27 | United Video Properties, Inc. | Program guide system with preference profiles |
US20050210101A1 (en) * | 1999-03-04 | 2005-09-22 | Universal Electronics Inc. | System and method for providing content, management, and interactivity for client devices |
US6990453B2 (en) | 2000-07-31 | 2006-01-24 | Landmark Digital Services Llc | System and methods for recognizing sound and music signals in high noise and distortion |
KR100971697B1 (en) * | 2000-10-11 | 2010-07-22 | 유나이티드 비디오 프로퍼티즈, 인크. | Systems and methods for providing storage of data on servers in an on-demand media delivery system |
US20020174426A1 (en) | 2001-05-15 | 2002-11-21 | Koninklijke Philips Electronics N.V | Method and apparatus for activating a media player based on user behavior |
US7536704B2 (en) | 2001-10-05 | 2009-05-19 | Opentv, Inc. | Method and apparatus automatic pause and resume of playback for a popup on interactive TV |
CN101098453B (en) | 2002-04-22 | 2013-03-27 | 尼尔逊媒介研究股份有限公司 | Methods and apparatus to collect audience information associated with a media presentation |
AU2003230993A1 (en) | 2002-04-25 | 2003-11-10 | Shazam Entertainment, Ltd. | Robust and invariant audio pattern matching |
US20040003392A1 (en) * | 2002-06-26 | 2004-01-01 | Koninklijke Philips Electronics N.V. | Method and apparatus for finding and updating user group preferences in an entertainment system |
US20050015816A1 (en) * | 2002-10-29 | 2005-01-20 | Actv, Inc | System and method of providing triggered event commands via digital program insertion splicing |
US7420956B2 (en) * | 2004-04-16 | 2008-09-02 | Broadcom Corporation | Distributed storage and aggregation of multimedia information via a broadband access gateway |
US9584868B2 (en) * | 2004-07-30 | 2017-02-28 | Broadband Itv, Inc. | Dynamic adjustment of electronic program guide displays based on viewer preferences for minimizing navigation in VOD program selection |
JP2008523684A (en) | 2004-12-07 | 2008-07-03 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Intelligent port button |
US20070271518A1 (en) | 2006-05-16 | 2007-11-22 | Bellsouth Intellectual Property Corporation | Methods, Apparatus and Computer Program Products for Audience-Adaptive Control of Content Presentation Based on Sensed Audience Attentiveness |
US20080056222A1 (en) * | 2006-08-29 | 2008-03-06 | Nigel Waites | Wireless router connection indicator |
JP4898581B2 (en) * | 2007-07-12 | 2012-03-14 | 株式会社日立製作所 | User interface method, display device, and user interface system |
US8539357B2 (en) | 2007-11-21 | 2013-09-17 | Qualcomm Incorporated | Media preferences |
US8875188B2 (en) * | 2008-02-05 | 2014-10-28 | Stratosaudio, Inc. | Systems, methods, and devices for scanning broadcasts |
WO2009156986A1 (en) * | 2008-06-26 | 2009-12-30 | Honeybee Tv Israel Ltd. | Methods and systems for managing viewing at client terminals |
US7796190B2 (en) | 2008-08-15 | 2010-09-14 | At&T Labs, Inc. | System and method for adaptive content rendition |
WO2011071461A1 (en) | 2009-12-10 | 2011-06-16 | Echostar Ukraine, L.L.C. | System and method for selecting audio/video content for presentation to a user in response to monitored user activity |
GB2476668B (en) * | 2009-12-31 | 2015-06-17 | Sony Europe Ltd | Audiovisual multi-room Support |
US20110289529A1 (en) * | 2010-05-18 | 2011-11-24 | Ropvi Technologies Corporation | user interface for content browsing and selection in a television portal of a content system |
US8341669B2 (en) | 2010-05-26 | 2012-12-25 | United Video Properties, Inc. | Systems and methods for controlling an electronic device |
US8949871B2 (en) * | 2010-09-08 | 2015-02-03 | Opentv, Inc. | Smart media selection based on viewer user presence |
KR101979176B1 (en) * | 2011-08-03 | 2019-05-15 | 인텐트 아이큐, 엘엘씨 | Targeted television advertising based on profiles linked to multiple online devices |
US20130061258A1 (en) | 2011-09-02 | 2013-03-07 | Sony Corporation | Personalized television viewing mode adjustments responsive to facial recognition |
US9706252B2 (en) * | 2013-02-04 | 2017-07-11 | Universal Electronics Inc. | System and method for user monitoring and intent determination |
-
2013
- 2013-02-04 US US13/758,307 patent/US9706252B2/en active Active
-
2014
- 2014-01-15 BR BR112015017883A patent/BR112015017883A2/en not_active Application Discontinuation
- 2014-01-15 CN CN201480007203.1A patent/CN105122177B/en active Active
- 2014-01-15 WO PCT/US2014/011573 patent/WO2014120438A1/en active Application Filing
- 2014-01-15 EP EP14746376.4A patent/EP2951662A4/en not_active Ceased
- 2014-02-12 US US14/178,859 patent/US20140223463A1/en not_active Abandoned
- 2014-08-18 US US14/461,928 patent/US10820047B2/en active Active
-
2020
- 2020-09-14 US US17/019,501 patent/US11477524B2/en active Active
-
2022
- 2022-09-07 US US17/939,469 patent/US11949947B2/en active Active
-
2024
- 2024-02-27 US US18/588,171 patent/US20240196052A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6233734B1 (en) * | 1995-01-05 | 2001-05-15 | Videoguide, Inc. | System and method for controlling the broadcast and recording of television programs and for distributing information to be displayed on a television screen |
US20070143777A1 (en) * | 2004-02-19 | 2007-06-21 | Landmark Digital Services Llc | Method and apparatus for identificaton of broadcast source |
US20130219417A1 (en) * | 2012-02-16 | 2013-08-22 | Comcast Cable Communications, Llc | Automated Personalization |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160080806A1 (en) * | 2013-04-26 | 2016-03-17 | Sharp Corporation | Device state checking system, device state checking method, server device, communication terminal device, and computer program |
US9681189B2 (en) * | 2013-06-20 | 2017-06-13 | Microsoft Technology Licensing, Llc | Paired devices |
US20140380362A1 (en) * | 2013-06-20 | 2014-12-25 | Microsoft Corporation | Paired Devices |
US20150215564A1 (en) * | 2014-01-30 | 2015-07-30 | Echostar Uk Holdings Limited | Methods and apparatus for creation of a reference time index for audio/video programming |
US9942599B2 (en) | 2014-01-30 | 2018-04-10 | Echostar Technologies Llc | Methods and apparatus to synchronize second screen content with audio/video programming using closed captioning data |
US9615122B2 (en) | 2014-01-30 | 2017-04-04 | Echostar Technologies L.L.C. | Methods and apparatus to synchronize second screen content with audio/video programming using closed captioning data |
US20150373419A1 (en) * | 2014-06-20 | 2015-12-24 | Ray Enterprises Inc. | Content driven interface |
US11973813B2 (en) | 2014-10-15 | 2024-04-30 | Benjamin Nowak | Systems and methods for multiple device control and content curation |
US10771518B2 (en) | 2014-10-15 | 2020-09-08 | Benjamin Nowak | Systems and methods for multiple device control and content curation |
US20160112649A1 (en) * | 2014-10-15 | 2016-04-21 | Benjamin Nowak | Controlling capture of content using one or more client electronic devices |
US20220044705A1 (en) * | 2014-10-15 | 2022-02-10 | Benjamin Nowak | Controlling capture of content using one or more client electronic devices |
US11165840B2 (en) | 2014-10-15 | 2021-11-02 | Benjamin Nowak | Systems and methods for multiple device control and content curation |
US11158345B2 (en) * | 2014-10-15 | 2021-10-26 | Benjamin Nowak | Controlling capture of content using one or more client electronic devices |
EP3257258A4 (en) * | 2015-02-10 | 2018-01-10 | Universal Electronics, Inc. | System and method for aggregating and analyzing the status of a system |
WO2017064576A1 (en) * | 2015-10-14 | 2017-04-20 | Sony Corporation | Auto pause when disturbed |
US9892632B1 (en) * | 2016-04-18 | 2018-02-13 | Google Llc | Configuring universal remote control device for appliances based on correlation of received infrared signals and detected appliance events |
US10176710B1 (en) * | 2016-04-18 | 2019-01-08 | Google Llc | Configuring universal remote control device for appliances based on correlation of received infrared signals and detected appliance events |
US20180018965A1 (en) * | 2016-07-12 | 2018-01-18 | Bose Corporation | Combining Gesture and Voice User Interfaces |
US11044525B2 (en) | 2016-12-27 | 2021-06-22 | Rovi Guides, Inc. | Systems and methods for dynamically adjusting media output based on presence detection of individuals |
WO2018125032A1 (en) * | 2016-12-27 | 2018-07-05 | Rovi Guides, Inc. | Systems and methods for dynamically adjusting media output based on presence detection of individuals |
US11785294B2 (en) | 2016-12-27 | 2023-10-10 | Rovi Guides, Inc. | Systems and methods for dynamically adjusting media output based on presence detection of individuals |
US10582246B2 (en) | 2018-03-09 | 2020-03-03 | Krishna Adusumilli | Automatic input selection |
US10200733B1 (en) * | 2018-03-09 | 2019-02-05 | Krishna Adusumilli | Automatic input selection |
Also Published As
Publication number | Publication date |
---|---|
US20230007341A1 (en) | 2023-01-05 |
US20140223465A1 (en) | 2014-08-07 |
US9706252B2 (en) | 2017-07-11 |
WO2014120438A1 (en) | 2014-08-07 |
EP2951662A4 (en) | 2016-05-25 |
CN105122177A (en) | 2015-12-02 |
US20140366050A1 (en) | 2014-12-11 |
US20240196052A1 (en) | 2024-06-13 |
US20200413133A1 (en) | 2020-12-31 |
BR112015017883A2 (en) | 2017-07-11 |
CN105122177B (en) | 2018-03-20 |
US11477524B2 (en) | 2022-10-18 |
US11949947B2 (en) | 2024-04-02 |
US10820047B2 (en) | 2020-10-27 |
EP2951662A1 (en) | 2015-12-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11949947B2 (en) | System and method for user monitoring and intent determination | |
US9137570B2 (en) | System and method for user monitoring and intent determination | |
US11671662B2 (en) | Methods and systems for controlling media display in a smart media display environment | |
US9560407B2 (en) | Systems and methods for managing pairing of remote control devices with a plurality of media content processing devices | |
US10531152B2 (en) | Tracking and responding to distracting events | |
US9215507B2 (en) | Volume customization | |
US11979639B2 (en) | First-screen navigation with channel surfing, backdrop reviewing and content peeking | |
US11816968B2 (en) | Automatic presence simulator for security systems | |
US10028023B2 (en) | Methods and systems for automatic media output based on user proximity | |
US20120072944A1 (en) | Method and apparatus for providing seamless viewing | |
KR20140117387A (en) | Alternate view video playback on a second screen | |
CN111279707A (en) | System and method for navigating internet appliances using a media guidance application | |
JP5321137B2 (en) | Communication device, related device identification method, and network system | |
WO2014209674A1 (en) | System and method for user monitoring and intent determination | |
US20140258464A1 (en) | System and method for electronic device control | |
US20190278439A1 (en) | Live interactive event indication based on notification profile for display device | |
BR112015032568B1 (en) | METHOD FOR CONTROLLING AT LEAST ONE COMPONENT DEVICE IN A HOME ENTERTAINMENT SYSTEM |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: UNIVERSAL ELECTRONICS INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HATAMBEIKI, ARSHAM;ARLING, PAUL D.;SIGNING DATES FROM 20130129 TO 20130201;REEL/FRAME:032205/0486 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |