US20170070775A1 - Methods and systems for coordinating home automation activity - Google Patents
Methods and systems for coordinating home automation activity Download PDFInfo
- Publication number
- US20170070775A1 US20170070775A1 US14/844,939 US201514844939A US2017070775A1 US 20170070775 A1 US20170070775 A1 US 20170070775A1 US 201514844939 A US201514844939 A US 201514844939A US 2017070775 A1 US2017070775 A1 US 2017070775A1
- Authority
- US
- United States
- Prior art keywords
- event
- television
- perspective
- graphical
- event data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 89
- 230000000694 effects Effects 0.000 title claims abstract description 28
- 238000001514 detection method Methods 0.000 claims description 22
- 230000033001 locomotion Effects 0.000 claims description 16
- 230000001960 triggered effect Effects 0.000 claims description 9
- 230000015654 memory Effects 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 description 43
- 238000005516 engineering process Methods 0.000 description 38
- 230000006854 communication Effects 0.000 description 36
- 238000004891 communication Methods 0.000 description 36
- 238000012552 review Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 16
- 238000012545 processing Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 9
- 230000003936 working memory Effects 0.000 description 8
- 238000013515 script Methods 0.000 description 7
- 239000003086 colorant Substances 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 238000009434 installation Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 239000000779 smoke Substances 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 238000001816 cooling Methods 0.000 description 4
- 238000010438 heat treatment Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000002853 ongoing effect Effects 0.000 description 3
- 239000000725 suspension Substances 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000009423 ventilation Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000003467 diminishing effect Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000020169 heat generation Effects 0.000 description 1
- 238000009998 heat setting Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/43615—Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D23/00—Control of temperature
- G05D23/19—Control of temperature characterised by the use of electric means
- G05D23/1927—Control of temperature characterised by the use of electric means using a plurality of sensors
- G05D23/193—Control of temperature characterised by the use of electric means using a plurality of sensors sensing the temperaure in different places in thermal relationship with one or more spaces
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/19684—Portable terminal, e.g. mobile phone, used for viewing video remotely
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/14—Central alarm receiver or annunciator arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2805—Home Audio Video Interoperability [HAVI] networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2823—Reporting information sensed by appliance or service execution status of appliance services in a home automation network
- H04L12/2825—Reporting to a device located outside the home and the home network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4131—Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/4147—PVR [Personal Video Recorder]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42202—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/426—Internal components of the client ; Characteristics thereof
- H04N21/42607—Internal components of the client ; Characteristics thereof for processing the incoming bitstream
- H04N21/4263—Internal components of the client ; Characteristics thereof for processing the incoming bitstream involving specific tuning arrangements, e.g. two tuners
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4334—Recording operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4405—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video stream decryption
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6143—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via a satellite
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6156—Network physical structure; Signal processing specially adapted to the upstream path of the transmission network
- H04N21/6193—Network physical structure; Signal processing specially adapted to the upstream path of the transmission network involving transmission via a satellite
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19665—Details related to the storage of video surveillance data
- G08B13/19669—Event triggers storage or change of storage policy
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19665—Details related to the storage of video surveillance data
- G08B13/19671—Addition of non-video data, i.e. metadata, to video stream
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/19682—Graphic User Interface [GUI] presenting system data to the user, e.g. information on a screen helping a user interacting with an alarm system
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L2012/2847—Home automation networks characterised by the type of home appliance used
- H04L2012/2849—Audio/video appliances
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L2012/2847—Home automation networks characterised by the type of home appliance used
- H04L2012/285—Generic home appliances, e.g. refrigerators
Definitions
- the present technology relates to systems and methods for incorporating and displaying content. More specifically, the present technology relates to suspending alerts for a home automation system.
- Home automation systems provide a plethora of valuable benefits. From monitoring ongoing activities to securing the home, these systems can be configured to monitor many activities. However, with all the monitoring comes updates and alerts. While providing useful information regarding ongoing operations in the home, the updates can become disruptive throughout the day. Although there may be ways of diminishing the disruption during activities, that the alerts continue may become a nuisance to a user.
- Systems and methods for coordinating home automation activity may include accessing event data from a storage database communicatively coupled with an electronic device. Each item of event data may include a time and date at which the event data was collected.
- the methods may include generating a graphical, perspective depiction identifying the number of events occurring during discrete time intervals for a period of time. The depiction may include the discrete time intervals in perspective view.
- the methods may also include outputting the graphical, perspective depiction from the electronic device for display on a display device.
- the graphical, perspective depiction may include individual cameras of a home automation system within the perspective view in embodiments, and each item of event data may further include a video recording.
- An event icon for each item of event data may be positioned along the perspective view of the graphical, perspective depiction according to the time at which the item of event data associated with the event icon was collected.
- the event icons may correspond to, and visually represent, a source of the event data.
- the source may include a triggering event for camera recording consisting of a motion detection, a sound detection, an instruction to manually record, and a rule-initiated recording.
- a rule-initiated recording may be triggered by an alarm or a connected device.
- the graphical depiction of the present technology may include a detection field associated with the discrete time interval associated with the nearest time displayed in the perspective view.
- the detection field may also display a graphical connection between event icons positioned within the detection field and a view from the video recording associated with the individual camera corresponding to the item of event data.
- the graphical connection may include a graphical boundary surrounding the graphical, perspective depiction and a connector between the graphical boundary and a view from the video recording associated with the individual camera corresponding to the item of event data.
- the graphical depiction may include individual cameras listed horizontally against the perspective view as abscissae to the time intervals in perspective view.
- the view from the video recording associated with the individual camera corresponding to the item of event data may be positioned within a space corresponding to the horizontal listing of the individual camera associated with the item of event data.
- the electronic device may be configured to receive instructions from a remote control having ordinate and abscissa directionality controls.
- an ordinate directionality control instruction may adjust the graphical depiction along the perspective view of the discrete time intervals
- an abscissa directionality control instruction may adjust a view from an individual camera.
- the remote control may include, for example, a television receiver remote control having at least four keys for providing input. At least two keys may provide ordinate-based instructions, and at least two keys may provide abscissa-based instructions.
- the remote control may include a mobile device having a touch screen.
- a swipe vertically along the touch screen may provide adjustments along the perspective view, and a horizontal swipe along the touch screen may provide abscissa-based instructions.
- the storage database may include at least one of a local database, or a network-accessible storage database.
- the electronic device may be or include a television receiver in embodiments.
- the present technology also includes home automation system hubs.
- the hubs may include a first input component configured to receive multimedia data, and a second input component configured to receive user input.
- the hubs may also include at least one output component communicatively coupled with at least one display device.
- the hubs may have one or more processors, as well as memory communicatively coupled with and readable by the one or more processors and having stored therein processor-readable instructions. When executed by the one or more processors, the one or more processors may be caused to access event data from a storage database communicatively coupled with an electronic device. Each item of event data may include a time and date at which the event data was collected.
- the processors may be further caused to generate a graphical, perspective depiction identifying the number of events occurring during discrete time intervals for a date range.
- the depiction may include the discrete time intervals in perspective view.
- the processors may also be caused to output the graphical, perspective depiction from the electronic device for display on a display device.
- the multimedia data may include satellite broadcast television.
- the home automation system hub may also further include a network input component configured to receive video data from at least one communicatively coupled camera.
- the graphical, perspective depiction may include individual cameras of a home automation system within the perspective view, and each item of event data may further include a video recording.
- the technology may provide numerous benefits over conventional techniques. For example, the technology may allow a user to quickly identify a time of day during which a crisis may have occurred. Additionally, the technology may allow a user to quickly identify events worth review on a daily or other time basis.
- FIG. 1 shows a simplified media service system that may be used in accordance with embodiments of the present technology.
- FIG. 2 illustrates an exemplary electronic device that may be used in accordance with embodiments of the present technology.
- FIG. 3 illustrates an exemplary home automation system setup in accordance with embodiments of the present technology.
- FIG. 4 shows a simplified flow diagram of a method of coordinating home automation activity according to embodiments of the present technology.
- FIG. 5A illustrates an exemplary overlay of an event table identifying events occurring during discrete time intervals according to embodiments of the present technology.
- FIG. 5B illustrates an exemplary event table identifying events occurring during discrete time intervals according to embodiments of the present technology.
- FIG. 5C illustrates an exemplary event table identifying events occurring during discrete time intervals according to embodiments of the present technology.
- FIG. 6 shows another simplified flow diagram of a method of coordinating home automation activity according to embodiments of the present technology.
- FIGS. 7A-7C illustrate an exemplary alert received by a user according to embodiments of the present technology.
- FIG. 8 shows another simplified flow diagram of a method of coordinating home automation activity according to embodiments of the present technology.
- FIG. 9 illustrates another exemplary overlay of an event table identifying events occurring during discrete time intervals according to embodiments of the present technology.
- FIG. 10 illustrates another exemplary overlay of an event table identifying events occurring during discrete time intervals according to embodiments of the present technology.
- FIG. 11 shows a simplified computer system that may be utilized to perform one or more of the operations discussed.
- a television receiver may serve as a host for a home automation system.
- the home automation system may be able to conveniently present home automation information to a user via a connected display device, such as a television or other connected devices, such as a tablet computer, mobile phone, monitor, or laptop computer.
- a connected display device such as a television or other connected devices, such as a tablet computer, mobile phone, monitor, or laptop computer.
- the amount of home automation activity recorded throughout a discrete time interval such as a day may include an amount of data a user cannot readily review in a reasonable amount of time.
- the present technology allows a user to quickly identify time intervals of increased activity for review and consideration.
- FIG. 1 illustrates an embodiment of a satellite television distribution system 100 . While a home automation system may be incorporated with various types of television receivers, various embodiments may be part of a satellite-based television distribution system. Cable, IP-based, wireless, and broadcast focused systems are also possible. Satellite television distribution system 100 may include: television service provider system 110 , satellite transmitter equipment 120 , satellites 130 , satellite dish 140 , television receiver 150 , home automation service server 112 , and display device 160 . The display device 160 can be controlled by a user 153 using a remote control device 155 that can send wired or wireless signals 157 to communicate with the STB 150 and/or display device 160 . Alternate embodiments of satellite television distribution system 100 may include fewer or greater numbers of components.
- satellite dish 140 While only one satellite dish 140 , television receiver 150 , and display device 160 (collectively referred to as “user equipment”) are illustrated, it should be understood that multiple (e.g., tens, thousands, millions of) instances and types of user equipment may receive data and television signals from television service provider system 110 via satellites 130 .
- Television service provider system 110 and satellite transmitter equipment 120 may be operated by a television service provider.
- a television service provider may distribute television channels, on-demand programming, programming information, and/or other content/services to users.
- Television service provider system 110 may receive feeds of one or more television channels and content from various sources. Such television channels may include multiple television channels that contain at least some of the same content (e.g., network affiliates).
- feeds of the television channels may be relayed to user equipment via multiple television distribution satellites. Each satellite may relay multiple transponder streams.
- Satellite transmitter equipment 120 may be used to transmit a feed of one or more television channels from television service provider system 110 to one or more satellites 130 .
- While a single television service provider system 110 and satellite transmitter equipment 120 are illustrated as part of satellite television distribution system 100 , it should be understood that multiple instances of transmitter equipment may be used, possibly scattered geographically, to communicate with satellites 130 . Such multiple instances of satellite transmitting equipment may communicate with the same or with different satellites. Different television channels may be transmitted to satellites 130 from different instances of transmitting equipment. For instance, a different satellite dish of satellite transmitter equipment 120 may be used for communication with satellites in different orbital slots.
- Satellites 130 may be configured to receive signals, such as streams of television channels, from one or more satellite uplinks such as satellite transmitter equipment 120 . Satellites 130 may relay received signals from satellite transmitter equipment 120 (and/or other satellite transmitter equipment) to multiple instances of user equipment via transponder streams. Different frequencies may be used for uplink signals 170 from downlink signals 180 . Satellites 130 may be in geosynchronous orbit. Each of the transponder streams transmitted by satellites 130 may contain multiple television channels transmitted as packetized data. For example, a single transponder stream may be a serial digital packet stream containing multiple television channels. Therefore, packets for multiple television channels may be interspersed. Further, information used by television receiver 150 for home automation functions may also be relayed to a television receiver via one or more transponder streams.
- Multiple satellites 130 may be used to relay television channels from television service provider system 110 to satellite dish 140 .
- Different television channels may be carried using different satellites.
- Different television channels may also be carried using different transponders of the same satellite; thus, such television channels may be transmitted at different frequencies and/or different frequency ranges.
- a first and second television channel may be relayed via a first transponder of satellite 130 a .
- a third, fourth, and fifth television channel may be relayed via a different satellite or a different transponder of the same satellite relaying the transponder stream at a different frequency.
- a transponder stream transmitted by a particular transponder of a particular satellite may include a finite number of television channels, such as seven. Accordingly, if many television channels are to be made available for viewing and recording, multiple transponder streams may be necessary to transmit all of the television channels to the instances of user equipment.
- Satellite dish 140 may be a piece of user equipment that is used to receive transponder streams from one or more satellites, such as satellites 130 . Satellite dish 140 may be provided to a subscriber for use on a subscription basis to receive television channels provided by the television service provider system 110 , satellite transmitter equipment 120 , and/or satellites 130 . Satellite dish 140 , which may include one or more low noise blocks (LNBs), may be configured to receive transponder streams from multiple satellites and/or multiple transponders of the same satellite. Satellite dish 140 may be configured to receive television channels via transponder streams on multiple frequencies. Based on the characteristics of television receiver 150 and/or satellite dish 140 , it may only be possible to capture transponder streams from a limited number of transponders concurrently.
- LNBs low noise blocks
- a tuner of television receiver 150 may only be able to tune to a single transponder stream from a transponder of a single satellite at a given time. The tuner can then be re-tuned to another transponder of the same or a different satellite.
- a television receiver 150 having multiple tuners may allow for multiple transponder streams to be received at the same time.
- a television receiver may be configured to decode signals received from satellites 130 via satellite dish 140 for output and presentation via a display device, such as display device 160 .
- a television receiver may be incorporated as part of a television or may be part of a separate device, commonly referred to as a set-top box (STB).
- STB set-top box
- Television receiver 150 may decode signals received via satellite dish 140 and provide an output to display device 160 .
- On-demand content such as PPV content, may be stored to a computer-readable storage medium.
- FIG. 2 provides additional detail of various embodiments of a television receiver.
- a television receiver is defined to include set-top boxes (STBs), and also circuitry having similar functionality that may be incorporated with another device.
- FIG. 1 illustrates an embodiment of television receiver 150 as separate from display device 160 , it should be understood that, in other embodiments, similar functions may be performed by a television receiver integrated with display device 160 .
- Television receiver 150 may include home automation engine 211 , as detailed in relation to FIG. 2 .
- Display device 160 may be used to present video and/or audio decoded and output by television receiver 150 .
- Television receiver 150 may also output a display of one or more interfaces to display device 160 , such as an electronic programming guide (EPG).
- EPG electronic programming guide
- display device 160 is a television.
- Display device 160 may also be a monitor, computer, or some other device configured to display video and, possibly, play audio.
- Uplink signal 170 a represents a signal between satellite transmitter equipment 120 and satellite 130 a .
- Uplink signal 170 b represents a signal between satellite transmitter equipment 120 and satellite 130 b .
- Each of uplink signals 170 may contain streams of one or more different television channels.
- uplink signal 170 a may contain a first group of television channels, while uplink signal 170 b contains a second group of television channels.
- Each of these television channels may be scrambled such that unauthorized persons are prevented from accessing the television channels.
- Downlink signal 180 a represents a signal between satellite 130 a and satellite dish 140 .
- Downlink signal 180 b represents a signal between satellite 130 b and satellite dish 140 .
- Each of downlink signals 180 may contain one or more different television channels, which may be at least partially scrambled.
- a downlink signal may be in the form of a transponder stream.
- a single transponder stream may be tuned to at a given time by a tuner of a television receiver.
- downlink signal 180 a may be a first transponder stream containing a first group of television channels
- downlink signal 180 b may be a second transponder stream containing a different group of television channels.
- a transponder stream can be used to transmit on-demand content to television receivers, including PPV content, which may be stored locally by the television receiver until output for presentation.
- FIG. 1 illustrates downlink signal 180 a and downlink signal 180 b , being received by satellite dish 140 and distributed to television receiver 150 .
- satellite dish 140 may receive downlink signal 180 a and for a second group of channels, downlink signal 180 b may be received.
- Television receiver 150 may decode the received transponder streams. As such, depending on which television channels are desired to be presented or stored, various transponder streams from various satellites may be received, descrambled, and decoded by television receiver 150 .
- Network 190 which may include the Internet, may allow for bidirectional communication between television receiver 150 and television service provider system 110 , such as for home automation related services provided by home automation service server 112 . Although illustrated as part of the television service provider system, the home automation service server 112 may be provided by a third party in embodiments. In addition or in alternate to network 190 , a telephone, e.g., landline, or cellular connection may be used to enable communication between television receiver 150 and television service provider system 110 .
- a telephone e.g., landline, or cellular connection
- FIG. 2 illustrates an embodiment of a television receiver 200 , which may represent television receiver 150 of FIG. 1 .
- Television receiver 200 may be configured to function as a host for a home automation system either alone or in conjunction with a communication device.
- Television receiver 200 may be in the form of a separate device configured to be connected with a display device, such as a television.
- Embodiments of television receiver 200 can include set top boxes (STBs).
- STBs set top boxes
- a television receiver may be incorporated as part of another device, such as a television, other form of display device, video game console, computer, mobile phone or tablet, or the like.
- a television may have an integrated television receiver, which does not involve an external STB being coupled with the television.
- Television receiver 200 may be incorporated as part of a television, such as display device 160 of FIG. 1 .
- Television receiver 200 may include: processors 210 , which may include control processor 210 a , tuning management processor 210 b , and possibly additional processors, tuners 215 , network interface 220 , non-transitory computer-readable storage medium 225 , electronic programming guide (EPG) database 230 , television interface 235 , digital video recorder (DVR) database 245 , which may include provider-managed television programming storage and/or user-defined television programming, on-demand programming database 227 , home automation settings database 247 , home automation script database 248 , remote control interface 250 , security device 260 , and/or descrambling engine 265 .
- processors 210 which may include control processor 210 a , tuning management processor 210 b , and possibly additional processors, tuners 215 , network interface 220 , non-transitory computer-readable storage medium 225 , electronic programming guide (EPG
- television receiver 200 In other embodiments of television receiver 200 , fewer or greater numbers of components may be present. It should be understood that the various components of television receiver 200 may be implemented using hardware, firmware, software, and/or some combination thereof. Functionality of components may be combined; for example, functions of descrambling engine 265 may be performed by tuning management processor 210 b . Further, functionality of components may be spread among additional components.
- Processors 210 may include one or more specialized and/or general-purpose processors configured to perform processes such as tuning to a particular channel, accessing and displaying EPG information from EPG database 230 , and/or receiving and processing input from a user. It should be understood that the functions performed by various modules of FIG. 2 may be performed using one or more processors. As such, for example, functions of descrambling engine 265 may be performed by control processor 210 a.
- Control processor 210 a may communicate with tuning management processor 210 b .
- Control processor 210 a may control the recording of television channels based on timers stored in DVR database 245 .
- Control processor 210 a may also provide commands to tuning management processor 210 b when recording of a television channel is to cease.
- control processor 210 a may provide commands to tuning management processor 210 b that indicate television channels to be output to decoder module 233 for output to a display device.
- Control processor 210 a may also communicate with network interface 220 and remote control interface 250 .
- Control processor 210 a may handle incoming data from network interface 220 and remote control interface 250 . Additionally, control processor 210 a may be configured to output data via network interface 220 .
- Control processor 210 a may include home automation engine 211 .
- Home automation engine 211 may permit television receiver and control processor 210 a to provide home automation functionality.
- Home automation engine 211 may have a JSON (JavaScript Object Notation) command interpreter or some other form of command interpreter that is configured to communicate with wireless devices via network interface 220 and a message server, possibly via a message server client.
- Such a command interpreter of home automation engine 211 may also communicate via a local area network with devices without using the Internet.
- Home automation engine 211 may contain multiple controllers specific to different protocols; for instance, a ZigBee® controller, a Z-Wave® controller, and/or an IP camera controller, wireless LAN, 802.11, may be present.
- Home automation engine 211 may contain a media server configured to serve streaming audio and/or video to remote devices on a local area network or the Internet.
- Television receiver may be able to serve such devices with recorded content, live content, and/or content recorded using one or more home automation devices, such as cameras.
- Tuners 215 may include one or more tuners used to tune to transponders that include broadcasts of one or more television channels. Such tuners may be used also to receive for storage on-demand content and/or addressable television commercials. In some embodiments, two, three, or more than three tuners may be present, such as four, six, or eight tuners. Each tuner contained in tuners 215 may be capable of receiving and processing a single transponder stream from a satellite transponder or from a cable network at a given time. As such, a single tuner may tune to a single transponder stream at a given time.
- tuners 215 include multiple tuners, one tuner may be used to tune to a television channel on a first transponder stream for display using a television, while another tuner may be used to tune to a television channel on a second transponder for recording and viewing at some other time. If multiple television channels transmitted on the same transponder stream are desired, a single tuner of tuners 215 may be used to receive the signal containing the multiple television channels for presentation and/or recording. Tuners 215 may receive commands from tuning management processor 210 b . Such commands may instruct tuners 215 to which frequencies are to be tuned.
- Network interface 220 may be used to communicate via an alternate communication channel with a television service provider, if such communication channel is available.
- a communication channel may be via satellite, which may be unidirectional to television receiver 200
- the alternate communication channel which may be bidirectional, may be via a network, such as the Internet.
- Data may be transmitted from television receiver 200 to a television service provider system and from the television service provider system to television receiver 200 .
- Information may be transmitted and/or received via network interface 220 . For instance, instructions from a television service provider may also be received via network interface 220 , if connected with the Internet.
- the primary communication channel being satellite, cable network, an IP-based network, or broadcast network may be used.
- Network interface 220 may permit wireless communication with one or more types of networks, including using home automation network protocols and wireless network protocols. Also, wired networks may be connected to and communicated with via network interface 220 .
- Device interface 221 may represent a USB port or some other form of communication port that permits communication with a communication device as will be explained further below.
- Storage medium 225 may represent one or more non-transitory computer-readable storage mediums.
- Storage medium 225 may include memory and/or a hard drive.
- Storage medium 225 may be used to store information received from one or more satellites and/or information received via network interface 220 .
- Storage medium 225 may store information related to on-demand programming database 227 , EPG database 230 , DVR database 245 , home automation settings database 247 , and/or home automation script database 248 .
- Recorded television programs may be stored using storage medium 225 as part of DVR database 245 .
- Storage medium 225 may be partitioned or otherwise divided, such as into folders, such that predefined amounts of storage medium 225 are devoted to storage of television programs recorded due to user-defined timers and stored television programs recorded due to provider-defined timers.
- Home automation settings database 247 may allow configuration settings of home automation devices and user preferences to be stored.
- Home automation settings database 247 may store data related to various devices that have been set up to communicate with television receiver 200 .
- home automation settings database 247 may be configured to store information on which types of events should be indicated to users, to which users, in what order, and what communication methods should be used. For instance, an event such as an open garage may only be notified to certain wireless devices, e.g., a cellular phone associated with a parent, not a child, notification may be by a third-party notification server, email, text message, and/or phone call.
- a second notification method may only be used if a first fails. For instance, if a notification cannot be sent to the user via a third-party notification server, an email may be sent.
- Home automation settings database 247 may store information that allows for the configuration and control of individual home automation devices which may operate using Z-wave and Zigbee—specific protocols. To do so, home automation engine 211 may create a proxy for each device that allows for settings for the device to be passed through a UI, e.g, presented on a television, to allow for settings to be solicited for and collected via a user interface presented by television receiver or overlay device. The received settings may then be handled by the proxy specific to the protocol, allowing for the settings to be passed on to the appropriate device. Such an arrangement may allow for settings to be collected and received via a UI of the television receiver or overlay device and passed to the appropriate home automation device and/or used for managing the appropriate home automation device.
- a UI e.g, presented on a television
- a piece of exercise equipment that is enabled to interface with the home automation engine 211 , such as via device interface 221 , may be configured at the electronic device 211 in addition to on the piece of exercise equipment itself.
- a mobile device or application residing on a mobile device and utilized with exercise equipment may be configured in such a fashion as well for displaying received fitness information on a coupled display device.
- Home automation script database 248 may store scripts that detail how home automation devices are to function based on various events occurring. For instance, if stored content starts being played back by television receiver 200 , lights in the vicinity of display device 160 may be dimmed and shades may be lowered by communicatively coupled and controlled shade controller. As another example, when a user shuts programming off late in the evening, there may be an assumption the user is going to bed. Therefore, the user may configure television receiver 200 to lock all doors via a lock controller, shut the garage door via garage controller, lower a heat setting of thermostat, shut off all lights via a light controller, and determine if any windows or doors are open via window sensors and door sensors, and, if so, alert the user. Such scripts or programs may be predefined by the home automation/television service provider and/or may be defined by a user.
- home automation script database 248 may allow for various music profiles to be implemented. For instance, based on home automation settings within a structure, appropriate music may be played. For instance, when a piece of exercise equipment is connected or is used, energizing music may be played. Conversely, based on the music being played, settings of home automation devices may be determined. If television programming, such as a movie, is output for playback by television receiver 150 , a particular home automation script may be used to adjust home automation settings, e.g., lower lights, raise temperature, and lock doors.
- EPG database 230 may store information related to television channels and the timing of programs appearing on such television channels.
- EPG database 230 may be stored using storage medium 225 , which may be a hard drive or solid-state drive. Information from EPG database 230 may be used to inform users of what television channels or programs are popular and/or provide recommendations to the user. Information from EPG database 230 may provide the user with a visual interface displayed by a television that allows a user to browse and select television channels and/or television programs for viewing and/or recording. Information used to populate EPG database 230 may be received via network interface 220 , via satellite, or some other communication link with a television service provider, e.g., a cable network. Updates to EPG database 230 may be received periodically. EPG database 230 may serve as an interface for a user to control DVR functions of television receiver 200 , and/or to enable viewing and/or recording of multiple television channels simultaneously. EPG database 240 may also contain information about on-demand content or any other form of accessible content.
- Decoder module 233 may serve to convert encoded video and audio into a format suitable for output to a display device. For instance, decoder module 233 may receive MPEG video and audio from storage medium 225 or descrambling engine 265 to be output to a television. MPEG video and audio from storage medium 225 may have been recorded to DVR database 245 as part of a previously-recorded television program. Decoder module 233 may convert the MPEG video and audio into a format appropriate to be displayed by a television or other form of display device and audio into a format appropriate to be output from speakers, respectively. Decoder module 233 may have the ability to convert a finite number of television channel streams received from storage medium 225 or descrambling engine 265 , simultaneously. For instance, decoders within decoder module 233 may be able to only decode a single television channel at a time. Decoder module 233 may have various numbers of decoders.
- Television interface 235 may serve to output a signal to a television or another form of display device in a proper format for display of video and playback of audio. As such, television interface 235 may output one or more television channels, stored television programming from storage medium 225 , e.g., television programs from DVR database 245 , television programs from on-demand programming 230 and/or information from EPG database 230 , to a television for presentation. Television interface 235 may also serve to output a CVM.
- Digital Video Recorder (DVR) functionality may permit a television channel to be recorded for a period of time.
- DVR functionality of television receiver 200 may be managed by control processor 210 a .
- Control processor 210 a may coordinate the television channel, start time, and stop time of when recording of a television channel is to occur.
- DVR database 245 may store information related to the recording of television channels.
- DVR database 245 may store timers that are used by control processor 210 a to determine when a television channel should be tuned to and its programs recorded to DVR database 245 of storage medium 225 . In some embodiments, a limited amount of storage medium 225 may be devoted to DVR database 245 .
- Timers may be set by the television service provider and/or one or more users of television receiver 200 .
- DVR database 245 may also be used to record recordings of service provider-defined television channels. For each day, an array of files may be created. For example, based on provider-defined timers, a file may be created for each recorded television channel for a day. For example, if four television channels are recorded from 6-10 PM on a given day, four files may be created; one for each television channel. Within each file, one or more television programs may be present.
- the service provider may define the television channels, the dates, and the time periods for which the television channels are recorded for the provider-defined timers.
- the provider-defined timers may be transmitted to television receiver 200 via the television provider's network. For example, in a satellite-based television service provider system, data necessary to create the provider-defined timers at television receiver 150 may be received via satellite.
- On-demand programming database 227 may store additional television programming.
- On-demand programming database 227 may include television programming that was not recorded to storage medium 225 via a timer, either user- or provider-defined. Rather, on-demand programming may be programming provided to the television receiver directly for storage by the television receiver and for later presentation to one or more users. On-demand programming may not be user-selected. As such, the television programming stored to on-demand programming database 227 may be the same for each television receiver of a television service provider.
- On-demand programming database 227 may include pay-per-view (PPV) programming that a user must pay and/or use an amount of credits to view. For instance, on-demand programming database 227 may include movies that are not available for purchase or rental yet.
- PSV pay-per-view
- television channels received via satellite or cable may contain at least some scrambled data. Packets of audio and video may be scrambled to prevent unauthorized users, e.g., nonsubscribers, from receiving television programming without paying the television service provider.
- the transponder stream may be a series of data packets corresponding to multiple television channels. Each data packet may contain a packet identifier (PID), which can be determined to be associated with a particular television channel.
- PID packet identifier
- ECMs entitlement control messages
- ECMs may be associated with another PID and may be encrypted; television receiver 200 may use decryption engine 261 of security device 260 to decrypt ECMs. Decryption of an ECM may only be possible if the user has authorization to access the particular television channel associated with the ECM. When an ECM is determined to correspond to a television channel being stored and/or displayed, the ECM may be provided to security device 260 for decryption.
- security device 260 may decrypt the ECM to obtain some number of control words. In some embodiments, from each ECM received by security device 260 , two control words are obtained. In some embodiments, when security device 260 receives an ECM, it compares the ECM to the previously received ECM. If the two ECMs match, the second ECM is not decrypted because the same control words would be obtained. In other embodiments, each ECM received by security device 260 is decrypted; however, if a second ECM matches a first ECM, the outputted control words will match; thus, effectively, the second ECM does not affect the control words output by security device 260 . Security device 260 may be permanently part of television receiver 200 or may be configured to be inserted and removed from television receiver 200 , such as a smart card, cable card, or the like.
- Tuning management processor 210 b may be in communication with tuners 215 and control processor 210 a . Tuning management processor 210 b may be configured to receive commands from control processor 210 a . Such commands may indicate when to start/stop receiving and/or recording of a television channel and/or when to start/stop causing a television channel to be output to a television. Tuning management processor 210 b may control tuners 215 . Tuning management processor 210 b may provide commands to tuners 215 that instruct the tuners which satellite, transponder, and/or frequency to tune to. From tuners 215 , tuning management processor 210 b may receive transponder streams of packetized data.
- Descrambling engine 265 may use the control words output by security device 260 in order to descramble video and/or audio corresponding to television channels for storage and/or presentation.
- Video and/or audio data contained in the transponder data stream received by tuners 215 may be scrambled.
- Video and/or audio data may be descrambled by descrambling engine 265 using a particular control word. Which control word output by security device 260 to be used for successful descrambling may be indicated by a scramble control identifier present within the data packet containing the scrambled video or audio.
- Descrambled video and/or audio may be output by descrambling engine 265 to storage medium 225 for storage, in DVR database 245 , and/or to decoder module 233 for output to a television or other presentation equipment via television interface 235 .
- the television receiver 200 may be configured to periodically reboot in order to install software updates downloaded over the network 190 or satellites 130 . Such reboots may occur for example during the night when the users are likely asleep and not watching television. If the system utilizes a single processing module to provide television receiving and home automation functionality, then the security functions may be temporarily deactivated. In order to increase the security of the system, the television receiver 200 may be configured to reboot at random times during the night in order to allow for installation of updates. Thus, an intruder is less likely to guess the time when the system is rebooting.
- the television receiver 200 may include multiple processing modules for providing different functionality, such as television receiving functionality and home automation, such that an update to one module does not necessitate reboot of the whole system. In other embodiments, multiple processing modules may be made available as a primary and a backup during any installation or update procedures.
- television receiver 200 of FIG. 2 has been reduced to a block diagram; commonly known parts, such as a power supply, have been omitted. Further, some routing between the various modules of television receiver 200 has been illustrated. Such illustrations are for exemplary purposes only. The state of two modules not being directly or indirectly connected does not indicate the modules cannot communicate. Rather, connections between modules of the television receiver 200 are intended only to indicate possible common data routing. It should be understood that the modules of television receiver 200 may be combined into a fewer number of modules or divided into a greater number of modules. Further, the components of television receiver 200 may be part of another device, such as built into a television. Television receiver 200 may include one or more instances of various computerized components, such as disclosed in relation to computer system 1100 of FIG. 11 .
- the television receiver 200 has been illustrated as a satellite-based television receiver, it is to be appreciated that techniques below may be implemented in other types of television receiving devices, such a cable receivers, terrestrial receivers, IPTV receivers or the like.
- the television receiver 200 may be configured as a hybrid receiving device, capable of receiving content from disparate communication networks, such as satellite and terrestrial television broadcasts.
- the tuners may be in the form of network interfaces capable of receiving content from designated network locations.
- the home automation functions of television receiver 200 may be performed by an overlay device. If such an overlay device is used, television programming functions may still be provided by a television receiver that is not used to provide home automation functions.
- FIG. 3 shows an embodiment of a system for home monitoring and control that includes a television receiver 350 .
- the system 300 may include a television receiver that is directly or indirectly coupled to one or more display devices 360 such as a television or a monitor.
- the television receiver may be communicatively coupled to other display and notification devices 361 such as stereo systems, speakers, lights, mobile phones, tablets, and the like.
- the television receiver may be configured to receive readings from one or more sensors 342 , 348 , or sensor systems 346 and may be configured to provide signals for controlling one or more control units 343 , 347 or control systems 346 .
- the television receiver may include a monitoring and control module 340 , 341 and may be directly or indirectly connected or coupled to one or more sensors and/or control units. Sensors and control units may be wired or wirelessly coupled with the television receiver. The sensors and control units may be coupled and connected in a serial, parallel, star, hierarchical, and/or the like topologies and may communicate to the television receiver via one or more serial, bus, or wireless protocols and technologies which may include, for example, WiFi, CAN bus, Bluetooth, I2C bus, ZigBee, Z-Wave and/or the like.
- the system may include one or more monitoring and control modules 340 , 341 that are external to the television receiver 350 .
- the television receiver may interface to sensors and control units via one or more of the monitoring and control modules.
- the external monitoring and control modules 340 , 341 may be wired or wirelessly coupled with the television receiver.
- the monitoring and control modules may connect to the television receiver via a communication port such as a USB port, serial port, and/or the like, or may connect to the television receiver via a wireless communication protocol such as Wi-Fi, Bluetooth, Z-Wave, ZigBee, and the like.
- the external monitoring and control modules may be a separate device that may be positioned near the television receiver or may be in a different location, remote from the television receiver.
- the monitoring and control modules 340 , 341 may provide protocol, communication, and interface support for each sensor and/or control unit of the system.
- the monitoring and control module may receive and transmit readings and provide a low level interface for controlling and/or monitoring the sensors and/or control units.
- the readings processed by the monitoring and control modules 340 , 341 may be used by the other elements of the television receiver.
- the readings from the monitoring and control modules may be logged and analyzed by the data processing and storage 322 module.
- the data processing and storage 322 module may analyze the received data and generate control signals, schedules, and/or sequences for controlling the control units. Additionally, the data processing and storage module 322 may utilize input data to generate additional outputs.
- the module 322 may receive from a sensor 342 information from a communicatively coupled piece of equipment.
- the sensor may be a part of or attached to the equipment in various embodiments.
- the equipment may provide information regarding movements, alarms, or notifications associated with the home, and the data processing module 322 may use this data to generate relative distance information to be output to and displayed by display device 360 .
- the monitoring and control modules 340 , 341 may be configured to receive and/or send digital signals and commands to the sensors and control units.
- the monitoring and control modules may be configured to receive and/or send analog signals and commands to the sensors and control units.
- Sensors and control units may be wired or wirelessly coupled to the monitoring and control modules 340 , 341 or directly or indirectly coupled with the receiver 350 itself.
- the sensors and control units may be coupled and connected in a serial, parallel, star, hierarchical, and/or the like topologies and may communicate to the monitoring and control modules via one or more serial, bus, or wireless protocols and technologies.
- the sensors may include any number of temperature, humidity, sound, proximity, field, electromagnetic, magnetic sensors, cameras, infrared detectors, motion sensors, pressure sensors, smoke sensors, fire sensors, water sensors, and/or the like.
- the sensors may also be part of or attached to other pieces of equipment, such as exercise equipment, doors or windows, or home appliances, or may be applications or other sensors as part of mobile devices.
- the monitoring and control modules 340 , 341 may be coupled with one or more control units.
- the control units may include any number of switches, solenoids, solid state devices and/or the like for making noise, turning on/off electronics, heating and cooling elements, controlling appliances, HVAC systems, lights, and/or the like.
- a control unit may be a device that plugs into an electrical outlet of a home. Other devices, such as an appliance, may be plugged into the device. The device may be controlled remotely to enable or disable electricity to flow to the appliance.
- a control unit may also be part of an appliance, heating or cooling system, and/or other electric or electronic devices.
- the control units of other system may be controlled via a communication or control interface of the system.
- the water heater temperature setting may be configurable and/or controlled via a communication interface of the water heater or home furnace. Additionally, received telephone calls may be answered or pushed to voicemail in embodiments.
- the controllers may include a remote control designed for association with the television receiver.
- the receiver remote control device may be communicatively coupled with the television receiver, such as through interface 250 , or one or more of the monitoring and control modules for providing control or instruction for operation of the various devices of the system.
- the control may be utilized to provide instructions to the receiver for providing various functions with the automation system including suspending alert notifications during an event. For example, a user may determine prior to or during an event that he wishes to suspend one or more types of notifications until the event has ended, and may so instruct the system with the controller.
- Sensors may be part of other devices and/or systems.
- sensors may be part of a mobile device such as a phone.
- the telemetry readings of the sensors may be accessed through a wireless communication interface such as a Bluetooth connection from the phone.
- temperature sensors may be part of a heating and ventilation system of a home.
- the readings of the sensors may be accessed via a communication interface of the heating and ventilation system.
- Sensors and/or control units may be combined into assemblies or units with multiple sensing capabilities and/or control capabilities.
- a single module may include, for example a temperature sensor and humidity sensor.
- Another module may include a light sensor and power or control unit and so on.
- the sensors and control units may be configurable or adjustable. In some cases the sensors and control units may be configurable or adjustable for specific applications. The sensors and control units may be adjustable by mechanical or manual means. In some cases the sensors and control units may be electronically adjustable from commands or instructions sent to the sensors or control units.
- the focal length of a camera may be configurable in some embodiments. The focal length of a camera may be dependent on the application of the camera. In some embodiments the focal length may be manually set or adjusted by moving or rotating a lens. In some embodiments the focal length may be adjusted via commands that cause an actuator to move one or more lenses to change the focal length. In other embodiments, the sensitivity, response, position, spectrum and/or like of the sensors may be adjustable.
- readings from the sensors may be collected, stored, and/or analyzed in the television receiver 350 .
- analysis of the sensors and control of the control units may be determined by configuration data 324 stored in the television receiver 350 .
- the configuration data may define how the sensor data is collected, how often, what periods of time, what accuracy is required, and other characteristics.
- the configuration data may specify specific sensor and/or control unit settings for a monitoring and/or control application.
- the configuration data may define how the sensor readings are processed and/or analyzed.
- sensor analysis may include collecting sensor readings and performing time based analysis to determine trends, such as temperature fluctuations in a typical day or energy usage. Such trending information may be developed by the receiver into charts or graphs for display to the user.
- sensor analysis may include monitoring sensor readings to determine if a threshold value of one or more sensors has been reached.
- the function of the system may be determined by loading and/or identifying configuration data for an application.
- the system 300 may be configured for more than one monitoring or control operation by selecting or loading the appropriate configuration data.
- the same sensors and/or control units may be used for multiple applications depending on the configuration data used to process and analyze sensor readings and/or activate the control units. Multiple monitoring and/or control applications may be active simultaneously or in a time multiplexed manner using the same or similar set of sensors and/or control units.
- the system 300 may be configured for both exercise monitoring and temperature monitoring applications using the same set of sensors.
- both monitoring applications may be active simultaneously or in a time multiplexed manner depending on which configuration data is loaded.
- the same sensors such as proximity sensors, or cameras may be used.
- the system may be configured for space temperature monitoring.
- the system may only monitor a specific subset of the sensors for activity.
- sensor activity may not need to be saved or recorded.
- the sensor readings may be monitored for specific thresholds which may indicate a threshold temperature for adjusting the space temperature.
- the two different monitoring examples may be selected based on the active configuration data. When one configuration data is active, data from the sensors may be saved and analyzed. When the second configuration data is active, the system may monitor sensor readings for specific thresholds.
- multiple or alternative sensors may be used as well.
- results, status, analysis, and configuration data details for each application may be communicated to a user.
- auditory, visual, and tactile communication methods may be used.
- a display device such as a television may be used for display and audio purposes.
- the display device may show information related to the monitoring and control application. Statistics, status, configuration data, and other elements may be shown. Users may also save particular configuration data for devices, such as notification suspensions while the user is using the coupled display.
- a user may log in or be recognized by the system upon activation and the system may make adjustments based on predetermined or recorded configuration data. For example, a user may have instructed that when he is recognized by the system, either automatically or with provided login information, a notification suspension profile personal to the user be enacted.
- That profile may include that the user would like to continue to receive alarms, such as smoke, fire, or hazard alarms, but that received telephone call information is suspended.
- the user may access the profile and select to begin, the user may be recognized by the system, or a combination such as being recognized by the system such that the television operations are performed or are input by a remote control, while the user himself selects a particular activity to perform with the system.
- the space temperature may be monitored or adjusted as well.
- generated heat may raise the space temperature above a threshold such that the home automation engine 211 additionally begins operation or adjustment of the HVAC system to cool the space.
- configuration data for the user may include reducing the space temperature to a particular degree based on a preference of the user.
- the home automation system may automatically begin adjusting the space temperature as well in anticipation of heat generation or user preferences.
- the system may include additional notification and display devices 361 capable of notifying the user, showing the status, configuration data, and/or the like.
- the additional notification and display devices may be devices that are directly or indirectly connected with the television receiver.
- computers, mobile devices, phones, tablets, and the like may receive information, notifications, control signals, etc., from the television receiver.
- Data related to the monitoring and control applications and activity may be transmitted to remote devices and displayed to a user.
- Such display devices may be used for presenting to the user interfaces that may be used to further configure or change configuration data for each application.
- An interface may include one or more options, selection tools, navigation tools for modifying the configuration data which in turn may change monitoring and/or control activity of an application. Modification to a configuration may be used to adjust general parameters of a monitoring application to specific constraints or characteristics of a home, user's schedule, control units, and/or the like.
- Display interfaces may be used to select and/or download new configurations for monitoring and/or control applications.
- a catalog of pre-defined configuration data definitions for monitoring and control applications may be available to a user.
- a user may select, load, and/or install the applications on the television receiver by making a selection using in part the display device. For example, a user may load a profile based on notification suspension preferences as discussed above.
- configuration data may be a separate executable application, code, package, and/or the like.
- the configuration data may be a set of parameters that define computations, schedules, or options for other processor executable code or instructions.
- Configuration data may be a meta data, text data, binary file, and/or the like.
- notification and display devices may be configured to receive periodic, scheduled, or continuous updates for one or more monitoring and control applications.
- the notifications may be configured to generate pop-up screens, notification banners, sounds, and/or other visual, auditory, and/or tactile alerts.
- some notifications may be configured to cause a pop-up or banner to appear over the programming or content being displayed, such as when a proximity monitor has been triggered in the home. Such an alert may be presented in a centrally located box or in a position different from the fitness information to make it more recognizable. Additionally the program being watched can be paused automatically while such an alert is being presented, and may not be resumed unless receiving an input or acceptance from the user.
- Some notifications may be configured to cause the television to turn on if it is powered off or in stand-by mode and display relevant information for a user. In this way, users can be warned of activity occurring elsewhere in the system.
- the television receiver may also be configured to receive broadcast or other input 362 .
- Such input may include television channels or other information previously described that is used in conjunction with the monitoring system to produce customizable outputs. For example, a user may wish to watch a particular television channel while also receiving video information of activities occurring on the property.
- the television receiver may receive both the exterior camera information and television channel information to develop a modified output for display.
- the display may include a split screen in some way, a banner, an overlay, etc.
- FIG. 4 illustrates an embodiment of a method 400 for coordinating home automation activity.
- Method 400 may be performed using any of the systems or components previously described.
- Method 400 may allow for an electronic device to receive and present event information over discrete time intervals for user review and consideration.
- Each step of method 400 may be performed at or by a single electronic device, such as an STB, television receiver, computer, or mobile device, for example, or by multiple devices communicating with one another.
- such electronic device or devices may be a hub for the home automation system in embodiments of the present technology.
- Means for performing each step of method 400 include an electronic device and/or the various components of an electronic device or distribution system, such as those detailed in relation to FIGS. 1 and 2 .
- Method 400 may be performed using a computerized device, such as a device incorporating some or all of the components of computer system 1100 of FIG. 11 .
- the method may include accessing event data at operation 410 .
- the event data may be accessed from a storage database communicatively coupled with an electronic device that accesses the event data.
- the event data may include a host of information that in embodiments may include a time and date at which the event data was collected.
- the methods may also include generating an event table identifying the number of events occurring during discrete time intervals for a date range or period of time at operation 420 .
- the event table may provide a graphical depiction of the number of events that occurred for each discrete time interval in embodiments.
- the methods may also include outputting the event table from the electronic device for display on a display device at operation 430 .
- the present methods may allow a user to review an overview of events that have been recorded or generated throughout time intervals of minutes, hours, days, months, years, etc. Such methods may be utilized with a home automation system for reviewing security and event information, for example.
- any one or more of a plurality of connected devices as previously described may record or output information or data throughout a day. Sensors may provide information that a temperature has passed a set point, which may cause a cooling system to be engaged for cooling the space. If a smoke detector recognizes smoke within a space, it may enable an alarm or other sprinkler system for reacting to a potential fire. Many other examples available with a home automation system will be readily understood to be encompassed by the present technology.
- the data associated with the event may be collected and stored in the storage database. In embodiments, the data may include some or all information or data related to the event.
- a security system may be a part of the home automation system, and may include cameras and other sensors and devices related to home security.
- the cameras may include motion detectors, sound detectors, thermal detectors, etc., that may be coupled with the one or more cameras to instruct them when to record.
- a camera may be configured to record constantly, at particular times based on pre-set instructions, or when a sensor has been tripped or triggered. Any one of these scenarios may be considered an event in relation to the present technology.
- a camera may be coupled with a motion sensor, which when triggered causes the camera to begin recording.
- the camera may continue recording until the sensor ceases to note motion, or for a pre-determined amount of time, which can be for a few seconds or more, about 30 seconds or more, about 1 minute or more, about 5 minutes or more, about 10 minutes or more, or for any period of time for which the camera may be set to record.
- a similar scenario can be triggered by a sound detector, for example.
- a manual recording may be initiated at any time by a user of the system, which may record for any period of time the user wishes, or for pre-set periods of time as instructed by a user during system setup, for example.
- the video recording may be included with the collected event data for the event, and may be stored in the storage database.
- the event data for a particular event may then include a time, date, video recording, triggering event explanation, sensor or camera identification, event duration, and any other information related to the event that occurs.
- Such information may be stored for every event that occurs during a day, or may be set by a user to include only particular types of data for particular events.
- security events such as motion-activated camera activity may include all relevant data for the event, while the engagement of an HVAC system may include only the time at which the system was started and stopped in embodiments.
- the storage database may include a local storage database on the electronic device in embodiments, and may also include a cloud-based storage or network-accessible storage external to the electronic device.
- the service provider of the system may include an amount of external storage for use by individual users, that may include a subscription-based amount of storage, video quality, period for individual event storage, etc.
- event data which may include video recordings, may fully utilize the available storage on a regular basis.
- the storage space may be re-used in any number of ways including discarding the oldest events to provide space for newer events, for example.
- a user may wish to review event data on a regular basis in order to ensure security or other events have been recognized by the user.
- a user may also simply wish to review the events during a period of time he or she was away from the home, business, or space at which the system is set.
- the electronic device includes a television receiver, and may be a set-top box including a satellite or cable box. A user may be watching television, for example, and may wish to alternatively or simultaneously review event data for the day.
- a user may provide an instruction to the electronic device to present events for review.
- the electronic device may access the storage database to retrieve all event data for a time period, which may default to the past 24 hours, or current day, or previous day, for example, and may be set to retrieve any amount of data or for any particular time period.
- the electronic device may retrieve the data from a storage database that may include local storage and remote storage, such as cloud storage, for example.
- certain information may be stored locally including event details, while other data, such as video recordings, may be stored externally, such as in cloud storage. Additionally, the event details may be stored both locally and in cloud storage for use in multiple ways.
- the electronic device may analyze the data or parse the data for presentation. For example, the device may develop or generate an event table that includes information for each item of event data over the time period, or the item of event data itself.
- the table may include the actual time and date information for each event, and may include a video recording associated with the event, or may include one or more selectable links to the video recording.
- the table may further parse and organize the data into discrete time intervals across the period of time, such as in hour or some number of minute increments, for example.
- the electronic device may then utilize this parsed information to produce a graphical depiction of each item of the event data across the discrete time intervals.
- An exemplary graphical depiction according to embodiments of the methods is illustrated in FIG. 5 .
- FIG. 5A shows an exemplary overlay 500 of event data for a home automation system according to the present technology.
- Overlay 500 may include a number of elements, including an event table 505 including a graphical depiction 510 of event data.
- Event table 505 may include some or all of the event data occurring during the period of time.
- the table may include all alarms, video recordings, and system notifications that have occurred throughout the day.
- the event table may organize the events based on any number of event particulars including the type of event, the sensor or home automation feature receiving the event, the start and/or end time of the event, the date, the duration of the event, as well as aspects of the event related to the event data available.
- the event table may also include information regarding whether the recording has been viewed, or an amount of how much of the recording has been viewed.
- the event type may include whether the event is a particular sensor event, or for a home automation feature such as a security camera, whether the event is a manual recording, a timer-based recording, a motion-initiated, a sound-initiated recording, or a rule-initiated recording.
- a rule-initiated recording may be a recording triggered by an alarm or a connected device, for example.
- a user may have a camera set up to record an entryway (in the home or outside) when the doorbell is activated, or if a smoke alarm somewhere in the house initiates, a camera for that zone may begin recording. Any number of rules may be initiated for these or other home features and are understood to be encompassed by the present technology.
- the discrete time intervals may include a series of hours throughout the day.
- the series may include every hour of the day, or may include a subset of hours in the day such as only those hours during which an event took place, or only those hours for which the user is not home and has selected for review.
- the graphical depiction may be or include a bar chart having a bar for each discrete time interval.
- the bar height and/or color may be used to indicate the number of items of event data occurring for each discrete time interval. For example, as the number of events during a discrete time interval increases, the bar height may increase commensurately.
- preset colors may be utilized to indicate the severity in the number of events during a time interval to provide a user with a visual cue to review the particular time interval for any major events. For example, as the number of events during a discrete time interval increases, the color of the corresponding bar may change. In an embodiment, one or no events during a discrete time interval may have a bar color that is a more neutral color, such as blue or green for example. For more events, such as 2 or 3, for example, the bar color for that time interval may change to yellow. As the event number increases to 4 or 5, the bar color for that time interval may change to orange, and as the number increases over 5 the bar color may change to red.
- the numbers listed are only examples, and that the actual colors used may be set based on an average number of events that occur in an individual situation, or any number preference of a user. For example, in a home in which there are many people constantly coming and going, the user may set up the system colors to be double, triple, or more of the listed numbers for each color, or any other number of the user's choosing. Similarly, any particular color combination may be used or chosen by a user of the system.
- Graphical depiction 510 may take a variety of forms in embodiments that may be used in alternative arrangements or in combined forms. As illustrated in FIG. 5B , the graphical depiction 510 may take the form of a line graph indicating the number of events occurring during discrete time intervals. As would be understood, the line graph may also be combined with the previously described bar chart in embodiments. The line itself may take any number of forms such as an angular view as illustrated in the figure, as well as a curved view having rounded corners, as well as any other line that may convey the number of events that occur during a specific time interval.
- the graphical depiction 510 may take the form of a chart or spreadsheet indicating the number of events occurring during discrete time intervals. As illustrated, such a format may allow additional information to be displayed including the discrete cameras, for example, as well as any other sensors or zones that may be identified for a space.
- the events may be differentiated in any number of ways including by shading within a specific color scheme as well as with alternate colors based on the number of events. For example, the same or a relatively similar scheme may be used as identified above for colors including yellow, orange, red, etc., to identify time intervals during which events occur. In another example shading may be utilized with one or more colors to indicate the number of events.
- a user may identify that multiple events were detected by Camera 4 during those times.
- Any other presentation schemes or variations on the provided examples as would be understood to identify the number of events occurring during discrete time intervals are similarly encompassed by the present technology, and may be utilized alone or in combination in order to present the information of most use to a user of the technology.
- a user may scroll to an individual bar of the bar chart and access the number of events for that time interval.
- the event table may list all video recordings and/or events for a day, and then as a user scrolls over individual bars, the event table may adjust to show only the events of that time interval.
- the methods may also include receiving an instruction selecting one or more of the discrete time intervals.
- the user may provide those instructions with a remote device or mobile device, which can include a remote associated with the electronic device, such as a television or set-top box remote, for example.
- the methods may also include having the electronic device, in response to the instruction, further output for display a tabularized description of each item of event data occurring during the selected discrete time interval.
- the tabularized description may be shown in the event table location, and include any of the data previously described.
- the user may also scroll down or over individual events within the event table, and select particular events for viewing, such as viewing additional particulars of the event or watching associated video. For example, for a sensor based event, by selecting the event in the event table, the user may learn the particulars of the event not already listed, such as a temperature relating to a temperature sensor based event. For events including a video recording, scrolling over the event may provide an image or screen shot from the camera in a viewing window 515 .
- the corresponding methods may include receiving an instruction to select one of the time intervals, which may include an actual selection, such as clicking on an interval or bar, or a scrolling operation over a listing.
- the electronic device may output for display at least an image associated with each video recording for each item of event data occurring during the selected discrete time interval. Additionally, for a user scrolling over individual events within the event table listings, the electronic device may automatically output for display at least an image associated with the video recording, or may automatically begin streaming or playing the event in the viewing window.
- the viewing window, the event table, or an additional window for display within the overlay may provide selectable filters allowing a user to filter event data.
- the filters may include filters for particular discrete time intervals, filters for particular sensors, cameras, or household devices, cameras for particular triggers such as motion-initiated events, etc.
- viewing window 515 includes a number of selections available for filtering event data.
- the first selection include all recordings, and the event table and graphical depiction include all event data for the selected time period.
- the user may also scroll to select individual cameras as illustrated, as well as particular triggering events for each camera. As illustrated, the individual cameras provide additional filters for both motion-initiating events and sound-initiating events.
- Additional filters may be included such as manual filters, or rule-based filters, for example, but any number of filter mechanisms may be included for parsing event data, and are to be understood as being encompassed by the present technology, which may be used in combination to further filter data per a user preference.
- the user may provide an instruction to the electronic device to apply the selected or highlighted filter to the event table.
- a user may utilize a remote control for maneuvering a selection box or highlight over aspects of the overlay that may allow further interaction or selection by the user.
- the electronic device may apply at least one filter to the event table.
- the application of the filter may filter not only the event data displayed in the event table, but may also adjust the bar chart height and/or color to include only the items of event data included by the at least one filter.
- event data from other cameras and/or sensors may be removed from both the event table and the graphical depiction such that the graphical depiction includes only data for the particular camera selected. Accordingly, if the number of events in the discrete time intervals reduces, the color and/or height may change based on the total number of events occurring during that time interval as previously described. This may provide an additional indication to a user of what intervals to review for an individual camera that a user may deem to be important.
- the electronic device may be a home automation system hub, which may also be included with a television receiver, such as a satellite or cable set-top box.
- a television receiver such as a satellite or cable set-top box.
- Such a hub may include input components to receive multimedia data including broadcast programming as well as user input, such as from a remote control.
- the electronic device may then be coupled with a display device, such as a television or screen.
- the electronic device may also be able to receive video data from communicatively coupled cameras over a network input, such as with cameras that include cloud-based storage of recorded information.
- Overlay 500 may take a variety of forms, which may be further adjusted by a viewer.
- the overlay may be a complete or partial overlay, and may be presented as a split-screen, picture-in-picture view, or punch-though as illustrated. Any number of user-adjustable configurations may be used to allow a user to view and review home automation information as well as watch broadcast or downloaded material simultaneously.
- the method may include accessing event data from a storage database communicatively coupled with an electronic device at operation 610 .
- the database may be local or remote, and each item of event data may include a time and date at which the event data was collected as previously described.
- the method may also include monitoring the event data to determine that the number of items of event data for a discrete time interval surpasses a pre-defined threshold at operation 620 .
- the method may still further include transmitting an alert to a mobile device indicating that the pre-defined threshold has been surpassed at operation 630 .
- the event data which may include video recordings, may be stored locally at the electronic device, or remotely on a network-accessible storage device, including cloud storage maintained by a provider system.
- the monitoring may be performed by the electronic device itself, such as by a processor of the electronic device, or may be monitored by the database, such as by the database management system, of the storage device. If the event data is monitored by the electronic device, the database may provide updates to the electronic device whenever new event information is received. The electronic device may then update an event table and compare the data values against threshold values for providing alerts. If the database management system or database is performing the monitoring, it may receive threshold values from the electronic device, such as user-defined preferences, and then provide an announcement or alert to the electronic device when the threshold values are surpassed.
- the threshold values may take several forms including both a discrete number as well as a pattern.
- the threshold values may include rules such as if the number of events in a discrete time interval surpasses 5 , or any other number, or extends into the orange or red color as previously discussed, an alert is transmitted to the user. Rules may also be used if an amount of movement above a threshold is detected, or a sound above a threshold. Once a system is enabled, a user may adjust these settings based on typical occurrences around the property. Moreover, a rule may be enacted that if a pattern of alerts is received, an alert is transmitted to the user.
- An exemplary pattern may include a series of alerts from consecutive sensors or motion-activated cameras around a property.
- an alert may be sent even if the number of discrete events is otherwise below a defined threshold for sending alerts.
- An order for the sensor triggers may include any pattern or sequence by which a user has set, or the system has been set, to recognize possible negative activity, such as criminal casing. Once a threshold or trigger has been satisfied, the electronic device may transmit an alert to the mobile device.
- the method 600 may optionally include generating an event table as previously described at operation 640 .
- the event table may be routinely updated to include each event recorded by the home automation or security system, and may receive updates from an external storage, or may be interconnected with the security features to receive automatic recognition of event data.
- the event table may be similar to that previously described, and may identify the number of events occurring during discrete time intervals for a date range or period of time.
- the event table may include a graphical depiction of the number of events that occurred for each discrete time interval. Additionally, the graphical depiction may include a bar chart having a bar for each discrete time interval, and the bar height and/or color may be commensurate with the number of items of event data occurring for each discrete time interval.
- the alert may take any number of forms, an example of which is illustrated in FIGS. 7A-7C .
- the electronic device may transmit an alert 705 to a mobile device 700 , such as a mobile phone, to a user.
- the electronic device may utilize various communication paths for communicating with various users' mobile devices. These communication paths may include one or more of: a push notification server system, an SMS server system, an email server system, a telephone service provider network, social media, or a general network.
- the push notification server system may be a system that causes a mobile device to display a message such that the message must be actively dismissed by the user prior to or otherwise interacting with the mobile device. As such, a push notification has a high likelihood of being viewed by a user since the user is required to dismiss the push notification before performing any other functions, home automation related or not, with the mobile device.
- Alert 705 may include a variety of information, such as a notification that activity has surpassed a defined threshold, or that suspicious activity has been identified. Based on the notification or alert, the alert may include one or more links accessible by the user.
- a user may access one or more cameras or sensor devices while away from the hub and or property under automation or surveillance.
- the network hub such as a set-top box, includes a network connection, a user may access his device across the network in order to access video streams or information from the electronic device.
- the alert may include a link to a camera 715 associated with at least one of the items of event data, up to all cameras of the system, as illustrated in FIG. 7C .
- a user may view live footage of the camera as well as recorded event data in embodiments. The camera provided may be based on that camera being one associated with the event data. Additionally, the provided view may allow a user to access alternate cameras to view surrounding areas of the property.
- the alert may additionally or alternatively include a link 710 to access the generated event table for review of the overall system by the user as illustrated in FIG. 7B .
- the user may review the aspects of the event table as previously described.
- the event table as accessed on the mobile device may also include links selectable by a filter or otherwise accessible on a mobile platform to live-stream a camera of the home automation system. In this way, a user may be able to determine the best view of an event that may be unfolding, and access appropriate cameras or sensors.
- the available information and tools to the user may also include allowing a user to record from any camera manually by selecting the camera and enabling it.
- the user may interact with other aspects of the home automation system, such as to lock/unlock doors, turn on/off equipment that is connected with the system, or access speakers to talk remotely with burglars, kids, or pets, for example.
- the alert may also include a graphical, perspective depiction as illustrated in FIGS. 9 and 10 that may be accessed and manipulated by the user as well.
- the method 800 may include accessing event data from a storage database communicatively coupled with an electronic device at operation 810 .
- the storage database may be local or remote including a network-accessible database, or some combination of both.
- each item of event data may include a time and date at which the event data was collected, along with any additional information as previously discussed including video recordings.
- the method may also include generating a graphical, perspective depiction at operation 820 that identifies the number of events occurring during discrete time intervals for a date range or period of time. In embodiments, the depiction may include the discrete time intervals in perspective view.
- the method may further include outputting the graphical, perspective depiction from the electronic device for display on a display device at operation 830 .
- the electronic device may be or include a television receiver, and may in embodiments be a satellite or cable set-top box, which may act as a hub for the home automation system.
- the hub may include an input for receiving multimedia data, which may include broadcast programming including cable or satellite broadcast television, on-demand programming, or internet-based data.
- the hub may also include an input for receiving user input such as with a remote control or mobile device.
- the hub may include one or more input components configured to receive video data from at least one communicatively coupled camera.
- the input may be an actual input for a camera of the system, or the camera may provide the video data or recordings to an additional database for storage that is accessible by the hub, such as over a network.
- the graphical, perspective depiction may allow one or more cameras of the system to be identified and event data associated with the camera to be displayed in a useful way for the user to quickly and easily identify events occurring over a period of time.
- An exemplary graphical, perspective depiction is illustrated at FIG. 9 .
- the graphical, perspective depiction 900 includes an interface for displaying home automation equipment, such as sensors, or cameras as illustrated, as well as a timeline of events.
- Discrete time intervals 905 are displayed in perspective view in the depiction, and individual cameras, or other sensors, of a home automation system may be shown within the graphical, perspective view as well.
- each item of data may also include a video recording that may be accessed or streamed by a user from the graphical, perspective depiction.
- Event icons 915 may be utilized for each item of event data, and may be positioned along the perspective view of the graphical, perspective depiction according to the time at which the item of event data associated with the event icon was collected.
- the event icons may correspond to, and visually represent, a source of the event data.
- the icon associated with the event type may include, for example, a speaker depiction for a sound-initiated event or recording, a person depicted in motion for a motion-initiated event or recording, a symbol such as an “M” for a manual recording, etc.
- a user may choose icons for each event type, or load images to be used for each event type icon, such as family pictures, words, etc.
- the source for each item of event data may be a specific triggering event for camera recording, such as motion detection, sound detection, an instruction to manually record, and a rule-initiated recording as previously described.
- a rule-initiated recording may be triggered by an alarm, or a connected device as described elsewhere.
- the individual cameras 910 are shown horizontally against the perspective view of discrete time intervals, and the individual cameras are shown as abscissae to the time intervals in perspective view.
- four cameras are shown in the illustration, it is to be understood that any number of cameras or sensors may be included in the graphical depiction depending on user setup, system components, etc.
- a user may also customize these features in embodiments to include pages of cameras, for example, in order to show a less crowded view when many cameras or sensors are included in the depiction. With such a depiction, more detail may be shown than in the bar chart previously described.
- a user may quickly identify not only the number of events that have occurred, but also the type and location, such as by knowing the location of each camera, in order to quickly glean the events of a time period or day.
- Graphical, perspective depiction 900 may be configured as an overlay, picture-in-picture, split-screen of some sort, or punch-through as illustrated to include programming 930 while a user is reviewing event activity from the time period. Additionally, in embodiments the graphical depiction may include a detection field 920 associated with the discrete time interval associated with the nearest time displayed in the perspective view. The exemplary detection field is illustrated as a box around the discrete time interval across each camera. Event icons that are contained within the detection field may be differentiated from other event icons of the graphical depiction, so as to further enable a user to determine what is being reviewed. For example, the event icon within the detection field may be colored, shaded, etc. in order to provide an additional visual differentiation for the user. As illustrated in the Figure, a user may be currently reviewing activity at 1 PM, for example, where a motion-initiated recording was received at camera 3.
- the detection field 920 may also display a graphical connection between event icons positioned within the detection field 920 and a view from the video recording associated with the individual camera corresponding to the item of event data.
- the graphical connection may include a graphical boundary surrounding the graphical, perspective depiction and a connector 922 between the graphical boundary and a view from the video recording 925 associated with the individual camera corresponding to the item of event data.
- multiple camera views 925 may be tiled along the side of the graphical depiction so a user can view each event.
- the camera view 925 may include any number of views for a corresponding event icon. For example, when an event icon is displayed within the detection field, an image of the recording may be displayed, or in embodiments, the video may automatically begin streaming.
- the electronic device or hub may also be configured to receive instructions from a remote control having ordinate and abscissa directionality controls, including horizontal and vertical buttons or inputs, such as a television receiver remote control.
- the remote control may have at least four keys for providing input, and at least two keys provide ordinate or vertical-based instructions, and at least two keys provide abscissa or horizontal-based instructions.
- the graphical depiction may adjust along the perspective view of the discrete time intervals. For example, by selecting up, the view may scroll to the next discrete time interval, while the current time interval is removed from the depiction.
- a new time interval may be added to the bottom of the depiction while the other discrete time intervals are shifted further along the perspective.
- the vertical inputs may simply move the detection field along the perspective discrete time intervals, and in embodiments additional vertical inputs, such as page up or down commands actually change the time intervals displayed along the perspective view.
- Horizontal or abscissa-based instructions may adjust the view from one individual camera to the next, or may shift the number of cameras displayed, such as from cameras 1-4, to cameras 5-8, for example. Additionally, horizontal instructions may adjust the focus or view of a camera or may rotate or move the camera itself.
- the graphical, perspective depiction may be linked with an alert similarly to the event table earlier described.
- a perspective depiction may also be easily manipulated and controlled on a mobile device, which may allow touch control and operation.
- a mobile device including a mobile phone may have a touch screen or allow touch-sensitivity-based controls, and thus a swipe vertically across the touch screen may provide adjustments along the perspective view as previously described, and a horizontal swipe may provide adjustments along the camera views, or may adjust the position or focus of an individual camera.
- a user may also be able to select individual cameras such as by selecting the horizontally listed camera itself to view events during the time period for that camera.
- FIG. 10 An additional embodiment of a graphical depiction 1000 is illustrated in FIG. 10 , where the view from the video recording 1025 associated with the individual camera corresponding to the item of event data is positioned within that camera space.
- a view of the video recording such as an image or a streaming view of the video recording may be presented when an event icon is located at the currently viewed discrete time interval along the perspective view.
- a motion-detection is displayed at the currently viewed time of 1 PM for camera 3, accordingly, an image from the video recording, or the video recording itself may be displayed in the space corresponding to the horizontal listing of the individual camera 3, which is the camera associated with the item of event data. This may provide an additional means for reviewing multiple event icons for a discrete time interval without question of which video is from which camera, for example.
- FIG. 11 illustrates an embodiment of a computer system 1100 .
- a computer system 1100 as illustrated in FIG. 11 may be incorporated into devices such as an STB, a first electronic device, DVR, television, media system, personal computer, and the like. Moreover, some or all of the components of the computer system 1100 may also be incorporated into a portable electronic device, mobile phone, or other device as described herein.
- FIG. 11 provides a schematic illustration of one embodiment of a computer system 1100 that can perform some or all of the steps of the methods provided by various embodiments. It should be noted that FIG. 11 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. FIG. 11 , therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
- the computer system 1100 is shown comprising hardware elements that can be electrically coupled via a bus 1105 , or may otherwise be in communication, as appropriate.
- the hardware elements may include one or more processors 1110 , including without limitation one or more general-purpose processors and/or one or more special-purpose processors such as digital signal processing chips, graphics acceleration processors, and/or the like; one or more input devices 1115 , which can include without limitation a mouse, a keyboard, a camera, and/or the like; and one or more output devices 1120 , which can include without limitation a display device, a printer, and/or the like.
- processors 1110 including without limitation one or more general-purpose processors and/or one or more special-purpose processors such as digital signal processing chips, graphics acceleration processors, and/or the like
- input devices 1115 which can include without limitation a mouse, a keyboard, a camera, and/or the like
- output devices 1120 which can include without limitation a display device, a printer, and/or the like.
- the computer system 1100 may further include and/or be in communication with one or more non-transitory storage devices 1125 , which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like.
- RAM random access memory
- ROM read-only memory
- Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
- the computer system 1100 might also include a communications subsystem 1130 , which can include without limitation a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device, and/or a chipset such as a BluetoothTM device, an 802.11 device, a WiFi device, a WiMax device, cellular communication facilities, etc., and/or the like.
- the communications subsystem 1130 may include one or more input and/or output communication interfaces to permit data to be exchanged with a network such as the network described below to name one example, other computer systems, television, and/or any other devices described herein.
- a portable electronic device or similar device may communicate image and/or other information via the communications subsystem 1130 .
- a portable electronic device e.g. the first electronic device
- the computer system 1100 may further comprise a working memory 1135 , which can include a RAM or ROM device, as described above.
- the computer system 1100 also can include software elements, shown as being currently located within the working memory 1135 , including an operating system 1140 , device drivers, executable libraries, and/or other code, such as one or more application programs 1145 , which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
- an operating system 1140 operating system 1140
- device drivers executable libraries
- application programs 1145 which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
- application programs 1145 may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
- application programs 1145 may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
- code and/or instructions can be used to configure and/or adapt a general purpose computer or other device to perform one or more operations in accordance with the described methods.
- a set of these instructions and/or code might be stored on a non-transitory computer-readable storage medium, such as the storage device(s) 1125 described above.
- the storage medium might be incorporated within a computer system, such as computer system 1100 .
- the storage medium might be separate from a computer system e.g., a removable medium, such as a compact disc, and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon.
- These instructions might take the form of executable code, which is executable by the computer system 1100 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 1100 e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc., then takes the form of executable code.
- some embodiments may employ a computer system such as the computer system 1100 to perform methods in accordance with various embodiments of the technology. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 1100 in response to processor 1110 executing one or more sequences of one or more instructions, which might be incorporated into the operating system 1140 and/or other code, such as an application program 1145 , contained in the working memory 1135 . Such instructions may be read into the working memory 1135 from another computer-readable medium, such as one or more of the storage device(s) 1125 . Merely by way of example, execution of the sequences of instructions contained in the working memory 1135 might cause the processor(s) 1110 to perform one or more procedures of the methods described herein. Additionally or alternatively, portions of the methods described herein may be executed through specialized hardware.
- machine-readable medium and “computer-readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion.
- various computer-readable media might be involved in providing instructions/code to processor(s) 1110 for execution and/or might be used to store and/or carry such instructions/code.
- a computer-readable medium is a physical and/or tangible storage medium.
- Such a medium may take the form of a non-volatile media or volatile media.
- Non-volatile media include, for example, optical and/or magnetic disks, such as the storage device(s) 1125 .
- Volatile media include, without limitation, dynamic memory, such as the working memory 1135 .
- Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
- Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 1110 for execution.
- the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
- a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 1100 .
- the communications subsystem 1130 and/or components thereof generally will receive signals, and the bus 1105 then might carry the signals and/or the data, instructions, etc. carried by the signals to the working memory 1135 , from which the processor(s) 1110 retrieves and executes the instructions.
- the instructions received by the working memory 1135 may optionally be stored on a non-transitory storage device 1125 either before or after execution by the processor(s) 1110 .
- configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure.
- examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Remote Sensing (AREA)
- Astronomy & Astrophysics (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Environmental & Geological Engineering (AREA)
- Environmental Sciences (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
- The present application relates to application Ser. No. ______ (Attorney Docket Number 94567-942623) filed on Sep. 3, 2015, and having the same title, which is herein incorporated by reference in its entirety for all purposes.
- The present technology relates to systems and methods for incorporating and displaying content. More specifically, the present technology relates to suspending alerts for a home automation system.
- Home automation systems provide a plethora of valuable benefits. From monitoring ongoing activities to securing the home, these systems can be configured to monitor many activities. However, with all the monitoring comes updates and alerts. While providing useful information regarding ongoing operations in the home, the updates can become disruptive throughout the day. Although there may be ways of diminishing the disruption during activities, that the alerts continue may become a nuisance to a user.
- Thus, there is a need for improved methods and systems for suspending alerts during ongoing events in the home or elsewhere. These and other needs are addressed by the present technology.
- Systems and methods for coordinating home automation activity may include accessing event data from a storage database communicatively coupled with an electronic device. Each item of event data may include a time and date at which the event data was collected. The methods may include generating a graphical, perspective depiction identifying the number of events occurring during discrete time intervals for a period of time. The depiction may include the discrete time intervals in perspective view. The methods may also include outputting the graphical, perspective depiction from the electronic device for display on a display device.
- The graphical, perspective depiction may include individual cameras of a home automation system within the perspective view in embodiments, and each item of event data may further include a video recording. An event icon for each item of event data may be positioned along the perspective view of the graphical, perspective depiction according to the time at which the item of event data associated with the event icon was collected. The event icons may correspond to, and visually represent, a source of the event data. In embodiments, the source may include a triggering event for camera recording consisting of a motion detection, a sound detection, an instruction to manually record, and a rule-initiated recording. For example, a rule-initiated recording may be triggered by an alarm or a connected device.
- The graphical depiction of the present technology may include a detection field associated with the discrete time interval associated with the nearest time displayed in the perspective view. The detection field may also display a graphical connection between event icons positioned within the detection field and a view from the video recording associated with the individual camera corresponding to the item of event data. The graphical connection may include a graphical boundary surrounding the graphical, perspective depiction and a connector between the graphical boundary and a view from the video recording associated with the individual camera corresponding to the item of event data. Also, the graphical depiction may include individual cameras listed horizontally against the perspective view as abscissae to the time intervals in perspective view. In embodiments, the view from the video recording associated with the individual camera corresponding to the item of event data may be positioned within a space corresponding to the horizontal listing of the individual camera associated with the item of event data.
- In embodiments, the electronic device may be configured to receive instructions from a remote control having ordinate and abscissa directionality controls. For example, an ordinate directionality control instruction may adjust the graphical depiction along the perspective view of the discrete time intervals, and an abscissa directionality control instruction may adjust a view from an individual camera. The remote control may include, for example, a television receiver remote control having at least four keys for providing input. At least two keys may provide ordinate-based instructions, and at least two keys may provide abscissa-based instructions. In embodiments, the remote control may include a mobile device having a touch screen. A swipe vertically along the touch screen may provide adjustments along the perspective view, and a horizontal swipe along the touch screen may provide abscissa-based instructions. Also in embodiments, the storage database may include at least one of a local database, or a network-accessible storage database. Also, the electronic device may be or include a television receiver in embodiments.
- The present technology also includes home automation system hubs. The hubs may include a first input component configured to receive multimedia data, and a second input component configured to receive user input. The hubs may also include at least one output component communicatively coupled with at least one display device. The hubs may have one or more processors, as well as memory communicatively coupled with and readable by the one or more processors and having stored therein processor-readable instructions. When executed by the one or more processors, the one or more processors may be caused to access event data from a storage database communicatively coupled with an electronic device. Each item of event data may include a time and date at which the event data was collected. The processors may be further caused to generate a graphical, perspective depiction identifying the number of events occurring during discrete time intervals for a date range. The depiction may include the discrete time intervals in perspective view. The processors may also be caused to output the graphical, perspective depiction from the electronic device for display on a display device.
- In embodiments, the multimedia data may include satellite broadcast television. The home automation system hub may also further include a network input component configured to receive video data from at least one communicatively coupled camera. Also, the graphical, perspective depiction may include individual cameras of a home automation system within the perspective view, and each item of event data may further include a video recording.
- Such technology may provide numerous benefits over conventional techniques. For example, the technology may allow a user to quickly identify a time of day during which a crisis may have occurred. Additionally, the technology may allow a user to quickly identify events worth review on a daily or other time basis. These and other embodiments, along with many of their advantages and features, are described in more detail in conjunction with the below description and attached figures.
- A further understanding of the nature and advantages of the disclosed embodiments may be realized by reference to the remaining portions of the specification and the drawings.
-
FIG. 1 shows a simplified media service system that may be used in accordance with embodiments of the present technology. -
FIG. 2 illustrates an exemplary electronic device that may be used in accordance with embodiments of the present technology. -
FIG. 3 illustrates an exemplary home automation system setup in accordance with embodiments of the present technology. -
FIG. 4 shows a simplified flow diagram of a method of coordinating home automation activity according to embodiments of the present technology. -
FIG. 5A illustrates an exemplary overlay of an event table identifying events occurring during discrete time intervals according to embodiments of the present technology. -
FIG. 5B illustrates an exemplary event table identifying events occurring during discrete time intervals according to embodiments of the present technology. -
FIG. 5C illustrates an exemplary event table identifying events occurring during discrete time intervals according to embodiments of the present technology. -
FIG. 6 shows another simplified flow diagram of a method of coordinating home automation activity according to embodiments of the present technology. -
FIGS. 7A-7C illustrate an exemplary alert received by a user according to embodiments of the present technology. -
FIG. 8 shows another simplified flow diagram of a method of coordinating home automation activity according to embodiments of the present technology. -
FIG. 9 illustrates another exemplary overlay of an event table identifying events occurring during discrete time intervals according to embodiments of the present technology. -
FIG. 10 illustrates another exemplary overlay of an event table identifying events occurring during discrete time intervals according to embodiments of the present technology. -
FIG. 11 shows a simplified computer system that may be utilized to perform one or more of the operations discussed. - In the appended figures, similar components and/or features may have the same numerical reference label. Further, various components of the same type may be distinguished by following the reference label by a letter that distinguishes among the similar components and/or features. If only the first numerical reference label is used in the specification, the description is applicable to any one of the similar components and/or features having the same first numerical reference label irrespective of the letter suffix.
- A television receiver may serve as a host for a home automation system. By using a television receiver to host a home automation system, various advantages may be realized. For instance, the home automation system may be able to conveniently present home automation information to a user via a connected display device, such as a television or other connected devices, such as a tablet computer, mobile phone, monitor, or laptop computer. However, the amount of home automation activity recorded throughout a discrete time interval such as a day may include an amount of data a user cannot readily review in a reasonable amount of time. As will be explained below, the present technology allows a user to quickly identify time intervals of increased activity for review and consideration. After describing media service systems and electronic devices in
FIGS. 1 and 2 that may be utilized in the present technology, methods and systems will be explained with the remaining figures. -
FIG. 1 illustrates an embodiment of a satellitetelevision distribution system 100. While a home automation system may be incorporated with various types of television receivers, various embodiments may be part of a satellite-based television distribution system. Cable, IP-based, wireless, and broadcast focused systems are also possible. Satellitetelevision distribution system 100 may include: televisionservice provider system 110, satellite transmitter equipment 120, satellites 130,satellite dish 140,television receiver 150, home automation service server 112, anddisplay device 160. Thedisplay device 160 can be controlled by auser 153 using aremote control device 155 that can send wired orwireless signals 157 to communicate with theSTB 150 and/ordisplay device 160. Alternate embodiments of satellitetelevision distribution system 100 may include fewer or greater numbers of components. While only onesatellite dish 140,television receiver 150, and display device 160 (collectively referred to as “user equipment”) are illustrated, it should be understood that multiple (e.g., tens, thousands, millions of) instances and types of user equipment may receive data and television signals from televisionservice provider system 110 via satellites 130. - Television
service provider system 110 and satellite transmitter equipment 120 may be operated by a television service provider. A television service provider may distribute television channels, on-demand programming, programming information, and/or other content/services to users. Televisionservice provider system 110 may receive feeds of one or more television channels and content from various sources. Such television channels may include multiple television channels that contain at least some of the same content (e.g., network affiliates). To distribute television channels for presentation to users, feeds of the television channels may be relayed to user equipment via multiple television distribution satellites. Each satellite may relay multiple transponder streams. Satellite transmitter equipment 120 may be used to transmit a feed of one or more television channels from televisionservice provider system 110 to one or more satellites 130. While a single televisionservice provider system 110 and satellite transmitter equipment 120 are illustrated as part of satellitetelevision distribution system 100, it should be understood that multiple instances of transmitter equipment may be used, possibly scattered geographically, to communicate with satellites 130. Such multiple instances of satellite transmitting equipment may communicate with the same or with different satellites. Different television channels may be transmitted to satellites 130 from different instances of transmitting equipment. For instance, a different satellite dish of satellite transmitter equipment 120 may be used for communication with satellites in different orbital slots. - Satellites 130 may be configured to receive signals, such as streams of television channels, from one or more satellite uplinks such as satellite transmitter equipment 120. Satellites 130 may relay received signals from satellite transmitter equipment 120 (and/or other satellite transmitter equipment) to multiple instances of user equipment via transponder streams. Different frequencies may be used for uplink signals 170 from downlink signals 180. Satellites 130 may be in geosynchronous orbit. Each of the transponder streams transmitted by satellites 130 may contain multiple television channels transmitted as packetized data. For example, a single transponder stream may be a serial digital packet stream containing multiple television channels. Therefore, packets for multiple television channels may be interspersed. Further, information used by
television receiver 150 for home automation functions may also be relayed to a television receiver via one or more transponder streams. - Multiple satellites 130 may be used to relay television channels from television
service provider system 110 tosatellite dish 140. Different television channels may be carried using different satellites. Different television channels may also be carried using different transponders of the same satellite; thus, such television channels may be transmitted at different frequencies and/or different frequency ranges. As an example, a first and second television channel may be relayed via a first transponder ofsatellite 130 a. A third, fourth, and fifth television channel may be relayed via a different satellite or a different transponder of the same satellite relaying the transponder stream at a different frequency. A transponder stream transmitted by a particular transponder of a particular satellite may include a finite number of television channels, such as seven. Accordingly, if many television channels are to be made available for viewing and recording, multiple transponder streams may be necessary to transmit all of the television channels to the instances of user equipment. -
Satellite dish 140 may be a piece of user equipment that is used to receive transponder streams from one or more satellites, such as satellites 130.Satellite dish 140 may be provided to a subscriber for use on a subscription basis to receive television channels provided by the televisionservice provider system 110, satellite transmitter equipment 120, and/or satellites 130.Satellite dish 140, which may include one or more low noise blocks (LNBs), may be configured to receive transponder streams from multiple satellites and/or multiple transponders of the same satellite.Satellite dish 140 may be configured to receive television channels via transponder streams on multiple frequencies. Based on the characteristics oftelevision receiver 150 and/orsatellite dish 140, it may only be possible to capture transponder streams from a limited number of transponders concurrently. For example, a tuner oftelevision receiver 150 may only be able to tune to a single transponder stream from a transponder of a single satellite at a given time. The tuner can then be re-tuned to another transponder of the same or a different satellite. Atelevision receiver 150 having multiple tuners may allow for multiple transponder streams to be received at the same time. - In communication with
satellite dish 140 may be one or more television receivers. Television receivers may be configured to decode signals received from satellites 130 viasatellite dish 140 for output and presentation via a display device, such asdisplay device 160. A television receiver may be incorporated as part of a television or may be part of a separate device, commonly referred to as a set-top box (STB).Television receiver 150 may decode signals received viasatellite dish 140 and provide an output to displaydevice 160. On-demand content, such as PPV content, may be stored to a computer-readable storage medium.FIG. 2 provides additional detail of various embodiments of a television receiver. A television receiver is defined to include set-top boxes (STBs), and also circuitry having similar functionality that may be incorporated with another device. For instance, circuitry similar to that of a television receiver may be incorporated as part of a television. As such, whileFIG. 1 illustrates an embodiment oftelevision receiver 150 as separate fromdisplay device 160, it should be understood that, in other embodiments, similar functions may be performed by a television receiver integrated withdisplay device 160.Television receiver 150 may includehome automation engine 211, as detailed in relation toFIG. 2 . -
Display device 160 may be used to present video and/or audio decoded and output bytelevision receiver 150.Television receiver 150 may also output a display of one or more interfaces to displaydevice 160, such as an electronic programming guide (EPG). In many embodiments,display device 160 is a television.Display device 160 may also be a monitor, computer, or some other device configured to display video and, possibly, play audio. - Uplink signal 170 a represents a signal between satellite transmitter equipment 120 and
satellite 130 a.Uplink signal 170 b represents a signal between satellite transmitter equipment 120 andsatellite 130 b. Each of uplink signals 170 may contain streams of one or more different television channels. For example, uplink signal 170 a may contain a first group of television channels, whileuplink signal 170 b contains a second group of television channels. Each of these television channels may be scrambled such that unauthorized persons are prevented from accessing the television channels. -
Downlink signal 180 a represents a signal betweensatellite 130 a andsatellite dish 140.Downlink signal 180 b represents a signal betweensatellite 130 b andsatellite dish 140. Each of downlink signals 180 may contain one or more different television channels, which may be at least partially scrambled. A downlink signal may be in the form of a transponder stream. A single transponder stream may be tuned to at a given time by a tuner of a television receiver. For example,downlink signal 180 a may be a first transponder stream containing a first group of television channels, whiledownlink signal 180 b may be a second transponder stream containing a different group of television channels. In addition to or instead of containing television channels, a transponder stream can be used to transmit on-demand content to television receivers, including PPV content, which may be stored locally by the television receiver until output for presentation. -
FIG. 1 illustratesdownlink signal 180 a anddownlink signal 180 b, being received bysatellite dish 140 and distributed totelevision receiver 150. For a first group of television channels,satellite dish 140 may receivedownlink signal 180 a and for a second group of channels,downlink signal 180 b may be received.Television receiver 150 may decode the received transponder streams. As such, depending on which television channels are desired to be presented or stored, various transponder streams from various satellites may be received, descrambled, and decoded bytelevision receiver 150. -
Network 190, which may include the Internet, may allow for bidirectional communication betweentelevision receiver 150 and televisionservice provider system 110, such as for home automation related services provided by home automation service server 112. Although illustrated as part of the television service provider system, the home automation service server 112 may be provided by a third party in embodiments. In addition or in alternate tonetwork 190, a telephone, e.g., landline, or cellular connection may be used to enable communication betweentelevision receiver 150 and televisionservice provider system 110. -
FIG. 2 illustrates an embodiment of atelevision receiver 200, which may representtelevision receiver 150 ofFIG. 1 .Television receiver 200 may be configured to function as a host for a home automation system either alone or in conjunction with a communication device.Television receiver 200 may be in the form of a separate device configured to be connected with a display device, such as a television. Embodiments oftelevision receiver 200 can include set top boxes (STBs). In addition to being in the form of an STB, a television receiver may be incorporated as part of another device, such as a television, other form of display device, video game console, computer, mobile phone or tablet, or the like. For example, a television may have an integrated television receiver, which does not involve an external STB being coupled with the television. -
Television receiver 200 may be incorporated as part of a television, such asdisplay device 160 ofFIG. 1 .Television receiver 200 may include: processors 210, which may includecontrol processor 210 a,tuning management processor 210 b, and possibly additional processors,tuners 215,network interface 220, non-transitory computer-readable storage medium 225, electronic programming guide (EPG)database 230,television interface 235, digital video recorder (DVR)database 245, which may include provider-managed television programming storage and/or user-defined television programming, on-demand programming database 227, homeautomation settings database 247, homeautomation script database 248,remote control interface 250,security device 260, and/ordescrambling engine 265. In other embodiments oftelevision receiver 200, fewer or greater numbers of components may be present. It should be understood that the various components oftelevision receiver 200 may be implemented using hardware, firmware, software, and/or some combination thereof. Functionality of components may be combined; for example, functions ofdescrambling engine 265 may be performed by tuningmanagement processor 210 b. Further, functionality of components may be spread among additional components. - Processors 210 may include one or more specialized and/or general-purpose processors configured to perform processes such as tuning to a particular channel, accessing and displaying EPG information from
EPG database 230, and/or receiving and processing input from a user. It should be understood that the functions performed by various modules ofFIG. 2 may be performed using one or more processors. As such, for example, functions ofdescrambling engine 265 may be performed bycontrol processor 210 a. -
Control processor 210 a may communicate withtuning management processor 210 b.Control processor 210 a may control the recording of television channels based on timers stored inDVR database 245.Control processor 210 a may also provide commands to tuningmanagement processor 210 b when recording of a television channel is to cease. In addition to providing commands relating to the recording of television channels,control processor 210 a may provide commands to tuningmanagement processor 210 b that indicate television channels to be output todecoder module 233 for output to a display device.Control processor 210 a may also communicate withnetwork interface 220 andremote control interface 250.Control processor 210 a may handle incoming data fromnetwork interface 220 andremote control interface 250. Additionally,control processor 210 a may be configured to output data vianetwork interface 220. -
Control processor 210 a may includehome automation engine 211.Home automation engine 211 may permit television receiver andcontrol processor 210 a to provide home automation functionality.Home automation engine 211 may have a JSON (JavaScript Object Notation) command interpreter or some other form of command interpreter that is configured to communicate with wireless devices vianetwork interface 220 and a message server, possibly via a message server client. Such a command interpreter ofhome automation engine 211 may also communicate via a local area network with devices without using the Internet.Home automation engine 211 may contain multiple controllers specific to different protocols; for instance, a ZigBee® controller, a Z-Wave® controller, and/or an IP camera controller, wireless LAN, 802.11, may be present.Home automation engine 211 may contain a media server configured to serve streaming audio and/or video to remote devices on a local area network or the Internet. Television receiver may be able to serve such devices with recorded content, live content, and/or content recorded using one or more home automation devices, such as cameras. -
Tuners 215 may include one or more tuners used to tune to transponders that include broadcasts of one or more television channels. Such tuners may be used also to receive for storage on-demand content and/or addressable television commercials. In some embodiments, two, three, or more than three tuners may be present, such as four, six, or eight tuners. Each tuner contained intuners 215 may be capable of receiving and processing a single transponder stream from a satellite transponder or from a cable network at a given time. As such, a single tuner may tune to a single transponder stream at a given time. Iftuners 215 include multiple tuners, one tuner may be used to tune to a television channel on a first transponder stream for display using a television, while another tuner may be used to tune to a television channel on a second transponder for recording and viewing at some other time. If multiple television channels transmitted on the same transponder stream are desired, a single tuner oftuners 215 may be used to receive the signal containing the multiple television channels for presentation and/or recording.Tuners 215 may receive commands from tuningmanagement processor 210 b. Such commands may instructtuners 215 to which frequencies are to be tuned. -
Network interface 220 may be used to communicate via an alternate communication channel with a television service provider, if such communication channel is available. A communication channel may be via satellite, which may be unidirectional totelevision receiver 200, and the alternate communication channel, which may be bidirectional, may be via a network, such as the Internet. Data may be transmitted fromtelevision receiver 200 to a television service provider system and from the television service provider system totelevision receiver 200. Information may be transmitted and/or received vianetwork interface 220. For instance, instructions from a television service provider may also be received vianetwork interface 220, if connected with the Internet. Besides the primary communication channel being satellite, cable network, an IP-based network, or broadcast network may be used.Network interface 220 may permit wireless communication with one or more types of networks, including using home automation network protocols and wireless network protocols. Also, wired networks may be connected to and communicated with vianetwork interface 220.Device interface 221 may represent a USB port or some other form of communication port that permits communication with a communication device as will be explained further below. -
Storage medium 225 may represent one or more non-transitory computer-readable storage mediums.Storage medium 225 may include memory and/or a hard drive.Storage medium 225 may be used to store information received from one or more satellites and/or information received vianetwork interface 220.Storage medium 225 may store information related to on-demand programming database 227,EPG database 230,DVR database 245, homeautomation settings database 247, and/or homeautomation script database 248. Recorded television programs may be stored usingstorage medium 225 as part ofDVR database 245.Storage medium 225 may be partitioned or otherwise divided, such as into folders, such that predefined amounts ofstorage medium 225 are devoted to storage of television programs recorded due to user-defined timers and stored television programs recorded due to provider-defined timers. - Home
automation settings database 247 may allow configuration settings of home automation devices and user preferences to be stored. Homeautomation settings database 247 may store data related to various devices that have been set up to communicate withtelevision receiver 200. For instance, homeautomation settings database 247 may be configured to store information on which types of events should be indicated to users, to which users, in what order, and what communication methods should be used. For instance, an event such as an open garage may only be notified to certain wireless devices, e.g., a cellular phone associated with a parent, not a child, notification may be by a third-party notification server, email, text message, and/or phone call. In some embodiments, a second notification method may only be used if a first fails. For instance, if a notification cannot be sent to the user via a third-party notification server, an email may be sent. - Home
automation settings database 247 may store information that allows for the configuration and control of individual home automation devices which may operate using Z-wave and Zigbee—specific protocols. To do so,home automation engine 211 may create a proxy for each device that allows for settings for the device to be passed through a UI, e.g, presented on a television, to allow for settings to be solicited for and collected via a user interface presented by television receiver or overlay device. The received settings may then be handled by the proxy specific to the protocol, allowing for the settings to be passed on to the appropriate device. Such an arrangement may allow for settings to be collected and received via a UI of the television receiver or overlay device and passed to the appropriate home automation device and/or used for managing the appropriate home automation device. For example, a piece of exercise equipment that is enabled to interface with thehome automation engine 211, such as viadevice interface 221, may be configured at theelectronic device 211 in addition to on the piece of exercise equipment itself. Additionally, a mobile device or application residing on a mobile device and utilized with exercise equipment may be configured in such a fashion as well for displaying received fitness information on a coupled display device. - Home
automation script database 248 may store scripts that detail how home automation devices are to function based on various events occurring. For instance, if stored content starts being played back bytelevision receiver 200, lights in the vicinity ofdisplay device 160 may be dimmed and shades may be lowered by communicatively coupled and controlled shade controller. As another example, when a user shuts programming off late in the evening, there may be an assumption the user is going to bed. Therefore, the user may configuretelevision receiver 200 to lock all doors via a lock controller, shut the garage door via garage controller, lower a heat setting of thermostat, shut off all lights via a light controller, and determine if any windows or doors are open via window sensors and door sensors, and, if so, alert the user. Such scripts or programs may be predefined by the home automation/television service provider and/or may be defined by a user. - In some embodiments, home
automation script database 248 may allow for various music profiles to be implemented. For instance, based on home automation settings within a structure, appropriate music may be played. For instance, when a piece of exercise equipment is connected or is used, energizing music may be played. Conversely, based on the music being played, settings of home automation devices may be determined. If television programming, such as a movie, is output for playback bytelevision receiver 150, a particular home automation script may be used to adjust home automation settings, e.g., lower lights, raise temperature, and lock doors. -
EPG database 230 may store information related to television channels and the timing of programs appearing on such television channels.EPG database 230 may be stored usingstorage medium 225, which may be a hard drive or solid-state drive. Information fromEPG database 230 may be used to inform users of what television channels or programs are popular and/or provide recommendations to the user. Information fromEPG database 230 may provide the user with a visual interface displayed by a television that allows a user to browse and select television channels and/or television programs for viewing and/or recording. Information used to populateEPG database 230 may be received vianetwork interface 220, via satellite, or some other communication link with a television service provider, e.g., a cable network. Updates toEPG database 230 may be received periodically.EPG database 230 may serve as an interface for a user to control DVR functions oftelevision receiver 200, and/or to enable viewing and/or recording of multiple television channels simultaneously. EPG database 240 may also contain information about on-demand content or any other form of accessible content. -
Decoder module 233 may serve to convert encoded video and audio into a format suitable for output to a display device. For instance,decoder module 233 may receive MPEG video and audio fromstorage medium 225 ordescrambling engine 265 to be output to a television. MPEG video and audio fromstorage medium 225 may have been recorded toDVR database 245 as part of a previously-recorded television program.Decoder module 233 may convert the MPEG video and audio into a format appropriate to be displayed by a television or other form of display device and audio into a format appropriate to be output from speakers, respectively.Decoder module 233 may have the ability to convert a finite number of television channel streams received fromstorage medium 225 ordescrambling engine 265, simultaneously. For instance, decoders withindecoder module 233 may be able to only decode a single television channel at a time.Decoder module 233 may have various numbers of decoders. -
Television interface 235 may serve to output a signal to a television or another form of display device in a proper format for display of video and playback of audio. As such,television interface 235 may output one or more television channels, stored television programming fromstorage medium 225, e.g., television programs fromDVR database 245, television programs from on-demand programming 230 and/or information fromEPG database 230, to a television for presentation.Television interface 235 may also serve to output a CVM. - Digital Video Recorder (DVR) functionality may permit a television channel to be recorded for a period of time. DVR functionality of
television receiver 200 may be managed bycontrol processor 210 a.Control processor 210 a may coordinate the television channel, start time, and stop time of when recording of a television channel is to occur.DVR database 245 may store information related to the recording of television channels.DVR database 245 may store timers that are used bycontrol processor 210 a to determine when a television channel should be tuned to and its programs recorded toDVR database 245 ofstorage medium 225. In some embodiments, a limited amount ofstorage medium 225 may be devoted toDVR database 245. Timers may be set by the television service provider and/or one or more users oftelevision receiver 200. -
DVR database 245 may also be used to record recordings of service provider-defined television channels. For each day, an array of files may be created. For example, based on provider-defined timers, a file may be created for each recorded television channel for a day. For example, if four television channels are recorded from 6-10 PM on a given day, four files may be created; one for each television channel. Within each file, one or more television programs may be present. The service provider may define the television channels, the dates, and the time periods for which the television channels are recorded for the provider-defined timers. The provider-defined timers may be transmitted totelevision receiver 200 via the television provider's network. For example, in a satellite-based television service provider system, data necessary to create the provider-defined timers attelevision receiver 150 may be received via satellite. - On-
demand programming database 227 may store additional television programming. On-demand programming database 227 may include television programming that was not recorded tostorage medium 225 via a timer, either user- or provider-defined. Rather, on-demand programming may be programming provided to the television receiver directly for storage by the television receiver and for later presentation to one or more users. On-demand programming may not be user-selected. As such, the television programming stored to on-demand programming database 227 may be the same for each television receiver of a television service provider. On-demand programming database 227 may include pay-per-view (PPV) programming that a user must pay and/or use an amount of credits to view. For instance, on-demand programming database 227 may include movies that are not available for purchase or rental yet. - Referring back to
tuners 215, television channels received via satellite or cable may contain at least some scrambled data. Packets of audio and video may be scrambled to prevent unauthorized users, e.g., nonsubscribers, from receiving television programming without paying the television service provider. When a tuner oftuners 215 is receiving data from a particular transponder of a satellite, the transponder stream may be a series of data packets corresponding to multiple television channels. Each data packet may contain a packet identifier (PID), which can be determined to be associated with a particular television channel. Particular data packets, referred to as entitlement control messages (ECMs), may be periodically transmitted. ECMs may be associated with another PID and may be encrypted;television receiver 200 may use decryption engine 261 ofsecurity device 260 to decrypt ECMs. Decryption of an ECM may only be possible if the user has authorization to access the particular television channel associated with the ECM. When an ECM is determined to correspond to a television channel being stored and/or displayed, the ECM may be provided tosecurity device 260 for decryption. - When
security device 260 receives an encrypted ECM,security device 260 may decrypt the ECM to obtain some number of control words. In some embodiments, from each ECM received bysecurity device 260, two control words are obtained. In some embodiments, whensecurity device 260 receives an ECM, it compares the ECM to the previously received ECM. If the two ECMs match, the second ECM is not decrypted because the same control words would be obtained. In other embodiments, each ECM received bysecurity device 260 is decrypted; however, if a second ECM matches a first ECM, the outputted control words will match; thus, effectively, the second ECM does not affect the control words output bysecurity device 260.Security device 260 may be permanently part oftelevision receiver 200 or may be configured to be inserted and removed fromtelevision receiver 200, such as a smart card, cable card, or the like. -
Tuning management processor 210 b may be in communication withtuners 215 andcontrol processor 210 a.Tuning management processor 210 b may be configured to receive commands fromcontrol processor 210 a. Such commands may indicate when to start/stop receiving and/or recording of a television channel and/or when to start/stop causing a television channel to be output to a television.Tuning management processor 210 b may controltuners 215.Tuning management processor 210 b may provide commands totuners 215 that instruct the tuners which satellite, transponder, and/or frequency to tune to. Fromtuners 215,tuning management processor 210 b may receive transponder streams of packetized data. - Descrambling
engine 265 may use the control words output bysecurity device 260 in order to descramble video and/or audio corresponding to television channels for storage and/or presentation. Video and/or audio data contained in the transponder data stream received bytuners 215 may be scrambled. Video and/or audio data may be descrambled by descramblingengine 265 using a particular control word. Which control word output bysecurity device 260 to be used for successful descrambling may be indicated by a scramble control identifier present within the data packet containing the scrambled video or audio. Descrambled video and/or audio may be output by descramblingengine 265 tostorage medium 225 for storage, inDVR database 245, and/or todecoder module 233 for output to a television or other presentation equipment viatelevision interface 235. - In some embodiments, the
television receiver 200 may be configured to periodically reboot in order to install software updates downloaded over thenetwork 190 or satellites 130. Such reboots may occur for example during the night when the users are likely asleep and not watching television. If the system utilizes a single processing module to provide television receiving and home automation functionality, then the security functions may be temporarily deactivated. In order to increase the security of the system, thetelevision receiver 200 may be configured to reboot at random times during the night in order to allow for installation of updates. Thus, an intruder is less likely to guess the time when the system is rebooting. In some embodiments, thetelevision receiver 200 may include multiple processing modules for providing different functionality, such as television receiving functionality and home automation, such that an update to one module does not necessitate reboot of the whole system. In other embodiments, multiple processing modules may be made available as a primary and a backup during any installation or update procedures. - For simplicity,
television receiver 200 ofFIG. 2 has been reduced to a block diagram; commonly known parts, such as a power supply, have been omitted. Further, some routing between the various modules oftelevision receiver 200 has been illustrated. Such illustrations are for exemplary purposes only. The state of two modules not being directly or indirectly connected does not indicate the modules cannot communicate. Rather, connections between modules of thetelevision receiver 200 are intended only to indicate possible common data routing. It should be understood that the modules oftelevision receiver 200 may be combined into a fewer number of modules or divided into a greater number of modules. Further, the components oftelevision receiver 200 may be part of another device, such as built into a television.Television receiver 200 may include one or more instances of various computerized components, such as disclosed in relation tocomputer system 1100 ofFIG. 11 . - While the
television receiver 200 has been illustrated as a satellite-based television receiver, it is to be appreciated that techniques below may be implemented in other types of television receiving devices, such a cable receivers, terrestrial receivers, IPTV receivers or the like. In some embodiments, thetelevision receiver 200 may be configured as a hybrid receiving device, capable of receiving content from disparate communication networks, such as satellite and terrestrial television broadcasts. In some embodiments, the tuners may be in the form of network interfaces capable of receiving content from designated network locations. The home automation functions oftelevision receiver 200 may be performed by an overlay device. If such an overlay device is used, television programming functions may still be provided by a television receiver that is not used to provide home automation functions. -
FIG. 3 shows an embodiment of a system for home monitoring and control that includes atelevision receiver 350. Thesystem 300 may include a television receiver that is directly or indirectly coupled to one ormore display devices 360 such as a television or a monitor. The television receiver may be communicatively coupled to other display andnotification devices 361 such as stereo systems, speakers, lights, mobile phones, tablets, and the like. The television receiver may be configured to receive readings from one ormore sensors sensor systems 346 and may be configured to provide signals for controlling one ormore control units control systems 346. - In embodiments the television receiver may include a monitoring and
control module - The system may include one or more monitoring and
control modules television receiver 350. The television receiver may interface to sensors and control units via one or more of the monitoring and control modules. The external monitoring andcontrol modules - In some embodiments, the monitoring and control modules may connect to the television receiver via a communication port such as a USB port, serial port, and/or the like, or may connect to the television receiver via a wireless communication protocol such as Wi-Fi, Bluetooth, Z-Wave, ZigBee, and the like. The external monitoring and control modules may be a separate device that may be positioned near the television receiver or may be in a different location, remote from the television receiver.
- In embodiments, the monitoring and
control modules control modules storage 322 module. The data processing andstorage 322 module may analyze the received data and generate control signals, schedules, and/or sequences for controlling the control units. Additionally, the data processing andstorage module 322 may utilize input data to generate additional outputs. For example, themodule 322 may receive from asensor 342 information from a communicatively coupled piece of equipment. The sensor may be a part of or attached to the equipment in various embodiments. The equipment may provide information regarding movements, alarms, or notifications associated with the home, and thedata processing module 322 may use this data to generate relative distance information to be output to and displayed bydisplay device 360. In some embodiments, the monitoring andcontrol modules - Sensors and control units may be wired or wirelessly coupled to the monitoring and
control modules receiver 350 itself. The sensors and control units may be coupled and connected in a serial, parallel, star, hierarchical, and/or the like topologies and may communicate to the monitoring and control modules via one or more serial, bus, or wireless protocols and technologies. The sensors may include any number of temperature, humidity, sound, proximity, field, electromagnetic, magnetic sensors, cameras, infrared detectors, motion sensors, pressure sensors, smoke sensors, fire sensors, water sensors, and/or the like. The sensors may also be part of or attached to other pieces of equipment, such as exercise equipment, doors or windows, or home appliances, or may be applications or other sensors as part of mobile devices. - The monitoring and
control modules - The controllers, e.g.,
controller 343, may include a remote control designed for association with the television receiver. For example, the receiver remote control device may be communicatively coupled with the television receiver, such as throughinterface 250, or one or more of the monitoring and control modules for providing control or instruction for operation of the various devices of the system. The control may be utilized to provide instructions to the receiver for providing various functions with the automation system including suspending alert notifications during an event. For example, a user may determine prior to or during an event that he wishes to suspend one or more types of notifications until the event has ended, and may so instruct the system with the controller. - Sensors may be part of other devices and/or systems. For example, sensors may be part of a mobile device such as a phone. The telemetry readings of the sensors may be accessed through a wireless communication interface such as a Bluetooth connection from the phone. As another example, temperature sensors may be part of a heating and ventilation system of a home. The readings of the sensors may be accessed via a communication interface of the heating and ventilation system. Sensors and/or control units may be combined into assemblies or units with multiple sensing capabilities and/or control capabilities. A single module may include, for example a temperature sensor and humidity sensor. Another module may include a light sensor and power or control unit and so on.
- In embodiments, the sensors and control units may be configurable or adjustable. In some cases the sensors and control units may be configurable or adjustable for specific applications. The sensors and control units may be adjustable by mechanical or manual means. In some cases the sensors and control units may be electronically adjustable from commands or instructions sent to the sensors or control units. For example, the focal length of a camera may be configurable in some embodiments. The focal length of a camera may be dependent on the application of the camera. In some embodiments the focal length may be manually set or adjusted by moving or rotating a lens. In some embodiments the focal length may be adjusted via commands that cause an actuator to move one or more lenses to change the focal length. In other embodiments, the sensitivity, response, position, spectrum and/or like of the sensors may be adjustable.
- During operation of the
system 300, readings from the sensors may be collected, stored, and/or analyzed in thetelevision receiver 350. In embodiments, analysis of the sensors and control of the control units may be determined byconfiguration data 324 stored in thetelevision receiver 350. The configuration data may define how the sensor data is collected, how often, what periods of time, what accuracy is required, and other characteristics. The configuration data may specify specific sensor and/or control unit settings for a monitoring and/or control application. The configuration data may define how the sensor readings are processed and/or analyzed. For example, for some applications, sensor analysis may include collecting sensor readings and performing time based analysis to determine trends, such as temperature fluctuations in a typical day or energy usage. Such trending information may be developed by the receiver into charts or graphs for display to the user. For other applications, sensor analysis may include monitoring sensor readings to determine if a threshold value of one or more sensors has been reached. - The function of the system may be determined by loading and/or identifying configuration data for an application. In embodiments, the
system 300 may be configured for more than one monitoring or control operation by selecting or loading the appropriate configuration data. In some embodiments the same sensors and/or control units may be used for multiple applications depending on the configuration data used to process and analyze sensor readings and/or activate the control units. Multiple monitoring and/or control applications may be active simultaneously or in a time multiplexed manner using the same or similar set of sensors and/or control units. - For example, the
system 300 may be configured for both exercise monitoring and temperature monitoring applications using the same set of sensors. In embodiments, both monitoring applications may be active simultaneously or in a time multiplexed manner depending on which configuration data is loaded. In both monitoring applications the same sensors, such as proximity sensors, or cameras may be used. Using the same sensors, the system may be configured for space temperature monitoring. For temperature monitoring, the system may only monitor a specific subset of the sensors for activity. For temperature monitoring, sensor activity may not need to be saved or recorded. The sensor readings may be monitored for specific thresholds which may indicate a threshold temperature for adjusting the space temperature. In this example, the two different monitoring examples may be selected based on the active configuration data. When one configuration data is active, data from the sensors may be saved and analyzed. When the second configuration data is active, the system may monitor sensor readings for specific thresholds. Of course, multiple or alternative sensors may be used as well. - In embodiments the results, status, analysis, and configuration data details for each application may be communicated to a user. In embodiments auditory, visual, and tactile communication methods may be used. In some cases a display device such as a television may be used for display and audio purposes. The display device may show information related to the monitoring and control application. Statistics, status, configuration data, and other elements may be shown. Users may also save particular configuration data for devices, such as notification suspensions while the user is using the coupled display. A user may log in or be recognized by the system upon activation and the system may make adjustments based on predetermined or recorded configuration data. For example, a user may have instructed that when he is recognized by the system, either automatically or with provided login information, a notification suspension profile personal to the user be enacted. That profile may include that the user would like to continue to receive alarms, such as smoke, fire, or hazard alarms, but that received telephone call information is suspended. The user may access the profile and select to begin, the user may be recognized by the system, or a combination such as being recognized by the system such that the television operations are performed or are input by a remote control, while the user himself selects a particular activity to perform with the system.
- Any number of additional adjustments or operations may be performed as well, as would be understood as encompassed by the present technology. For example, the space temperature may be monitored or adjusted as well. In one situation, after the user has been exercising for a period of time, generated heat may raise the space temperature above a threshold such that the
home automation engine 211 additionally begins operation or adjustment of the HVAC system to cool the space. Additionally, configuration data for the user may include reducing the space temperature to a particular degree based on a preference of the user. Thus, when the user loads a profile or begins exercising, the home automation system may automatically begin adjusting the space temperature as well in anticipation of heat generation or user preferences. - In embodiments the system may include additional notification and
display devices 361 capable of notifying the user, showing the status, configuration data, and/or the like. The additional notification and display devices may be devices that are directly or indirectly connected with the television receiver. In some embodiments computers, mobile devices, phones, tablets, and the like may receive information, notifications, control signals, etc., from the television receiver. Data related to the monitoring and control applications and activity may be transmitted to remote devices and displayed to a user. Such display devices may be used for presenting to the user interfaces that may be used to further configure or change configuration data for each application. An interface may include one or more options, selection tools, navigation tools for modifying the configuration data which in turn may change monitoring and/or control activity of an application. Modification to a configuration may be used to adjust general parameters of a monitoring application to specific constraints or characteristics of a home, user's schedule, control units, and/or the like. - Display interfaces may be used to select and/or download new configurations for monitoring and/or control applications. A catalog of pre-defined configuration data definitions for monitoring and control applications may be available to a user. A user may select, load, and/or install the applications on the television receiver by making a selection using in part the display device. For example, a user may load a profile based on notification suspension preferences as discussed above. In embodiments, configuration data may be a separate executable application, code, package, and/or the like. In some cases, the configuration data may be a set of parameters that define computations, schedules, or options for other processor executable code or instructions. Configuration data may be a meta data, text data, binary file, and/or the like.
- In embodiments notification and display devices may be configured to receive periodic, scheduled, or continuous updates for one or more monitoring and control applications. The notifications may be configured to generate pop-up screens, notification banners, sounds, and/or other visual, auditory, and/or tactile alerts. In the case where the display device is a television, some notifications may be configured to cause a pop-up or banner to appear over the programming or content being displayed, such as when a proximity monitor has been triggered in the home. Such an alert may be presented in a centrally located box or in a position different from the fitness information to make it more recognizable. Additionally the program being watched can be paused automatically while such an alert is being presented, and may not be resumed unless receiving an input or acceptance from the user. Some notifications may be configured to cause the television to turn on if it is powered off or in stand-by mode and display relevant information for a user. In this way, users can be warned of activity occurring elsewhere in the system.
- The television receiver may also be configured to receive broadcast or
other input 362. Such input may include television channels or other information previously described that is used in conjunction with the monitoring system to produce customizable outputs. For example, a user may wish to watch a particular television channel while also receiving video information of activities occurring on the property. The television receiver may receive both the exterior camera information and television channel information to develop a modified output for display. The display may include a split screen in some way, a banner, an overlay, etc. - The systems and devices previously described may be used in performing various methods. The methods may also be represented by programming stored in memory of a computing device.
FIG. 4 illustrates an embodiment of amethod 400 for coordinating home automation activity.Method 400 may be performed using any of the systems or components previously described.Method 400 may allow for an electronic device to receive and present event information over discrete time intervals for user review and consideration. Each step ofmethod 400 may be performed at or by a single electronic device, such as an STB, television receiver, computer, or mobile device, for example, or by multiple devices communicating with one another. For example, such electronic device or devices may be a hub for the home automation system in embodiments of the present technology. Means for performing each step ofmethod 400 include an electronic device and/or the various components of an electronic device or distribution system, such as those detailed in relation toFIGS. 1 and 2 .Method 400 may be performed using a computerized device, such as a device incorporating some or all of the components ofcomputer system 1100 ofFIG. 11 . - The method may include accessing event data at
operation 410. The event data may be accessed from a storage database communicatively coupled with an electronic device that accesses the event data. The event data may include a host of information that in embodiments may include a time and date at which the event data was collected. The methods may also include generating an event table identifying the number of events occurring during discrete time intervals for a date range or period of time atoperation 420. The event table may provide a graphical depiction of the number of events that occurred for each discrete time interval in embodiments. The methods may also include outputting the event table from the electronic device for display on a display device atoperation 430. - The present methods may allow a user to review an overview of events that have been recorded or generated throughout time intervals of minutes, hours, days, months, years, etc. Such methods may be utilized with a home automation system for reviewing security and event information, for example. In embodiments, any one or more of a plurality of connected devices as previously described may record or output information or data throughout a day. Sensors may provide information that a temperature has passed a set point, which may cause a cooling system to be engaged for cooling the space. If a smoke detector recognizes smoke within a space, it may enable an alarm or other sprinkler system for reacting to a potential fire. Many other examples available with a home automation system will be readily understood to be encompassed by the present technology. In embodiments, each time an event occurs related to the home automation system, the data associated with the event may be collected and stored in the storage database. In embodiments, the data may include some or all information or data related to the event.
- In one embodiment, a security system may be a part of the home automation system, and may include cameras and other sensors and devices related to home security. The cameras may include motion detectors, sound detectors, thermal detectors, etc., that may be coupled with the one or more cameras to instruct them when to record. A camera may be configured to record constantly, at particular times based on pre-set instructions, or when a sensor has been tripped or triggered. Any one of these scenarios may be considered an event in relation to the present technology. For example, a camera may be coupled with a motion sensor, which when triggered causes the camera to begin recording. The camera may continue recording until the sensor ceases to note motion, or for a pre-determined amount of time, which can be for a few seconds or more, about 30 seconds or more, about 1 minute or more, about 5 minutes or more, about 10 minutes or more, or for any period of time for which the camera may be set to record. A similar scenario can be triggered by a sound detector, for example. In still other embodiments, a manual recording may be initiated at any time by a user of the system, which may record for any period of time the user wishes, or for pre-set periods of time as instructed by a user during system setup, for example.
- When a camera begins recording, the video recording may be included with the collected event data for the event, and may be stored in the storage database. The event data for a particular event may then include a time, date, video recording, triggering event explanation, sensor or camera identification, event duration, and any other information related to the event that occurs. Such information may be stored for every event that occurs during a day, or may be set by a user to include only particular types of data for particular events. For example, security events such as motion-activated camera activity may include all relevant data for the event, while the engagement of an HVAC system may include only the time at which the system was started and stopped in embodiments.
- While a user is away, such as at work, running errands, or on vacation, this event data may be collected for every event which may occur during the day or time period. Over the course or a workday or time period, these events may increase to a point of reaching or exceeding an amount of storage space in the storage database allocated for such activity. The storage database may include a local storage database on the electronic device in embodiments, and may also include a cloud-based storage or network-accessible storage external to the electronic device. For example, the service provider of the system may include an amount of external storage for use by individual users, that may include a subscription-based amount of storage, video quality, period for individual event storage, etc. Based on the subscription or amount of storage generally available to an individual user, event data, which may include video recordings, may fully utilize the available storage on a regular basis. As the storage space fills, it may be re-used in any number of ways including discarding the oldest events to provide space for newer events, for example. Accordingly, a user may wish to review event data on a regular basis in order to ensure security or other events have been recognized by the user. Regardless of storage requirements, a user may also simply wish to review the events during a period of time he or she was away from the home, business, or space at which the system is set.
- As stated, based on the number of events that occur during a day or time period, the amount of review for a user may be quite large. The present technology may allow the user to be presented with an overview of all events that have occurred during the time period, and present it to the user in a useful way. For example, after a user has returned from work at the end of the day, he may access the electronic device to review event data. In an embodiment, the electronic device includes a television receiver, and may be a set-top box including a satellite or cable box. A user may be watching television, for example, and may wish to alternatively or simultaneously review event data for the day.
- As explained with regard to an embodiment in which the electronic device or hub includes a television receiver, such as for example with a set-top box, a user may provide an instruction to the electronic device to present events for review. The electronic device may access the storage database to retrieve all event data for a time period, which may default to the past 24 hours, or current day, or previous day, for example, and may be set to retrieve any amount of data or for any particular time period. The electronic device may retrieve the data from a storage database that may include local storage and remote storage, such as cloud storage, for example. In embodiments, certain information may be stored locally including event details, while other data, such as video recordings, may be stored externally, such as in cloud storage. Additionally, the event details may be stored both locally and in cloud storage for use in multiple ways.
- Once retrieved, the electronic device may analyze the data or parse the data for presentation. For example, the device may develop or generate an event table that includes information for each item of event data over the time period, or the item of event data itself. For example, the table may include the actual time and date information for each event, and may include a video recording associated with the event, or may include one or more selectable links to the video recording. The table may further parse and organize the data into discrete time intervals across the period of time, such as in hour or some number of minute increments, for example. The electronic device may then utilize this parsed information to produce a graphical depiction of each item of the event data across the discrete time intervals. An exemplary graphical depiction according to embodiments of the methods is illustrated in
FIG. 5 . -
FIG. 5A shows anexemplary overlay 500 of event data for a home automation system according to the present technology.Overlay 500 may include a number of elements, including an event table 505 including agraphical depiction 510 of event data. Event table 505 may include some or all of the event data occurring during the period of time. For example, the table may include all alarms, video recordings, and system notifications that have occurred throughout the day. The event table may organize the events based on any number of event particulars including the type of event, the sensor or home automation feature receiving the event, the start and/or end time of the event, the date, the duration of the event, as well as aspects of the event related to the event data available. For example, for video recordings, the event table may also include information regarding whether the recording has been viewed, or an amount of how much of the recording has been viewed. The event type may include whether the event is a particular sensor event, or for a home automation feature such as a security camera, whether the event is a manual recording, a timer-based recording, a motion-initiated, a sound-initiated recording, or a rule-initiated recording. A rule-initiated recording may be a recording triggered by an alarm or a connected device, for example. In an embodiment, a user may have a camera set up to record an entryway (in the home or outside) when the doorbell is activated, or if a smoke alarm somewhere in the house initiates, a camera for that zone may begin recording. Any number of rules may be initiated for these or other home features and are understood to be encompassed by the present technology. - Presented within or with the event table may be a
graphical depiction 510 of the number of events that occurred for each discrete time interval of the period of time. For example, if the time period displayed is a particular day, the discrete time intervals may include a series of hours throughout the day. The series may include every hour of the day, or may include a subset of hours in the day such as only those hours during which an event took place, or only those hours for which the user is not home and has selected for review. As illustrated in the Figure, in an embodiment the graphical depiction may be or include a bar chart having a bar for each discrete time interval. In embodiments, the bar height and/or color may be used to indicate the number of items of event data occurring for each discrete time interval. For example, as the number of events during a discrete time interval increases, the bar height may increase commensurately. - Additionally, preset colors may be utilized to indicate the severity in the number of events during a time interval to provide a user with a visual cue to review the particular time interval for any major events. For example, as the number of events during a discrete time interval increases, the color of the corresponding bar may change. In an embodiment, one or no events during a discrete time interval may have a bar color that is a more neutral color, such as blue or green for example. For more events, such as 2 or 3, for example, the bar color for that time interval may change to yellow. As the event number increases to 4 or 5, the bar color for that time interval may change to orange, and as the number increases over 5 the bar color may change to red. It is to be understood that the numbers listed are only examples, and that the actual colors used may be set based on an average number of events that occur in an individual situation, or any number preference of a user. For example, in a home in which there are many people constantly coming and going, the user may set up the system colors to be double, triple, or more of the listed numbers for each color, or any other number of the user's choosing. Similarly, any particular color combination may be used or chosen by a user of the system.
-
Graphical depiction 510 may take a variety of forms in embodiments that may be used in alternative arrangements or in combined forms. As illustrated inFIG. 5B , thegraphical depiction 510 may take the form of a line graph indicating the number of events occurring during discrete time intervals. As would be understood, the line graph may also be combined with the previously described bar chart in embodiments. The line itself may take any number of forms such as an angular view as illustrated in the figure, as well as a curved view having rounded corners, as well as any other line that may convey the number of events that occur during a specific time interval. - In another embodiment as illustrated in
FIG. 5C , thegraphical depiction 510 may take the form of a chart or spreadsheet indicating the number of events occurring during discrete time intervals. As illustrated, such a format may allow additional information to be displayed including the discrete cameras, for example, as well as any other sensors or zones that may be identified for a space. Within the chart the events may be differentiated in any number of ways including by shading within a specific color scheme as well as with alternate colors based on the number of events. For example, the same or a relatively similar scheme may be used as identified above for colors including yellow, orange, red, etc., to identify time intervals during which events occur. In another example shading may be utilized with one or more colors to indicate the number of events. As illustrated, for example, based on the darker shading at 8 AM and 9 AM, a user may identify that multiple events were detected byCamera 4 during those times. Any other presentation schemes or variations on the provided examples as would be understood to identify the number of events occurring during discrete time intervals are similarly encompassed by the present technology, and may be utilized alone or in combination in order to present the information of most use to a user of the technology. - In embodiments, a user may scroll to an individual bar of the bar chart and access the number of events for that time interval. For example, the event table may list all video recordings and/or events for a day, and then as a user scrolls over individual bars, the event table may adjust to show only the events of that time interval. Thus, in the corresponding methods, the methods may also include receiving an instruction selecting one or more of the discrete time intervals. The user may provide those instructions with a remote device or mobile device, which can include a remote associated with the electronic device, such as a television or set-top box remote, for example. The methods may also include having the electronic device, in response to the instruction, further output for display a tabularized description of each item of event data occurring during the selected discrete time interval. The tabularized description may be shown in the event table location, and include any of the data previously described.
- In embodiments the user may also scroll down or over individual events within the event table, and select particular events for viewing, such as viewing additional particulars of the event or watching associated video. For example, for a sensor based event, by selecting the event in the event table, the user may learn the particulars of the event not already listed, such as a temperature relating to a temperature sensor based event. For events including a video recording, scrolling over the event may provide an image or screen shot from the camera in a
viewing window 515. Again, the corresponding methods may include receiving an instruction to select one of the time intervals, which may include an actual selection, such as clicking on an interval or bar, or a scrolling operation over a listing. In response to the instruction, the electronic device may output for display at least an image associated with each video recording for each item of event data occurring during the selected discrete time interval. Additionally, for a user scrolling over individual events within the event table listings, the electronic device may automatically output for display at least an image associated with the video recording, or may automatically begin streaming or playing the event in the viewing window. - The viewing window, the event table, or an additional window for display within the overlay may provide selectable filters allowing a user to filter event data. The filters may include filters for particular discrete time intervals, filters for particular sensors, cameras, or household devices, cameras for particular triggers such as motion-initiated events, etc. For example, as illustrated in
FIG. 5 ,viewing window 515 includes a number of selections available for filtering event data. The first selection include all recordings, and the event table and graphical depiction include all event data for the selected time period. The user may also scroll to select individual cameras as illustrated, as well as particular triggering events for each camera. As illustrated, the individual cameras provide additional filters for both motion-initiating events and sound-initiating events. Additional filters may be included such as manual filters, or rule-based filters, for example, but any number of filter mechanisms may be included for parsing event data, and are to be understood as being encompassed by the present technology, which may be used in combination to further filter data per a user preference. - By scrolling over the filters, the user may provide an instruction to the electronic device to apply the selected or highlighted filter to the event table. Again, for example, a user may utilize a remote control for maneuvering a selection box or highlight over aspects of the overlay that may allow further interaction or selection by the user. In response, the electronic device may apply at least one filter to the event table. The application of the filter may filter not only the event data displayed in the event table, but may also adjust the bar chart height and/or color to include only the items of event data included by the at least one filter. For example, if a user selects to filter the event data to include only data from a particular camera, such as a front-door camera for example, event data from other cameras and/or sensors may be removed from both the event table and the graphical depiction such that the graphical depiction includes only data for the particular camera selected. Accordingly, if the number of events in the discrete time intervals reduces, the color and/or height may change based on the total number of events occurring during that time interval as previously described. This may provide an additional indication to a user of what intervals to review for an individual camera that a user may deem to be important.
- As previously described, the electronic device may be a home automation system hub, which may also be included with a television receiver, such as a satellite or cable set-top box. Such a hub may include input components to receive multimedia data including broadcast programming as well as user input, such as from a remote control. The electronic device may then be coupled with a display device, such as a television or screen. The electronic device may also be able to receive video data from communicatively coupled cameras over a network input, such as with cameras that include cloud-based storage of recorded information.
Overlay 500 may take a variety of forms, which may be further adjusted by a viewer. For example, the overlay may be a complete or partial overlay, and may be presented as a split-screen, picture-in-picture view, or punch-though as illustrated. Any number of user-adjustable configurations may be used to allow a user to view and review home automation information as well as watch broadcast or downloaded material simultaneously. - Turning to
FIG. 6 is illustrated amethod 600 of coordinating home automation activity according to the present technology. The method may include accessing event data from a storage database communicatively coupled with an electronic device atoperation 610. The database may be local or remote, and each item of event data may include a time and date at which the event data was collected as previously described. The method may also include monitoring the event data to determine that the number of items of event data for a discrete time interval surpasses a pre-defined threshold atoperation 620. The method may still further include transmitting an alert to a mobile device indicating that the pre-defined threshold has been surpassed atoperation 630. - The event data, which may include video recordings, may be stored locally at the electronic device, or remotely on a network-accessible storage device, including cloud storage maintained by a provider system. The monitoring may be performed by the electronic device itself, such as by a processor of the electronic device, or may be monitored by the database, such as by the database management system, of the storage device. If the event data is monitored by the electronic device, the database may provide updates to the electronic device whenever new event information is received. The electronic device may then update an event table and compare the data values against threshold values for providing alerts. If the database management system or database is performing the monitoring, it may receive threshold values from the electronic device, such as user-defined preferences, and then provide an announcement or alert to the electronic device when the threshold values are surpassed.
- The threshold values may take several forms including both a discrete number as well as a pattern. In embodiments, the threshold values may include rules such as if the number of events in a discrete time interval surpasses 5, or any other number, or extends into the orange or red color as previously discussed, an alert is transmitted to the user. Rules may also be used if an amount of movement above a threshold is detected, or a sound above a threshold. Once a system is enabled, a user may adjust these settings based on typical occurrences around the property. Moreover, a rule may be enacted that if a pattern of alerts is received, an alert is transmitted to the user. An exemplary pattern may include a series of alerts from consecutive sensors or motion-activated cameras around a property. If cameras are positioned at each corner of a property, and are triggered in order within a particular time interval, such as a few seconds, or less than a minute, or about a few minutes depending on the property size, etc., an alert may be sent even if the number of discrete events is otherwise below a defined threshold for sending alerts. An order for the sensor triggers may include any pattern or sequence by which a user has set, or the system has been set, to recognize possible negative activity, such as criminal casing. Once a threshold or trigger has been satisfied, the electronic device may transmit an alert to the mobile device.
- The
method 600 may optionally include generating an event table as previously described atoperation 640. The event table may be routinely updated to include each event recorded by the home automation or security system, and may receive updates from an external storage, or may be interconnected with the security features to receive automatic recognition of event data. The event table may be similar to that previously described, and may identify the number of events occurring during discrete time intervals for a date range or period of time. The event table may include a graphical depiction of the number of events that occurred for each discrete time interval. Additionally, the graphical depiction may include a bar chart having a bar for each discrete time interval, and the bar height and/or color may be commensurate with the number of items of event data occurring for each discrete time interval. - The alert may take any number of forms, an example of which is illustrated in
FIGS. 7A-7C . As illustrated in the figure, the electronic device may transmit an alert 705 to a mobile device 700, such as a mobile phone, to a user. The electronic device may utilize various communication paths for communicating with various users' mobile devices. These communication paths may include one or more of: a push notification server system, an SMS server system, an email server system, a telephone service provider network, social media, or a general network. The push notification server system may be a system that causes a mobile device to display a message such that the message must be actively dismissed by the user prior to or otherwise interacting with the mobile device. As such, a push notification has a high likelihood of being viewed by a user since the user is required to dismiss the push notification before performing any other functions, home automation related or not, with the mobile device. -
Alert 705 may include a variety of information, such as a notification that activity has surpassed a defined threshold, or that suspicious activity has been identified. Based on the notification or alert, the alert may include one or more links accessible by the user. By utilizing a cloud storage system for video recordings and streaming, a user may access one or more cameras or sensor devices while away from the hub and or property under automation or surveillance. Additionally, if the network hub, such as a set-top box, includes a network connection, a user may access his device across the network in order to access video streams or information from the electronic device. In this way, the alert may include a link to acamera 715 associated with at least one of the items of event data, up to all cameras of the system, as illustrated inFIG. 7C . A user may view live footage of the camera as well as recorded event data in embodiments. The camera provided may be based on that camera being one associated with the event data. Additionally, the provided view may allow a user to access alternate cameras to view surrounding areas of the property. - The alert may additionally or alternatively include a
link 710 to access the generated event table for review of the overall system by the user as illustrated inFIG. 7B . Once accessed, the user may review the aspects of the event table as previously described. The event table as accessed on the mobile device may also include links selectable by a filter or otherwise accessible on a mobile platform to live-stream a camera of the home automation system. In this way, a user may be able to determine the best view of an event that may be unfolding, and access appropriate cameras or sensors. The available information and tools to the user may also include allowing a user to record from any camera manually by selecting the camera and enabling it. Similarly, the user may interact with other aspects of the home automation system, such as to lock/unlock doors, turn on/off equipment that is connected with the system, or access speakers to talk remotely with burglars, kids, or pets, for example. As will be explained below, the alert may also include a graphical, perspective depiction as illustrated inFIGS. 9 and 10 that may be accessed and manipulated by the user as well. - Turning to
FIG. 8 is shown anothermethod 800 of coordinating home automation activity. Similarly to the above-described methods, themethod 800 may include accessing event data from a storage database communicatively coupled with an electronic device atoperation 810. Again, the storage database may be local or remote including a network-accessible database, or some combination of both. Additionally, each item of event data may include a time and date at which the event data was collected, along with any additional information as previously discussed including video recordings. The method may also include generating a graphical, perspective depiction atoperation 820 that identifies the number of events occurring during discrete time intervals for a date range or period of time. In embodiments, the depiction may include the discrete time intervals in perspective view. The method may further include outputting the graphical, perspective depiction from the electronic device for display on a display device atoperation 830. - The electronic device may be or include a television receiver, and may in embodiments be a satellite or cable set-top box, which may act as a hub for the home automation system. The hub may include an input for receiving multimedia data, which may include broadcast programming including cable or satellite broadcast television, on-demand programming, or internet-based data. The hub may also include an input for receiving user input such as with a remote control or mobile device. In embodiments, the hub may include one or more input components configured to receive video data from at least one communicatively coupled camera. The input may be an actual input for a camera of the system, or the camera may provide the video data or recordings to an additional database for storage that is accessible by the hub, such as over a network.
- The graphical, perspective depiction may allow one or more cameras of the system to be identified and event data associated with the camera to be displayed in a useful way for the user to quickly and easily identify events occurring over a period of time. An exemplary graphical, perspective depiction is illustrated at
FIG. 9 . As illustrated, the graphical,perspective depiction 900 includes an interface for displaying home automation equipment, such as sensors, or cameras as illustrated, as well as a timeline of events.Discrete time intervals 905 are displayed in perspective view in the depiction, and individual cameras, or other sensors, of a home automation system may be shown within the graphical, perspective view as well. In embodiments, each item of data may also include a video recording that may be accessed or streamed by a user from the graphical, perspective depiction. -
Event icons 915 may be utilized for each item of event data, and may be positioned along the perspective view of the graphical, perspective depiction according to the time at which the item of event data associated with the event icon was collected. In embodiments the event icons may correspond to, and visually represent, a source of the event data. As merely an example, understanding that any different symbol may be used, the icon associated with the event type may include, for example, a speaker depiction for a sound-initiated event or recording, a person depicted in motion for a motion-initiated event or recording, a symbol such as an “M” for a manual recording, etc. Additionally, in embodiments a user may choose icons for each event type, or load images to be used for each event type icon, such as family pictures, words, etc. The source for each item of event data may be a specific triggering event for camera recording, such as motion detection, sound detection, an instruction to manually record, and a rule-initiated recording as previously described. For example, a rule-initiated recording may be triggered by an alarm, or a connected device as described elsewhere. - As illustrated in the exemplary graphical depiction, the
individual cameras 910 are shown horizontally against the perspective view of discrete time intervals, and the individual cameras are shown as abscissae to the time intervals in perspective view. Although four cameras are shown in the illustration, it is to be understood that any number of cameras or sensors may be included in the graphical depiction depending on user setup, system components, etc. A user may also customize these features in embodiments to include pages of cameras, for example, in order to show a less crowded view when many cameras or sensors are included in the depiction. With such a depiction, more detail may be shown than in the bar chart previously described. For example, by looking along the perspective view, and considering the individual icons, a user may quickly identify not only the number of events that have occurred, but also the type and location, such as by knowing the location of each camera, in order to quickly glean the events of a time period or day. - Graphical,
perspective depiction 900 may be configured as an overlay, picture-in-picture, split-screen of some sort, or punch-through as illustrated to includeprogramming 930 while a user is reviewing event activity from the time period. Additionally, in embodiments the graphical depiction may include adetection field 920 associated with the discrete time interval associated with the nearest time displayed in the perspective view. The exemplary detection field is illustrated as a box around the discrete time interval across each camera. Event icons that are contained within the detection field may be differentiated from other event icons of the graphical depiction, so as to further enable a user to determine what is being reviewed. For example, the event icon within the detection field may be colored, shaded, etc. in order to provide an additional visual differentiation for the user. As illustrated in the Figure, a user may be currently reviewing activity at 1 PM, for example, where a motion-initiated recording was received atcamera 3. - The
detection field 920 may also display a graphical connection between event icons positioned within thedetection field 920 and a view from the video recording associated with the individual camera corresponding to the item of event data. For example, the graphical connection may include a graphical boundary surrounding the graphical, perspective depiction and aconnector 922 between the graphical boundary and a view from thevideo recording 925 associated with the individual camera corresponding to the item of event data. When multiple event icons are located within thedetection field 920, multiple camera views 925 may be tiled along the side of the graphical depiction so a user can view each event. Thecamera view 925 may include any number of views for a corresponding event icon. For example, when an event icon is displayed within the detection field, an image of the recording may be displayed, or in embodiments, the video may automatically begin streaming. - The electronic device or hub may also be configured to receive instructions from a remote control having ordinate and abscissa directionality controls, including horizontal and vertical buttons or inputs, such as a television receiver remote control. The remote control may have at least four keys for providing input, and at least two keys provide ordinate or vertical-based instructions, and at least two keys provide abscissa or horizontal-based instructions. By receiving an ordinate or vertical directionality control instruction, the graphical depiction may adjust along the perspective view of the discrete time intervals. For example, by selecting up, the view may scroll to the next discrete time interval, while the current time interval is removed from the depiction. Similarly, by selecting down, a new time interval may be added to the bottom of the depiction while the other discrete time intervals are shifted further along the perspective. Additionally, the vertical inputs may simply move the detection field along the perspective discrete time intervals, and in embodiments additional vertical inputs, such as page up or down commands actually change the time intervals displayed along the perspective view. Horizontal or abscissa-based instructions may adjust the view from one individual camera to the next, or may shift the number of cameras displayed, such as from cameras 1-4, to cameras 5-8, for example. Additionally, horizontal instructions may adjust the focus or view of a camera or may rotate or move the camera itself.
- In the alert system previously described, the graphical, perspective depiction may be linked with an alert similarly to the event table earlier described. Such a perspective depiction may also be easily manipulated and controlled on a mobile device, which may allow touch control and operation. For example, a mobile device including a mobile phone may have a touch screen or allow touch-sensitivity-based controls, and thus a swipe vertically across the touch screen may provide adjustments along the perspective view as previously described, and a horizontal swipe may provide adjustments along the camera views, or may adjust the position or focus of an individual camera. A user may also be able to select individual cameras such as by selecting the horizontally listed camera itself to view events during the time period for that camera.
- An additional embodiment of a
graphical depiction 1000 is illustrated inFIG. 10 , where the view from thevideo recording 1025 associated with the individual camera corresponding to the item of event data is positioned within that camera space. For example, where individual cameras 1010 are listed horizontally, a view of the video recording, such as an image or a streaming view of the video recording may be presented when an event icon is located at the currently viewed discrete time interval along the perspective view. As illustrated, a motion-detection is displayed at the currently viewed time of 1 PM forcamera 3, accordingly, an image from the video recording, or the video recording itself may be displayed in the space corresponding to the horizontal listing of theindividual camera 3, which is the camera associated with the item of event data. This may provide an additional means for reviewing multiple event icons for a discrete time interval without question of which video is from which camera, for example. -
FIG. 11 illustrates an embodiment of acomputer system 1100. Acomputer system 1100 as illustrated inFIG. 11 may be incorporated into devices such as an STB, a first electronic device, DVR, television, media system, personal computer, and the like. Moreover, some or all of the components of thecomputer system 1100 may also be incorporated into a portable electronic device, mobile phone, or other device as described herein.FIG. 11 provides a schematic illustration of one embodiment of acomputer system 1100 that can perform some or all of the steps of the methods provided by various embodiments. It should be noted thatFIG. 11 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate.FIG. 11 , therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner. - The
computer system 1100 is shown comprising hardware elements that can be electrically coupled via abus 1105, or may otherwise be in communication, as appropriate. The hardware elements may include one ormore processors 1110, including without limitation one or more general-purpose processors and/or one or more special-purpose processors such as digital signal processing chips, graphics acceleration processors, and/or the like; one ormore input devices 1115, which can include without limitation a mouse, a keyboard, a camera, and/or the like; and one ormore output devices 1120, which can include without limitation a display device, a printer, and/or the like. - The
computer system 1100 may further include and/or be in communication with one or morenon-transitory storage devices 1125, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like. - The
computer system 1100 might also include acommunications subsystem 1130, which can include without limitation a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device, and/or a chipset such as a Bluetooth™ device, an 802.11 device, a WiFi device, a WiMax device, cellular communication facilities, etc., and/or the like. Thecommunications subsystem 1130 may include one or more input and/or output communication interfaces to permit data to be exchanged with a network such as the network described below to name one example, other computer systems, television, and/or any other devices described herein. Depending on the desired functionality and/or other implementation concerns, a portable electronic device or similar device may communicate image and/or other information via thecommunications subsystem 1130. In other embodiments, a portable electronic device, e.g. the first electronic device, may be incorporated into thecomputer system 1100, e.g., an electronic device or STB, as aninput device 1115. In many embodiments, thecomputer system 1100 will further comprise a workingmemory 1135, which can include a RAM or ROM device, as described above. - The
computer system 1100 also can include software elements, shown as being currently located within the workingmemory 1135, including anoperating system 1140, device drivers, executable libraries, and/or other code, such as one ormore application programs 1145, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the methods discussed above, such as those described in relation toFIG. 4, 6 or 8 , might be implemented as code and/or instructions executable by a computer and/or a processor within a computer; in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer or other device to perform one or more operations in accordance with the described methods. - A set of these instructions and/or code might be stored on a non-transitory computer-readable storage medium, such as the storage device(s) 1125 described above. In some cases, the storage medium might be incorporated within a computer system, such as
computer system 1100. In other embodiments, the storage medium might be separate from a computer system e.g., a removable medium, such as a compact disc, and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by thecomputer system 1100 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on thecomputer system 1100 e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc., then takes the form of executable code. - It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software including portable software, such as applets, etc., or both. Further, connection to other computing devices such as network input/output devices may be employed.
- As mentioned above, in one aspect, some embodiments may employ a computer system such as the
computer system 1100 to perform methods in accordance with various embodiments of the technology. According to a set of embodiments, some or all of the procedures of such methods are performed by thecomputer system 1100 in response toprocessor 1110 executing one or more sequences of one or more instructions, which might be incorporated into theoperating system 1140 and/or other code, such as anapplication program 1145, contained in the workingmemory 1135. Such instructions may be read into the workingmemory 1135 from another computer-readable medium, such as one or more of the storage device(s) 1125. Merely by way of example, execution of the sequences of instructions contained in the workingmemory 1135 might cause the processor(s) 1110 to perform one or more procedures of the methods described herein. Additionally or alternatively, portions of the methods described herein may be executed through specialized hardware. - The terms “machine-readable medium” and “computer-readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion. In an embodiment implemented using the
computer system 1100, various computer-readable media might be involved in providing instructions/code to processor(s) 1110 for execution and/or might be used to store and/or carry such instructions/code. In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take the form of a non-volatile media or volatile media. Non-volatile media include, for example, optical and/or magnetic disks, such as the storage device(s) 1125. Volatile media include, without limitation, dynamic memory, such as the workingmemory 1135. - Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
- Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 1110 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the
computer system 1100. - The
communications subsystem 1130 and/or components thereof generally will receive signals, and thebus 1105 then might carry the signals and/or the data, instructions, etc. carried by the signals to the workingmemory 1135, from which the processor(s) 1110 retrieves and executes the instructions. The instructions received by the workingmemory 1135 may optionally be stored on anon-transitory storage device 1125 either before or after execution by the processor(s) 1110. - The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
- Specific details are given in the description to provide a thorough understanding of exemplary configurations including implementations. However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those skilled in the art with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
- Also, configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
- Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of the technology. Also, a number of steps may be undertaken before, during, or after the above elements are considered. Accordingly, the above description does not bind the scope of the claims.
- As used herein and in the appended claims, the singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise. Thus, for example, reference to “a user” includes a plurality of such users, and reference to “the processor” includes reference to one or more processors and equivalents thereof known to those skilled in the art, and so forth.
- Also, the words “comprise”, “comprising”, “contains”, “containing”, “include”, “including”, and “includes”, when used in this specification and in the following claims, are intended to specify the presence of stated features, integers, components, or steps, but they do not preclude the presence or addition of one or more other features, integers, components, steps, acts, or groups.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/844,862 US9704537B2 (en) | 2015-09-03 | 2015-09-03 | Methods and systems for coordinating home automation activity |
US14/844,939 US20170070775A1 (en) | 2015-09-03 | 2015-09-03 | Methods and systems for coordinating home automation activity |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/844,939 US20170070775A1 (en) | 2015-09-03 | 2015-09-03 | Methods and systems for coordinating home automation activity |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170070775A1 true US20170070775A1 (en) | 2017-03-09 |
Family
ID=58190806
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/844,862 Active US9704537B2 (en) | 2015-09-03 | 2015-09-03 | Methods and systems for coordinating home automation activity |
US14/844,939 Abandoned US20170070775A1 (en) | 2015-09-03 | 2015-09-03 | Methods and systems for coordinating home automation activity |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/844,862 Active US9704537B2 (en) | 2015-09-03 | 2015-09-03 | Methods and systems for coordinating home automation activity |
Country Status (1)
Country | Link |
---|---|
US (2) | US9704537B2 (en) |
Cited By (80)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180096157A1 (en) * | 2016-10-05 | 2018-04-05 | Microsoft Technology Licensing, Llc | Detection of compromised devices via user states |
WO2018175912A1 (en) * | 2017-03-24 | 2018-09-27 | Johnson Controls Technology Company | Building management system with dynamic channel communication |
EP3382665A1 (en) * | 2017-03-29 | 2018-10-03 | Honeywell International Inc. | Method for recreating time-based events using a building monitoring system |
US20180357871A1 (en) * | 2017-06-07 | 2018-12-13 | Amazon Technologies, Inc. | Informative Image Data Generation Using Audio/Video Recording and Communication Devices |
US20190137128A1 (en) * | 2017-11-06 | 2019-05-09 | International Business Machines Corporation | Adjusting settings of environmental devices connected via a network to an automation hub |
WO2019099367A1 (en) * | 2017-11-15 | 2019-05-23 | Johnson Controls Technology Company | Building energy management system |
US20190182534A1 (en) * | 2017-12-13 | 2019-06-13 | Google Llc | Tactile launching of an asymmetric visual communication session |
US20190188980A1 (en) * | 2017-12-15 | 2019-06-20 | Google Llc | External video clip distribution with metadata from a smart-home environment |
US20190342622A1 (en) * | 2018-05-07 | 2019-11-07 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
WO2020168154A1 (en) * | 2019-02-14 | 2020-08-20 | Savant Systems, Llc | Multi-role devices for automation environments |
US10755543B1 (en) * | 2019-07-08 | 2020-08-25 | Chekt Llc | Bridge device supporting alarm format |
US10779085B1 (en) | 2019-05-31 | 2020-09-15 | Apple Inc. | User interfaces for managing controllable external devices |
US10809682B2 (en) | 2017-11-15 | 2020-10-20 | Johnson Controls Technology Company | Building management system with optimized processing of building system data |
US10831163B2 (en) | 2012-08-27 | 2020-11-10 | Johnson Controls Technology Company | Syntax translation from first syntax to second syntax based on string analysis |
US10854194B2 (en) | 2017-02-10 | 2020-12-01 | Johnson Controls Technology Company | Building system with digital twin based data ingestion and processing |
US10962945B2 (en) | 2017-09-27 | 2021-03-30 | Johnson Controls Technology Company | Building management system with integration of data into smart entities |
WO2021087111A1 (en) * | 2019-11-01 | 2021-05-06 | Invue Security Products Inc. | Display control for televisions |
US11016998B2 (en) | 2017-02-10 | 2021-05-25 | Johnson Controls Technology Company | Building management smart entity creation and maintenance using time series data |
US11048397B2 (en) * | 2015-06-14 | 2021-06-29 | Google Llc | Methods and systems for presenting alert event indicators |
US11079913B1 (en) | 2020-05-11 | 2021-08-03 | Apple Inc. | User interface for status indicators |
US11105528B2 (en) | 2017-11-15 | 2021-08-31 | Johnson Controls Tyco IP Holdings LLP | Building management system with automatic synchronization of point read frequency |
US11120012B2 (en) | 2017-09-27 | 2021-09-14 | Johnson Controls Tyco IP Holdings LLP | Web services platform with integration and interface of smart entities with enterprise applications |
US11275348B2 (en) | 2017-02-10 | 2022-03-15 | Johnson Controls Technology Company | Building system with digital twin based agent processing |
US11281169B2 (en) | 2017-11-15 | 2022-03-22 | Johnson Controls Tyco IP Holdings LLP | Building management system with point virtualization for online meters |
US11280509B2 (en) | 2017-07-17 | 2022-03-22 | Johnson Controls Technology Company | Systems and methods for agent based building simulation for optimal control |
US11307538B2 (en) | 2017-02-10 | 2022-04-19 | Johnson Controls Technology Company | Web services platform with cloud-eased feedback control |
US11314788B2 (en) | 2017-09-27 | 2022-04-26 | Johnson Controls Tyco IP Holdings LLP | Smart entity management for building management systems |
US11363071B2 (en) | 2019-05-31 | 2022-06-14 | Apple Inc. | User interfaces for managing a local network |
US11360447B2 (en) | 2017-02-10 | 2022-06-14 | Johnson Controls Technology Company | Building smart entity system with agent based communication and control |
US11495103B2 (en) * | 2017-01-23 | 2022-11-08 | Hanwha Techwin Co., Ltd. | Monitoring apparatus and system |
US20220376944A1 (en) | 2019-12-31 | 2022-11-24 | Johnson Controls Tyco IP Holdings LLP | Building data platform with graph based capabilities |
US11589010B2 (en) | 2020-06-03 | 2023-02-21 | Apple Inc. | Camera and visitor user interfaces |
US11657614B2 (en) | 2020-06-03 | 2023-05-23 | Apple Inc. | Camera and visitor user interfaces |
US11699903B2 (en) | 2017-06-07 | 2023-07-11 | Johnson Controls Tyco IP Holdings LLP | Building energy optimization system with economic load demand response (ELDR) optimization and ELDR user interfaces |
US11704311B2 (en) | 2021-11-24 | 2023-07-18 | Johnson Controls Tyco IP Holdings LLP | Building data platform with a distributed digital twin |
US11709965B2 (en) | 2017-09-27 | 2023-07-25 | Johnson Controls Technology Company | Building system with smart entity personal identifying information (PII) masking |
US11714930B2 (en) | 2021-11-29 | 2023-08-01 | Johnson Controls Tyco IP Holdings LLP | Building data platform with digital twin based inferences and predictions for a graphical building model |
US11727738B2 (en) | 2017-11-22 | 2023-08-15 | Johnson Controls Tyco IP Holdings LLP | Building campus with integrated smart environment |
US11726632B2 (en) | 2017-07-27 | 2023-08-15 | Johnson Controls Technology Company | Building management system with global rule library and crowdsourcing framework |
US11733663B2 (en) | 2017-07-21 | 2023-08-22 | Johnson Controls Tyco IP Holdings LLP | Building management system with dynamic work order generation with adaptive diagnostic task details |
US11735021B2 (en) | 2017-09-27 | 2023-08-22 | Johnson Controls Tyco IP Holdings LLP | Building risk analysis system with risk decay |
US11741165B2 (en) | 2020-09-30 | 2023-08-29 | Johnson Controls Tyco IP Holdings LLP | Building management system with semantic model integration |
US11755604B2 (en) | 2017-02-10 | 2023-09-12 | Johnson Controls Technology Company | Building management system with declarative views of timeseries data |
US11764991B2 (en) | 2017-02-10 | 2023-09-19 | Johnson Controls Technology Company | Building management system with identity management |
US11763266B2 (en) | 2019-01-18 | 2023-09-19 | Johnson Controls Tyco IP Holdings LLP | Smart parking lot system |
US11762343B2 (en) | 2019-01-28 | 2023-09-19 | Johnson Controls Tyco IP Holdings LLP | Building management system with hybrid edge-cloud processing |
US11761653B2 (en) | 2017-05-10 | 2023-09-19 | Johnson Controls Tyco IP Holdings LLP | Building management system with a distributed blockchain database |
US11768004B2 (en) | 2016-03-31 | 2023-09-26 | Johnson Controls Tyco IP Holdings LLP | HVAC device registration in a distributed building management system |
US11769066B2 (en) | 2021-11-17 | 2023-09-26 | Johnson Controls Tyco IP Holdings LLP | Building data platform with digital twin triggers and actions |
US11770020B2 (en) | 2016-01-22 | 2023-09-26 | Johnson Controls Technology Company | Building system with timeseries synchronization |
US11774920B2 (en) | 2016-05-04 | 2023-10-03 | Johnson Controls Technology Company | Building system with user presentation composition based on building context |
US11774922B2 (en) | 2017-06-15 | 2023-10-03 | Johnson Controls Technology Company | Building management system with artificial intelligence for unified agent based control of building subsystems |
US11785277B2 (en) | 2020-09-05 | 2023-10-10 | Apple Inc. | User interfaces for managing audio for media items |
US11792039B2 (en) | 2017-02-10 | 2023-10-17 | Johnson Controls Technology Company | Building management system with space graphs including software components |
US11796974B2 (en) | 2021-11-16 | 2023-10-24 | Johnson Controls Tyco IP Holdings LLP | Building data platform with schema extensibility for properties and tags of a digital twin |
US11874809B2 (en) | 2020-06-08 | 2024-01-16 | Johnson Controls Tyco IP Holdings LLP | Building system with naming schema encoding entity type and entity relationships |
US11874635B2 (en) | 2015-10-21 | 2024-01-16 | Johnson Controls Technology Company | Building automation system with integrated building information model |
US11880677B2 (en) | 2020-04-06 | 2024-01-23 | Johnson Controls Tyco IP Holdings LLP | Building system with digital network twin |
US11892180B2 (en) | 2017-01-06 | 2024-02-06 | Johnson Controls Tyco IP Holdings LLP | HVAC system with automated device pairing |
US11894944B2 (en) | 2019-12-31 | 2024-02-06 | Johnson Controls Tyco IP Holdings LLP | Building data platform with an enrichment loop |
US11902375B2 (en) | 2020-10-30 | 2024-02-13 | Johnson Controls Tyco IP Holdings LLP | Systems and methods of configuring a building management system |
US11899723B2 (en) | 2021-06-22 | 2024-02-13 | Johnson Controls Tyco IP Holdings LLP | Building data platform with context based twin function processing |
US11900287B2 (en) | 2017-05-25 | 2024-02-13 | Johnson Controls Tyco IP Holdings LLP | Model predictive maintenance system with budgetary constraints |
US11921481B2 (en) | 2021-03-17 | 2024-03-05 | Johnson Controls Tyco IP Holdings LLP | Systems and methods for determining equipment energy waste |
US11927925B2 (en) | 2018-11-19 | 2024-03-12 | Johnson Controls Tyco IP Holdings LLP | Building system with a time correlated reliability data stream |
US11934966B2 (en) | 2021-11-17 | 2024-03-19 | Johnson Controls Tyco IP Holdings LLP | Building data platform with digital twin inferences |
US11941238B2 (en) | 2018-10-30 | 2024-03-26 | Johnson Controls Technology Company | Systems and methods for entity visualization and management with an entity node editor |
US11947785B2 (en) | 2016-01-22 | 2024-04-02 | Johnson Controls Technology Company | Building system with a building graph |
US11954713B2 (en) | 2018-03-13 | 2024-04-09 | Johnson Controls Tyco IP Holdings LLP | Variable refrigerant flow system with electricity consumption apportionment |
US11954154B2 (en) | 2020-09-30 | 2024-04-09 | Johnson Controls Tyco IP Holdings LLP | Building management system with semantic model integration |
US11954478B2 (en) | 2017-04-21 | 2024-04-09 | Tyco Fire & Security Gmbh | Building management system with cloud management of gateway configurations |
US12013823B2 (en) | 2022-09-08 | 2024-06-18 | Tyco Fire & Security Gmbh | Gateway system that maps points into a graph schema |
US12013673B2 (en) | 2021-11-29 | 2024-06-18 | Tyco Fire & Security Gmbh | Building control system using reinforcement learning |
US12021650B2 (en) | 2019-12-31 | 2024-06-25 | Tyco Fire & Security Gmbh | Building data platform with event subscriptions |
EP4394728A1 (en) * | 2020-06-03 | 2024-07-03 | Apple Inc. | Camera and visitor user interfaces |
US12055908B2 (en) | 2017-02-10 | 2024-08-06 | Johnson Controls Technology Company | Building management system with nested stream generation |
US12061633B2 (en) | 2022-09-08 | 2024-08-13 | Tyco Fire & Security Gmbh | Building system that maps points into a graph schema |
US12061453B2 (en) | 2020-12-18 | 2024-08-13 | Tyco Fire & Security Gmbh | Building management system performance index |
US12100280B2 (en) | 2020-02-04 | 2024-09-24 | Tyco Fire & Security Gmbh | Systems and methods for software defined fire detection and risk assessment |
US12099334B2 (en) | 2019-12-31 | 2024-09-24 | Tyco Fire & Security Gmbh | Systems and methods for presenting multiple BIM files in a single interface |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10965484B2 (en) * | 2018-12-21 | 2021-03-30 | Opendoor Labs Inc. | Fleet of home electronic systems |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030115602A1 (en) * | 1995-06-07 | 2003-06-19 | Knee Robert Alan | Electronic television program guide schedule system and method with data feed access |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6690411B2 (en) * | 1999-07-20 | 2004-02-10 | @Security Broadband Corp. | Security system |
US20060001537A1 (en) * | 2003-11-20 | 2006-01-05 | Blake Wilbert L | System and method for remote access to security event information |
US8799952B2 (en) * | 2007-04-24 | 2014-08-05 | Google Inc. | Virtual channels |
US20110010624A1 (en) * | 2009-07-10 | 2011-01-13 | Vanslette Paul J | Synchronizing audio-visual data with event data |
US8438175B2 (en) * | 2010-03-17 | 2013-05-07 | Lighthaus Logic Inc. | Systems, methods and articles for video analysis reporting |
-
2015
- 2015-09-03 US US14/844,862 patent/US9704537B2/en active Active
- 2015-09-03 US US14/844,939 patent/US20170070775A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030115602A1 (en) * | 1995-06-07 | 2003-06-19 | Knee Robert Alan | Electronic television program guide schedule system and method with data feed access |
Cited By (146)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10831163B2 (en) | 2012-08-27 | 2020-11-10 | Johnson Controls Technology Company | Syntax translation from first syntax to second syntax based on string analysis |
US11754982B2 (en) | 2012-08-27 | 2023-09-12 | Johnson Controls Tyco IP Holdings LLP | Syntax translation from first syntax to second syntax based on string analysis |
US10859984B2 (en) | 2012-08-27 | 2020-12-08 | Johnson Controls Technology Company | Systems and methods for classifying data in building automation systems |
US11599259B2 (en) | 2015-06-14 | 2023-03-07 | Google Llc | Methods and systems for presenting alert event indicators |
US11048397B2 (en) * | 2015-06-14 | 2021-06-29 | Google Llc | Methods and systems for presenting alert event indicators |
US12105484B2 (en) | 2015-10-21 | 2024-10-01 | Johnson Controls Technology Company | Building automation system with integrated building information model |
US11899413B2 (en) | 2015-10-21 | 2024-02-13 | Johnson Controls Technology Company | Building automation system with integrated building information model |
US11874635B2 (en) | 2015-10-21 | 2024-01-16 | Johnson Controls Technology Company | Building automation system with integrated building information model |
US11947785B2 (en) | 2016-01-22 | 2024-04-02 | Johnson Controls Technology Company | Building system with a building graph |
US11770020B2 (en) | 2016-01-22 | 2023-09-26 | Johnson Controls Technology Company | Building system with timeseries synchronization |
US11894676B2 (en) | 2016-01-22 | 2024-02-06 | Johnson Controls Technology Company | Building energy management system with energy analytics |
US11768004B2 (en) | 2016-03-31 | 2023-09-26 | Johnson Controls Tyco IP Holdings LLP | HVAC device registration in a distributed building management system |
US11927924B2 (en) | 2016-05-04 | 2024-03-12 | Johnson Controls Technology Company | Building system with user presentation composition based on building context |
US11774920B2 (en) | 2016-05-04 | 2023-10-03 | Johnson Controls Technology Company | Building system with user presentation composition based on building context |
US10534925B2 (en) * | 2016-10-05 | 2020-01-14 | Microsoft Technology Licensing, Llc | Detection of compromised devices via user states |
US20180096157A1 (en) * | 2016-10-05 | 2018-04-05 | Microsoft Technology Licensing, Llc | Detection of compromised devices via user states |
US11892180B2 (en) | 2017-01-06 | 2024-02-06 | Johnson Controls Tyco IP Holdings LLP | HVAC system with automated device pairing |
US11495103B2 (en) * | 2017-01-23 | 2022-11-08 | Hanwha Techwin Co., Ltd. | Monitoring apparatus and system |
US11024292B2 (en) | 2017-02-10 | 2021-06-01 | Johnson Controls Technology Company | Building system with entity graph storing events |
US12019437B2 (en) | 2017-02-10 | 2024-06-25 | Johnson Controls Technology Company | Web services platform with cloud-based feedback control |
US11275348B2 (en) | 2017-02-10 | 2022-03-15 | Johnson Controls Technology Company | Building system with digital twin based agent processing |
US10854194B2 (en) | 2017-02-10 | 2020-12-01 | Johnson Controls Technology Company | Building system with digital twin based data ingestion and processing |
US11994833B2 (en) | 2017-02-10 | 2024-05-28 | Johnson Controls Technology Company | Building smart entity system with agent based data ingestion and entity creation using time series data |
US11809461B2 (en) | 2017-02-10 | 2023-11-07 | Johnson Controls Technology Company | Building system with an entity graph storing software logic |
US11016998B2 (en) | 2017-02-10 | 2021-05-25 | Johnson Controls Technology Company | Building management smart entity creation and maintenance using time series data |
US11762886B2 (en) | 2017-02-10 | 2023-09-19 | Johnson Controls Technology Company | Building system with entity graph commands |
US11307538B2 (en) | 2017-02-10 | 2022-04-19 | Johnson Controls Technology Company | Web services platform with cloud-eased feedback control |
US11764991B2 (en) | 2017-02-10 | 2023-09-19 | Johnson Controls Technology Company | Building management system with identity management |
US11792039B2 (en) | 2017-02-10 | 2023-10-17 | Johnson Controls Technology Company | Building management system with space graphs including software components |
US11778030B2 (en) | 2017-02-10 | 2023-10-03 | Johnson Controls Technology Company | Building smart entity system with agent based communication and control |
US12055908B2 (en) | 2017-02-10 | 2024-08-06 | Johnson Controls Technology Company | Building management system with nested stream generation |
US11151983B2 (en) | 2017-02-10 | 2021-10-19 | Johnson Controls Technology Company | Building system with an entity graph storing software logic |
US11158306B2 (en) | 2017-02-10 | 2021-10-26 | Johnson Controls Technology Company | Building system with entity graph commands |
US11774930B2 (en) | 2017-02-10 | 2023-10-03 | Johnson Controls Technology Company | Building system with digital twin based agent processing |
US11360447B2 (en) | 2017-02-10 | 2022-06-14 | Johnson Controls Technology Company | Building smart entity system with agent based communication and control |
US11755604B2 (en) | 2017-02-10 | 2023-09-12 | Johnson Controls Technology Company | Building management system with declarative views of timeseries data |
US11762362B2 (en) | 2017-03-24 | 2023-09-19 | Johnson Controls Tyco IP Holdings LLP | Building management system with dynamic channel communication |
US11442424B2 (en) | 2017-03-24 | 2022-09-13 | Johnson Controls Tyco IP Holdings LLP | Building management system with dynamic channel communication |
WO2018175912A1 (en) * | 2017-03-24 | 2018-09-27 | Johnson Controls Technology Company | Building management system with dynamic channel communication |
US11042144B2 (en) | 2017-03-24 | 2021-06-22 | Johnson Controls Technology Company | Building management system with dynamic channel communication |
EP3382665A1 (en) * | 2017-03-29 | 2018-10-03 | Honeywell International Inc. | Method for recreating time-based events using a building monitoring system |
US11954478B2 (en) | 2017-04-21 | 2024-04-09 | Tyco Fire & Security Gmbh | Building management system with cloud management of gateway configurations |
US11761653B2 (en) | 2017-05-10 | 2023-09-19 | Johnson Controls Tyco IP Holdings LLP | Building management system with a distributed blockchain database |
US11900287B2 (en) | 2017-05-25 | 2024-02-13 | Johnson Controls Tyco IP Holdings LLP | Model predictive maintenance system with budgetary constraints |
US10769914B2 (en) * | 2017-06-07 | 2020-09-08 | Amazon Technologies, Inc. | Informative image data generation using audio/video recording and communication devices |
US11699903B2 (en) | 2017-06-07 | 2023-07-11 | Johnson Controls Tyco IP Holdings LLP | Building energy optimization system with economic load demand response (ELDR) optimization and ELDR user interfaces |
US20180357871A1 (en) * | 2017-06-07 | 2018-12-13 | Amazon Technologies, Inc. | Informative Image Data Generation Using Audio/Video Recording and Communication Devices |
US12061446B2 (en) | 2017-06-15 | 2024-08-13 | Johnson Controls Technology Company | Building management system with artificial intelligence for unified agent based control of building subsystems |
US11774922B2 (en) | 2017-06-15 | 2023-10-03 | Johnson Controls Technology Company | Building management system with artificial intelligence for unified agent based control of building subsystems |
US11920810B2 (en) | 2017-07-17 | 2024-03-05 | Johnson Controls Technology Company | Systems and methods for agent based building simulation for optimal control |
US11280509B2 (en) | 2017-07-17 | 2022-03-22 | Johnson Controls Technology Company | Systems and methods for agent based building simulation for optimal control |
US11733663B2 (en) | 2017-07-21 | 2023-08-22 | Johnson Controls Tyco IP Holdings LLP | Building management system with dynamic work order generation with adaptive diagnostic task details |
US11726632B2 (en) | 2017-07-27 | 2023-08-15 | Johnson Controls Technology Company | Building management system with global rule library and crowdsourcing framework |
US11762353B2 (en) | 2017-09-27 | 2023-09-19 | Johnson Controls Technology Company | Building system with a digital twin based on information technology (IT) data and operational technology (OT) data |
US11741812B2 (en) | 2017-09-27 | 2023-08-29 | Johnson Controls Tyco IP Holdings LLP | Building risk analysis system with dynamic modification of asset-threat weights |
US20220138183A1 (en) | 2017-09-27 | 2022-05-05 | Johnson Controls Tyco IP Holdings LLP | Web services platform with integration and interface of smart entities with enterprise applications |
US11709965B2 (en) | 2017-09-27 | 2023-07-25 | Johnson Controls Technology Company | Building system with smart entity personal identifying information (PII) masking |
US11762356B2 (en) | 2017-09-27 | 2023-09-19 | Johnson Controls Technology Company | Building management system with integration of data into smart entities |
US12013842B2 (en) | 2017-09-27 | 2024-06-18 | Johnson Controls Tyco IP Holdings LLP | Web services platform with integration and interface of smart entities with enterprise applications |
US10962945B2 (en) | 2017-09-27 | 2021-03-30 | Johnson Controls Technology Company | Building management system with integration of data into smart entities |
US11768826B2 (en) | 2017-09-27 | 2023-09-26 | Johnson Controls Tyco IP Holdings LLP | Web services for creation and maintenance of smart entities for connected devices |
US11735021B2 (en) | 2017-09-27 | 2023-08-22 | Johnson Controls Tyco IP Holdings LLP | Building risk analysis system with risk decay |
US12056999B2 (en) | 2017-09-27 | 2024-08-06 | Tyco Fire & Security Gmbh | Building risk analysis system with natural language processing for threat ingestion |
US11120012B2 (en) | 2017-09-27 | 2021-09-14 | Johnson Controls Tyco IP Holdings LLP | Web services platform with integration and interface of smart entities with enterprise applications |
US11314726B2 (en) | 2017-09-27 | 2022-04-26 | Johnson Controls Tyco IP Holdings LLP | Web services for smart entity management for sensor systems |
US11314788B2 (en) | 2017-09-27 | 2022-04-26 | Johnson Controls Tyco IP Holdings LLP | Smart entity management for building management systems |
US11449022B2 (en) | 2017-09-27 | 2022-09-20 | Johnson Controls Technology Company | Building management system with integration of data into smart entities |
US20190137128A1 (en) * | 2017-11-06 | 2019-05-09 | International Business Machines Corporation | Adjusting settings of environmental devices connected via a network to an automation hub |
US11262088B2 (en) * | 2017-11-06 | 2022-03-01 | International Business Machines Corporation | Adjusting settings of environmental devices connected via a network to an automation hub |
US11782407B2 (en) | 2017-11-15 | 2023-10-10 | Johnson Controls Tyco IP Holdings LLP | Building management system with optimized processing of building system data |
US11762351B2 (en) | 2017-11-15 | 2023-09-19 | Johnson Controls Tyco IP Holdings LLP | Building management system with point virtualization for online meters |
US11105528B2 (en) | 2017-11-15 | 2021-08-31 | Johnson Controls Tyco IP Holdings LLP | Building management system with automatic synchronization of point read frequency |
US11467549B2 (en) | 2017-11-15 | 2022-10-11 | Johnson Controls Tyco IP Holdings LLP | Building management system with optimized processing of building system data |
WO2019099367A1 (en) * | 2017-11-15 | 2019-05-23 | Johnson Controls Technology Company | Building energy management system |
US11281169B2 (en) | 2017-11-15 | 2022-03-22 | Johnson Controls Tyco IP Holdings LLP | Building management system with point virtualization for online meters |
US10809682B2 (en) | 2017-11-15 | 2020-10-20 | Johnson Controls Technology Company | Building management system with optimized processing of building system data |
US11727738B2 (en) | 2017-11-22 | 2023-08-15 | Johnson Controls Tyco IP Holdings LLP | Building campus with integrated smart environment |
US11259076B2 (en) * | 2017-12-13 | 2022-02-22 | Google Llc | Tactile launching of an asymmetric visual communication session |
US20190182534A1 (en) * | 2017-12-13 | 2019-06-13 | Google Llc | Tactile launching of an asymmetric visual communication session |
US20190188980A1 (en) * | 2017-12-15 | 2019-06-20 | Google Llc | External video clip distribution with metadata from a smart-home environment |
US10621838B2 (en) * | 2017-12-15 | 2020-04-14 | Google Llc | External video clip distribution with metadata from a smart-home environment |
US11954713B2 (en) | 2018-03-13 | 2024-04-09 | Johnson Controls Tyco IP Holdings LLP | Variable refrigerant flow system with electricity consumption apportionment |
US10904628B2 (en) | 2018-05-07 | 2021-01-26 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US20190342622A1 (en) * | 2018-05-07 | 2019-11-07 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US10820058B2 (en) * | 2018-05-07 | 2020-10-27 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US12096085B2 (en) | 2018-05-07 | 2024-09-17 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US11941238B2 (en) | 2018-10-30 | 2024-03-26 | Johnson Controls Technology Company | Systems and methods for entity visualization and management with an entity node editor |
US11927925B2 (en) | 2018-11-19 | 2024-03-12 | Johnson Controls Tyco IP Holdings LLP | Building system with a time correlated reliability data stream |
US11769117B2 (en) | 2019-01-18 | 2023-09-26 | Johnson Controls Tyco IP Holdings LLP | Building automation system with fault analysis and component procurement |
US11763266B2 (en) | 2019-01-18 | 2023-09-19 | Johnson Controls Tyco IP Holdings LLP | Smart parking lot system |
US11775938B2 (en) | 2019-01-18 | 2023-10-03 | Johnson Controls Tyco IP Holdings LLP | Lobby management system |
US11762343B2 (en) | 2019-01-28 | 2023-09-19 | Johnson Controls Tyco IP Holdings LLP | Building management system with hybrid edge-cloud processing |
US11863343B2 (en) | 2019-02-14 | 2024-01-02 | Savant Systems, Inc. | Multi-role devices for automation environments |
WO2020168154A1 (en) * | 2019-02-14 | 2020-08-20 | Savant Systems, Llc | Multi-role devices for automation environments |
JP7554757B2 (en) | 2019-02-14 | 2024-09-20 | サバント システムズ インコーポレイテッド | Multifunctional device for automated control environments |
CN113796059A (en) * | 2019-02-14 | 2021-12-14 | 萨万特系统公司 | Multi-role device for automated environments |
US11785387B2 (en) | 2019-05-31 | 2023-10-10 | Apple Inc. | User interfaces for managing controllable external devices |
US11824898B2 (en) | 2019-05-31 | 2023-11-21 | Apple Inc. | User interfaces for managing a local network |
US11363071B2 (en) | 2019-05-31 | 2022-06-14 | Apple Inc. | User interfaces for managing a local network |
US10779085B1 (en) | 2019-05-31 | 2020-09-15 | Apple Inc. | User interfaces for managing controllable external devices |
US12114142B2 (en) | 2019-05-31 | 2024-10-08 | Apple Inc. | User interfaces for managing controllable external devices |
US10904029B2 (en) | 2019-05-31 | 2021-01-26 | Apple Inc. | User interfaces for managing controllable external devices |
US10755543B1 (en) * | 2019-07-08 | 2020-08-25 | Chekt Llc | Bridge device supporting alarm format |
WO2021087111A1 (en) * | 2019-11-01 | 2021-05-06 | Invue Security Products Inc. | Display control for televisions |
US11991019B2 (en) | 2019-12-31 | 2024-05-21 | Johnson Controls Tyco IP Holdings LLP | Building data platform with event queries |
US11777757B2 (en) | 2019-12-31 | 2023-10-03 | Johnson Controls Tyco IP Holdings LLP | Building data platform with event based graph queries |
US12099334B2 (en) | 2019-12-31 | 2024-09-24 | Tyco Fire & Security Gmbh | Systems and methods for presenting multiple BIM files in a single interface |
US11894944B2 (en) | 2019-12-31 | 2024-02-06 | Johnson Controls Tyco IP Holdings LLP | Building data platform with an enrichment loop |
US11968059B2 (en) | 2019-12-31 | 2024-04-23 | Johnson Controls Tyco IP Holdings LLP | Building data platform with graph based capabilities |
US11777759B2 (en) | 2019-12-31 | 2023-10-03 | Johnson Controls Tyco IP Holdings LLP | Building data platform with graph based permissions |
US12063126B2 (en) | 2019-12-31 | 2024-08-13 | Tyco Fire & Security Gmbh | Building data graph including application programming interface calls |
US11824680B2 (en) | 2019-12-31 | 2023-11-21 | Johnson Controls Tyco IP Holdings LLP | Building data platform with a tenant entitlement model |
US20220376944A1 (en) | 2019-12-31 | 2022-11-24 | Johnson Controls Tyco IP Holdings LLP | Building data platform with graph based capabilities |
US11770269B2 (en) | 2019-12-31 | 2023-09-26 | Johnson Controls Tyco IP Holdings LLP | Building data platform with event enrichment with contextual information |
US12040911B2 (en) | 2019-12-31 | 2024-07-16 | Tyco Fire & Security Gmbh | Building data platform with a graph change feed |
US12021650B2 (en) | 2019-12-31 | 2024-06-25 | Tyco Fire & Security Gmbh | Building data platform with event subscriptions |
US11777756B2 (en) | 2019-12-31 | 2023-10-03 | Johnson Controls Tyco IP Holdings LLP | Building data platform with graph based communication actions |
US11991018B2 (en) | 2019-12-31 | 2024-05-21 | Tyco Fire & Security Gmbh | Building data platform with edge based event enrichment |
US11777758B2 (en) | 2019-12-31 | 2023-10-03 | Johnson Controls Tyco IP Holdings LLP | Building data platform with external twin synchronization |
US12100280B2 (en) | 2020-02-04 | 2024-09-24 | Tyco Fire & Security Gmbh | Systems and methods for software defined fire detection and risk assessment |
US11880677B2 (en) | 2020-04-06 | 2024-01-23 | Johnson Controls Tyco IP Holdings LLP | Building system with digital network twin |
US11079913B1 (en) | 2020-05-11 | 2021-08-03 | Apple Inc. | User interface for status indicators |
US11513667B2 (en) | 2020-05-11 | 2022-11-29 | Apple Inc. | User interface for audio message |
US11589010B2 (en) | 2020-06-03 | 2023-02-21 | Apple Inc. | Camera and visitor user interfaces |
US11657614B2 (en) | 2020-06-03 | 2023-05-23 | Apple Inc. | Camera and visitor user interfaces |
EP4394728A1 (en) * | 2020-06-03 | 2024-07-03 | Apple Inc. | Camera and visitor user interfaces |
US11937021B2 (en) | 2020-06-03 | 2024-03-19 | Apple Inc. | Camera and visitor user interfaces |
US11874809B2 (en) | 2020-06-08 | 2024-01-16 | Johnson Controls Tyco IP Holdings LLP | Building system with naming schema encoding entity type and entity relationships |
US11785277B2 (en) | 2020-09-05 | 2023-10-10 | Apple Inc. | User interfaces for managing audio for media items |
US11954154B2 (en) | 2020-09-30 | 2024-04-09 | Johnson Controls Tyco IP Holdings LLP | Building management system with semantic model integration |
US11741165B2 (en) | 2020-09-30 | 2023-08-29 | Johnson Controls Tyco IP Holdings LLP | Building management system with semantic model integration |
US11902375B2 (en) | 2020-10-30 | 2024-02-13 | Johnson Controls Tyco IP Holdings LLP | Systems and methods of configuring a building management system |
US12063274B2 (en) | 2020-10-30 | 2024-08-13 | Tyco Fire & Security Gmbh | Self-configuring building management system |
US12058212B2 (en) | 2020-10-30 | 2024-08-06 | Tyco Fire & Security Gmbh | Building management system with auto-configuration using existing points |
US12061453B2 (en) | 2020-12-18 | 2024-08-13 | Tyco Fire & Security Gmbh | Building management system performance index |
US11921481B2 (en) | 2021-03-17 | 2024-03-05 | Johnson Controls Tyco IP Holdings LLP | Systems and methods for determining equipment energy waste |
US11899723B2 (en) | 2021-06-22 | 2024-02-13 | Johnson Controls Tyco IP Holdings LLP | Building data platform with context based twin function processing |
US11796974B2 (en) | 2021-11-16 | 2023-10-24 | Johnson Controls Tyco IP Holdings LLP | Building data platform with schema extensibility for properties and tags of a digital twin |
US12055907B2 (en) | 2021-11-16 | 2024-08-06 | Tyco Fire & Security Gmbh | Building data platform with schema extensibility for properties and tags of a digital twin |
US11934966B2 (en) | 2021-11-17 | 2024-03-19 | Johnson Controls Tyco IP Holdings LLP | Building data platform with digital twin inferences |
US11769066B2 (en) | 2021-11-17 | 2023-09-26 | Johnson Controls Tyco IP Holdings LLP | Building data platform with digital twin triggers and actions |
US11704311B2 (en) | 2021-11-24 | 2023-07-18 | Johnson Controls Tyco IP Holdings LLP | Building data platform with a distributed digital twin |
US12013673B2 (en) | 2021-11-29 | 2024-06-18 | Tyco Fire & Security Gmbh | Building control system using reinforcement learning |
US11714930B2 (en) | 2021-11-29 | 2023-08-01 | Johnson Controls Tyco IP Holdings LLP | Building data platform with digital twin based inferences and predictions for a graphical building model |
US12061633B2 (en) | 2022-09-08 | 2024-08-13 | Tyco Fire & Security Gmbh | Building system that maps points into a graph schema |
US12013823B2 (en) | 2022-09-08 | 2024-06-18 | Tyco Fire & Security Gmbh | Gateway system that maps points into a graph schema |
Also Published As
Publication number | Publication date |
---|---|
US9704537B2 (en) | 2017-07-11 |
US20170069353A1 (en) | 2017-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9704537B2 (en) | Methods and systems for coordinating home automation activity | |
US9977587B2 (en) | Fitness overlay and incorporation for home automation system | |
US9967614B2 (en) | Alert suspension for home automation system | |
US9628286B1 (en) | Television receiver and home automation system and methods to associate data with nearby people | |
US10073428B2 (en) | Methods and systems for control of home automation activity based on user characteristics | |
US10060644B2 (en) | Methods and systems for control of home automation activity based on user preferences | |
US10027503B2 (en) | Integrated door locking and state detection systems and methods | |
US20160203700A1 (en) | Methods and systems to make changes in home automation based on user states | |
US11659225B2 (en) | Systems and methods for targeted television commercials based on viewer presence | |
US20170064412A1 (en) | Device-based event detection and notification surfacing | |
US9632746B2 (en) | Automatic muting | |
US9798309B2 (en) | Home automation control based on individual profiling using audio sensor data | |
US9495860B2 (en) | False alarm identification | |
US20160191912A1 (en) | Home occupancy simulation mode selection and implementation | |
US20170146964A1 (en) | HVAC Health Monitoring Systems and Methods | |
US20160127765A1 (en) | Pausing playback of media content based on user presence | |
CA2822753A1 (en) | Methods and apparatuses to facilitate preselection of programming preferences | |
EP3053344B1 (en) | Intelligent recording of favorite video content using a video services receiver | |
US9934659B2 (en) | Outdoor messaging display for home automation/security systems | |
US9832530B2 (en) | Reduce blue light at set-top box to assist with sleep |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ECHOSTAR TECHNOLOGIES L.L.C., COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAXIER, KAREN;NGUYEN, TONY;REEL/FRAME:036491/0033 Effective date: 20150903 |
|
AS | Assignment |
Owner name: ECHOSTAR TECHNOLOGIES INTERNATIONAL CORPORATION, COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ECHOSTAR TECHNOLOGIES L.L.C.;REEL/FRAME:041735/0861 Effective date: 20170214 Owner name: ECHOSTAR TECHNOLOGIES INTERNATIONAL CORPORATION, C Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ECHOSTAR TECHNOLOGIES L.L.C.;REEL/FRAME:041735/0861 Effective date: 20170214 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |