US20190208390A1 - Methods And Apparatus For Exploiting Interfaces Smart Environment Device Application Program Interfaces - Google Patents
Methods And Apparatus For Exploiting Interfaces Smart Environment Device Application Program Interfaces Download PDFInfo
- Publication number
- US20190208390A1 US20190208390A1 US16/293,358 US201916293358A US2019208390A1 US 20190208390 A1 US20190208390 A1 US 20190208390A1 US 201916293358 A US201916293358 A US 201916293358A US 2019208390 A1 US2019208390 A1 US 2019208390A1
- Authority
- US
- United States
- Prior art keywords
- smart
- home
- lighting device
- home lighting
- devices
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 43
- 230000003213 activating effect Effects 0.000 claims description 9
- 230000009471 action Effects 0.000 claims description 6
- 238000013499 data model Methods 0.000 claims 7
- 230000007704 transition Effects 0.000 abstract description 26
- 230000006854 communication Effects 0.000 description 36
- 238000004891 communication Methods 0.000 description 36
- 230000000694 effects Effects 0.000 description 33
- 230000033001 locomotion Effects 0.000 description 33
- 239000000779 smoke Substances 0.000 description 29
- 230000007613 environmental effect Effects 0.000 description 22
- UGFAIRIUMAVXCW-UHFFFAOYSA-N Carbon monoxide Chemical compound [O+]#[C-] UGFAIRIUMAVXCW-UHFFFAOYSA-N 0.000 description 21
- 229910002091 carbon monoxide Inorganic materials 0.000 description 21
- 238000001514 detection method Methods 0.000 description 20
- 238000012545 processing Methods 0.000 description 19
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 17
- 241000282414 Homo sapiens Species 0.000 description 16
- 238000010586 diagram Methods 0.000 description 16
- 230000001960 triggered effect Effects 0.000 description 12
- 241000607479 Yersinia pestis Species 0.000 description 11
- 238000003032 molecular docking Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- 230000008859 change Effects 0.000 description 9
- 230000001276 controlling effect Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000005406 washing Methods 0.000 description 8
- 230000004044 response Effects 0.000 description 7
- 238000013473 artificial intelligence Methods 0.000 description 6
- 230000008901 benefit Effects 0.000 description 6
- 238000005265 energy consumption Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 230000007958 sleep Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 230000009286 beneficial effect Effects 0.000 description 5
- 238000004422 calculation algorithm Methods 0.000 description 5
- 230000010354 integration Effects 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 230000002262 irrigation Effects 0.000 description 5
- 238000003973 irrigation Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 3
- 230000007175 bidirectional communication Effects 0.000 description 3
- 230000033228 biological regulation Effects 0.000 description 3
- 235000021152 breakfast Nutrition 0.000 description 3
- 230000000875 corresponding effect Effects 0.000 description 3
- 238000004134 energy conservation Methods 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000010411 cooking Methods 0.000 description 2
- 238000001816 cooling Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 239000000383 hazardous chemical Substances 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000005012 migration Effects 0.000 description 2
- 238000013508 migration Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000029058 respiratory gaseous exchange Effects 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000012384 transportation and delivery Methods 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 208000024827 Alzheimer disease Diseases 0.000 description 1
- 241001674044 Blattodea Species 0.000 description 1
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 1
- 241000238631 Hexapoda Species 0.000 description 1
- 235000008694 Humulus lupulus Nutrition 0.000 description 1
- 244000035744 Hura crepitans Species 0.000 description 1
- 241000256602 Isoptera Species 0.000 description 1
- 241000699670 Mus sp. Species 0.000 description 1
- 241000283984 Rodentia Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 229910002092 carbon dioxide Inorganic materials 0.000 description 1
- 231100000357 carcinogen Toxicity 0.000 description 1
- 239000003183 carcinogenic agent Substances 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000008131 children development Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 231100000573 exposure to toxins Toxicity 0.000 description 1
- 235000019580 granularity Nutrition 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000002779 inactivation Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000035764 nutrition Effects 0.000 description 1
- 235000016709 nutrition Nutrition 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 238000010407 vacuum cleaning Methods 0.000 description 1
- 230000002618 waking effect Effects 0.000 description 1
- 230000037303 wrinkles Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/30—Control or safety arrangements for purposes related to the operation of the system, e.g. for safety or monitoring
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/30—Control or safety arrangements for purposes related to the operation of the system, e.g. for safety or monitoring
- F24F11/46—Improving electric energy efficiency or saving
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/30—Control or safety arrangements for purposes related to the operation of the system, e.g. for safety or monitoring
- F24F11/48—Control or safety arrangements for purposes related to the operation of the system, e.g. for safety or monitoring prior to normal operation, e.g. pre-heating or pre-cooling
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/50—Control or safety arrangements characterised by user interfaces or communication
- F24F11/52—Indication arrangements, e.g. displays
- F24F11/523—Indication arrangements, e.g. displays for displaying temperature data
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/50—Control or safety arrangements characterised by user interfaces or communication
- F24F11/56—Remote control
- F24F11/58—Remote control using Internet communication
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/62—Control or safety arrangements characterised by the type of control or by internal processing, e.g. using fuzzy logic, adaptive control or estimation of values
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/62—Control or safety arrangements characterised by the type of control or by internal processing, e.g. using fuzzy logic, adaptive control or estimation of values
- F24F11/63—Electronic processing
- F24F11/65—Electronic processing for selecting an operating mode
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B13/00—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
- G05B13/02—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
- G05B13/04—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric involving the use of models or simulators
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B17/00—Systems involving the use of models or simulators of said systems
- G05B17/02—Systems involving the use of models or simulators of said systems electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D23/00—Control of temperature
- G05D23/19—Control of temperature characterised by the use of electric means
- G05D23/1902—Control of temperature characterised by the use of electric means characterised by the use of a variable reference value
- G05D23/1904—Control of temperature characterised by the use of electric means characterised by the use of a variable reference value variable in time
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D23/00—Control of temperature
- G05D23/19—Control of temperature characterised by the use of electric means
- G05D23/1917—Control of temperature characterised by the use of electric means using digital means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/54—Interprogram communication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/54—Interprogram communication
- G06F9/541—Interprogram communication via adapters, e.g. between incompatible applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/54—Interprogram communication
- G06F9/546—Message passing systems or structures, e.g. queues
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B17/00—Fire alarms; Alarms responsive to explosion
- G08B17/10—Actuation by presence of smoke or gases, e.g. automatic alarm devices for analysing flowing fluid materials by the use of optical means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2816—Controlling appliance services of a home automation network by calling their functionalities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2816—Controlling appliance services of a home automation network by calling their functionalities
- H04L12/2818—Controlling appliance services of a home automation network by calling their functionalities from a device located outside both the home and the home network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2816—Controlling appliance services of a home automation network by calling their functionalities
- H04L12/282—Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2823—Reporting information sensed by appliance or service execution status of appliance services in a home automation network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2823—Reporting information sensed by appliance or service execution status of appliance services in a home automation network
- H04L12/2827—Reporting to a device within the home network; wherein the reception of the information reported automatically triggers the execution of a home appliance functionality
- H04L12/2829—Reporting to a device within the home network; wherein the reception of the information reported automatically triggers the execution of a home appliance functionality involving user profiles according to which the execution of a home appliance functionality is automatically triggered
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
- H04L67/1097—Protocols in which an application is distributed across nodes in the network for distributed storage of data in networks, e.g. transport arrangements for network file system [NFS], storage area networks [SAN] or network attached storage [NAS]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H04L67/22—
-
- H04L67/42—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/535—Tracking the activity of the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72415—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
-
- H04M1/72533—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/021—Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/60—Subscription-based services using application servers or record carriers, e.g. SIM application toolkits
-
- H05B37/0227—
-
- H05B37/0272—
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/175—Controlling the light source by remote control
- H05B47/19—Controlling the light source by remote control via wireless transmission
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F11/00—Control or safety arrangements
- F24F11/50—Control or safety arrangements characterised by user interfaces or communication
- F24F11/61—Control or safety arrangements characterised by user interfaces or communication using timers
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24F—AIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
- F24F2120/00—Control inputs relating to users or occupants
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L2012/284—Home automation networks characterised by the type of medium used
- H04L2012/2841—Wireless
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L2012/2847—Home automation networks characterised by the type of home appliance used
- H04L2012/285—Generic home appliances, e.g. refrigerators
-
- H04L29/06047—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/2866—Architectures; Arrangements
- H04L67/30—Profiles
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/175—Controlling the light source by remote control
- H05B47/196—Controlling the light source by remote control characterised by user interface arrangements
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/175—Controlling the light source by remote control
- H05B47/196—Controlling the light source by remote control characterised by user interface arrangements
- H05B47/1965—Controlling the light source by remote control characterised by user interface arrangements using handheld communication devices
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Definitions
- This disclosure relates to controlling access to electronic devices via application programming interface (API) restrictions.
- API application programming interface
- a person may interact with smart thermostats, lighting systems, alarm systems, entertainment systems, and a variety of other electronic devices.
- a person may communicate a command using an application program running on another electronic device.
- a person may control the temperature setting on a smart thermostat using an application program running on a smartphone.
- the application program may communicate with a secure online service that interacts with that thermostat.
- the manufacturer of the electronic device may develop the application programs to control the electronic device. Opening access to the electronic devices to third party developers, however, may potentially improve the experience of some people with the devices—but only if third party application programs do not cause the electronic devices to behave in an undesirable manner. Accordingly, while it may be desirable to open access to the electronic devices to third party developers, it may also be desirable to place restrictions on that access so as to reduce the risk that the third party access may negatively impact the operation of the electronic devices and thus the user experience associated with those devices.
- applications may access different installations of smart home devices (e.g., via an application programming interface (API)).
- the third party applications may communicate not directly with a smart home device, but rather through a device service.
- the device service may provide a corresponding update signal to the target smart home device based on one or more factors such as operation status parameters of the device.
- FIG. 1 is a block diagram of a smart home device, in accordance with an embodiment
- FIG. 2 is a block diagram of a connected smart home environment that includes a number of smart home devices, in accordance with an embodiment
- FIG. 3 is a block diagram illustrating a manner of controlling and/or accessing the smart home environment using services over the internet, in accordance with an embodiment
- FIG. 4 is a block diagram of processing paradigms that may be used to control devices of the smart home environment, in accordance with an embodiment
- FIG. 5 is a block diagram of a system that provides access to smart home devices, in accordance with an embodiment
- FIG. 6 is a flow diagram illustrating a method for transitioning temperatures based upon an estimated time of arrival, in accordance with an embodiment
- FIG. 7 is block diagram illustrating window creation for the method of FIG. 6 , in accordance with an embodiment
- FIG. 8 is a flow diagram illustrating a method for controlling devices using geo-fencing, in accordance with an embodiment
- FIG. 9 is a block diagram illustrating a set of geo-fence boundaries, in accordance with an embodiment
- FIG. 10 is a block diagram illustrating a geo-fencing application on a handheld electronic device, in accordance with an embodiment
- FIG. 11 is a block diagram illustrating an application running from an in-dash interface, in accordance with an embodiment
- FIG. 12 is a schematic illustration of a conditional rule where a thermostat, a smoke/carbon monoxide detector, or both are outputs, in accordance with an embodiment
- FIG. 13 is a schematic illustration of a conditional rule where data from a thermostat, a smoke/carbon monoxide detector, or both are conditions, in accordance with an embodiment
- FIG. 14 is a block diagram of a system that integrates household appliances with a thermostat, smoke/carbon monoxide detector, or both, in accordance with an embodiment
- FIG. 15 is a block diagram of a system that integrates a booking service with a thermostat, smoke/carbon monoxide detector, an alarm system, or combination thereof, in accordance with an embodiment
- FIG. 16 is a block diagram of a system that integrates a garage door opener with a thermostat, smoke/carbon monoxide detector, or both, in accordance with an embodiment.
- Embodiments of the present disclosure relate to an electronic device, such as a thermostat or a hazard detector (e.g., smoke detector), that may be disposed in a building (e.g., home or office) such that the electronic device may detect the presence of a human being in the building and distinguish between the presence of the human being and a pet.
- the electronic device may employ a sensor, such as a passive infrared (PIR) sensor, to detect the presence of a human being.
- PIR passive infrared
- each PIR sensor may be inherently sensitive to different levels of noise. By accounting for the different sensitivity levels of each PIR sensor, the electronic device may improve its detection of human being and better distinguish between the presence of human beings and pets.
- the electronic device may include a low-power processor that may store the sensor measurements acquired by the PIR sensor during a time period when the electronic device does not expect a human the building or portion of the building being monitored by electronic device is not expected to have a human being present.
- the low-power processor may send the stored sensor measurements to a high-power processor of the electronic device.
- the high-power processor may then calculate a threshold or adjust the previous threshold for determining a presence of a human based on the stored sensor measurements that correspond to the time period when a human being is likely not present in the building.
- the high-power processor may then send the newly calculated or the adjusted threshold to the low-power processor.
- the low-power processor may then use the newly calculated or the adjusted threshold to detect the presence of a human. Since the new threshold is calculated based on the respective sensor measurements for the respective PIR sensor of a respective electronic device, the new threshold may compensate for the inherent sensitivity characteristics of the respective PIR sensor. As a result, the electronic device may detect the presence of a human being more effectively and efficiently.
- FIG. 1 illustrates an example of a general device 10 that may that may be disposed within a building environment.
- the device 10 may include one or more sensors 12 , a user-interface component 14 , a power supply 16 (e.g., including a power connection and/or battery), a network interface 18 , a high-power processor 20 , a low-power processor 22 , a passive infrared (PIR) sensor 24 , a light source 26 , and the like.
- a power supply 16 e.g., including a power connection and/or battery
- PIR passive infrared
- the sensors 12 may detect various properties such as acceleration, temperature, humidity, water, supplied power, proximity, external motion, device motion, sound signals, ultrasound signals, light signals, fire, smoke, carbon monoxide, global-positioning-satellite (GPS) signals, radio-frequency (RF), other electromagnetic signals or fields, or the like.
- the sensors 12 may include temperature sensor(s), humidity sensor(s), hazard-related sensor(s) or other environmental sensor(s), accelerometer(s), microphone(s), optical sensors up to and including camera(s) (e.g., charged coupled-device or video cameras), active or passive radiation sensors, GPS receiver(s) or radiofrequency identification detector(s). While FIG.
- the device 10 may include one or more primary sensors and one or more secondary sensors.
- the primary sensor(s) may sense data central to the core operation of the device (e.g., sensing a temperature in a thermostat or sensing smoke in a smoke detector), while the secondary sensor(s) may sense other types of data (e.g., motion, light or sound), which can be used for energy-efficiency objectives or smart-operation obj ectives.
- One or more user-interface components 14 in the device 10 may receive input from the user and/or present information to the user. The received input may be used to determine a setting.
- the user-interface components may include a mechanical or virtual component that responds to the user's motion. For example, the user can mechanically move a sliding component (e.g., along a vertical or horizontal track) or rotate a rotatable ring (e.g., along a circular track), or the user's motion along a touchpad may be detected.
- Such motions may correspond to a setting adjustment, which can be determined based on an absolute position of a user-interface component 14 or based on a displacement of a user-interface components 14 (e.g., adjusting a set point temperature by 1 degree F. for every 10° rotation of a rotatable-ring component).
- Physically and virtually movable user-interface components can allow a user to set a setting along a portion of an apparent continuum.
- the user may not be confined to choose between two discrete options (e.g., as would be the case if up and down buttons were used) but can quickly and intuitively define a setting along a range of possible setting values.
- a magnitude of a movement of a user-interface component may be associated with a magnitude of a setting adjustment, such that a user may dramatically alter a setting with a large movement or finely tune a setting with a small movement.
- the user-interface components 14 may also include one or more buttons (e.g., up and down buttons), a keypad, a number pad, a switch, a microphone, and/or a camera (e.g., to detect gestures).
- the user-interface component 14 may include a click-and-rotate annular ring component that may enable the user to interact with the component by rotating the ring (e.g., to adjust a setting) and/or by clicking the ring inwards (e.g., to select an adjusted setting or to select an option).
- the user-interface component 14 may include a camera that may detect gestures (e.g., to indicate that a power or alarm state of a device is to be changed).
- the device 10 may have one primary input component, which may be used to set a plurality of types of settings.
- the user-interface components 14 may also be configured to present information to a user via, e.g., a visual display (e.g., a thin-film-transistor display or organic light-emitting-diode display) and/or an audio speaker.
- the power-supply component 16 may include a power connection and/or a local battery.
- the power connection may connect the device 10 to a power source such as a line voltage source.
- a power source such as a line voltage source.
- an AC power source can be used to repeatedly charge a (e.g., rechargeable) local battery, such that the battery may be used later to supply power to the device 10 when the AC power source is not available.
- the network interface 18 may include a component that enables the device 10 to communicate between devices. As such, the network interface 18 may enable the device 10 to communicate with other devices 10 via a wired or wireless network.
- the network interface 18 may include a wireless card or some other transceiver connection to facilitate this communication.
- the high-power processor 20 and the low-power processor 22 may support one or more of a variety of different device functionalities. As such, the high-power processor 20 and the low-power processor 22 may each include one or more processors configured and programmed to carry out and/or cause to be carried out one or more of the functionalities described herein. In one embodiment, the high-power processor 20 and the low-power processor 22 may include general-purpose processors carrying out computer code stored in local memory (e.g., flash memory, hard drive, random access memory), special-purpose processors or application-specific integrated circuits, combinations thereof, and/or using other types of hardware/firmware/software processing platforms. In certain embodiments, the high-power processor 20 may execute computationally intensive operations such as operating the user-interface component 14 and the like. The low-power processor 22 , on the other hand, may manage less complex processes such as detecting a hazard or temperature from the sensor 12 . In one embodiment, the low-power processor may wake or initialize the high-power processor for computationally intensive processes.
- local memory
- the high-power processor 20 and the low-power processor 22 may detect when a location (e.g., a house or room) is occupied (i.e., includes a presence of a human), up to and including whether it is occupied by a specific person or is occupied by a specific number of people (e.g., relative to one or more thresholds). In one embodiment, this detection can occur, e.g., by analyzing microphone signals, detecting user movements (e.g., in front of a device), detecting openings and closings of doors or garage doors, detecting wireless signals, detecting an internet protocol (IP) address of a received signal, detecting operation of one or more devices within a time window, or the like.
- IP internet protocol
- the high-power processor 20 and the low-power processor 22 may include image recognition technology to identify particular occupants or objects.
- the high-power processor 20 and the low-power processor 22 may detect the presence of a human using the PIR sensor 24 .
- the PIR sensor 24 may be a passive infrared sensor that may measures infrared (IR) light radiating from objects in its field of view. As such, the PIR sensor 24 may detect the Infrared radiation emitted from an object.
- IR infrared
- the high-power processor 20 may predict desirable settings and/or implement those settings. For example, based on the presence detection, the high-power processor 20 may adjust device settings to, e.g., conserve power when nobody is home or in a particular room or to accord with user preferences (e.g., general at-home preferences or user-specific preferences). As another example, based on the detection of a particular person, animal or object (e.g., a child, pet or lost object), the high-power processor 20 may initiate an audio or visual indicator of where the person, animal or object is or may initiate an alarm or security feature if an unrecognized person is detected under certain conditions (e.g., at night or when lights are off).
- user preferences e.g., general at-home preferences or user-specific preferences.
- the high-power processor 20 may initiate an audio or visual indicator of where the person, animal or object is or may initiate an alarm or security feature if an unrecognized person is detected under certain conditions (e.g., at night or when lights are off).
- devices may interact with each other such that events detected by a first device influences actions of a second device.
- a first device can detect that a user has entered into a garage (e.g., by detecting motion in the garage, detecting a change in light in the garage or detecting opening of the garage door).
- the first device can transmit this information to a second device via the network interface 18 , such that the second device can, e.g., adjust a home temperature setting, a light setting, a music setting, and/or a security-alarm setting.
- a first device can detect a user approaching a front door (e.g., by detecting motion or sudden light pattern changes).
- the first device may, e.g., cause a general audio or visual signal to be presented (e.g., such as sounding of a doorbell) or cause a location-specific audio or visual signal to be presented (e.g., to announce the visitor's presence within a room that a user is occupying).
- a general audio or visual signal e.g., such as sounding of a doorbell
- a location-specific audio or visual signal e.g., to announce the visitor's presence within a room that a user is occupying.
- the device 10 may include a light source 26 that may illuminate when a living being, such as a human, is detected as approaching.
- the light source 26 may include any type of light source such as one or more light-emitting diodes or the like.
- the light source 26 may be communicatively coupled to the high-power processor 20 and the low-power processor 22 , which may provide a signal to cause the light source 26 to illuminate.
- FIG. 2 illustrates an example of a smart-home environment 30 within which one or more of the devices 10 of FIG. 1 , methods, systems, services, and/or computer program products described further herein can be applicable.
- the depicted smart-home environment 30 includes a structure 32 , which can include, e.g., a house, office building, garage, or mobile home.
- a structure 32 can include, e.g., a house, office building, garage, or mobile home.
- devices can also be integrated into a smart-home environment 30 that does not include an entire structure 32 , such as an apartment, condominium, or office space.
- the smart home environment can control and/or be coupled to devices outside of the actual structure 32 . Indeed, several devices in the smart home environment need not physically be within the structure 32 at all. For example, a device controlling a pool heater or irrigation system can be located outside of the structure 32 .
- the depicted structure 32 includes a plurality of rooms 38 , separated at least partly from each other via walls 40 .
- the walls 40 can include interior walls or exterior walls.
- Each room can further include a floor 42 and a ceiling 44 .
- Devices can be mounted on, integrated with and/or supported by a wall 40 , floor 42 or ceiling 44 .
- the smart-home environment 30 of FIG. 2 includes a plurality of devices 10 , including intelligent, multi-sensing, network-connected devices, that can integrate seamlessly with each other and/or with a central server or a cloud-computing system to provide any of a variety of useful smart-home objectives.
- the smart-home environment 30 may include one or more intelligent, multi-sensing, network-connected thermostats 46 (hereinafter referred to as “smart thermostats 46 ”), one or more intelligent, network-connected, multi-sensing hazard detection units 50 (hereinafter referred to as “smart hazard detectors 50 ”), and one or more intelligent, multi-sensing, network-connected entryway interface devices 52 (hereinafter referred to as “smart doorbells 52 ”).
- the smart thermostat 46 may include a Nest® Learning Thermostat—1st Generation T100577 or Nest® Learning Thermostat—2nd Generation T200577 by Nest Labs, Inc., among others.
- the smart thermostat 46 detects ambient climate characteristics (e.g., temperature and/or humidity) and controls a HVAC system 48 accordingly.
- the smart hazard detector 50 may detect the presence of a hazardous substance or a substance indicative of a hazardous substance (e.g., smoke, fire, or carbon monoxide).
- the smart hazard detector 50 may include a Nest® Protect that may include sensors 12 such as smoke sensors, carbon monoxide sensors, and the like. As such, the hazard detector 50 may determine when smoke, fire, or carbon monoxide may be present within the building.
- the smart doorbell 52 may detect a person's approach to or departure from a location (e.g., an outer door), control doorbell functionality, announce a person's approach or departure via audio or visual means, or control settings on a security system (e.g., to activate or deactivate the security system when occupants go and come).
- the smart doorbell 52 may interact with other devices 10 based on whether someone has approached or entered the smart-home environment 30 .
- the smart-home environment 30 further includes one or more intelligent, multi-sensing, network-connected wall switches 54 (hereinafter referred to as “smart wall switches 54 ”), along with one or more intelligent, multi-sensing, network-connected wall plug interfaces 56 (hereinafter referred to as “smart wall plugs 56 ”).
- the smart wall switches 54 may detect ambient lighting conditions, detect room-occupancy states, and control a power and/or dim state of one or more lights. In some instances, smart wall switches 54 may also control a power state or speed of a fan, such as a ceiling fan.
- the smart wall plugs 56 may detect occupancy of a room or enclosure and control supply of power to one or more wall plugs (e.g., such that power is not supplied to the plug if nobody is at home).
- the device 10 within the smart-home environment 30 may further includes a plurality of intelligent, multi-sensing, network-connected appliances 58 (hereinafter referred to as “smart appliances 58 ”), such as refrigerators, stoves and/or ovens, televisions, washers, dryers, lights, stereos, intercom systems, garage-door openers, floor fans, ceiling fans, wall air conditioners, pool heaters, irrigation systems, security systems, and so forth.
- the network-connected appliances 58 are made compatible with the smart-home environment by cooperating with the respective manufacturers of the appliances.
- the appliances can be space heaters, window AC units, motorized duct vents, etc.
- an appliance When plugged in, an appliance can announce itself to the smart-home network, such as by indicating what type of appliance it is, and it can automatically integrate with the controls of the smart-home. Such communication by the appliance to the smart home can be facilitated by any wired or wireless communication protocols known by those having ordinary skill in the art.
- the smart home also can include a variety of non-communicating legacy appliances 68 , such as old conventional washer/dryers, refrigerators, and the like which can be controlled, albeit coarsely (ON/OFF), by virtue of the smart wall plugs 56 .
- the smart-home environment 30 can further include a variety of partially communicating legacy appliances 70 , such as infrared (“IR”) controlled wall air conditioners or other IR-controlled devices, which can be controlled by IR signals provided by the smart hazard detectors 50 or the smart wall switches 54 .
- legacy appliances 70 such as infrared (“IR”) controlled wall air conditioners or other IR-controlled devices, which can be controlled by IR signals provided by the smart hazard detectors 50 or the smart wall switches 54 .
- the smart thermostats 46 , the smart hazard detectors 50 , the smart doorbells 52 , the smart wall switches 54 , the smart wall plugs 56 , and other devices of the smart-home environment 30 are modular and can be incorporated into older and new houses.
- the devices 10 are designed around a modular platform consisting of two basic components: a head unit and a back plate, which is also referred to as a docking station. Multiple configurations of the docking station are provided so as to be compatible with any home, such as older and newer homes. However, all of the docking stations include a standard head-connection arrangement, such that any head unit can be removably attached to any docking station.
- the docking stations are interfaces that serve as physical connections to the structure and the voltage wiring of the homes, and the interchangeable head units contain all of the sensors 12 , processors 28 , user interfaces 14 , the power supply 16 , the network interface 18 , and other functional components of the devices described above.
- the head unit can ask the user (by 2D LCD display, 2D/3D holographic projection, voice interaction, etc.) a few simple questions such as, “Where am I” and the user can indicate “living room”, “kitchen” and so forth.
- the smart-home environment 30 may also include communication with devices outside of the physical home but within a proximate geographical range of the home.
- the smart-home environment 30 may include a pool heater monitor 34 that communicates a current pool temperature to other devices within the smart-home environment 30 or receives commands for controlling the pool temperature.
- the smart-home environment 30 may include an irrigation monitor 36 that communicates information regarding irrigation systems within the smart-home environment 30 and/or receives control information for controlling such irrigation systems.
- an algorithm is provided for considering the geographic location of the smart-home environment 30 , such as based on the zip code or geographic coordinates of the home. The geographic information is then used to obtain data helpful for determining optimal times for watering, such data may include sun location information, temperature, dewpoint, soil type of the land on which the home is located, etc.
- one or more of the smart-home devices of FIG. 2 can further allow a user to interact with the device even if the user is not proximate to the device.
- a user can communicate with a device using a computer (e.g., a desktop computer, laptop computer, or tablet) or other portable electronic device (e.g., a smartphone) 66 .
- a web page or app can be configured to receive communications from the user and control the device based on the communications and/or to present information about the device's operation to the user.
- the user can view a current setpoint temperature for a device and adjust it using a computer.
- the user can be in the structure during this remote communication or outside the structure.
- users can control the smart thermostat and other smart devices in the smart-home environment 30 using a network-connected computer or portable electronic device 66 .
- some or all of the occupants e.g., individuals who live in the home
- Such registration can be made at a central server to authenticate the occupant and/or the device as being associated with the home and to give permission to the occupant to use the device to control the smart devices in the home.
- An occupant can use their registered device 66 to remotely control the smart devices of the home, such as when the occupant is at work or on vacation.
- the occupant may also use their registered device to control the smart devices when the occupant is actually located inside the home, such as when the occupant is sitting on a couch inside the home. It should be appreciated that instead of or in addition to registering devices 66 , the smart-home environment 30 makes inferences about which individuals live in the home and are therefore occupants and which devices 66 are associated with those individuals. As such, the smart-home environment “learns” who is an occupant and permits the devices 66 associated with those individuals to control the smart devices of the home.
- the smart-home environment may receive communication from an unregistered mobile device of an individual inside of the home, where said individual is not recognized as an occupant of the home. Further, for example, a smart-home environment may receive communication from a mobile device of an individual who is known to be or who is registered as a guest.
- a guest-layer of controls can be provided to guests of the smart-home environment 30 .
- the guest-layer of controls gives guests access to basic controls (e.g., a judicially selected subset of features of the smart devices), such as temperature adjustments, but it locks out other functionalities.
- the guest layer of controls can be thought of as a “safe sandbox” in which guests have limited controls, but they do not have access to more advanced controls that could fundamentally alter, undermine, damage, or otherwise impair the occupant-desired operation of the smart devices. For example, the guest layer of controls will not permit the guest to adjust the heat-pump lockout temperature.
- a use case example of this is when a guest is in a smart home, the guest could walk up to the thermostat and turn the dial manually, but the guest may not want to walk around the house “hunting” the thermostat, especially at night while the home is dark and others are sleeping. Further, the guest may not want to go through the hassle of downloading the necessary application to their device for remotely controlling the thermostat. In fact, the guest may not have the home owner's login credentials, etc., and therefore cannot remotely control the thermostat via such an application. Accordingly, according to embodiments of the invention, the guest can open a mobile browser on their mobile device, type a keyword, such as “NEST” into the URL field and tap “Go” or “Search”, etc.
- the device presents the guest with a user interface which allows the guest to move the target temperature between a limited range, such as 65 and 80 degrees Fahrenheit.
- a user interface provides a guest layer of controls that are limited to basic functions. The guest cannot change the target humidity, modes, or view energy history.
- a local webserver is provided that is accessible in the local area network (LAN). It does not require a password, because physical presence inside the home is established reliably enough by the guest's presence on the LAN.
- the home owner is asked if they want to enable a Local Web App (LWA) on the smart device.
- LWA Local Web App
- Business owners will likely say no; home owners will likely say yes.
- the smart device broadcasts to the LAN that the above referenced keyword, such as “NEST”, is now a host alias for its local web server.
- a guest layer of controls may also be provided to users by means other than a device 66 .
- the smart device such as the smart thermostat, may be equipped with walkup-identification technology (e.g., face recognition, RFID, ultrasonic sensors) that “fingerprints” or creates a “signature” for the occupants of the home.
- the walkup-identification technology can be the same as or similar to the fingerprinting and signature creating techniques described in other sections of this application.
- the smart device provides the guest with the guest layer of controls, rather than full controls.
- the smart thermostat 46 and other smart devices “learn” by observing occupant behavior. For example, the smart thermostat learns occupants' preferred temperature set-points for mornings and evenings, and it learns when the occupants are asleep or awake, as well as when the occupants are typically away or at home, for example. According to embodiments, when a guest controls the smart devices, such as the smart thermostat, the smart devices do not “learn” from the guest. This prevents the guest's adjustments and controls from affecting the learned preferences of the occupants.
- a smart television remote control recognizes occupants by thumbprint, visual identification, RFID, etc., and it recognizes a user as a guest or as someone belonging to a particular class having limited control and access (e.g., child).
- the smart remote control Upon recognizing the user as a guest or someone belonging to a limited class, the smart remote control only permits that user to view a subset of channels and to make limited adjustments to the settings of the television and other devices. For example, a guest cannot adjust the digital video recorder (DVR) settings, and a child is limited to viewing child-appropriate programming.
- DVR digital video recorder
- sinks, bathtubs, and showers can be controlled by smart spigots that recognize users as guests or as children and therefore prevent water from exceeding a designated temperature that is considered safe.
- each of the devices 34 , 36 , 46 , 50 , 52 , 54 , 56 , and 58 is capable of data communications and information sharing with any other of the smart devices, as well as to any central server or cloud-computing system or any other device that is network-connected anywhere in the world.
- the required data communications can be carried out using any of a variety of custom or standard wireless protocols (Wi-Fi, ZigBee, 6LoWPAN, etc.) and/or any of a variety of custom or standard wired protocols (CAT6 Ethernet, HomePlug, etc.).
- all or some of the smart devices can serve as wireless or wired repeaters.
- a first one of the smart devices can communicate with a second one of the smart device via a wireless router 60 .
- the smart devices can further communicate with each other via a connection to a network, such as the Internet 62 .
- the smart devices can communicate with a central server or a cloud-computing system 64 .
- the central server or cloud-computing system 64 can be associated with a manufacturer, support entity, or service provider associated with the device.
- a user may be able to contact customer support using a device itself rather than needing to use other communication means such as a telephone or Internet-connected computer.
- software updates can be automatically sent from the central server or cloud-computing system 64 to devices (e.g., when available, when purchased, or at routine intervals).
- the smart devices combine to create a mesh network of spokesman and low-power nodes in the smart-home environment 30 , where some of the smart devices are “spokesman” nodes and others are “low-powered” nodes. Some of the smart devices in the smart-home environment 30 are battery powered, while others have a regular and reliable power source, such as by connecting to wiring (e.g., to 120V line voltage wires) behind the walls 40 of the smart-home environment.
- the smart devices that have a regular and reliable power source are referred to as “spokesman” nodes.
- nodes are equipped with the capability of using any wireless protocol or manner to facilitate bidirectional communication with any of a variety of other devices in the smart-home environment 30 as well as with the central server or cloud-computing system 64 .
- the devices that are battery powered are referred to as “low-power” nodes.
- These nodes tend to be smaller than spokesman nodes and can only communicate using wireless protocols that requires very little power, such as Zigbee, 6LoWPAN, etc. Further, some, but not all, low-power nodes are incapable of bidirectional communication. These low-power nodes send messages, but they are unable to “listen”. Thus, other devices in the smart-home environment 30 , such as the spokesman nodes, cannot send information to these low-power nodes.
- the smart devices serve as low-power and spokesman nodes to create a mesh network in the smart-home environment 30 .
- Individual low-power nodes in the smart-home environment regularly send out messages regarding what they are sensing, and the other low-powered nodes in the smart-home environment—in addition to sending out their own messages—repeat the messages, thereby causing the messages to travel from node to node (i.e., device to device) throughout the smart-home environment 30 .
- the spokesman nodes in the smart-home environment 30 are able to “drop down” to low-powered communication protocols to receive these messages, translate the messages to other communication protocols, and send the translated messages to other spokesman nodes and/or the central server or cloud-computing system 64 .
- the low-powered nodes using low-power communication protocols are able send messages across the entire smart-home environment 30 as well as over the Internet 62 to the central server or cloud-computing system 64 .
- the mesh network enables the central server or cloud-computing system 64 to regularly receive data from all of the smart devices in the home, make inferences based on the data, and send commands back to one of the smart devices to accomplish some of the smart-home objectives described herein.
- the spokesman nodes and some of the low-powered nodes are capable of “listening”. Accordingly, users, other devices, and the central server or cloud-computing system 64 can communicate controls to the low-powered nodes.
- a user can use the portable electronic device (e.g., a smartphone) 66 to send commands over the Internet 62 to the central server or cloud-computing system 64 , which then relays the commands to the spokesman nodes in the smart-home environment 30 .
- the spokesman nodes drop down to a low-power protocol to communicate the commands to the low-power nodes throughout the smart-home environment, as well as to other spokesman nodes that did not receive the commands directly from the central server or cloud-computing system 64 .
- a low-power node is a smart night light 65 .
- the smart night light 65 houses an occupancy sensor, such as an ultrasonic or passive IR sensor, and an ambient light sensor, such as a photoresistor or a single-pixel sensor that measures light in the room.
- the smart night light 65 is configured to activate the light source when its ambient light sensor detects that the room is dark and when its occupancy sensor detects that someone is in the room. In other embodiments, the smart night light 65 is simply configured to activate the light source when its ambient light sensor detects that the room is dark.
- the smart night light 65 includes a low-power wireless communication chip (e.g., ZigBee chip) that regularly sends out messages regarding the occupancy of the room and the amount of light in the room, including instantaneous messages coincident with the occupancy sensor detecting the presence of a person in the room.
- these messages may be sent wirelessly, using the mesh network, from node to node (i.e., smart device to smart device) within the smart-home environment 30 as well as over the Internet 62 to the central server or cloud-computing system 64 .
- low-powered nodes include battery-operated versions of the smart hazard detectors 50 .
- These smart hazard detectors 50 are often located in an area without access to constant and reliable power and, as discussed in detail below, may include any number and type of sensors, such as smoke/fire/heat sensors, carbon monoxide/dioxide sensors, occupancy/motion sensors, ambient light sensors, temperature sensors, humidity sensors, and the like.
- smart hazard detectors 50 can send messages that correspond to each of the respective sensors to the other devices and the central server or cloud-computing system 64 , such as by using the mesh network as described above.
- spokesman nodes examples include smart thermostats 46 , smart doorbells 52 , smart wall switches 54 , and smart wall plugs 56 . These devices 46 , 52 , 54 , and 56 are often located near and connected to a reliable power source, and therefore can include more power-consuming components, such as one or more communication chips capable of bidirectional communication in any variety of protocols.
- these low-powered and spokesman nodes can function as “tripwires” for an alarm system in the smart-home environment. For example, in the event a perpetrator circumvents detection by alarm sensors located at windows, doors, and other entry points of the smart-home environment 30 , the alarm could be triggered upon receiving an occupancy, motion, heat, sound, etc. message from one or more of the low-powered and spokesman nodes in the mesh network.
- the central server or cloud-computing system 64 or some other device could trigger an alarm, provided the alarm is armed at the time of detection.
- the alarm system could be enhanced by various low-powered and spokesman nodes located throughout the smart-home environment 30 .
- a user could enhance the security of the smart-home environment 30 by buying and installing extra smart nightlights 65 .
- the devices 10 may be incapable of communicating with each other. Therefore, as discussed in detail below, the present techniques provide network communication jamming attack detection and notification solutions to such a problem.
- the mesh network can be used to automatically turn on and off lights as a person transitions from room to room.
- the low-powered and spokesman nodes detect the person's movement through the smart-home environment and communicate corresponding messages through the mesh network.
- the central server or cloud-computing system 64 or some other device activates and deactivates the smart wall switches 54 to automatically provide light as the person moves from room to room in the smart-home environment 30 .
- users may provide pre-configuration information that indicates which smart wall plugs 56 provide power to lamps and other light sources, such as the smart night light 65 .
- this mapping of light sources to wall plugs 56 can be done automatically (e.g., the smart wall plugs 56 detect when a light source is plugged into it, and it sends a corresponding message to the central server or cloud-computing system 64 ). Using this mapping information in combination with messages that indicate which rooms are occupied, the central server or cloud-computing system 64 or some other device activates and deactivates the smart wall plugs 56 that provide power to lamps and other light sources so as to track the person's movement and provide light as the person moves from room to room.
- the mesh network of low-powered and spokesman nodes can be used to provide exit lighting in the event of an emergency.
- users provide pre-configuration information that indicates exit routes in the smart-home environment 30 . For example, for each room in the house, the user provides a map of the best exit route.
- the central server or cloud-computing system 64 or some other device could automatically determine the routes using uploaded maps, diagrams, architectural drawings of the smart-home house, as well as using a map generated based on positional information obtained from the nodes of the mesh network (e.g., positional information from the devices is used to construct a map of the house).
- the central server or cloud-computing system 64 or some other device uses occupancy information obtained from the low-powered and spokesman nodes to determine which rooms are occupied and then turns on lights (e.g., nightlights 65 , wall switches 54 , wall plugs 56 that power lamps, etc.) along the exit routes from the occupied rooms so as to provide emergency exit lighting.
- lights e.g., nightlights 65 , wall switches 54 , wall plugs 56 that power lamps, etc.
- service robots 69 each configured to carry out, in an autonomous manner, any of a variety of household tasks.
- the service robots 69 can be respectively configured to perform floor sweeping, floor washing, etc. in a manner similar to that of known commercially available devices such as the ROOMBATM and SCOOBATM products sold by iRobot, Inc. of Bedford, Mass.
- Tasks such as floor sweeping and floor washing can be considered as “away” or “while-away” tasks for purposes of the instant description, as it is generally more desirable for these tasks to be performed when the occupants are not present.
- one or more of the service robots 69 are configured to perform tasks such as playing music for an occupant, serving as a localized thermostat for an occupant, serving as a localized air monitor/purifier for an occupant, serving as a localized baby monitor, serving as a localized hazard detector for an occupant, and so forth, it being generally more desirable for such tasks to be carried out in the immediate presence of the human occupant.
- tasks can be considered as “human-facing” or “human-centric” tasks.
- a particular one of the service robots 69 can be considered to be facilitating what can be called a “personal comfort-area network” for the occupant, with the objective being to keep the occupant's immediate space at a comfortable temperature wherever that occupant may be located in the home.
- a personal comfort-area network for the occupant
- the localized-thermostat service robot 69 is configured to move itself into the immediate presence (e.g., within five feet) of a particular occupant who has settled into a particular location in the home (e.g. in the dining room to eat their breakfast and read the news).
- the localized-thermostat service robot 69 includes a temperature sensor, a processor, and wireless communication components configured such that control communications with the HVAC system, either directly or through a wall-mounted wirelessly communicating thermostat coupled to the HVAC system, are maintained and such that the temperature in the immediate vicinity of the occupant is maintained at their desired level. If the occupant then moves and settles into another location (e.g. to the living room couch to watch television), the localized-thermostat service robot 69 proceeds to move and park itself next to the couch and keep that particular immediate space at a comfortable temperature.
- the localized-thermostat service robot 69 can identify and locate the occupant whose personal-area space is to be kept at a comfortable temperature
- RFID sensing e.g., person having an RFID bracelet, RFID necklace, or RFID key fob
- a particular service robot 69 When serving as a localized air monitor/purifier for an occupant, a particular service robot 69 can be considered to be facilitating what can be called a “personal health-area network” for the occupant, with the objective being to keep the air quality in the occupant's immediate space at healthy levels.
- other health-related functions can be provided, such as monitoring the temperature or heart rate of the occupant (e.g., using finely remote sensors, near-field communication with on-person monitors, etc.).
- a particular service robot 69 When serving as a localized hazard detector for an occupant, a particular service robot 69 can be considered to be facilitating what can be called a “personal safety-area network” for the occupant, with the objective being to ensure there is no excessive carbon monoxide, smoke, fire, etc., in the immediate space of the occupant.
- Methods analogous to those described above for personal comfort-area networks in terms of occupant identifying and tracking are likewise applicable for personal health-area network and personal safety-area network embodiments.
- the above-referenced facilitation of personal comfort-area networks, personal health-area networks, personal safety-area networks, and/or other such human-facing functionalities of the service robots 69 are further enhanced by logical integration with other smart sensors in the home according to rules-based inferencing techniques or artificial intelligence techniques for achieving better performance of those human-facing functionalities and/or for achieving those goals in energy-conserving or other resource-conserving ways.
- the air monitor/purifier service robot 69 can be configured to detect whether a household pet is moving toward the currently settled location of the occupant (e.g., using on-board sensors and/or by data communications with other smart-home sensors along with rules-based inferencing/artificial intelligence techniques), and if so, the air purifying rate is immediately increased in preparation for the arrival of more airborne pet dander.
- the hazard detector service robot 69 can be advised by other smart-home sensors that the temperature and humidity levels are rising in the kitchen, which is nearby to the occupant's current dining room location, and responsive to this advisory the hazard detector service robot 69 will temporarily raise a hazard detection threshold, such as a smoke detection threshold, under an inference that any small increases in ambient smoke levels will most likely be due to cooking activity and not due to a genuinely hazardous condition.
- a hazard detection threshold such as a smoke detection threshold
- each service robot 69 includes wireless communication components that facilitate data communications with one or more of the other wirelessly communicating smart-home sensors of FIG. 2 and/or with one or more other service robots 69 (e.g., using Wi-Fi, Zigbee, Z-Wave, 6LoWPAN, etc.), and one or more of the smart-home devices 10 can be in communication with a remote server over the Internet.
- each service robot 69 can be configured to communicate directly with a remote server by virtue of cellular telephone communications, satellite communications, 3G/4G network data communications, or other direct communication method.
- inventions are systems and methods relating to the integration of the service robot(s) 69 with home security sensors and related functionalities of the smart home system.
- the embodiments are particularly applicable and advantageous when applied for those service robots 69 that perform “away” functionalities or that otherwise are desirable to be active when the home is unoccupied (hereinafter “away-service robots”).
- away-service robots Included in the embodiments are methods and systems for ensuring that home security systems, intrusion detection systems, and/or occupancy-sensitive environmental control systems (for example, occupancy-sensitive automated setback thermostats that enter into a lower-energy-using condition when the home is unoccupied) are not erroneously triggered by the away-service robots.
- a home automation and security system e.g., as shown in FIG. 2
- a monitoring service by virtue of automated systems (e.g., cloud-based servers or other central servers, hereinafter “central server”) that are in data communications with one or more network-connected elements of the home automation and security system.
- the away-service robots are configured to be in operative data communication with the central server, and are configured such that they remain in a non-away-service state (e.g., a dormant state at their docking station) unless permission is granted from the central server (e.g., by virtue of an “away-service-OK” message from the central server) to commence their away-service activities.
- An away-state determination made by the system which can be arrived at (i) exclusively by local on-premises smart device(s) based on occupancy sensor data, (ii) exclusively by the central server based on received occupancy sensor data and/or based on received proximity-related information such as GPS coordinates from user smartphones or automobiles, or (iii) any combination of (i) and (ii) can then trigger the granting of away-service permission to the away-service robots by the central server.
- the central server can readily filter signals from the occupancy sensing devices to distinguish between the away-service robot activity versus any unexpected intrusion activity, thereby avoiding a false intrusion alarm condition while also ensuring that the home is secure.
- the central server may provide filtering data (such as an expected occupancy-sensing profile triggered by the away-service robots) to the occupancy sensing nodes or associated processing nodes of the smart home, such that the filtering is performed at the local level.
- filtering data such as an expected occupancy-sensing profile triggered by the away-service robots
- the central server may temporarily disable the occupancy sensing equipment for the duration of the away-service robot activity.
- functionality similar to that of the central server in the above example can be performed by an on-site computing device such as a dedicated server computer, a “master” home automation console or panel, or as an adjunct function of one or more of the smart-home devices of FIG. 2 .
- an on-site computing device such as a dedicated server computer, a “master” home automation console or panel, or as an adjunct function of one or more of the smart-home devices of FIG. 2 .
- the home security systems and/or occupancy-sensitive environmental controls that would be triggered by the motion, noise, vibrations, or other disturbances of the away-service robot activity are referenced simply as “activity sensing systems,” and when so triggered will yield a “disturbance-detected” outcome representative of the false trigger (for example, an alarm message to a security service, or an “arrival” determination for an automated setback thermostat that causes the home to be heated or cooled to a more comfortable “occupied” setpoint temperature).
- the away-service robots are configured to emit a standard ultrasonic sound throughout the course of their away-service activity
- the activity sensing systems are configured to detect that standard ultrasonic sound
- the activity sensing systems are further configured such that no disturbance-detected outcome will occur for as long as that standard ultrasonic sound is detected.
- the away-service robots are configured to emit a standard notification signal throughout the course of their away-service activity
- the activity sensing systems are configured to detect that standard notification signal
- the activity sensing systems are further configured such that no disturbance-detected outcome will occur for as long as that standard notification signal is detected
- the standard notification signal comprises one or more of: an optical notifying signal; an audible notifying signal; an infrared notifying signal; an infrasonic notifying signal; a wirelessly transmitted data notification signal (e.g., an IP broadcast, multicast, or unicast notification signal, or a notification message sent in an TCP/IP two-way communication session).
- the notification signals sent by the away-service robots to the activity sensing systems are authenticated and encrypted such that the notifications cannot be learned and replicated by a potential burglar.
- Any of a variety of known encryption/authentication schemes can be used to ensure such data security including, but not limited to, methods involving third party data security services or certificate authorities.
- a permission request-response model can be used, wherein any particular away-service robot requests permission from each activity sensing system in the home when it is ready to perform its away-service tasks, and does not initiate such activity until receiving a “yes” or “permission granted” message from each activity sensing system (or from a single activity sensing system serving as a “spokesman” for all of the activity sensing systems).
- One advantage of the described embodiments that do not require a central event orchestrator is that there can (optionally) be more of an arms-length relationship between the supplier(s) of the home security/environmental control equipment, on the one hand, and the supplier(s) of the away-service robot(s), on the other hand, as it is only required that there is the described standard one-way notification protocol or the described standard two-way request/permission protocol to be agreed upon by the respective suppliers.
- the activity sensing systems are configured to detect sounds, vibrations, RF emissions, or other detectable environmental signals or “signatures” that are intrinsically associated with the away-service activity of each away-service robot, and are further configured such that no disturbance-detected outcome will occur for as long as that particular detectable signal or environmental “signature” is detected.
- a particular kind of vacuum-cleaning away-service robot may emit a specific sound or RF signature.
- the away-service environmental signatures for each of a plurality of known away-service robots are stored in the memory of the activity sensing systems based on empirically collected data, the environmental signatures being supplied with the activity sensing systems and periodically updated by a remote update server.
- the activity sensing systems can be placed into a “training mode” for the particular home in which they are installed, wherein they “listen” and “learn” the particular environmental signatures of the away-service robots for that home during that training session, and thereafter will suppress disturbance-detected outcomes for intervals in which those environmental signatures are heard.
- the activity sensing system is configured to automatically learn the environmental signatures for the away-service robots by virtue of automatically performing correlations over time between detected environmental signatures and detected occupancy activity.
- an intelligent automated nonoccupancy-triggered setback thermostat such as the Nest Learning Thermostat can be configured to constantly monitor for audible and RF activity as well as to perform infrared-based occupancy detection.
- the environmental signature of the away-service robot will remain relatively constant from event to event, and in view of the fact that the away-service events will likely either (a) themselves be triggered by some sort of nonoccupancy condition as measured by the away-service robots themselves, or (b) occur at regular times of day, there will be patterns in the collected data by which the events themselves will become apparent and for which the environmental signatures can be readily learned.
- the environmental signatures of the away-service robots are automatically learned without requiring user interaction, it is more preferable that a certain number of false triggers be tolerable over the course of the learning process.
- this automatic-learning embodiment is more preferable for application in occupancy-sensitive environmental control equipment (such as an automated setback thermostat) rather than home security systems for the reason that a few false occupancy determinations may cause a few instances of unnecessary heating or cooling, but will not otherwise have any serious consequences, whereas false home security alarms may have more serious consequences.
- occupancy-sensitive environmental control equipment such as an automated setback thermostat
- technologies including the sensors of the smart devices located in the mesh network of the smart-home environment in combination with rules-based inference engines or artificial intelligence provided at the central server or cloud-computing system 64 are used to provide a personal “smart alarm clock” for individual occupants of the home.
- user-occupants can communicate with the central server or cloud-computing system 64 via their mobile devices 66 to access an interface for the smart alarm clock.
- occupants can turn on their “smart alarm clock” and input a wake time for the next day and/or for additional days.
- the occupant may have the option of setting a specific wake time for each day of the week, as well as the option of setting some or all of the inputted wake times to “repeat”.
- Artificial intelligence will be used to consider the occupant's response to these alarms when they go off and make inferences about the user's preferred sleep patterns over time.
- the smart device in the smart-home environment 30 that happens to be closest to the occupant when the occupant falls asleep will be the device that transmits messages regarding when the occupant stopped moving, from which the central server or cloud-computing system 64 will make inferences about where and when the occupant prefers to sleep.
- This closest smart device will as be the device that sounds the alarm to wake the occupant.
- the “smart alarm clock” will follow the occupant throughout the house, by tracking the individual occupants based on their “unique signature”, which is determined based on data obtained from sensors located in the smart devices.
- the sensors include ultrasonic sensors, passive IR sensors, and the like.
- the unique signature is based on a combination of walking gate, patterns of movement, voice, height, size, etc. It should be appreciated that facial recognition may also be used.
- the wake times associated with the “smart alarm clock” are used by the smart thermostat 46 to control the HVAC in an efficient manner so as to pre-heat or cool the house to the occupant's desired “sleeping” and “awake” temperature settings.
- the preferred settings can be learned over time, such as by observing which temperature the occupant sets the thermostat to before going to sleep and which temperature the occupant sets the thermostat to upon waking up.
- a device is positioned proximate to the occupant's bed, such as on an adjacent nightstand, and collects data as the occupant sleeps using noise sensors, motion sensors (e.g., ultrasonic, IR, and optical), etc.
- Data may be obtained by the other smart devices in the room as well.
- Such data may include the occupant's breathing patterns, heart rate, movement, etc. Inferences are made based on this data in combination with data that indicates when the occupant actually wakes up. For example, if—on a regular basis—the occupant's heart rate, breathing, and moving all increase by 5% to 10%, twenty to thirty minutes before the occupant wakes up each morning, then predictions can be made regarding when the occupant is going to wake.
- predictions to provide other smart-home objectives such as adjusting the smart thermostat 46 so as to pre-heat or cool the home to the occupant's desired setting before the occupant wakes up. Further, these predictions can be used to set the “smart alarm clock” for the occupant, to turn on lights, etc.
- technologies including the sensors of the smart devices located throughout the smart-home environment in combination with rules-based inference engines or artificial intelligence provided at the central server or cloud-computing system 64 are used to detect or monitor the progress of Alzheimer's Disease.
- the unique signatures of the occupants are used to track the individual occupants' movement throughout the smart-home environment 30 .
- This data can be aggregated and analyzed to identify patterns indicative of Alzheimer's.
- individuals with Alzheimer's have distinctive patterns of migration in their homes. For example, a person will walk to the kitchen and stand there for a while, then to the living room and stand there for a while, and then back to the kitchen. This pattern will take about thirty minutes, and then the person will repeat the pattern.
- the remote servers or cloud computing architectures 64 analyze the person's migration data collected by the mesh network of the smart-home environment to identify such patterns.
- FIG. 3 illustrates an embodiment of an extensible devices and services platform 80 that can be concentrated at a single server or distributed among several different computing entities without limitation with respect to the smart-home environment 30 .
- the extensible devices and services platform 80 may include a processing engine 86 , which may include engines that receive data from devices of smart-home environments (e.g., via the Internet or a hubbed network), to index the data, to analyze the data and/or to generate statistics based on the analysis or as part of the analysis.
- the analyzed data can be stored as derived home data 88 .
- Results of the analysis or statistics can thereafter be transmitted back to the device that provided home data used to derive the results, to other devices, to a server providing a web page to a user of the device, or to other non-device entities.
- use statistics, use statistics relative to use of other devices, use patterns, and/or statistics summarizing sensor readings can be generated by the processing engine 86 and transmitted.
- the results or statistics can be provided via the Internet 62 .
- the processing engine 86 can be configured and programmed to derive a variety of useful information from the home data 82 .
- a single server can include one or more engines.
- the derived data can be highly beneficial at a variety of different granularities for a variety of useful purposes, ranging from explicit programmed control of the devices on a per-home, per-neighborhood, or per-region basis (for example, demand-response programs for electrical utilities), to the generation of inferential abstractions that can assist on a per-home basis (for example, an inference can be drawn that the homeowner has left for vacation and so security detection equipment can be put on heightened sensitivity), to the generation of statistics and associated inferential abstractions that can be used for government or charitable purposes.
- processing engine 86 can generate statistics about device usage across a population of devices and send the statistics to device users, service providers or other entities (e.g., that have requested or may have provided monetary compensation for the statistics).
- the home data 82 , the derived home data 88 , and/or another data can be used to create “automated neighborhood safety networks.” For example, in the event the central server or cloud-computing architecture 64 receives data indicating that a particular home has been broken into, is experiencing a fire, or some other type of emergency event, an alarm is sent to other smart homes in the “neighborhood.” In some instances, the central server or cloud-computing architecture 64 automatically identifies smart homes within a radius of the home experiencing the emergency and sends an alarm to the identified homes.
- the other homes in the “neighborhood” do not have to sign up for or register to be a part of a safety network, but instead are notified of an emergency based on their proximity to the location of the emergency.
- this can be an opt-in service and that, in addition to or instead of the central server or cloud-computing architecture 64 selecting which homes to send alerts to, individuals can subscribe to participate in such networks and individuals can specify which homes they want to receive alerts from. This can include, for example, the homes of family members who live in different cities, such that individuals can receive alerts when their loved ones in other locations are experiencing an emergency.
- sound, vibration, and/or motion sensing components of the smart devices are used to detect sound, vibration, and/or motion created by running water. Based on the detected sound, vibration, and/or motion, the central server or cloud-computing architecture 64 makes inferences about water usage in the home and provides related services. For example, the central server or cloud-computing architecture 64 can run programs/algorithms that recognize what water sounds like and when it is running in the home.
- the central server or cloud-computing architecture 64 to map the various water sources of the home, upon detecting running water, the central server or cloud-computing architecture 64 sends a message an occupant's mobile device asking if water is currently running or if water has been recently run in the home and, if so, which room and which water-consumption appliance (e.g., sink, shower, toilet, etc.) was the source of the water. This enables the central server or cloud-computing architecture 64 to determine the “signature” or “fingerprint” of each water source in the home. This is sometimes referred to herein as “audio fingerprinting water usage.”
- the central server or cloud-computing architecture 64 creates a signature for the toilet in the master bathroom, and whenever that toilet is flushed, the central server or cloud-computing architecture 64 will know that the water usage at that time is associated with that toilet. Thus, the central server or cloud-computing architecture 64 can track the water usage of that toilet as well as each water-consumption application in the home. This information can be correlated to water bills or smart water meters so as to provide users with a breakdown of their water usage.
- sound, vibration, and/or motion sensing components of the smart devices are used to detect sound, vibration, and/or motion created by mice and other rodents as well as by termites, cockroaches, and other insects (collectively referred to as “pests”).
- the central server or cloud-computing architecture 64 Based on the detected sound, vibration, and/or motion, the central server or cloud-computing architecture 64 makes inferences about pest-detection in the home and provides related services.
- the central server or cloud-computing architecture 64 can run programs/algorithms that recognize what certain pests sound like, how they move, and/or the vibration they create, individually and/or collectively.
- the central server or cloud-computing architecture 64 can determine the “signatures” of particular types of pests.
- the central server or cloud-computing architecture 64 detects sounds that may be associated with pests, it notifies the occupants of such sounds and suggests hiring a pest control company. If it is confirmed that pests are indeed present, the occupants input to the central server or cloud-computing architecture 64 confirms that its detection was correct, along with details regarding the identified pests, such as name, type, description, location, quantity, etc. This enables the central server or cloud-computing architecture 64 to “tune” itself for better detection and create “signatures” or “fingerprints” for specific types of pests.
- the central server or cloud-computing architecture 64 can use the tuning as well as the signatures and fingerprints to detect pests in other homes, such as nearby homes that may be experiencing problems with the same pests. Further, for example, in the event that two or more homes in a “neighborhood” are experiencing problems with the same or similar types of pests, the central server or cloud-computing architecture 64 can make inferences that nearby homes may also have such problems or may be susceptible to having such problems, and it can send warning messages to those homes to help facilitate early detection and prevention.
- the devices and services platform 80 expose a range of application programming interfaces (APIs) 90 to third parties, such as charities 94 , governmental entities 96 (e.g., the Food and Drug Administration or the Environmental Protection Agency), academic institutions 98 (e.g., university researchers), businesses 100 (e.g., providing device warranties or service to related equipment, targeting advertisements based on home data), utility companies 102 , and other third parties.
- the APIs 90 are coupled to and permit third party systems to communicate with the central server or the cloud-computing system 64 , including the services 84 , the processing engine 86 , the home data 82 , and the derived home data 88 .
- the APIs 90 allow applications executed by the third parties to initiate specific data processing tasks that are executed by the central server or the cloud-computing system 64 , as well as to receive dynamic updates to the home data 82 and the derived home data 88 .
- third parties can develop programs and/or applications, such as web or mobile apps that integrate with the central server or the cloud-computing system 64 to provide services and information to users.
- programs and application may be, for example, designed to help users reduce energy consumption, to preemptively service faulty equipment, to prepare for high service demands, to track past service performance, etc., or to perform any of a variety of beneficial functions or tasks now known or hereinafter developed.
- third party applications make inferences from the home data 82 and the derived home data 88 , such inferences may include when are occupants home, when are they sleeping, when are they cooking, when are they in the den watching television, and when do they shower.
- the answers to these questions may help third-parties benefit consumers by providing them with interesting information, products and services as well as with providing them with targeted advertisements.
- a shipping company creates an application that makes inferences regarding when people are at home.
- the application uses the inferences to schedule deliveries for times when people will most likely be at home.
- the application can also build delivery routes around these scheduled times. This reduces the number of instances where the shipping company has to make multiple attempts to deliver packages, and it reduces the number of times consumers have to pick up their packages from the shipping company.
- FIG. 4 describes an abstracted functional view 110 of the extensible devices and services platform 80 of FIG. 3 , with particular reference to the processing engine 86 as well as devices, such as those of the smart-home environment 30 of FIG. 2 .
- devices situated in smart-home environments will have an endless variety of different individual capabilities and limitations, they can all be thought of as sharing common characteristics in that each of them is a data consumer 112 (DC), a data source 114 (DS), a services consumer 116 (SC), and a services source 118 (SS).
- DC data consumer 112
- DS data source 114
- SC services consumer 116
- SS services source 118
- the extensible devices and services platform 80 can also be configured to harness the large amount of data that is flowing out of these devices.
- the extensible devices and services platform 80 can be directed to “repurposing” that data in a variety of automated, extensible, flexible, and/or scalable ways to achieve a variety of useful objectives. These objectives may be predefined or adaptively identified based on, e.g., usage patterns, device efficiency, and/or user input (e.g., requesting specific functionality).
- FIG. 4 shows processing engine 86 as including a number of paradigms 120 .
- Processing engine 86 can include a managed services paradigm 120 a that monitors and manages primary or secondary device functions.
- the device functions can include ensuring proper operation of a device given user inputs, estimating that (e.g., and responding to an instance in which) an intruder is or is attempting to be in a dwelling, detecting a failure of equipment coupled to the device (e.g., a light bulb having burned out), implementing or otherwise responding to energy demand response events, or alerting a user of a current or predicted future event or characteristic.
- Processing engine 86 can further include an advertising/communication paradigm 120 b that estimates characteristics (e.g., demographic information), desires and/or products of interest of a user based on device usage. Services, promotions, products or upgrades can then be offered or automatically provided to the user. Processing engine 86 can further include a social paradigm 120 c that uses information from a social network, provides information to a social network (for example, based on device usage), and/or processes data associated with user and/or device interactions with the social network platform. For example, a user's status as reported to their trusted contacts on the social network could be updated to indicate when they are home based on light detection, security system inactivation or device usage detectors. As another example, a user may be able to share device-usage statistics with other users. In yet another example, a user may share HVAC settings that result in low power bills and other users may download the HVAC settings to their smart thermostat 46 to reduce their power bills.
- characteristics e.g., demographic information
- Services, promotions, products or upgrades can then be offered or automatically provided to the user
- the processing engine 86 can include a challenges/rules/compliance/rewards paradigm 120 d that informs a user of challenges, competitions, rules, compliance regulations and/or rewards and/or that uses operation data to determine whether a challenge has been met, a rule or regulation has been complied with and/or a reward has been earned.
- the challenges, rules or regulations can relate to efforts to conserve energy, to live safely (e.g., reducing exposure to toxins or carcinogens), to conserve money and/or equipment life, to improve health, etc.
- one challenge may involve participants turning down their thermostat by one degree for one week. Those that successfully complete the challenge are rewarded, such as by coupons, virtual currency, status, etc.
- compliance an example involves a rental-property owner making a rule that no renters are permitted to access certain owner's rooms. The devices in the room having occupancy sensors could send updates to the owner when the room is accessed.
- the processing engine 86 can integrate or otherwise utilize extrinsic information 122 from extrinsic sources to improve the functioning of one or more processing paradigms.
- Extrinsic information 122 can be used to interpret data received from a device, to determine a characteristic of the environment near the device (e.g., outside a structure that the device is enclosed in), to determine services or products available to the user, to identify a social network or social-network information, to determine contact information of entities (e.g., public-service entities such as an emergency-response team, the police or a hospital) near the device, etc., to identify statistical or environmental conditions, trends or other information associated with a home or neighborhood, and so forth.
- entities e.g., public-service entities such as an emergency-response team, the police or a hospital
- each bedroom of the smart-home environment 30 can be provided with a smart wall switch 54 , a smart wall plug 56 , and/or smart hazard detectors 50 , all or some of which include an occupancy sensor, wherein the occupancy sensor is also capable of inferring (e.g., by virtue of motion detection, facial recognition, audible sound patterns, etc.) whether the occupant is asleep or awake.
- the remote security/monitoring service or fire department is advised of how many occupants there are in each bedroom, and whether those occupants are still asleep (or immobile) or whether they have properly evacuated the bedroom. While this is, of course, a very advantageous capability accommodated by the described extensible devices and services platform 80 , there can be substantially more “profound” examples that can truly illustrate the potential of a larger “intelligence” that can be made available. By way of perhaps a more “profound” example, the same bedroom occupancy data that is being used for fire safety can also be “repurposed” by the processing engine 86 in the context of a social paradigm of neighborhood child development and education.
- the same bedroom occupancy and motion data discussed in the “ordinary” example can be collected and made available (properly anonymized) for processing in which the sleep patterns of schoolchildren in a particular ZIP code can be identified and tracked.
- Localized variations in the sleeping patterns of the schoolchildren may be identified and correlated, for example, to different nutrition programs in local schools.
- the described extensible devices and services platform 80 may enable communicating emergency information between smart-home environments 30 that are linked and/or to the proper authorities. For example, when a burglar breaks into a smart-home environment 30 , a home security system may trip and sound an alarm and/or send emergency notifications to the neighbors, the police, the security company, and the like. However, in instances where the break in is preceded by a jamming attack on the wireless network, the notifications may not be sent out if their transmission is dependent upon the wireless network. Thus, another means to communicate with external parties may be desired. As such, the techniques disclosed herein solve this problem by detecting the jamming attack and sending emergency notifications via side channels that are not dependent upon the wireless network.
- programs, applications, and/or application services may be used to communicate requests or commands to the smart home devices 10 , in some embodiments these may not be sent directly to the smart home devices 10 .
- the following figures illustrate smart device communication and/or control via an application accessing an API.
- FIG. 5 illustrates a system 140 where an API may be used to access and/or control one or more smart devices.
- a person may desire to access a number of smart home devices 10 , such as a first smart home device 10 A and second smart home devices 10 B.
- the first smart home device 10 A is an example of a smart thermostat, such as the Nest® Learning Thermostat by Nest Labs, Inc. (a company of Google, Inc.)
- the second smart home devices 10 B are examples of smart hazard detectors, such as the Nest® Protect by Nest Labs, Inc.
- Two application programs are shown accessing the smart home devices 10 A and/or 10 B through the device service 84 .
- FIG. 5 illustrates accessing the smart home devices 10 A and/or 10 B using two separate application programs, it should be appreciated that any suitable number of application programs may be used to access the smart home devices 10 A and/or 10 B.
- a first application 142 sends a first device request message 144 targeted to a smart home device 10 (e.g., the smart home device 10 A) into cloud service(s) 145 and, more specifically, to a first application service 146 .
- a second application 148 may be used to issue a second device request message 150 targeted to a smart home device 10 (e.g., the smart home device 10 A) to a second application service 152 also among the cloud service(s) 145 .
- the first application 142 is a navigation application that sends estimated-time-of-arrival (ETA) information in the device request messages 144 .
- ETA estimated-time-of-arrival
- the first application 142 may be used to cause the smart home devices 10 A and/or 10 B to be prepared when a person arrives home.
- the first application 142 may send occasional device request messages 144 indicating the ETA to the first application service 146 , which may forward this information to the device service 84 (e.g., via an API, as discussed above).
- the device service 84 may hold the device request messages 144 from the first application 142 until an appropriate time.
- the second application 148 may be a third party home-automation application that may be running on a portable electronic device, such as a personal mobile device.
- the second application 148 may generate device request messages 150 , such as commands to control or request information from the smart home devices 10 A and/or 10 B.
- the second application service 152 may interface with the device service 84 by way of an API, as mentioned above.
- the first application service 146 , the second application service 152 , and the device service 84 are illustrated in FIG. 5 as cloud service(s) 145 , it may appreciated that some or all of these services may run on electronic devices that are not remote cloud-computer systems accessible by way of the Internet. Indeed, in some examples, the device service 84 may not be on a network that is remote from the smart home devices 10 A and/or 10 B, but rather may be running on an electronic device in the same local area network as the smart home devices 10 A and/or 10 B. For example, the device service 84 may, additionally or alternatively, run on a local server computer and/or a local wireless router on the same local area network as the smart home devices 10 A and/or 10 B. Moreover, some applications may communicate directly with the device service 84 (e.g., via the API) without first communicating with an application service such as the first application service 146 or the second application service 152 .
- an application service such as the first application service 146 or the second application service 152 .
- the device service 84 may not merely forward these messages to the smart home devices 10 A and/or 10 B that the device request messages are targeted too. Rather, the device service 84 may serve as the point of contact that application programs may use to access the smart home devices 10 A and/or 10 B. The device service 84 then may communicate information and/or commands provided by the applications to the smart home devices 10 A and/or 10 B, enabling coordination between the applications and the devices 10 A and/or 10 B.
- the smart home devices 10 A and/or 10 B may occasionally transmit device operation status parameters 156 or other data based on the device operation status parameters 156 through the device service 84 and the proper application service (e.g., first application service 146 and/or second application service 152 ) to the proper applications (e.g., first application 142 and/or second application 148 ).
- the proper application service e.g., first application service 146 and/or second application service 152
- the device operation status parameters 156 may represent any suitable characteristics of the operation status of the smart home devices 10 A and/or 10 B that may affect the proper functioning of the smart home devices 10 A and/or 10 B.
- the device operation status parameters 156 may include, for example: a battery level 159 indicative of an amount of charge remaining in a battery of the smart home device; a charging rate 160 indicative of a current rate that the battery of the smart home device is charging; a current device age 161 indicative of a period of use since initial install, a period of use since manufacture, a period of use since original sale, etc.; a planned lifespan 162 indicative of an expected useful operational duration of the smart home device; an amount of recent wireless use 163 (selected within a timespan recent enough to substantially affect an internal temperature of the smart home device 10 ); a direct measurement of an internal device temperature 164 ; and/or device operation status parameters for connected devices 165 .
- the operational status parameters for connected devices 165 may represent any suitable operational parameters that may describe the smart home devices 10 (e.g., smart home device 10 A) through which the device service 84 may use to connect to a target smart home device 10 (e.g., one of the smart home devices 10 B). For example, regarding the operational status parameters for connected devices 165 , if the target smart home device 10 is the last smart home device 10 B through three smart home devices 10 in three communication “hops”, the device operation status parameters 156 associated with these three intervening smart home devices 10 may be included.
- the target smart home device 10 is the last smart home device 10 B through three smart home devices 10 in three communication “hops”
- the device operation status parameters 156 associated with these three intervening smart home devices 10 may be included.
- the various specific device operation status parameters 156 shown in FIG. 5 are provided by way of example. As such, the device operation status parameters 156 shown in FIG. 5 should not be understood to be exhaustive, but merely representative of possible operational parameters that may be considered for API-accessing applications. For example, additional device operation status parameters may include current state of the device (e.g., sleeping, awake, Wifi active/inactive, executing a demand-response algorithm, executing a time-to-temperature algorithm, etc.).
- current state of the device e.g., sleeping, awake, Wifi active/inactive, executing a demand-response algorithm, executing a time-to-temperature algorithm, etc.
- the applications may use the device operation status parameters 156 or data to affect subsequent interactions (e.g., via messages 144 or 150 ) that are transmitted to the smart home devices 10 A and/or 10 B.
- the device operation status parameters 156 may correspond only to a target smart home device 10 (e.g., the smart home device 10 A), or may correspond to other smart home devices 10 that are in the vicinity of the target smart home device 10 (e.g., the smart home device 10 A and the smart home devices 10 B).
- the device operation status parameters 156 may correspond substantially only to the smart home device 10 A.
- the device operation status parameters 156 may contain operational parameter information about both the smart home device 10 A and the smart home device 10 B.
- the second application 148 may include voice actions.
- a user input to the second application 148 may be an audible cue to “Set [brand(e.g. ‘nest’)
- the second application 148 may convert this into messages that ultimately become commands to transition the desired temperature of the thermostat 10 A.
- an audible queue might be to “Turn on the heat.”
- the commands provided to the thermostat 10 A would set the thermostat one degree Celsius above the current ambient temperature. If the thermostat 10 A is in range mode, both the low and high points are raised one degree Celsius.
- an audible queue might be to “Turn on the [air conditioning ⁇ ooling
- the commands provided to the thermostat 10 A would set the thermostat one degree Celsius lower the current ambient temperature. If the thermostat 10 A is in range mode, both the low and high points are lowered one degree Celsius.
- an audible queue might be to “set [brand(e.g. ‘nest’)
- the commands provided to the thermostat 10 A would change the mode of the thermostat 10 A to “AWAY.”
- the audible queue is “set [brand(e.g. ‘nest’)
- a message 144 is provided from a vehicle-based application 142 .
- the message 144 may indicate an estimated time of arrival (“ETA”) to a location (e.g., “home”) where the devices 10 A and/or 10 B are located.
- this ETA device may be provided by the second program 148 running on a user device (e.g., a smart phone running the Google Now application).
- the device service 84 (or any other processor-based component of the system 140 ) may determine controls for the smart devices 10 A and/or 10 B.
- the device 10 A may be aware of a time period needed for an air conditioning system to adjust the temperature of an environment where the device 10 A is located. Operation of the device 10 A may be altered based upon the provided ETA information.
- the ETA information may be used to automatically take the device 10 A out of an “AWAY” mode (e.g., set to a “HOME” mode) when the ETA reaches a particular threshold.
- the device 10 A may be taken out of the “AWAY” mode when the ETA is, for example, less than 1 hour, less than thirty minutes, etc.
- a comparison of the ETA information and an expected temperature transition time may be used to automatically begin temperature adjustment, such that the home is at a desired temperature at the ETA of the vehicle. Accordingly, the transition state of the temperature adjustment may be completed prior to the vehicle operator entering the environment controlled by the device 10 A.
- FIG. 6 illustrates a flow diagram of a process 166 for adjusting temperature in this manner.
- the process 166 begins by obtaining an estimated time of arrival (“ETA”) (block 168 ).
- block 168 may be triggered by setting a map application destination (e.g. an in-car navigation system and/or Google Map Application) to “home.”
- the ETA may be provided by an application communicating directly and/or indirectly with the smart device(s).
- a transition time to obtain a desired temperature from a current ambient temperature is calculated (block 170 ).
- the transition time is compared with the ETA (block 172 ).
- a determination is made as to whether or not the transition time is greater than or equal to the ETA (decision block 174 ).
- a time window may be defined based upon the transition time. For example, additional time (e.g., 0.5 hours, 1.5 hours, etc.) may be added to a transition time to ensure a desired temperature is reached prior to the vehicle's ETA. This will be described in more detail with regards to FIG. 7 .
- the process continues to poll for new ETA's from the application (or counts down until the transition time is greater than or equal to the ETA).
- the smart device e.g., the thermostat 10 A
- the temperature adjustment e.g., cooling
- FIG. 7 illustrates a window creation operation for the ETA-based temperature adjustment.
- a graphical user interface 180 e.g., a slider
- a relatively large window 182 here, Transition Time+a 1.5 hour buffer
- a relatively small window 186 here, Transition Time+0.1 hrs
- maximum efficiency 188 e.g., ensure that less energy is used.
- a vehicular application e.g., first application 142 of FIG. 5
- the smart devices e.g., smart devices 10 A and/or 10 B of FIG. 5
- This information may be used to control the smart devices (e.g., via geo-fencing).
- FIGS. 8-11 relate to such embodiments.
- FIG. 8 is a process 190 for controlling smart devices via data obtained from a vehicular application.
- FIG. 9 illustrates an example of geo-fencing boundaries 200 .
- FIG. 10 relates to a location-based application on a smart phone (e.g., Google Now) and
- FIG. 11 relates to a location-based application within a vehicle.
- the process 190 begins with obtaining a location of a vehicle (or other structure providing location information) (block 192 ). As mentioned above, this may be done by providing, for example, global-positioning-system (GPS) coordinates from the vehicular application to the smart devices (e.g., via one or more APIs).
- GPS global-positioning-system
- geo-fence locations are determined (block 194 ). As illustrated in FIG. 10 , one or more geo-fencing boundaries 200 may define locations (e.g., perimeters). Any number of boundaries of any shape or size may be used to create geo-fences. Operation of the smart devices (e.g., 10 A and/or 10 B) may be altered when the vehicle is located within and/or transitions into one of the boundaries 200 (block 196 ).
- the vehicular application may automatically prompt the user to set the thermostat to an “AWAY” mode.
- the location 210 has moved 212 to the location 210 ′ (e.g., from the home zone 200 A to outside the home zone 200 A).
- the prompt 214 may be provided.
- the prompt 214 is provided on a handheld device 216 (e.g., a tablet computer, a programmable remote control, and/or a cellular telephone).
- FIG. 11 provides an illustration of vehicular application embodiment.
- a prompt 270 may be provided in a graphical user interface of the vehicle, here an in-dash graphical user interface 272 .
- the vehicular application or other application may provide an automatic prompt suggesting to set one or more of the smart devices (e.g., thermostat 10 A) to “HOME” mode (e.g., not “AWAY”). For example, if the location were indicated as being within boundary 200 A or a transition into boundary 200 A was detected (e.g., by transition from location 210 ′ to location 210 ), the application may automatically prompt to set one or more of the smart devices to “HOME.”
- a vehicular application may allow manual configuration adjustments for smart devices.
- the vehicular applications may allow a user to manually set “HOME” and/or “AWAY” mode of a thermostat without having to physically access a separate application (e.g. a smart phone or tablet computer application).
- a separate application e.g. a smart phone or tablet computer application.
- the user would not have to engage a graphical user interface of a smart phone or tablet, but could access configuration adjustments directly from the vehicular application (e.g. via the in-dash graphical user interface 272 ).
- other configuration adjustments may be possible.
- a temperature adjustment graphical user interface 274 may enable changes to the desired temperature of the thermostat 10 A.
- one or more messages may be sent from the vehicular application to the smart devices, which may be interpreted by a processor to control the smart devices. Accordingly, when user inputs (e.g., temperature adjustments or mode change adjustments) are made at the vehicular application, one or more control messages may be provided via the API(s). These messages are interpreted and cause the relevant control of the smart devices.
- user inputs e.g., temperature adjustments or mode change adjustments
- energy consumption data may be provided from the vehicular application to the smart devices (or a cloud service 145 associated with the smart devices).
- gasoline and/or electrical power usage 276 may be provided to cloud services 145 .
- the cloud services 145 may provide an optimal vehicle charging schedule based on utility cost information known to the cloud services 145 . For example, in some situations, utility companies may provide cheaper energy at off-peak times.
- the cloud services 145 may provide a recharging schedule based upon these off-peak energy times.
- the vehicular energy consumption data may allow integration with energy conservation games (e.g., Nest Leaf) available for other smart devices (e.g., the thermostat 10 A). Accordingly, energy usage reports may provide not only energy usage for smart devices within the home, but also energy consumption of vehicles related to that home.
- energy conservation games e.g., Nest Leaf
- energy usage reports may provide not only energy usage for smart devices within the home, but also energy consumption of vehicles related to that home.
- device operation status 156 and/or other data may be provided from smart devices to applications (e.g., the vehicular application (first application 142 )). Indeed, operational status of these smart devices (e.g., smoke and/or carbon monoxide detectors (e.g., smart devices 10 B) may be provided the vehicular application.
- a status GUI 278 provides an indication of the current operating status of a smoke detector and/or carbon monoxide detector.
- an alarm system status, ambient temperature, or any other operational and/or sensor data may be provided for display within a vehicle.
- conditional rules may be generated based upon information received and/or sent to the API(s).
- conditional rule generation may occur from a website, such as a site that enables plugging in of conditions and outputs from a variety of different sources.
- dedicated machine-readable code having conditional rule generation instructions may be stored on a tangible, non-transitory, machine-readable medium and executed by a machine.
- conditional rules may be created where the smart devices 10 A and/or 10 B are affected as an output of the rule.
- FIG. 12 illustrates an example of a conditional rule 300 where the output 302 is access and/or control of one or more features of the smart devices 10 A and/or 10 B.
- an output 302 for a thermostat 10 A may be changing a mode (e.g., “HOME” or “AWAY”) for the thermostat, changing a desired temperature level of the thermostat, setting a fan to on or off, changing a fan speed, changing a temperature adjustment system (e.g., setting heat to cool or vice versa), etc.
- Example outputs 302 relating to a smoke detector and/or carbon monoxide detector may be activating/deactivating alarms, activating/deactivating audio, activating/deactivating lighting, activating/deactivating motion sensors, etc.
- the conditions 304 used to control the outputs 302 need not be sourced from the smart devices accessed and/or controlled by the outputs 302 .
- the conditional rules 300 may be based upon conditions sourced from an external data source 306 (e.g., external to the smart devices 10 A and/or 10 B).
- FIG. 12 illustrates a conditional rule 300 where the condition(s) 304 is sourced from an external source 302 .
- the external data source 306 may include a weather service, social media site (e.g., check-in announcement), electronic-calendar (e.g., Google calendar), geo-fencing application, utility company rate schedule, an electronic device (e.g., an alarm clock), etc.
- conditional rules may be based upon information sourced from the smart devices 10 A (e.g., thermostat) and/or 10 B (e.g., smoke and/or carbon monoxide detector).
- the source for the condition 304 may be the smart devices 10 A and/or 10 B
- the outputs 302 may be external to the smart devices 10 A and/or 10 B.
- FIG. 13 illustrates a conditional rule 310 where the output 302 is an external output 312 and the inputs 304 are sourced from data provided by the thermostat 10 A and/or smoke and/or carbon monoxide detector 10 B.
- both the inputs 302 and the outputs 304 relate to the smart devices 10 A and/or 10 B.
- Example conditions 304 that may be sourced from the thermostat 10 A may include: any device operation status 156 of the thermostat, a mode (e.g., “HOME” and/or “AWAY”) of the thermostat, an ambient temperature of the thermostat, an amount of periodic temperature change, etc.
- Example conditions 304 that may be sourced from the smoke and/or carbon monoxide detector 10 B may include: an operating status 156 of the device, an active smoke alarm, and active carbon monoxide alarm, low device battery level, etc.
- conditional rules e.g., 300 and 310
- data from an activity monitor such as an electronic wristband that tracks vital statistics may be used to provide a condition for a conditional rule.
- a conditional output may set the desired temperature to a desired sleep temperature.
- the output may set the desired thermostat temperature to an awake temperature.
- a conditional output may correspond to smart lighting.
- the lighting may be turned off when the thermostat 10 A enters an “AWAY” mode. This helps to ensure that energy is not wasted while no one is in the home. Further, when the thermostat 10 A enters “HOME” mode, the lighting may be re-activated (perhaps in the same configuration as when it was turned off, or a new configuration, such as lighting only the front foyer where access to the home typically occurs).
- lighting colors may change based upon conditions from the devices 10 A and/or 10 B. For example, it has been shown that the color red may provide visibility benefits when smoke and/or gaseous conditions. Accordingly, color-changing lights, may be transitioned to red when an alarm from the smoke/carbon monoxide detector 10 B is active.
- additional notifications may be provided via conditional rules.
- a rule may trigger a text message, email, voice call, etc. to family, friends, neighbors, home-owners, etc. when a smoke alarm and/or a carbon monoxide alarm is triggered.
- a conditional rule may mute or lower decibel levels of one or more devices if an alarm of the detector 10 B is active. In some instances, this may be done in conjunction with a programmable remote control.
- a weather service may provide conditions 304 for a conditional rule. For example, when the weather service reports an extremely hot and/or humid day, the desired temperature of the thermostat 10 A may be adjusted as a conditional rule output. Thus, the thermostat 10 A may become highly customizable for a user's desired preferences.
- Outputs 302 related to mode changes in the thermostat 10 A may be implemented by conditions sourced from social media data. For example, a “check-in” on Google Hangouts may suggest that a homeowner is not home and that an “AWAY” mode should be set. Accordingly, a rule may be generated to set the mode of the thermostat 10 A to “AWAY” if there is a check-in outside of the home.
- the geo-fencing applications may also be used as conditions for the conditional rules. For example, an output altering the mode of the thermostat 10 A to “AWAY” may be triggered when exiting the boundary 200 A. The thermostat 10 A mode may be altered to “HOME” when entering the boundary 200 A.
- other smart devices within the home may trigger outputs of the smart devices 10 A and/or 10 B.
- the thermostat 10 A mode may be set to “HOME.”
- particular keywords or contextual identifiers may be used as conditions 304 that trigger an output 302 .
- the thermostat may be controlled to go into “AWAY” mode.
- the “AWAY” mode output may be triggered at the thermostat 10 A.
- thermostat 10 A when the thermostat 10 A transitions to “HOME,” audio playback may be triggered. Further, when the thermostat 10 A transitions to “AWAY,” music playback may be halted. Additionally, activating music playback on a device within the home may automatically trigger a command to enable “HOME” mode on the thermostat 10 A.
- each of the thermostats 10 A and/or detectors 10 B may accessed by a unique identifier. Accordingly, a condition 304 and/or output 302 may be specifically tied to a particular one or many of the thermostats 10 A and/or detectors 10 B.
- the API(s) may enable other automation system to interact with the smart devices 10 A and/or 10 B.
- a Control4® system may use the API(s) to increase/decrease temperatures of the thermostat 10 A, may receive alarm states or other device operation status 156 from the thermostat 10 A and/or detector 10 B, set modes of operation (e.g., “heat,” “cool,” “HOME,” and/or “AWAY” on the thermostat 10 A, etc.
- FIG. 14 illustrates such a system 370 .
- the washing machine 372 may include a system to maintain unattended laundry.
- a fan may periodically pull moisture from the drum of the washing machine 372 and also periodically tumble the unattended laundry.
- the dryer 374 may include an unattended laundry system that intermittently tumbles unattended laundry after a dryer cycle.
- these unattended laundry systems are activated manually via an onboard interface of the washing machine 372 .
- this system may be activated automatically, using occupancy status discerned from the smart devices 10 A and/or 10 B.
- the thermostat 10 A is set to “AWAY” when the thermostat detects an indication that no one is in the temperature-controlled environment. Further, when the detectors 10 B are equipped with occupancy sensors, similar household occupancy status may be defined. The status from the detectors 10 B may be provided to the thermostat 10 A, which in turn may automatically be set to “AWAY.” Further, thermostat 10 A users may manually set the thermostat to “AWAY,” upon leaving the house.
- the away status may be provided to a service (e.g., service of the washer 372 , dryper 374 , cloud service 145 , condition service 376 (e.g., a website that provides graphical conditional rule generation), etc., which may use the status as a condition for activating the unattended laundry systems.
- a service e.g., service of the washer 372 , dryper 374 , cloud service 145 , condition service 376 (e.g., a website that provides graphical conditional rule generation), etc.
- the service may provide a washer 372 and/or dryer 374 command to activate the respective unattended laundry system.
- the laundry will remain fresh and/or wrinkle free, despite the operator leaving the laundry unattended and not manually activating the unattended laundry systems.
- some dryers 374 may be equipped with an economy boost option that may place the dryer in a more time-consuming but energy-consuming state.
- the service may provide a command for the dryer 374 to enter the economy boost option.
- Rush Hour Rewards by Nest, provides incentives to consumers to use less energy during peak usage times. Users enrolled in the Rush Hour Rewards receive periodic peak energy usage events defining a peak usage time when energy consumption should be avoided to obtain a reward from the cloud services 145 .
- the washer 372 and/or dryer 374 receives the peak event signal from the cloud services 147 and calculates the peak start time and duration The peak start time is adjusted by a default cycle length for the washer 372 and/or dryer 374 to ensure that a consumer does not inadvertently start a cycle just before the event is to begin. For example, if a washing machine 372 and/or dryer 374 cycle is typically 30 minutes, the peak start time is adjusted by 30 minutes, to ensure that the washer 372 and/or the dryer 374 is not active during the peak event.
- a Rush Hour peak event may begin at 2:00 pm and last for 4 hours. With a default cycle time of 30 minutes, the washer 372 adjusts the peak event start to 1:30 pm and ends the event at 6:00 pm (4 hour and 30 minute duration). These adjustments to the Rush Hour peak event help to ensure that the washer 372 is not in operation during the peak event.
- the service may send a command to the washer 372 and/or dryer 374 to enter a Smart Delay.
- the washer 372 and/or dryer 374 will inform the consumer that a peak event is in process and that a more energy friendly time to run the cycle is approaching.
- the consumer may provide an input to allow the washer 372 and/or dryer 374 to automatically start when the event is complete, or the consumer may override the Smart Delay and start the cycle immediately.
- the service sends a command for the washer 372 and/or dryer 374 to enter a deep power reduction mode. Accordingly, if the washer 372 and/or dryer 374 is in operation prior to receiving the peak event, the washer 372 and/or dryer 374 will reduce power usage for a brief period of time. Further, the dryer will also enter economy boost for the remainder of the cycle. If not running a cycle, the washer 372 and/or dryer 374 will enter Smart Delay. When the Rush Hour peak event has concluded, the washer 372 and/or dryer 374 return to normal operation.
- Energy usage of the washer 372 and/or dryer 374 may be accumulated by the cloud services 145 .
- Nest may accumulate the energy usage of lighting, external automation systems, etc. to include this information in energy utilization reports.
- the energy consumption may be incorporated in energy conservation information and/or games, such as Nest Leaf.
- the detectors 10 B may be used as conditions for controlling the washer 372 , dryer 374 , and or a stove-top/oven 378 .
- the detectors 10 B may detect smoke and/or gas
- the washer 372 , dryer 374 , and or a stove-top/oven 378 may be disabled.
- gas access may be disabled a burner on the stove-top/oven 378 .
- a booking service conditions may be used to control smart devices (e.g., thermostat 10 A and/or detectors 10 B).
- FIG. 15 illustrates such a system 400 .
- a booking service 402 such as a hotel or Bed and Breakfast website may enable reservations for one or more particular rooms.
- the booking service 402 includes a listing 404 of available Bed and Breakfast locations for a particular location.
- the listing 404 includes an indicator 408 for smart locations that may be personalized for a user's particular desires.
- an availability calendar 408 is provided.
- additional prompts 410 may be provided.
- an alarm prompt 412 may enable a user to input an alarm code that is easy for the user to remember.
- An environment prompt 414 may enable the user to input particular environmental settings such as a desired arrival temperature, etc.
- the alarm and/or environmental settings may be pre-populated or obtained from the user's home 418 (or other location) settings. For example, if the user maintains a 78 degree temperature when awake and occupying the house and a 73 degree temperature when sleeping and occupying the house, these temperature settings may automatically be sent and implemented at the user's booked room 420 .
- the cloud services 145 may provide the settings input at the prompts 410 and/or the settings obtained from the house 418 .
- the user's settings may be automatically implemented via the cloud service 145 during those time periods.
- smart device notifications such as active alarms of the detector 10 B may be provided to the user (e.g., the user's smart phone, etc.) during the booked time period.
- the user's home may be controlled by placing the user's home in “AWAY” mode during the booked time period and the user may be notified when their home devices detect occupancy while they are expected to be away (e.g., notify the user that their home thermostat transitioned to “HOME” while they are away).
- This functionality may also benefit the lessor by providing energy conservation.
- the booking service 402 is aware of times when there is no occupancy in the room 420 . Accordingly, the availability calendar 408 may be used to set the thermostats 10 A to “AWAY” during periods where there is no occupancy.
- a garage door opener may be used as either a condition for a thermostat 10 A output and/or a thermostat 10 A condition may be used for a garage door opener output.
- FIG. 16 provides a system 440 that integrates a garage door opener 442 with smart devices 10 A and/or 10 B.
- the garage door opener 442 status may indicate that someone is arriving and/or leaving the house 444 .
- a prompt 446 may be provided on a user's device 448 (e.g., smart phone) prompting to change the mode of the thermostat 10 A (e.g., from “HOME” to “AWAY” or vice versa).
- conditions of the thermostat 10 A may be used to trigger closure of the door 450 .
- a conditional rule might trigger closure of the door 450 on the thermostat being “AWAY” for 30 minutes or longer.
- the door 450 may be closed, adding household security.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- General Physics & Mathematics (AREA)
- Combustion & Propulsion (AREA)
- Mechanical Engineering (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Fuzzy Systems (AREA)
- Medical Informatics (AREA)
- Health & Medical Sciences (AREA)
- Computer Hardware Design (AREA)
- Business, Economics & Management (AREA)
- Analytical Chemistry (AREA)
- Emergency Management (AREA)
- Computer Security & Cryptography (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Selective Calling Equipment (AREA)
- Telephonic Communication Services (AREA)
- Circuit Arrangement For Electric Light Sources In General (AREA)
Abstract
Description
- This application is a Continuation Application of, and claims priority to, U.S. application Ser. No. 14/577,635, entitled “Methods And Apparatus For Exploiting Interfaces Smart Environment Device Application Program Interfaces”, filed Dec. 19, 2014 which in turn claims priority to U.S. Provisional Patent Application No. 62/016,052, entitled “Methods and Apparatus for Exploiting Application Programming Interfaces to Smart Home Environment Electronic Components”, filed Jun. 23, 2014, which is herein incorporated by reference.
- This disclosure relates to controlling access to electronic devices via application programming interface (API) restrictions.
- This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present disclosure, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
- People interact with a number of different electronic devices on a daily basis. In a home setting, for example, a person may interact with smart thermostats, lighting systems, alarm systems, entertainment systems, and a variety of other electronic devices. To interact with some of these electronic devices, a person may communicate a command using an application program running on another electronic device. For instance, a person may control the temperature setting on a smart thermostat using an application program running on a smartphone. The application program may communicate with a secure online service that interacts with that thermostat.
- To preserve the user experience associated with an electronic device, the manufacturer of the electronic device may develop the application programs to control the electronic device. Opening access to the electronic devices to third party developers, however, may potentially improve the experience of some people with the devices—but only if third party application programs do not cause the electronic devices to behave in an undesirable manner. Accordingly, while it may be desirable to open access to the electronic devices to third party developers, it may also be desirable to place restrictions on that access so as to reduce the risk that the third party access may negatively impact the operation of the electronic devices and thus the user experience associated with those devices.
- A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
- According to embodiments of this disclosure, applications may access different installations of smart home devices (e.g., via an application programming interface (API)). Namely, the third party applications may communicate not directly with a smart home device, but rather through a device service. The device service may provide a corresponding update signal to the target smart home device based on one or more factors such as operation status parameters of the device.
- Various refinements of the features noted above may exist in relation to various aspects of the present disclosure. Further features may also be incorporated in these various aspects as well. These refinements and additional features may exist individually or in any combination. For instance, various features discussed below in relation to one or more of the illustrated embodiments may be incorporated into any of the above-described aspects of the present disclosure alone or in any combination. The brief summary presented above is intended only to familiarize the reader with certain aspects and contexts of embodiments of the present disclosure without limitation to the claimed subject matter.
- Various aspects of this disclosure may be better understood upon reading the following detailed description and upon reference to the drawings in which:
-
FIG. 1 is a block diagram of a smart home device, in accordance with an embodiment; -
FIG. 2 is a block diagram of a connected smart home environment that includes a number of smart home devices, in accordance with an embodiment; -
FIG. 3 is a block diagram illustrating a manner of controlling and/or accessing the smart home environment using services over the internet, in accordance with an embodiment; -
FIG. 4 is a block diagram of processing paradigms that may be used to control devices of the smart home environment, in accordance with an embodiment; -
FIG. 5 is a block diagram of a system that provides access to smart home devices, in accordance with an embodiment; -
FIG. 6 is a flow diagram illustrating a method for transitioning temperatures based upon an estimated time of arrival, in accordance with an embodiment; -
FIG. 7 is block diagram illustrating window creation for the method ofFIG. 6 , in accordance with an embodiment; -
FIG. 8 is a flow diagram illustrating a method for controlling devices using geo-fencing, in accordance with an embodiment; -
FIG. 9 is a block diagram illustrating a set of geo-fence boundaries, in accordance with an embodiment; -
FIG. 10 is a block diagram illustrating a geo-fencing application on a handheld electronic device, in accordance with an embodiment; -
FIG. 11 is a block diagram illustrating an application running from an in-dash interface, in accordance with an embodiment; -
FIG. 12 is a schematic illustration of a conditional rule where a thermostat, a smoke/carbon monoxide detector, or both are outputs, in accordance with an embodiment; -
FIG. 13 is a schematic illustration of a conditional rule where data from a thermostat, a smoke/carbon monoxide detector, or both are conditions, in accordance with an embodiment; -
FIG. 14 is a block diagram of a system that integrates household appliances with a thermostat, smoke/carbon monoxide detector, or both, in accordance with an embodiment; -
FIG. 15 is a block diagram of a system that integrates a booking service with a thermostat, smoke/carbon monoxide detector, an alarm system, or combination thereof, in accordance with an embodiment; and -
FIG. 16 is a block diagram of a system that integrates a garage door opener with a thermostat, smoke/carbon monoxide detector, or both, in accordance with an embodiment. - One or more specific embodiments will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
- Embodiments of the present disclosure relate to an electronic device, such as a thermostat or a hazard detector (e.g., smoke detector), that may be disposed in a building (e.g., home or office) such that the electronic device may detect the presence of a human being in the building and distinguish between the presence of the human being and a pet. Generally, the electronic device may employ a sensor, such as a passive infrared (PIR) sensor, to detect the presence of a human being. However, each PIR sensor may be inherently sensitive to different levels of noise. By accounting for the different sensitivity levels of each PIR sensor, the electronic device may improve its detection of human being and better distinguish between the presence of human beings and pets.
- Keeping this in mind, the electronic device may include a low-power processor that may store the sensor measurements acquired by the PIR sensor during a time period when the electronic device does not expect a human the building or portion of the building being monitored by electronic device is not expected to have a human being present. In one embodiment, after storing the sensor measurements over some period of time, the low-power processor may send the stored sensor measurements to a high-power processor of the electronic device. The high-power processor may then calculate a threshold or adjust the previous threshold for determining a presence of a human based on the stored sensor measurements that correspond to the time period when a human being is likely not present in the building. The high-power processor may then send the newly calculated or the adjusted threshold to the low-power processor. The low-power processor may then use the newly calculated or the adjusted threshold to detect the presence of a human. Since the new threshold is calculated based on the respective sensor measurements for the respective PIR sensor of a respective electronic device, the new threshold may compensate for the inherent sensitivity characteristics of the respective PIR sensor. As a result, the electronic device may detect the presence of a human being more effectively and efficiently.
- Smart Device in Smart Home Environment
- By way of introduction,
FIG. 1 illustrates an example of ageneral device 10 that may that may be disposed within a building environment. In one embodiment, thedevice 10 may include one ormore sensors 12, a user-interface component 14, a power supply 16 (e.g., including a power connection and/or battery), anetwork interface 18, a high-power processor 20, a low-power processor 22, a passive infrared (PIR) sensor 24, a light source 26, and the like. - The
sensors 12, in certain embodiments, may detect various properties such as acceleration, temperature, humidity, water, supplied power, proximity, external motion, device motion, sound signals, ultrasound signals, light signals, fire, smoke, carbon monoxide, global-positioning-satellite (GPS) signals, radio-frequency (RF), other electromagnetic signals or fields, or the like. As such, thesensors 12 may include temperature sensor(s), humidity sensor(s), hazard-related sensor(s) or other environmental sensor(s), accelerometer(s), microphone(s), optical sensors up to and including camera(s) (e.g., charged coupled-device or video cameras), active or passive radiation sensors, GPS receiver(s) or radiofrequency identification detector(s). WhileFIG. 1 illustrates an embodiment with a single sensor, many embodiments may include multiple sensors. In some instances, thedevice 10 may include one or more primary sensors and one or more secondary sensors. Here, the primary sensor(s) may sense data central to the core operation of the device (e.g., sensing a temperature in a thermostat or sensing smoke in a smoke detector), while the secondary sensor(s) may sense other types of data (e.g., motion, light or sound), which can be used for energy-efficiency objectives or smart-operation obj ectives. - One or more user-
interface components 14 in thedevice 10 may receive input from the user and/or present information to the user. The received input may be used to determine a setting. In certain embodiments, the user-interface components may include a mechanical or virtual component that responds to the user's motion. For example, the user can mechanically move a sliding component (e.g., along a vertical or horizontal track) or rotate a rotatable ring (e.g., along a circular track), or the user's motion along a touchpad may be detected. Such motions may correspond to a setting adjustment, which can be determined based on an absolute position of a user-interface component 14 or based on a displacement of a user-interface components 14 (e.g., adjusting a set point temperature by 1 degree F. for every 10° rotation of a rotatable-ring component). Physically and virtually movable user-interface components can allow a user to set a setting along a portion of an apparent continuum. Thus, the user may not be confined to choose between two discrete options (e.g., as would be the case if up and down buttons were used) but can quickly and intuitively define a setting along a range of possible setting values. For example, a magnitude of a movement of a user-interface component may be associated with a magnitude of a setting adjustment, such that a user may dramatically alter a setting with a large movement or finely tune a setting with a small movement. - The user-
interface components 14 may also include one or more buttons (e.g., up and down buttons), a keypad, a number pad, a switch, a microphone, and/or a camera (e.g., to detect gestures). In one embodiment, the user-interface component 14 may include a click-and-rotate annular ring component that may enable the user to interact with the component by rotating the ring (e.g., to adjust a setting) and/or by clicking the ring inwards (e.g., to select an adjusted setting or to select an option). In another embodiment, the user-interface component 14 may include a camera that may detect gestures (e.g., to indicate that a power or alarm state of a device is to be changed). In some instances, thedevice 10 may have one primary input component, which may be used to set a plurality of types of settings. The user-interface components 14 may also be configured to present information to a user via, e.g., a visual display (e.g., a thin-film-transistor display or organic light-emitting-diode display) and/or an audio speaker. - The power-
supply component 16 may include a power connection and/or a local battery. For example, the power connection may connect thedevice 10 to a power source such as a line voltage source. In some instances, an AC power source can be used to repeatedly charge a (e.g., rechargeable) local battery, such that the battery may be used later to supply power to thedevice 10 when the AC power source is not available. - The
network interface 18 may include a component that enables thedevice 10 to communicate between devices. As such, thenetwork interface 18 may enable thedevice 10 to communicate withother devices 10 via a wired or wireless network. Thenetwork interface 18 may include a wireless card or some other transceiver connection to facilitate this communication. - The high-
power processor 20 and the low-power processor 22 may support one or more of a variety of different device functionalities. As such, the high-power processor 20 and the low-power processor 22 may each include one or more processors configured and programmed to carry out and/or cause to be carried out one or more of the functionalities described herein. In one embodiment, the high-power processor 20 and the low-power processor 22 may include general-purpose processors carrying out computer code stored in local memory (e.g., flash memory, hard drive, random access memory), special-purpose processors or application-specific integrated circuits, combinations thereof, and/or using other types of hardware/firmware/software processing platforms. In certain embodiments, the high-power processor 20 may execute computationally intensive operations such as operating the user-interface component 14 and the like. The low-power processor 22, on the other hand, may manage less complex processes such as detecting a hazard or temperature from thesensor 12. In one embodiment, the low-power processor may wake or initialize the high-power processor for computationally intensive processes. - By way of example, the high-
power processor 20 and the low-power processor 22 may detect when a location (e.g., a house or room) is occupied (i.e., includes a presence of a human), up to and including whether it is occupied by a specific person or is occupied by a specific number of people (e.g., relative to one or more thresholds). In one embodiment, this detection can occur, e.g., by analyzing microphone signals, detecting user movements (e.g., in front of a device), detecting openings and closings of doors or garage doors, detecting wireless signals, detecting an internet protocol (IP) address of a received signal, detecting operation of one or more devices within a time window, or the like. Moreover, the high-power processor 20 and the low-power processor 22 may include image recognition technology to identify particular occupants or objects. - In certain embodiments, the high-
power processor 20 and the low-power processor 22 may detect the presence of a human using the PIR sensor 24. The PIR sensor 24 may be a passive infrared sensor that may measures infrared (IR) light radiating from objects in its field of view. As such, the PIR sensor 24 may detect the Infrared radiation emitted from an object. - In some instances, the high-
power processor 20 may predict desirable settings and/or implement those settings. For example, based on the presence detection, the high-power processor 20 may adjust device settings to, e.g., conserve power when nobody is home or in a particular room or to accord with user preferences (e.g., general at-home preferences or user-specific preferences). As another example, based on the detection of a particular person, animal or object (e.g., a child, pet or lost object), the high-power processor 20 may initiate an audio or visual indicator of where the person, animal or object is or may initiate an alarm or security feature if an unrecognized person is detected under certain conditions (e.g., at night or when lights are off). - In some instances, devices may interact with each other such that events detected by a first device influences actions of a second device. For example, a first device can detect that a user has entered into a garage (e.g., by detecting motion in the garage, detecting a change in light in the garage or detecting opening of the garage door). The first device can transmit this information to a second device via the
network interface 18, such that the second device can, e.g., adjust a home temperature setting, a light setting, a music setting, and/or a security-alarm setting. As another example, a first device can detect a user approaching a front door (e.g., by detecting motion or sudden light pattern changes). The first device may, e.g., cause a general audio or visual signal to be presented (e.g., such as sounding of a doorbell) or cause a location-specific audio or visual signal to be presented (e.g., to announce the visitor's presence within a room that a user is occupying). - In addition to detecting various types of events, the
device 10 may include a light source 26 that may illuminate when a living being, such as a human, is detected as approaching. The light source 26 may include any type of light source such as one or more light-emitting diodes or the like. The light source 26 may be communicatively coupled to the high-power processor 20 and the low-power processor 22, which may provide a signal to cause the light source 26 to illuminate. - Keeping the foregoing in mind,
FIG. 2 illustrates an example of a smart-home environment 30 within which one or more of thedevices 10 ofFIG. 1 , methods, systems, services, and/or computer program products described further herein can be applicable. The depicted smart-home environment 30 includes astructure 32, which can include, e.g., a house, office building, garage, or mobile home. It will be appreciated that devices can also be integrated into a smart-home environment 30 that does not include anentire structure 32, such as an apartment, condominium, or office space. Further, the smart home environment can control and/or be coupled to devices outside of theactual structure 32. Indeed, several devices in the smart home environment need not physically be within thestructure 32 at all. For example, a device controlling a pool heater or irrigation system can be located outside of thestructure 32. - The depicted
structure 32 includes a plurality ofrooms 38, separated at least partly from each other viawalls 40. Thewalls 40 can include interior walls or exterior walls. Each room can further include afloor 42 and aceiling 44. Devices can be mounted on, integrated with and/or supported by awall 40,floor 42 orceiling 44. - In some embodiments, the smart-
home environment 30 ofFIG. 2 includes a plurality ofdevices 10, including intelligent, multi-sensing, network-connected devices, that can integrate seamlessly with each other and/or with a central server or a cloud-computing system to provide any of a variety of useful smart-home objectives. The smart-home environment 30 may include one or more intelligent, multi-sensing, network-connected thermostats 46 (hereinafter referred to as “smart thermostats 46”), one or more intelligent, network-connected, multi-sensing hazard detection units 50 (hereinafter referred to as “smart hazard detectors 50”), and one or more intelligent, multi-sensing, network-connected entryway interface devices 52 (hereinafter referred to as “smart doorbells 52”). According to embodiments, thesmart thermostat 46 may include a Nest® Learning Thermostat—1st Generation T100577 or Nest® Learning Thermostat—2nd Generation T200577 by Nest Labs, Inc., among others. Thesmart thermostat 46 detects ambient climate characteristics (e.g., temperature and/or humidity) and controls aHVAC system 48 accordingly. - The
smart hazard detector 50 may detect the presence of a hazardous substance or a substance indicative of a hazardous substance (e.g., smoke, fire, or carbon monoxide). Thesmart hazard detector 50 may include a Nest® Protect that may includesensors 12 such as smoke sensors, carbon monoxide sensors, and the like. As such, thehazard detector 50 may determine when smoke, fire, or carbon monoxide may be present within the building. - The
smart doorbell 52 may detect a person's approach to or departure from a location (e.g., an outer door), control doorbell functionality, announce a person's approach or departure via audio or visual means, or control settings on a security system (e.g., to activate or deactivate the security system when occupants go and come). Thesmart doorbell 52 may interact withother devices 10 based on whether someone has approached or entered the smart-home environment 30. - In some embodiments, the smart-
home environment 30 further includes one or more intelligent, multi-sensing, network-connected wall switches 54 (hereinafter referred to as “smart wall switches 54”), along with one or more intelligent, multi-sensing, network-connected wall plug interfaces 56 (hereinafter referred to as “smart wall plugs 56”). The smart wall switches 54 may detect ambient lighting conditions, detect room-occupancy states, and control a power and/or dim state of one or more lights. In some instances, smart wall switches 54 may also control a power state or speed of a fan, such as a ceiling fan. The smart wall plugs 56 may detect occupancy of a room or enclosure and control supply of power to one or more wall plugs (e.g., such that power is not supplied to the plug if nobody is at home). - Still further, in some embodiments, the
device 10 within the smart-home environment 30 may further includes a plurality of intelligent, multi-sensing, network-connected appliances 58 (hereinafter referred to as “smart appliances 58”), such as refrigerators, stoves and/or ovens, televisions, washers, dryers, lights, stereos, intercom systems, garage-door openers, floor fans, ceiling fans, wall air conditioners, pool heaters, irrigation systems, security systems, and so forth. According to embodiments, the network-connectedappliances 58 are made compatible with the smart-home environment by cooperating with the respective manufacturers of the appliances. For example, the appliances can be space heaters, window AC units, motorized duct vents, etc. When plugged in, an appliance can announce itself to the smart-home network, such as by indicating what type of appliance it is, and it can automatically integrate with the controls of the smart-home. Such communication by the appliance to the smart home can be facilitated by any wired or wireless communication protocols known by those having ordinary skill in the art. The smart home also can include a variety ofnon-communicating legacy appliances 68, such as old conventional washer/dryers, refrigerators, and the like which can be controlled, albeit coarsely (ON/OFF), by virtue of the smart wall plugs 56. The smart-home environment 30 can further include a variety of partially communicatinglegacy appliances 70, such as infrared (“IR”) controlled wall air conditioners or other IR-controlled devices, which can be controlled by IR signals provided by thesmart hazard detectors 50 or the smart wall switches 54. - According to embodiments, the
smart thermostats 46, thesmart hazard detectors 50, thesmart doorbells 52, the smart wall switches 54, the smart wall plugs 56, and other devices of the smart-home environment 30 are modular and can be incorporated into older and new houses. For example, thedevices 10 are designed around a modular platform consisting of two basic components: a head unit and a back plate, which is also referred to as a docking station. Multiple configurations of the docking station are provided so as to be compatible with any home, such as older and newer homes. However, all of the docking stations include a standard head-connection arrangement, such that any head unit can be removably attached to any docking station. Thus, in some embodiments, the docking stations are interfaces that serve as physical connections to the structure and the voltage wiring of the homes, and the interchangeable head units contain all of thesensors 12, processors 28,user interfaces 14, thepower supply 16, thenetwork interface 18, and other functional components of the devices described above. - Many different commercial and functional possibilities for provisioning, maintenance, and upgrade are possible. For example, after years of using any particular head unit, a user will be able to buy a new version of the head unit and simply plug it into the old docking station. There are also many different versions for the head units, such as low-cost versions with few features, and then a progression of increasingly-capable versions, up to and including extremely fancy head units with a large number of features. Thus, it should be appreciated that the various versions of the head units can all be interchangeable, with any of them working when placed into any docking station. This can advantageously encourage sharing and re-deployment of old head units—for example, when an important high-capability head unit, such as a hazard detector, is replaced by a new version of the head unit, then the old head unit can be re-deployed to a back room or basement, etc. According to embodiments, when first plugged into a docking station, the head unit can ask the user (by 2D LCD display, 2D/3D holographic projection, voice interaction, etc.) a few simple questions such as, “Where am I” and the user can indicate “living room”, “kitchen” and so forth.
- The smart-
home environment 30 may also include communication with devices outside of the physical home but within a proximate geographical range of the home. For example, the smart-home environment 30 may include a pool heater monitor 34 that communicates a current pool temperature to other devices within the smart-home environment 30 or receives commands for controlling the pool temperature. Similarly, the smart-home environment 30 may include anirrigation monitor 36 that communicates information regarding irrigation systems within the smart-home environment 30 and/or receives control information for controlling such irrigation systems. According to embodiments, an algorithm is provided for considering the geographic location of the smart-home environment 30, such as based on the zip code or geographic coordinates of the home. The geographic information is then used to obtain data helpful for determining optimal times for watering, such data may include sun location information, temperature, dewpoint, soil type of the land on which the home is located, etc. - By virtue of network connectivity, one or more of the smart-home devices of
FIG. 2 can further allow a user to interact with the device even if the user is not proximate to the device. For example, a user can communicate with a device using a computer (e.g., a desktop computer, laptop computer, or tablet) or other portable electronic device (e.g., a smartphone) 66. A web page or app can be configured to receive communications from the user and control the device based on the communications and/or to present information about the device's operation to the user. For example, the user can view a current setpoint temperature for a device and adjust it using a computer. The user can be in the structure during this remote communication or outside the structure. - As discussed, users can control the smart thermostat and other smart devices in the smart-
home environment 30 using a network-connected computer or portableelectronic device 66. In some examples, some or all of the occupants (e.g., individuals who live in the home) can register theirdevice 66 with the smart-home environment 30. Such registration can be made at a central server to authenticate the occupant and/or the device as being associated with the home and to give permission to the occupant to use the device to control the smart devices in the home. An occupant can use theirregistered device 66 to remotely control the smart devices of the home, such as when the occupant is at work or on vacation. The occupant may also use their registered device to control the smart devices when the occupant is actually located inside the home, such as when the occupant is sitting on a couch inside the home. It should be appreciated that instead of or in addition to registeringdevices 66, the smart-home environment 30 makes inferences about which individuals live in the home and are therefore occupants and whichdevices 66 are associated with those individuals. As such, the smart-home environment “learns” who is an occupant and permits thedevices 66 associated with those individuals to control the smart devices of the home. - In some instances, guests desire to control the smart devices. For example, the smart-home environment may receive communication from an unregistered mobile device of an individual inside of the home, where said individual is not recognized as an occupant of the home. Further, for example, a smart-home environment may receive communication from a mobile device of an individual who is known to be or who is registered as a guest.
- According to embodiments, a guest-layer of controls can be provided to guests of the smart-
home environment 30. The guest-layer of controls gives guests access to basic controls (e.g., a judicially selected subset of features of the smart devices), such as temperature adjustments, but it locks out other functionalities. The guest layer of controls can be thought of as a “safe sandbox” in which guests have limited controls, but they do not have access to more advanced controls that could fundamentally alter, undermine, damage, or otherwise impair the occupant-desired operation of the smart devices. For example, the guest layer of controls will not permit the guest to adjust the heat-pump lockout temperature. - A use case example of this is when a guest is in a smart home, the guest could walk up to the thermostat and turn the dial manually, but the guest may not want to walk around the house “hunting” the thermostat, especially at night while the home is dark and others are sleeping. Further, the guest may not want to go through the hassle of downloading the necessary application to their device for remotely controlling the thermostat. In fact, the guest may not have the home owner's login credentials, etc., and therefore cannot remotely control the thermostat via such an application. Accordingly, according to embodiments of the invention, the guest can open a mobile browser on their mobile device, type a keyword, such as “NEST” into the URL field and tap “Go” or “Search”, etc. In response, the device presents the guest with a user interface which allows the guest to move the target temperature between a limited range, such as 65 and 80 degrees Fahrenheit. As discussed, the user interface provides a guest layer of controls that are limited to basic functions. The guest cannot change the target humidity, modes, or view energy history.
- According to embodiments, to enable guests to access the user interface that provides the guest layer of controls, a local webserver is provided that is accessible in the local area network (LAN). It does not require a password, because physical presence inside the home is established reliably enough by the guest's presence on the LAN. In some embodiments, during installation of the smart device, such as the smart thermostat, the home owner is asked if they want to enable a Local Web App (LWA) on the smart device. Business owners will likely say no; home owners will likely say yes. When the LWA option is selected, the smart device broadcasts to the LAN that the above referenced keyword, such as “NEST”, is now a host alias for its local web server. Thus, no matter whose home a guest goes to, that same keyword (e.g., “NEST”) is always the URL you use to access the LWA, provided the smart device is purchased from the same manufacturer. Further, according to embodiments, if there is more than one smart device on the LAN, the second and subsequent smart devices do not offer to set up another LWA. Instead, they register themselves as target candidates with the master LWA. And in this case the LWA user would be asked which smart device they want to change the temperature on before getting the simplified user interface for the particular smart device they choose.
- According to embodiments, a guest layer of controls may also be provided to users by means other than a
device 66. For example, the smart device, such as the smart thermostat, may be equipped with walkup-identification technology (e.g., face recognition, RFID, ultrasonic sensors) that “fingerprints” or creates a “signature” for the occupants of the home. The walkup-identification technology can be the same as or similar to the fingerprinting and signature creating techniques described in other sections of this application. In operation, when a person who does not live in the home or is otherwise not registered with the smart home or whose fingerprint or signature is not recognized by the smart home “walks up” to a smart device, the smart device provides the guest with the guest layer of controls, rather than full controls. - As described below, the
smart thermostat 46 and other smart devices “learn” by observing occupant behavior. For example, the smart thermostat learns occupants' preferred temperature set-points for mornings and evenings, and it learns when the occupants are asleep or awake, as well as when the occupants are typically away or at home, for example. According to embodiments, when a guest controls the smart devices, such as the smart thermostat, the smart devices do not “learn” from the guest. This prevents the guest's adjustments and controls from affecting the learned preferences of the occupants. - According to some embodiments, a smart television remote control is provided. The smart remote control recognizes occupants by thumbprint, visual identification, RFID, etc., and it recognizes a user as a guest or as someone belonging to a particular class having limited control and access (e.g., child). Upon recognizing the user as a guest or someone belonging to a limited class, the smart remote control only permits that user to view a subset of channels and to make limited adjustments to the settings of the television and other devices. For example, a guest cannot adjust the digital video recorder (DVR) settings, and a child is limited to viewing child-appropriate programming.
- According to some embodiments, similar controls are provided for other instruments, utilities, and devices in the house. For example, sinks, bathtubs, and showers can be controlled by smart spigots that recognize users as guests or as children and therefore prevent water from exceeding a designated temperature that is considered safe.
- In some embodiments, in addition to containing processing and sensing capabilities, each of the
devices - According to embodiments, all or some of the smart devices can serve as wireless or wired repeaters. For example, a first one of the smart devices can communicate with a second one of the smart device via a
wireless router 60. The smart devices can further communicate with each other via a connection to a network, such as theInternet 62. Through theInternet 62, the smart devices can communicate with a central server or a cloud-computing system 64. The central server or cloud-computing system 64 can be associated with a manufacturer, support entity, or service provider associated with the device. For one embodiment, a user may be able to contact customer support using a device itself rather than needing to use other communication means such as a telephone or Internet-connected computer. Further, software updates can be automatically sent from the central server or cloud-computing system 64 to devices (e.g., when available, when purchased, or at routine intervals). - According to embodiments, the smart devices combine to create a mesh network of spokesman and low-power nodes in the smart-
home environment 30, where some of the smart devices are “spokesman” nodes and others are “low-powered” nodes. Some of the smart devices in the smart-home environment 30 are battery powered, while others have a regular and reliable power source, such as by connecting to wiring (e.g., to 120V line voltage wires) behind thewalls 40 of the smart-home environment. The smart devices that have a regular and reliable power source are referred to as “spokesman” nodes. These nodes are equipped with the capability of using any wireless protocol or manner to facilitate bidirectional communication with any of a variety of other devices in the smart-home environment 30 as well as with the central server or cloud-computing system 64. On the other hand, the devices that are battery powered are referred to as “low-power” nodes. These nodes tend to be smaller than spokesman nodes and can only communicate using wireless protocols that requires very little power, such as Zigbee, 6LoWPAN, etc. Further, some, but not all, low-power nodes are incapable of bidirectional communication. These low-power nodes send messages, but they are unable to “listen”. Thus, other devices in the smart-home environment 30, such as the spokesman nodes, cannot send information to these low-power nodes. - As described, the smart devices serve as low-power and spokesman nodes to create a mesh network in the smart-
home environment 30. Individual low-power nodes in the smart-home environment regularly send out messages regarding what they are sensing, and the other low-powered nodes in the smart-home environment—in addition to sending out their own messages—repeat the messages, thereby causing the messages to travel from node to node (i.e., device to device) throughout the smart-home environment 30. The spokesman nodes in the smart-home environment 30 are able to “drop down” to low-powered communication protocols to receive these messages, translate the messages to other communication protocols, and send the translated messages to other spokesman nodes and/or the central server or cloud-computing system 64. Thus, the low-powered nodes using low-power communication protocols are able send messages across the entire smart-home environment 30 as well as over theInternet 62 to the central server or cloud-computing system 64. According to embodiments, the mesh network enables the central server or cloud-computing system 64 to regularly receive data from all of the smart devices in the home, make inferences based on the data, and send commands back to one of the smart devices to accomplish some of the smart-home objectives described herein. - As described, the spokesman nodes and some of the low-powered nodes are capable of “listening”. Accordingly, users, other devices, and the central server or cloud-
computing system 64 can communicate controls to the low-powered nodes. For example, a user can use the portable electronic device (e.g., a smartphone) 66 to send commands over theInternet 62 to the central server or cloud-computing system 64, which then relays the commands to the spokesman nodes in the smart-home environment 30. The spokesman nodes drop down to a low-power protocol to communicate the commands to the low-power nodes throughout the smart-home environment, as well as to other spokesman nodes that did not receive the commands directly from the central server or cloud-computing system 64. - An example of a low-power node is a
smart night light 65. In addition to housing a light source, the smart night light 65 houses an occupancy sensor, such as an ultrasonic or passive IR sensor, and an ambient light sensor, such as a photoresistor or a single-pixel sensor that measures light in the room. In some embodiments, thesmart night light 65 is configured to activate the light source when its ambient light sensor detects that the room is dark and when its occupancy sensor detects that someone is in the room. In other embodiments, thesmart night light 65 is simply configured to activate the light source when its ambient light sensor detects that the room is dark. Further, according to embodiments, thesmart night light 65 includes a low-power wireless communication chip (e.g., ZigBee chip) that regularly sends out messages regarding the occupancy of the room and the amount of light in the room, including instantaneous messages coincident with the occupancy sensor detecting the presence of a person in the room. As mentioned above, these messages may be sent wirelessly, using the mesh network, from node to node (i.e., smart device to smart device) within the smart-home environment 30 as well as over theInternet 62 to the central server or cloud-computing system 64. - Other examples of low-powered nodes include battery-operated versions of the
smart hazard detectors 50. Thesesmart hazard detectors 50 are often located in an area without access to constant and reliable power and, as discussed in detail below, may include any number and type of sensors, such as smoke/fire/heat sensors, carbon monoxide/dioxide sensors, occupancy/motion sensors, ambient light sensors, temperature sensors, humidity sensors, and the like. Furthermore,smart hazard detectors 50 can send messages that correspond to each of the respective sensors to the other devices and the central server or cloud-computing system 64, such as by using the mesh network as described above. - Examples of spokesman nodes include
smart thermostats 46,smart doorbells 52, smart wall switches 54, and smart wall plugs 56. Thesedevices - In some embodiments, these low-powered and spokesman nodes (e.g.,
devices home environment 30, the alarm could be triggered upon receiving an occupancy, motion, heat, sound, etc. message from one or more of the low-powered and spokesman nodes in the mesh network. For example, upon receiving a message from asmart night light 65 indicating the presence of a person, the central server or cloud-computing system 64 or some other device could trigger an alarm, provided the alarm is armed at the time of detection. Thus, the alarm system could be enhanced by various low-powered and spokesman nodes located throughout the smart-home environment 30. In this example, a user could enhance the security of the smart-home environment 30 by buying and installing extrasmart nightlights 65. However, in a scenario where the perpetrator uses a radio transceiver to jam the wireless network, thedevices 10 may be incapable of communicating with each other. Therefore, as discussed in detail below, the present techniques provide network communication jamming attack detection and notification solutions to such a problem. - In some embodiments, the mesh network can be used to automatically turn on and off lights as a person transitions from room to room. For example, the low-powered and spokesman nodes detect the person's movement through the smart-home environment and communicate corresponding messages through the mesh network. Using the messages that indicate which rooms are occupied, the central server or cloud-
computing system 64 or some other device activates and deactivates the smart wall switches 54 to automatically provide light as the person moves from room to room in the smart-home environment 30. Further, users may provide pre-configuration information that indicates which smart wall plugs 56 provide power to lamps and other light sources, such as thesmart night light 65. Alternatively, this mapping of light sources towall plugs 56 can be done automatically (e.g., the smart wall plugs 56 detect when a light source is plugged into it, and it sends a corresponding message to the central server or cloud-computing system 64). Using this mapping information in combination with messages that indicate which rooms are occupied, the central server or cloud-computing system 64 or some other device activates and deactivates the smart wall plugs 56 that provide power to lamps and other light sources so as to track the person's movement and provide light as the person moves from room to room. - In some embodiments, the mesh network of low-powered and spokesman nodes can be used to provide exit lighting in the event of an emergency. In some instances, to facilitate this, users provide pre-configuration information that indicates exit routes in the smart-
home environment 30. For example, for each room in the house, the user provides a map of the best exit route. It should be appreciated that instead of a user providing this information, the central server or cloud-computing system 64 or some other device could automatically determine the routes using uploaded maps, diagrams, architectural drawings of the smart-home house, as well as using a map generated based on positional information obtained from the nodes of the mesh network (e.g., positional information from the devices is used to construct a map of the house). In operation, when an alarm is activated (e.g., when one or more of thesmart hazard detector 50 detects smoke and activates an alarm), the central server or cloud-computing system 64 or some other device uses occupancy information obtained from the low-powered and spokesman nodes to determine which rooms are occupied and then turns on lights (e.g.,nightlights 65, wall switches 54, wall plugs 56 that power lamps, etc.) along the exit routes from the occupied rooms so as to provide emergency exit lighting. - Further included and illustrated in the smart-
home environment 30 ofFIG. 2 areservice robots 69 each configured to carry out, in an autonomous manner, any of a variety of household tasks. For some embodiments, theservice robots 69 can be respectively configured to perform floor sweeping, floor washing, etc. in a manner similar to that of known commercially available devices such as the ROOMBA™ and SCOOBA™ products sold by iRobot, Inc. of Bedford, Mass. Tasks such as floor sweeping and floor washing can be considered as “away” or “while-away” tasks for purposes of the instant description, as it is generally more desirable for these tasks to be performed when the occupants are not present. For other embodiments, one or more of theservice robots 69 are configured to perform tasks such as playing music for an occupant, serving as a localized thermostat for an occupant, serving as a localized air monitor/purifier for an occupant, serving as a localized baby monitor, serving as a localized hazard detector for an occupant, and so forth, it being generally more desirable for such tasks to be carried out in the immediate presence of the human occupant. For purposes of the instant description, such tasks can be considered as “human-facing” or “human-centric” tasks. - When serving as a localized thermostat for an occupant, a particular one of the
service robots 69 can be considered to be facilitating what can be called a “personal comfort-area network” for the occupant, with the objective being to keep the occupant's immediate space at a comfortable temperature wherever that occupant may be located in the home. This can be contrasted with conventional wall-mounted room thermostats, which have the more attenuated objective of keeping a statically-defined structural space at a comfortable temperature. According to one embodiment, the localized-thermostat service robot 69 is configured to move itself into the immediate presence (e.g., within five feet) of a particular occupant who has settled into a particular location in the home (e.g. in the dining room to eat their breakfast and read the news). The localized-thermostat service robot 69 includes a temperature sensor, a processor, and wireless communication components configured such that control communications with the HVAC system, either directly or through a wall-mounted wirelessly communicating thermostat coupled to the HVAC system, are maintained and such that the temperature in the immediate vicinity of the occupant is maintained at their desired level. If the occupant then moves and settles into another location (e.g. to the living room couch to watch television), the localized-thermostat service robot 69 proceeds to move and park itself next to the couch and keep that particular immediate space at a comfortable temperature. - Technologies by which the localized-thermostat service robot 69 (and/or the larger smart-home system of
FIG. 2 ) can identify and locate the occupant whose personal-area space is to be kept at a comfortable temperature can include, but are not limited to, RFID sensing (e.g., person having an RFID bracelet, RFID necklace, or RFID key fob), synthetic vision techniques (e.g., video cameras and face recognition processors), audio techniques (e.g., voice, sound pattern, vibration pattern recognition), ultrasound sensing/imaging techniques, and infrared or near-field communication (NFC) techniques (e.g., person wearing an infrared or NFC-capable smartphone), along with rules-based inference engines or artificial intelligence techniques that draw useful conclusions from the sensed information (e.g., if there is only a single occupant present in the home, then that is the person whose immediate space should be kept at a comfortable temperature, and the selection of the desired comfortable temperature should correspond to that occupant's particular stored profile). - When serving as a localized air monitor/purifier for an occupant, a
particular service robot 69 can be considered to be facilitating what can be called a “personal health-area network” for the occupant, with the objective being to keep the air quality in the occupant's immediate space at healthy levels. Alternatively or in conjunction therewith, other health-related functions can be provided, such as monitoring the temperature or heart rate of the occupant (e.g., using finely remote sensors, near-field communication with on-person monitors, etc.). When serving as a localized hazard detector for an occupant, aparticular service robot 69 can be considered to be facilitating what can be called a “personal safety-area network” for the occupant, with the objective being to ensure there is no excessive carbon monoxide, smoke, fire, etc., in the immediate space of the occupant. Methods analogous to those described above for personal comfort-area networks in terms of occupant identifying and tracking are likewise applicable for personal health-area network and personal safety-area network embodiments. - According to some embodiments, the above-referenced facilitation of personal comfort-area networks, personal health-area networks, personal safety-area networks, and/or other such human-facing functionalities of the
service robots 69, are further enhanced by logical integration with other smart sensors in the home according to rules-based inferencing techniques or artificial intelligence techniques for achieving better performance of those human-facing functionalities and/or for achieving those goals in energy-conserving or other resource-conserving ways. Thus, for one embodiment relating to personal health-area networks, the air monitor/purifier service robot 69 can be configured to detect whether a household pet is moving toward the currently settled location of the occupant (e.g., using on-board sensors and/or by data communications with other smart-home sensors along with rules-based inferencing/artificial intelligence techniques), and if so, the air purifying rate is immediately increased in preparation for the arrival of more airborne pet dander. For another embodiment relating to personal safety-area networks, the hazarddetector service robot 69 can be advised by other smart-home sensors that the temperature and humidity levels are rising in the kitchen, which is nearby to the occupant's current dining room location, and responsive to this advisory the hazarddetector service robot 69 will temporarily raise a hazard detection threshold, such as a smoke detection threshold, under an inference that any small increases in ambient smoke levels will most likely be due to cooking activity and not due to a genuinely hazardous condition. - The above-described “human-facing” and “away” functionalities can be provided, without limitation, by multiple
distinct service robots 69 having respective dedicated ones of such functionalities, by asingle service robot 69 having an integration of two or more different ones of such functionalities, and/or any combinations thereof (including the ability for asingle service robot 69 to have both “away” and “human facing” functionalities) without departing from the scope of the present teachings. Electrical power can be provided by virtue of rechargeable batteries or other rechargeable methods, such as an out-of-the-way docking station to which theservice robots 69 will automatically dock and recharge its batteries (if needed) during periods of inactivity. Preferably, eachservice robot 69 includes wireless communication components that facilitate data communications with one or more of the other wirelessly communicating smart-home sensors ofFIG. 2 and/or with one or more other service robots 69 (e.g., using Wi-Fi, Zigbee, Z-Wave, 6LoWPAN, etc.), and one or more of the smart-home devices 10 can be in communication with a remote server over the Internet. Alternatively or in conjunction therewith, eachservice robot 69 can be configured to communicate directly with a remote server by virtue of cellular telephone communications, satellite communications, 3G/4G network data communications, or other direct communication method. - Provided according to some embodiments are systems and methods relating to the integration of the service robot(s) 69 with home security sensors and related functionalities of the smart home system. The embodiments are particularly applicable and advantageous when applied for those
service robots 69 that perform “away” functionalities or that otherwise are desirable to be active when the home is unoccupied (hereinafter “away-service robots”). Included in the embodiments are methods and systems for ensuring that home security systems, intrusion detection systems, and/or occupancy-sensitive environmental control systems (for example, occupancy-sensitive automated setback thermostats that enter into a lower-energy-using condition when the home is unoccupied) are not erroneously triggered by the away-service robots. - Provided according to one embodiment is a home automation and security system (e.g., as shown in
FIG. 2 ) that is remotely monitored by a monitoring service by virtue of automated systems (e.g., cloud-based servers or other central servers, hereinafter “central server”) that are in data communications with one or more network-connected elements of the home automation and security system. The away-service robots are configured to be in operative data communication with the central server, and are configured such that they remain in a non-away-service state (e.g., a dormant state at their docking station) unless permission is granted from the central server (e.g., by virtue of an “away-service-OK” message from the central server) to commence their away-service activities. An away-state determination made by the system, which can be arrived at (i) exclusively by local on-premises smart device(s) based on occupancy sensor data, (ii) exclusively by the central server based on received occupancy sensor data and/or based on received proximity-related information such as GPS coordinates from user smartphones or automobiles, or (iii) any combination of (i) and (ii) can then trigger the granting of away-service permission to the away-service robots by the central server. During the course of the away-service robot activity, during which the away-service robots may continuously detect and send their in-home location coordinates to the central server, the central server can readily filter signals from the occupancy sensing devices to distinguish between the away-service robot activity versus any unexpected intrusion activity, thereby avoiding a false intrusion alarm condition while also ensuring that the home is secure. Alternatively or in conjunction therewith, the central server may provide filtering data (such as an expected occupancy-sensing profile triggered by the away-service robots) to the occupancy sensing nodes or associated processing nodes of the smart home, such that the filtering is performed at the local level. Although somewhat less secure, it would also be within the scope of the present teachings for the central server to temporarily disable the occupancy sensing equipment for the duration of the away-service robot activity. - According to another embodiment, functionality similar to that of the central server in the above example can be performed by an on-site computing device such as a dedicated server computer, a “master” home automation console or panel, or as an adjunct function of one or more of the smart-home devices of
FIG. 2 . In such an embodiment, there would be no dependency on a remote service provider to provide the “away-service-OK” permission to the away-service robots and the false-alarm-avoidance filtering service or filter information for the sensed intrusion detection signals. - According to other embodiments, there are provided methods and systems for implementing away-service robot functionality while avoiding false home security alarms and false occupancy-sensitive environmental controls without the requirement of a single overall event orchestrator. For purposes of the simplicity in the present disclosure, the home security systems and/or occupancy-sensitive environmental controls that would be triggered by the motion, noise, vibrations, or other disturbances of the away-service robot activity are referenced simply as “activity sensing systems,” and when so triggered will yield a “disturbance-detected” outcome representative of the false trigger (for example, an alarm message to a security service, or an “arrival” determination for an automated setback thermostat that causes the home to be heated or cooled to a more comfortable “occupied” setpoint temperature). According to one embodiment, the away-service robots are configured to emit a standard ultrasonic sound throughout the course of their away-service activity, the activity sensing systems are configured to detect that standard ultrasonic sound, and the activity sensing systems are further configured such that no disturbance-detected outcome will occur for as long as that standard ultrasonic sound is detected. For other embodiments, the away-service robots are configured to emit a standard notification signal throughout the course of their away-service activity, the activity sensing systems are configured to detect that standard notification signal, and the activity sensing systems are further configured such that no disturbance-detected outcome will occur for as long as that standard notification signal is detected, wherein the standard notification signal comprises one or more of: an optical notifying signal; an audible notifying signal; an infrared notifying signal; an infrasonic notifying signal; a wirelessly transmitted data notification signal (e.g., an IP broadcast, multicast, or unicast notification signal, or a notification message sent in an TCP/IP two-way communication session).
- According to some embodiments, the notification signals sent by the away-service robots to the activity sensing systems are authenticated and encrypted such that the notifications cannot be learned and replicated by a potential burglar. Any of a variety of known encryption/authentication schemes can be used to ensure such data security including, but not limited to, methods involving third party data security services or certificate authorities. For some embodiments, a permission request-response model can be used, wherein any particular away-service robot requests permission from each activity sensing system in the home when it is ready to perform its away-service tasks, and does not initiate such activity until receiving a “yes” or “permission granted” message from each activity sensing system (or from a single activity sensing system serving as a “spokesman” for all of the activity sensing systems). One advantage of the described embodiments that do not require a central event orchestrator is that there can (optionally) be more of an arms-length relationship between the supplier(s) of the home security/environmental control equipment, on the one hand, and the supplier(s) of the away-service robot(s), on the other hand, as it is only required that there is the described standard one-way notification protocol or the described standard two-way request/permission protocol to be agreed upon by the respective suppliers.
- According to still other embodiments, the activity sensing systems are configured to detect sounds, vibrations, RF emissions, or other detectable environmental signals or “signatures” that are intrinsically associated with the away-service activity of each away-service robot, and are further configured such that no disturbance-detected outcome will occur for as long as that particular detectable signal or environmental “signature” is detected. By way of example, a particular kind of vacuum-cleaning away-service robot may emit a specific sound or RF signature. For one embodiment, the away-service environmental signatures for each of a plurality of known away-service robots are stored in the memory of the activity sensing systems based on empirically collected data, the environmental signatures being supplied with the activity sensing systems and periodically updated by a remote update server. For another embodiment, the activity sensing systems can be placed into a “training mode” for the particular home in which they are installed, wherein they “listen” and “learn” the particular environmental signatures of the away-service robots for that home during that training session, and thereafter will suppress disturbance-detected outcomes for intervals in which those environmental signatures are heard.
- For still another embodiment, which is particularly useful when the activity sensing system is associated with occupancy-sensitive environmental control equipment rather than a home security system, the activity sensing system is configured to automatically learn the environmental signatures for the away-service robots by virtue of automatically performing correlations over time between detected environmental signatures and detected occupancy activity. By way of example, for one embodiment an intelligent automated nonoccupancy-triggered setback thermostat such as the Nest Learning Thermostat can be configured to constantly monitor for audible and RF activity as well as to perform infrared-based occupancy detection. In particular view of the fact that the environmental signature of the away-service robot will remain relatively constant from event to event, and in view of the fact that the away-service events will likely either (a) themselves be triggered by some sort of nonoccupancy condition as measured by the away-service robots themselves, or (b) occur at regular times of day, there will be patterns in the collected data by which the events themselves will become apparent and for which the environmental signatures can be readily learned. Generally speaking, for this automatic-learning embodiment in which the environmental signatures of the away-service robots are automatically learned without requiring user interaction, it is more preferable that a certain number of false triggers be tolerable over the course of the learning process. Accordingly, this automatic-learning embodiment is more preferable for application in occupancy-sensitive environmental control equipment (such as an automated setback thermostat) rather than home security systems for the reason that a few false occupancy determinations may cause a few instances of unnecessary heating or cooling, but will not otherwise have any serious consequences, whereas false home security alarms may have more serious consequences.
- According to embodiments, technologies including the sensors of the smart devices located in the mesh network of the smart-home environment in combination with rules-based inference engines or artificial intelligence provided at the central server or cloud-
computing system 64 are used to provide a personal “smart alarm clock” for individual occupants of the home. For example, user-occupants can communicate with the central server or cloud-computing system 64 via theirmobile devices 66 to access an interface for the smart alarm clock. There, occupants can turn on their “smart alarm clock” and input a wake time for the next day and/or for additional days. In some embodiments, the occupant may have the option of setting a specific wake time for each day of the week, as well as the option of setting some or all of the inputted wake times to “repeat”. Artificial intelligence will be used to consider the occupant's response to these alarms when they go off and make inferences about the user's preferred sleep patterns over time. - According to embodiments, the smart device in the smart-
home environment 30 that happens to be closest to the occupant when the occupant falls asleep will be the device that transmits messages regarding when the occupant stopped moving, from which the central server or cloud-computing system 64 will make inferences about where and when the occupant prefers to sleep. This closest smart device will as be the device that sounds the alarm to wake the occupant. In this manner, the “smart alarm clock” will follow the occupant throughout the house, by tracking the individual occupants based on their “unique signature”, which is determined based on data obtained from sensors located in the smart devices. For example, the sensors include ultrasonic sensors, passive IR sensors, and the like. The unique signature is based on a combination of walking gate, patterns of movement, voice, height, size, etc. It should be appreciated that facial recognition may also be used. - According to an embodiment, the wake times associated with the “smart alarm clock” are used by the
smart thermostat 46 to control the HVAC in an efficient manner so as to pre-heat or cool the house to the occupant's desired “sleeping” and “awake” temperature settings. The preferred settings can be learned over time, such as by observing which temperature the occupant sets the thermostat to before going to sleep and which temperature the occupant sets the thermostat to upon waking up. - According to an embodiment, a device is positioned proximate to the occupant's bed, such as on an adjacent nightstand, and collects data as the occupant sleeps using noise sensors, motion sensors (e.g., ultrasonic, IR, and optical), etc. Data may be obtained by the other smart devices in the room as well. Such data may include the occupant's breathing patterns, heart rate, movement, etc. Inferences are made based on this data in combination with data that indicates when the occupant actually wakes up. For example, if—on a regular basis—the occupant's heart rate, breathing, and moving all increase by 5% to 10%, twenty to thirty minutes before the occupant wakes up each morning, then predictions can be made regarding when the occupant is going to wake. Other devices in the home can use these predictions to provide other smart-home objectives, such as adjusting the
smart thermostat 46 so as to pre-heat or cool the home to the occupant's desired setting before the occupant wakes up. Further, these predictions can be used to set the “smart alarm clock” for the occupant, to turn on lights, etc. - According to embodiments, technologies including the sensors of the smart devices located throughout the smart-home environment in combination with rules-based inference engines or artificial intelligence provided at the central server or cloud-
computing system 64 are used to detect or monitor the progress of Alzheimer's Disease. For example, the unique signatures of the occupants are used to track the individual occupants' movement throughout the smart-home environment 30. This data can be aggregated and analyzed to identify patterns indicative of Alzheimer's. Oftentimes, individuals with Alzheimer's have distinctive patterns of migration in their homes. For example, a person will walk to the kitchen and stand there for a while, then to the living room and stand there for a while, and then back to the kitchen. This pattern will take about thirty minutes, and then the person will repeat the pattern. According to embodiments, the remote servers orcloud computing architectures 64 analyze the person's migration data collected by the mesh network of the smart-home environment to identify such patterns. - In addition,
FIG. 3 illustrates an embodiment of an extensible devices andservices platform 80 that can be concentrated at a single server or distributed among several different computing entities without limitation with respect to the smart-home environment 30. The extensible devices andservices platform 80 may include aprocessing engine 86, which may include engines that receive data from devices of smart-home environments (e.g., via the Internet or a hubbed network), to index the data, to analyze the data and/or to generate statistics based on the analysis or as part of the analysis. The analyzed data can be stored as derivedhome data 88. - Results of the analysis or statistics can thereafter be transmitted back to the device that provided home data used to derive the results, to other devices, to a server providing a web page to a user of the device, or to other non-device entities. For example, use statistics, use statistics relative to use of other devices, use patterns, and/or statistics summarizing sensor readings can be generated by the
processing engine 86 and transmitted. The results or statistics can be provided via theInternet 62. In this manner, theprocessing engine 86 can be configured and programmed to derive a variety of useful information from thehome data 82. A single server can include one or more engines. - The derived data can be highly beneficial at a variety of different granularities for a variety of useful purposes, ranging from explicit programmed control of the devices on a per-home, per-neighborhood, or per-region basis (for example, demand-response programs for electrical utilities), to the generation of inferential abstractions that can assist on a per-home basis (for example, an inference can be drawn that the homeowner has left for vacation and so security detection equipment can be put on heightened sensitivity), to the generation of statistics and associated inferential abstractions that can be used for government or charitable purposes. For example, processing
engine 86 can generate statistics about device usage across a population of devices and send the statistics to device users, service providers or other entities (e.g., that have requested or may have provided monetary compensation for the statistics). - According to some embodiments, the
home data 82, the derivedhome data 88, and/or another data can be used to create “automated neighborhood safety networks.” For example, in the event the central server or cloud-computing architecture 64 receives data indicating that a particular home has been broken into, is experiencing a fire, or some other type of emergency event, an alarm is sent to other smart homes in the “neighborhood.” In some instances, the central server or cloud-computing architecture 64 automatically identifies smart homes within a radius of the home experiencing the emergency and sends an alarm to the identified homes. In such instances, the other homes in the “neighborhood” do not have to sign up for or register to be a part of a safety network, but instead are notified of an emergency based on their proximity to the location of the emergency. This creates robust and evolving neighborhood security watch networks, such that if one person's home is getting broken into, an alarm can be sent to nearby homes, such as by audio announcements via the smart devices located in those homes. It should be appreciated that this can be an opt-in service and that, in addition to or instead of the central server or cloud-computing architecture 64 selecting which homes to send alerts to, individuals can subscribe to participate in such networks and individuals can specify which homes they want to receive alerts from. This can include, for example, the homes of family members who live in different cities, such that individuals can receive alerts when their loved ones in other locations are experiencing an emergency. - According to some embodiments, sound, vibration, and/or motion sensing components of the smart devices are used to detect sound, vibration, and/or motion created by running water. Based on the detected sound, vibration, and/or motion, the central server or cloud-
computing architecture 64 makes inferences about water usage in the home and provides related services. For example, the central server or cloud-computing architecture 64 can run programs/algorithms that recognize what water sounds like and when it is running in the home. According to one embodiment, to map the various water sources of the home, upon detecting running water, the central server or cloud-computing architecture 64 sends a message an occupant's mobile device asking if water is currently running or if water has been recently run in the home and, if so, which room and which water-consumption appliance (e.g., sink, shower, toilet, etc.) was the source of the water. This enables the central server or cloud-computing architecture 64 to determine the “signature” or “fingerprint” of each water source in the home. This is sometimes referred to herein as “audio fingerprinting water usage.” - In one illustrative example, the central server or cloud-
computing architecture 64 creates a signature for the toilet in the master bathroom, and whenever that toilet is flushed, the central server or cloud-computing architecture 64 will know that the water usage at that time is associated with that toilet. Thus, the central server or cloud-computing architecture 64 can track the water usage of that toilet as well as each water-consumption application in the home. This information can be correlated to water bills or smart water meters so as to provide users with a breakdown of their water usage. - According to some embodiments, sound, vibration, and/or motion sensing components of the smart devices are used to detect sound, vibration, and/or motion created by mice and other rodents as well as by termites, cockroaches, and other insects (collectively referred to as “pests”). Based on the detected sound, vibration, and/or motion, the central server or cloud-
computing architecture 64 makes inferences about pest-detection in the home and provides related services. For example, the central server or cloud-computing architecture 64 can run programs/algorithms that recognize what certain pests sound like, how they move, and/or the vibration they create, individually and/or collectively. According to one embodiment, the central server or cloud-computing architecture 64 can determine the “signatures” of particular types of pests. - For example, in the event the central server or cloud-
computing architecture 64 detects sounds that may be associated with pests, it notifies the occupants of such sounds and suggests hiring a pest control company. If it is confirmed that pests are indeed present, the occupants input to the central server or cloud-computing architecture 64 confirms that its detection was correct, along with details regarding the identified pests, such as name, type, description, location, quantity, etc. This enables the central server or cloud-computing architecture 64 to “tune” itself for better detection and create “signatures” or “fingerprints” for specific types of pests. For example, the central server or cloud-computing architecture 64 can use the tuning as well as the signatures and fingerprints to detect pests in other homes, such as nearby homes that may be experiencing problems with the same pests. Further, for example, in the event that two or more homes in a “neighborhood” are experiencing problems with the same or similar types of pests, the central server or cloud-computing architecture 64 can make inferences that nearby homes may also have such problems or may be susceptible to having such problems, and it can send warning messages to those homes to help facilitate early detection and prevention. - In some embodiments, to encourage innovation and research and to increase products and services available to users, the devices and
services platform 80 expose a range of application programming interfaces (APIs) 90 to third parties, such ascharities 94, governmental entities 96 (e.g., the Food and Drug Administration or the Environmental Protection Agency), academic institutions 98 (e.g., university researchers), businesses 100 (e.g., providing device warranties or service to related equipment, targeting advertisements based on home data),utility companies 102, and other third parties. TheAPIs 90 are coupled to and permit third party systems to communicate with the central server or the cloud-computing system 64, including theservices 84, theprocessing engine 86, thehome data 82, and the derivedhome data 88. For example, theAPIs 90 allow applications executed by the third parties to initiate specific data processing tasks that are executed by the central server or the cloud-computing system 64, as well as to receive dynamic updates to thehome data 82 and the derivedhome data 88. - For example, third parties can develop programs and/or applications, such as web or mobile apps that integrate with the central server or the cloud-
computing system 64 to provide services and information to users. Such programs and application may be, for example, designed to help users reduce energy consumption, to preemptively service faulty equipment, to prepare for high service demands, to track past service performance, etc., or to perform any of a variety of beneficial functions or tasks now known or hereinafter developed. - According to some embodiments, third party applications make inferences from the
home data 82 and the derivedhome data 88, such inferences may include when are occupants home, when are they sleeping, when are they cooking, when are they in the den watching television, and when do they shower. The answers to these questions may help third-parties benefit consumers by providing them with interesting information, products and services as well as with providing them with targeted advertisements. - In one example, a shipping company creates an application that makes inferences regarding when people are at home. The application uses the inferences to schedule deliveries for times when people will most likely be at home. The application can also build delivery routes around these scheduled times. This reduces the number of instances where the shipping company has to make multiple attempts to deliver packages, and it reduces the number of times consumers have to pick up their packages from the shipping company.
- To further illustrate,
FIG. 4 describes an abstractedfunctional view 110 of the extensible devices andservices platform 80 ofFIG. 3 , with particular reference to theprocessing engine 86 as well as devices, such as those of the smart-home environment 30 ofFIG. 2 . Even though devices situated in smart-home environments will have an endless variety of different individual capabilities and limitations, they can all be thought of as sharing common characteristics in that each of them is a data consumer 112 (DC), a data source 114 (DS), a services consumer 116 (SC), and a services source 118 (SS). Advantageously, in addition to providing the essential control information needed for the devices to achieve their local and immediate objectives, the extensible devices andservices platform 80 can also be configured to harness the large amount of data that is flowing out of these devices. In addition to enhancing or optimizing the actual operation of the devices themselves with respect to their immediate functions, the extensible devices andservices platform 80 can be directed to “repurposing” that data in a variety of automated, extensible, flexible, and/or scalable ways to achieve a variety of useful objectives. These objectives may be predefined or adaptively identified based on, e.g., usage patterns, device efficiency, and/or user input (e.g., requesting specific functionality). - For example,
FIG. 4 shows processing engine 86 as including a number of paradigms 120. Processingengine 86 can include a managed services paradigm 120 a that monitors and manages primary or secondary device functions. The device functions can include ensuring proper operation of a device given user inputs, estimating that (e.g., and responding to an instance in which) an intruder is or is attempting to be in a dwelling, detecting a failure of equipment coupled to the device (e.g., a light bulb having burned out), implementing or otherwise responding to energy demand response events, or alerting a user of a current or predicted future event or characteristic. Processingengine 86 can further include an advertising/communication paradigm 120 b that estimates characteristics (e.g., demographic information), desires and/or products of interest of a user based on device usage. Services, promotions, products or upgrades can then be offered or automatically provided to the user. Processingengine 86 can further include a social paradigm 120 c that uses information from a social network, provides information to a social network (for example, based on device usage), and/or processes data associated with user and/or device interactions with the social network platform. For example, a user's status as reported to their trusted contacts on the social network could be updated to indicate when they are home based on light detection, security system inactivation or device usage detectors. As another example, a user may be able to share device-usage statistics with other users. In yet another example, a user may share HVAC settings that result in low power bills and other users may download the HVAC settings to theirsmart thermostat 46 to reduce their power bills. - The
processing engine 86 can include a challenges/rules/compliance/rewards paradigm 120 d that informs a user of challenges, competitions, rules, compliance regulations and/or rewards and/or that uses operation data to determine whether a challenge has been met, a rule or regulation has been complied with and/or a reward has been earned. The challenges, rules or regulations can relate to efforts to conserve energy, to live safely (e.g., reducing exposure to toxins or carcinogens), to conserve money and/or equipment life, to improve health, etc. For example, one challenge may involve participants turning down their thermostat by one degree for one week. Those that successfully complete the challenge are rewarded, such as by coupons, virtual currency, status, etc. Regarding compliance, an example involves a rental-property owner making a rule that no renters are permitted to access certain owner's rooms. The devices in the room having occupancy sensors could send updates to the owner when the room is accessed. - The
processing engine 86 can integrate or otherwise utilizeextrinsic information 122 from extrinsic sources to improve the functioning of one or more processing paradigms.Extrinsic information 122 can be used to interpret data received from a device, to determine a characteristic of the environment near the device (e.g., outside a structure that the device is enclosed in), to determine services or products available to the user, to identify a social network or social-network information, to determine contact information of entities (e.g., public-service entities such as an emergency-response team, the police or a hospital) near the device, etc., to identify statistical or environmental conditions, trends or other information associated with a home or neighborhood, and so forth. - An extraordinary range and variety of benefits can be brought about by, and fit within the scope of, the described extensible devices and
services platform 80, ranging from the ordinary to the profound. Thus, in one “ordinary” example, each bedroom of the smart-home environment 30 can be provided with asmart wall switch 54, asmart wall plug 56, and/orsmart hazard detectors 50, all or some of which include an occupancy sensor, wherein the occupancy sensor is also capable of inferring (e.g., by virtue of motion detection, facial recognition, audible sound patterns, etc.) whether the occupant is asleep or awake. If a serious fire event is sensed, the remote security/monitoring service or fire department is advised of how many occupants there are in each bedroom, and whether those occupants are still asleep (or immobile) or whether they have properly evacuated the bedroom. While this is, of course, a very advantageous capability accommodated by the described extensible devices andservices platform 80, there can be substantially more “profound” examples that can truly illustrate the potential of a larger “intelligence” that can be made available. By way of perhaps a more “profound” example, the same bedroom occupancy data that is being used for fire safety can also be “repurposed” by theprocessing engine 86 in the context of a social paradigm of neighborhood child development and education. Thus, for example, the same bedroom occupancy and motion data discussed in the “ordinary” example can be collected and made available (properly anonymized) for processing in which the sleep patterns of schoolchildren in a particular ZIP code can be identified and tracked. Localized variations in the sleeping patterns of the schoolchildren may be identified and correlated, for example, to different nutrition programs in local schools. - As previously discussed, the described extensible devices and
services platform 80 may enable communicating emergency information between smart-home environments 30 that are linked and/or to the proper authorities. For example, when a burglar breaks into a smart-home environment 30, a home security system may trip and sound an alarm and/or send emergency notifications to the neighbors, the police, the security company, and the like. However, in instances where the break in is preceded by a jamming attack on the wireless network, the notifications may not be sent out if their transmission is dependent upon the wireless network. Thus, another means to communicate with external parties may be desired. As such, the techniques disclosed herein solve this problem by detecting the jamming attack and sending emergency notifications via side channels that are not dependent upon the wireless network. - Although programs, applications, and/or application services may be used to communicate requests or commands to the
smart home devices 10, in some embodiments these may not be sent directly to thesmart home devices 10. The following figures illustrate smart device communication and/or control via an application accessing an API. - For example,
FIG. 5 illustrates asystem 140 where an API may be used to access and/or control one or more smart devices. In the illustrated example, a person may desire to access a number ofsmart home devices 10, such as a firstsmart home device 10A and secondsmart home devices 10B. In the example ofFIG. 5 , the firstsmart home device 10A is an example of a smart thermostat, such as the Nest® Learning Thermostat by Nest Labs, Inc. (a company of Google, Inc.), and the secondsmart home devices 10B are examples of smart hazard detectors, such as the Nest® Protect by Nest Labs, Inc. Two application programs are shown accessing thesmart home devices 10A and/or 10B through thedevice service 84. AlthoughFIG. 5 illustrates accessing thesmart home devices 10A and/or 10B using two separate application programs, it should be appreciated that any suitable number of application programs may be used to access thesmart home devices 10A and/or 10B. - In the example of
FIG. 5 , afirst application 142 sends a firstdevice request message 144 targeted to a smart home device 10 (e.g., thesmart home device 10A) into cloud service(s) 145 and, more specifically, to afirst application service 146. Asecond application 148 may be used to issue a seconddevice request message 150 targeted to a smart home device 10 (e.g., thesmart home device 10A) to asecond application service 152 also among the cloud service(s) 145. In the example shown, thefirst application 142 is a navigation application that sends estimated-time-of-arrival (ETA) information in thedevice request messages 144. By sending a number of ETA messages as thedevice request messages 144, thefirst application 142 may be used to cause thesmart home devices 10A and/or 10B to be prepared when a person arrives home. Thus, as an example, thefirst application 142 may send occasionaldevice request messages 144 indicating the ETA to thefirst application service 146, which may forward this information to the device service 84 (e.g., via an API, as discussed above). Thedevice service 84 may hold thedevice request messages 144 from thefirst application 142 until an appropriate time. In the illustrated example, thesecond application 148 may be a third party home-automation application that may be running on a portable electronic device, such as a personal mobile device. Thesecond application 148 may generatedevice request messages 150, such as commands to control or request information from thesmart home devices 10A and/or 10B. Thesecond application service 152 may interface with thedevice service 84 by way of an API, as mentioned above. - Although the
first application service 146, thesecond application service 152, and thedevice service 84 are illustrated inFIG. 5 as cloud service(s) 145, it may appreciated that some or all of these services may run on electronic devices that are not remote cloud-computer systems accessible by way of the Internet. Indeed, in some examples, thedevice service 84 may not be on a network that is remote from thesmart home devices 10A and/or 10B, but rather may be running on an electronic device in the same local area network as thesmart home devices 10A and/or 10B. For example, thedevice service 84 may, additionally or alternatively, run on a local server computer and/or a local wireless router on the same local area network as thesmart home devices 10A and/or 10B. Moreover, some applications may communicate directly with the device service 84 (e.g., via the API) without first communicating with an application service such as thefirst application service 146 or thesecond application service 152. - Regardless of the number of applications that may issue device request messages (e.g., 144 or 150) to the
device service 84, thedevice service 84 may not merely forward these messages to thesmart home devices 10A and/or 10B that the device request messages are targeted too. Rather, thedevice service 84 may serve as the point of contact that application programs may use to access thesmart home devices 10A and/or 10B. Thedevice service 84 then may communicate information and/or commands provided by the applications to thesmart home devices 10A and/or 10B, enabling coordination between the applications and thedevices 10A and/or 10B. - In some embodiments, to enable additional functionalities in the applications (e.g.,
first application 142 and/or second application 148), thesmart home devices 10A and/or 10B may occasionally transmit deviceoperation status parameters 156 or other data based on the deviceoperation status parameters 156 through thedevice service 84 and the proper application service (e.g.,first application service 146 and/or second application service 152) to the proper applications (e.g.,first application 142 and/or second application 148). - The device
operation status parameters 156 may represent any suitable characteristics of the operation status of thesmart home devices 10A and/or 10B that may affect the proper functioning of thesmart home devices 10A and/or 10B. Thus, the deviceoperation status parameters 156 may include, for example: abattery level 159 indicative of an amount of charge remaining in a battery of the smart home device; acharging rate 160 indicative of a current rate that the battery of the smart home device is charging; acurrent device age 161 indicative of a period of use since initial install, a period of use since manufacture, a period of use since original sale, etc.; aplanned lifespan 162 indicative of an expected useful operational duration of the smart home device; an amount of recent wireless use 163 (selected within a timespan recent enough to substantially affect an internal temperature of the smart home device 10); a direct measurement of aninternal device temperature 164; and/or device operation status parameters forconnected devices 165. The operational status parameters forconnected devices 165 may represent any suitable operational parameters that may describe the smart home devices 10 (e.g.,smart home device 10A) through which thedevice service 84 may use to connect to a target smart home device 10 (e.g., one of thesmart home devices 10B). For example, regarding the operational status parameters forconnected devices 165, if the targetsmart home device 10 is the lastsmart home device 10B through threesmart home devices 10 in three communication “hops”, the deviceoperation status parameters 156 associated with these three interveningsmart home devices 10 may be included. - The various specific device
operation status parameters 156 shown inFIG. 5 are provided by way of example. As such, the deviceoperation status parameters 156 shown inFIG. 5 should not be understood to be exhaustive, but merely representative of possible operational parameters that may be considered for API-accessing applications. For example, additional device operation status parameters may include current state of the device (e.g., sleeping, awake, Wifi active/inactive, executing a demand-response algorithm, executing a time-to-temperature algorithm, etc.). - The applications may use the device
operation status parameters 156 or data to affect subsequent interactions (e.g., viamessages 144 or 150) that are transmitted to thesmart home devices 10A and/or 10B. The deviceoperation status parameters 156 may correspond only to a target smart home device 10 (e.g., thesmart home device 10A), or may correspond to othersmart home devices 10 that are in the vicinity of the target smart home device 10 (e.g., thesmart home device 10A and thesmart home devices 10B). In one example, when the targetsmart home device 10 for thedevice request messages 144 and/or 150 are thesmart home device 10A, the deviceoperation status parameters 156 may correspond substantially only to thesmart home device 10A. In another example, when the targetsmart home device 10 is one of thesmart home devices 10B, which is accessible by way of thesmart home device 10A, the deviceoperation status parameters 156 may contain operational parameter information about both thesmart home device 10A and thesmart home device 10B. - The
second application 148 may include voice actions. For example, a user input to thesecond application 148 may be an audible cue to “Set [brand(e.g. ‘nest’)|thermostat|temperature] to [nn] degrees.” Thesecond application 148 may convert this into messages that ultimately become commands to transition the desired temperature of thethermostat 10A. - Further, an audible queue might be to “Turn on the heat.” In such a scenario, the commands provided to the
thermostat 10A would set the thermostat one degree Celsius above the current ambient temperature. If thethermostat 10A is in range mode, both the low and high points are raised one degree Celsius. - Additionally, an audible queue might be to “Turn on the [air conditioningκooling|a.c.].” In such a scenario, the commands provided to the
thermostat 10A would set the thermostat one degree Celsius lower the current ambient temperature. If thethermostat 10A is in range mode, both the low and high points are lowered one degree Celsius. - In some embodiments, an audible queue might be to “set [brand(e.g. ‘nest’)|thermostat] to away.” In such a scenario, the commands provided to the
thermostat 10A would change the mode of thethermostat 10A to “AWAY.” When the audible queue is “set [brand(e.g. ‘nest’)|thermostat] to home,” the commands provided to thethermostat 10A would change the mode of thethermostat 10A to “HOME.” - As mentioned above, in
FIG. 5 , amessage 144 is provided from a vehicle-basedapplication 142. Themessage 144 may indicate an estimated time of arrival (“ETA”) to a location (e.g., “home”) where thedevices 10A and/or 10B are located. In some embodiments, this ETA device may be provided by thesecond program 148 running on a user device (e.g., a smart phone running the Google Now application). Based upon the ETA, the device service 84 (or any other processor-based component of the system 140) may determine controls for thesmart devices 10A and/or 10B. For example, in some embodiments, thedevice 10A (e.g., a thermostat) may be aware of a time period needed for an air conditioning system to adjust the temperature of an environment where thedevice 10A is located. Operation of thedevice 10A may be altered based upon the provided ETA information. For example, in some embodiments, the ETA information may be used to automatically take thedevice 10A out of an “AWAY” mode (e.g., set to a “HOME” mode) when the ETA reaches a particular threshold. For example, thedevice 10A may be taken out of the “AWAY” mode when the ETA is, for example, less than 1 hour, less than thirty minutes, etc. - In some embodiments, a comparison of the ETA information and an expected temperature transition time (e.g., an amount of time to adjust an environment's temperature from a current temperature to a desired temperature) may be used to automatically begin temperature adjustment, such that the home is at a desired temperature at the ETA of the vehicle. Accordingly, the transition state of the temperature adjustment may be completed prior to the vehicle operator entering the environment controlled by the
device 10A.FIG. 6 illustrates a flow diagram of aprocess 166 for adjusting temperature in this manner. - The
process 166 begins by obtaining an estimated time of arrival (“ETA”) (block 168). In one embodiment, block 168 may be triggered by setting a map application destination (e.g. an in-car navigation system and/or Google Map Application) to “home.” As mentioned above, the ETA may be provided by an application communicating directly and/or indirectly with the smart device(s). Further, a transition time to obtain a desired temperature from a current ambient temperature is calculated (block 170). The transition time is compared with the ETA (block 172). Next, a determination is made as to whether or not the transition time is greater than or equal to the ETA (decision block 174). In some embodiments, a time window may be defined based upon the transition time. For example, additional time (e.g., 0.5 hours, 1.5 hours, etc.) may be added to a transition time to ensure a desired temperature is reached prior to the vehicle's ETA. This will be described in more detail with regards toFIG. 7 . - If the transition time is less than the ETA, the process continues to poll for new ETA's from the application (or counts down until the transition time is greater than or equal to the ETA). When the transition time is equal to or greater than the ETA, the smart device (e.g., the
thermostat 10A) may be controlled to begin the temperature adjustment (e.g., cooling) (block 176). Thus, by the time the vehicle arrives at the climate-controlled destination, the transition to the desired temperature may be complete. -
FIG. 7 illustrates a window creation operation for the ETA-based temperature adjustment. As illustrated inFIG. 7 , there may be tradeoffs associated with beginning temperature adjustment prior to arrival. For example, some users may prefer a guarantee that the desired temperature is reached prior to arrival. To do this, a relatively large window may be created that starts the temperature adjustment early. Alternatively, other users may wish to factor in energy savings, which may be achieved by using a relatively small window. Thus, to provide flexibility, a graphical user interface 180 (e.g., a slider) may enable a user to select between these competing tradeoffs. As illustrated, a relatively large window 182 (here, Transition Time+a 1.5 hour buffer) is created to help ensure maximum comfort 184 (e.g., ensure that the desired temperature is reached prior to arrival). In contrast, a relatively small window 186 (here, Transition Time+0.1 hrs) to help ensure maximum efficiency 188 (e.g., ensure that less energy is used). - In some embodiments, a vehicular application (e.g.,
first application 142 ofFIG. 5 ) or other application may provide a location of the vehicle or other device to the smart devices (e.g.,smart devices 10A and/or 10B ofFIG. 5 ). This information may be used to control the smart devices (e.g., via geo-fencing).FIGS. 8-11 relate to such embodiments.FIG. 8 is aprocess 190 for controlling smart devices via data obtained from a vehicular application.FIG. 9 illustrates an example of geo-fencing boundaries 200.FIG. 10 relates to a location-based application on a smart phone (e.g., Google Now) andFIG. 11 relates to a location-based application within a vehicle. These figures will be discussed together. - The
process 190 begins with obtaining a location of a vehicle (or other structure providing location information) (block 192). As mentioned above, this may be done by providing, for example, global-positioning-system (GPS) coordinates from the vehicular application to the smart devices (e.g., via one or more APIs). Next, geo-fence locations are determined (block 194). As illustrated inFIG. 10 , one or more geo-fencing boundaries 200 may define locations (e.g., perimeters). Any number of boundaries of any shape or size may be used to create geo-fences. Operation of the smart devices (e.g., 10A and/or 10B) may be altered when the vehicle is located within and/or transitions into one of the boundaries 200 (block 196). - For example, when leaving the
home boundary 200A, the vehicular application may automatically prompt the user to set the thermostat to an “AWAY” mode. As illustrated inFIG. 10 , thelocation 210 has moved 212 to thelocation 210′ (e.g., from thehome zone 200A to outside thehome zone 200A). Based upon thelocation 210′ and/or the transition outside of theboundary 200A, the prompt 214 may be provided. In the illustrated embodiment, the prompt 214 is provided on a handheld device 216 (e.g., a tablet computer, a programmable remote control, and/or a cellular telephone). -
FIG. 11 provides an illustration of vehicular application embodiment. As illustrated, in the vehicular application embodiment, a prompt 270 may be provided in a graphical user interface of the vehicle, here an in-dashgraphical user interface 272. - In addition to the “AWAY” mode prompt, the vehicular application or other application may provide an automatic prompt suggesting to set one or more of the smart devices (e.g.,
thermostat 10A) to “HOME” mode (e.g., not “AWAY”). For example, if the location were indicated as being withinboundary 200A or a transition intoboundary 200A was detected (e.g., by transition fromlocation 210′ to location 210), the application may automatically prompt to set one or more of the smart devices to “HOME.” - In some embodiments, a vehicular application (e.g., an application running on the graphical user interface 272) may allow manual configuration adjustments for smart devices. For example, the vehicular applications may allow a user to manually set “HOME” and/or “AWAY” mode of a thermostat without having to physically access a separate application (e.g. a smart phone or tablet computer application). In other words, the user would not have to engage a graphical user interface of a smart phone or tablet, but could access configuration adjustments directly from the vehicular application (e.g. via the in-dash graphical user interface 272). Additionally, other configuration adjustments may be possible. For example, a temperature adjustment
graphical user interface 274 may enable changes to the desired temperature of thethermostat 10A. - As mentioned above, one or more messages may be sent from the vehicular application to the smart devices, which may be interpreted by a processor to control the smart devices. Accordingly, when user inputs (e.g., temperature adjustments or mode change adjustments) are made at the vehicular application, one or more control messages may be provided via the API(s). These messages are interpreted and cause the relevant control of the smart devices.
- In some embodiments, energy consumption data may be provided from the vehicular application to the smart devices (or a
cloud service 145 associated with the smart devices). For example, gasoline and/orelectrical power usage 276 may be provided tocloud services 145. Whenelectrical power usage 276 is provided, thecloud services 145 may provide an optimal vehicle charging schedule based on utility cost information known to the cloud services 145. For example, in some situations, utility companies may provide cheaper energy at off-peak times. When thecloud services 145 determine that a future recharge of the vehicle may be needed, thecloud services 145 may provide a recharging schedule based upon these off-peak energy times. - Further, when the energy usage data is provided to the
cloud services 145, additional services may be provided. For example, the vehicular energy consumption data may allow integration with energy conservation games (e.g., Nest Leaf) available for other smart devices (e.g., thethermostat 10A). Accordingly, energy usage reports may provide not only energy usage for smart devices within the home, but also energy consumption of vehicles related to that home. - As mentioned above,
device operation status 156 and/or other data may be provided from smart devices to applications (e.g., the vehicular application (first application 142)). Indeed, operational status of these smart devices (e.g., smoke and/or carbon monoxide detectors (e.g.,smart devices 10B) may be provided the vehicular application. For example, in the embodiment ofFIG. 11 , astatus GUI 278 provides an indication of the current operating status of a smoke detector and/or carbon monoxide detector. In other examples, an alarm system status, ambient temperature, or any other operational and/or sensor data may be provided for display within a vehicle. - In some embodiments, the API(s) may be used to enable condition based access and/or control. For example, conditional rules may be generated based upon information received and/or sent to the API(s). In one example, conditional rule generation may occur from a website, such as a site that enables plugging in of conditions and outputs from a variety of different sources. In some examples, dedicated machine-readable code having conditional rule generation instructions may be stored on a tangible, non-transitory, machine-readable medium and executed by a machine.
- In some embodiments, conditional rules may be created where the
smart devices 10A and/or 10B are affected as an output of the rule.FIG. 12 illustrates an example of aconditional rule 300 where theoutput 302 is access and/or control of one or more features of thesmart devices 10A and/or 10B. For example, anoutput 302 for athermostat 10A may be changing a mode (e.g., “HOME” or “AWAY”) for the thermostat, changing a desired temperature level of the thermostat, setting a fan to on or off, changing a fan speed, changing a temperature adjustment system (e.g., setting heat to cool or vice versa), etc. Example outputs 302 relating to a smoke detector and/or carbon monoxide detector (e.g.,device 10B) may be activating/deactivating alarms, activating/deactivating audio, activating/deactivating lighting, activating/deactivating motion sensors, etc. - The conditions 304 used to control the
outputs 302 need not be sourced from the smart devices accessed and/or controlled by theoutputs 302. In some embodiments, theconditional rules 300 may be based upon conditions sourced from an external data source 306 (e.g., external to thesmart devices 10A and/or 10B). For example,FIG. 12 illustrates aconditional rule 300 where the condition(s) 304 is sourced from anexternal source 302. For example, theexternal data source 306 may include a weather service, social media site (e.g., check-in announcement), electronic-calendar (e.g., Google calendar), geo-fencing application, utility company rate schedule, an electronic device (e.g., an alarm clock), etc. - In some embodiments, conditional rules may be based upon information sourced from the
smart devices 10A (e.g., thermostat) and/or 10B (e.g., smoke and/or carbon monoxide detector). Further, though the source for the condition 304 may be thesmart devices 10A and/or 10B, theoutputs 302 may be external to thesmart devices 10A and/or 10B. For example,FIG. 13 illustrates aconditional rule 310 where theoutput 302 is an external output 312 and the inputs 304 are sourced from data provided by thethermostat 10A and/or smoke and/orcarbon monoxide detector 10B. In some embodiments, both theinputs 302 and the outputs 304 relate to thesmart devices 10A and/or 10B. - Example conditions 304 that may be sourced from the
thermostat 10A may include: anydevice operation status 156 of the thermostat, a mode (e.g., “HOME” and/or “AWAY”) of the thermostat, an ambient temperature of the thermostat, an amount of periodic temperature change, etc. Example conditions 304 that may be sourced from the smoke and/orcarbon monoxide detector 10B may include: an operatingstatus 156 of the device, an active smoke alarm, and active carbon monoxide alarm, low device battery level, etc. - Having now discussed basic conditional rules (e.g., 300 and 310) using the
thermostat 10A and/or smoke/carbon monoxide detector 10B, the following are examples of conditional rules that may be useful for implementation within a smart home. In one embodiment, data from an activity monitor, such as an electronic wristband that tracks vital statistics may be used to provide a condition for a conditional rule. For example, when the activity monitor detects that an activity level suggests that the user is sleeping, a conditional output may set the desired temperature to a desired sleep temperature. When the activity level suggests that the user is awake, the output may set the desired thermostat temperature to an awake temperature. - In certain embodiments, a conditional output may correspond to smart lighting. For example, the lighting may be turned off when the
thermostat 10A enters an “AWAY” mode. This helps to ensure that energy is not wasted while no one is in the home. Further, when thethermostat 10A enters “HOME” mode, the lighting may be re-activated (perhaps in the same configuration as when it was turned off, or a new configuration, such as lighting only the front foyer where access to the home typically occurs). Additionally, lighting colors may change based upon conditions from thedevices 10A and/or 10B. For example, it has been shown that the color red may provide visibility benefits when smoke and/or gaseous conditions. Accordingly, color-changing lights, may be transitioned to red when an alarm from the smoke/carbon monoxide detector 10B is active. - In some embodiments, additional notifications may be provided via conditional rules. For example, a rule may trigger a text message, email, voice call, etc. to family, friends, neighbors, home-owners, etc. when a smoke alarm and/or a carbon monoxide alarm is triggered. Further, when a television, receiver, etc. is operating at a high decibel level, it may be beneficial to mute or lower the decibel volume to ensure that active alarms are heard. Accordingly, a conditional rule may mute or lower decibel levels of one or more devices if an alarm of the
detector 10B is active. In some instances, this may be done in conjunction with a programmable remote control. - As mentioned above, a weather service may provide conditions 304 for a conditional rule. For example, when the weather service reports an extremely hot and/or humid day, the desired temperature of the
thermostat 10A may be adjusted as a conditional rule output. Thus, thethermostat 10A may become highly customizable for a user's desired preferences. -
Outputs 302 related to mode changes in thethermostat 10A may be implemented by conditions sourced from social media data. For example, a “check-in” on Google Hangouts may suggest that a homeowner is not home and that an “AWAY” mode should be set. Accordingly, a rule may be generated to set the mode of thethermostat 10A to “AWAY” if there is a check-in outside of the home. - The geo-fencing applications (discussed in
FIGS. 8 and 9 may also be used as conditions for the conditional rules. For example, an output altering the mode of thethermostat 10A to “AWAY” may be triggered when exiting theboundary 200A. Thethermostat 10A mode may be altered to “HOME” when entering theboundary 200A. - In some examples, other smart devices within the home may trigger outputs of the
smart devices 10A and/or 10B. For example, when motion sensing smart light bulbs and/or other motion sensing devices detect movement within the home, thethermostat 10A mode may be set to “HOME.” - In one embodiment, particular keywords or contextual identifiers may be used as conditions 304 that trigger an
output 302. For example, when a Google calendar appointment suggests that a climate-controlled environment will be unoccupied, the thermostat may be controlled to go into “AWAY” mode. For example, when a calendar entry includes the keywords “Out of Office,” “OOO,” “Vacation,” etc, the “AWAY” mode output may be triggered at thethermostat 10A. - In some embodiments, when the
thermostat 10A transitions to “HOME,” audio playback may be triggered. Further, when thethermostat 10A transitions to “AWAY,” music playback may be halted. Additionally, activating music playback on a device within the home may automatically trigger a command to enable “HOME” mode on thethermostat 10A. - When
multiple thermostats 10A and/ordetectors 10B exist within a home, each of thethermostats 10A and/ordetectors 10B may accessed by a unique identifier. Accordingly, a condition 304 and/oroutput 302 may be specifically tied to a particular one or many of thethermostats 10A and/ordetectors 10B. - The API(s) may enable other automation system to interact with the
smart devices 10A and/or 10B. For example, a Control4® system may use the API(s) to increase/decrease temperatures of thethermostat 10A, may receive alarm states or otherdevice operation status 156 from thethermostat 10A and/ordetector 10B, set modes of operation (e.g., “heat,” “cool,” “HOME,” and/or “AWAY” on thethermostat 10A, etc. - It may be beneficial to link conditions and outputs between washers, dryers, ovens, etc. and
thermostats 10A/detectors 10B.FIG. 14 illustrates such asystem 370. - Certain appliances may include features that are beneficial in situations where there is delayed user involvement. For example, the
washing machine 372 may include a system to maintain unattended laundry. When laundry left unattended in thewashing machine 372, a fan may periodically pull moisture from the drum of thewashing machine 372 and also periodically tumble the unattended laundry. Similarly, thedryer 374 may include an unattended laundry system that intermittently tumbles unattended laundry after a dryer cycle. - Typically, these unattended laundry systems are activated manually via an onboard interface of the
washing machine 372. However, under certain scenarios, this system may be activated automatically, using occupancy status discerned from thesmart devices 10A and/or 10B. - For example, the
thermostat 10A is set to “AWAY” when the thermostat detects an indication that no one is in the temperature-controlled environment. Further, when thedetectors 10B are equipped with occupancy sensors, similar household occupancy status may be defined. The status from thedetectors 10B may be provided to thethermostat 10A, which in turn may automatically be set to “AWAY.” Further,thermostat 10A users may manually set the thermostat to “AWAY,” upon leaving the house. - In any of these cases, when an indication that no occupants are present is discerned, the away status may be provided to a service (e.g., service of the
washer 372,dryper 374,cloud service 145, condition service 376 (e.g., a website that provides graphical conditional rule generation), etc., which may use the status as a condition for activating the unattended laundry systems. - When the
washing machine 372 and/or thedryer 374 are running a cycle and the respective unattended laundry systems are not enabled, the service may provide awasher 372 and/ordryer 374 command to activate the respective unattended laundry system. Thus, the laundry will remain fresh and/or wrinkle free, despite the operator leaving the laundry unattended and not manually activating the unattended laundry systems. - Further, some
dryers 374 may be equipped with an economy boost option that may place the dryer in a more time-consuming but energy-consuming state. When no occupancy is indicated or detected (e.g., by thethermostat 10A entering an “AWAY” mode), the service may provide a command for thedryer 374 to enter the economy boost option. - As mentioned above, certain utility providers offer lower energy rates during off-peak hours of operation. Rush Hour Rewards, by Nest, provides incentives to consumers to use less energy during peak usage times. Users enrolled in the Rush Hour Rewards receive periodic peak energy usage events defining a peak usage time when energy consumption should be avoided to obtain a reward from the cloud services 145. When the Rush Hour Reward event occurs, the
washer 372 and/ordryer 374 receives the peak event signal from the cloud services 147 and calculates the peak start time and duration The peak start time is adjusted by a default cycle length for thewasher 372 and/ordryer 374 to ensure that a consumer does not inadvertently start a cycle just before the event is to begin. For example, if awashing machine 372 and/ordryer 374 cycle is typically 30 minutes, the peak start time is adjusted by 30 minutes, to ensure that thewasher 372 and/or thedryer 374 is not active during the peak event. - In one example, a Rush Hour peak event may begin at 2:00 pm and last for 4 hours. With a default cycle time of 30 minutes, the
washer 372 adjusts the peak event start to 1:30 pm and ends the event at 6:00 pm (4 hour and 30 minute duration). These adjustments to the Rush Hour peak event help to ensure that thewasher 372 is not in operation during the peak event. - Once a new peak event start time and duration is calculated, the service may send a command to the
washer 372 and/ordryer 374 to enter a Smart Delay. When in Smart Delay, thewasher 372 and/ordryer 374 will inform the consumer that a peak event is in process and that a more energy friendly time to run the cycle is approaching. The consumer may provide an input to allow thewasher 372 and/ordryer 374 to automatically start when the event is complete, or the consumer may override the Smart Delay and start the cycle immediately. - When the
washer 372 and/ordryer 374 receive thepeak event signal 30 minutes or less from the start of the event, the service sends a command for thewasher 372 and/ordryer 374 to enter a deep power reduction mode. Accordingly, if thewasher 372 and/ordryer 374 is in operation prior to receiving the peak event, thewasher 372 and/ordryer 374 will reduce power usage for a brief period of time. Further, the dryer will also enter economy boost for the remainder of the cycle. If not running a cycle, thewasher 372 and/ordryer 374 will enter Smart Delay. When the Rush Hour peak event has concluded, thewasher 372 and/ordryer 374 return to normal operation. - To further encourage energy efficiency, energy usage of the
washer 372 and/ordryer 374, along with any of the other devices and/or services described herein may be accumulated by the cloud services 145. For example, Nest may accumulate the energy usage of lighting, external automation systems, etc. to include this information in energy utilization reports. Further, the energy consumption may be incorporated in energy conservation information and/or games, such as Nest Leaf. - In some embodiments, the
detectors 10B may be used as conditions for controlling thewasher 372,dryer 374, and or a stove-top/oven 378. For example, when thedetectors 10B detect smoke and/or gas, thewasher 372,dryer 374, and or a stove-top/oven 378 may be disabled. For example, gas access may be disabled a burner on the stove-top/oven 378. - In some embodiments, a booking service conditions may be used to control smart devices (e.g.,
thermostat 10A and/ordetectors 10B).FIG. 15 illustrates such asystem 400. Abooking service 402, such as a hotel or Bed and Breakfast website may enable reservations for one or more particular rooms. For example, thebooking service 402 includes a listing 404 of available Bed and Breakfast locations for a particular location. Further, thelisting 404 includes anindicator 408 for smart locations that may be personalized for a user's particular desires. - When a
location 406 is selected, additional details about thelocation 406 are provided. In the current embodiment, anavailability calendar 408 is provided. Further, because the selected location is a personalized location,additional prompts 410 may be provided. For example, an alarm prompt 412 may enable a user to input an alarm code that is easy for the user to remember. An environment prompt 414 may enable the user to input particular environmental settings such as a desired arrival temperature, etc. In some embodiments, the alarm and/or environmental settings (or any other customizable settings) may be pre-populated or obtained from the user's home 418 (or other location) settings. For example, if the user maintains a 78 degree temperature when awake and occupying the house and a 73 degree temperature when sleeping and occupying the house, these temperature settings may automatically be sent and implemented at the user's bookedroom 420. - For example, based upon the dates selected by the user, the
cloud services 145 may provide the settings input at theprompts 410 and/or the settings obtained from thehouse 418. Thus, if the user booked a rental from December 1 through December 10, the user's settings may be automatically implemented via thecloud service 145 during those time periods. Additionally, smart device notifications, such as active alarms of thedetector 10B may be provided to the user (e.g., the user's smart phone, etc.) during the booked time period. Further, the user's home may be controlled by placing the user's home in “AWAY” mode during the booked time period and the user may be notified when their home devices detect occupancy while they are expected to be away (e.g., notify the user that their home thermostat transitioned to “HOME” while they are away). - This functionality may also benefit the lessor by providing energy conservation. For example, the
booking service 402 is aware of times when there is no occupancy in theroom 420. Accordingly, theavailability calendar 408 may be used to set thethermostats 10A to “AWAY” during periods where there is no occupancy. - In some embodiments, a garage door opener may be used as either a condition for a
thermostat 10A output and/or athermostat 10A condition may be used for a garage door opener output.FIG. 16 provides asystem 440 that integrates agarage door opener 442 withsmart devices 10A and/or 10B. - In the provided embodiment, the
garage door opener 442 status may indicate that someone is arriving and/or leaving the house 444. Accordingly, a prompt 446 may be provided on a user's device 448 (e.g., smart phone) prompting to change the mode of thethermostat 10A (e.g., from “HOME” to “AWAY” or vice versa). - Further, in cases where a user inadvertently leaves the
garage door 450 open, conditions of thethermostat 10A may be used to trigger closure of thedoor 450. For example, a conditional rule might trigger closure of thedoor 450 on the thermostat being “AWAY” for 30 minutes or longer. Thus, thedoor 450 may be closed, adding household security. - The specific embodiments described above have been shown by way of example, and it should be understood that these embodiments may be susceptible to various modifications and alternative forms. It should be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and scope of this disclosure.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/293,358 US20190208390A1 (en) | 2014-06-23 | 2019-03-05 | Methods And Apparatus For Exploiting Interfaces Smart Environment Device Application Program Interfaces |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462016052P | 2014-06-23 | 2014-06-23 | |
US14/577,635 US20150372832A1 (en) | 2014-06-23 | 2014-12-19 | Methods and apparatus for exploiting interfaces smart environment device application program interfaces |
US16/293,358 US20190208390A1 (en) | 2014-06-23 | 2019-03-05 | Methods And Apparatus For Exploiting Interfaces Smart Environment Device Application Program Interfaces |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/577,635 Continuation US20150372832A1 (en) | 2014-06-23 | 2014-12-19 | Methods and apparatus for exploiting interfaces smart environment device application program interfaces |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190208390A1 true US20190208390A1 (en) | 2019-07-04 |
Family
ID=54869564
Family Applications (14)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/531,805 Abandoned US20150370272A1 (en) | 2014-06-23 | 2014-11-03 | Intelligent configuration of a smart environment based on arrival time |
US14/577,635 Abandoned US20150372832A1 (en) | 2014-06-23 | 2014-12-19 | Methods and apparatus for exploiting interfaces smart environment device application program interfaces |
US14/722,034 Abandoned US20150372834A1 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US14/722,012 Active US9854386B2 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US14/722,023 Active US9838830B2 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US14/722,003 Active US9491571B2 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US14/722,026 Active US9456297B2 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US14/722,032 Active US9668085B2 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US15/158,268 Active 2035-07-30 US10075828B2 (en) | 2014-06-23 | 2016-05-18 | Methods and apparatus for using smart environment devices via application program interfaces |
US15/380,767 Active 2035-07-03 US10638292B2 (en) | 2014-06-23 | 2016-12-15 | Methods and apparatus for using smart environment devices via application program interfaces |
US16/051,375 Active US10440545B2 (en) | 2014-06-23 | 2018-07-31 | Methods and apparatus for using smart environment devices via application program interfaces |
US16/166,046 Abandoned US20190058985A1 (en) | 2014-06-23 | 2018-10-19 | Methods and Apparatus for Using Smart Environment Devices Via Application Program Interfaces |
US16/293,358 Abandoned US20190208390A1 (en) | 2014-06-23 | 2019-03-05 | Methods And Apparatus For Exploiting Interfaces Smart Environment Device Application Program Interfaces |
US16/565,124 Active US10764735B2 (en) | 2014-06-23 | 2019-09-09 | Methods and apparatus for using smart environment devices via application program interfaces |
Family Applications Before (12)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/531,805 Abandoned US20150370272A1 (en) | 2014-06-23 | 2014-11-03 | Intelligent configuration of a smart environment based on arrival time |
US14/577,635 Abandoned US20150372832A1 (en) | 2014-06-23 | 2014-12-19 | Methods and apparatus for exploiting interfaces smart environment device application program interfaces |
US14/722,034 Abandoned US20150372834A1 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US14/722,012 Active US9854386B2 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US14/722,023 Active US9838830B2 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US14/722,003 Active US9491571B2 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US14/722,026 Active US9456297B2 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US14/722,032 Active US9668085B2 (en) | 2014-06-23 | 2015-05-26 | Methods and apparatus for using smart environment devices via application program interfaces |
US15/158,268 Active 2035-07-30 US10075828B2 (en) | 2014-06-23 | 2016-05-18 | Methods and apparatus for using smart environment devices via application program interfaces |
US15/380,767 Active 2035-07-03 US10638292B2 (en) | 2014-06-23 | 2016-12-15 | Methods and apparatus for using smart environment devices via application program interfaces |
US16/051,375 Active US10440545B2 (en) | 2014-06-23 | 2018-07-31 | Methods and apparatus for using smart environment devices via application program interfaces |
US16/166,046 Abandoned US20190058985A1 (en) | 2014-06-23 | 2018-10-19 | Methods and Apparatus for Using Smart Environment Devices Via Application Program Interfaces |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/565,124 Active US10764735B2 (en) | 2014-06-23 | 2019-09-09 | Methods and apparatus for using smart environment devices via application program interfaces |
Country Status (1)
Country | Link |
---|---|
US (14) | US20150370272A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10440545B2 (en) | 2014-06-23 | 2019-10-08 | Google Llc | Methods and apparatus for using smart environment devices via application program interfaces |
CN111522615A (en) * | 2020-04-23 | 2020-08-11 | 平安国际智慧城市科技股份有限公司 | Method, device and equipment for updating command line interface and storage medium |
WO2021163270A1 (en) * | 2020-02-12 | 2021-08-19 | Appareo Systems, Llc | Aircraft lighting system and method |
Families Citing this family (558)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6658091B1 (en) | 2002-02-01 | 2003-12-02 | @Security Broadband Corp. | LIfestyle multimedia security system |
US8635350B2 (en) | 2006-06-12 | 2014-01-21 | Icontrol Networks, Inc. | IP device discovery systems and methods |
US9191228B2 (en) | 2005-03-16 | 2015-11-17 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11368429B2 (en) | 2004-03-16 | 2022-06-21 | Icontrol Networks, Inc. | Premises management configuration and control |
US11159484B2 (en) | 2004-03-16 | 2021-10-26 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US10313303B2 (en) | 2007-06-12 | 2019-06-04 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US10522026B2 (en) | 2008-08-11 | 2019-12-31 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US11277465B2 (en) | 2004-03-16 | 2022-03-15 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US11190578B2 (en) | 2008-08-11 | 2021-11-30 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US10339791B2 (en) | 2007-06-12 | 2019-07-02 | Icontrol Networks, Inc. | Security network integrated with premise security system |
US20090077623A1 (en) | 2005-03-16 | 2009-03-19 | Marc Baum | Security Network Integrating Security System and Network Devices |
US8988221B2 (en) | 2005-03-16 | 2015-03-24 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11343380B2 (en) | 2004-03-16 | 2022-05-24 | Icontrol Networks, Inc. | Premises system automation |
US10721087B2 (en) | 2005-03-16 | 2020-07-21 | Icontrol Networks, Inc. | Method for networked touchscreen with integrated interfaces |
US10237237B2 (en) | 2007-06-12 | 2019-03-19 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US9141276B2 (en) | 2005-03-16 | 2015-09-22 | Icontrol Networks, Inc. | Integrated interface for mobile device |
US10348575B2 (en) | 2013-06-27 | 2019-07-09 | Icontrol Networks, Inc. | Control system user interface |
US10142392B2 (en) | 2007-01-24 | 2018-11-27 | Icontrol Networks, Inc. | Methods and systems for improved system performance |
US7711796B2 (en) | 2006-06-12 | 2010-05-04 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11582065B2 (en) | 2007-06-12 | 2023-02-14 | Icontrol Networks, Inc. | Systems and methods for device communication |
US11916870B2 (en) | 2004-03-16 | 2024-02-27 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11811845B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11489812B2 (en) | 2004-03-16 | 2022-11-01 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US10156959B2 (en) | 2005-03-16 | 2018-12-18 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US12063220B2 (en) | 2004-03-16 | 2024-08-13 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11113950B2 (en) | 2005-03-16 | 2021-09-07 | Icontrol Networks, Inc. | Gateway integrated with premises security system |
US11244545B2 (en) | 2004-03-16 | 2022-02-08 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US9609003B1 (en) | 2007-06-12 | 2017-03-28 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US11201755B2 (en) | 2004-03-16 | 2021-12-14 | Icontrol Networks, Inc. | Premises system management using status signal |
US11677577B2 (en) | 2004-03-16 | 2023-06-13 | Icontrol Networks, Inc. | Premises system management using status signal |
US20170118037A1 (en) * | 2008-08-11 | 2017-04-27 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US8963713B2 (en) | 2005-03-16 | 2015-02-24 | Icontrol Networks, Inc. | Integrated security network with security alarm signaling system |
US9531593B2 (en) | 2007-06-12 | 2016-12-27 | Icontrol Networks, Inc. | Takeover processes in security network integrated with premise security system |
US10444964B2 (en) | 2007-06-12 | 2019-10-15 | Icontrol Networks, Inc. | Control system user interface |
GB2428821B (en) | 2004-03-16 | 2008-06-04 | Icontrol Networks Inc | Premises management system |
US11316958B2 (en) | 2008-08-11 | 2022-04-26 | Icontrol Networks, Inc. | Virtual device systems and methods |
US10375253B2 (en) | 2008-08-25 | 2019-08-06 | Icontrol Networks, Inc. | Security system with networked touchscreen and gateway |
US10200504B2 (en) | 2007-06-12 | 2019-02-05 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US10382452B1 (en) | 2007-06-12 | 2019-08-13 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US9729342B2 (en) | 2010-12-20 | 2017-08-08 | Icontrol Networks, Inc. | Defining and implementing sensor triggered response rules |
US11496568B2 (en) | 2005-03-16 | 2022-11-08 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US20110128378A1 (en) | 2005-03-16 | 2011-06-02 | Reza Raji | Modular Electronic Display Platform |
US20170180198A1 (en) | 2008-08-11 | 2017-06-22 | Marc Baum | Forming a security network including integrated security system components |
US9306809B2 (en) | 2007-06-12 | 2016-04-05 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US10999254B2 (en) | 2005-03-16 | 2021-05-04 | Icontrol Networks, Inc. | System for data routing in networks |
US20120324566A1 (en) | 2005-03-16 | 2012-12-20 | Marc Baum | Takeover Processes In Security Network Integrated With Premise Security System |
US11700142B2 (en) | 2005-03-16 | 2023-07-11 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11615697B2 (en) | 2005-03-16 | 2023-03-28 | Icontrol Networks, Inc. | Premise management systems and methods |
US10079839B1 (en) | 2007-06-12 | 2018-09-18 | Icontrol Networks, Inc. | Activation of gateway device |
US12063221B2 (en) | 2006-06-12 | 2024-08-13 | Icontrol Networks, Inc. | Activation of gateway device |
US11706279B2 (en) | 2007-01-24 | 2023-07-18 | Icontrol Networks, Inc. | Methods and systems for data communication |
US7633385B2 (en) | 2007-02-28 | 2009-12-15 | Ucontrol, Inc. | Method and system for communicating with and controlling an alarm system from a remote server |
US8451986B2 (en) | 2007-04-23 | 2013-05-28 | Icontrol Networks, Inc. | Method and system for automatically providing alternate network access for telecommunications |
US10498830B2 (en) | 2007-06-12 | 2019-12-03 | Icontrol Networks, Inc. | Wi-Fi-to-serial encapsulation in systems |
US10666523B2 (en) | 2007-06-12 | 2020-05-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10051078B2 (en) | 2007-06-12 | 2018-08-14 | Icontrol Networks, Inc. | WiFi-to-serial encapsulation in systems |
US11089122B2 (en) | 2007-06-12 | 2021-08-10 | Icontrol Networks, Inc. | Controlling data routing among networks |
US11423756B2 (en) | 2007-06-12 | 2022-08-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10423309B2 (en) | 2007-06-12 | 2019-09-24 | Icontrol Networks, Inc. | Device integration framework |
US10389736B2 (en) | 2007-06-12 | 2019-08-20 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11316753B2 (en) | 2007-06-12 | 2022-04-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11646907B2 (en) | 2007-06-12 | 2023-05-09 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US12003387B2 (en) | 2012-06-27 | 2024-06-04 | Comcast Cable Communications, Llc | Control system user interface |
US11218878B2 (en) | 2007-06-12 | 2022-01-04 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10616075B2 (en) | 2007-06-12 | 2020-04-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11601810B2 (en) | 2007-06-12 | 2023-03-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11237714B2 (en) | 2007-06-12 | 2022-02-01 | Control Networks, Inc. | Control system user interface |
US10523689B2 (en) | 2007-06-12 | 2019-12-31 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11212192B2 (en) | 2007-06-12 | 2021-12-28 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10223903B2 (en) | 2010-09-28 | 2019-03-05 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11831462B2 (en) | 2007-08-24 | 2023-11-28 | Icontrol Networks, Inc. | Controlling data routing in premises management systems |
US11916928B2 (en) | 2008-01-24 | 2024-02-27 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US20170185278A1 (en) | 2008-08-11 | 2017-06-29 | Icontrol Networks, Inc. | Automation system user interface |
US11729255B2 (en) | 2008-08-11 | 2023-08-15 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11792036B2 (en) | 2008-08-11 | 2023-10-17 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11758026B2 (en) | 2008-08-11 | 2023-09-12 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11258625B2 (en) | 2008-08-11 | 2022-02-22 | Icontrol Networks, Inc. | Mobile premises automation platform |
US20100114768A1 (en) | 2008-10-31 | 2010-05-06 | Wachovia Corporation | Payment vehicle with on and off function |
US10867298B1 (en) | 2008-10-31 | 2020-12-15 | Wells Fargo Bank, N.A. | Payment vehicle with on and off function |
US8049613B2 (en) * | 2008-11-26 | 2011-11-01 | Comcast Cable Holdings, Llc | Building security system |
US8638211B2 (en) | 2009-04-30 | 2014-01-28 | Icontrol Networks, Inc. | Configurable controller and interface for home SMA, phone and multimedia |
US8918779B2 (en) * | 2009-08-27 | 2014-12-23 | Microsoft Corporation | Logical migration of applications and data |
US8471707B2 (en) * | 2009-09-25 | 2013-06-25 | Intel Corporation | Methods and arrangements for smart sensors |
CN102985915B (en) | 2010-05-10 | 2016-05-11 | 网际网路控制架构网络有限公司 | Control system user interface |
US8836467B1 (en) | 2010-09-28 | 2014-09-16 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
US11750414B2 (en) | 2010-12-16 | 2023-09-05 | Icontrol Networks, Inc. | Bidirectional security sensor communication for a premises security system |
US9147337B2 (en) | 2010-12-17 | 2015-09-29 | Icontrol Networks, Inc. | Method and system for logging security event data |
US10713726B1 (en) | 2013-01-13 | 2020-07-14 | United Services Automobile Association (Usaa) | Determining insurance policy modifications using informatic sensor data |
US10976713B2 (en) | 2013-03-15 | 2021-04-13 | Hayward Industries, Inc. | Modular pool/spa control system |
US8769031B1 (en) * | 2013-04-15 | 2014-07-01 | Upfront Media Group, Inc. | System and method for implementing a subscription-based social media platform |
JP5538592B1 (en) * | 2013-05-17 | 2014-07-02 | 三菱電機株式会社 | Energy management controller, energy management system, energy management method, and program |
US10708404B2 (en) | 2014-09-01 | 2020-07-07 | Skybell Technologies Ip, Llc | Doorbell communication and electrical systems |
US11764990B2 (en) | 2013-07-26 | 2023-09-19 | Skybell Technologies Ip, Llc | Doorbell communications systems and methods |
US10672238B2 (en) | 2015-06-23 | 2020-06-02 | SkyBell Technologies, Inc. | Doorbell communities |
US11909549B2 (en) | 2013-07-26 | 2024-02-20 | Skybell Technologies Ip, Llc | Doorbell communication systems and methods |
US20170263067A1 (en) | 2014-08-27 | 2017-09-14 | SkyBell Technologies, Inc. | Smart lock systems and methods |
US11889009B2 (en) | 2013-07-26 | 2024-01-30 | Skybell Technologies Ip, Llc | Doorbell communication and electrical systems |
US11651665B2 (en) | 2013-07-26 | 2023-05-16 | Skybell Technologies Ip, Llc | Doorbell communities |
US20180343141A1 (en) | 2015-09-22 | 2018-11-29 | SkyBell Technologies, Inc. | Doorbell communication systems and methods |
US11349879B1 (en) * | 2013-07-28 | 2022-05-31 | Secureauth Corporation | System and method for multi-transaction policy orchestration with first and second level derived policies for authentication and authorization |
US9710858B1 (en) | 2013-08-16 | 2017-07-18 | United Services Automobile Association (Usaa) | Insurance policy alterations using informatic sensor data |
WO2015071202A1 (en) * | 2013-11-12 | 2015-05-21 | Sma Solar Technology Ag | Method for the communication of system control units with a plurality of energy generating systems via a gateway, and correspondingly configured and programmed data server |
US10552911B1 (en) | 2014-01-10 | 2020-02-04 | United Services Automobile Association (Usaa) | Determining status of building modifications using informatics sensor data |
US11087404B1 (en) | 2014-01-10 | 2021-08-10 | United Services Automobile Association (Usaa) | Electronic sensor management |
US11416941B1 (en) | 2014-01-10 | 2022-08-16 | United Services Automobile Association (Usaa) | Electronic sensor management |
US12100050B1 (en) | 2014-01-10 | 2024-09-24 | United Services Automobile Association (Usaa) | Electronic sensor management |
US11700400B2 (en) | 2014-02-05 | 2023-07-11 | Enseo, Llc | Geolocationing system and method for use of same |
US11683534B2 (en) | 2014-02-05 | 2023-06-20 | Enseo, Llc | Geolocationing system and method for use of same |
US11700399B2 (en) | 2014-02-05 | 2023-07-11 | Enseo, Llc | Geolocationing system and method for use of same |
US11381850B2 (en) | 2014-02-05 | 2022-07-05 | Enseo, Llc | Thermostat and system and method for use of same |
US11700401B2 (en) | 2014-02-05 | 2023-07-11 | Enseo, Llc | Geolocationing system and method for use of same |
US11825132B2 (en) | 2014-02-05 | 2023-11-21 | Enseo, Llc | Thermostat, system and method for providing awareness in a hospitality environment |
US11553214B2 (en) | 2014-02-05 | 2023-01-10 | Enseo, Llc | Thermostat and system and method for use of same |
US11849155B2 (en) | 2014-02-05 | 2023-12-19 | Enseo, Llc | Thermostat, system and method for providing awareness in a hospitality environment |
US11856241B2 (en) | 2014-02-05 | 2023-12-26 | Enseo, Llc | Thermostat, system and method for providing awareness in a hospitality environment |
US11825133B2 (en) | 2014-02-05 | 2023-11-21 | Enseo, Llc | Thermostat, system and method for providing awareness in a hospitality environment |
US20180108230A1 (en) * | 2014-02-08 | 2018-04-19 | Switchmate Home Llc | Monitoring of legacy controls in smart home system |
US11847666B1 (en) | 2014-02-24 | 2023-12-19 | United Services Automobile Association (Usaa) | Determining status of building modifications using informatics sensor data |
US11146637B2 (en) | 2014-03-03 | 2021-10-12 | Icontrol Networks, Inc. | Media content management |
US11405463B2 (en) | 2014-03-03 | 2022-08-02 | Icontrol Networks, Inc. | Media content management |
US10614525B1 (en) | 2014-03-05 | 2020-04-07 | United Services Automobile Association (Usaa) | Utilizing credit and informatic data for insurance underwriting purposes |
US9716861B1 (en) | 2014-03-07 | 2017-07-25 | Steelcase Inc. | Method and system for facilitating collaboration sessions |
US10664772B1 (en) | 2014-03-07 | 2020-05-26 | Steelcase Inc. | Method and system for facilitating collaboration sessions |
US9766079B1 (en) | 2014-10-03 | 2017-09-19 | Steelcase Inc. | Method and system for locating resources and communicating within an enterprise |
US9380682B2 (en) | 2014-06-05 | 2016-06-28 | Steelcase Inc. | Environment optimization for space based on presence and activities |
US9955318B1 (en) | 2014-06-05 | 2018-04-24 | Steelcase Inc. | Space guidance and management system and method |
US10433646B1 (en) | 2014-06-06 | 2019-10-08 | Steelcaase Inc. | Microclimate control systems and methods |
US11744376B2 (en) | 2014-06-06 | 2023-09-05 | Steelcase Inc. | Microclimate control systems and methods |
EP2958010A1 (en) * | 2014-06-20 | 2015-12-23 | Thomson Licensing | Apparatus and method for controlling the apparatus by a user |
US20170085843A1 (en) * | 2015-09-22 | 2017-03-23 | SkyBell Technologies, Inc. | Doorbell communication systems and methods |
US11184589B2 (en) | 2014-06-23 | 2021-11-23 | Skybell Technologies Ip, Llc | Doorbell communication systems and methods |
US9788039B2 (en) | 2014-06-23 | 2017-10-10 | Google Inc. | Camera system API for third-party integrations |
US9607507B1 (en) * | 2014-06-25 | 2017-03-28 | Amazon Technologies, Inc. | User activity-based actions |
US9860242B2 (en) * | 2014-08-11 | 2018-01-02 | Vivint, Inc. | One-time access to an automation system |
CN105363298B (en) * | 2014-08-15 | 2017-11-03 | 台达电子工业股份有限公司 | Have the air regenerating device and its detection method of the dirty detection function of filter screen |
US10119714B2 (en) * | 2014-09-10 | 2018-11-06 | Cielo WiGle Inc. | System and method for remotely controlling IR-enabled appliances via networked device |
DE102014113040A1 (en) * | 2014-09-10 | 2016-03-10 | Miele & Cie. Kg | Method for operating a household appliance system |
US20160077530A1 (en) * | 2014-09-12 | 2016-03-17 | Michael T. Moran | Smart valve for controlling a plumbing fixture |
US10991049B1 (en) | 2014-09-23 | 2021-04-27 | United Services Automobile Association (Usaa) | Systems and methods for acquiring insurance related informatics |
US10028025B2 (en) * | 2014-09-29 | 2018-07-17 | Time Warner Cable Enterprises Llc | Apparatus and methods for enabling presence-based and use-based services |
US9852388B1 (en) | 2014-10-03 | 2017-12-26 | Steelcase, Inc. | Method and system for locating resources and communicating within an enterprise |
KR102366961B1 (en) * | 2014-10-07 | 2022-02-24 | 삼성전자 주식회사 | Method and apparatus for managing heating venilation and air conditioning |
US10290447B2 (en) | 2014-10-15 | 2019-05-14 | Umbrela Smart Inc. | Wall-mounted smart switches and outlets for use in building wiring for load control, home automation, and/or security purposes |
US10619874B2 (en) * | 2014-10-23 | 2020-04-14 | Trane International Inc. | Apparatuses, methods and systems for configuring electronically programmable HVAC system |
US9982906B2 (en) * | 2014-10-23 | 2018-05-29 | Vivint, Inc. | Real-time temperature management |
US20160131382A1 (en) * | 2014-11-12 | 2016-05-12 | Howard Rosen | Method and apparatus of networked thermostats providing for reduced peak power demand |
EP3026645A1 (en) * | 2014-11-26 | 2016-06-01 | Thomson Licensing | Apparatus and method for activity monitoring |
TWI554747B (en) * | 2014-12-04 | 2016-10-21 | 台達電子工業股份有限公司 | Human detection system and human detection method |
US10609475B2 (en) | 2014-12-05 | 2020-03-31 | Stages Llc | Active noise control and customized audio system |
US10605474B2 (en) * | 2015-07-30 | 2020-03-31 | Encycle Corporation | Smart thermostat orchestration |
US10091015B2 (en) * | 2014-12-16 | 2018-10-02 | Microsoft Technology Licensing, Llc | 3D mapping of internet of things devices |
CN105763514B (en) * | 2014-12-17 | 2019-11-29 | 华为技术有限公司 | A kind of method, apparatus and system of processing authorization |
US9985796B2 (en) * | 2014-12-19 | 2018-05-29 | Smartlabs, Inc. | Smart sensor adaptive configuration systems and methods using cloud data |
US11489690B2 (en) | 2014-12-19 | 2022-11-01 | Smartlabs, Inc. | System communication utilizing path between neighboring networks |
AU2014415265A1 (en) * | 2014-12-23 | 2017-07-27 | Fluidra, S.A. | Controlling a water installation device |
FR3031209A1 (en) * | 2014-12-24 | 2016-07-01 | Orange | MANAGEMENT OF ELECTRONIC ENTITIES FOR THE CREATION OF A NEWS WIRE |
US9526155B2 (en) * | 2014-12-30 | 2016-12-20 | Google Inc. | Systems and methods of controlling light sources according to location |
WO2016107658A1 (en) * | 2014-12-31 | 2016-07-07 | Fluidra, S.A. | Controlling water installation devices |
KR102338899B1 (en) * | 2015-01-02 | 2021-12-13 | 삼성전자주식회사 | Method and device for controlling home device |
US9870696B2 (en) * | 2015-01-05 | 2018-01-16 | Ford Global Technologies, Llc | Smart device vehicle integration |
US10028220B2 (en) | 2015-01-27 | 2018-07-17 | Locix, Inc. | Systems and methods for providing wireless asymmetric network architectures of wireless devices with power management features |
US9680646B2 (en) * | 2015-02-05 | 2017-06-13 | Apple Inc. | Relay service for communication between controllers and accessories |
EP3256913A1 (en) * | 2015-02-11 | 2017-12-20 | NEC Europe Ltd. | A method for operating a thermal system and a thermal system |
US9900174B2 (en) | 2015-03-06 | 2018-02-20 | Honeywell International Inc. | Multi-user geofencing for building automation |
US10742938B2 (en) | 2015-03-07 | 2020-08-11 | Skybell Technologies Ip, Llc | Garage door communication systems and methods |
US9984686B1 (en) * | 2015-03-17 | 2018-05-29 | Amazon Technologies, Inc. | Mapping device capabilities to a predefined set |
US9967391B2 (en) | 2015-03-25 | 2018-05-08 | Honeywell International Inc. | Geo-fencing in a building automation system |
US11429975B1 (en) | 2015-03-27 | 2022-08-30 | Wells Fargo Bank, N.A. | Token management system |
US11575537B2 (en) | 2015-03-27 | 2023-02-07 | Skybell Technologies Ip, Llc | Doorbell communication systems and methods |
MX2017012713A (en) | 2015-04-03 | 2017-12-11 | Lucis Tech Holdings Limited | Environmental control system. |
US9619985B2 (en) * | 2015-04-08 | 2017-04-11 | Vivint, Inc. | Home automation communication system |
US11381686B2 (en) | 2015-04-13 | 2022-07-05 | Skybell Technologies Ip, Llc | Power outlet cameras |
US10802459B2 (en) | 2015-04-27 | 2020-10-13 | Ademco Inc. | Geo-fencing with advanced intelligent recovery |
US10802469B2 (en) | 2015-04-27 | 2020-10-13 | Ademco Inc. | Geo-fencing with diagnostic feature |
CN104932455B (en) * | 2015-04-27 | 2018-04-13 | 小米科技有限责任公司 | The group technology and apparatus for grouping of smart machine in intelligent domestic system |
WO2016179321A1 (en) | 2015-05-04 | 2016-11-10 | Johnson Controls Technology Company | User control device with housing containing angled circuit boards |
CN107771265A (en) | 2015-05-04 | 2018-03-06 | 江森自控科技公司 | Touch thermostat is installed using transparent screen technology |
US10677484B2 (en) | 2015-05-04 | 2020-06-09 | Johnson Controls Technology Company | User control device and multi-function home control system |
US11641452B2 (en) | 2015-05-08 | 2023-05-02 | Skybell Technologies Ip, Llc | Doorbell communication systems and methods |
US20160342311A1 (en) * | 2015-05-21 | 2016-11-24 | 1804282 Ontario Limited Dba Gymnext | Timer display and methods of communication between the timer display and a mobile device |
US10489863B1 (en) | 2015-05-27 | 2019-11-26 | United Services Automobile Association (Usaa) | Roof inspection systems and methods |
US9866545B2 (en) * | 2015-06-02 | 2018-01-09 | ALTR Solutions, Inc. | Credential-free user login to remotely executed applications |
US10733371B1 (en) | 2015-06-02 | 2020-08-04 | Steelcase Inc. | Template based content preparation system for use with a plurality of space types |
US9817957B1 (en) * | 2015-06-04 | 2017-11-14 | EMC IP Holding Company LLC | Access management based on active environment comprising dynamically reconfigurable sets of smart objects |
US20180047269A1 (en) | 2015-06-23 | 2018-02-15 | SkyBell Technologies, Inc. | Doorbell communities |
US10655951B1 (en) | 2015-06-25 | 2020-05-19 | Amazon Technologies, Inc. | Determining relative positions of user devices |
KR102427836B1 (en) * | 2015-06-26 | 2022-08-02 | 삼성전자주식회사 | Cleaning robot, information providing system and method for providing information |
US10365620B1 (en) | 2015-06-30 | 2019-07-30 | Amazon Technologies, Inc. | Interoperability of secondary-device hubs |
KR20170004054A (en) * | 2015-07-01 | 2017-01-11 | 한국전자통신연구원 | Apparatus and method for providing interactive communication service using a sensor network |
EP3320457B1 (en) | 2015-07-10 | 2021-04-07 | Whether Or Knot LLC | System and method for electronic data distribution |
CN105093948A (en) * | 2015-07-13 | 2015-11-25 | 小米科技有限责任公司 | Intelligent device control method, terminal, and server |
US9876852B2 (en) * | 2015-07-23 | 2018-01-23 | Microsoft Technology Licensing, Llc | Coordinating actions across platforms |
US9781686B2 (en) * | 2015-07-23 | 2017-10-03 | Google Inc. | Reducing wireless communication to conserve energy and increase security |
US10706702B2 (en) | 2015-07-30 | 2020-07-07 | Skybell Technologies Ip, Llc | Doorbell package detection systems and methods |
US11170364B1 (en) | 2015-07-31 | 2021-11-09 | Wells Fargo Bank, N.A. | Connected payment card systems and methods |
US11172273B2 (en) | 2015-08-10 | 2021-11-09 | Delta Energy & Communications, Inc. | Transformer monitor, communications and data collection device |
US10055869B2 (en) | 2015-08-11 | 2018-08-21 | Delta Energy & Communications, Inc. | Enhanced reality system for visualizing, evaluating, diagnosing, optimizing and servicing smart grids and incorporated components |
RU2710506C2 (en) * | 2015-08-18 | 2019-12-26 | ФОРД ГЛОУБАЛ ТЕКНОЛОДЖИЗ, ЭлЭлСи | Method (embodiments) and object tracking system from transport facility |
US10425414B1 (en) * | 2015-08-31 | 2019-09-24 | United Services Automobile Association (Usaa) | Security platform |
US10055966B2 (en) | 2015-09-03 | 2018-08-21 | Delta Energy & Communications, Inc. | System and method for determination and remediation of energy diversion in a smart grid network |
US20170074536A1 (en) * | 2015-09-11 | 2017-03-16 | Johnson Controls Technology Company | Thermostat with near field communication features |
US10760809B2 (en) | 2015-09-11 | 2020-09-01 | Johnson Controls Technology Company | Thermostat with mode settings for multiple zones |
SG10201507834SA (en) * | 2015-09-21 | 2017-04-27 | Yokogawa Electric Corp | Mobile based on collaborative and interactive operations with smart mobile devices |
US9643619B2 (en) * | 2015-09-21 | 2017-05-09 | Honda Motor Co., Ltd. | System and method for applying vehicle settings in a vehicle |
US10448453B2 (en) * | 2015-09-25 | 2019-10-15 | Intel Corporation | Virtual sensor system |
CN113093917A (en) * | 2015-09-28 | 2021-07-09 | 微软技术许可有限责任公司 | Unified virtual reality platform |
US10992625B2 (en) | 2015-09-28 | 2021-04-27 | Microsoft Technology Licensing, Llc | Unified messaging platform |
WO2017059210A1 (en) * | 2015-09-30 | 2017-04-06 | Cooper Technologies Company | Electrical devices with camera sensors |
US11436911B2 (en) | 2015-09-30 | 2022-09-06 | Johnson Controls Tyco IP Holdings LLP | Sensor based system and method for premises safety and operational profiling based on drift analysis |
MX2018004053A (en) * | 2015-10-02 | 2018-12-17 | Delta Energy & Communications Inc | Supplemental and alternative digital data delivery and receipt mesh network realized through the placement of enhanced transformer mounted monitoring devices. |
US10179568B2 (en) * | 2015-10-13 | 2019-01-15 | Faraday & Future Inc. | Seamless vehicle access system |
WO2017070648A1 (en) | 2015-10-22 | 2017-04-27 | Delta Energy & Communications, Inc. | Augmentation, expansion and self-healing of a geographically distributed mesh network using unmanned aerial vehicle technology |
WO2017070646A1 (en) | 2015-10-22 | 2017-04-27 | Delta Energy & Communications, Inc. | Data transfer facilitation across a distributed mesh network using light and optical based technology |
CN106612253B (en) * | 2015-10-23 | 2019-10-22 | 中国科学院声学研究所 | A kind of linkage control power managing device and method |
US10222276B2 (en) * | 2015-10-28 | 2019-03-05 | Sk Planet Co., Ltd. | System and method for controlling temperature of user |
US11277893B2 (en) | 2015-10-28 | 2022-03-15 | Johnson Controls Technology Company | Thermostat with area light system and occupancy sensor |
US10655881B2 (en) | 2015-10-28 | 2020-05-19 | Johnson Controls Technology Company | Thermostat with halo light system and emergency directions |
US10180673B2 (en) | 2015-10-28 | 2019-01-15 | Johnson Controls Technology Company | Multi-function thermostat with emergency direction features |
US10230706B2 (en) * | 2015-10-28 | 2019-03-12 | Lenovo Enterprise Solutions (Singapore) Pte. Ltd. | Using personal RF signature for enhanced authentication metric |
US10546472B2 (en) | 2015-10-28 | 2020-01-28 | Johnson Controls Technology Company | Thermostat with direction handoff features |
US10057110B2 (en) | 2015-11-06 | 2018-08-21 | Honeywell International Inc. | Site management system with dynamic site threat level based on geo-location data |
US10795900B1 (en) | 2015-11-11 | 2020-10-06 | Twitter, Inc. | Real time analyses using common features |
US9628951B1 (en) | 2015-11-11 | 2017-04-18 | Honeywell International Inc. | Methods and systems for performing geofencing with reduced power consumption |
US10516965B2 (en) | 2015-11-11 | 2019-12-24 | Ademco Inc. | HVAC control using geofencing |
US10673646B1 (en) * | 2018-12-09 | 2020-06-02 | Olibra Llc | System, device, and method of multi-path wireless communication |
US10168682B1 (en) | 2015-11-20 | 2019-01-01 | Wellhead Power Solutions, Llc | System and method for managing load-modifying demand response of energy consumption |
US10318266B2 (en) | 2015-11-25 | 2019-06-11 | Johnson Controls Technology Company | Modular multi-function thermostat |
FR3044848B1 (en) * | 2015-12-03 | 2019-08-23 | Overkiz | METHOD FOR CONFIGURING, CONTROLLING OR SUPERVISING A DOMOTIC FACILITY |
US10275775B2 (en) | 2015-12-10 | 2019-04-30 | Microsoft Technology Licensing, Llc | Context generation for routing on-demand services |
US10223174B2 (en) * | 2015-12-10 | 2019-03-05 | Microsoft Technology Licensing, Llc | Tenant engagement signal acquisition and exposure |
US9686406B1 (en) | 2015-12-10 | 2017-06-20 | Microsoft Technology Licensing, Llc | Issue detection for routing assistance requests |
DE102015225792B3 (en) * | 2015-12-17 | 2017-04-13 | Volkswagen Aktiengesellschaft | A method and system for secure communication between a mobile device coupled to a smartphone and a server |
CA3010340C (en) | 2015-12-31 | 2021-06-15 | Delta Faucet Company | Water sensor |
US9985947B1 (en) * | 2015-12-31 | 2018-05-29 | Quirklogic, Inc. | Method and system for communication of devices using dynamic routes encoded in security tokens and a dynamic optical label |
US11030902B2 (en) | 2016-01-05 | 2021-06-08 | Locix, Inc. | Systems and methods for using radio frequency signals and sensors to monitor environments |
US10504364B2 (en) | 2016-01-05 | 2019-12-10 | Locix, Inc. | Systems and methods for using radio frequency signals and sensors to monitor environments |
US10156852B2 (en) * | 2016-01-05 | 2018-12-18 | Locix, Inc. | Systems and methods for using radio frequency signals and sensors to monitor environments |
US10127749B2 (en) * | 2016-01-11 | 2018-11-13 | Ford Global Technologies, Llc | System and method for profile indication on a key fob |
US9955296B2 (en) * | 2016-01-13 | 2018-04-24 | Edwin Mcauley Electronics Ltd. | Wireless controlled thermostat with reduced polling communications during predicted periods of low activity to save power |
TWI584232B (en) * | 2016-01-19 | 2017-05-21 | 台達電子工業股份有限公司 | Area abnormality detecting system and area abnormality detecting method |
CN105527952A (en) * | 2016-01-20 | 2016-04-27 | 宁波六脉神剑软件科技有限公司 | Smart home control device and electric appliance matching system and method |
US11720085B2 (en) | 2016-01-22 | 2023-08-08 | Hayward Industries, Inc. | Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment |
US10382424B2 (en) * | 2016-01-26 | 2019-08-13 | Redhat, Inc. | Secret store for OAuth offline tokens |
FR3047374B1 (en) | 2016-01-28 | 2018-07-27 | Overkiz | METHOD FOR CONFIGURING, CONTROLLING OR SUPERVISING A DOMOTIC FACILITY |
US20170219243A1 (en) * | 2016-02-02 | 2017-08-03 | T.A. Industries, Inc. | Hvac register grille with sensor-activated light |
US9848061B1 (en) | 2016-10-28 | 2017-12-19 | Vignet Incorporated | System and method for rules engine that dynamically adapts application behavior |
US10605472B2 (en) | 2016-02-19 | 2020-03-31 | Ademco Inc. | Multiple adaptive geo-fences for a building |
US10567184B2 (en) | 2016-02-19 | 2020-02-18 | Vertigo Media, Inc. | System and method for group stream broadcasting with stateless queuing feature |
WO2017147476A1 (en) | 2016-02-24 | 2017-08-31 | Delta Energy & Communications, Inc. | Distributed 802.11s mesh network using transformer module hardware for the capture and transmission of data |
CN109475242B (en) * | 2016-02-24 | 2022-03-04 | 斯马特斯纳格Ip有限公司 | Sleeping bag for babies and children |
US9868391B1 (en) * | 2016-02-26 | 2018-01-16 | Waymo Llc | Scenario based audible warnings for autonomous vehicles |
JP6503148B1 (en) | 2016-02-26 | 2019-04-17 | アマゾン テクノロジーズ インコーポレイテッド | Cross-referencing of applications related to sharing of video images from audio / video recording and communication devices |
US10841542B2 (en) | 2016-02-26 | 2020-11-17 | A9.Com, Inc. | Locating a person of interest using shared video footage from audio/video recording and communication devices |
US10397528B2 (en) * | 2016-02-26 | 2019-08-27 | Amazon Technologies, Inc. | Providing status information for secondary devices with video footage from audio/video recording and communication devices |
US10748414B2 (en) | 2016-02-26 | 2020-08-18 | A9.Com, Inc. | Augmenting and sharing data from audio/video recording and communication devices |
US11393108B1 (en) | 2016-02-26 | 2022-07-19 | Amazon Technologies, Inc. | Neighborhood alert mode for triggering multi-device recording, multi-camera locating, and multi-camera event stitching for audio/video recording and communication devices |
JP6597418B2 (en) * | 2016-03-09 | 2019-10-30 | 富士通株式会社 | Radio beacon device and method for controlling radio beacon device |
JP2017162273A (en) * | 2016-03-10 | 2017-09-14 | オムロン株式会社 | Sensor network inter-device cooperation apparatus, inter-device cooperation method, inter-device cooperation program, and recording medium storing program |
US9924021B2 (en) * | 2016-03-11 | 2018-03-20 | Distech Controls Inc. | Environment controllers capable of controlling a plurality of smart light fixtures |
US10593177B2 (en) * | 2016-03-16 | 2020-03-17 | Sensormatic Electronics, LLC | Method and apparatus for tiered analytics in a multi-sensor environment |
CN107229965B (en) * | 2016-03-25 | 2021-10-22 | 陕西微阅信息技术有限公司 | Anthropomorphic system of intelligent robot and method for simulating forgetting effect |
US9772613B1 (en) * | 2016-03-28 | 2017-09-26 | David Webster | System and method for setting moods and experiences through use of a mobile device |
US10326609B1 (en) * | 2016-03-28 | 2019-06-18 | Sanjay Patel | System and method for automatic association coordinator module pre-configuration |
US9781602B1 (en) * | 2016-03-31 | 2017-10-03 | Ca, Inc. | Geographically based access management for internet of things device data |
US20170284690A1 (en) * | 2016-04-01 | 2017-10-05 | Softarex Technologies, Inc. | Mobile environment monitoring system |
EP3439451B1 (en) * | 2016-04-04 | 2021-09-29 | Freight Farms, Inc. | Modular farm control and monitoring system |
US11240215B2 (en) * | 2016-04-11 | 2022-02-01 | Avaya Inc. | Temporary control of components using location based grants |
US20170295058A1 (en) * | 2016-04-12 | 2017-10-12 | Johnson Controls Technology Company | Devices and methods for network integration of an hvac device |
US10047971B2 (en) * | 2016-04-15 | 2018-08-14 | Ametros Solutions LLC | Home automation system |
US10586023B2 (en) | 2016-04-21 | 2020-03-10 | Time Warner Cable Enterprises Llc | Methods and apparatus for secondary content management and fraud prevention |
US11153310B2 (en) * | 2016-04-21 | 2021-10-19 | Signify Holding B.V. | Systems and methods for registering and localizing building servers for cloud-based monitoring and control of physical environments |
US10552914B2 (en) | 2016-05-05 | 2020-02-04 | Sensormatic Electronics, LLC | Method and apparatus for evaluating risk based on sensor monitoring |
DE112016006863T5 (en) * | 2016-05-16 | 2019-02-14 | Mitsubishi Electric Corporation | Management device and air conditioning |
JP6580520B2 (en) * | 2016-05-26 | 2019-09-25 | トヨタホーム株式会社 | Equipment control system |
US10911255B2 (en) * | 2016-05-31 | 2021-02-02 | Honeywell International Inc. | Devices, methods, and systems for hands free facility status alerts |
CN107450899B (en) * | 2016-06-01 | 2022-04-26 | 深圳市信锐网科技术有限公司 | Method and device for generating terminal control script |
US9921726B1 (en) | 2016-06-03 | 2018-03-20 | Steelcase Inc. | Smart workstation method and system |
US9846577B1 (en) | 2016-06-03 | 2017-12-19 | Afero, Inc. | Integrated development tool with preview functionality for an internet of things (IoT) system |
US9841968B1 (en) | 2016-06-03 | 2017-12-12 | Afero, Inc. | Integrated development tool with preview functionality for an internet of things (IoT) system |
WO2017210120A1 (en) * | 2016-06-03 | 2017-12-07 | Afero, Inc. | Integrated development tool with preview functionality for an internet of things (iot) system |
WO2017213808A1 (en) | 2016-06-11 | 2017-12-14 | Enlighted, Inc. | Associating information with an asset or a physical space |
US10558228B1 (en) | 2016-06-17 | 2020-02-11 | United Services Automobile Association (Usaa) | Flow monitoring device and system |
US10453149B1 (en) * | 2016-06-23 | 2019-10-22 | State Farm Mutual Automobile Insurance Company | Systems and methods for analyzing property telematics data to update risk-based coverage of a property |
US10021648B2 (en) * | 2016-06-29 | 2018-07-10 | Verizon Patent And Licensing Inc. | Wireless device transfer to a power saving mode |
US11935020B1 (en) | 2016-07-01 | 2024-03-19 | Wells Fargo Bank, N.A. | Control tower for prospective transactions |
US11157641B2 (en) * | 2016-07-01 | 2021-10-26 | Microsoft Technology Licensing, Llc | Short-circuit data access |
US11386223B1 (en) | 2016-07-01 | 2022-07-12 | Wells Fargo Bank, N.A. | Access control tower |
US11886611B1 (en) | 2016-07-01 | 2024-01-30 | Wells Fargo Bank, N.A. | Control tower for virtual rewards currency |
EP3481671A4 (en) * | 2016-07-06 | 2020-08-12 | Ford Global Technologies, LLC | Information sharing and user experience enhancement by context-aware vehicles |
US10455350B2 (en) | 2016-07-10 | 2019-10-22 | ZaiNar, Inc. | Method and system for radiolocation asset tracking via a mesh network |
US10210356B2 (en) * | 2016-07-21 | 2019-02-19 | Nippon Sysits Co. Ltd. | Multi signal diffusion integrated system and method |
US10941951B2 (en) | 2016-07-27 | 2021-03-09 | Johnson Controls Technology Company | Systems and methods for temperature and humidity control |
US10834586B2 (en) * | 2016-07-29 | 2020-11-10 | Amzetta Technologies, Llc | System and method for controlling heterogeneous internet of things (IoT) devices using single application |
US10248189B2 (en) * | 2016-07-29 | 2019-04-02 | Lenovo (Singapore) Pte. Ltd. | Presentation of virtual reality object based on one or more conditions |
US10140593B2 (en) * | 2016-07-29 | 2018-11-27 | International Business Machines Corporation | System, method and recording medium for doorbell control based on doorbell data and calendar data |
US10547469B2 (en) * | 2016-07-29 | 2020-01-28 | International Business Machines Corporation | System, method, and recording medium for adjusting ambience of a room |
US10957446B2 (en) | 2016-08-08 | 2021-03-23 | Johnson & Johnson Surgical Vision, Inc. | System and method for providing a genericized medical device architecture |
CA3033155A1 (en) * | 2016-08-08 | 2018-02-15 | Johnson & Johnson Surgical Vision, Inc. | System and method for providing a genericized medical device architecture |
US10652633B2 (en) | 2016-08-15 | 2020-05-12 | Delta Energy & Communications, Inc. | Integrated solutions of Internet of Things and smart grid network pertaining to communication, data and asset serialization, and data modeling algorithms |
CN106357485B (en) * | 2016-08-16 | 2019-11-15 | 北京小米移动软件有限公司 | The method and device of marking arrangement |
US11259165B2 (en) | 2016-08-26 | 2022-02-22 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
US11055670B1 (en) * | 2016-08-26 | 2021-07-06 | United Services Automobile Association (Usaa) | Systems and methods for generating a travel smartlist |
EP3504693B1 (en) | 2016-08-26 | 2021-07-28 | Intrinsic Value, LLC | Systems, devices, and methods for emergency responses and safety |
US10506413B2 (en) | 2017-08-28 | 2019-12-10 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
WO2018049244A1 (en) | 2016-09-09 | 2018-03-15 | Carrier Corporation | System and method for operating a hvac system by determining occupied state of a structure via ip address |
US10650621B1 (en) | 2016-09-13 | 2020-05-12 | Iocurrents, Inc. | Interfacing with a vehicular controller area network |
US10484201B2 (en) | 2016-09-28 | 2019-11-19 | Samsung Electronics Co., Ltd. | Distributed platform for robust execution of smart home applications |
CN107395910B (en) * | 2016-09-29 | 2020-09-01 | 维沃移动通信有限公司 | Incoming call notification method and mobile terminal |
DE102016219134B4 (en) | 2016-10-04 | 2024-05-16 | Volkswagen Aktiengesellschaft | Method for accessing an external electronic device |
US11171905B1 (en) * | 2016-10-17 | 2021-11-09 | Open Invention Network Llc | Request and delivery of additional data |
AT519289B1 (en) * | 2016-10-17 | 2018-08-15 | Wolfinger Gerd | Security device for burglary prevention |
US9910433B1 (en) * | 2016-10-17 | 2018-03-06 | General Electric Company | System for remotely operating a vehicle system |
US10520210B2 (en) | 2016-10-31 | 2019-12-31 | Johnson Controls Technology Company | Building automation systems for online, offline, and hybrid licensing of distributed edge devices |
CN106371343A (en) * | 2016-11-09 | 2017-02-01 | 北京奇虎科技有限公司 | Household electrical appliance-based control method and electronic device |
US10539711B2 (en) * | 2016-11-10 | 2020-01-21 | Z Image, Llc | Laser beam detector including a light source for use in a laser attraction |
US10194276B1 (en) * | 2017-09-24 | 2019-01-29 | WiSilica Inc. | Integrating building automation with location awareness utilizing wireless mesh technology |
US10452145B2 (en) * | 2016-11-14 | 2019-10-22 | Immerson Corporation | Systems and methods for haptically-enhanced smart home architectures |
US11272481B2 (en) * | 2016-11-14 | 2022-03-08 | Google Llc | Distributed resource model |
US10915095B2 (en) * | 2016-11-16 | 2021-02-09 | Brentt Blakkan | Systems and methods for dynamic groups in control systems |
US10945080B2 (en) | 2016-11-18 | 2021-03-09 | Stages Llc | Audio analysis and processing system |
US11662748B2 (en) | 2016-11-22 | 2023-05-30 | Wint Wi Ltd | Appliance based tariff |
CN108089450B (en) * | 2016-11-23 | 2021-04-27 | 阿里巴巴集团控股有限公司 | Intelligent building control method, device and system |
US10756919B1 (en) * | 2016-11-28 | 2020-08-25 | Alarm.Com Incorporated | Connected automation controls using robotic devices |
GB2571043A (en) * | 2016-11-29 | 2019-08-14 | Walmart Apollo Llc | Virtual representation of activity within an environment |
US10264213B1 (en) | 2016-12-15 | 2019-04-16 | Steelcase Inc. | Content amplification system and method |
US20190207946A1 (en) * | 2016-12-20 | 2019-07-04 | Google Inc. | Conditional provision of access by interactive assistant modules |
CN108241300B (en) | 2016-12-26 | 2023-05-02 | 开利公司 | Device control for predetermined spatial regions |
US9922478B1 (en) * | 2016-12-28 | 2018-03-20 | Nortek Security & Control Llc | Pressing device for garage door controller |
FR3061399B1 (en) | 2016-12-28 | 2023-04-21 | Overkiz | METHOD FOR CONFIGURING ACCESS, CONTROL AND REMOTE SUPERVISION OF AT LEAST ONE HOME AUTOMATION DEVICE BELONGING TO A HOME AUTOMATION INSTALLATION |
FR3061390B1 (en) | 2016-12-28 | 2022-12-16 | Overkiz | METHOD FOR CONFIGURING, CONTROL OR SUPERVISION OF A HOME AUTOMATION INSTALLATION |
FR3061400A1 (en) * | 2016-12-28 | 2018-06-29 | Overkiz | METHOD FOR CONFIGURING ACCESS, CONTROL AND REMOTE SUPERVISION OF AT LEAST ONE DOMOTIC DEVICE BELONGING TO A DOMOTIC INSTALLATION |
US10360746B1 (en) * | 2016-12-30 | 2019-07-23 | Alarm.Com Incorporated | Controlled indoor access using smart indoor door knobs |
US10365932B2 (en) * | 2017-01-23 | 2019-07-30 | Essential Products, Inc. | Dynamic application customization for automated environments |
US9747083B1 (en) | 2017-01-23 | 2017-08-29 | Essential Products, Inc. | Home device application programming interface |
US10163284B2 (en) | 2017-02-03 | 2018-12-25 | Gto Access Systems, Llc | Method and system for controlling a movable barrier |
US10812605B2 (en) | 2017-02-10 | 2020-10-20 | General Electric Company | Message queue-based systems and methods for establishing data communications with industrial machines in multiple locations |
US11066813B2 (en) * | 2017-02-15 | 2021-07-20 | Saya Life, Inc. | Water management, metering, leak detection, water analytics and remote shutoff system |
US11530531B2 (en) | 2017-02-15 | 2022-12-20 | Saya Life, Inc. | Water management, metering, leak detection, water analytics and remote shutoff system |
US10356096B2 (en) * | 2017-02-17 | 2019-07-16 | At&T Intellectual Property I, L.P. | Authentication using credentials submitted via a user premises device |
EP3583473A1 (en) * | 2017-02-20 | 2019-12-25 | Lutron Technology Company LLC | Integrating and controlling multiple load control systems |
CN106933111B (en) * | 2017-02-28 | 2020-10-30 | 北京小米移动软件有限公司 | Method and device for controlling equipment |
CN108512881A (en) * | 2017-02-28 | 2018-09-07 | 中兴通讯股份有限公司 | A kind of intelligent domestic system |
CN106600890A (en) * | 2017-02-28 | 2017-04-26 | 上海帆煜自动化科技有限公司 | Internet-of-Things-based intelligent household security and protection system |
WO2018161851A1 (en) * | 2017-03-10 | 2018-09-13 | 腾讯科技(深圳)有限公司 | Device control method, storage medium, and computer device |
CN108168034B (en) * | 2017-03-17 | 2020-02-21 | 青岛海尔空调器有限总公司 | Air conditioner control method |
CN106842974A (en) * | 2017-03-22 | 2017-06-13 | 深圳市实益达智能技术有限公司 | A kind of method that remote auto control equipment is realized based on distance |
US10458669B2 (en) | 2017-03-29 | 2019-10-29 | Johnson Controls Technology Company | Thermostat with interactive installation features |
US10624086B2 (en) * | 2017-03-31 | 2020-04-14 | A9.Com, Inc. | Wireless security network and communication methods |
US10393881B2 (en) * | 2017-04-07 | 2019-08-27 | General Motors Llc | Obtaining vehicle positions based on positional trigger events |
CA174062S (en) * | 2017-04-11 | 2019-01-02 | Peak Innovations Inc | Display screen with graphical user interface |
WO2018191510A1 (en) | 2017-04-14 | 2018-10-18 | Johnson Controls Technology Company | Multi-function thermostat with air quality display |
WO2018191688A2 (en) | 2017-04-14 | 2018-10-18 | Johnson Controls Techology Company | Thermostat with exhaust fan control for air quality and humidity control |
US10317102B2 (en) | 2017-04-18 | 2019-06-11 | Ademco Inc. | Geofencing for thermostatic control |
KR102391683B1 (en) * | 2017-04-24 | 2022-04-28 | 엘지전자 주식회사 | An audio device and method for controlling the same |
US11556936B1 (en) | 2017-04-25 | 2023-01-17 | Wells Fargo Bank, N.A. | System and method for card control |
CN107139676B (en) * | 2017-04-26 | 2020-01-10 | 北京小米移动软件有限公司 | Vehicle heat dissipation method and device |
US10673272B2 (en) * | 2017-05-02 | 2020-06-02 | SAW Capital Partners LLC | Energy management system |
US11385609B2 (en) | 2017-05-02 | 2022-07-12 | SAW Capital Partners LLC | Smart electricity monitor and energy management system including same |
US10127227B1 (en) | 2017-05-15 | 2018-11-13 | Google Llc | Providing access to user-controlled resources by automated assistants |
US11436417B2 (en) | 2017-05-15 | 2022-09-06 | Google Llc | Providing access to user-controlled resources by automated assistants |
CN107181802B (en) * | 2017-05-22 | 2020-09-25 | 北京百度网讯科技有限公司 | Intelligent hardware control method and device, server and storage medium |
US10962942B2 (en) | 2017-06-04 | 2021-03-30 | Apple Inc. | Presence triggered notifications and actions |
US20180357870A1 (en) * | 2017-06-07 | 2018-12-13 | Amazon Technologies, Inc. | Behavior-aware security systems and associated methods |
US10983753B2 (en) | 2017-06-09 | 2021-04-20 | International Business Machines Corporation | Cognitive and interactive sensor based smart home solution |
US10096228B1 (en) | 2017-06-14 | 2018-10-09 | At&T Intellectual Property I, L.P. | Smart mobility assistance device |
CN107395467B (en) * | 2017-06-21 | 2021-08-17 | 北京小米移动软件有限公司 | Intelligent home initialization method and device |
US10923104B2 (en) * | 2017-06-30 | 2021-02-16 | Ademco Inc. | Systems and methods for customizing and providing automated voice prompts for text displayed on a security system keypad |
US10449440B2 (en) | 2017-06-30 | 2019-10-22 | Electronic Arts Inc. | Interactive voice-controlled companion application for a video game |
US20190014026A1 (en) * | 2017-07-05 | 2019-01-10 | Ford Global Technologies, Llc | Method and apparatus for ignition state monitoring |
US11062388B1 (en) * | 2017-07-06 | 2021-07-13 | Wells Fargo Bank, N.A | Data control tower |
US10823443B2 (en) | 2017-07-20 | 2020-11-03 | Carrier Corporation | Self-adaptive smart setback control system |
DE102017006927A1 (en) | 2017-07-20 | 2019-01-24 | Daimler Ag | Communication network |
US11567726B2 (en) * | 2017-07-21 | 2023-01-31 | Google Llc | Methods, systems, and media for providing information relating to detected events |
CN107274623A (en) * | 2017-08-01 | 2017-10-20 | 中消云(北京)物联网科技研究院有限公司 | A kind of Internet of Things fire-fighting family cloud detection system |
US10552607B2 (en) * | 2017-08-03 | 2020-02-04 | NexiTech, Inc. | Moving target defenses for data storage devices |
US11134146B2 (en) * | 2017-08-14 | 2021-09-28 | Carrier Corporation | User preference utilization in remote applications |
WO2019033317A1 (en) * | 2017-08-16 | 2019-02-21 | 深圳市启惠智能科技有限公司 | Device control method and system |
GB2565593B (en) * | 2017-08-18 | 2021-03-17 | Centrica Hive Ltd | Automated control method and apparatus |
ES2827790T3 (en) | 2017-08-21 | 2021-05-24 | Carrier Corp | Fire and security system including accessible loop by address and automatic firmware upgrade |
DE102017008051A1 (en) * | 2017-08-27 | 2019-02-28 | Tobias Rückert | Method for deactivating control channels and communication system for communication of a user with groups of target devices |
US10355864B2 (en) * | 2017-08-29 | 2019-07-16 | Citrix Systems, Inc. | Policy based authentication |
CA3006893C (en) | 2017-09-07 | 2023-01-10 | The Toronto-Dominion Bank | Digital identity network interface system |
FR3070938B1 (en) * | 2017-09-14 | 2019-08-23 | Psa Automobiles Sa | SYSTEM FOR MANAGING AN ETHERNET NETWORK ON OPTICAL FIBER OF A VEHICLE |
US10621317B1 (en) * | 2017-09-14 | 2020-04-14 | Electronic Arts Inc. | Audio-based device authentication system |
US10887125B2 (en) | 2017-09-15 | 2021-01-05 | Kohler Co. | Bathroom speaker |
US11093554B2 (en) | 2017-09-15 | 2021-08-17 | Kohler Co. | Feedback for water consuming appliance |
US11314214B2 (en) | 2017-09-15 | 2022-04-26 | Kohler Co. | Geographic analysis of water conditions |
US11099540B2 (en) * | 2017-09-15 | 2021-08-24 | Kohler Co. | User identity in household appliances |
US10448762B2 (en) | 2017-09-15 | 2019-10-22 | Kohler Co. | Mirror |
US10909825B2 (en) | 2017-09-18 | 2021-02-02 | Skybell Technologies Ip, Llc | Outdoor security systems and methods |
CN107450342A (en) * | 2017-09-20 | 2017-12-08 | 深圳市晟达机械设计有限公司 | A kind of smart home intelligent safety and defence system |
US10452695B2 (en) * | 2017-09-22 | 2019-10-22 | Oracle International Corporation | Context-based virtual assistant implementation |
US11064168B1 (en) * | 2017-09-29 | 2021-07-13 | Objectvideo Labs, Llc | Video monitoring by peep hole device |
US11374918B2 (en) * | 2017-09-29 | 2022-06-28 | Interdigital Ce Patent Holdings | Smart gateway enabled low cost smart building solution |
CN108737360B (en) * | 2017-09-29 | 2021-05-14 | 北京猎户星空科技有限公司 | Robot control method, robot control device, robot apparatus, and storage medium |
US11539520B2 (en) | 2017-10-04 | 2022-12-27 | Delphian Systems, LLC | Emergency lockdown in a local network of interconnected devices |
US10467879B2 (en) | 2017-10-19 | 2019-11-05 | Google Llc | Thoughtful elderly monitoring in a smart home environment |
US10620798B2 (en) * | 2017-10-21 | 2020-04-14 | Mordechai Teicher | Autonomously cooperating smart devices |
US10742442B2 (en) * | 2017-10-21 | 2020-08-11 | Mordechai Teicher | Cluster of smart devices operable in hub-based and hub-less modes |
US11736292B2 (en) * | 2017-10-23 | 2023-08-22 | Huawei Technologies Co., Ltd. | Access token management method, terminal, and server |
AU2018356126B2 (en) * | 2017-10-25 | 2021-07-29 | Lg Electronics Inc. | Artificial intelligence moving robot which learns obstacles, and control method therefor |
US10921763B1 (en) * | 2017-10-25 | 2021-02-16 | Alarm.Com Incorporated | Baby monitoring using a home monitoring system |
US10567515B1 (en) | 2017-10-26 | 2020-02-18 | Amazon Technologies, Inc. | Speech processing performed with respect to first and second user profiles in a dialog session |
US10715604B1 (en) * | 2017-10-26 | 2020-07-14 | Amazon Technologies, Inc. | Remote system processing based on a previously identified user |
JP7155508B2 (en) * | 2017-10-26 | 2022-10-19 | 富士フイルムビジネスイノベーション株式会社 | Equipment, management system and program |
US10369927B2 (en) * | 2017-11-05 | 2019-08-06 | Tamika Crawford | Sol: a system for child safety alert |
US10897374B2 (en) | 2017-11-06 | 2021-01-19 | Computime Ltd. | Scalable smart environment for controlling a plurality of controlled apparatuses using a connection hub to route a processed subset of control data received from a cloud computing resource to terminal units |
US11045047B2 (en) | 2017-11-10 | 2021-06-29 | Ron's Enterprises, Inc. | Variable capacity oven |
CN108111374B (en) * | 2017-11-16 | 2019-09-20 | 百度在线网络技术(北京)有限公司 | Method, apparatus, equipment and the computer storage medium of synchronizer list |
US20190156589A1 (en) | 2017-11-22 | 2019-05-23 | Bank Of America Corporation | System for communicable integration of an automobile system and a toll system |
US20190163217A1 (en) * | 2017-11-27 | 2019-05-30 | Steven Dushane | Thermostat system |
US10303045B1 (en) | 2017-12-20 | 2019-05-28 | Micron Technology, Inc. | Control of display device for autonomous vehicle |
US10877999B2 (en) | 2017-12-21 | 2020-12-29 | Micron Technology, Inc. | Programmatically identifying a personality of an autonomous vehicle |
US10706085B2 (en) | 2018-01-03 | 2020-07-07 | Oracle International Corporation | Method and system for exposing virtual assistant services across multiple platforms |
KR102385263B1 (en) * | 2018-01-04 | 2022-04-12 | 삼성전자주식회사 | Mobile home robot and controlling method of the mobile home robot |
WO2019140232A1 (en) | 2018-01-12 | 2019-07-18 | Insight Energy Ventures, Llc | Systems and methods of capturing usage data from an amr device |
WO2019143735A1 (en) * | 2018-01-16 | 2019-07-25 | Walter Viveiros | System and method for customized sleep environment management |
CN108243259B (en) * | 2018-02-08 | 2021-04-16 | 北京车和家信息技术有限公司 | Method, device and system for transmitting Internet of vehicles data |
US10955162B2 (en) * | 2018-03-07 | 2021-03-23 | Johnson Controls Technology Company | Portable thermostat systems and methods |
US11131474B2 (en) | 2018-03-09 | 2021-09-28 | Johnson Controls Tyco IP Holdings LLP | Thermostat with user interface features |
US10894545B2 (en) * | 2018-03-14 | 2021-01-19 | Micron Technology, Inc. | Configuration of a vehicle based on collected user data |
US11148658B2 (en) | 2018-03-21 | 2021-10-19 | Micron Technology, Inc. | Personalization of a vehicle based on user settings |
US11284646B2 (en) * | 2018-03-22 | 2022-03-29 | Altria Client Services Llc | Augmented reality and/or virtual reality based e-vaping device vapor simulation systems and methods |
US10935268B2 (en) * | 2018-04-03 | 2021-03-02 | Carrier Corporation | HVAC system controller and method for obtaining environmental data for HVAC system |
US10691423B2 (en) | 2018-04-04 | 2020-06-23 | Johnson Controls Technology Company | Testing systems and methods for performing HVAC zone airflow adjustments |
EP3776103A1 (en) * | 2018-04-09 | 2021-02-17 | Carrier Corporation | Mining and deploying profiles in smart buildings |
US11847241B1 (en) * | 2018-04-20 | 2023-12-19 | Amazon Technologies, Inc. | Management of service permissions |
JP7160092B2 (en) * | 2018-05-01 | 2022-10-25 | ソニーグループ株式会社 | Information processing device, information processing method, program, and autonomous action robot control system |
US10901806B2 (en) | 2018-05-01 | 2021-01-26 | International Business Machines Corporation | Internet of things resource optimization |
US10961774B2 (en) | 2018-05-07 | 2021-03-30 | Johnson Controls Technology Company | Systems and methods for window setting adjustment |
US11694287B2 (en) * | 2018-05-18 | 2023-07-04 | Edst, Llc | Intelligent property management system |
US12045647B2 (en) * | 2018-05-22 | 2024-07-23 | Positec Power Tools (Suzhou) Co., Ltd. | Automatic working system, forwarding device, and task execution device |
CN110529987B (en) * | 2018-05-24 | 2023-05-23 | 开利公司 | Biological characteristic air conditioner control system |
CN108768803A (en) * | 2018-05-28 | 2018-11-06 | 北京小米移动软件有限公司 | Add the method, apparatus and storage medium of smart machine |
US10972454B2 (en) * | 2018-05-29 | 2021-04-06 | Apple Inc. | Conversation merging for electronic devices |
US20190377374A1 (en) * | 2018-06-07 | 2019-12-12 | Rcs Technology, Llc | Thermostat relay device |
US10921008B1 (en) * | 2018-06-11 | 2021-02-16 | Braeburn Systems Llc | Indoor comfort control system and method with multi-party access |
US10921807B2 (en) * | 2018-06-18 | 2021-02-16 | Toyota Research Institute, Inc. | Automatic re-energization of vehicles |
US10880114B2 (en) * | 2018-06-27 | 2020-12-29 | Paypal, Inc. | Merchant or third party controlled environmental adjustment devices |
CN112236655B (en) | 2018-06-29 | 2022-05-24 | 金伯利-克拉克环球有限公司 | Toilet use determination system |
US20200014552A1 (en) * | 2018-07-05 | 2020-01-09 | Google Llc | Dynamic Inclusion and Exclusion of Smart-Home Devices |
CN109243547B (en) * | 2018-07-09 | 2021-07-06 | 河海大学 | Quantitative evaluation method for demand response potential of air conditioner load group |
US11481509B1 (en) | 2018-07-10 | 2022-10-25 | United Services Automobile Association (Usaa) | Device management and security through a distributed ledger system |
US11252155B2 (en) * | 2018-07-26 | 2022-02-15 | Comcast Cable Communications, Llc | Systems and methods for on-network device identification |
US10663963B2 (en) * | 2018-08-03 | 2020-05-26 | Here Global B.V. | Method and apparatus for visualizing future events for passengers of autonomous vehicles |
EP3682345B1 (en) | 2018-08-07 | 2021-11-24 | Google LLC | Assembling and evaluating automated assistant responses for privacy concerns |
CN109283858A (en) * | 2018-08-14 | 2019-01-29 | 北京云迹科技有限公司 | Apparatus control method and device based on interface |
WO2020044394A1 (en) * | 2018-08-27 | 2020-03-05 | 三菱電機株式会社 | Control system, air conditioner, and control method |
US11327453B2 (en) * | 2018-09-01 | 2022-05-10 | Honeywell International Inc. | Status indicator for a building controller |
CN109525966B (en) * | 2018-09-07 | 2022-05-06 | 北京小米移动软件有限公司 | Intelligent device query method and device and storage medium |
CN109005093B (en) * | 2018-09-13 | 2020-10-02 | 海南深远腾博科技有限公司 | Intelligent household appliance energy-saving method and system based on remote wireless control |
US11174022B2 (en) * | 2018-09-17 | 2021-11-16 | International Business Machines Corporation | Smart device for personalized temperature control |
US11870862B2 (en) * | 2018-09-17 | 2024-01-09 | Amazon Technologies, Inc. | State prediction of devices |
CN109150898B (en) * | 2018-09-18 | 2021-09-24 | 厦门安胜网络科技有限公司 | Method and apparatus for processing information |
US10552125B1 (en) * | 2018-09-18 | 2020-02-04 | Inductive Automation, LLC | Messaging between components in graphical user interfaces for industrial control systems |
CA3113283A1 (en) * | 2018-09-20 | 2020-03-26 | Zen Ecosystems IP Pty Ltd | Method, system and apparatus for controlling sensing devices of a hvac system |
CN112789590B (en) * | 2018-10-02 | 2024-09-17 | 松下电器(美国)知识产权公司 | Information providing method, control method of acoustic device, and information processing apparatus |
CN112771557A (en) * | 2018-10-02 | 2021-05-07 | 松下电器(美国)知识产权公司 | Information providing method |
US11316709B2 (en) | 2018-10-08 | 2022-04-26 | Google Llc | Multi-source smart-home device control |
US10985936B2 (en) * | 2018-10-08 | 2021-04-20 | Google Llc | Customized interface based on vocal input |
EP3637203A1 (en) * | 2018-10-10 | 2020-04-15 | Vestel Elektronik Sanayi ve Ticaret A.S. | Building automation system and method of its operation |
US11519621B2 (en) | 2018-10-12 | 2022-12-06 | University Of The Witwatersrand, Johannesburg | Systems, methods, and an apparatus for controlling a sleep environment and waking a sleeping person |
CN109245973A (en) * | 2018-10-16 | 2019-01-18 | 广州益牛科技有限公司 | A kind of smart home system based on block chain |
US11556120B2 (en) * | 2018-10-29 | 2023-01-17 | Honeywell International Inc. | Systems and methods for monitoring performance of a building management system via log streams |
JP7308453B2 (en) * | 2018-11-08 | 2023-07-14 | パナソニックIpマネジメント株式会社 | SETTING COMMUNICATION DEVICE, COMMUNICATION SYSTEM, INFORMATION TERMINAL CONTROL METHOD AND PROGRAM |
KR102448387B1 (en) | 2018-12-03 | 2022-09-28 | 구글 엘엘씨 | Efficient control and/or connection of smart devices |
US10469987B1 (en) * | 2018-12-10 | 2019-11-05 | Honda Motor Co., Ltd. | System and method for providing device subjective vehicle passive functions |
US11221941B2 (en) * | 2018-12-17 | 2022-01-11 | Jpmorgan Chase Bank, N.A. | Systems and methods for universal system-to-system communication management and analysis |
US20220047143A1 (en) * | 2018-12-18 | 2022-02-17 | Trinamix Gmbh | Autonomous household appliance |
US11107390B2 (en) | 2018-12-21 | 2021-08-31 | Johnson Controls Technology Company | Display device with halo |
US11713895B2 (en) | 2019-01-14 | 2023-08-01 | Research Products Corporation | Multi-zone environmental control system |
US11163434B2 (en) | 2019-01-24 | 2021-11-02 | Ademco Inc. | Systems and methods for using augmenting reality to control a connected home system |
JP6777174B2 (en) | 2019-01-31 | 2020-10-28 | 株式会社富士通ゼネラル | Server equipment, adapters and air conditioning systems |
CN111596551A (en) * | 2019-02-20 | 2020-08-28 | 青岛海尔洗衣机有限公司 | Control method of intelligent socket of home system, intelligent socket and home system |
CN111614524A (en) * | 2019-02-26 | 2020-09-01 | 华为技术有限公司 | Multi-intelligent-device linkage control method, device and system |
US10869005B2 (en) | 2019-02-28 | 2020-12-15 | Arlo Technologies, Inc. | Electronic doorbell system with reduced latency |
US11534919B2 (en) * | 2019-03-06 | 2022-12-27 | Ademco Inc. | Security sentinel robot |
CN109887251A (en) * | 2019-03-11 | 2019-06-14 | 泉州市科立信智能科技有限公司 | Electrical fire alarm pushing method |
CN109799727B (en) * | 2019-03-20 | 2024-04-05 | 北京理工大学 | Intelligent household system for remotely controlling curtain and window |
US11268727B2 (en) * | 2019-03-27 | 2022-03-08 | Johnson Controls Technology Company | Selective zone air condition setpoint mode interface systems and methods |
US11102004B2 (en) * | 2019-04-29 | 2021-08-24 | Google Llc | Systems and methods for distributed verification of online identity |
CN111856856B (en) * | 2019-04-29 | 2022-03-08 | 中强光电股份有限公司 | Projection device and heat dissipation control method |
WO2020243681A1 (en) * | 2019-05-31 | 2020-12-03 | Goodman Manufacturing Company, L.P. | Hvac authentication system and method |
US11768978B1 (en) * | 2019-06-12 | 2023-09-26 | United Services Automobile Association (Usaa) | Systems and methods for contextual occupancy simulation |
CN110300447B (en) * | 2019-06-28 | 2021-07-16 | 联想(北京)有限公司 | Control method and device |
CN110377075B (en) * | 2019-07-19 | 2020-11-24 | 重庆工商职业学院 | Indoor intelligent temperature control system |
KR20190099165A (en) * | 2019-08-06 | 2019-08-26 | 엘지전자 주식회사 | Apparatus and method for virtual home service |
US11074790B2 (en) | 2019-08-24 | 2021-07-27 | Skybell Technologies Ip, Llc | Doorbell communication systems and methods |
US11677634B1 (en) * | 2019-09-04 | 2023-06-13 | Amazon Technologies, Inc. | Selecting and deploying models based on sensor availability |
US12101349B2 (en) | 2019-09-16 | 2024-09-24 | The Toronto-Dominion Bank | Systems and methods for detecting changes in data access pattern of third-party applications |
US11275842B2 (en) | 2019-09-20 | 2022-03-15 | The Toronto-Dominion Bank | Systems and methods for evaluating security of third-party applications |
US11436336B2 (en) | 2019-09-23 | 2022-09-06 | The Toronto-Dominion Bank | Systems and methods for evaluating data access signature of third-party applications |
CN110673559A (en) * | 2019-09-29 | 2020-01-10 | 引力(深圳)智能机器人有限公司 | Robot scheduling management system |
GB2587423B (en) * | 2019-09-30 | 2022-03-09 | Centrica Plc | Integration of smart home and vehicle systems |
CN110798462B (en) * | 2019-10-25 | 2021-11-02 | 青岛海信智慧家居系统股份有限公司 | Smart home system and equipment access method |
US11356438B2 (en) * | 2019-11-05 | 2022-06-07 | Microsoft Technology Licensing, Llc | Access management system with a secret isolation manager |
US11631291B2 (en) * | 2019-11-08 | 2023-04-18 | Latch Systems, Inc. | Smart building integration and device hub |
US11385605B2 (en) * | 2019-11-12 | 2022-07-12 | Johnson Controls Tyco IP Holdings LLP | Building control system with features for operating under intermittent connectivity to a cloud computation system |
CN111049711B (en) * | 2019-11-28 | 2022-01-11 | 苏宁智能终端有限公司 | Device control right sharing method and device, computer device and storage medium |
US10848567B1 (en) * | 2019-11-29 | 2020-11-24 | Cygnus, LLC | Remote support for IoT devices |
US11201954B2 (en) * | 2019-11-30 | 2021-12-14 | Verizon Patent And Licensing Inc. | Systems and methods for binary message transformation using custom descriptors |
US20210176319A1 (en) * | 2019-12-06 | 2021-06-10 | Zurn Industries, Llc | Water management system and user interface |
CN111026189B (en) * | 2019-12-10 | 2021-02-26 | 山东科技大学 | Smoking temperature rising device and method, smoking temperature rising detection method, control device and system |
US11115819B2 (en) * | 2019-12-30 | 2021-09-07 | Itron, Inc. | Local authentication of communications device |
US11283901B2 (en) | 2019-12-30 | 2022-03-22 | Sony Corporation | Neural network model based configuration of settings |
CN111025932B (en) * | 2020-01-02 | 2021-01-01 | 重庆特斯联智慧科技股份有限公司 | Multi-house scene sharing method and system based on edge calculation |
CN111277565B (en) * | 2020-01-08 | 2022-04-12 | 北京小米松果电子有限公司 | Information processing method and device, and storage medium |
US11156378B2 (en) * | 2020-01-22 | 2021-10-26 | Johnson Controls Tyco IP Holdings LLP | Personal health monitoring using smart home devices |
CN111308944A (en) * | 2020-04-02 | 2020-06-19 | 深圳创维-Rgb电子有限公司 | Equipment control method and device |
US12118178B1 (en) | 2020-04-08 | 2024-10-15 | Steelcase Inc. | Wayfinding services method and apparatus |
US11656842B2 (en) * | 2020-05-14 | 2023-05-23 | T-Mobile Innovations Llc | Presentation layer for portable electronic assistant |
USD951780S1 (en) | 2020-06-15 | 2022-05-17 | Honeywell International Inc. | Building controller |
US20230194114A1 (en) * | 2020-06-29 | 2023-06-22 | Mitsubishi Electric Corporation | Air conditioner, control device, air conditioning system, and air conditioning method |
US11642977B2 (en) * | 2020-07-09 | 2023-05-09 | Weave Grid, Inc. | Optimized charging of electric vehicles over distribution grid |
CN114024948B (en) | 2020-07-17 | 2024-05-28 | 群光电能科技股份有限公司 | Intelligent Building Integrated Management System |
CN111884812B (en) * | 2020-07-24 | 2022-07-29 | 四川阵风科技有限公司 | Binding method and system of hardware equipment |
US11108865B1 (en) | 2020-07-27 | 2021-08-31 | Zurn Industries, Llc | Battery powered end point device for IoT applications |
US11984739B1 (en) | 2020-07-31 | 2024-05-14 | Steelcase Inc. | Remote power systems, apparatus and methods |
US11636870B2 (en) | 2020-08-20 | 2023-04-25 | Denso International America, Inc. | Smoking cessation systems and methods |
US11828210B2 (en) | 2020-08-20 | 2023-11-28 | Denso International America, Inc. | Diagnostic systems and methods of vehicles using olfaction |
US11881093B2 (en) | 2020-08-20 | 2024-01-23 | Denso International America, Inc. | Systems and methods for identifying smoking in vehicles |
US11813926B2 (en) | 2020-08-20 | 2023-11-14 | Denso International America, Inc. | Binding agent and olfaction sensor |
US12017506B2 (en) | 2020-08-20 | 2024-06-25 | Denso International America, Inc. | Passenger cabin air control systems and methods |
US11760169B2 (en) | 2020-08-20 | 2023-09-19 | Denso International America, Inc. | Particulate control systems and methods for olfaction sensors |
US11932080B2 (en) | 2020-08-20 | 2024-03-19 | Denso International America, Inc. | Diagnostic and recirculation control systems and methods |
US11760170B2 (en) | 2020-08-20 | 2023-09-19 | Denso International America, Inc. | Olfaction sensor preservation systems and methods |
CN111948954B (en) * | 2020-09-08 | 2021-07-30 | 兰州工业学院 | Intelligent home system based on internet control |
CN114531909A (en) * | 2020-09-17 | 2022-05-24 | 松下知识产权经营株式会社 | Information processing apparatus and information processing method |
US11556670B2 (en) | 2020-09-24 | 2023-01-17 | AO Kaspersky Lab | System and method of granting access to data of a user |
US11176825B1 (en) * | 2020-11-17 | 2021-11-16 | Ford Global Technologies, Llc | Systems and methods for vehicle backup warning notification |
CN112614302B (en) * | 2020-12-03 | 2022-10-04 | 杭州海康微影传感科技有限公司 | Fire detection method, device and system and electronic equipment |
US11575751B2 (en) * | 2020-12-14 | 2023-02-07 | International Business Machines Corporation | Dynamic creation of sensor area networks based on geofenced IoT devices |
US11546338B1 (en) | 2021-01-05 | 2023-01-03 | Wells Fargo Bank, N.A. | Digital account controls portal and protocols for federated and non-federated systems and devices |
US20220221178A1 (en) * | 2021-01-12 | 2022-07-14 | Lennox Industries Inc. | Heating, ventilation, and air conditioning system control using adaptive occupancy scheduling |
EP4044552A1 (en) * | 2021-02-15 | 2022-08-17 | Inter IKEA Systems B.V. | System and method for authorizing access to smart devices in a local environment |
US11477285B2 (en) | 2021-03-09 | 2022-10-18 | International Business Machines Corporation | Contextual device command resolution |
CN113055942B (en) * | 2021-03-10 | 2022-04-05 | 重庆邮电大学 | Method for data aggregation in 6tisch network |
US11853100B2 (en) * | 2021-04-12 | 2023-12-26 | EMC IP Holding Company LLC | Automated delivery of cloud native application updates using one or more user-connection gateways |
US11875664B2 (en) | 2021-06-04 | 2024-01-16 | Smart Cellular Labs, Llc | Integrated smoke alarm communications system |
US11785012B2 (en) | 2021-06-07 | 2023-10-10 | Bank Of America Corporation | Data processing for internet of things (IoT) devices based on recorded user behavior |
US11831688B2 (en) * | 2021-06-18 | 2023-11-28 | Capital One Services, Llc | Systems and methods for network security |
US12055909B2 (en) * | 2021-07-02 | 2024-08-06 | Whirlpool Corporation | Night cycle algorithm for a laundry appliance to minimize operational noise |
US20230071307A1 (en) * | 2021-09-03 | 2023-03-09 | Salesforce.Com, Inc. | Managing database quotas with a scalable technique |
US11595324B1 (en) * | 2021-10-01 | 2023-02-28 | Bank Of America Corporation | System for automated cross-network monitoring of computing hardware and software resources |
US20230107818A1 (en) * | 2021-10-04 | 2023-04-06 | Sendal, Inc. | Home automation platform |
DE102021127184A1 (en) | 2021-10-20 | 2023-04-20 | Bayerische Motoren Werke Aktiengesellschaft | Method for operating a vehicle system for a motor vehicle and a vehicle system |
DE102021212231A1 (en) | 2021-10-29 | 2023-05-04 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for operating a vehicle with an interior monitoring device |
US11555734B1 (en) | 2022-02-18 | 2023-01-17 | Zurn Industries, Llc | Smart and cloud connected detection mechanism and real-time internet of things (IoT) system management |
US11907229B2 (en) * | 2022-03-31 | 2024-02-20 | Gm Cruise Holdings Llc | System and method for platform-independent access bindings |
WO2023209087A1 (en) * | 2022-04-28 | 2023-11-02 | Inter Ikea Systems B.V. | System and method for authorizing access to smart devices in a local environment |
US11936491B2 (en) * | 2022-04-29 | 2024-03-19 | Haier Us Appliance Solutions, Inc. | Methods of coordinating engagement with a laundry appliance |
US12079616B2 (en) | 2022-06-01 | 2024-09-03 | Toyota Motor Engineering & Manufacturing North America, Inc. | Real-time modifications for vehicles |
US20230418457A1 (en) * | 2022-06-24 | 2023-12-28 | Trent FARRER | Central control hub for entertainment system |
WO2024005708A1 (en) * | 2022-06-28 | 2024-01-04 | Grabtaxi Holdings Pte. Ltd. | Service request estimated time generation system and method |
US11747788B1 (en) | 2022-08-31 | 2023-09-05 | Enconnex LLC | Rack-mount computing equipment with presence sensor |
DE102023103213A1 (en) | 2023-02-09 | 2024-08-14 | Alfred Kärcher SE & Co. KG | Method for logging in a functional device as well as functional device and communication device and uses for this and hereof |
US20240302063A1 (en) * | 2023-03-06 | 2024-09-12 | HG Home Guardian Inc. | Centralized home automation system and method of use thereof |
WO2024188431A1 (en) * | 2023-03-10 | 2024-09-19 | Bdr Thermea Group B.V. | Dynamic control of an energy management system |
Family Cites Families (98)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6094600A (en) | 1996-02-06 | 2000-07-25 | Fisher-Rosemount Systems, Inc. | System and method for managing a transaction database of records of changes to field device configurations |
US20020113555A1 (en) * | 1997-08-26 | 2002-08-22 | Color Kinetics, Inc. | Lighting entertainment system |
US6179213B1 (en) * | 1999-02-09 | 2001-01-30 | Energy Rest, Inc. | Universal accessory for timing and cycling heat, ventilation and air conditioning energy consumption and distribution systems |
US20030041107A1 (en) | 1999-07-22 | 2003-02-27 | Douglas O. Blattner | Method and apparatus for community network communication |
EP1075108A1 (en) | 1999-07-23 | 2001-02-07 | BRITISH TELECOMMUNICATIONS public limited company | Cryptographic data distribution |
US6993658B1 (en) | 2000-03-06 | 2006-01-31 | April System Design Ab | Use of personal communication devices for user authentication |
US20010034754A1 (en) | 2000-03-17 | 2001-10-25 | Elwahab Amgad Mazen | Device, system and method for providing web browser access and control of devices on customer premise gateways |
US20020023258A1 (en) | 2000-06-27 | 2002-02-21 | Elwahab Amgad Mazen | System and method for managing telecommunications devices |
US20030208527A1 (en) * | 2001-07-20 | 2003-11-06 | Lino Lglesais | Method for smart device network application infrastructure (SDNA) |
US20030177012A1 (en) | 2002-03-13 | 2003-09-18 | Brett Drennan | Voice activated thermostat |
US20040019900A1 (en) * | 2002-07-23 | 2004-01-29 | Philip Knightbridge | Integration platform for interactive communications and management of video on demand services |
US8220018B2 (en) | 2002-09-19 | 2012-07-10 | Tvworks, Llc | System and method for preferred placement programming of iTV content |
US20040139120A1 (en) | 2002-11-08 | 2004-07-15 | Matt Clark | Feature-based solutions provisioning of data services |
US20040137891A1 (en) * | 2002-11-08 | 2004-07-15 | Matt Clark | Application packaging and branding in a feature/service/solution client-service delivery environment |
US20040138961A1 (en) * | 2002-11-08 | 2004-07-15 | Matt Clark | Sevice-vendor request processing for data service processing |
US20040139119A1 (en) * | 2002-11-08 | 2004-07-15 | Matt Clark | Feature/concept based local data service request formulation for client-server data services |
US20040142683A1 (en) | 2002-11-08 | 2004-07-22 | Matt Clark | Programming interface layer of a service provider for data service delivery |
US7600234B2 (en) * | 2002-12-10 | 2009-10-06 | Fisher-Rosemount Systems, Inc. | Method for launching applications |
US20070043478A1 (en) * | 2003-07-28 | 2007-02-22 | Ehlers Gregory A | System and method of controlling an HVAC system |
FR2861245B1 (en) | 2003-10-16 | 2006-05-05 | Canon Europa Nv | VIDEO SURVEILLANCE METHOD, DEVICE, SYSTEM AND CORRESPONDING COMPUTER PROGRAM |
US20050125083A1 (en) | 2003-11-10 | 2005-06-09 | Kiko Frederick J. | Automation apparatus and methods |
US7225054B2 (en) * | 2003-12-02 | 2007-05-29 | Honeywell International Inc. | Controller with programmable service event display mode |
CN1627765B (en) | 2003-12-10 | 2010-09-01 | 松下电器产业株式会社 | Portable information terminal device |
WO2005109907A2 (en) * | 2004-04-30 | 2005-11-17 | Vulcan Inc. | Maintaining a graphical user interface state that is based on a selected time |
US7509330B2 (en) | 2004-09-03 | 2009-03-24 | Crossroads Systems, Inc. | Application-layer monitoring of communication between one or more database clients and one or more database servers |
US7752598B2 (en) | 2005-05-13 | 2010-07-06 | International Business Machines Corporation | Generating executable objects implementing methods for an information model |
WO2007052285A2 (en) | 2005-07-22 | 2007-05-10 | Yogesh Chunilal Rathod | Universal knowledge management and desktop search system |
KR100737521B1 (en) | 2005-11-24 | 2007-07-10 | 한국전자통신연구원 | Method and system for collecting and restoring application states |
WO2007127933A2 (en) | 2006-04-27 | 2007-11-08 | Traq Wireless, Inc. | Provisioning a user device for multiple services |
US9208679B2 (en) | 2006-09-05 | 2015-12-08 | Universal Electronics Inc. | System and method for configuring the remote control functionality of a portable device |
US20140337879A1 (en) * | 2006-09-05 | 2014-11-13 | Universal Electronics Inc. | System and method for configuring the remote control functionality of a portable device |
US8812629B2 (en) | 2008-04-18 | 2014-08-19 | Universal Electronics Inc. | System and method for configuring the remote control functionality of a portable device |
US8001474B2 (en) | 2006-09-25 | 2011-08-16 | Embarq Holdings Company, Llc | System and method for creating and distributing asynchronous bi-directional channel based multimedia content |
US7747293B2 (en) | 2006-10-17 | 2010-06-29 | Marvell Worl Trade Ltd. | Display control for cellular phone |
WO2008085205A2 (en) * | 2006-12-29 | 2008-07-17 | Prodea Systems, Inc. | System and method for providing network support services and premises gateway support infrastructure |
US8181206B2 (en) | 2007-02-28 | 2012-05-15 | Time Warner Cable Inc. | Personal content server apparatus and methods |
US7904209B2 (en) | 2007-03-01 | 2011-03-08 | Syracuse University | Open web services-based indoor climate control system |
US7996204B2 (en) * | 2007-04-23 | 2011-08-09 | Microsoft Corporation | Simulation using resource models |
US8413204B2 (en) | 2008-03-31 | 2013-04-02 | At&T Intellectual Property I, Lp | System and method of interacting with home automation systems via a set-top box device |
US8751612B2 (en) * | 2008-11-21 | 2014-06-10 | Microsoft Corporation | Creating cross-technology configuration settings |
US8676942B2 (en) * | 2008-11-21 | 2014-03-18 | Microsoft Corporation | Common configuration application programming interface |
US8615570B2 (en) | 2008-11-21 | 2013-12-24 | Microsoft Corporation | Unified storage for configuring multiple networking technologies |
US20100127854A1 (en) * | 2008-11-21 | 2010-05-27 | Richard Eric Helvick | Method and system for controlling home appliances based on estimated time of arrival |
US8683046B2 (en) | 2008-11-21 | 2014-03-25 | Microsoft Corporation | Unified interface for configuring multiple networking technologies |
US8924707B2 (en) | 2009-04-28 | 2014-12-30 | Hewlett-Packard Development Company, L.P. | Communicating confidential information between an application and a database |
WO2010135372A1 (en) * | 2009-05-18 | 2010-11-25 | Alarm.Com Incorporated | Remote device control and energy monitoring |
KR101768186B1 (en) * | 2009-07-20 | 2017-09-05 | 삼성전자주식회사 | Energy management system and method |
US20110046805A1 (en) * | 2009-08-18 | 2011-02-24 | Honeywell International Inc. | Context-aware smart home energy manager |
US9838255B2 (en) * | 2009-08-21 | 2017-12-05 | Samsung Electronics Co., Ltd. | Mobile demand response energy management system with proximity control |
US8838260B2 (en) | 2009-10-07 | 2014-09-16 | Sony Corporation | Animal-machine audio interaction system |
WO2011063187A2 (en) | 2009-11-19 | 2011-05-26 | Atellis, Inc. | Apparatus, method and computer readable medium for simulation integration |
US10133485B2 (en) * | 2009-11-30 | 2018-11-20 | Red Hat, Inc. | Integrating storage resources from storage area network in machine provisioning platform |
US8825819B2 (en) | 2009-11-30 | 2014-09-02 | Red Hat, Inc. | Mounting specified storage resources from storage area network in machine provisioning platform |
GB201005320D0 (en) * | 2010-03-30 | 2010-05-12 | Telepure Ltd | Improvements in controllers, particularly controllers for use in heating, ventilation and air conditioning systems |
KR101784264B1 (en) | 2010-04-28 | 2017-10-11 | 삼성전자주식회사 | Handover method and apparatus in mobile communication system |
US8556188B2 (en) | 2010-05-26 | 2013-10-15 | Ecofactor, Inc. | System and method for using a mobile electronic device to optimize an energy management system |
US8874129B2 (en) * | 2010-06-10 | 2014-10-28 | Qualcomm Incorporated | Pre-fetching information based on gesture and/or location |
US20110314163A1 (en) * | 2010-06-16 | 2011-12-22 | Mmb Research Inc. | Wireless communication network for smart appliances |
US20120016524A1 (en) * | 2010-07-16 | 2012-01-19 | General Electric Company | Thermal time constraints for demand response applications |
US9256230B2 (en) | 2010-11-19 | 2016-02-09 | Google Inc. | HVAC schedule establishment in an intelligent, network-connected thermostat |
US20120232969A1 (en) * | 2010-12-31 | 2012-09-13 | Nest Labs, Inc. | Systems and methods for updating climate control algorithms |
US9342082B2 (en) * | 2010-12-31 | 2016-05-17 | Google Inc. | Methods for encouraging energy-efficient behaviors based on a network connected thermostat-centric energy efficiency platform |
US20120172027A1 (en) * | 2011-01-03 | 2012-07-05 | Mani Partheesh | Use of geofences for location-based activation and control of services |
US8798804B2 (en) * | 2011-01-06 | 2014-08-05 | General Electric Company | Added features of HEM/HEG using GPS technology |
CA2742894A1 (en) * | 2011-05-31 | 2012-11-30 | Ecobee Inc. | Hvac controller with predictive set-point control |
US8718826B2 (en) | 2011-06-01 | 2014-05-06 | Emerson Electric Co. | System for remote control of a condition at a site |
US9407492B2 (en) * | 2011-08-24 | 2016-08-02 | Location Labs, Inc. | System and method for enabling control of mobile device functional components |
US8622314B2 (en) | 2011-10-21 | 2014-01-07 | Nest Labs, Inc. | Smart-home device that self-qualifies for away-state functionality |
US9529993B2 (en) * | 2012-03-02 | 2016-12-27 | International Business Machines Corporation | Policy-driven approach to managing privileged/shared identity in an enterprise |
WO2014172327A1 (en) | 2013-04-15 | 2014-10-23 | Flextronics Ap, Llc | Synchronization between vehicle and user device calendar |
US10088853B2 (en) | 2012-05-02 | 2018-10-02 | Honeywell International Inc. | Devices and methods for interacting with an HVAC controller |
US8755039B2 (en) * | 2012-05-03 | 2014-06-17 | Abl Ip Holding Llc | Lighting devices with sensors for detecting one or more external conditions and networked system using such devices |
US8843935B2 (en) * | 2012-05-03 | 2014-09-23 | Vmware, Inc. | Automatically changing a pre-selected datastore associated with a requested host for a virtual machine deployment based on resource availability during deployment of the virtual machine |
US8866583B2 (en) * | 2012-06-12 | 2014-10-21 | Jeffrey Ordaz | Garage door system and method |
US9467500B2 (en) * | 2012-08-09 | 2016-10-11 | Rockwell Automation Technologies, Inc. | Remote industrial monitoring using a cloud infrastructure |
US20140047368A1 (en) * | 2012-08-13 | 2014-02-13 | Magnet Systems Inc. | Application development tool |
US20140082702A1 (en) | 2012-09-19 | 2014-03-20 | Spark Devices | Systems and methods for controlling and communicating with connected devices |
JP6103393B2 (en) | 2012-09-28 | 2017-03-29 | パナソニックIpマネジメント株式会社 | Function update method and function update system |
US8594850B1 (en) | 2012-09-30 | 2013-11-26 | Nest Labs, Inc. | Updating control software on a network-connected HVAC controller |
JP6075609B2 (en) * | 2012-10-04 | 2017-02-08 | 日本電気株式会社 | Information processing system, information processing apparatus, information processing method, information processing program, portable communication terminal, control method thereof, and control program thereof |
EP2738672B1 (en) | 2012-11-30 | 2016-09-14 | Accenture Global Services Limited | Communications network, computer architecture, computer-implemented method and computer program product for development and management of femtocell-based applications |
US9541912B1 (en) | 2012-12-13 | 2017-01-10 | Google Inc. | Synchronization of appliances to a schedule of a user |
EP2946307A4 (en) * | 2013-01-15 | 2016-08-24 | Muzzley | Appliance control system and method |
US9801035B2 (en) | 2013-01-21 | 2017-10-24 | Location Labs, Inc. | System and method to identify devices in a shared mobile operating plan |
US9154303B1 (en) * | 2013-03-14 | 2015-10-06 | Microstrategy Incorporated | Third-party authorization of user credentials |
US10078341B2 (en) * | 2013-04-11 | 2018-09-18 | Honeywell International Inc | System and method with GEO location triggering automatic action |
US9305086B2 (en) | 2013-05-24 | 2016-04-05 | Worldrelay, Inc. | Numeric channel tuner and directory server for media and services |
US8862096B1 (en) | 2013-05-28 | 2014-10-14 | Gainspan Corporation | Provisioning of multiple wireless devices by an access point |
US9710248B2 (en) * | 2013-05-29 | 2017-07-18 | Microsoft Technology Licensing, Llc | Application install and layout syncing |
US9191771B2 (en) * | 2013-05-31 | 2015-11-17 | Gainspan Corporation | Convenient use of push button mode of WPS (Wi-Fi protected setup) for provisioning wireless devices |
US20150098455A1 (en) | 2013-10-09 | 2015-04-09 | Darren William Fritsch | WiFi Enabled Wide Area Automation System |
DE102013226607A1 (en) | 2013-12-19 | 2015-06-25 | Bayerische Motoren Werke Aktiengesellschaft | Body structure in knot construction |
US20150222601A1 (en) * | 2014-02-05 | 2015-08-06 | Branto Inc. | Systems for Securing Control and Data Transfer of Smart Camera |
US10637682B2 (en) * | 2014-02-11 | 2020-04-28 | Oracle International Corporation | Smart home learning system including user behavior |
US10063625B2 (en) * | 2014-05-15 | 2018-08-28 | Universal Electronics Inc. | System and method for appliance detection and app configuration |
US20150370272A1 (en) | 2014-06-23 | 2015-12-24 | Google Inc. | Intelligent configuration of a smart environment based on arrival time |
US9788039B2 (en) | 2014-06-23 | 2017-10-10 | Google Inc. | Camera system API for third-party integrations |
US9760501B2 (en) * | 2014-11-05 | 2017-09-12 | Google Inc. | In-field smart device updates |
-
2014
- 2014-11-03 US US14/531,805 patent/US20150370272A1/en not_active Abandoned
- 2014-12-19 US US14/577,635 patent/US20150372832A1/en not_active Abandoned
-
2015
- 2015-05-26 US US14/722,034 patent/US20150372834A1/en not_active Abandoned
- 2015-05-26 US US14/722,012 patent/US9854386B2/en active Active
- 2015-05-26 US US14/722,023 patent/US9838830B2/en active Active
- 2015-05-26 US US14/722,003 patent/US9491571B2/en active Active
- 2015-05-26 US US14/722,026 patent/US9456297B2/en active Active
- 2015-05-26 US US14/722,032 patent/US9668085B2/en active Active
-
2016
- 2016-05-18 US US15/158,268 patent/US10075828B2/en active Active
- 2016-12-15 US US15/380,767 patent/US10638292B2/en active Active
-
2018
- 2018-07-31 US US16/051,375 patent/US10440545B2/en active Active
- 2018-10-19 US US16/166,046 patent/US20190058985A1/en not_active Abandoned
-
2019
- 2019-03-05 US US16/293,358 patent/US20190208390A1/en not_active Abandoned
- 2019-09-09 US US16/565,124 patent/US10764735B2/en active Active
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10440545B2 (en) | 2014-06-23 | 2019-10-08 | Google Llc | Methods and apparatus for using smart environment devices via application program interfaces |
US10638292B2 (en) | 2014-06-23 | 2020-04-28 | Google Llc | Methods and apparatus for using smart environment devices via application program interfaces |
US10764735B2 (en) | 2014-06-23 | 2020-09-01 | Google Llc | Methods and apparatus for using smart environment devices via application program interfaces |
WO2021163270A1 (en) * | 2020-02-12 | 2021-08-19 | Appareo Systems, Llc | Aircraft lighting system and method |
CN111522615A (en) * | 2020-04-23 | 2020-08-11 | 平安国际智慧城市科技股份有限公司 | Method, device and equipment for updating command line interface and storage medium |
Also Published As
Publication number | Publication date |
---|---|
US20150370621A1 (en) | 2015-12-24 |
US20150372999A1 (en) | 2015-12-24 |
US20150370615A1 (en) | 2015-12-24 |
US20150373149A1 (en) | 2015-12-24 |
US20190058985A1 (en) | 2019-02-21 |
US9491571B2 (en) | 2016-11-08 |
US9854386B2 (en) | 2017-12-26 |
US10440545B2 (en) | 2019-10-08 |
US20160261425A1 (en) | 2016-09-08 |
US20150372832A1 (en) | 2015-12-24 |
US20170192402A1 (en) | 2017-07-06 |
US20150372834A1 (en) | 2015-12-24 |
US20180376313A1 (en) | 2018-12-27 |
US10638292B2 (en) | 2020-04-28 |
US10764735B2 (en) | 2020-09-01 |
US20200045522A1 (en) | 2020-02-06 |
US9668085B2 (en) | 2017-05-30 |
US10075828B2 (en) | 2018-09-11 |
US9838830B2 (en) | 2017-12-05 |
US20150372833A1 (en) | 2015-12-24 |
US20150370272A1 (en) | 2015-12-24 |
US9456297B2 (en) | 2016-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190208390A1 (en) | Methods And Apparatus For Exploiting Interfaces Smart Environment Device Application Program Interfaces | |
US11627133B2 (en) | Selectively restricting communications from third party applications/devices to electronic devices | |
US11709101B2 (en) | Home monitoring and control system | |
US10375150B2 (en) | Crowd-based device trust establishment in a connected environment | |
US10302499B2 (en) | Adaptive threshold manipulation for movement detecting sensors | |
US9869484B2 (en) | Predictively controlling an environmental control system | |
US9933177B2 (en) | Enhanced automated environmental control system scheduling using a preference function | |
US9660948B2 (en) | Rule-based rate limiting | |
US9772116B2 (en) | Enhanced automated control scheduling | |
US20160201933A1 (en) | Predictively controlling an environmental control system | |
WO2016073312A1 (en) | Enhanced automated control scheduling |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KORTZ, SAMUEL W.;HU, GREGORY J.;SURYA, AMANDA;AND OTHERS;SIGNING DATES FROM 20141119 TO 20141204;REEL/FRAME:048509/0900 Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC;REEL/FRAME:048512/0654 Effective date: 20170929 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |