US20180176031A1 - Home automation via voice control - Google Patents
Home automation via voice control Download PDFInfo
- Publication number
- US20180176031A1 US20180176031A1 US15/847,014 US201715847014A US2018176031A1 US 20180176031 A1 US20180176031 A1 US 20180176031A1 US 201715847014 A US201715847014 A US 201715847014A US 2018176031 A1 US2018176031 A1 US 2018176031A1
- Authority
- US
- United States
- Prior art keywords
- home automation
- automation system
- rule
- spoken command
- rules
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 53
- 238000004891 communication Methods 0.000 claims abstract description 15
- 238000012790 confirmation Methods 0.000 claims description 25
- 230000003068 static effect Effects 0.000 claims description 19
- 238000005352 clarification Methods 0.000 claims description 16
- 230000006870 function Effects 0.000 claims description 14
- 230000000977 initiatory effect Effects 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 10
- 238000009434 installation Methods 0.000 claims description 9
- 230000004044 response Effects 0.000 description 21
- 238000010586 diagram Methods 0.000 description 18
- 238000005516 engineering process Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000001413 cellular effect Effects 0.000 description 3
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 2
- UGFAIRIUMAVXCW-UHFFFAOYSA-N Carbon monoxide Chemical compound [O+]#[C-] UGFAIRIUMAVXCW-UHFFFAOYSA-N 0.000 description 2
- 230000004888 barrier function Effects 0.000 description 2
- 229910002091 carbon monoxide Inorganic materials 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000000779 smoke Substances 0.000 description 2
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 229910002092 carbon dioxide Inorganic materials 0.000 description 1
- 239000001569 carbon dioxide Substances 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G06N99/005—
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
Definitions
- an apparatus for establishing operation rules in a home automation system includes a processor, a memory in electronic communication with the processor, and instructions stored in the memory which are executable by the processor to receive a spoken command having a plurality of rule setting terms, establish at least one operation rule for the home automation system based on the spoken command, and store the at least one operation rule for later use by the home automation system.
- the instructions may be executable by the processor to initiate a rules mode for the home automation system prior to receiving the spoken command.
- the instructions may be executable by the processor to terminate the rules mode after establishing the at least one operation rule.
- Initiating the rules mode may include receiving a touch input at a control panel of the home automation system.
- Initiating the rules mode may include receiving a spoken trigger word or an audible sound.
- the instructions may be executable by the processor to deliver a request for clarification of the spoken command.
- the request for clarification may include an audible message.
- the request for clarification may include a displayed message visible on a control panel of the home automation system.
- the instructions may be executable by the processor to generate a message that includes a restatement of the spoken command and a request for confirmation of the correctness of the restatement of the spoken command.
- the instructions may be executable by the processor to generate a message to a user of the home automation system that includes a restatement of the at least one operation rule and requests confirmation of the accuracy of the at least one operation rule.
- the computer-program product includes a non-transitory computer-readable medium storing instructions executable by a processor to initiate a rule setting mode for the home automation system, receive a spoken command having a plurality of rule setting terms, generate at least one operation rule for the home automation system based on the spoken command, and generate a message that includes a restatement of the at least one operation rule.
- At least some of the plurality of rule setting terms may be defined during installation of the home automation system at a property. At least some of the plurality of rule setting terms may be generic to a plurality of different home automation systems associated with a plurality of different properties.
- the plurality of rule setting terms may include at least one static term that is generic to a plurality of different home automation systems, and at least one dynamic term that is uniquely defined for the home automation system.
- the instructions may be executable by the processor to store the at least one operation rule for later use by the home automation system, wherein storing the at least one operation rule includes storing in a local database of the home automation system.
- the instructions may be executable by the processor to generate a message requesting confirmation of the spoken command.
- the instructions may be executable by the processor to receive confirmation of the spoken command.
- a further embodiment of the present disclosure relates to a computer-implemented method for establishing operation rules in a home automation system.
- the method includes initiating a rules mode for the home automation system, receiving a spoken command having a plurality of rule setting terms, establishing at least one operation rule for the home automation system using the spoken command, storing the at least one operation rule, and operating at least one function of the home automation system based on the at least one stored operation rule.
- the plurality of rule setting terms may include at least one static term that is pre-programmed into the home automation system prior to installation.
- the plurality of rule setting terms may include at least one dynamic term that is uniquely programmed into the home automation system no sooner than installation at a property.
- FIG. 1 is a block diagram of an environment in which the present systems and methods may be implemented
- FIG. 2 is a block diagram of another environment in which the present systems and methods may be implemented;
- FIG. 3 is a block diagram of another environment in which the present systems and methods may be implemented.
- FIG. 4 is a block diagram of another environment in which the present systems and methods may be implemented.
- FIG. 5 is a block diagram of a voice control module for use in at least one of the environments shown in FIGS. 1-4 ;
- FIG. 6 is a block diagram of another voice control module for use in at least one of the environments shown in FIGS. 1-4 ;
- FIG. 7 is a flow diagram illustrating a method for establishing operation rules in a home automation system
- FIG. 8 is a flow diagram illustrating a method for establishing operation rules in a home automation system
- FIG. 9 is a flow diagram illustrating a method for establishing operation rules in a home automation system.
- FIG. 10 is a block diagram of a computer system suitable for implementing the present systems and methods of FIGS. 1-9 .
- the systems and methods described herein relate to home automation and home security, and related security systems and automation for use in commercial and business settings. More specifically, the systems and methods described herein relate to operating features of a home automation system using voice control.
- the phrase “home automation system” may refer to a system that includes automation features alone, security features alone, a combination of automation and security features, or a combination of automation, security and other features. While the phrase “home automation system” is used throughout to describe a system or components of a system or environment in which aspects of the present disclosure are described, such an automation system and its related features (whether automation and/or security features) may be generally applicable to other properties such as businesses and commercial properties as well as systems that are used in indoor and outdoor settings.
- Home automation systems are typically programmed via an on-site control panel that is located at the property that is being monitored/controlled by the system.
- the control panel usually includes a touch screen with a number of categories and menus to select from for programming various functions of the system.
- the user-selected options create rules by which the system operates/functions.
- the system may be programmed by selecting among the options presented on the control panel screen to arm or disarm the system at a certain time of the day or day of the week based on the programmed rules. Once the “rules” are established, the system operates automatically when the time of day or day of week condition is met.
- Home automation systems are initially programmed for a particular property by a professional installer who is familiar with the control panel and how to program and/or modify functionality of the system.
- the property owner is usually less familiar with the control panel and how to modify functionality of the system after it is installed.
- the voice control may particularly relate to establishing or changing rules by which the automation system operates, rather than prompting functionality of the automation system.
- Real-time control of at least some functionality of an automation system (speaking a command to arm the system or unlock a door) is relatively well-known in this technical field.
- the present disclosure is intended to alleviate challenges for a user in interfacing with the home automation system (e.g., control panel) to establish or modify the rules by which the home automation system operates.
- the voice commands given in accordance with the present disclosure typically do not result in real-time change in functionality of the system, but rather are used at a later time to control functionality of the system.
- a user may first activate a “rules” mode for the home automation system and then speak a command such as “unlock back door every day at 7:00 a.m.” or “turn up basement thermostat to 75 degrees when Jason arrives home after noon.”
- the “rules” mode may be activated by speaking a trigger word or phrase (e.g., “rules mode”), selecting a rules button or category on the control panel (e.g., on a display screen of the control panel), or entering a key word at the control panel.
- the command may include static and/or dynamic vocabulary that is interpreted in order for the system to understand the meaning of the command.
- Static vocabulary may include those terms that are consistent for at least a certain type of home automation system (e.g., unlock, every, day, 7:00, a.m., turn up, after, noon, 75 degrees, etc.).
- the system may include a library of static vocabulary and definitions associated with each term.
- Dynamic vocabulary may include those terms that are specific to a particular property and system setup. For example, the various access points of a building may be named according to user preferences at the time of system setup (e.g., back patio door, basement thermostat, etc.).
- the dynamic terms may be, for example, stored separately, defined at the time of set up, and/or modified during the lifetime of the system.
- the dynamic vocabulary may also include the names or identifiers of persons who are authorized to access the property (e.g., Jason, mom, etc.).
- the persons may be identified by the system in a variety of ways, including cell phones and fabs carried by the persons, codes entered by the persons when entering or leaving the property, voice or face recognition, etc.
- the system may be configured to prompt the user for clarification of the spoken command if the command includes one or more unknown terms or is unclear in some way.
- the prompt may be an audible message (e.g., a spoken message such as, “you mentioned back door, is this the south back door or the east back door?”) or displayed (e.g., on a screen of a control panel or on a mobile handheld device).
- the system may also repeat the rule back to the user and request confirmation of its correctness before storing and/or using the rule to control functionality of the system.
- the rule setting mode for the home automation system may be operated through a control panel of the home automation system. Additionally, or alternatively, the rule setting mode may be operated via a remote computing device, such as a mobile handheld device, laptop computer, desktop computer, mobile tablet computing device, or the like. Any function related to setting rules of the home automation system, such as the voice control features discussed herein for setting rules of the home automation system, may be performed using such a remote computing device.
- FIG. 1 is a block diagram of one example of an environment 100 in which the present systems and methods may be implemented.
- the systems and methods described herein may be performed, at least in part, using a device 105 , which includes a user interface 110 , a voice control module 115 , and a processor 120 .
- Device 105 may include or be part of a control panel of a home automation system.
- Device 105 may be a permanently installed device (e.g., a control panel mounted to a wall of a building) or it may be a mobile device such as, for example, a desktop computer, a laptop computer, a tablet computer, or mobile handheld device such as a smartphone.
- Device 105 may operate at least in part to establish rules of operation for a home automation system.
- rules are typically established for a home automation system using a somewhat complicated and oftentimes non-user friendly system of drop down menus, questions, and the like that a user must navigate via, for example, a display on a control panel.
- a potential advantage related to the voice control features of the present disclosure for setting rules for a home automation system is the reduced complexity of creating rules and being able to avoid at least some of the confusing, complex menus and interfaces that are otherwise required to establish such rules.
- User interface 110 may provide a physical interface between device 105 and a user.
- User interface 110 may include, for example, a touch screen with a plurality of user activated menus and screen selectable features.
- User interface 110 may include one or more physical buttons, switches, or the like.
- User interface 110 may include at least one feature that provides a user the ability to enter into or exit a rule setting mode for the home automation system.
- User interface 110 may include other features or functionality that permit the user to select certain rules and/or responses to questions as part of setting rules related to a home automation system.
- Voice control module 115 may operate at least in part in response to audible sounds and to establish one or more rules for the home automation system in response to the audible sounds.
- the audible sounds may be voice commands generated by one or more users. Additionally, or alternatively, the audible sounds may include tones, notes, beeps, or other sounds that are generated by a device and/or by the user.
- Voice control module 115 may provide an interactive experience with a user to establish a rule for a home automation system. For example, voice control module 115 may receive a voice command from a user which places the home automation system in a rule setting mode. Once in the rule setting mode, voice control module 115 may prompt the user for information related to setting one or more rules. Voice control module 115 may receive audible responses and/or inputs via, for example, user interface 110 , from the user to create a rule. Voice control module 115 may respond to the user with audible and/or displayed responses for the purpose of, for example, confirming instructions from the user related to the rule being established, requesting clarification of the inputs from the user related to setting a rule, or confirming that the rule has been established.
- voice control module 115 may eliminate, at least in part, the need for a user to provide inputs via, for example, user interface 110 or other physical inputs from the user in order to establish a rule for the home automation system.
- voice control module 115 may be operable to establish at least one rule of the home automation system using only audible impact from the user.
- Processor 120 may cooperate with user interface 110 and voice control module 115 to perform any of the computing and/or logic functions related to, for example, inputting information from the user, establishing rules, storing rules, and later providing at least some automated function of the home automation system based on the rules.
- the rules established by voice control module 115 are stored and used for future operation of one or more features of the home automation system.
- Rules may be stored in any of a number of locations including, for example, a memory or database of device 105 . In other examples, the rules may be stored at other locations such as, for example, in a cloud storage or in a backend server available via, for example, a network. Once saved, the rules may provide for automated functionality of home automation system based on certain conditions.
- a rule established via voice control module 115 may be that the “front door is locked every time Bill arms the system.”
- the home automation system may be able to identify the difference between Bill and other users of the system based on, for example, a log-in/security code, voice recognition, a pre-registered mobile device that is identifiable via, for example, blue tooth technology, or the like.
- Bill may arm the system in any of a number of ways including, for example, manual arming the system via user interface 110 or other feature or device 105 , arming the system via a remote computing device (e.g., a mobile handheld device), or using a voice command of “arm system” spoken from anywhere in the property and picked up by any number of microphones or other features of the home automation system.
- the spoken commands received from the user at voice control module 115 may include a variety of different terms. Some of the terms may be classified as “static” terms that are generic to all of at least a certain type of home automation systems. The generic terms may include such words as, for example, lock, unlock, arm, disarm, on, off, open, close, daytime, night, 1:00, noon, I, me, arrive, depart, temperature, raise, lower, and the like.
- dynamic terms may be classified as “dynamic” terms that are unique to a specific property or particular home automation system installation.
- the names of each of the people living in the home or others who may be authorized to operate the home automation system may be stored in a dynamic term database.
- Other dynamic terms may include, for example, particular names assigned to each of the barriers, appliances, systems, or areas of the home (e.g., front door, south garage door, kitchen light, patio window, south HVAC zone, front sprinkler zone, and the like).
- the dynamic terms may be established at the time of, for example, installing the home automation system, as prompted at any time via, for example, voice control module 115 , or based on a questionnaire that is answerable, for example, when updating or upgrading the home automation system.
- the static and dynamic terms may be stored in different locations.
- the static terms may be pre-programmed in the home automation system.
- the static terms may be permanent and unable to be altered.
- the dynamic terms may be updated, added to, or changed as desired.
- Voice control module 115 may search the static and dynamic term databases in response to receiving a voice command from a user to determine whether the voice command provides enough specificity to generate a rule. If there is uncertainty and/or a lack of clarity related to any of the static or dynamic terms identified in the spoken command, voice control module 115 may prompt the user to, for example, restate the voice command, restate a specific term, provide clarification of a given term, or confirm that the spoken command (as repeated back to the user via voice control module 115 ) accurately reflects the user's intended instructions and/or rule.
- FIG. 2 shows another environment 200 that may include the components in environment 100 described above, and may further include a device 105 - a having a user interface 110 , a voice control module 115 , a processor 120 , a speaker 205 , a microphone 210 , and a term database 215 .
- the speaker 205 and microphone 210 may facilitate audible communication between the user and the home automation system (e.g., voice control module 115 ).
- Speaker 205 may convey audible responses from the home automation system to the user.
- Microphone 210 may operate to receive audible signals from the user (e.g., voice commands).
- Speaker 205 and microphone 210 may be integrated into device 105 - a .
- speaker 205 and/or microphone 210 may be provided as separate components from device 105 - a .
- speaker 205 and microphone 210 may provide multiple functions, some of which may be unrelated to establishing rules for the home automation system.
- speaker 205 may be used to convey alarm signals, messages to the user related to the status of one or more barriers at the home, convey audible messages related to, for example, current weather conditions or the status of an HVAC system, or prompt the user for further instructions related to operation of the home automation system.
- Microphone 210 may operate to receive audible signals from a user including, for example, instructions related to operation of security features and/or automation features for real-time control of the home automation system.
- Term database 215 may include at least one storage location for storing one or more static and/or dynamic terms.
- Term database 215 may be part of or include memory capability for device 105 - a .
- device 105 - a may include a storage device (e.g., hard disk drive or solid state drive) that includes a portion thereof dedicating to storing terms (e.g., the term database 215 ).
- term database 215 may be a separate and distinct storage device and/or storage location, and may include storage at a remote location.
- Term database 215 may be provided as a component of home automation system that is separate from device 105 - a .
- Term database 215 may be accessible via, for example, a network.
- Term database 215 may include separate storage devices for static terms versus dynamic terms.
- FIG. 3 shows another environment 300 that may include the components of environments 100 , 200 described above, and may further include a backend server 305 , a remote device 310 , and a network 315 . Additionally, a device 105 - b may include, in addition to the components of device 105 - a shown in FIG. 2 , a display 320 .
- Backend server 305 may provide at least some backend support, functionality, and/or storage for a home automation system that includes device 105 - a .
- at least some of the static and/or dynamic terms associated with term database 215 may be stored at the backend server 305 .
- Backend server 305 may provide support services such as, for example, communicating with emergency personnel in response to an alarm condition identified by device 105 - b .
- at least some of the rules established via operation of voice control module 115 may be stored in backend server 305 .
- Remote device 310 may provide operation of at least some features of the home automation system from a remote location.
- remote device 310 may facilitate communication between a user of the home automation system and voice control module 115 .
- Remote device 310 may include, for example, a speaker and/or microphone to provide voice commands used to establish rules via voice control module 115 .
- Voice control module 115 may generate audible responses and/or displayed responses that are communicated to the user via remote device 310 .
- remote device 310 may be operated by the user to initiate a rule setting mode via voice control module 115 . Once the rule setting mode is initiated, the user may provide one or more voice commands via remote device 310 that are used to establish one or more rules of the home automation system.
- Voice control module 115 may provide a response to the spoken commands, wherein the responses received at the remote device 310 are in some other way communicated to the user. Once the rules are established voice control module 115 , the rules may be stored and used to operate the home automation system at a later time.
- Network 315 may provide communication between device 105 - b , backend server 305 , and remote device 310 , or other components of environment 300 .
- Network 315 may include local area networks (LAN), wide area networks (WAN), virtual private networks (VPN), wireless networks (using 802.11, for example), and/or cellular networks (using 3G and/or LTE, for example), etc.
- network 315 may include the internet.
- Display 320 may be provided as a separate component from device 105 - b .
- Display 320 and other features or components of device 105 - b may be provided separate from device 105 - b and communicate with each other via, for example, network 315 .
- Voice control module 115 may communicate information via display 320 to a user during, for example, a rule setting mode of the home automation system.
- Display 320 may be part of user interface 110 .
- Display 320 may include, for example, a touch screen. Display 320 may display colors, patterns, and the like, as well as text and other symbols.
- display 320 displays a text message such as “please restate previous voice command,” “I heard you say to lock the door every time Bill arms the system, is that correct,” or “you said back door, do you mean back patio door or back bedroom door.”
- voice control module 115 may provide a text message via display 320 as well as an audible message via speaker 205 .
- the text and audible messages may be selected depending on, for example, considerations such as whether the user setting the rules is in close proximity to device 105 - b , whether the user is able to read, etc.
- FIG. 4 shows another environment 400 that may include the components of any of the environments 100 , 200 , 300 described above, and may further include an application 405 , a mobile computing device 410 , a sensor 415 , a home automation controller 420 , and a remote device 310 - a .
- Remote device 310 - a includes a user interface 425 , a remote voice control module 430 , a processor 435 , a speaker 440 , a microphone 445 , and a display 450 .
- Environment 400 may also include a display 320 - a that is separate from device 105 - a .
- Device 105 - a which may include the same or similar components as shown in FIG. 2 , may communicate with the other components and features of environment 100 via network 315 .
- Application 405 may allow a user to control either directly or via device 105 - a or a remote device 310 - a , an aspect of the monitored property, including security, energy management, locking and unlocking doors, checking the status of a door, locating a user or item, controlling lighting, thermostat, or cameras, and retrieving notifications regarding a current status or anomaly associated with a home, office, place of business, and the like.
- application 405 may enable device 105 - a and/or remove device 310 - a to interface with voice control module 115 , processor 120 , or the like and provide the user interface 110 to display an automation kind of security and/or energy management content on device 105 - a and/or remote device 310 - a .
- application 405 via user interface 110 , or other features of environment 400 may allow users to control aspects of their home, office and/or other type of property. Further, application 405 may be installed on device 105 - a , remote device 310 - a , or other components of environment 400 . Application 405 may operate in response to at least one rule established via voice control module 115 .
- Display 320 - a may include, for example, a digital display as part of, for example, a control panel of environment 400 (e.g., a home automation system).
- Display 320 - a may be provided via devices such as, for example, a desktop computer or a mobile computing device (e.g., remote device 310 - a ).
- the user interface 110 may be integrated into display 320 - a .
- Such a user interface 110 may include a plurality of menus, screens, microphones, speakers, cameras, and other capability that permit interaction between the user and the home automation system or other components of environment 400 .
- user interface 110 with display 320 - a may be integrated into device 105 - a as discussed above related to environment 300 .
- Mobile computing device 410 may include remote device 310 - a or be part of remote device 310 - a .
- Mobile computing device 410 may be used to interact with voice control module 115 from a local or remote location.
- mobile computing device 410 may provide initiation of a rule setting mode, may communicate a voice command associated with establishing a rule via voice control module 115 , and the like from any location relative to device 105 - a.
- Sensor 415 may include, for example, a camera sensor, an audio sensors, a forced entry sensor, a shock sensor, a proximity sensor, a boundary sensor, an appliance sensor, a light fixture sensor, a temperature sensor, a light beam sensor, a three-dimensional (3D) sensor, a motion sensor, a smoke sensor, a glass break sensor, a door sensor, a video sensor, a carbon monoxide sensor, an accelerometer, a global positioning system (GPS) sensor, a Wi-Fi positioning sensor, a capacity sensor, a radio frequency sensor, a near-field sensor, a heartbeat sensor, a breathing sensor, an oxygen sensor, a carbon dioxide sensor, a brain wave sensor, a motion sensor, a voice sensor, a touch sensor, and the like.
- GPS global positioning system
- the device 105 - a , application 405 , display 320 - a , remote device 310 - a , home automation controller 420 , or other components of environment 400 and/or a home automation system may have included or have integrated therein one or more of the sensors 415 .
- sensor 415 is depicted as a separate component from device 105 - a , remote device 310 - a , and other components of environment 400 . In some embodiments, sensor 415 may be connected directly to any one of these components or other components of environment 400 . Additionally, or alternatively, sensor 415 may be integrated into a home appliance or fixture such as a light fixture.
- the information provided by sensor 415 may be used to initiate a rule setting mode of a home automation system, confirm user feedback associated with questions or other feedback provided via voice control module 115 , confirm identification of one or more users attempting to establish or change rules of the home automation system, and the like.
- sensor 415 may operate in response to the rule established via voice control module 115 (e.g., rules established by voice commands, stored, and then used by the home automation system at a later time to permit one or more feature and/or functionality of the home automation system).
- Home automation controller 420 may include or be part of processor 120 and/or processor 435 .
- Home automation controller 420 may operate to control any of the features or functionality of a home automation system and/or component of environment 400 .
- Home automation controller may operate at least in part in response to rules established via voice control module 115 .
- Remote device 310 - a may include at least some of the features and/or functionality of device 105 - a , and may provide such functionality from a location that is remote from device 105 - a and/or a property being controlled and/or monitored by a home automation system.
- the user interface 425 may include any of the features and/or functionality of user interface 110 .
- Remote voice control module 430 may include at least some of the features and/or functionality of voice control module 115 described above.
- processor 435 , speaker 440 , microphone 445 , and display 450 may include at least some of the same features and/or functionality of the processor 120 , speaker 205 , microphone 210 , and display 320 - a , respectively described above with reference to device 105 .
- remote voice control module 430 may facilitate initiation of a rule setting mode for the home automation system via remote device 310 - a .
- Remote voice control module 430 may operate to receive voice commands and other input related to establishing one or more rules, and may provide communication and/or responses to the user as part of establishing the rules.
- Speaker 440 and microphone 445 may operate to communicate between the user and the home automation system. Speaker 440 and microphone 445 may also provide communication between a user operating remote device 310 - a and the voice control module 115 of device 105 - a (e.g., via network 315 ).
- display 450 may display messages to the user operating remote device 310 - a in response to operation of voice control module 115 .
- remote device 310 - a may provide a user with the ability to generate and/or change rules for operation of the home automation system while the user is positioned at a location remote from device 105 - a (which may be any location remote from a property being monitored by the home automation system).
- remote device 310 - a may provide communication between voice control module 115 , which operates on device 105 - a locally at the property being monitored by the home automation system, while the user is positioned at a location remote from the property.
- FIG. 5 shows a voice control module 115 - a that may be one example of the voice control modules 115 shown and described with reference to FIGS. 1-4 .
- Voice control module 115 - a may include a rules mode module 505 , a rule generation module 510 , and a rule storage module 515 .
- voice control module 115 - a may include other modules that perform the same or different functionality for the voice control module 115 .
- Rules mode module 505 may operate to place the home automation system in a rule setting mode.
- the rules setting mode may be initiated by, for example, the user speaking a voice command that includes a particular trigger word/phrase such as, for example, “start rules mode,” “set new rule,” or “change a rule.”
- the rule setting mode is initiated by the user providing manual input such as, for example, touching an active area of a touch screen, pressing a button, operating a switch, etc. (e.g., via user interface 110 or user interface 425 ).
- Rules mode module 505 may provide feedback to the user to confirm that the home automation system is in a rule setting mode.
- Rules mode module 505 may provide an indicator such as, for example, an audible message that states “rule setting mode active,” a text message displayed to the user that reads “rules mode,” a symbol, a change in color of a light indicator, etc.
- rules mode module 505 may prompt the user for instructions as a way to confirm that the home automation system is in a rule setting mode.
- the rules mode module 505 may provide an audible or text message stating “provide rules instructions” or “enter rule.”
- Rule generation module 510 may receive the rules instructions from the user and generate at least one rule for the home automation system in response to the received instructions. Rule generation module 510 may operate to provide communication back to the user in response to instructions received from the user. The responsive communication may be in any format. For example, the user may provide audible instructions related to setting a rule (e.g., “lock the front door every time Bill arms the system”), and rule generation module 510 provides a confirmation of the instructions to the user via audible, text, or other means of communication with a message such as “you stated, lock the door every time Bill arms the system, is this correct?” Rule generation module 510 may also confirm that a rule has been set by communicating a message to the user such as, for example, “rule set” or “created new rule.”
- a rule has been set by communicating a message to the user such as, for example, “rule set” or “created new rule.”
- Rule storage module 515 may store the rule for later use by the home automation system. Rule storage module 515 may provide instructions for saving the rule in any of a number of locations such as, for example, the memory of a storage panel (e.g., device 105 ), or at a remote location such as, for example, backend server 305 . Rule storage module 515 may also operate to retrieve the rule and/or permit the user to modify and/or replace a rule at a later time.
- FIG. 6 is a block diagram illustrating another voice control module 115 - b .
- Voice control module 115 - b may be one example of the voice control module 115 shown in any of FIGS. 1-5 .
- Voice control module 115 - b may include the rules mode module 505 , rule generation module 510 , and rule storage module 515 of voice control module 115 - a described above with reference to FIG. 5 .
- Voice control module 115 - b may additionally include at least one of a command receipt module 605 , a term identification module 610 , a command confirmation module 615 , and a rule confirmation module 620 .
- the modules 605 , 610 , 615 , 620 may perform at least some of the features and/or functionality of the rules mode module 505 , rule generation module 510 , and rule storage module 515 described above with reference to voice control module 115 - a , and/or the voice control module 115 described with reference to FIGS. 1-4 .
- Command receipt module 605 may prompt a user for instructions relating to setting a rule for the home automation system.
- Command receipt module 605 may present the prompt in the form of, for example, an audible request (e.g., “state the rule”), a text message, a flashing light sequence, etc.
- Command receipt module 605 may operate to inform the user that the spoken command or other instruction from the user associated with setting a rule has been received.
- Command receipt module 605 may store, analyze, and/or evaluate the command or at least certain aspects of the command.
- Term identification module 610 may operate to identify terms in the command received from the user. The term identification module 610 may identify certain of the terms as static terms and other of the terms as dynamic terms. In some examples, the command is provided at least in part by a menu selection or a written command. Regardless of the format of the command provided by the user, term identification module 610 may help identify key terms that are relevant for setting one or more rules. In at least one example, term identification module 610 may identify a trigger term used to initiate the rule setting mode, and may cooperate with rules mode module 505 related to initiating the rule setting mode. In other examples, term identification module 610 is operable to identify a different trigger word used to terminate a rule setting mode, to delete an existing rule, and/or to correct previously stated spoken commands.
- Command confirmation module 615 may operate to provide confirmation back to the user that the user's command (whether spoken or in another format) has been received.
- Command confirmation module 615 may provide a responsive message to the user to confirm that the user's command is understood correctly.
- command confirmation module 615 may provide a message to the user restating what the voice control module 115 - b understood the user's command to say and/or mean (e.g., “you stated that the front door should be locked every time Bill arms the system. Is this correct?”).
- Rule confirmation module 620 may provide a response to the user that a rule has been generated and/or stored. For example, rule confirmation module 620 may provide a response to the user (e.g., spoken, text or otherwise) indicating that the rule has been recognized, saved, etc. Rule confirmation module 620 may provide an audible message to the user such as, for example, “the system will lock the door every time Bill arms the system,” or “your requested rule has been set and saved.” Voice control module 115 may cooperate with other devices such as, for example, remote device 310 to provide communication with the user.
- any of the functions described with reference to the modules of voice control module 115 - b may function at least in part in cooperation with remote device 310 or other features of the environments 100 , 200 , 300 , 400 described above.
- any of the features and functionality of the voice control modules 115 e.g., described with reference to FIGS. 5-6 ) may be included in the remote voice control module 430 described with reference to FIG. 4 .
- FIG. 7 is a flow diagram illustrating one embodiment of a method 700 for establishing operation rules in a home automation system.
- the method 700 may be implemented with any of the voice control modules 115 shown in FIGS. 1-6 .
- method 700 may be performed generally by device 105 or remote device 310 shown in FIGS. 1, 2, 3 , and/or 4 , or even more generally by the environments 100 , 200 , 300 , 400 shown in FIGS. 1-4 .
- method 700 includes receiving a spoken command having a plurality of rule setting terms.
- Block 710 includes establishing at least one operation rule for the home automation system based on the spoken command.
- the method 700 includes storing the at least one operation rule for later use by the automation and security system.
- Method 700 may also include initiating a rules mode for the home automation system prior to receiving a spoken command.
- Method 700 may include terminating the rules mode after establishing the at least one operation rule.
- Initiating the rules mode may include receiving a touch input at a control panel of the home automation system.
- Initiating the rules mode may include receiving a spoken trigger word or an audible sound.
- Method 700 may include delivering a request for clarification of a spoken command.
- the request for clarification may include an audible message.
- the request for clarification may include a displayed message visible on a control panel of the home automation system.
- Method 700 may include generating a message that includes a restatement of the spoken command and a request for confirmation of the correctness of the restatement of the spoken command.
- Method 700 may include generating a message to a user of the home automation system that includes a restatement of the at least one operation rule and requesting confirmation of the accuracy of the at least one operation rule.
- FIG. 8 is a flow diagram illustrating one embodiment of a method 800 for establishing operation rules in a home automation system.
- the method 800 may be implemented by any one of the voice control modules 115 described with reference to FIGS. 1-6 .
- method 800 may be performed generally by device 105 or the remote device 310 described with reference to FIGS. 1-4 , or even more generally by the environments 100 , 200 , 300 , 400 shown in FIGS. 1-4 .
- method 800 includes initiating a rule setting mode for the home automation system.
- Block 810 includes receiving a spoken command having a plurality of rule setting terms.
- Block 815 includes generating at least one operation rule for the home automation system based on the spoken command.
- method 800 includes generating a message that includes a restatement of the at least one operation rule.
- Method 800 may also include defining at least some of the plurality of rule setting terms at the time of installing the home automation system at a property. At least some of the plurality of rule setting terms may be generic to a property of different home automation systems associated with a plurality of different properties. The plurality of rule setting terms may include at least one static term that is generic to a plurality of different home automation systems, and at least one dynamic term that is uniquely defined for the home automation system. Method 800 may include storing the at least one operation rule for later use by the home automation system, wherein storing the at least one operation rule includes storing in a local database of the home automation system. The method 800 may include generating a message requesting confirmation of the spoken command. Method 800 may include receiving confirmation of the spoken command.
- FIG. 9 is a flow diagram illustrating one embodiment of a method 900 for establishing operation rules in a home automation system.
- the method 900 may be implemented by the voice control module 115 described with reference to FIGS. 1-6 .
- method 900 may be performed generally by the device 105 or remote device 310 shown with reference to FIGS. 1-4 , or even more generally by the environments 100 , 200 , 300 , 400 shown in FIGS. 1-4 .
- method 900 includes initiating a rules mode for the home automation system.
- Block 910 includes receiving a spoken command having a plurality of rule setting terms.
- method 900 includes establishing at least one operation rule for the home automation system using the spoken commands.
- Block 920 includes storing the at least one operation rule.
- Block 925 includes operating at least one function of the home automation system based on the at least one stored operation rule.
- the plurality of rule setting terms may include at least one static term that is preprogrammed into the home automation system prior to installation.
- the plurality of rule setting terms may include at least one dynamic term that is uniquely programmed into the home automation system no sooner than installation of the home automation system at a property.
- FIG. 10 depicts a block diagram of a controller 1000 suitable for implementing the present systems and methods.
- the controller 1000 may be an example of the device 105 or remote device 310 illustrated in FIGS. 1, 2, 3 and/or 4 .
- controller 1000 includes a bus 1005 which interconnects major subsystems of controller 1000 , such as a central processor 1010 , a system memory 1015 (typically RAM, but which may also include ROM, flash RAM, or the like), an input/output controller 1020 , an external audio device, such as a speaker system 1025 via an audio output interface 1030 , an external device, such as a display screen 1035 via display adapter 1040 , an input device 1045 (e.g., remote control device interfaced with an input controller 1050 ), multiple USB devices 1065 (interfaced with a USB controller 1070 ), and a storage interface 1080 . Also included are at least one sensor 1055 connected to bus 1005 through a sensor controller 1060 and a network interface 1085 (coupled directly to bus 100
- Bus 1005 allows data communication between central processor 1010 (e.g., processor 120 ) and system memory 1015 , which may include read-only memory (ROM) or flash memory (neither shown), and random access memory (RAM) (not shown), as previously noted.
- Voice control module 115 - c which is one example of voice control module 115 shown in FIGS. 1-6 , may be stored in system memory 1015 . Any of the modules discloses with reference to FIGS. 1-6 may be stored in system memory 1015 .
- the RAM is generally the main memory into which the operating system and application programs are loaded.
- the ROM or flash memory can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components or devices.
- BIOS Basic Input-Output system
- Applications e.g., application 405
- controller 1000 are generally stored on and accessed via a non-transitory computer readable medium, such as a hard disk drive (e.g., fixed disk 1075 ) or other storage medium. Additionally, applications can be in the form of electronic signals modulated in accordance with the application and data communication technology when accessed via network interface 1085 .
- a non-transitory computer readable medium such as a hard disk drive (e.g., fixed disk 1075 ) or other storage medium.
- applications can be in the form of electronic signals modulated in accordance with the application and data communication technology when accessed via network interface 1085 .
- Storage interface 1080 can connect to a standard computer readable medium for storage and/or retrieval of information, such as a fixed disk drive 1075 .
- Fixed disk drive 1075 may be a part of controller 1000 or may be separate and accessed through other interface systems.
- Network interface 1085 may provide a direct connection to a remote server via a direct network link to the Internet via a POP (point of presence).
- Network interface 1085 may provide such connection using wireless techniques, including digital cellular telephone connection, Cellular Digital Packet Data (CDPD) connection, digital satellite data connection, or the like.
- one or more sensors e.g., motion sensor, smoke sensor, glass break sensor, door sensor, window sensor, carbon monoxide sensor, and the like) connect to controller 1000 wirelessly via network interface 1085 .
- controller 1000 may be iOS®, ANDROID®, MS-dOS®, MS-WINDOWS®, OS/2®, UNIX®, LINUX®, or another known operating system.
- a signal can be directly transmitted from a first block to a second block, or a signal can be modified (e.g., amplified, attenuated, delayed, latched, buffered, inverted, filtered, or otherwise modified) between the blocks.
- a signal can be directly transmitted from a first block to a second block, or a signal can be modified (e.g., amplified, attenuated, delayed, latched, buffered, inverted, filtered, or otherwise modified) between the blocks.
- a signal input at a second block can be conceptualized as a second signal derived from a first signal output from a first block due to physical limitations of the circuitry involved (e.g., there will inevitably be some attenuation and delay). Therefore, as used herein, a second signal derived from a first signal includes the first signal or any modifications to the first signal, whether due to circuit limitations or due to passage through other circuit elements which do not change the informational and/or final functional aspect of the first signal.
- the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.”
- the words “including” and “having,” as used in the specification and claims are interchangeable with and have the same meaning as the word “comprising.”
- the term “based on” as used in the specification and the claims is to be construed as meaning “based at least upon.”
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Medical Informatics (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- User Interface Of Digital Computer (AREA)
- Telephonic Communication Services (AREA)
- Selective Calling Equipment (AREA)
Abstract
Description
- The present application is a continuation of U.S. patent application Ser. No. 14/272,181, filed May 7, 2014, titled “HOME AUTOMATION VIA VOICE CONTROL,” and assigned to the assignee hereof, the disclosure of which is expressly incorporated herein in its entirety by this reference.
- Advancements in media delivery systems and media-related technologies continue to increase at a rapid pace. Increasing demand for media has influenced the advances made to media-related technologies. Computer systems have increasingly become an integral part of the media-related technologies. Computer systems may be used to carry out several media-related functions. The wide-spread access to media has been accelerated by the increased use of computer networks, including the Internet and cloud networking.
- Many homes and businesses use one or more computer networks to generate, deliver, and receive data and information between the various computers connected to computer networks. Users of computer technologies continue to demand increased access to information and an increase in the efficiency of these technologies. Improving the efficiency of computer technologies is desirable to those who use and rely on computers.
- With the wide-spread use of computers and mobile devices has come an increased presence of home automation and security products. Advancements in mobile devices allow users to monitor and/or control an aspect of a home or business. As home automation and security products expand to encompass other systems and functionality in the home, challenges exist in setting rules for various automated functions of the home automation and security system.
- Methods and systems are described for setting operation rules that are used to control at least some functions of a home automation system. According to at least one embodiment, an apparatus for establishing operation rules in a home automation system includes a processor, a memory in electronic communication with the processor, and instructions stored in the memory which are executable by the processor to receive a spoken command having a plurality of rule setting terms, establish at least one operation rule for the home automation system based on the spoken command, and store the at least one operation rule for later use by the home automation system.
- In one example, the instructions may be executable by the processor to initiate a rules mode for the home automation system prior to receiving the spoken command. The instructions may be executable by the processor to terminate the rules mode after establishing the at least one operation rule. Initiating the rules mode may include receiving a touch input at a control panel of the home automation system. Initiating the rules mode may include receiving a spoken trigger word or an audible sound. The instructions may be executable by the processor to deliver a request for clarification of the spoken command. The request for clarification may include an audible message. The request for clarification may include a displayed message visible on a control panel of the home automation system. The instructions may be executable by the processor to generate a message that includes a restatement of the spoken command and a request for confirmation of the correctness of the restatement of the spoken command. The instructions may be executable by the processor to generate a message to a user of the home automation system that includes a restatement of the at least one operation rule and requests confirmation of the accuracy of the at least one operation rule.
- Another embodiment of the present disclosure relates to a computer-program product for establishing operation rules in a home automation system. The computer-program product includes a non-transitory computer-readable medium storing instructions executable by a processor to initiate a rule setting mode for the home automation system, receive a spoken command having a plurality of rule setting terms, generate at least one operation rule for the home automation system based on the spoken command, and generate a message that includes a restatement of the at least one operation rule.
- In one example, at least some of the plurality of rule setting terms may be defined during installation of the home automation system at a property. At least some of the plurality of rule setting terms may be generic to a plurality of different home automation systems associated with a plurality of different properties. The plurality of rule setting terms may include at least one static term that is generic to a plurality of different home automation systems, and at least one dynamic term that is uniquely defined for the home automation system. The instructions may be executable by the processor to store the at least one operation rule for later use by the home automation system, wherein storing the at least one operation rule includes storing in a local database of the home automation system. The instructions may be executable by the processor to generate a message requesting confirmation of the spoken command. The instructions may be executable by the processor to receive confirmation of the spoken command.
- A further embodiment of the present disclosure relates to a computer-implemented method for establishing operation rules in a home automation system. The method includes initiating a rules mode for the home automation system, receiving a spoken command having a plurality of rule setting terms, establishing at least one operation rule for the home automation system using the spoken command, storing the at least one operation rule, and operating at least one function of the home automation system based on the at least one stored operation rule.
- In one example, the plurality of rule setting terms may include at least one static term that is pre-programmed into the home automation system prior to installation. The plurality of rule setting terms may include at least one dynamic term that is uniquely programmed into the home automation system no sooner than installation at a property.
- The foregoing has outlined rather broadly the features and technical advantages of examples according to the disclosure in order that the detailed description that follows may be better understood. Additional features and advantages will be described hereinafter. The conception and specific examples disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Such equivalent constructions do not depart from the spirit and scope of the appended claims. Features which are believed to be characteristic of the concepts disclosed herein, both as to their organization and method of operation, together with associated advantages will be better understood from the following description when considered in connection with the accompanying figures. Each of the figures is provided for the purpose of illustration and description only, and not as a definition of the limits of the claims.
- A further understanding of the nature and advantages of the embodiments may be realized by reference to the following drawings. In the appended figures, similar components or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
-
FIG. 1 is a block diagram of an environment in which the present systems and methods may be implemented; -
FIG. 2 is a block diagram of another environment in which the present systems and methods may be implemented; -
FIG. 3 is a block diagram of another environment in which the present systems and methods may be implemented; -
FIG. 4 is a block diagram of another environment in which the present systems and methods may be implemented; -
FIG. 5 is a block diagram of a voice control module for use in at least one of the environments shown inFIGS. 1-4 ; -
FIG. 6 is a block diagram of another voice control module for use in at least one of the environments shown inFIGS. 1-4 ; -
FIG. 7 is a flow diagram illustrating a method for establishing operation rules in a home automation system; -
FIG. 8 is a flow diagram illustrating a method for establishing operation rules in a home automation system; -
FIG. 9 is a flow diagram illustrating a method for establishing operation rules in a home automation system; and -
FIG. 10 is a block diagram of a computer system suitable for implementing the present systems and methods ofFIGS. 1-9 . - While the embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the instant disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
- The systems and methods described herein relate to home automation and home security, and related security systems and automation for use in commercial and business settings. More specifically, the systems and methods described herein relate to operating features of a home automation system using voice control. As used herein, the phrase “home automation system” may refer to a system that includes automation features alone, security features alone, a combination of automation and security features, or a combination of automation, security and other features. While the phrase “home automation system” is used throughout to describe a system or components of a system or environment in which aspects of the present disclosure are described, such an automation system and its related features (whether automation and/or security features) may be generally applicable to other properties such as businesses and commercial properties as well as systems that are used in indoor and outdoor settings.
- Home automation systems are typically programmed via an on-site control panel that is located at the property that is being monitored/controlled by the system. The control panel usually includes a touch screen with a number of categories and menus to select from for programming various functions of the system. The user-selected options create rules by which the system operates/functions. For example, the system may be programmed by selecting among the options presented on the control panel screen to arm or disarm the system at a certain time of the day or day of the week based on the programmed rules. Once the “rules” are established, the system operates automatically when the time of day or day of week condition is met.
- Home automation systems are initially programmed for a particular property by a professional installer who is familiar with the control panel and how to program and/or modify functionality of the system. The property owner is usually less familiar with the control panel and how to modify functionality of the system after it is installed. Opportunities exist for improving the user interface with a home automation system, and particularly the process by which the operation rules of the system are established and/or modified.
- One aspect of the present disclosure relates to using voice control with a home automation system. The voice control may particularly relate to establishing or changing rules by which the automation system operates, rather than prompting functionality of the automation system. Real-time control of at least some functionality of an automation system (speaking a command to arm the system or unlock a door) is relatively well-known in this technical field. The present disclosure is intended to alleviate challenges for a user in interfacing with the home automation system (e.g., control panel) to establish or modify the rules by which the home automation system operates. The voice commands given in accordance with the present disclosure typically do not result in real-time change in functionality of the system, but rather are used at a later time to control functionality of the system.
- In one example, a user may first activate a “rules” mode for the home automation system and then speak a command such as “unlock back door every day at 7:00 a.m.” or “turn up basement thermostat to 75 degrees when Jason arrives home after noon.” The “rules” mode may be activated by speaking a trigger word or phrase (e.g., “rules mode”), selecting a rules button or category on the control panel (e.g., on a display screen of the control panel), or entering a key word at the control panel.
- The command may include static and/or dynamic vocabulary that is interpreted in order for the system to understand the meaning of the command. Static vocabulary may include those terms that are consistent for at least a certain type of home automation system (e.g., unlock, every, day, 7:00, a.m., turn up, after, noon, 75 degrees, etc.). The system may include a library of static vocabulary and definitions associated with each term. Dynamic vocabulary may include those terms that are specific to a particular property and system setup. For example, the various access points of a building may be named according to user preferences at the time of system setup (e.g., back patio door, basement thermostat, etc.). The dynamic terms may be, for example, stored separately, defined at the time of set up, and/or modified during the lifetime of the system. The dynamic vocabulary may also include the names or identifiers of persons who are authorized to access the property (e.g., Jason, mom, etc.). The persons may be identified by the system in a variety of ways, including cell phones and fabs carried by the persons, codes entered by the persons when entering or leaving the property, voice or face recognition, etc.
- The system may be configured to prompt the user for clarification of the spoken command if the command includes one or more unknown terms or is unclear in some way. The prompt may be an audible message (e.g., a spoken message such as, “you mentioned back door, is this the south back door or the east back door?”) or displayed (e.g., on a screen of a control panel or on a mobile handheld device). The system may also repeat the rule back to the user and request confirmation of its correctness before storing and/or using the rule to control functionality of the system.
- As mentioned above, the rule setting mode for the home automation system may be operated through a control panel of the home automation system. Additionally, or alternatively, the rule setting mode may be operated via a remote computing device, such as a mobile handheld device, laptop computer, desktop computer, mobile tablet computing device, or the like. Any function related to setting rules of the home automation system, such as the voice control features discussed herein for setting rules of the home automation system, may be performed using such a remote computing device.
-
FIG. 1 is a block diagram of one example of anenvironment 100 in which the present systems and methods may be implemented. In some embodiments, the systems and methods described herein may be performed, at least in part, using adevice 105, which includes a user interface 110, avoice control module 115, and aprocessor 120. -
Environment 100 may include or be a part of a home automation system.Device 105 may include or be part of a control panel of a home automation system.Device 105 may be a permanently installed device (e.g., a control panel mounted to a wall of a building) or it may be a mobile device such as, for example, a desktop computer, a laptop computer, a tablet computer, or mobile handheld device such as a smartphone.Device 105 may operate at least in part to establish rules of operation for a home automation system. As mentioned above, rules are typically established for a home automation system using a somewhat complicated and oftentimes non-user friendly system of drop down menus, questions, and the like that a user must navigate via, for example, a display on a control panel. A potential advantage related to the voice control features of the present disclosure for setting rules for a home automation system is the reduced complexity of creating rules and being able to avoid at least some of the confusing, complex menus and interfaces that are otherwise required to establish such rules. - User interface 110 may provide a physical interface between
device 105 and a user. User interface 110 may include, for example, a touch screen with a plurality of user activated menus and screen selectable features. User interface 110 may include one or more physical buttons, switches, or the like. User interface 110 may include at least one feature that provides a user the ability to enter into or exit a rule setting mode for the home automation system. User interface 110 may include other features or functionality that permit the user to select certain rules and/or responses to questions as part of setting rules related to a home automation system. -
Voice control module 115 may operate at least in part in response to audible sounds and to establish one or more rules for the home automation system in response to the audible sounds. The audible sounds may be voice commands generated by one or more users. Additionally, or alternatively, the audible sounds may include tones, notes, beeps, or other sounds that are generated by a device and/or by the user. -
Voice control module 115 may provide an interactive experience with a user to establish a rule for a home automation system. For example,voice control module 115 may receive a voice command from a user which places the home automation system in a rule setting mode. Once in the rule setting mode,voice control module 115 may prompt the user for information related to setting one or more rules.Voice control module 115 may receive audible responses and/or inputs via, for example, user interface 110, from the user to create a rule.Voice control module 115 may respond to the user with audible and/or displayed responses for the purpose of, for example, confirming instructions from the user related to the rule being established, requesting clarification of the inputs from the user related to setting a rule, or confirming that the rule has been established. The functionality provided byvoice control module 115 may eliminate, at least in part, the need for a user to provide inputs via, for example, user interface 110 or other physical inputs from the user in order to establish a rule for the home automation system. In at least some embodiments,voice control module 115 may be operable to establish at least one rule of the home automation system using only audible impact from the user. -
Processor 120 may cooperate with user interface 110 andvoice control module 115 to perform any of the computing and/or logic functions related to, for example, inputting information from the user, establishing rules, storing rules, and later providing at least some automated function of the home automation system based on the rules. - Typically, the rules established by
voice control module 115 are stored and used for future operation of one or more features of the home automation system. Rules may be stored in any of a number of locations including, for example, a memory or database ofdevice 105. In other examples, the rules may be stored at other locations such as, for example, in a cloud storage or in a backend server available via, for example, a network. Once saved, the rules may provide for automated functionality of home automation system based on certain conditions. For example, a rule established viavoice control module 115 may be that the “front door is locked every time Bill arms the system.” The home automation system may be able to identify the difference between Bill and other users of the system based on, for example, a log-in/security code, voice recognition, a pre-registered mobile device that is identifiable via, for example, blue tooth technology, or the like. Bill may arm the system in any of a number of ways including, for example, manual arming the system via user interface 110 or other feature ordevice 105, arming the system via a remote computing device (e.g., a mobile handheld device), or using a voice command of “arm system” spoken from anywhere in the property and picked up by any number of microphones or other features of the home automation system. - The spoken commands received from the user at
voice control module 115 may include a variety of different terms. Some of the terms may be classified as “static” terms that are generic to all of at least a certain type of home automation systems. The generic terms may include such words as, for example, lock, unlock, arm, disarm, on, off, open, close, daytime, night, 1:00, noon, I, me, arrive, depart, temperature, raise, lower, and the like. - Other types of terms may be classified as “dynamic” terms that are unique to a specific property or particular home automation system installation. The names of each of the people living in the home or others who may be authorized to operate the home automation system may be stored in a dynamic term database. Other dynamic terms may include, for example, particular names assigned to each of the barriers, appliances, systems, or areas of the home (e.g., front door, south garage door, kitchen light, patio window, south HVAC zone, front sprinkler zone, and the like). The dynamic terms may be established at the time of, for example, installing the home automation system, as prompted at any time via, for example,
voice control module 115, or based on a questionnaire that is answerable, for example, when updating or upgrading the home automation system. The static and dynamic terms may be stored in different locations. The static terms may be pre-programmed in the home automation system. In at least some examples, the static terms may be permanent and unable to be altered. In contrast, the dynamic terms may be updated, added to, or changed as desired. -
Voice control module 115 may search the static and dynamic term databases in response to receiving a voice command from a user to determine whether the voice command provides enough specificity to generate a rule. If there is uncertainty and/or a lack of clarity related to any of the static or dynamic terms identified in the spoken command,voice control module 115 may prompt the user to, for example, restate the voice command, restate a specific term, provide clarification of a given term, or confirm that the spoken command (as repeated back to the user via voice control module 115) accurately reflects the user's intended instructions and/or rule. -
FIG. 2 shows anotherenvironment 200 that may include the components inenvironment 100 described above, and may further include a device 105-a having a user interface 110, avoice control module 115, aprocessor 120, aspeaker 205, amicrophone 210, and aterm database 215. Thespeaker 205 andmicrophone 210 may facilitate audible communication between the user and the home automation system (e.g., voice control module 115).Speaker 205 may convey audible responses from the home automation system to the user.Microphone 210 may operate to receive audible signals from the user (e.g., voice commands).Speaker 205 andmicrophone 210 may be integrated into device 105-a. In other examples,speaker 205 and/ormicrophone 210 may be provided as separate components from device 105-a. Furthermore,speaker 205 andmicrophone 210 may provide multiple functions, some of which may be unrelated to establishing rules for the home automation system. For examples,speaker 205 may be used to convey alarm signals, messages to the user related to the status of one or more barriers at the home, convey audible messages related to, for example, current weather conditions or the status of an HVAC system, or prompt the user for further instructions related to operation of the home automation system.Microphone 210 may operate to receive audible signals from a user including, for example, instructions related to operation of security features and/or automation features for real-time control of the home automation system. -
Term database 215 may include at least one storage location for storing one or more static and/or dynamic terms.Term database 215 may be part of or include memory capability for device 105-a. In one example, device 105-a may include a storage device (e.g., hard disk drive or solid state drive) that includes a portion thereof dedicating to storing terms (e.g., the term database 215). In other examples,term database 215 may be a separate and distinct storage device and/or storage location, and may include storage at a remote location.Term database 215 may be provided as a component of home automation system that is separate from device 105-a.Term database 215 may be accessible via, for example, a network.Term database 215 may include separate storage devices for static terms versus dynamic terms. -
FIG. 3 shows anotherenvironment 300 that may include the components ofenvironments backend server 305, aremote device 310, and anetwork 315. Additionally, a device 105-b may include, in addition to the components of device 105-a shown inFIG. 2 , adisplay 320. -
Backend server 305 may provide at least some backend support, functionality, and/or storage for a home automation system that includes device 105-a. In one example, at least some of the static and/or dynamic terms associated withterm database 215 may be stored at thebackend server 305.Backend server 305 may provide support services such as, for example, communicating with emergency personnel in response to an alarm condition identified by device 105-b. In other examples, at least some of the rules established via operation ofvoice control module 115 may be stored inbackend server 305. -
Remote device 310 may provide operation of at least some features of the home automation system from a remote location. For example,remote device 310 may facilitate communication between a user of the home automation system andvoice control module 115.Remote device 310 may include, for example, a speaker and/or microphone to provide voice commands used to establish rules viavoice control module 115.Voice control module 115 may generate audible responses and/or displayed responses that are communicated to the user viaremote device 310. For example,remote device 310 may be operated by the user to initiate a rule setting mode viavoice control module 115. Once the rule setting mode is initiated, the user may provide one or more voice commands viaremote device 310 that are used to establish one or more rules of the home automation system.Voice control module 115 may provide a response to the spoken commands, wherein the responses received at theremote device 310 are in some other way communicated to the user. Once the rules are establishedvoice control module 115, the rules may be stored and used to operate the home automation system at a later time. -
Network 315 may provide communication between device 105-b,backend server 305, andremote device 310, or other components ofenvironment 300.Network 315 may include local area networks (LAN), wide area networks (WAN), virtual private networks (VPN), wireless networks (using 802.11, for example), and/or cellular networks (using 3G and/or LTE, for example), etc. In some embodiments,network 315 may include the internet. -
Environment 300 showsdisplay 320 as a component of device 105-b. In other examples,display 320 may be provided as a separate component from device 105-b.Display 320 and other features or components of device 105-b may be provided separate from device 105-b and communicate with each other via, for example,network 315.Voice control module 115 may communicate information viadisplay 320 to a user during, for example, a rule setting mode of the home automation system.Display 320 may be part of user interface 110.Display 320 may include, for example, a touch screen.Display 320 may display colors, patterns, and the like, as well as text and other symbols. In one example, display 320 displays a text message such as “please restate previous voice command,” “I heard you say to lock the door every time Bill arms the system, is that correct,” or “you said back door, do you mean back patio door or back bedroom door.” In at least some examples,voice control module 115 may provide a text message viadisplay 320 as well as an audible message viaspeaker 205. The text and audible messages may be selected depending on, for example, considerations such as whether the user setting the rules is in close proximity to device 105-b, whether the user is able to read, etc. -
FIG. 4 shows anotherenvironment 400 that may include the components of any of theenvironments application 405, amobile computing device 410, asensor 415, ahome automation controller 420, and a remote device 310-a. Remote device 310-a includes a user interface 425, a remotevoice control module 430, aprocessor 435, aspeaker 440, amicrophone 445, and adisplay 450.Environment 400 may also include a display 320-a that is separate from device 105-a. Device 105-a, which may include the same or similar components as shown inFIG. 2 , may communicate with the other components and features ofenvironment 100 vianetwork 315. -
Application 405 may allow a user to control either directly or via device 105-a or a remote device 310-a, an aspect of the monitored property, including security, energy management, locking and unlocking doors, checking the status of a door, locating a user or item, controlling lighting, thermostat, or cameras, and retrieving notifications regarding a current status or anomaly associated with a home, office, place of business, and the like. In some configurations,application 405 may enable device 105-a and/or remove device 310-a to interface withvoice control module 115,processor 120, or the like and provide the user interface 110 to display an automation kind of security and/or energy management content on device 105-a and/or remote device 310-a. Thus,application 405 via user interface 110, or other features ofenvironment 400 may allow users to control aspects of their home, office and/or other type of property. Further,application 405 may be installed on device 105-a, remote device 310-a, or other components ofenvironment 400.Application 405 may operate in response to at least one rule established viavoice control module 115. - Display 320-a may include, for example, a digital display as part of, for example, a control panel of environment 400 (e.g., a home automation system). Display 320-a may be provided via devices such as, for example, a desktop computer or a mobile computing device (e.g., remote device 310-a). The user interface 110 may be integrated into display 320-a. Such a user interface 110 may include a plurality of menus, screens, microphones, speakers, cameras, and other capability that permit interaction between the user and the home automation system or other components of
environment 400. Additionally, or alternatively, user interface 110 with display 320-a may be integrated into device 105-a as discussed above related toenvironment 300. -
Mobile computing device 410 may include remote device 310-a or be part of remote device 310-a.Mobile computing device 410 may be used to interact withvoice control module 115 from a local or remote location. For example,mobile computing device 410 may provide initiation of a rule setting mode, may communicate a voice command associated with establishing a rule viavoice control module 115, and the like from any location relative to device 105-a. -
Sensor 415 may include, for example, a camera sensor, an audio sensors, a forced entry sensor, a shock sensor, a proximity sensor, a boundary sensor, an appliance sensor, a light fixture sensor, a temperature sensor, a light beam sensor, a three-dimensional (3D) sensor, a motion sensor, a smoke sensor, a glass break sensor, a door sensor, a video sensor, a carbon monoxide sensor, an accelerometer, a global positioning system (GPS) sensor, a Wi-Fi positioning sensor, a capacity sensor, a radio frequency sensor, a near-field sensor, a heartbeat sensor, a breathing sensor, an oxygen sensor, a carbon dioxide sensor, a brain wave sensor, a motion sensor, a voice sensor, a touch sensor, and the like. The device 105-a,application 405, display 320-a, remote device 310-a,home automation controller 420, or other components ofenvironment 400 and/or a home automation system may have included or have integrated therein one or more of thesensors 415. Althoughsensor 415 is depicted as a separate component from device 105-a, remote device 310-a, and other components ofenvironment 400. In some embodiments,sensor 415 may be connected directly to any one of these components or other components ofenvironment 400. Additionally, or alternatively,sensor 415 may be integrated into a home appliance or fixture such as a light fixture. The information provided bysensor 415 may be used to initiate a rule setting mode of a home automation system, confirm user feedback associated with questions or other feedback provided viavoice control module 115, confirm identification of one or more users attempting to establish or change rules of the home automation system, and the like. In other examples,sensor 415 may operate in response to the rule established via voice control module 115 (e.g., rules established by voice commands, stored, and then used by the home automation system at a later time to permit one or more feature and/or functionality of the home automation system). -
Home automation controller 420 may include or be part ofprocessor 120 and/orprocessor 435.Home automation controller 420 may operate to control any of the features or functionality of a home automation system and/or component ofenvironment 400. Home automation controller may operate at least in part in response to rules established viavoice control module 115. - Remote device 310-a may include at least some of the features and/or functionality of device 105-a, and may provide such functionality from a location that is remote from device 105-a and/or a property being controlled and/or monitored by a home automation system. The user interface 425 may include any of the features and/or functionality of user interface 110. Remote
voice control module 430 may include at least some of the features and/or functionality ofvoice control module 115 described above. Likewise,processor 435,speaker 440,microphone 445, and display 450 may include at least some of the same features and/or functionality of theprocessor 120,speaker 205,microphone 210, and display 320-a, respectively described above with reference todevice 105. For example, remotevoice control module 430 may facilitate initiation of a rule setting mode for the home automation system via remote device 310-a. Remotevoice control module 430 may operate to receive voice commands and other input related to establishing one or more rules, and may provide communication and/or responses to the user as part of establishing the rules.Speaker 440 andmicrophone 445 may operate to communicate between the user and the home automation system.Speaker 440 andmicrophone 445 may also provide communication between a user operating remote device 310-a and thevoice control module 115 of device 105-a (e.g., via network 315). Furthermore,display 450 may display messages to the user operating remote device 310-a in response to operation ofvoice control module 115. - In at least some respects, remote device 310-a may provide a user with the ability to generate and/or change rules for operation of the home automation system while the user is positioned at a location remote from device 105-a (which may be any location remote from a property being monitored by the home automation system). In other examples, remote device 310-a may provide communication between
voice control module 115, which operates on device 105-a locally at the property being monitored by the home automation system, while the user is positioned at a location remote from the property. -
FIG. 5 shows a voice control module 115-a that may be one example of thevoice control modules 115 shown and described with reference toFIGS. 1-4 . Voice control module 115-a may include arules mode module 505, arule generation module 510, and arule storage module 515. In other embodiments, voice control module 115-a may include other modules that perform the same or different functionality for thevoice control module 115. -
Rules mode module 505 may operate to place the home automation system in a rule setting mode. The rules setting mode may be initiated by, for example, the user speaking a voice command that includes a particular trigger word/phrase such as, for example, “start rules mode,” “set new rule,” or “change a rule.” In another example, the rule setting mode is initiated by the user providing manual input such as, for example, touching an active area of a touch screen, pressing a button, operating a switch, etc. (e.g., via user interface 110 or user interface 425). -
Rules mode module 505 may provide feedback to the user to confirm that the home automation system is in a rule setting mode.Rules mode module 505 may provide an indicator such as, for example, an audible message that states “rule setting mode active,” a text message displayed to the user that reads “rules mode,” a symbol, a change in color of a light indicator, etc. In another example, rulesmode module 505 may prompt the user for instructions as a way to confirm that the home automation system is in a rule setting mode. For example, therules mode module 505 may provide an audible or text message stating “provide rules instructions” or “enter rule.” -
Rule generation module 510 may receive the rules instructions from the user and generate at least one rule for the home automation system in response to the received instructions.Rule generation module 510 may operate to provide communication back to the user in response to instructions received from the user. The responsive communication may be in any format. For example, the user may provide audible instructions related to setting a rule (e.g., “lock the front door every time Bill arms the system”), andrule generation module 510 provides a confirmation of the instructions to the user via audible, text, or other means of communication with a message such as “you stated, lock the door every time Bill arms the system, is this correct?”Rule generation module 510 may also confirm that a rule has been set by communicating a message to the user such as, for example, “rule set” or “created new rule.” -
Rule storage module 515 may store the rule for later use by the home automation system.Rule storage module 515 may provide instructions for saving the rule in any of a number of locations such as, for example, the memory of a storage panel (e.g., device 105), or at a remote location such as, for example,backend server 305.Rule storage module 515 may also operate to retrieve the rule and/or permit the user to modify and/or replace a rule at a later time. -
FIG. 6 is a block diagram illustrating another voice control module 115-b. Voice control module 115-b may be one example of thevoice control module 115 shown in any ofFIGS. 1-5 . Voice control module 115-b may include therules mode module 505,rule generation module 510, andrule storage module 515 of voice control module 115-a described above with reference toFIG. 5 . Voice control module 115-b may additionally include at least one of acommand receipt module 605, aterm identification module 610, acommand confirmation module 615, and arule confirmation module 620. Themodules rules mode module 505,rule generation module 510, andrule storage module 515 described above with reference to voice control module 115-a, and/or thevoice control module 115 described with reference toFIGS. 1-4 . -
Command receipt module 605 may prompt a user for instructions relating to setting a rule for the home automation system.Command receipt module 605 may present the prompt in the form of, for example, an audible request (e.g., “state the rule”), a text message, a flashing light sequence, etc.Command receipt module 605 may operate to inform the user that the spoken command or other instruction from the user associated with setting a rule has been received.Command receipt module 605 may store, analyze, and/or evaluate the command or at least certain aspects of the command. -
Term identification module 610 may operate to identify terms in the command received from the user. Theterm identification module 610 may identify certain of the terms as static terms and other of the terms as dynamic terms. In some examples, the command is provided at least in part by a menu selection or a written command. Regardless of the format of the command provided by the user,term identification module 610 may help identify key terms that are relevant for setting one or more rules. In at least one example,term identification module 610 may identify a trigger term used to initiate the rule setting mode, and may cooperate withrules mode module 505 related to initiating the rule setting mode. In other examples,term identification module 610 is operable to identify a different trigger word used to terminate a rule setting mode, to delete an existing rule, and/or to correct previously stated spoken commands. -
Command confirmation module 615 may operate to provide confirmation back to the user that the user's command (whether spoken or in another format) has been received.Command confirmation module 615 may provide a responsive message to the user to confirm that the user's command is understood correctly. For example,command confirmation module 615 may provide a message to the user restating what the voice control module 115-b understood the user's command to say and/or mean (e.g., “you stated that the front door should be locked every time Bill arms the system. Is this correct?”). -
Rule confirmation module 620 may provide a response to the user that a rule has been generated and/or stored. For example,rule confirmation module 620 may provide a response to the user (e.g., spoken, text or otherwise) indicating that the rule has been recognized, saved, etc.Rule confirmation module 620 may provide an audible message to the user such as, for example, “the system will lock the door every time Bill arms the system,” or “your requested rule has been set and saved.”Voice control module 115 may cooperate with other devices such as, for example,remote device 310 to provide communication with the user. For example, any of the functions described with reference to the modules of voice control module 115-b may function at least in part in cooperation withremote device 310 or other features of theenvironments FIGS. 5-6 ) may be included in the remotevoice control module 430 described with reference toFIG. 4 . -
FIG. 7 is a flow diagram illustrating one embodiment of amethod 700 for establishing operation rules in a home automation system. In some configurations, themethod 700 may be implemented with any of thevoice control modules 115 shown inFIGS. 1-6 . In some examples,method 700 may be performed generally bydevice 105 orremote device 310 shown inFIGS. 1, 2, 3 , and/or 4, or even more generally by theenvironments FIGS. 1-4 . - At
block 705,method 700 includes receiving a spoken command having a plurality of rule setting terms.Block 710 includes establishing at least one operation rule for the home automation system based on the spoken command. Atblock 710, themethod 700 includes storing the at least one operation rule for later use by the automation and security system. -
Method 700 may also include initiating a rules mode for the home automation system prior to receiving a spoken command.Method 700 may include terminating the rules mode after establishing the at least one operation rule. Initiating the rules mode may include receiving a touch input at a control panel of the home automation system. Initiating the rules mode may include receiving a spoken trigger word or an audible sound.Method 700 may include delivering a request for clarification of a spoken command. The request for clarification may include an audible message. The request for clarification may include a displayed message visible on a control panel of the home automation system.Method 700 may include generating a message that includes a restatement of the spoken command and a request for confirmation of the correctness of the restatement of the spoken command.Method 700 may include generating a message to a user of the home automation system that includes a restatement of the at least one operation rule and requesting confirmation of the accuracy of the at least one operation rule. -
FIG. 8 is a flow diagram illustrating one embodiment of amethod 800 for establishing operation rules in a home automation system. In some configurations, themethod 800 may be implemented by any one of thevoice control modules 115 described with reference toFIGS. 1-6 . In other examples,method 800 may be performed generally bydevice 105 or theremote device 310 described with reference toFIGS. 1-4 , or even more generally by theenvironments FIGS. 1-4 . - At
block 805,method 800 includes initiating a rule setting mode for the home automation system.Block 810 includes receiving a spoken command having a plurality of rule setting terms.Block 815 includes generating at least one operation rule for the home automation system based on the spoken command. Atblock 820,method 800 includes generating a message that includes a restatement of the at least one operation rule. -
Method 800 may also include defining at least some of the plurality of rule setting terms at the time of installing the home automation system at a property. At least some of the plurality of rule setting terms may be generic to a property of different home automation systems associated with a plurality of different properties. The plurality of rule setting terms may include at least one static term that is generic to a plurality of different home automation systems, and at least one dynamic term that is uniquely defined for the home automation system.Method 800 may include storing the at least one operation rule for later use by the home automation system, wherein storing the at least one operation rule includes storing in a local database of the home automation system. Themethod 800 may include generating a message requesting confirmation of the spoken command.Method 800 may include receiving confirmation of the spoken command. -
FIG. 9 is a flow diagram illustrating one embodiment of amethod 900 for establishing operation rules in a home automation system. In some configurations, themethod 900 may be implemented by thevoice control module 115 described with reference toFIGS. 1-6 . In other examples,method 900 may be performed generally by thedevice 105 orremote device 310 shown with reference toFIGS. 1-4 , or even more generally by theenvironments FIGS. 1-4 . - At
block 905,method 900 includes initiating a rules mode for the home automation system.Block 910 includes receiving a spoken command having a plurality of rule setting terms. Atblock 915,method 900 includes establishing at least one operation rule for the home automation system using the spoken commands.Block 920 includes storing the at least one operation rule.Block 925 includes operating at least one function of the home automation system based on the at least one stored operation rule. - The plurality of rule setting terms may include at least one static term that is preprogrammed into the home automation system prior to installation. The plurality of rule setting terms may include at least one dynamic term that is uniquely programmed into the home automation system no sooner than installation of the home automation system at a property.
-
FIG. 10 depicts a block diagram of acontroller 1000 suitable for implementing the present systems and methods. Thecontroller 1000 may be an example of thedevice 105 orremote device 310 illustrated inFIGS. 1, 2, 3 and/or 4 . In one configuration,controller 1000 includes abus 1005 which interconnects major subsystems ofcontroller 1000, such as a central processor 1010, a system memory 1015 (typically RAM, but which may also include ROM, flash RAM, or the like), an input/output controller 1020, an external audio device, such as aspeaker system 1025 via anaudio output interface 1030, an external device, such as adisplay screen 1035 viadisplay adapter 1040, an input device 1045 (e.g., remote control device interfaced with an input controller 1050), multiple USB devices 1065 (interfaced with a USB controller 1070), and astorage interface 1080. Also included are at least onesensor 1055 connected tobus 1005 through asensor controller 1060 and a network interface 1085 (coupled directly to bus 1005). -
Bus 1005 allows data communication between central processor 1010 (e.g., processor 120) andsystem memory 1015, which may include read-only memory (ROM) or flash memory (neither shown), and random access memory (RAM) (not shown), as previously noted. Voice control module 115-c, which is one example ofvoice control module 115 shown inFIGS. 1-6 , may be stored insystem memory 1015. Any of the modules discloses with reference toFIGS. 1-6 may be stored insystem memory 1015. The RAM is generally the main memory into which the operating system and application programs are loaded. The ROM or flash memory can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components or devices. Applications (e.g., application 405) resident withcontroller 1000 are generally stored on and accessed via a non-transitory computer readable medium, such as a hard disk drive (e.g., fixed disk 1075) or other storage medium. Additionally, applications can be in the form of electronic signals modulated in accordance with the application and data communication technology when accessed vianetwork interface 1085. -
Storage interface 1080, as with the other storage interfaces ofcontroller 1000, can connect to a standard computer readable medium for storage and/or retrieval of information, such as afixed disk drive 1075.Fixed disk drive 1075 may be a part ofcontroller 1000 or may be separate and accessed through other interface systems.Network interface 1085 may provide a direct connection to a remote server via a direct network link to the Internet via a POP (point of presence).Network interface 1085 may provide such connection using wireless techniques, including digital cellular telephone connection, Cellular Digital Packet Data (CDPD) connection, digital satellite data connection, or the like. In some embodiments, one or more sensors (e.g., motion sensor, smoke sensor, glass break sensor, door sensor, window sensor, carbon monoxide sensor, and the like) connect tocontroller 1000 wirelessly vianetwork interface 1085. - Many other devices or subsystems (not shown) may be connected in a similar manner (e.g., entertainment system, computing device, remote cameras, wireless key fob, wall mounted user interface device, cell radio module, battery, alarm siren, door lock, lighting system, thermostat, home appliance monitor, utility equipment monitor, and so on). Conversely, all of the devices shown in
FIG. 10 need not be present to practice the present systems and methods. The devices and subsystems can be interconnected in different ways from that shown inFIG. 10 . The aspect of some operations of a system such as that shown inFIG. 10 are readily known in the art and are not discussed in detail in this application. Code to implement the present disclosure can be stored in a non-transitory computer-readable medium such as one or more ofsystem memory 1015 or fixeddisk 1075. The operating system provided oncontroller 1000 may be iOS®, ANDROID®, MS-dOS®, MS-WINDOWS®, OS/2®, UNIX®, LINUX®, or another known operating system. - Moreover, regarding the signals described herein, those skilled in the art will recognize that a signal can be directly transmitted from a first block to a second block, or a signal can be modified (e.g., amplified, attenuated, delayed, latched, buffered, inverted, filtered, or otherwise modified) between the blocks. Although the signals of the above described embodiment are characterized as transmitted from one block to the next, other embodiments of the present systems and methods may include modified signals in place of such directly transmitted signals as long as the informational and/or functional aspect of the signal is transmitted between blocks. To some extent, a signal input at a second block can be conceptualized as a second signal derived from a first signal output from a first block due to physical limitations of the circuitry involved (e.g., there will inevitably be some attenuation and delay). Therefore, as used herein, a second signal derived from a first signal includes the first signal or any modifications to the first signal, whether due to circuit limitations or due to passage through other circuit elements which do not change the informational and/or final functional aspect of the first signal.
- While the foregoing disclosure sets forth various embodiments using specific block diagrams, flowcharts, and examples, each block diagram component, flowchart step, operation, and/or component described and/or illustrated herein may be implemented, individually and/or collectively, using a wide range of hardware, software, or firmware (or any combination thereof) configurations. In addition, any disclosure of components contained within other components should be considered exemplary in nature since many other architectures can be implemented to achieve the same functionality.
- The process parameters and sequence of steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
- Furthermore, while various embodiments have been described and/or illustrated herein in the context of fully functional computing systems, one or more of these exemplary embodiments may be distributed as a program product in a variety of forms, regardless of the particular type of computer-readable media used to actually carry out the distribution. The embodiments disclosed herein may also be implemented using software modules that perform certain tasks. These software modules may include script, batch, or other executable files that may be stored on a computer-readable storage medium or in a computing system. In some embodiments, these software modules may configure a computing system to perform one or more of the exemplary embodiments disclosed herein.
- The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the present systems and methods and their practical applications, to thereby enable others skilled in the art to best utilize the present systems and methods and various embodiments with various modifications as may be suited to the particular use contemplated.
- Unless otherwise noted, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” In addition, for ease of use, the words “including” and “having,” as used in the specification and claims, are interchangeable with and have the same meaning as the word “comprising.” In addition, the term “based on” as used in the specification and the claims is to be construed as meaning “based at least upon.”
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/847,014 US10554432B2 (en) | 2014-05-07 | 2017-12-19 | Home automation via voice control |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/272,181 US9860076B2 (en) | 2014-05-07 | 2014-05-07 | Home automation via voice control |
US15/847,014 US10554432B2 (en) | 2014-05-07 | 2017-12-19 | Home automation via voice control |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/272,181 Continuation US9860076B2 (en) | 2014-05-07 | 2014-05-07 | Home automation via voice control |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180176031A1 true US20180176031A1 (en) | 2018-06-21 |
US10554432B2 US10554432B2 (en) | 2020-02-04 |
Family
ID=54368132
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/272,181 Active 2036-03-29 US9860076B2 (en) | 2014-05-07 | 2014-05-07 | Home automation via voice control |
US15/847,014 Active 2034-08-21 US10554432B2 (en) | 2014-05-07 | 2017-12-19 | Home automation via voice control |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/272,181 Active 2036-03-29 US9860076B2 (en) | 2014-05-07 | 2014-05-07 | Home automation via voice control |
Country Status (4)
Country | Link |
---|---|
US (2) | US9860076B2 (en) |
EP (1) | EP3140727A4 (en) |
CA (1) | CA2946910A1 (en) |
WO (1) | WO2015171878A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10901380B2 (en) * | 2019-06-04 | 2021-01-26 | Lg Electronics Inc. | Apparatus and method for controlling operation of home appliance, home appliance and method for operating of home appliance |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10481860B2 (en) * | 2009-02-02 | 2019-11-19 | Gregory Walker Johnson | Solar tablet verbal |
US20130201316A1 (en) | 2012-01-09 | 2013-08-08 | May Patents Ltd. | System and method for server based control |
US9860076B2 (en) * | 2014-05-07 | 2018-01-02 | Vivint, Inc. | Home automation via voice control |
GB2532244B (en) * | 2014-11-13 | 2021-03-24 | Honeywell Technologies Sarl | A commissioning tool and a method of commissioning a building management system |
US9985796B2 (en) | 2014-12-19 | 2018-05-29 | Smartlabs, Inc. | Smart sensor adaptive configuration systems and methods using cloud data |
US11489690B2 (en) | 2014-12-19 | 2022-11-01 | Smartlabs, Inc. | System communication utilizing path between neighboring networks |
US20160182246A1 (en) * | 2014-12-19 | 2016-06-23 | Smartlabs, Inc. | Smart home device adaptive configuration systems and methods using network data |
WO2016104824A1 (en) * | 2014-12-23 | 2016-06-30 | 엘지전자 주식회사 | Portable device and control method therefor |
CN104808499B (en) * | 2015-03-09 | 2019-01-15 | 联想(北京)有限公司 | A kind of method and control device based on linkage rule control smart home device |
CN104749962B (en) * | 2015-03-09 | 2017-06-27 | 联想(北京)有限公司 | The method and control device of a kind of control smart home based on linkage rule |
US9984686B1 (en) | 2015-03-17 | 2018-05-29 | Amazon Technologies, Inc. | Mapping device capabilities to a predefined set |
US10655951B1 (en) | 2015-06-25 | 2020-05-19 | Amazon Technologies, Inc. | Determining relative positions of user devices |
US10365620B1 (en) | 2015-06-30 | 2019-07-30 | Amazon Technologies, Inc. | Interoperability of secondary-device hubs |
US10896671B1 (en) * | 2015-08-21 | 2021-01-19 | Soundhound, Inc. | User-defined extensions of the command input recognized by a virtual assistant |
US10089070B1 (en) * | 2015-09-09 | 2018-10-02 | Cisco Technology, Inc. | Voice activated network interface |
US10018977B2 (en) * | 2015-10-05 | 2018-07-10 | Savant Systems, Llc | History-based key phrase suggestions for voice control of a home automation system |
US9679453B2 (en) * | 2015-10-20 | 2017-06-13 | Vivint, Inc. | System and methods for correlating sound events to security and/or automation system operations |
US9769420B1 (en) * | 2016-03-18 | 2017-09-19 | Thomas Lawrence Moses | Portable wireless remote monitoring and control systems |
US10163437B1 (en) * | 2016-06-02 | 2018-12-25 | Amazon Technologies, Inc. | Training models using voice tags |
TWI586051B (en) * | 2016-10-26 | 2017-06-01 | 勝德國際研發股份有限公司 | Socket device with hanging type |
US10832564B2 (en) | 2017-05-01 | 2020-11-10 | Johnson Controls Technology Company | Building security system with event data analysis for generating false alarm rules for false alarm reduction |
EP3622784B1 (en) * | 2017-05-08 | 2020-11-11 | Signify Holding B.V. | Voice control |
JP6962158B2 (en) * | 2017-12-01 | 2021-11-05 | ヤマハ株式会社 | Equipment control system, equipment control method, and program |
US11237796B2 (en) * | 2018-05-07 | 2022-02-01 | Google Llc | Methods, systems, and apparatus for providing composite graphical assistant interfaces for controlling connected devices |
US10916121B2 (en) * | 2018-05-21 | 2021-02-09 | Johnson Controls Technology Company | Virtual maintenance manager |
EP3803854B1 (en) | 2018-05-25 | 2022-11-23 | Carrier Corporation | Controlling home automation devices through security panel using voice assistant |
US11385635B2 (en) * | 2018-07-02 | 2022-07-12 | Disney Enterprises, Inc. | Autonomous drone play and directional alignment |
US11402812B1 (en) | 2019-03-22 | 2022-08-02 | The Chamberlain Group Llc | Apparatus and method for controlling a device |
KR20200126509A (en) * | 2019-04-30 | 2020-11-09 | 삼성전자주식회사 | Home appliance and method for controlling thereof |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020052746A1 (en) * | 1996-12-31 | 2002-05-02 | News Datacom Limited Corporation | Voice activated communication system and program guide |
US20040010410A1 (en) * | 2002-07-11 | 2004-01-15 | Samsung Electronics Co., Ltd. | System and method for processing voice command |
US20120025348A1 (en) * | 2010-07-27 | 2012-02-02 | Stmicroelectronics (Grenoble) Sas | Semiconductor device comprising a passive component of capacitors and process for fabrication |
US20130183944A1 (en) * | 2012-01-12 | 2013-07-18 | Sensory, Incorporated | Information Access and Device Control Using Mobile Phones and Audio in the Home Environment |
US8654936B1 (en) * | 2004-02-24 | 2014-02-18 | At&T Intellectual Property I, L.P. | Home control, monitoring and communication system using remote voice commands |
US20140156281A1 (en) * | 2012-12-03 | 2014-06-05 | Qualcomm Incorporated | Voice-controlled configuration of an automation system |
US20140309789A1 (en) * | 2013-04-15 | 2014-10-16 | Flextronics Ap, Llc | Vehicle Location-Based Home Automation Triggers |
US20140330560A1 (en) * | 2013-05-06 | 2014-11-06 | Honeywell International Inc. | User authentication of voice controlled devices |
US20150162007A1 (en) * | 2013-12-06 | 2015-06-11 | Vivint, Inc. | Voice control using multi-media rooms |
US20150162006A1 (en) * | 2013-12-11 | 2015-06-11 | Echostar Technologies L.L.C. | Voice-recognition home automation system for speaker-dependent commands |
US20150348548A1 (en) * | 2014-05-30 | 2015-12-03 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US20160314782A1 (en) * | 2015-04-21 | 2016-10-27 | Google Inc. | Customizing speech-recognition dictionaries in a smart-home environment |
US9860076B2 (en) * | 2014-05-07 | 2018-01-02 | Vivint, Inc. | Home automation via voice control |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0911808B1 (en) * | 1997-10-23 | 2002-05-08 | Sony International (Europe) GmbH | Speech interface in a home network environment |
WO1999034339A2 (en) * | 1997-12-29 | 1999-07-08 | Ameritech Corporation | System and method for home automation and security |
US6249575B1 (en) | 1998-12-11 | 2001-06-19 | Securelogix Corporation | Telephony security system |
EP1057302A1 (en) | 1998-12-29 | 2000-12-06 | Koninklijke Philips Electronics N.V. | Home control system with distributed network devices |
US6408272B1 (en) | 1999-04-12 | 2002-06-18 | General Magic, Inc. | Distributed voice user interface |
WO2002087152A1 (en) | 2001-04-18 | 2002-10-31 | Caveo Technology, Llc | Universal, customizable security system for computers and other devices |
KR100434545B1 (en) * | 2002-03-15 | 2004-06-05 | 삼성전자주식회사 | Method and apparatus for controlling devices connected with home network |
US6792323B2 (en) * | 2002-06-27 | 2004-09-14 | Openpeak Inc. | Method, system, and computer program product for managing controlled residential or non-residential environments |
US7594112B2 (en) | 2003-10-10 | 2009-09-22 | Bea Systems, Inc. | Delegated administration for a distributed security system |
US20110062888A1 (en) | 2004-12-01 | 2011-03-17 | Bondy Montgomery C | Energy saving extra-low voltage dimmer and security lighting system wherein fixture control is local to the illuminated area |
US20060181401A1 (en) | 2005-01-31 | 2006-08-17 | Honeywell International, Inc | Vacation mode security system and method |
US9614964B2 (en) | 2005-08-19 | 2017-04-04 | Nextstep, Inc. | Consumer electronic registration, control and support concierge device and method |
US8042048B2 (en) | 2005-11-17 | 2011-10-18 | Att Knowledge Ventures, L.P. | System and method for home automation |
US7552467B2 (en) | 2006-04-24 | 2009-06-23 | Jeffrey Dean Lindsay | Security systems for protecting an asset |
US7696873B2 (en) | 2006-09-12 | 2010-04-13 | Tyco Safety Products Canada Ltd. | Method and apparatus for automatically disarming a security system |
US8538757B2 (en) * | 2007-05-17 | 2013-09-17 | Redstart Systems, Inc. | System and method of a list commands utility for a speech recognition command system |
WO2008144638A2 (en) * | 2007-05-17 | 2008-11-27 | Redstart Systems Inc. | Systems and methods of a structured grammar for a speech recognition command system |
US8165886B1 (en) | 2007-10-04 | 2012-04-24 | Great Northern Research LLC | Speech interface system and method for control and interaction with applications on a computing system |
US8526909B2 (en) | 2008-03-27 | 2013-09-03 | At&T Mobility Ii Llc | Systems and methods for activating a security system upon receipt of emergency alert messages |
US20100127854A1 (en) | 2008-11-21 | 2010-05-27 | Richard Eric Helvick | Method and system for controlling home appliances based on estimated time of arrival |
US9311917B2 (en) * | 2009-01-21 | 2016-04-12 | International Business Machines Corporation | Machine, system and method for user-guided teaching of deictic references and referent objects of deictic references to a conversational command and control system |
US8301270B2 (en) * | 2009-01-27 | 2012-10-30 | Eldon Technology Ltd. | Systems and methods for facilitating home automation |
FR2956757B1 (en) * | 2010-02-25 | 2012-09-21 | Somfy Sas | ASSIGNING SCENARIOS TO CONTROL BUTTONS. |
KR101828273B1 (en) * | 2011-01-04 | 2018-02-14 | 삼성전자주식회사 | Apparatus and method for voice command recognition based on combination of dialog models |
FR2972062B1 (en) * | 2011-02-28 | 2013-04-12 | Somfy Sas | CONTROL DEVICE COMPRISING AN INTERFACE CAPABLE OF PROVIDING THE NEXT CONTROL ORDER TO BE TRANSMITTED TO DOMOTIC EQUIPMENT |
US9368107B2 (en) * | 2011-04-20 | 2016-06-14 | Nuance Communications, Inc. | Permitting automated speech command discovery via manual event to command mapping |
KR20130027665A (en) * | 2011-09-08 | 2013-03-18 | 삼성전자주식회사 | Device and method for controlling home network service in wireless terminal |
US8340975B1 (en) | 2011-10-04 | 2012-12-25 | Theodore Alfred Rosenberger | Interactive speech recognition device and system for hands-free building control |
KR20130133629A (en) * | 2012-05-29 | 2013-12-09 | 삼성전자주식회사 | Method and apparatus for executing voice command in electronic device |
US10145579B2 (en) * | 2013-05-01 | 2018-12-04 | Honeywell International Inc. | Devices and methods for interacting with a control system that is connected to a network |
-
2014
- 2014-05-07 US US14/272,181 patent/US9860076B2/en active Active
-
2015
- 2015-05-07 WO PCT/US2015/029663 patent/WO2015171878A1/en active Application Filing
- 2015-05-07 CA CA2946910A patent/CA2946910A1/en not_active Abandoned
- 2015-05-07 EP EP15789155.7A patent/EP3140727A4/en not_active Withdrawn
-
2017
- 2017-12-19 US US15/847,014 patent/US10554432B2/en active Active
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020052746A1 (en) * | 1996-12-31 | 2002-05-02 | News Datacom Limited Corporation | Voice activated communication system and program guide |
US6654721B2 (en) * | 1996-12-31 | 2003-11-25 | News Datacom Limited | Voice activated communication system and program guide |
US20040010410A1 (en) * | 2002-07-11 | 2004-01-15 | Samsung Electronics Co., Ltd. | System and method for processing voice command |
US8654936B1 (en) * | 2004-02-24 | 2014-02-18 | At&T Intellectual Property I, L.P. | Home control, monitoring and communication system using remote voice commands |
US20120025348A1 (en) * | 2010-07-27 | 2012-02-02 | Stmicroelectronics (Grenoble) Sas | Semiconductor device comprising a passive component of capacitors and process for fabrication |
US20130183944A1 (en) * | 2012-01-12 | 2013-07-18 | Sensory, Incorporated | Information Access and Device Control Using Mobile Phones and Audio in the Home Environment |
US20140156281A1 (en) * | 2012-12-03 | 2014-06-05 | Qualcomm Incorporated | Voice-controlled configuration of an automation system |
US20140309789A1 (en) * | 2013-04-15 | 2014-10-16 | Flextronics Ap, Llc | Vehicle Location-Based Home Automation Triggers |
US20140330560A1 (en) * | 2013-05-06 | 2014-11-06 | Honeywell International Inc. | User authentication of voice controlled devices |
US20150162007A1 (en) * | 2013-12-06 | 2015-06-11 | Vivint, Inc. | Voice control using multi-media rooms |
US20150162006A1 (en) * | 2013-12-11 | 2015-06-11 | Echostar Technologies L.L.C. | Voice-recognition home automation system for speaker-dependent commands |
US9860076B2 (en) * | 2014-05-07 | 2018-01-02 | Vivint, Inc. | Home automation via voice control |
US20150348548A1 (en) * | 2014-05-30 | 2015-12-03 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US9715875B2 (en) * | 2014-05-30 | 2017-07-25 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US20180012596A1 (en) * | 2014-05-30 | 2018-01-11 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US20160314782A1 (en) * | 2015-04-21 | 2016-10-27 | Google Inc. | Customizing speech-recognition dictionaries in a smart-home environment |
US10079012B2 (en) * | 2015-04-21 | 2018-09-18 | Google Llc | Customizing speech-recognition dictionaries in a smart-home environment |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10901380B2 (en) * | 2019-06-04 | 2021-01-26 | Lg Electronics Inc. | Apparatus and method for controlling operation of home appliance, home appliance and method for operating of home appliance |
Also Published As
Publication number | Publication date |
---|---|
CA2946910A1 (en) | 2015-11-12 |
WO2015171878A1 (en) | 2015-11-12 |
US9860076B2 (en) | 2018-01-02 |
EP3140727A4 (en) | 2017-10-04 |
US10554432B2 (en) | 2020-02-04 |
US20150324706A1 (en) | 2015-11-12 |
EP3140727A1 (en) | 2017-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10554432B2 (en) | Home automation via voice control | |
US10768784B2 (en) | Systems and methods for rules-based automations and notifications | |
US11635737B1 (en) | Determining occupancy with user provided information | |
US10362441B1 (en) | Communications based on geo location information | |
US20190005751A1 (en) | Mobile device based authentication | |
US10432419B1 (en) | Voice control using multi-media rooms | |
US10455271B1 (en) | Voice control component installation | |
US20150088282A1 (en) | Touch-less swipe control | |
US20150160935A1 (en) | Managing device configuration information | |
US11029655B2 (en) | Progressive profiling in an automation system | |
US9686092B2 (en) | Remote talk down to panel, camera and speaker | |
US20160065910A1 (en) | Video gateway device | |
US11361652B1 (en) | Voice annunciated reminders and alerts |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: VIVINT, INC., UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WARREN, JEREMY B.;REEL/FRAME:044437/0087 Effective date: 20140429 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: BANK OF AMERICA N.A., NORTH CAROLINA Free format text: SUPPL. NO. 2 SECURITY AGREEMENT;ASSIGNOR:VIVINT, INC.;REEL/FRAME:047024/0048 Effective date: 20180906 Owner name: BANK OF AMERICA, N.A., NORTH CAROLINA Free format text: SECURITY AGREEMENT;ASSIGNOR:VIVINT, INC.;REEL/FRAME:047029/0304 Effective date: 20180906 |
|
AS | Assignment |
Owner name: WILMINGTON TRUST, NATIONAL ASSOCIATION, DELAWARE Free format text: SECURITY AGREEMENT;ASSIGNOR:VIVINT, INC.;REEL/FRAME:049283/0566 Effective date: 20190510 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: VIVINT, INC., UTAH Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:056832/0824 Effective date: 20210709 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |