WO2016187964A1 - 智能控制受控设备的方法和装置 - Google Patents

智能控制受控设备的方法和装置 Download PDF

Info

Publication number
WO2016187964A1
WO2016187964A1 PCT/CN2015/086878 CN2015086878W WO2016187964A1 WO 2016187964 A1 WO2016187964 A1 WO 2016187964A1 CN 2015086878 W CN2015086878 W CN 2015086878W WO 2016187964 A1 WO2016187964 A1 WO 2016187964A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
controlled device
scene
recognition result
information
Prior art date
Application number
PCT/CN2015/086878
Other languages
English (en)
French (fr)
Inventor
王伟杰
Original Assignee
深圳创维-Rgb电子有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳创维-Rgb电子有限公司 filed Critical 深圳创维-Rgb电子有限公司
Priority to US15/319,383 priority Critical patent/US20170139470A1/en
Priority to AU2015396131A priority patent/AU2015396131A1/en
Publication of WO2016187964A1 publication Critical patent/WO2016187964A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/12Applying verification of the received information
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25314Modular structure, modules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Definitions

  • the present invention relates to the field of intelligent control technologies, and in particular, to a method and apparatus for intelligently controlling a controlled device.
  • the commonly used control method is to install an application software for controlling the controlled device on the mobile terminal, and the user inputs a control command through the application software on the mobile terminal, and the mobile terminal sends the control command to the control center, and the control center controls the control.
  • the instruction is converted into control commands of various code values, and the converted control command is sent to the designated controlled device through the transmitting module to implement control of the controlled device.
  • the method requires the user to carry the mobile terminal, and the user is required to input a corresponding instruction on the mobile terminal, which is troublesome to operate.
  • the relatively intelligent method at present is to control the controlled device by means of voice recognition
  • the user speaks the password to the voice receiving terminal
  • the voice receiving terminal performs voice recognition on the password, converts it into a control command and sends it to the control center
  • the control center Control instructions enable control of the controlled device.
  • the voice control mode is greatly affected by factors such as ambient noise and user pronunciation regional differences, and there is a large error, resulting in low accuracy of the control command.
  • the main object of the present invention is to provide a method and apparatus for intelligently controlling a controlled device, which is convenient for intelligent control and high in accuracy of control commands.
  • the invention provides a method for intelligently controlling a controlled device, comprising the steps of:
  • the current scene is monitored, and when the user arrives in the current scene, the scene information of the current scene is obtained, and the current scene is identified according to the scene information, and the scene recognition result is obtained;
  • the step of controlling the controlled device that matches the scene recognition result and the action recognition result according to the scene recognition result and the action recognition result comprises:
  • the step of sending the control command to the matched controlled device further includes:
  • the step of transmitting the control command to the matched controlled device is performed when the current state does not coincide with the corresponding state of the control command.
  • the step of collecting the action information of the user further includes:
  • the alarm system is activated.
  • the step of obtaining the identity information of the user and determining whether the user is a legitimate user includes:
  • the scene information is a scene photo or video captured by a camera
  • the action information of the user is an action video of the user captured by a camera.
  • the invention also provides an apparatus for intelligently controlling a controlled device, comprising:
  • the scene analysis module is configured to monitor the current scene, obtain the scene information of the current scene, and identify the current scene according to the scene information, and obtain the scene recognition result;
  • the action analysis module is configured to collect action information of the user, identify the action information, and obtain a motion recognition result;
  • control module configured to control, according to the scene recognition result and the motion recognition result, a controlled device that matches the scene recognition result and the motion recognition result.
  • control module is further configured to:
  • the device for intelligently controlling the controlled device further includes a state analysis module, configured to acquire a current state of the matched controlled device, and determine whether a current state of the matched controlled device has a corresponding state with the control command. Consistent
  • the control module is further configured to: when the current state has been consistent with the corresponding state of the control command, not to send the control command; when the current state does not match the corresponding state of the control command, send the control command to Matched controlled device.
  • the device for intelligently controlling the controlled device further includes an identity recognition module and an alarm module;
  • the identity recognition module is configured to acquire identity information of the user, and determine whether the user is a legitimate user;
  • the action analysis module is further configured to: when the user is a legitimate user, collect action information of the user;
  • the alarm module is configured to activate an alarm system when the user is an illegal user.
  • the identity recognition module is further configured to:
  • the scene information is a scene photo or video captured by a camera
  • the action information of the user is an action video of the user captured by a camera.
  • the invention controls the corresponding controlled device by identifying the current scene and the user action, and realizes intelligent control of the controlled device, the user does not need to perform special actions or gestures, and does not need to carry the mobile terminal for sending instructions, and the controlled device
  • the control is more intelligent and more convenient.
  • FIG. 1 is a flow chart of a first embodiment of a method for intelligently controlling a controlled device according to the present invention
  • FIG. 2 is a flow chart of a second embodiment of a method for intelligently controlling a controlled device according to the present invention
  • FIG. 3 is a flowchart of a third embodiment of a method for intelligently controlling a controlled device according to the present invention.
  • FIG. 4 is a flowchart of a fourth embodiment of a method for intelligently controlling a controlled device according to the present invention.
  • FIG. 5 is a flowchart of a fifth embodiment of a method for intelligently controlling a controlled device according to the present invention.
  • FIG. 6 is a schematic block diagram of a first embodiment of an apparatus for intelligently controlling a controlled device according to the present invention.
  • FIG. 7 is a schematic block diagram of a second embodiment of an apparatus for intelligently controlling a controlled device according to the present invention.
  • FIG. 8 is a schematic block diagram of a third embodiment of an apparatus for intelligently controlling a controlled device according to the present invention.
  • FIG. 1 is a flowchart of a first embodiment of a method for intelligently controlling a controlled device according to the present invention.
  • the method for intelligently controlling a controlled device mentioned in this embodiment includes the following steps:
  • step S10 the current scene is monitored, and when the user arrives in the current scene, the scene information of the current scene is acquired, and the current scene is identified according to the scene information, and the scene recognition result is obtained;
  • the embodiment is mainly used in an intelligent control system, and can be applied in an environment such as home, office, security, etc., and intelligently controls devices such as living appliances, office equipment, and security systems.
  • the functions of collecting, analyzing, identifying, matching, and transmitting commands of each data can be completed by the intelligent control system.
  • an image or video capture device ie, a camera
  • each scene is monitored by a camera.
  • the camera In the living room, when the current scene is detected by the camera installed in the living room, the camera acquires scene information of the current scene, such as taking a photo of the current scene or recording a video of the current scene.
  • the camera transmits the obtained scene information to the intelligent control system, and the intelligent control system recognizes the scene information according to the scene information, including image processing and visual recognition, obtains the scene recognition result, and recognizes that the current scene is the living room.
  • the identified scene can also be a more detailed scene. For example, if the user walks into the living room and walks to the sofa, the scene recognition result can be “living room + sofa”.
  • Step S20 collecting action information of the user, identifying action information, and obtaining a motion recognition result
  • the camera While acquiring the scene information of the current scene, the camera also needs to track the motion video of the shooting user as the collected user motion information.
  • the camera transmits user action information to the intelligent control system.
  • the intelligent control system uses image processing, visual recognition, pattern recognition and other technologies to identify the user's motion information and obtain the user's specific actions. For example, if the user sits down on the living room sofa, Get the "sit down" action recognition result.
  • Step S30 controlling the controlled device that matches the scene recognition result and the motion recognition result according to the scene recognition result and the motion recognition result.
  • the intelligent control system associates the information of the current scene with the information of the controlled device installed or placed in the scene, that is, which controlled devices are placed or installed in the current scene, so that after the current scene is identified, the intelligent system can be found.
  • the corresponding controlled device is intelligently controlled.
  • the living room is equipped with a ceiling, a TV set A, a cabinet type air conditioner, etc.
  • the bedroom is equipped with a table lamp, a television set B, a hanging type air conditioner, and the like.
  • the intelligent control system can directly send the recognition result to the matched controlled device, and the controlled device searches for or generates a corresponding control instruction according to the recognition result, and executes the control instruction, or is controlled.
  • the device responds to the received recognition result, for example, returns its current state information; the intelligent control system may also generate a corresponding control instruction after obtaining the scene recognition result and the motion recognition result, and send the control instruction to the controlled device.
  • the controlled device performs a corresponding operation according to the control instruction.
  • the user by identifying the current scene and the user action, controlling the corresponding controlled device, and implementing intelligent control of the controlled device, the user does not need to perform special actions or gestures, and does not need to carry the mobile terminal for sending instructions, and is controlled.
  • the control of the device is more intelligent and more convenient.
  • FIG. 2 is a flowchart of a second embodiment of a method for intelligently controlling a controlled device according to the present invention. This embodiment includes all the steps in the embodiment shown in FIG. 1, wherein step S30 includes:
  • Step S31 searching for a matching control instruction and a matched controlled device according to the scene recognition result and the action recognition result;
  • the scene recognition result, the motion recognition result and the control instruction are correspondingly stored in the instruction set database, and the intelligent control system can query the matching from the preset instruction set database according to the scene recognition result and the motion recognition result obtained by the recognition.
  • Commands and controlled devices the same action has different meanings in different scenarios, corresponding to different matching instructions, making the generated control commands more accurate.
  • the recognition result of "living room + sitting down” is associated with the instruction “opening the living room chandelier”
  • the recognition result of "dining room + sitting down” is associated with the instruction "turning on the light of the dining room”.
  • a pair of scene recognition results and motion recognition results may correspond to one or more control commands, for example, the "living room + sitting down” recognition result and the instruction “turning the chandelier”, “turning on the television A”, “opening the cabinet” Air conditioners are associated.
  • Step S32 sending a control command to the matched controlled device.
  • the intelligent control system After finding the matching control command and the controlled device, the intelligent control system sends the control command to the corresponding controlled device separately, and controls the controlled device to perform the corresponding operation.
  • the "open chandelier” command is sent to the living room chandelier to control the living room chandelier to light;
  • the "turn on the TV A” command is sent to the television set A, and the television set A is turned on;
  • "open the cabinet type air conditioner” "This command is sent to the cabinet air conditioner, and the control cabinet air conditioner is activated.
  • the sent control commands can be sent via wifi, infrared, Bluetooth, Z-Wave, ZigBee, etc.
  • the intelligent control of the controlled device is realized, and the generated control command is more accurate.
  • FIG. 3 is a flowchart of a third embodiment of a method for intelligently controlling a controlled device according to the present invention. This embodiment includes all the steps in the embodiment shown in FIG. 2, and before step S32, the method further includes:
  • Step S41 obtaining a current state of the matched controlled device
  • Step S42 it is determined whether the current state of the matched controlled device has been consistent with the corresponding state of the control command; if yes, step S43 is performed; if not, step S32 is performed;
  • step S43 no control command is sent.
  • the intelligent control system determines the current running state of the controlled device before sending the control command to the controlled device.
  • the intelligent control system can obtain the current state of the controlled device through wifi, infrared, Bluetooth, Z-Wave, ZigBee, etc., or the controlled device actively sends the controlled control to the intelligent control system through the above transmission mode after each execution of the operation instruction.
  • the current state of the device can establish a status record table according to the state arrival time, so as to facilitate the query, or only record the last sent status information, which is beneficial to save information storage space.
  • the current state of the controlled device has been consistent with the corresponding state of the control command, for example, the current state of the television set A is the off state, and the control command is "turn off the television set A", the corresponding state of the control command and the current state of the television set A The status is the same. If the intelligent control system continues to send the control command “turn off the TV A” to the TV set A, the TV set A will not respond to the command. Therefore, in order to save the processing flow of the intelligent control system, the intelligent control system There is no need to send control commands, which is beneficial to improve the efficiency of intelligent control and avoid waste of resources.
  • FIG. 4 is a flowchart of a fourth embodiment of a method for intelligently controlling a controlled device according to the present invention. This embodiment includes all the steps in the embodiment shown in FIG. 1. Before step S20, the method further includes:
  • Step S51 obtaining the identity information of the user, determining whether the user is a legitimate user; if yes, executing step S20; if not, executing step S52;
  • step S52 the alarm system is activated.
  • the identity of the user Before the user action is recognized, the identity of the user needs to be identified to determine whether the current user is a legitimate user. If the current user is not a preset legal user, it may be that the illegal user breaks into the current scene. For example, in an office environment, when a legitimate user enters the office and sits down in the seat, the intelligent control system controls the office computer to start up, but if an illegal user breaks in, in order to prevent illegal users from stealing data in the computer, the intelligent control system When the user identity is identified and the current user is determined to be an illegal user, the alarm system is activated, which is beneficial to improving the security of the intelligent control.
  • the identification method may be a user password or fingerprint information entered before entering the environment.
  • a password or fingerprint entry device is set at the door of the house or at the door of the office.
  • the legitimate user will input the correct password or fingerprint information, the fingerprint or password information entered by the illegal user is incorrect, or the password is not entered at all.
  • the fingerprint, password or fingerprint entry device sends the received information to the intelligent control system for identification to ensure intelligent control and environmental security.
  • FIG. 5 is a flowchart of a fifth embodiment of a method for intelligently controlling a controlled device according to the present invention. This embodiment includes all the steps in the embodiment shown in FIG. 4, wherein step S51 includes:
  • Step S511 extracting a portrait feature of the user from the scene information
  • Step S512 identifying the portrait feature of the user, determining whether the portrait feature of the user is consistent with the portrait feature of the pre-stored legitimate user; if yes, executing step S513; if not, executing step S514;
  • step S51 it is determined that the user is an illegal user, and step S52 is performed.
  • This embodiment adopts an image recognition method to identify a user identity.
  • the scene information acquired by the camera includes the portrait of the user
  • the intelligent control system may extract the portrait feature of the user from the scene information, and may include the facial features, the body shape, the iris feature, and the like of the user.
  • the user may include a walking posture, a sitting posture, a habit motion, and the like, and the intelligent control system identifies the user identity by one or more of the features.
  • the acquired feature is consistent with the pre-stored legal user feature, the identity verification is valid.
  • the corresponding intelligent control operation is performed.
  • the features are inconsistent, the identity verification is invalid, and the alarm system is activated, which is beneficial to improving the security of the intelligent control.
  • FIG. 6 is a schematic block diagram of a first embodiment of an apparatus for intelligently controlling a controlled device according to the present invention.
  • the device for intelligently controlling the controlled device mentioned in this embodiment includes:
  • the scene analysis module 10 is configured to monitor the current scene, obtain the scene information of the current scene when the user arrives in the current scene, and identify the current scene according to the scene information, and obtain the scene recognition result;
  • the action analysis module 20 is configured to collect action information of the user, identify motion information, and obtain a motion recognition result;
  • the control module 30 is configured to control the controlled device that matches the scene recognition result and the motion recognition result according to the scene recognition result and the motion recognition result.
  • the device for intelligently controlling the controlled device in the embodiment is mainly used in an intelligent control system, and can be applied in an environment of home, office, security, etc., and intelligently controls devices such as living appliances, office equipment, and security systems.
  • the functions of collecting, analyzing, identifying, matching, and transmitting commands of each data can be performed by devices that intelligently control the controlled devices in the intelligent control system.
  • an image or video capture device ie, a camera
  • a camera is installed in various scenes in the home, such as a doorway, a living room, a dining room, a bedroom, etc., and each scene is monitored by a camera. When the user enters a certain scene, for example, the user walks.
  • the camera In the living room, when the current scene is detected by the camera installed in the living room, the camera acquires scene information of the current scene, such as taking a photo of the current scene or recording a video of the current scene.
  • the camera transmits the obtained scene information to the intelligent control system, and the intelligent control system recognizes the scene information according to the scene information, including image processing and visual recognition, obtains the scene recognition result, and recognizes that the current scene is the living room.
  • the identified scene can also be a more detailed scene. For example, if the user walks into the living room and walks to the sofa, the scene recognition result can be “living room + sofa”.
  • the camera While acquiring the scene information of the current scene, the camera also needs to track the motion video of the shooting user as the collected user motion information.
  • the camera transmits user action information to the intelligent control system.
  • the intelligent control system uses image processing, visual recognition, pattern recognition and other technologies to identify the user's motion information and obtain the user's specific actions. For example, if the user sits down on the living room sofa, Get the "sit down" action recognition result.
  • the intelligent control system associates the information of the current scene with the information of the controlled device installed or placed in the scene, that is, which controlled devices are placed or installed in the current scene, so that after the current scene is identified, the intelligent system can be found.
  • the corresponding controlled device is intelligently controlled.
  • the living room is equipped with a ceiling, a TV set A, a cabinet type air conditioner, etc.
  • the bedroom is equipped with a table lamp, a television set B, a hanging type air conditioner, and the like.
  • the intelligent control system can directly send the recognition result to the matched controlled device, and the controlled device searches for or generates a corresponding control instruction according to the recognition result, and executes the control instruction, or is controlled.
  • the device responds to the received recognition result, for example, returns its current state information; the intelligent control system may also generate a corresponding control instruction after obtaining the scene recognition result and the motion recognition result, and send the control instruction to the controlled device.
  • the controlled device performs a corresponding operation according to the control instruction.
  • the user by identifying the current scene and the user action, controlling the corresponding controlled device, and implementing intelligent control of the controlled device, the user does not need to perform special actions or gestures, and does not need to carry the mobile terminal for sending instructions, and is controlled.
  • the control of the device is more intelligent and more convenient.
  • control module 30 is further configured to:
  • the scene recognition result, the motion recognition result and the control instruction are correspondingly stored in the instruction set database, and the intelligent control system can query the matching from the preset instruction set database according to the scene recognition result and the motion recognition result obtained by the recognition.
  • Commands and controlled devices the same action has different meanings in different scenarios, corresponding to different matching instructions, making the generated control commands more accurate.
  • the recognition result of "living room + sitting down” is associated with the instruction “opening the living room chandelier”
  • the recognition result of "dining room + sitting down” is associated with the instruction "turning on the light of the dining room”.
  • a pair of scene recognition results and motion recognition results may correspond to one or more control commands, for example, the "living room + sitting down” recognition result and the instruction “turning the chandelier”, “turning on the television A”, “opening the cabinet” Air conditioners are associated.
  • the intelligent control system After finding the matching control command and the controlled device, the intelligent control system sends the control command to the corresponding controlled device separately, and controls the controlled device to perform the corresponding operation.
  • the "open chandelier” command is sent to the living room chandelier to control the living room chandelier to light;
  • the "turn on the TV A” command is sent to the television set A, and the television set A is turned on;
  • "open the cabinet type air conditioner” "This command is sent to the cabinet air conditioner, and the control cabinet air conditioner is activated.
  • the sent control commands can be sent via wifi, infrared, Bluetooth, Z-Wave, ZigBee, etc.
  • the intelligent control of the controlled device is realized, and the generated control command is more accurate.
  • FIG. 7 is a schematic block diagram of a second embodiment of an apparatus for intelligently controlling a controlled device according to the present invention.
  • This embodiment includes all of the modules in the embodiment shown in FIG. 6, and a state analysis module 40 is also added.
  • the state analysis module 40 is configured to obtain a current state of the matched controlled device, and determine whether the current state of the matched controlled device is consistent with the corresponding state of the control instruction.
  • the control module 30 is further configured to: when the current state has been consistent with the corresponding state of the control command, does not send the control command; when the current state is inconsistent with the corresponding state of the control command, send the control command to the matched controlled device.
  • the intelligent control system determines the current running state of the controlled device before sending the control command to the controlled device.
  • the intelligent control system can obtain the current state of the controlled device through wifi, infrared, Bluetooth, Z-Wave, ZigBee, etc., or the controlled device actively sends the controlled control to the intelligent control system through the above transmission mode after each execution of the operation instruction.
  • the current state of the device can establish a status record table according to the state arrival time, so as to facilitate the query, or only record the last sent status information, which is beneficial to save information storage space.
  • the current state of the controlled device has been consistent with the corresponding state of the control command, for example, the current state of the television set A is the off state, and the control command is "turn off the television set A", the corresponding state of the control command and the current state of the television set A The status is the same. If the intelligent control system continues to send the control command “turn off the TV A” to the TV set A, the TV set A will not respond to the command. Therefore, in order to save the processing flow of the intelligent control system, the intelligent control system There is no need to send control commands, which is beneficial to improve the efficiency of intelligent control and avoid waste of resources.
  • FIG. 8 is a schematic block diagram of a third embodiment of an apparatus for intelligently controlling a controlled device according to the present invention.
  • This embodiment includes all of the modules in the embodiment shown in FIG. 6, and an identity recognition module 50 and an alarm module 60 are also added.
  • the identity recognition module 50 is configured to obtain identity information of the user, and determine whether the user is a legitimate user.
  • the action analysis module 20 is further configured to collect action information of the user when the user is a legitimate user;
  • the alarm module 60 is configured to activate the alarm system when the user is an illegal user.
  • the identity of the user Before the user action is recognized, the identity of the user needs to be identified to determine whether the current user is a legitimate user. If the current user is not a preset legal user, it may be that the illegal user breaks into the current scene. For example, in an office environment, when a legitimate user enters the office and sits down in the seat, the intelligent control system controls the office computer to start up, but if an illegal user breaks in, in order to prevent illegal users from stealing data in the computer, the intelligent control system When the user identity is identified and the current user is determined to be an illegal user, the alarm system is activated, which is beneficial to improving the security of the intelligent control.
  • the identification method may be a user password or fingerprint information entered before entering the environment.
  • a password or fingerprint entry device is set at the door of the house or at the door of the office.
  • the legitimate user will input the correct password or fingerprint information, the fingerprint or password information entered by the illegal user is incorrect, or the password is not entered at all.
  • the fingerprint, password or fingerprint entry device sends the received information to the intelligent control system for identification to ensure intelligent control and environmental security.
  • identity module 50 is further configured to:
  • the user is determined to be a legitimate user
  • This embodiment adopts an image recognition method to identify a user identity.
  • the scene information acquired by the camera includes the portrait of the user
  • the intelligent control system may extract the portrait feature of the user from the scene information, and may include the facial features, the body shape, the iris feature, and the like of the user.
  • the user may include a walking posture, a sitting posture, a habit motion, and the like, and the intelligent control system identifies the user identity by one or more of the features.
  • the acquired feature is consistent with the pre-stored legal user feature, the identity verification is valid.
  • the corresponding intelligent control operation is performed.
  • the features are inconsistent, the identity verification is invalid, and the alarm system is activated, which is beneficial to improving the security of the intelligent control.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Automation & Control Theory (AREA)
  • Selective Calling Equipment (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种智能控制受控设备的方法,包括步骤:监测当前场景,在当前场景有用户到达时,获取当前场景的场景信息,并根据所述场景信息识别当前场景,获得场景识别结果(S10);采集所述用户的动作信息,识别所述动作信息,获得动作识别结果(S20);根据所述场景识别结果和所述动作识别结果,控制与所述场景识别结果和所述动作识别结果匹配的受控设备(S30)。一种智能控制受控设备的装置。通过识别当前场景和用户动作,控制对应的受控设备,实现对受控设备的智能控制,用户无须做特殊的动作或手势,也无须携带用于发送指令的移动终端,对受控设备的控制更加智能,更加方便。

Description

智能控制受控设备的方法和装置
技术领域
本发明涉及智能控制技术领域,特别涉及智能控制受控设备的方法和装置。
背景技术
随着物联网的发展,通过智能控制系统来对受控设备进行控制的方式越来越丰富。现在常用的控制方式是,在移动终端上安装用于控制受控设备的应用软件,用户通过移动终端上的应用软件输入控制指令,移动终端将控制指令发送到控制中心,控制中心再将这些控制指令转换成各种不同码值的控制指令,并通过发射模块将转换后的控制指令发送到指定的受控设备,实现对受控设备的控制。但是该方法需要用户携带移动终端,且需要用户在移动终端上输入相应的指令,操作较麻烦。此外,目前相对比较智能的方法是通过语音识别的方法控制受控设备,用户向语音接收终端说出口令,语音接收终端对口令进行语音识别,转换为控制指令并发送给控制中心,控制中心根据控制指令实现对受控设备的控制。但是语音控制方式受到周围环境杂音、用户发音地域差异等因素影响较大,存在较大误差,导致控制指令准确度低。
发明内容
本发明的主要目的为提供一种智能控制受控设备的方法和装置,智能控制操作方便且控制指令准确度高。
本发明提出一种智能控制受控设备的方法,包括步骤:
监测当前场景,在当前场景有用户到达时,获取当前场景的场景信息,并根据所述场景信息识别当前场景,获得场景识别结果;
采集所述用户的动作信息,识别所述动作信息,获得动作识别结果;
根据所述场景识别结果和所述动作识别结果,控制与所述场景识别结果和所述动作识别结果匹配的受控设备。
优选地,所述根据所述场景识别结果和所述动作识别结果,控制与所述场景识别结果和所述动作识别结果匹配的受控设备的步骤包括:
根据所述场景识别结果和所述动作识别结果,查找匹配的控制指令和匹配的受控设备;
发送所述控制指令至匹配的受控设备。
优选地,所述发送控制指令至匹配的受控设备的步骤之前还包括:
获取匹配的受控设备的当前状态;
判断匹配的受控设备的当前状态是否已与所述控制指令的对应状态一致;
在当前状态已与所述控制指令的对应状态一致时,不发送所述控制指令;
在当前状态与所述控制指令的对应状态不一致时,执行所述发送控制指令至匹配的受控设备的步骤。
优选地,所述采集用户的动作信息的步骤之前还包括:
获取所述用户的身份信息,判断所述用户是否为合法用户;
在所述用户为合法用户时,执行所述采集用户的动作信息的步骤;
在所述用户为非法用户时,启动报警系统。
优选地,所述获取所述用户的身份信息,判断所述用户是否为合法用户的步骤包括:
从所述场景信息中提取所述用户的人像特征;
对所述用户的人像特征进行识别,判断所述用户的人像特征是否与预存的合法用户的人像特征一致;
当一致时,判定所述用户为合法用户;
当不一致时,判定所述用户为非法用户。
优选地,所述场景信息为摄像头拍摄的场景照片或视频;所述用户的动作信息为摄像头拍摄的所述用户的动作视频。
本发明还提出一种智能控制受控设备的装置,包括:
场景分析模块,用于监测当前场景,在当前场景有用户到达时,获取当前场景的场景信息,并根据所述场景信息识别当前场景,获得场景识别结果;
动作分析模块,用于采集所述用户的动作信息,识别所述动作信息,获得动作识别结果;
控制模块,用于根据所述场景识别结果和所述动作识别结果,控制与所述场景识别结果和所述动作识别结果匹配的受控设备。
优选地,所述控制模块还用于:
根据所述场景识别结果和所述动作识别结果,查找匹配的控制指令和匹配的受控设备;
发送所述控制指令至匹配的受控设备。
优选地,所述智能控制受控设备的装置,还包括状态分析模块,用于获取匹配的受控设备的当前状态;判断匹配的受控设备的当前状态是否已与所述控制指令的对应状态一致;
所述控制模块还用于,在当前状态已与所述控制指令的对应状态一致时,不发送所述控制指令;在当前状态与所述控制指令的对应状态不一致时,发送所述控制指令至匹配的受控设备。
优选地,所述智能控制受控设备的装置,还包括身份识别模块和报警模块;
所述身份识别模块用于,获取所述用户的身份信息,判断所述用户是否为合法用户;
所述动作分析模块还用于,在所述用户为合法用户时,采集所述用户的动作信息;
所述报警模块用于,在所述用户为非法用户时,启动报警系统。
优选地,所述身份识别模块还用于:
从所述场景信息中提取所述用户的人像特征;
对所述用户的人像特征进行识别,判断所述用户的人像特征是否与预存的合法用户的人像特征一致;
当一致时,判定所述用户为合法用户;
当不一致时,判定所述用户为非法用户。
优选地,所述场景信息为摄像头拍摄的场景照片或视频;所述用户的动作信息为摄像头拍摄的所述用户的动作视频。
本发明通过识别当前场景和用户动作,控制对应的受控设备,实现对受控设备的智能控制,用户无须做特殊的动作或手势,也无须携带用于发送指令的移动终端,对受控设备的控制更加智能,更加方便。
附图说明
图1为本发明智能控制受控设备的方法的第一实施例的流程图;
图2为本发明智能控制受控设备的方法的第二实施例的流程图;
图3为本发明智能控制受控设备的方法的第三实施例的流程图;
图4为本发明智能控制受控设备的方法的第四实施例的流程图;
图5为本发明智能控制受控设备的方法的第五实施例的流程图;
图6为本发明智能控制受控设备的装置的第一实施例的模块示意图;
图7为本发明智能控制受控设备的装置的第二实施例的模块示意图;
图8为本发明智能控制受控设备的装置的第三实施例的模块示意图。
本发明目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
具体实施方式
应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
如图1所示,图1为本发明智能控制受控设备的方法的第一实施例的流程图。本实施例提到的智能控制受控设备的方法,包括步骤:
步骤S10,监测当前场景,在当前场景有用户到达时,获取当前场景的场景信息,并根据场景信息识别当前场景,获得场景识别结果;
本实施例主要用于智能控制系统中,可应用在家居、办公、安防等环境,对生活电器、办公设备、安防系统等设备进行智能控制。各数据的采集、分析、识别、匹配、以及指令的发送等功能均可由智能控制系统完成。以家居环境为例,在家中各个场景安装有图像或视频采集装置(即摄像头),例如门口、客厅、饭厅、卧室等场景,通过摄像头监测各个场景,当用户进入某一场景时,例如用户走到客厅内,通过安装于客厅的摄像头监测到当前场景有人像时,摄像头获取当前场景的场景信息,例如拍摄当前场景的照片或录制当前场景的视频。摄像头将获得的场景信息传输到智能控制系统,智能控制系统根据场景信息进行识别,包括图像处理、视觉识别等方式,获得场景识别结果,识别出当前场景为客厅。此外,识别的场景还可以是更细致的场景,例如用户走到客厅内,并走到沙发旁,则获得场景识别结果可以为“客厅+沙发”。
步骤S20,采集用户的动作信息,识别动作信息,获得动作识别结果;
在获取当前场景的场景信息的同时,摄像头还需跟踪拍摄用户的动作视频,作为采集的用户动作信息。摄像头将用户动作信息传输到智能控制系统,智能控制系统采用图像处理、视觉识别、模式识别等技术对用户的动作信息进行识别,获得用户的具体动作,例如,用户在客厅沙发上坐下,则获得“坐下”这一动作识别结果。
步骤S30,根据场景识别结果和动作识别结果,控制与场景识别结果和动作识别结果匹配的受控设备。
智能控制系统预先将当前场景的信息与场景中安装或放置的受控设备的信息进行关联存储,即哪些受控设备是放置或安装在当前场景中,以便于在识别出当前场景后,能够找到对应的受控设备进行智能控制。例如客厅安装有吊顶、电视机A、柜式空调器等,卧室安装有台灯、电视机B、挂式空调器等。智能控制系统在获得场景识别结果和动作识别结果后,可直接将识别结果发送给匹配的受控设备,由受控设备根据识别结果查找或生成相应的控制指令,并执行控制指令,或受控设备对接收到的识别结果做出响应,例如返回自身的当前状态信息;智能控制系统也可以在获得场景识别结果和动作识别结果后,生成对应的控制指令,并将控制指令发送给受控设备,受控设备根据控制指令执行对应操作。
本实施例通过识别当前场景和用户动作,控制对应的受控设备,实现对受控设备的智能控制,用户无须做特殊的动作或手势,也无须携带用于发送指令的移动终端,对受控设备的控制更加智能,更加方便。
如图2所示,图2为本发明智能控制受控设备的方法的第二实施例的流程图。本实施例包括图1所示实施例中的所有步骤,其中步骤S30包括:
步骤S31,根据场景识别结果和动作识别结果,查找匹配的控制指令和匹配的受控设备;
本实施例预先将场景识别结果、动作识别结果和控制指令对应存储在指令集数据库中,智能控制系统根据识别获得的场景识别结果和动作识别结果,可从预设的指令集数据库中查询匹配的指令和受控设备,同一动作在不同的场景下具有不同的含义,对应于不同的匹配指令,使生成的控制指令更加准确。例如,“客厅+坐下”这一识别结果与指令“打开客厅吊灯”关联,“饭厅+坐下”这一识别结果与指令“打开饭厅的灯”关联。此外,一对场景识别结果和动作识别结果可对应于一个或多个控制指令,例如,“客厅+坐下”这一识别结果与指令“打开吊灯”、“打开电视机A”、“打开柜式空调器”关联。
步骤S32,发送控制指令至匹配的受控设备。
在查找到匹配的控制指令和受控设备后,智能控制系统将控制指令分别发送到对应的受控设备,控制受控设备执行相应的操作。例如,将“打开吊灯”这一指令发送给客厅吊灯,控制客厅吊灯点亮;将“打开电视机A”这一指令发送给电视机A,控制电视机A开机;将“打开柜式空调器”这一指令发送给柜式空调器,控制柜式空调器启动。发送的控制指令可通过wifi、红外、蓝牙、Z-Wave、ZigBee等方式发送。
本实施例通过识别当前场景和用户动作,查找匹配的控制指令和受控设备,通过查找的控制指令控制受控设备执行对应操作,实现对受控设备的智能控制,生成的控制指令更加准确。
如图3所示,图3为本发明智能控制受控设备的方法的第三实施例的流程图。本实施例包括图2所示实施例中的所有步骤,在步骤S32之前还包括:
步骤S41,获取匹配的受控设备的当前状态;
步骤S42,判断匹配的受控设备的当前状态是否已与控制指令的对应状态一致;如果是,则执行步骤S43;如果否,执行步骤S32;
步骤S43,不发送控制指令。
本实施例中,为了避免向受控设备发送无效指令,提高智能控制效率,智能控制系统在向受控设备发送控制指令前,还对受控设备的当前运行状态进行判断。智能控制系统可通过wifi、红外、蓝牙、Z-Wave、ZigBee等方式获取受控设备的当前状态,或由受控设备在每一次执行操作指令后通过上述传输方式主动向智能控制系统发送受控设备的当前状态。智能控制系统中可根据状态到达时间建立状态记录表,以便于查询,也可以只记录最后一次发送的状态信息,有利于节省信息存储空间。当受控设备的当前状态已与控制指令对应状态一致时,例如电视机A的当前状态为关闭状态,而控制指令为“关闭电视机A”,则控制指令的对应状态与电视机A的当前状态一致,如果智能控制系统继续向电视机A发送“关闭电视机A”的控制指令,则电视机A不会对该指令做出响应,因此,为节省智能控制系统的处理流程,智能控制系统无须发送控制指令,有利于提高智能控制效率,避免资源浪费。
如图4所示,图4为本发明智能控制受控设备的方法的第四实施例的流程图。本实施例包括图1所示实施例中的所有步骤,在步骤S20之前还包括:
步骤S51,获取用户的身份信息,判断用户是否为合法用户;如果是,则执行步骤S20;如果否,执行步骤S52;
步骤S52,启动报警系统。
本实施例在对用户动作识别前,还需对用户的身份进行识别,判断当前用户是否为合法用户。如果当前用户不是预设的合法用户,则可能是非法用户闯入当前场景。例如,在办公室环境,当合法用户进入办公室并在座位上坐下时,智能控制系统控制办公室的电脑开机启动,但如果是非法用户闯入,为避免非法用户窃取电脑内的资料,智能控制系统在对用户身份进行识别,判定当前用户为非法用户,则启动报警系统,有利于提高智能控制的安全性。身份识别方式可以是在进入环境前录入的用户密码或指纹信息等。例如,在进入家中或进入办公室前,在家门口或办公室门口设置密码或指纹录入装置,合法用户会输入正确的密码或指纹信息,非法用户录入的指纹或密码信息不正确,或根本不录入密码和指纹,密码或指纹录入装置将收到的信息发送给智能控制系统进行识别,确保智能控制和环境的安全性。
如图5所示,图5为本发明智能控制受控设备的方法的第五实施例的流程图。本实施例包括图4所示实施例中的所有步骤,其中步骤S51包括:
步骤S511,从场景信息中提取用户的人像特征;
步骤S512,对用户的人像特征进行识别,判断用户的人像特征是否与预存的合法用户的人像特征一致;如果是,则执行步骤S513;如果否,执行步骤S514;
步骤S513,判定用户为合法用户,执行步骤S20;
步骤S514,判定所述用户为非法用户,执行步骤S52。
本实施例采用了图像识别方式来识别用户身份。当用户进入场景后,摄像头获取到的场景信息中包括有用户的人像,智能控制系统可从场景信息中提取用户的人像特征,可包括用户的脸部特征、身形特征、虹膜特征等,还可以包括用户的走路姿势、坐姿、习惯动作等,智能控制系统通过这些特征中的一个或多个来对用户身份进行识别,当获取的特征与预存的合法用户特征一致时,则身份验证有效,执行对应的智能控制操作,当特征不一致时,则身份验证无效,则启动报警系统,有利于提高智能控制的安全性。
如图6所示,图6为本发明智能控制受控设备的装置的第一实施例的模块示意图。本实施例提到的智能控制受控设备的装置,包括:
场景分析模块10,用于监测当前场景,在当前场景有用户到达时,获取当前场景的场景信息,并根据场景信息识别当前场景,获得场景识别结果;
动作分析模块20,用于采集用户的动作信息,识别动作信息,获得动作识别结果;
控制模块30,用于根据场景识别结果和动作识别结果,控制与场景识别结果和动作识别结果匹配的受控设备。
本实施例智能控制受控设备的装置主要用于智能控制系统中,可应用在家居、办公、安防等环境,对生活电器、办公设备、安防系统等设备进行智能控制。各数据的采集、分析、识别、匹配、以及指令的发送等功能均可由智能控制系统中智能控制受控设备的装置完成。以家居环境为例,在家中各个场景安装有图像或视频采集装置(即摄像头),例如门口、客厅、饭厅、卧室等场景,通过摄像头监测各个场景,当用户进入某一场景时,例如用户走到客厅内,通过安装于客厅的摄像头监测到当前场景有人像时,摄像头获取当前场景的场景信息,例如拍摄当前场景的照片或录制当前场景的视频。摄像头将获得的场景信息传输到智能控制系统,智能控制系统根据场景信息进行识别,包括图像处理、视觉识别等方式,获得场景识别结果,识别出当前场景为客厅。此外,识别的场景还可以是更细致的场景,例如用户走到客厅内,并走到沙发旁,则获得场景识别结果可以为“客厅+沙发”。
在获取当前场景的场景信息的同时,摄像头还需跟踪拍摄用户的动作视频,作为采集的用户动作信息。摄像头将用户动作信息传输到智能控制系统,智能控制系统采用图像处理、视觉识别、模式识别等技术对用户的动作信息进行识别,获得用户的具体动作,例如,用户在客厅沙发上坐下,则获得“坐下”这一动作识别结果。
智能控制系统预先将当前场景的信息与场景中安装或放置的受控设备的信息进行关联存储,即哪些受控设备是放置或安装在当前场景中,以便于在识别出当前场景后,能够找到对应的受控设备进行智能控制。例如客厅安装有吊顶、电视机A、柜式空调器等,卧室安装有台灯、电视机B、挂式空调器等。智能控制系统在获得场景识别结果和动作识别结果后,可直接将识别结果发送给匹配的受控设备,由受控设备根据识别结果查找或生成相应的控制指令,并执行控制指令,或受控设备对接收到的识别结果做出响应,例如返回自身的当前状态信息;智能控制系统也可以在获得场景识别结果和动作识别结果后,生成对应的控制指令,并将控制指令发送给受控设备,受控设备根据控制指令执行对应操作。
本实施例通过识别当前场景和用户动作,控制对应的受控设备,实现对受控设备的智能控制,用户无须做特殊的动作或手势,也无须携带用于发送指令的移动终端,对受控设备的控制更加智能,更加方便。
进一步的,控制模块30还用于:
根据场景识别结果和动作识别结果,查找匹配的控制指令和匹配的受控设备;
发送控制指令至匹配的受控设备。
本实施例预先将场景识别结果、动作识别结果和控制指令对应存储在指令集数据库中,智能控制系统根据识别获得的场景识别结果和动作识别结果,可从预设的指令集数据库中查询匹配的指令和受控设备,同一动作在不同的场景下具有不同的含义,对应于不同的匹配指令,使生成的控制指令更加准确。例如,“客厅+坐下”这一识别结果与指令“打开客厅吊灯”关联,“饭厅+坐下”这一识别结果与指令“打开饭厅的灯”关联。此外,一对场景识别结果和动作识别结果可对应于一个或多个控制指令,例如,“客厅+坐下”这一识别结果与指令“打开吊灯”、“打开电视机A”、“打开柜式空调器”关联。
在查找到匹配的控制指令和受控设备后,智能控制系统将控制指令分别发送到对应的受控设备,控制受控设备执行相应的操作。例如,将“打开吊灯”这一指令发送给客厅吊灯,控制客厅吊灯点亮;将“打开电视机A”这一指令发送给电视机A,控制电视机A开机;将“打开柜式空调器”这一指令发送给柜式空调器,控制柜式空调器启动。发送的控制指令可通过wifi、红外、蓝牙、Z-Wave、ZigBee等方式发送。
本实施例通过识别当前场景和用户动作,查找匹配的控制指令和受控设备,通过查找的控制指令控制受控设备执行对应操作,实现对受控设备的智能控制,生成的控制指令更加准确。
如图7所示,图7为本发明智能控制受控设备的装置的第二实施例的模块示意图。本实施例包括图6所示实施例中的所有模块,还增加了状态分析模块40。
状态分析模块40用于,获取匹配的受控设备的当前状态;判断匹配的受控设备的当前状态是否已与控制指令的对应状态一致;
控制模块30还用于,在当前状态已与控制指令的对应状态一致时,不发送控制指令;在当前状态与控制指令的对应状态不一致时,发送控制指令至匹配的受控设备。
本实施例中,为了避免向受控设备发送无效指令,提高智能控制效率,智能控制系统在向受控设备发送控制指令前,还对受控设备的当前运行状态进行判断。智能控制系统可通过wifi、红外、蓝牙、Z-Wave、ZigBee等方式获取受控设备的当前状态,或由受控设备在每一次执行操作指令后通过上述传输方式主动向智能控制系统发送受控设备的当前状态。智能控制系统中可根据状态到达时间建立状态记录表,以便于查询,也可以只记录最后一次发送的状态信息,有利于节省信息存储空间。当受控设备的当前状态已与控制指令对应状态一致时,例如电视机A的当前状态为关闭状态,而控制指令为“关闭电视机A”,则控制指令的对应状态与电视机A的当前状态一致,如果智能控制系统继续向电视机A发送“关闭电视机A”的控制指令,则电视机A不会对该指令做出响应,因此,为节省智能控制系统的处理流程,智能控制系统无须发送控制指令,有利于提高智能控制效率,避免资源浪费。
如图8所示,图8为本发明智能控制受控设备的装置的第三实施例的模块示意图。本实施例包括图6所示实施例中的所有模块,还增加了身份识别模块50和报警模块60。
身份识别模块50用于,获取用户的身份信息,判断用户是否为合法用户;
动作分析模块20还用于,在用户为合法用户时,采集用户的动作信息;
报警模块60用于,在用户为非法用户时,启动报警系统。
本实施例在对用户动作识别前,还需对用户的身份进行识别,判断当前用户是否为合法用户。如果当前用户不是预设的合法用户,则可能是非法用户闯入当前场景。例如,在办公室环境,当合法用户进入办公室并在座位上坐下时,智能控制系统控制办公室的电脑开机启动,但如果是非法用户闯入,为避免非法用户窃取电脑内的资料,智能控制系统在对用户身份进行识别,判定当前用户为非法用户,则启动报警系统,有利于提高智能控制的安全性。身份识别方式可以是在进入环境前录入的用户密码或指纹信息等。例如,在进入家中或进入办公室前,在家门口或办公室门口设置密码或指纹录入装置,合法用户会输入正确的密码或指纹信息,非法用户录入的指纹或密码信息不正确,或根本不录入密码和指纹,密码或指纹录入装置将收到的信息发送给智能控制系统进行识别,确保智能控制和环境的安全性。
进一步的,身份识别模块50还用于:
从场景信息中提取用户的人像特征;
对用户的人像特征进行识别,判断用户的人像特征是否与预存的合法用户的人像特征一致;
当一致时,判定用户为合法用户;
当不一致时,判定用户为非法用户。
本实施例采用了图像识别方式来识别用户身份。当用户进入场景后,摄像头获取到的场景信息中包括有用户的人像,智能控制系统可从场景信息中提取用户的人像特征,可包括用户的脸部特征、身形特征、虹膜特征等,还可以包括用户的走路姿势、坐姿、习惯动作等,智能控制系统通过这些特征中的一个或多个来对用户身份进行识别,当获取的特征与预存的合法用户特征一致时,则身份验证有效,执行对应的智能控制操作,当特征不一致时,则身份验证无效,则启动报警系统,有利于提高智能控制的安全性。
以上所述仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (18)

  1. 一种智能控制受控设备的方法,其特征在于,包括步骤:
    监测当前场景,在当前场景有用户到达时,获取当前场景的场景信息,并根据所述场景信息识别当前场景,获得场景识别结果;
    采集所述用户的动作信息,识别所述动作信息,获得动作识别结果;
    根据所述场景识别结果和所述动作识别结果,控制与所述场景识别结果和所述动作识别结果匹配的受控设备。
  2. 如权利要求1所述的智能控制受控设备的方法,其特征在于,所述根据所述场景识别结果和所述动作识别结果,控制与所述场景识别结果和所述动作识别结果匹配的受控设备的步骤包括:
    根据所述场景识别结果和所述动作识别结果,查找匹配的控制指令和匹配的受控设备;
    发送所述控制指令至匹配的受控设备。
  3. 如权利要求2所述的智能控制受控设备的方法,其特征在于,所述发送控制指令至匹配的受控设备的步骤之前还包括:
    获取匹配的受控设备的当前状态;
    判断匹配的受控设备的当前状态是否已与所述控制指令的对应状态一致;
    在当前状态已与所述控制指令的对应状态一致时,不发送所述控制指令;
    在当前状态与所述控制指令的对应状态不一致时,执行所述发送控制指令至匹配的受控设备的步骤。
  4. 如权利要求3所述的智能控制受控设备的方法,其特征在于,所述采集用户的动作信息的步骤之前还包括:
    获取所述用户的身份信息,判断所述用户是否为合法用户;
    在所述用户为合法用户时,执行所述采集用户的动作信息的步骤;
    在所述用户为非法用户时,启动报警系统。
  5. 如权利要求4所述的智能控制受控设备的方法,其特征在于,所述获取所述用户的身份信息,判断所述用户是否为合法用户的步骤包括:
    从所述场景信息中提取所述用户的人像特征;
    对所述用户的人像特征进行识别,判断所述用户的人像特征是否与预存的合法用户的人像特征一致;
    当一致时,判定所述用户为合法用户;
    当不一致时,判定所述用户为非法用户。
  6. 如权利要求2所述的智能控制受控设备的方法,其特征在于,所述采集用户的动作信息的步骤之前还包括:
    获取所述用户的身份信息,判断所述用户是否为合法用户;
    在所述用户为合法用户时,执行所述采集用户的动作信息的步骤;
    在所述用户为非法用户时,启动报警系统。
  7. 如权利要求6所述的智能控制受控设备的方法,其特征在于,所述获取所述用户的身份信息,判断所述用户是否为合法用户的步骤包括:
    从所述场景信息中提取所述用户的人像特征;
    对所述用户的人像特征进行识别,判断所述用户的人像特征是否与预存的合法用户的人像特征一致;
    当一致时,判定所述用户为合法用户;
    当不一致时,判定所述用户为非法用户。
  8. 如权利要求1所述的智能控制受控设备的方法,其特征在于,所述采集用户的动作信息的步骤之前还包括:
    获取所述用户的身份信息,判断所述用户是否为合法用户;
    在所述用户为合法用户时,执行所述采集用户的动作信息的步骤;
    在所述用户为非法用户时,启动报警系统。
  9. 如权利要求8所述的智能控制受控设备的方法,其特征在于,所述获取所述用户的身份信息,判断所述用户是否为合法用户的步骤包括:
    从所述场景信息中提取所述用户的人像特征;
    对所述用户的人像特征进行识别,判断所述用户的人像特征是否与预存的合法用户的人像特征一致;
    当一致时,判定所述用户为合法用户;
    当不一致时,判定所述用户为非法用户。
  10. 一种智能控制受控设备的装置,其特征在于,包括:
    场景分析模块,用于监测当前场景,在当前场景有用户到达时,获取当前场景的场景信息,并根据所述场景信息识别当前场景,获得场景识别结果;
    动作分析模块,用于采集所述用户的动作信息,识别所述动作信息,获得动作识别结果;
    控制模块,用于根据所述场景识别结果和所述动作识别结果,控制与所述场景识别结果和所述动作识别结果匹配的受控设备。
  11. 如权利要求10所述的智能控制受控设备的装置,其特征在于,所述控制模块还用于:
    根据所述场景识别结果和所述动作识别结果,查找匹配的控制指令和匹配的受控设备;
    发送所述控制指令至匹配的受控设备。
  12. 如权利要求11所述的智能控制受控设备的装置,其特征在于,还包括状态分析模块,用于获取匹配的受控设备的当前状态;判断匹配的受控设备的当前状态是否已与所述控制指令的对应状态一致;
    所述控制模块还用于,在当前状态已与所述控制指令的对应状态一致时,不发送所述控制指令;在当前状态与所述控制指令的对应状态不一致时,发送所述控制指令至匹配的受控设备。
  13. 如权利要求12所述的智能控制受控设备的装置,其特征在于,还包括身份识别模块和报警模块;
    所述身份识别模块用于,获取所述用户的身份信息,判断所述用户是否为合法用户;
    所述动作分析模块还用于,在所述用户为合法用户时,采集所述用户的动作信息;
    所述报警模块用于,在所述用户为非法用户时,启动报警系统。
  14. 如权利要求13所述的智能控制受控设备的装置,其特征在于,所述身份识别模块还用于:
    从所述场景信息中提取所述用户的人像特征;
    对所述用户的人像特征进行识别,判断所述用户的人像特征是否与预存的合法用户的人像特征一致;
    当一致时,判定所述用户为合法用户;
    当不一致时,判定所述用户为非法用户。
  15. 如权利要求11所述的智能控制受控设备的装置,其特征在于,还包括身份识别模块和报警模块;
    所述身份识别模块用于,获取所述用户的身份信息,判断所述用户是否为合法用户;
    所述动作分析模块还用于,在所述用户为合法用户时,采集所述用户的动作信息;
    所述报警模块用于,在所述用户为非法用户时,启动报警系统。
  16. 如权利要求15所述的智能控制受控设备的装置,其特征在于,所述身份识别模块还用于:
    从所述场景信息中提取所述用户的人像特征;
    对所述用户的人像特征进行识别,判断所述用户的人像特征是否与预存的合法用户的人像特征一致;
    当一致时,判定所述用户为合法用户;
    当不一致时,判定所述用户为非法用户。
  17. 如权利要求10所述的智能控制受控设备的装置,其特征在于,还包括身份识别模块和报警模块;
    所述身份识别模块用于,获取所述用户的身份信息,判断所述用户是否为合法用户;
    所述动作分析模块还用于,在所述用户为合法用户时,采集所述用户的动作信息;
    所述报警模块用于,在所述用户为非法用户时,启动报警系统。
  18. 如权利要求17所述的智能控制受控设备的装置,其特征在于,所述身份识别模块还用于:
    从所述场景信息中提取所述用户的人像特征;
    对所述用户的人像特征进行识别,判断所述用户的人像特征是否与预存的合法用户的人像特征一致;
    当一致时,判定所述用户为合法用户;
    当不一致时,判定所述用户为非法用户。
PCT/CN2015/086878 2015-05-26 2015-08-13 智能控制受控设备的方法和装置 WO2016187964A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/319,383 US20170139470A1 (en) 2015-05-26 2015-08-13 Method for intelligently controlling controlled equipment and device
AU2015396131A AU2015396131A1 (en) 2015-05-26 2015-08-13 Method and apparatus for intelligently controlling controlled device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510275680.2 2015-05-26
CN201510275680.2A CN105045140B (zh) 2015-05-26 2015-05-26 智能控制受控设备的方法和装置

Publications (1)

Publication Number Publication Date
WO2016187964A1 true WO2016187964A1 (zh) 2016-12-01

Family

ID=54451759

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/086878 WO2016187964A1 (zh) 2015-05-26 2015-08-13 智能控制受控设备的方法和装置

Country Status (4)

Country Link
US (1) US20170139470A1 (zh)
CN (1) CN105045140B (zh)
AU (1) AU2015396131A1 (zh)
WO (1) WO2016187964A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108205836A (zh) * 2017-12-21 2018-06-26 广东汇泰龙科技有限公司 一种基于云锁的红外人体感应的智能家居联动方法及系统

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201636784A (zh) * 2015-04-01 2016-10-16 全智慧科技股份有限公司 環境導向的智慧控制裝置、智慧控制系統及智慧控制方法
CN105629762B (zh) * 2016-03-21 2019-07-19 美的集团股份有限公司 智能家居的控制装置及方法
CN105872685A (zh) * 2016-03-24 2016-08-17 深圳市国华识别科技开发有限公司 智能终端控制方法和系统、智能终端
CN106338926A (zh) * 2016-11-01 2017-01-18 成都铅笔科技有限公司 基于人体感应的智慧家居控制系统
CN106773815A (zh) * 2016-11-30 2017-05-31 广州微至科技有限公司 数字智能控制方法、装置和中控系统
CN106951071B (zh) * 2017-03-01 2020-09-01 海尔优家智能科技(北京)有限公司 一种基于动作捕捉的设备控制方法和装置
CN107255928A (zh) * 2017-06-05 2017-10-17 珠海格力电器股份有限公司 一种设备控制方法、装置及家电设备
CN107330450A (zh) * 2017-06-15 2017-11-07 珠海格力电器股份有限公司 一种终端设备控制方法和装置
CN107688319A (zh) * 2017-09-08 2018-02-13 合肥永烨信息科技有限公司 一种室内安防系统及其方法
CN108009414B (zh) * 2017-12-28 2020-04-07 大道网络(上海)股份有限公司 一种基于生物识别的多用户智能控制台系统和控制方法
CN108335130A (zh) * 2018-01-11 2018-07-27 口碑(上海)信息技术有限公司 出入场所检测方法及装置
CN108337253A (zh) * 2018-01-29 2018-07-27 苏州南尔材料科技有限公司 一种基于计算机的智能家电控制方法
CN108614509A (zh) * 2018-05-03 2018-10-02 珠海格力电器股份有限公司 一种联动控制方法、装置、存储介质、终端及安防设备
WO2019213855A1 (zh) * 2018-05-09 2019-11-14 Fang Chao 设备控制方法和系统
CN108710308A (zh) * 2018-05-29 2018-10-26 广东汇泰龙科技有限公司 一种基于智能云锁的特定姿势触发场景联动的方法及系统
CN109164713B (zh) * 2018-10-23 2020-08-04 珠海格力电器股份有限公司 一种智能家居控制方法及装置
CN109298646B (zh) * 2018-11-09 2020-05-05 珠海格力电器股份有限公司 一种场景控制方法、装置、存储介质及电器
CN111737669A (zh) * 2019-03-22 2020-10-02 青岛海信智慧家居系统股份有限公司 显示装置的控制方法、装置、电子设备及存储介质
CN111766786B (zh) * 2019-04-02 2023-05-02 青岛海信智慧生活科技股份有限公司 一种智能控制方法及控制器
CN110163105B (zh) * 2019-04-19 2021-11-26 瑞芯微电子股份有限公司 基于人工智能的中控芯片及方法
CN111310009A (zh) * 2020-01-16 2020-06-19 珠海格力电器股份有限公司 用户分类方法、装置、存储介质、计算机设备
CN112769895B (zh) * 2020-12-18 2023-10-13 杭州涂鸦信息技术有限公司 一种群组或场景的控制方法及相关装置
CN113329121B (zh) * 2021-05-28 2022-11-08 维沃软件技术有限公司 操作执行方法、操作执行装置、电子设备和可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101504542A (zh) * 2008-02-04 2009-08-12 冯光道 家电信息化管理系统及方法
US20120131640A1 (en) * 2010-11-19 2012-05-24 Samsung Electronics Co., Ltd. Enabling presence information access and authorization for home network telephony
CN102707797A (zh) * 2011-03-02 2012-10-03 微软公司 通过自然用户界面控制多媒体系统中的电子设备
CN102932212A (zh) * 2012-10-12 2013-02-13 华南理工大学 一种基于多通道交互方式的智能家居控制系统
CN103257627A (zh) * 2013-02-05 2013-08-21 西安交通大学 一种基于计算机视觉的物联网控制系统及方法
CN103295028A (zh) * 2013-05-21 2013-09-11 深圳Tcl新技术有限公司 手势操作控制方法、装置及智能显示终端

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005059711A1 (de) * 2005-12-12 2007-06-14 Basf Ag Formkörper enthaltend ein mikroporöses Material und mindestens ein siliciumhaltiges Bindemittel, Verfahren zu seiner Herstellung und seine Verwendung als Katalysator, insbesondere in einem Verfahren zur kontinuierlichen Synthese von Methylaminen
US9258326B2 (en) * 2008-04-02 2016-02-09 Yougetitback Limited API for auxiliary interface
KR20110003146A (ko) * 2009-07-03 2011-01-11 한국전자통신연구원 제스쳐 인식 장치, 이를 구비한 로봇 시스템 및 이를 이용한 제스쳐 인식 방법
CN101673094A (zh) * 2009-09-23 2010-03-17 曾昭兴 一种家电控制装置和控制方法
KR101235432B1 (ko) * 2011-07-11 2013-02-22 김석중 3차원 모델링된 전자기기의 가상터치를 이용한 원격 조작 장치 및 방법
CN104102181B (zh) * 2013-04-10 2017-04-19 海尔集团公司 智能家居控制方法、装置及系统
CN103472796B (zh) * 2013-09-11 2014-10-22 厦门狄耐克电子科技有限公司 一种基于手势识别的智能家居系统
CN104483851B (zh) * 2014-10-30 2017-03-15 深圳创维-Rgb电子有限公司 一种情景感知控制装置、系统及方法
CN104597883A (zh) * 2015-01-15 2015-05-06 小米科技有限责任公司 智能室内家居监控方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101504542A (zh) * 2008-02-04 2009-08-12 冯光道 家电信息化管理系统及方法
US20120131640A1 (en) * 2010-11-19 2012-05-24 Samsung Electronics Co., Ltd. Enabling presence information access and authorization for home network telephony
CN102707797A (zh) * 2011-03-02 2012-10-03 微软公司 通过自然用户界面控制多媒体系统中的电子设备
CN102932212A (zh) * 2012-10-12 2013-02-13 华南理工大学 一种基于多通道交互方式的智能家居控制系统
CN103257627A (zh) * 2013-02-05 2013-08-21 西安交通大学 一种基于计算机视觉的物联网控制系统及方法
CN103295028A (zh) * 2013-05-21 2013-09-11 深圳Tcl新技术有限公司 手势操作控制方法、装置及智能显示终端

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108205836A (zh) * 2017-12-21 2018-06-26 广东汇泰龙科技有限公司 一种基于云锁的红外人体感应的智能家居联动方法及系统

Also Published As

Publication number Publication date
CN105045140B (zh) 2019-01-01
US20170139470A1 (en) 2017-05-18
AU2015396131A1 (en) 2017-01-12
CN105045140A (zh) 2015-11-11

Similar Documents

Publication Publication Date Title
WO2016187964A1 (zh) 智能控制受控设备的方法和装置
WO2019051887A1 (zh) 家电的控制方法、装置和计算机可读存储介质
WO2020246844A1 (en) Device control method, conflict processing method, corresponding apparatus and electronic device
WO2019051902A1 (zh) 终端控制方法、空调器及计算机可读存储介质
WO2016065745A1 (zh) 一种情景感知控制装置、系统及方法
EP3411634A1 (en) Data learning server and method for generating and using learning model thereof
WO2019080406A1 (zh) 电视机语音交互方法、语音交互控制装置及存储介质
WO2019033904A1 (zh) 登录验证方法、系统及计算机可读存储介质
WO2016060486A1 (en) User terminal apparatus and iris recognition method thereof
WO2017177524A1 (zh) 音视频同步播放的方法及装置
WO2017099314A1 (ko) 사용자 정보를 제공하는 전자 장치 및 방법
WO2019062113A1 (zh) 家电设备的控制方法、装置、家电设备及可读存储介质
WO2017041337A1 (zh) 空调器的控制方法、终端及系统
WO2016058258A1 (zh) 终端远程控制方法和系统
WO2018023926A1 (zh) 电视与移动终端的互动方法及系统
WO2019045521A1 (ko) 전자 장치 및 그 제어 방법
WO2015018185A1 (zh) 实现分布式遥控的方法、装置及其电视端和移动终端
WO2015170832A1 (ko) 디스플레이 장치 및 그의 화상 통화 수행 방법
WO2019085543A1 (zh) 电视机系统及电视机控制方法
WO2019051904A1 (zh) 终端报警方法、装置及计算机可读存储介质
WO2015137666A1 (ko) 오브젝트 인식 장치 및 그 제어 방법
WO2019164281A1 (en) Electronic device and control method thereof
WO2019051897A1 (zh) 终端运行参数调整方法、装置及计算机可读存储介质
WO2019062112A1 (zh) 空调器控制方法、装置、空调器及计算机可读存储介质
WO2018053964A1 (zh) 基于安卓系统的语音输入标点符号的方法及装置

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 15319383

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15893045

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015396131

Country of ref document: AU

Date of ref document: 20150813

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15893045

Country of ref document: EP

Kind code of ref document: A1