CN111007806A - Smart home control method and device - Google Patents
Smart home control method and device Download PDFInfo
- Publication number
- CN111007806A CN111007806A CN201811168754.2A CN201811168754A CN111007806A CN 111007806 A CN111007806 A CN 111007806A CN 201811168754 A CN201811168754 A CN 201811168754A CN 111007806 A CN111007806 A CN 111007806A
- Authority
- CN
- China
- Prior art keywords
- user
- action information
- control
- control instruction
- controlling
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
- G05B19/4183—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
- G05B19/41845—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by system universality, reconfigurability, modularity
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Abstract
The invention discloses a control method and device for smart home. Wherein, the method comprises the following steps: acquiring action information of a user and a control instruction corresponding to the action information; the action information and the control command of the user are input into a recognition model, and the corresponding relation between the action information and the control command of the user is determined through the recognition model, wherein the recognition model is obtained by using a plurality of groups of training data through machine learning training, and each group of data in the plurality of groups of training data comprises: the corresponding relation between the action information and the control command of the user and the action information and the control command of the user; and controlling the smart home according to the corresponding relation. The invention solves the technical problem that the existing intelligent household motion control method is poor in effect.
Description
Technical Field
The invention relates to the technical field of intelligent home control, in particular to a method and a device for controlling an intelligent home.
Background
The recognition method in the intelligent home environment is various, at present, the method is mature, and is controlled by voice, remote control and the like, and an action control method is also provided, so that the action of a user is collected, the characteristic of the action is obtained, and a control instruction is obtained from the extracted action characteristic, so that the purpose of controlling the intelligent home is achieved. In the prior art, the control of the smart home is to perform a series of controls on a device in the smart home through actions, for example, to control an air conditioner through actions. Once the intelligent household control system is applied to a plurality of devices of the intelligent household, control instructions are easily disordered. In addition, because there are many smart home devices, there are more corresponding control commands, and if there are many users, too many commands obviously increase the burden of the users. The living habits and the action characteristics of each user are different, and the same action of different users is easily mistaken for different control instructions, so that the action control effect of the smart home cannot achieve a satisfactory effect.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a control method and a control device for an intelligent home, which are used for at least solving the technical problem that the existing action control method for the intelligent home is poor in effect.
According to an aspect of an embodiment of the present invention, a method for controlling smart home is provided, including: acquiring action information of a user and a control instruction corresponding to the action information; inputting the action information of the user and the control instruction into a recognition model, and determining the corresponding relation between the action information of the user and the control instruction through the recognition model, wherein the recognition model is obtained by using multiple groups of training data through machine learning training, and each group of data in the multiple groups of training data comprises: the corresponding relation between the action information and the control command of the user and the action information and the control command of the user; and controlling the smart home according to the corresponding relation.
Optionally, the acquiring the motion information of the user includes: collecting an action image of a user; and determining the motion information according to the motion image.
Optionally, the obtaining of the control instruction corresponding to the action information of the user includes: and determining a corresponding control instruction according to the action information.
Optionally, controlling the smart home according to the correspondence includes: controlling the use equipment in the intelligent home; and controlling the intelligent household to work by simultaneously controlling a plurality of using devices.
Optionally, the controlling the usage device in the smart home comprises: acquiring action information input into the using equipment, wherein the using equipment is any one of a plurality of using equipment; determining a control instruction adaptive to the using equipment according to the action information and the corresponding relation; and controlling the using equipment according to the control instruction.
Optionally, the motion information includes motion parameters of each joint and each part of the human body.
According to another aspect of the embodiments of the present invention, there is also provided a control device for smart home, including: the acquisition module is used for acquiring action information of a user and a control instruction corresponding to the action information; the recognition module is used for inputting the action information of the user and the control instruction into a recognition model, and determining the corresponding relation between the action information of the user and the control instruction through the recognition model, wherein the recognition model is obtained by using multiple groups of training data through machine learning training, and each group of data in the multiple groups of training data comprises: the corresponding relation between the action information and the control command of the user and the action information and the control command of the user; and the control module is used for controlling the smart home according to the corresponding relation.
Optionally, the control module comprises: an acquisition unit configured to acquire motion information of an input user device, wherein the user device is any one of a plurality of user devices; a determining unit, configured to determine, according to the action information and the correspondence, a control instruction adapted to the user equipment; and the control unit is used for controlling the using equipment according to the control instruction.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium, where the storage medium stores program instructions, and when the program instructions are executed, the storage medium is controlled by an apparatus to execute any one of the above methods.
According to another aspect of the embodiments of the present invention, there is also provided a processor, configured to execute a program, where the program executes to perform the method described in any one of the above.
In the embodiment of the invention, the action information of a user and a control instruction corresponding to the action information are acquired; inputting the action information of the user and the control instruction into a recognition model, and determining the corresponding relation between the action information of the user and the control instruction through the recognition model, wherein the recognition model is obtained by using multiple groups of training data through machine learning training, and each group of data in the multiple groups of training data comprises: the corresponding relation between the action information and the control command of the user and the action information and the control command of the user; according to the mode of controlling the smart home according to the corresponding relation, the purpose of effectively improving the action recognition accuracy of the smart home to the user is achieved by recognizing the corresponding relation between the action information of the user and the corresponding control instruction through the recognition model, so that the technical effect of effectively controlling the smart home is achieved, and the technical problem of poor effect of the existing action control method of the smart home is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a flowchart of a control method of smart home according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a control device of a smart home according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, an embodiment of a control method for smart home is provided, it should be noted that the steps shown in the flowchart of the drawings may be executed in a computer system such as a set of computer executable instructions, and although a logical order is shown in the flowchart, in some cases, the steps shown or described may be executed in an order different from that here.
Fig. 1 is a flowchart of a control method of smart home according to an embodiment of the present invention, and as shown in fig. 1, the method includes the following steps:
step S102, acquiring action information of a user and a control command corresponding to the action information;
step S104, inputting the action information and the control instruction of the user into a recognition model, and determining the corresponding relation between the action information and the control instruction of the user through the recognition model, wherein the recognition model is obtained by using a plurality of groups of training data through machine learning training, and each group of data in the plurality of groups of training data comprises: the corresponding relation between the action information and the control command of the user and the action information and the control command of the user;
and S106, controlling the smart home according to the corresponding relation.
Through the steps, the action information of the user and the control instruction corresponding to the action information can be acquired; the action information and the control command of the user are input into a recognition model, and the corresponding relation between the action information and the control command of the user is determined through the recognition model, wherein the recognition model is obtained by using a plurality of groups of training data through machine learning training, and each group of data in the plurality of groups of training data comprises: the corresponding relation between the action information and the control command of the user and the action information and the control command of the user; according to the mode of controlling the smart home according to the corresponding relation, the purpose of effectively improving the action recognition accuracy of the smart home to the user is achieved by recognizing the corresponding relation between the action information of the user and the corresponding control instruction through the recognition model, so that the technical effect of effectively controlling the smart home is achieved, and the technical problem of poor effect of the existing action control method of the smart home is solved.
Before acquiring the action information of the user and the control instruction corresponding to the action information, the action information of the user and the control instruction corresponding to the action information need to be established, that is, the action information of the user can control the user equipment in the smart home to perform corresponding work, for example, control the user equipment to be turned on or turned off, and perform corresponding adjustment.
The recognition model establishes a model of the corresponding relation between the action information and the control instruction by using a learning algorithm, and trains the established recognition model through the action information of the user and the control instruction corresponding to the action information until the model converges. And aiming at different using equipment, inputting a recognition model for expressing the corresponding relation according to the action information suitable for the using equipment and the control command corresponding to the action information, and determining the corresponding relation matched with the using equipment. According to the corresponding relation, the user equipment can determine a control instruction applicable to the user equipment according to the input action information. The user can carry out the corresponding relation affirmation to different equipment according to the corresponding relation model, thereby achieving the purpose that the user controls different intelligent household equipment according to the action instruction.
Optionally, the acquiring the motion information of the user includes: collecting an action image of a user; motion information is determined from the motion image.
When acquiring the motion information of the user, the motion image of the user needs to be acquired, and then corresponding motion features are extracted from the motion image to obtain the motion information corresponding to the motion image. The image acquisition equipment in the intelligent home environment is used for acquiring action images of a user, and can be a monitoring camera, a video recorder, a mobile phone camera and the like. The motion video or the motion picture of the user can be collected through the equipment. When the video of the user action is obtained, the picture frame is intercepted, the user action feature in the picture frame is extracted, or the action picture of the user is directly obtained, the action feature is extracted from the picture, and then the action feature in the action image is analyzed to obtain the action information of the user.
Optionally, the obtaining of the control instruction corresponding to the action information of the user includes: and determining a corresponding control command according to the action information.
And determining a control instruction corresponding to the action information according to the action information of the user, wherein the action information of the user and the control instruction have a one-to-one and one-to-many correspondence relationship, and specifically, one action information of the user may include one or more corresponding control instructions. In addition, the user's motion information may be intermittent or continuous. In the one-to-one correspondence relationship, the action information of one user includes only one control command, and the command can control one or more devices to perform the same control command, that is, can control the devices to execute the same control command, for example, the devices are turned on and off simultaneously. The one-to-many correspondence is that the action information of the user includes a plurality of control instructions, and different control instructions can be executed on a plurality of user devices respectively, for example, the user makes an "OK" gesture with a hand, the air conditioner keeps the current temperature unchanged, and the sweeping robot starts to sweep a room.
Furthermore, the user can preset the corresponding relationship between the action information and the control instruction corresponding to the action information according to factors such as living habits, and can also set respective control authority according to different identities of the user.
Optionally, controlling the smart home according to the corresponding relationship includes: controlling the use equipment in the intelligent home; the intelligent household work is controlled by simultaneously controlling a plurality of using devices.
And after the corresponding relation between the action information of the user and the control instruction corresponding to the action information is established, controlling the smart home according to the corresponding relation. Specifically, when the action information of the user and the control command are in a one-to-one relationship, that is, the control command can execute the same command on one or more user devices in the smart home, for example, the control command corresponding to the action information of the user is a power-on command, and when the user performs the action, the one or more user devices controlled by the power-on command are simultaneously powered on. When the action information of the user and the control command are in a one-to-many relationship, that is, the control command can execute different commands to a plurality of user equipments in the smart home, for example, the control command corresponding to the same action information is different for different user equipments, the command information may be different for different user equipments, the command information is turned on for the television, and the command information is turned off or dimmed for the lighting.
Optionally, the controlling the usage device in the smart home includes: acquiring action information input into a using device, wherein the using device is any one of a plurality of using devices; determining a control instruction adaptive to the using equipment according to the action information and the corresponding relation; and controlling the using equipment according to the control instruction.
When the using equipment in the intelligent home is controlled, one or more using equipment acquires the action information of the user, whether the control instruction in the corresponding relation is the control instruction of the using equipment is further judged according to the action information and the corresponding relation, and then the controlling of the using equipment is realized according to the judgment result.
Optionally, the motion information includes motion parameters of respective joints and respective parts of the human body.
The motion information can be acquired through motion acquisition equipment for motion parameters of each joint and each part of the human body. The human body is composed of various body parts, and can be roughly divided into a head, a hand, a leg and the like, and the human body can be further subdivided, for example, the head comprises eyes, ears and the like. Joints are connections between bones of the human body, and can also be connections between parts of the body. The motion parameter is a degree of change in each joint and each part. For example, the degree of extension of the arms, the flexion of the fingers, the change in the form of the body, etc. The change of each joint and each part of the human body can be expressed by using specific parameters to express the action information of the user.
Fig. 2 is a schematic structural diagram of a control device of a smart home according to an embodiment of the present invention; as shown in fig. 2, the control device for smart home includes: an acquisition module 22, an identification module 24, and a control module 26. The following describes the control device of the smart home in detail.
The acquiring module 22 is used for acquiring action information of a user and a control instruction corresponding to the action information; the recognition module 24 is connected to the obtaining module 22, and is configured to input the motion information and the control instruction of the user into a recognition model, and determine a corresponding relationship between the motion information and the control instruction of the user through the recognition model, where the recognition model is obtained through machine learning training for using multiple sets of training data, and each set of data in the multiple sets of training data includes: the corresponding relation between the action information and the control command of the user and the action information and the control command of the user; and the control module 26 is connected with the identification module 24 and is used for controlling the smart home according to the corresponding relation.
Through the modules, the control device of the intelligent home can acquire the action information of the user and the control instruction corresponding to the action information; the action information and the control command of the user are input into a recognition model, and the corresponding relation between the action information and the control command of the user is determined through the recognition model, wherein the recognition model is obtained by using a plurality of groups of training data through machine learning training, and each group of data in the plurality of groups of training data comprises: the corresponding relation between the action information and the control command of the user and the action information and the control command of the user; and controlling the smart home according to the corresponding relation. The action information of the user and the corresponding relation of the corresponding control instruction are identified through the identification model, and the purpose of effectively improving the action identification accuracy of the intelligent home to the user is achieved, so that the technical effect of effectively controlling the intelligent home is achieved, and the technical problem that the existing action control method of the intelligent home is poor in effect is solved.
Before acquiring the action information of the user and the control instruction corresponding to the action information, the action information of the user and the control instruction corresponding to the action information need to be established, that is, the action information of the user can control the user equipment in the smart home to perform corresponding work, for example, control the user equipment to be turned on or turned off, and perform corresponding adjustment.
The recognition model establishes a model of the corresponding relation between the action information and the control instruction by using a learning algorithm, and trains the established recognition model through the action information of the user and the control instruction corresponding to the action information until the model converges. And aiming at different using equipment, inputting a recognition model for expressing the corresponding relation according to the action information suitable for the using equipment and the control command corresponding to the action information, and determining the corresponding relation matched with the using equipment. According to the corresponding relation, the user equipment can determine a control instruction applicable to the user equipment according to the input action information. The user can carry out the corresponding relation affirmation to different equipment according to the corresponding relation model, thereby achieving the purpose that the user controls different intelligent household equipment according to the action instruction.
Optionally, the control module 26 comprises: an acquisition unit configured to acquire motion information of an input user device, wherein the user device is any one of a plurality of user devices; the determining unit is used for determining a control instruction adaptive to the using equipment according to the action information and the corresponding relation; and the control unit is used for controlling the using equipment according to the control instruction.
When the using equipment in the intelligent home is controlled, one or more using equipment acquires the action information of the user, whether the control instruction in the corresponding relation is the control instruction of the using equipment is further judged according to the action information and the corresponding relation, and then the controlling of the using equipment is realized according to the judgment result.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium storing program instructions, wherein when the program instructions are executed, the apparatus on which the storage medium is located is controlled to execute the method of any one of the above.
According to another aspect of the embodiments of the present invention, there is also provided a processor, configured to execute a program, where the program executes to perform the method of any one of the above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (10)
1. A control method of smart home is characterized by comprising the following steps:
acquiring action information of a user and a control instruction corresponding to the action information;
inputting the action information of the user and the control instruction into a recognition model, and determining the corresponding relation between the action information of the user and the control instruction through the recognition model, wherein the recognition model is obtained by using multiple groups of training data through machine learning training, and each group of data in the multiple groups of training data comprises: the corresponding relation between the action information and the control command of the user and the action information and the control command of the user;
and controlling the smart home according to the corresponding relation.
2. The method of claim 1, wherein obtaining user action information comprises:
collecting an action image of a user;
and determining the motion information according to the motion image.
3. The method of claim 2, wherein obtaining the control command corresponding to the user's action information comprises:
and determining a corresponding control instruction according to the action information.
4. The method according to claim 1, wherein controlling the smart home according to the correspondence comprises:
controlling the use equipment in the intelligent home;
and controlling the intelligent household to work by simultaneously controlling a plurality of using devices.
5. The method of claim 4, wherein controlling the use of the device in the smart home comprises:
acquiring action information input into the using equipment, wherein the using equipment is any one of a plurality of using equipment;
determining a control instruction adaptive to the using equipment according to the action information and the corresponding relation;
and controlling the using equipment according to the control instruction.
6. The method according to any one of claims 1 to 5, wherein the motion information includes motion parameters of respective joints and respective parts of the human body.
7. The utility model provides a controlling means of intelligence house which characterized in that includes:
the acquisition module is used for acquiring action information of a user and a control instruction corresponding to the action information;
the recognition module is used for inputting the action information of the user and the control instruction into a recognition model, and determining the corresponding relation between the action information of the user and the control instruction through the recognition model, wherein the recognition model is obtained by using multiple groups of training data through machine learning training, and each group of data in the multiple groups of training data comprises: the corresponding relation between the action information and the control command of the user and the action information and the control command of the user;
and the control module is used for controlling the smart home according to the corresponding relation.
8. The apparatus of claim 7, wherein the control module comprises:
an acquisition unit configured to acquire motion information of an input user device, wherein the user device is any one of a plurality of user devices;
a determining unit, configured to determine, according to the action information and the correspondence, a control instruction adapted to the user equipment;
and the control unit is used for controlling the using equipment according to the control instruction.
9. A storage medium storing program instructions, wherein the program instructions, when executed, control an apparatus in which the storage medium is located to perform the method of any one of claims 1 to 6.
10. A processor, characterized in that the processor is configured to run a program, wherein the program when running performs the method of any of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811168754.2A CN111007806B (en) | 2018-10-08 | 2018-10-08 | Smart home control method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811168754.2A CN111007806B (en) | 2018-10-08 | 2018-10-08 | Smart home control method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111007806A true CN111007806A (en) | 2020-04-14 |
CN111007806B CN111007806B (en) | 2022-04-08 |
Family
ID=70111488
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811168754.2A Active CN111007806B (en) | 2018-10-08 | 2018-10-08 | Smart home control method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111007806B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112162492A (en) * | 2020-11-03 | 2021-01-01 | 珠海格力电器股份有限公司 | Control method and device of household equipment, edge computing gateway and storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103167340A (en) * | 2013-04-03 | 2013-06-19 | 青岛歌尔声学科技有限公司 | Smart television gesture recognition system and method |
CN103593680A (en) * | 2013-11-19 | 2014-02-19 | 南京大学 | Dynamic hand gesture recognition method based on self incremental learning of hidden Markov model |
CN104992171A (en) * | 2015-08-04 | 2015-10-21 | 易视腾科技有限公司 | Method and system for gesture recognition and man-machine interaction based on 2D video sequence |
CN105045122A (en) * | 2015-06-24 | 2015-11-11 | 张子兴 | Intelligent household natural interaction system based on audios and videos |
CN105354551A (en) * | 2015-11-03 | 2016-02-24 | 北京英梅吉科技有限公司 | Gesture recognition method based on monocular camera |
CN106054650A (en) * | 2016-07-18 | 2016-10-26 | 汕头大学 | Novel intelligent household system and multi-gesture control method thereof |
CN106295479A (en) * | 2015-06-05 | 2017-01-04 | 上海戏剧学院 | Based on body-sensing technology action recognition editing system |
CN107702273A (en) * | 2017-09-20 | 2018-02-16 | 珠海格力电器股份有限公司 | Air conditioning control method and device |
CN107726540A (en) * | 2017-09-27 | 2018-02-23 | 珠海格力电器股份有限公司 | Air conditioning control method and device |
CN107991893A (en) * | 2017-11-14 | 2018-05-04 | 美的集团股份有限公司 | Realize method, gesture identification module, main control module and the home appliance of communication |
CN108052199A (en) * | 2017-10-30 | 2018-05-18 | 珠海格力电器股份有限公司 | Control method, device and the smoke exhaust ventilator of smoke exhaust ventilator |
CN108105136A (en) * | 2017-11-03 | 2018-06-01 | 珠海格力电器股份有限公司 | Control method, device and the fan of fan |
-
2018
- 2018-10-08 CN CN201811168754.2A patent/CN111007806B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103167340A (en) * | 2013-04-03 | 2013-06-19 | 青岛歌尔声学科技有限公司 | Smart television gesture recognition system and method |
CN103593680A (en) * | 2013-11-19 | 2014-02-19 | 南京大学 | Dynamic hand gesture recognition method based on self incremental learning of hidden Markov model |
CN106295479A (en) * | 2015-06-05 | 2017-01-04 | 上海戏剧学院 | Based on body-sensing technology action recognition editing system |
CN105045122A (en) * | 2015-06-24 | 2015-11-11 | 张子兴 | Intelligent household natural interaction system based on audios and videos |
CN104992171A (en) * | 2015-08-04 | 2015-10-21 | 易视腾科技有限公司 | Method and system for gesture recognition and man-machine interaction based on 2D video sequence |
CN105354551A (en) * | 2015-11-03 | 2016-02-24 | 北京英梅吉科技有限公司 | Gesture recognition method based on monocular camera |
CN106054650A (en) * | 2016-07-18 | 2016-10-26 | 汕头大学 | Novel intelligent household system and multi-gesture control method thereof |
CN107702273A (en) * | 2017-09-20 | 2018-02-16 | 珠海格力电器股份有限公司 | Air conditioning control method and device |
CN107726540A (en) * | 2017-09-27 | 2018-02-23 | 珠海格力电器股份有限公司 | Air conditioning control method and device |
CN108052199A (en) * | 2017-10-30 | 2018-05-18 | 珠海格力电器股份有限公司 | Control method, device and the smoke exhaust ventilator of smoke exhaust ventilator |
CN108105136A (en) * | 2017-11-03 | 2018-06-01 | 珠海格力电器股份有限公司 | Control method, device and the fan of fan |
CN107991893A (en) * | 2017-11-14 | 2018-05-04 | 美的集团股份有限公司 | Realize method, gesture identification module, main control module and the home appliance of communication |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112162492A (en) * | 2020-11-03 | 2021-01-01 | 珠海格力电器股份有限公司 | Control method and device of household equipment, edge computing gateway and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN111007806B (en) | 2022-04-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103295028B (en) | gesture operation control method, device and intelligent display terminal | |
CN108181819B (en) | Linkage control method, device and system for household electrical appliance and household electrical appliance | |
CN106094540B (en) | Electrical equipment control method, device and system | |
US11061385B2 (en) | Method, apparatus and system for controlling device | |
CN110308660B (en) | Intelligent equipment control method and device | |
CN107560062B (en) | Air conditioner control device and method and air conditioner | |
CN107255928A (en) | A kind of apparatus control method, device and home appliance | |
CN105042789B (en) | The control method and system of a kind of intelligent air condition | |
CN110426962A (en) | A kind of control method and system of smart home device | |
CN107682236A (en) | Smart home interactive system and method based on computer picture recognition | |
CN108375911B (en) | Equipment control method and device, storage medium and equipment | |
CN111007806B (en) | Smart home control method and device | |
CN107742520B (en) | Voice control method, device and system | |
CN109357366A (en) | Adjustment control method, device, storage medium and air-conditioning system | |
CN108572555A (en) | Kitchen ventilator and family's intarconnected cotrol method based on kitchen ventilator | |
CN110880994A (en) | Control method and control equipment of household appliance | |
CN105955040A (en) | Intelligent household system according to real-time video picture visual control and control method thereof | |
CN108415572B (en) | Module control method and device applied to mobile terminal and storage medium | |
CN104062912A (en) | Intelligent home control method, apparatus and system | |
CN115481284A (en) | Cosmetic method and device based on cosmetic box, storage medium and electronic device | |
CN106951071B (en) | Equipment control method and device based on motion capture | |
CN114488829A (en) | Method and device for controlling household appliance and server | |
CN110736223A (en) | Air conditioner control method and device | |
CN110908566A (en) | Information processing method and device | |
CN205334503U (en) | Robot with face identification function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |