CN110825217B - Household appliance control method and device - Google Patents
Household appliance control method and device Download PDFInfo
- Publication number
- CN110825217B CN110825217B CN201810918051.0A CN201810918051A CN110825217B CN 110825217 B CN110825217 B CN 110825217B CN 201810918051 A CN201810918051 A CN 201810918051A CN 110825217 B CN110825217 B CN 110825217B
- Authority
- CN
- China
- Prior art keywords
- controlling
- control
- instruction
- image
- household appliance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000010801 machine learning Methods 0.000 claims abstract description 11
- 238000012549 training Methods 0.000 claims abstract description 10
- 238000012216 screening Methods 0.000 claims description 38
- 230000010355 oscillation Effects 0.000 claims description 21
- 230000008569 process Effects 0.000 claims description 21
- 230000001629 suppression Effects 0.000 claims description 20
- 230000009471 action Effects 0.000 claims description 9
- 230000008094 contradictory effect Effects 0.000 claims description 5
- 238000005516 engineering process Methods 0.000 abstract description 6
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2816—Controlling appliance services of a home automation network by calling their functionalities
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Computing Systems (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Molecular Biology (AREA)
- Multimedia (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Selective Calling Equipment (AREA)
Abstract
The invention discloses a household appliance control method and device. Wherein the method comprises the following steps: acquiring an image of a target for controlling the household appliance; and identifying a control instruction for controlling the household appliance in the image by using an instruction identification model, wherein the instruction identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the image and a control instruction in the image for controlling the household appliances; and controlling the household appliance according to the identified control instruction. The invention solves the technical problem of lower control accuracy of controlling household appliances in the related technology.
Description
Technical Field
The invention relates to the field of household appliance control, in particular to a household appliance control method and device.
Background
Common control mode of intelligent house is through remote control, or intelligent terminal's application program control, also has through speech control in prior art, but traditional control mode often needs controlling means (for example above-mentioned remote controller, or intelligent terminal) to realize control, and the user is difficult to control intelligent house under the condition that does not have controlling means, and control rate of accuracy and rate of accuracy are lower moreover. Although the above-mentioned voice control mode can realize that no control device controls the smart home, in general, the voice control is realized by a fixed voice to control the voice, and under the condition that the user does not know the voice capable of forming the control, the control on the smart home is difficult, and the control accuracy and the accuracy are low.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides a household appliance control method and device, which at least solve the technical problem of low control accuracy of household appliances in the related technology.
According to an aspect of an embodiment of the present invention, there is provided a home appliance control method including: acquiring an image of a target for controlling the household appliance; and identifying a control instruction for controlling the household appliance in the image by using an instruction identification model, wherein the instruction identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the image and a control instruction in the image for controlling the household appliances; and controlling the household appliance according to the identified control instruction.
Optionally, before the instruction recognition model is used to recognize the control instruction for controlling the home appliance in the image, the method further includes: the instruction recognition model is obtained by adopting the following modes: determining a basic model algorithm of the instruction recognition model; and setting a plurality of convolution layers and a screening layer in the basic model algorithm, wherein the convolution layers are used for acquiring a plurality of identification results for identifying the image, and the screening layer is used for screening the plurality of identification results to obtain a screening result.
Optionally, acquiring the instruction recognition model further includes: and an oscillation suppression layer is further arranged in the basic model algorithm, wherein the oscillation suppression layer is used for suppressing oscillation in a regression process by adopting a logarithmic suppression method, and the regression process is a process of obtaining the identified control instruction by carrying out regression on the screening result.
Optionally, acquiring the image of the target for controlling the home appliance includes: acquiring a plurality of images of a target for controlling the household appliance; determining features in the plurality of images; and determining an image in which the features with the feature confidence coefficient higher than a preset value are located as an image for controlling the household appliance, wherein the feature confidence coefficient is used for identifying the obvious degree of controlling the household appliance by the features.
Optionally, controlling the home appliance according to the identified control instruction includes: outputting the identified control instruction; receiving a voice instruction for controlling the household appliance; and controlling the household appliance according to the identified control command under the condition that the identified control command is higher in priority than the voice command.
According to another aspect of an embodiment of the present invention, there is provided a home appliance control apparatus including: the first acquisition module is used for acquiring an image of a target for controlling the household appliance; the identification module is used for identifying a control instruction for controlling the household appliance in the image by using an instruction identification model, wherein the instruction identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the image and a control instruction in the image for controlling the household appliances; and the control module is used for controlling the household appliances according to the identified control instructions.
Optionally, the apparatus further comprises: the second acquisition module is used for acquiring the instruction identification model in the following way, and the second acquisition module comprises: a first determining unit configured to determine a basic model algorithm of the instruction recognition model; the first setting unit is used for setting a plurality of convolution layers and a screening layer in the basic model algorithm, wherein the convolution layers are used for acquiring a plurality of identification results for identifying the image, and the screening layer is used for screening the plurality of identification results to obtain screening results.
Optionally, the second obtaining module further includes: and the second setting unit is used for further setting an oscillation suppression layer in the basic model algorithm, wherein the oscillation suppression layer is used for suppressing oscillation in a regression process by adopting a logarithmic suppression method, and the regression process is a process of obtaining the identified control instruction by carrying out regression on the screening result.
Optionally, the first obtaining module includes: an acquisition unit for acquiring a plurality of images of a target controlling the home appliance; a second determining unit configured to determine features in the plurality of images; and a third determining unit, configured to determine an image in which a feature with a feature confidence coefficient higher than a predetermined value is located as an image for controlling the home appliance, where the feature confidence coefficient is used to identify that the feature is an obvious degree for controlling the home appliance.
According to another aspect of an embodiment of the present invention, there is provided a storage medium, wherein the storage medium includes a stored program, and wherein the apparatus on which the storage medium is controlled to execute any one of the above home appliance control methods when the program runs.
In the embodiment of the invention, an image of a target for controlling household appliances is adopted; and identifying a control instruction for controlling the household appliance in the image by using an instruction identification model, wherein the instruction identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the image and a control instruction in the image for controlling the household appliances; according to the mode of controlling the household appliances by the identified control instructions, the purpose of accurately controlling the household appliances is achieved, so that the technical effect of improving the control accuracy is achieved, and the technical problem that the control accuracy of controlling the household appliances in the related technology is low is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
fig. 1 is a flowchart of a home appliance control method according to an embodiment of the present invention;
fig. 2 is a schematic view of a home appliance control device according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided a method embodiment of a home appliance control method, it should be noted that the steps shown in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order different from that herein.
Fig. 1 is a flowchart of a home appliance control method according to an embodiment of the present invention, as shown in fig. 1, the method comprising the steps of:
step S102, obtaining an image of a target for controlling household appliances;
step S104, a control instruction for controlling the household appliances in the image is identified by using an instruction identification model, wherein the instruction identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the image and a control instruction in the image for controlling the household appliances;
step S106, controlling the household appliance according to the identified control instruction.
Through the steps, an image of the control of the household appliance by the acquisition target is adopted; and identifying a control instruction for controlling the household appliance in the image by using an instruction identification model, wherein the instruction identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the image and a control instruction in the image for controlling the household appliances; according to the mode of controlling the household appliances by the identified control instructions, the purpose of accurately controlling the household appliances is achieved, so that the technical effect of improving the control accuracy is achieved, and the technical problem that the control accuracy of controlling the household appliances in the related technology is low is solved.
The image may be a photograph taken by a camera, a sensing image detected by a sensor, or an infrared image taken by an infrared camera. The image may include an image feature for controlling the home appliance, such as a gesture, an action, or an image.
The instruction recognition model is a recognition model which is obtained through machine learning training by using multiple sets of data, such as a convolutional neural network recognition model, and the like and can be subjected to machine learning, for example, the instruction recognition model is trained by multiple sets of data until the model converges, and has recognition capability between input data and output data. Each set of data in the plurality of sets of data includes: the image and the control instruction for controlling the household appliance in the image. For example, an image of a start gesture and a start home appliance control instruction corresponding to the start gesture in the image.
The control instructions for controlling the home appliances in the image may be one or more, and any one of the control instructions is executed when the plurality of control instructions are the same control instruction; under the condition that the various control instructions respectively belong to the control instructions of different actions, executing the various instructions respectively; in the case where the control commands are contradictory control commands among the plurality of control commands, the priority may be set in advance, the control command with the higher priority may be executed, and the control command opposite to the control command with the higher priority may be discarded.
For example, the image may include two kinds of instructions for controlling the home appliance, namely, an on gesture and an on action, and both of the two kinds of control instructions are instructions for turning on the home appliance, so that any one of the control instructions may be selected for execution. The image may further include a start gesture and an action of raising the temperature, where the control instruction is a control instruction of starting the home appliance and raising the operating temperature, where the two control instructions belong to different actions, and there is no conflict, so that both the control instructions are executed. The image may further include an on gesture and an off gesture, where the two control instructions are two control instructions with opposite actions, so that a priority may be preset, and the gesture may be set to take precedence over the action, and then the control instruction to turn on the home appliance is selected to be executed, and the control instruction to turn off the home appliance is discarded.
Optionally, before the instruction recognition model is used to recognize the control instruction for controlling the home appliance in the image, the method further includes: the instruction recognition model is obtained by adopting the following modes: determining a basic model algorithm of the instruction recognition model; and setting a plurality of convolution layers and a screening layer in the basic model algorithm, wherein the convolution layers are used for acquiring a plurality of identification results for identifying the image, and the screening layer is used for screening the plurality of identification results to obtain a screening result.
The instruction identification model is a convolutional neural network identification model, a plurality of convolutional layers are arranged on the basis of a basic model algorithm and used for identifying input, a plurality of identification results and probabilities of the identification results are output, a plurality of screening layers are arranged among the plurality of convolutional layers and used for screening output windows of a plurality of output results of the convolutional layers, windows with higher confidence degrees are screened out from the plurality of screening windows, namely windows with higher reliability, windows with lower confidence degrees are abandoned, and therefore accuracy of the output results of the convolutional layers is improved.
The convolution layer is used for obtaining a plurality of identification results for identifying the image, and the screening layer is used for screening the plurality of identification results to obtain screening results. And screening the identification result of the convolution layer according to the screening layer, so that the identification accuracy of the identification model is improved.
Optionally, acquiring the instruction recognition model further includes: and an oscillation suppression layer is further arranged in the basic model algorithm, wherein the oscillation suppression layer is used for suppressing oscillation in a regression process by adopting a logarithmic suppression method, and the regression process is a process of obtaining the identified control instruction by carrying out regression on the screening result.
In the identification model, when the basic model algorithm generates a prediction window output by the model, regression operation needs to be performed at a target position of the model algorithm, the prediction window can be determined after the regression operation converges, and the regression operation method in the related technology may cause large oscillation of data, so that the speed of regression convergence is slow, and the identification speed of the identification model is affected.
In the embodiment, the oscillation suppression layer is arranged in the basic model algorithm, and the logarithmic suppression method is adopted to suppress the data oscillation in the regression process, so that the operation speed of the recognition model is improved.
Optionally, acquiring the image of the target for controlling the home appliance includes: acquiring a plurality of images of a target for controlling the household appliance; determining features in the plurality of images; and determining an image in which the features with the feature confidence coefficient higher than a preset value are positioned as an image for controlling the household appliance, wherein the feature confidence coefficient is used for identifying the obvious degree of the features for controlling the household appliance.
Under the condition that the variety of image features for controlling the household appliances is more, the flexibility of household appliance control can be effectively increased, and the user experience degree is effectively improved. In addition, as the types of image features for controlling the home appliance are more, the image recognition is more complicated, and when there are control instructions corresponding to a plurality of image features, the priorities of the plurality of control instructions may be ordered. In this embodiment, the confidence is used as a criterion to screen a plurality of image features, so as to screen control instructions corresponding to the plurality of image features.
The confidence is a confidence interval of a probability sample in statistics, and is an interval estimation of a certain overall parameter of the sample. The confidence interval reveals the extent to which the true value of this parameter falls around the measurement with a certain probability. The confidence interval gives the degree of confidence of the measured value of the measured parameter, i.e. the required probability, i.e. the confidence level.
Optionally, controlling the home appliance according to the identified control instruction includes: outputting the identified control instruction; receiving a voice instruction for controlling the household appliance; and controlling the household appliance according to the identified control command under the condition that the identified control command is higher in priority than the voice command.
In the above embodiment, when the home appliance is controlled according to the control instruction output by the recognition model, the home appliance may be controlled by combining the voice of the user. When the control method is specifically combined, the voice control can be utilized to assist in controlling the identified control command, and after the voice command for controlling the household appliance is received, the priority of the voice control command and the control command is judged. And controlling the household appliance according to the identified control command under the condition that the identified control command is higher in priority than the voice command. And if the priority of the voice control instruction is higher than that of the control instruction, controlling the household appliance according to the voice control instruction.
Fig. 2 is a schematic diagram of a home appliance control device according to an embodiment of the present invention, and as shown in fig. 2, the home appliance control device 20 includes: a first acquisition module 22, an identification module 24 and a control module 26. The home appliance control device 20 will be described in detail below.
A first acquiring module 22, configured to acquire an image of a target controlling the home appliance; the identifying module 24 is connected to the first obtaining module 22, and is configured to identify a control instruction for controlling the home appliance in the image by using an instruction identifying model, where the instruction identifying model is obtained by using multiple sets of data through machine learning training, and each set of data in the multiple sets of data includes: the image and a control instruction in the image for controlling the household appliances; the control module 26 is connected to the identification module 24, and is configured to control the home appliance according to the identified control command.
Optionally, the home appliance control device 20 further includes: the second acquisition module is used for acquiring the instruction identification model in the following way, and the second acquisition module comprises: a first determining unit configured to determine a basic model algorithm of the instruction recognition model; the first setting unit is used for setting a plurality of convolution layers and a screening layer in the basic model algorithm, wherein the convolution layers are used for acquiring a plurality of identification results for identifying the image, and the screening layer is used for screening the plurality of identification results to obtain screening results.
Optionally, the second obtaining module further includes: and the second setting unit is used for further setting an oscillation suppression layer in the basic model algorithm, wherein the oscillation suppression layer is used for suppressing oscillation in a regression process by adopting a logarithmic suppression method, and the regression process is a process of obtaining the identified control instruction by carrying out regression on the screening result.
Optionally, the first obtaining module 22 includes: an acquisition unit for acquiring a plurality of images of a target controlling the home appliance; a second determining unit configured to determine features in the plurality of images; and a third determining unit, configured to determine an image in which a feature with a feature confidence coefficient higher than a predetermined value is located as an image for controlling the home appliance, where the feature confidence coefficient is used to identify that the feature is an obvious degree for controlling the home appliance.
According to another aspect of an embodiment of the present invention, there is provided a storage medium, wherein the storage medium includes a stored program, and wherein the apparatus on which the storage medium is controlled to execute any one of the above home appliance control methods when the program runs.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of units may be a logic function division, and there may be another division manner in actual implementation, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the method of the various embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.
Claims (3)
1. A home appliance control method, comprising:
acquiring an image of a target for controlling the household appliance;
and identifying a control instruction for controlling the household appliance in the image by using an instruction identification model, wherein the instruction identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the image and a control instruction in the image for controlling the household appliances; the number of the control instructions for controlling the household appliances in the image is one or more, and any one control instruction is executed under the condition that the control instructions are the same control instructions; under the condition that the various control instructions respectively belong to the control instructions of different actions, executing the various instructions respectively; executing the control instruction with the highest priority and discarding the control instruction contradicting the control instruction with the highest priority under the condition that contradictory control instructions exist in the plurality of control instructions;
controlling the household appliance according to the identified control instruction;
before the control instruction for controlling the household appliance in the image is identified by using the instruction identification model, the method further comprises the following steps: the instruction recognition model is obtained by adopting the following modes: determining a basic model algorithm of the instruction recognition model; setting a plurality of convolution layers and a screening layer in a basic model algorithm, wherein the convolution layers are used for acquiring a plurality of identification results for identifying images, and the screening layer is used for screening the plurality of identification results to obtain screening results;
acquiring the instruction recognition model further comprises: the basic model algorithm is also provided with an oscillation suppression layer, wherein the oscillation suppression layer is used for suppressing oscillation in a regression process by adopting a logarithmic suppression method, and the regression process is a process of regressing a screening result to obtain an identified control instruction;
the obtaining of the image of the target for controlling the household appliance comprises the following steps: acquiring a plurality of images of a target for controlling the household appliance; determining features in the plurality of images; determining an image in which the features with the feature confidence coefficient higher than a preset value are located as an image for controlling the household appliance, wherein the feature confidence coefficient is used for identifying the obvious degree of controlling the household appliance by the features;
controlling the home appliance according to the identified control instruction comprises: outputting the identified control instruction; receiving a voice instruction for controlling the household appliance; and controlling the household appliance according to the identified control command under the condition that the identified control command is higher in priority than the voice command.
2. A home appliance control device, comprising:
the first acquisition module is used for acquiring an image of a target for controlling the household appliance;
the identification module is used for identifying a control instruction for controlling the household appliance in the image by using an instruction identification model, wherein the instruction identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the image and a control instruction in the image for controlling the household appliances; the number of the control instructions for controlling the household appliances in the image is one or more, and any one control instruction is executed under the condition that the control instructions are the same control instructions; under the condition that the various control instructions respectively belong to the control instructions of different actions, executing the various instructions respectively; executing the control instruction with the highest priority and discarding the control instruction contradicting the control instruction with the highest priority under the condition that contradictory control instructions exist in the plurality of control instructions;
the control module is used for controlling the household appliances according to the identified control instructions;
the apparatus further comprises: the second acquisition module is used for acquiring the instruction identification model in the following way, and the second acquisition module comprises: a first determining unit configured to determine a basic model algorithm of the instruction recognition model; the first setting unit is used for setting a plurality of convolution layers and a screening layer in the basic model algorithm, wherein the convolution layers are used for acquiring a plurality of identification results for identifying the image, and the screening layer is used for screening the plurality of identification results to obtain screening results;
the second acquisition module further includes: the second setting unit is used for setting an oscillation suppression layer in the basic model algorithm, wherein the oscillation suppression layer is used for suppressing oscillation in a regression process by adopting a logarithmic suppression method, and the regression process is a process of regressing a screening result to obtain an identified control instruction;
the first acquisition module includes: an acquisition unit for acquiring a plurality of images of a target controlling the home appliance; a second determining unit configured to determine features in the plurality of images; a third determining unit, configured to determine an image in which a feature with a feature confidence coefficient higher than a predetermined value is located as an image for controlling the home appliance, where the feature confidence coefficient is used to identify that the feature is an obvious degree for controlling the home appliance;
controlling the home appliance according to the identified control instruction comprises: outputting the identified control instruction; receiving a voice instruction for controlling the household appliance; and controlling the household appliance according to the identified control command under the condition that the identified control command is higher in priority than the voice command.
3. A storage medium comprising a stored program, wherein the program, when run, controls a device in which the storage medium resides to perform the home appliance control method of claim 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810918051.0A CN110825217B (en) | 2018-08-13 | 2018-08-13 | Household appliance control method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810918051.0A CN110825217B (en) | 2018-08-13 | 2018-08-13 | Household appliance control method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110825217A CN110825217A (en) | 2020-02-21 |
CN110825217B true CN110825217B (en) | 2023-07-11 |
Family
ID=69547116
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810918051.0A Active CN110825217B (en) | 2018-08-13 | 2018-08-13 | Household appliance control method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110825217B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016045579A1 (en) * | 2014-09-22 | 2016-03-31 | 努比亚技术有限公司 | Application interaction control method and apparatus, and terminal |
CN106249889A (en) * | 2016-07-28 | 2016-12-21 | 北京奇虎科技有限公司 | Infrared gesture recognition system and control method thereof and wearable device, identification equipment |
CN107576022A (en) * | 2017-09-12 | 2018-01-12 | 广东美的制冷设备有限公司 | Control method, air conditioner and the storage medium of air conditioner |
CN108052199A (en) * | 2017-10-30 | 2018-05-18 | 珠海格力电器股份有限公司 | Control method and device of range hood and range hood |
CN108305623A (en) * | 2018-01-15 | 2018-07-20 | 珠海格力电器股份有限公司 | electric appliance control method and device |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6937742B2 (en) * | 2001-09-28 | 2005-08-30 | Bellsouth Intellectual Property Corporation | Gesture activated home appliance |
US8428368B2 (en) * | 2009-07-31 | 2013-04-23 | Echostar Technologies L.L.C. | Systems and methods for hand gesture control of an electronic device |
CN102831404B (en) * | 2012-08-15 | 2016-01-13 | 深圳先进技术研究院 | Gesture detecting method and system |
CN103631210B (en) * | 2012-08-28 | 2017-02-01 | 海尔集团公司 | Human-computer interaction method and system in smart home |
US9524142B2 (en) * | 2014-03-25 | 2016-12-20 | Honeywell International Inc. | System and method for providing, gesture control of audio information |
CN106328143A (en) * | 2015-06-23 | 2017-01-11 | 中兴通讯股份有限公司 | Voice control method and device and mobile terminal |
CN105353634B (en) * | 2015-11-30 | 2018-05-08 | 北京地平线机器人技术研发有限公司 | Utilize the home appliance and method of gesture identification control operation |
FR3049078B1 (en) * | 2016-03-21 | 2019-11-29 | Valeo Vision | VOICE AND / OR GESTUAL RECOGNITION CONTROL DEVICE AND METHOD FOR INTERIOR LIGHTING OF A VEHICLE |
CN106440192B (en) * | 2016-09-19 | 2019-04-09 | 珠海格力电器股份有限公司 | Household appliance control method, device and system and intelligent air conditioner |
CN108229515A (en) * | 2016-12-29 | 2018-06-29 | 北京市商汤科技开发有限公司 | Object classification method and device, the electronic equipment of high spectrum image |
CN107229904B (en) * | 2017-04-24 | 2020-11-24 | 东北大学 | Target detection and identification method based on deep learning |
CN107942700A (en) * | 2017-12-15 | 2018-04-20 | 广东工业大学 | A kind of appliance control system, method and computer-readable recording medium |
-
2018
- 2018-08-13 CN CN201810918051.0A patent/CN110825217B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016045579A1 (en) * | 2014-09-22 | 2016-03-31 | 努比亚技术有限公司 | Application interaction control method and apparatus, and terminal |
CN106249889A (en) * | 2016-07-28 | 2016-12-21 | 北京奇虎科技有限公司 | Infrared gesture recognition system and control method thereof and wearable device, identification equipment |
CN107576022A (en) * | 2017-09-12 | 2018-01-12 | 广东美的制冷设备有限公司 | Control method, air conditioner and the storage medium of air conditioner |
CN108052199A (en) * | 2017-10-30 | 2018-05-18 | 珠海格力电器股份有限公司 | Control method and device of range hood and range hood |
CN108305623A (en) * | 2018-01-15 | 2018-07-20 | 珠海格力电器股份有限公司 | electric appliance control method and device |
Also Published As
Publication number | Publication date |
---|---|
CN110825217A (en) | 2020-02-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11805939B2 (en) | Cooking device, system and method for controlling the cooking device | |
CN113412609A (en) | Equipment control method, device, server and storage medium | |
CN110857807B (en) | Method and device for controlling air purifier | |
CN110826574A (en) | Food material maturity determination method and device, kitchen electrical equipment and server | |
US10339958B2 (en) | In-home legacy device onboarding and privacy enhanced monitoring | |
JP6002796B1 (en) | KANSEI evaluation device, KANSEI evaluation method, and program | |
CN112690761B (en) | Sleep state detection method, device, equipment and computer readable medium | |
CN114821236A (en) | Smart home environment sensing method, system, storage medium and electronic device | |
CN111199249A (en) | Food material identification and update control method and device and refrigeration equipment | |
CN114266959A (en) | Food cooking method and device, storage medium and electronic device | |
CN108710820A (en) | Infantile state recognition methods, device and server based on recognition of face | |
CN108498077B (en) | Sleep state detection method, home appliance, and computer-readable storage medium | |
CN110825217B (en) | Household appliance control method and device | |
CN113205802B (en) | Updating method of voice recognition model, household appliance and server | |
US11397874B2 (en) | Image capturing apparatus, generating apparatus, control method, and storage medium | |
CN111476140A (en) | Information playing method and system, electronic equipment, household appliance and storage medium | |
CN115547352A (en) | Electronic device, method, apparatus and medium for processing noise thereof | |
CN113805490A (en) | Device control method, system, apparatus, device and storage medium | |
US10832060B2 (en) | Resident activity recognition system and method thereof | |
CN109117786B (en) | Data processing method and device based on neural network model and readable storage medium | |
WO2021130856A1 (en) | Object identification device, object identification method, learning device, learning method, and recording medium | |
CN112906805A (en) | Image training sample screening and task model training method and device and electronic equipment | |
WO2023197887A1 (en) | Intelligent control method for starting washing machine and device thereof | |
JP7264412B2 (en) | Device and method for improving robustness against adversarial samples | |
CN111160168B (en) | Method and device for identifying food in cooking box and computer storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |