CN112350908B - Control method and device of intelligent household equipment - Google Patents

Control method and device of intelligent household equipment Download PDF

Info

Publication number
CN112350908B
CN112350908B CN202011249355.6A CN202011249355A CN112350908B CN 112350908 B CN112350908 B CN 112350908B CN 202011249355 A CN202011249355 A CN 202011249355A CN 112350908 B CN112350908 B CN 112350908B
Authority
CN
China
Prior art keywords
control
intention
intelligent household
household equipment
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011249355.6A
Other languages
Chinese (zh)
Other versions
CN112350908A (en
Inventor
黄姿荣
戴林
李禹慧
贾巨涛
吴伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202011249355.6A priority Critical patent/CN112350908B/en
Publication of CN112350908A publication Critical patent/CN112350908A/en
Application granted granted Critical
Publication of CN112350908B publication Critical patent/CN112350908B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/18Speech classification or search using natural language modelling
    • G10L15/1822Parsing for meaning understanding
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house

Abstract

The invention provides a control method and device of intelligent household equipment, and belongs to the technical field of intelligent household. The application includes: acquiring control audio of the intelligent household equipment; analyzing the control audio to determine a control intention of the intelligent household equipment; determining a control mode of the intelligent household equipment based on the type of the control intention, wherein the control mode comprises voice interaction control and touch interaction control. Through the application, the mutual adaptation of voice interaction control and touch interaction control of the intelligent household equipment is facilitated, and then the use experience of controlling the intelligent household equipment by a user is improved.

Description

Control method and device of intelligent household equipment
Technical Field
The application belongs to the technical field of intelligent home, and particularly relates to a control method and device for intelligent home equipment, computer equipment and a computer readable storage medium.
Background
At present, voice interaction control develops rapidly in smart home, the voice interaction control can obtain the space breadth, intuition and user emotional experience can exist, the user learning cost is reduced, but the accuracy of the current voice interaction control cannot meet the user control requirement, meanwhile, in a plurality of user scenes, the voice interaction control is not suitable for being used for controlling, a plurality of contents cannot be output simultaneously, and therefore touch interaction control still needs to exist in most smart home control scenes. For many of the above reasons, there is a need for a visual and auditory existence complement, i.e. a voice interaction and a touch interaction are mutually adapted. However, in the related art, the voice interaction control and the touch interaction control for the smart home device are independent from each other, and the voice interaction control lacks system visibility, which reduces the use experience of the user in controlling the smart home device.
Under the application scheme, the voice interaction control and the touch interaction control of the intelligent household equipment are mutually independent, and the voice interaction control lacks system visibility, so that the use experience of controlling the intelligent household equipment by a user is reduced.
Disclosure of Invention
In order to overcome the problems in the related art at least to a certain extent, the application provides a control method and device of smart home devices, a computer device and a computer readable storage medium, which are beneficial to enabling the voice interaction control and the touch interaction control of the smart home devices to be mutually adaptive, so as to improve the use experience of a user in controlling the smart home devices.
In order to achieve the purpose, the following technical scheme is adopted in the application:
in a first aspect,
the application provides a control method of intelligent household equipment, which comprises the following steps:
acquiring control audio of the intelligent household equipment;
analyzing the control audio to determine a control intention of the intelligent household equipment;
determining a control mode of the intelligent household equipment based on the type of the control intention, wherein the control mode comprises voice interaction control and touch interaction control.
Further, determining a control mode of the smart home device based on the type of the control intention includes:
if the type of the control intention is a multi-parameter task intention, performing touch interactive control on the intelligent home equipment, wherein the multi-parameter task intention is used for indicating that the intelligent home equipment needs to meet multiple parameters when executing a user instruction;
if the type of the control intention is a parameter ambiguous intention, performing touch interactive control on the intelligent home equipment, wherein the parameter ambiguous intention is used for indicating that parameters which do not exist in the intelligent home equipment exist in a user instruction;
and if the type of the control intention is an intention with a small output information amount, performing voice interaction control on the intelligent home equipment, wherein the intention with the small output information amount is used for indicating that the intelligent home equipment does not need to feed back information or the information amount needing to be fed back is lower than a preset threshold value when executing a user instruction.
Further, after determining to perform touch interaction control on the smart home device based on the type of the control intention, the method further includes:
in the process of performing touch interactive control on the intelligent home equipment, if the retention time of a parameter setting interface of the intelligent home equipment exceeds preset time, controlling the intelligent home equipment to switch to voice interactive control.
Further, analyzing the control audio, and determining the control intention of the smart home device includes:
carrying out voice recognition on the control audio to obtain a control text;
performing word segmentation on the control text to obtain a word segmentation text;
and determining the type of the control intention corresponding to the word segmentation text by using a pre-trained classification model, wherein the pre-trained classification model is used for indicating the mapping relation between the word segmentation text and the type of the control intention.
In a second aspect of the present invention,
the application provides a controlling means of intelligent household equipment includes:
the acquisition unit is used for acquiring control audio of the intelligent household equipment;
the first determining unit is used for analyzing the control audio and determining a control intention of the intelligent household equipment;
and the second determining unit is used for determining a control mode of the intelligent household equipment based on the type of the control intention, wherein the control mode comprises voice interaction control and touch interaction control.
Further, the second determination unit includes:
the first determining module is used for performing touch interactive control on the intelligent home equipment if the type of the control intention is a multi-parameter task intention, wherein the multi-parameter task intention is used for indicating that the intelligent home equipment needs to meet a plurality of parameters when executing a user instruction;
the second determining module is used for performing touch interactive control on the intelligent household equipment if the type of the control intention is a parameter ambiguous intention, wherein the parameter ambiguous intention is used for indicating that parameters which do not exist in the intelligent household equipment exist in a user instruction;
a third determining module, configured to perform voice interaction control on the smart home device if the type of the control intention is an intention with a small output information amount, where the intention with a small output information amount is used to indicate that the smart home device does not need to feed back information or the information amount that needs to be fed back is lower than a predetermined threshold when executing a user instruction;
further, the apparatus further comprises:
and the switching unit is used for controlling the intelligent household equipment to be switched to voice interaction control if the retention time of a parameter setting interface of the intelligent household equipment exceeds preset time in the process of performing touch interaction control on the intelligent household equipment after determining to perform touch interaction control on the intelligent household equipment based on the type of the control intention.
Further, the first determination unit includes:
the recognition module is used for carrying out voice recognition on the control audio to obtain a control text;
the word segmentation module is used for segmenting the control text to obtain a segmented text;
and the fourth determining module is used for determining the type of the control intention corresponding to the segmented text by using a pre-trained classification model, wherein the pre-trained classification model is used for indicating the mapping relation between the segmented text and the type of the control intention.
In a third aspect,
the application provides a computer device, which comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the computer program to realize the control method of the smart home device according to the first aspect.
In a fourth aspect of the present invention,
an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for controlling an intelligent home device according to the first aspect.
This application adopts above technical scheme, possesses following beneficial effect at least:
the method comprises the steps of obtaining control audio of the intelligent household equipment; analyzing the control audio to determine a control intention of the intelligent household equipment; the control mode of the intelligent home equipment is determined based on the type of the control intention, wherein the control mode comprises voice interaction control and touch interaction control, and compared with the fact that the voice interaction control and the touch interaction control of the intelligent home equipment in the prior art are mutually independent and lack of system visibility, the voice interaction control and the touch interaction control of the intelligent home equipment can be mutually adaptive, and therefore the use experience of controlling the intelligent home equipment by a user is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart illustrating a control method of a smart home device according to an exemplary embodiment;
fig. 2 is a schematic diagram illustrating a control method of smart home devices according to a preferred embodiment;
fig. 3 is a schematic diagram of a control device of a smart home device according to an exemplary embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail below. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the examples given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a flowchart illustrating a control method of a smart home device according to an exemplary embodiment, where as shown in fig. 1, the method includes the following steps:
s101, acquiring a control audio of the intelligent household equipment;
in one embodiment, after the control audio for the smart home device is input, the smart home device uploads the control voice to the cloud server. After the cloud server acquires the control audio, the control audio is analyzed to obtain a control intention, so that a control mode of the intelligent household equipment is determined according to the type of the control intention.
Step S102, analyzing the control audio to determine a control intention of the intelligent household equipment;
in one embodiment, parsing the control audio and determining the control intention of the smart home device may include:
carrying out voice recognition on the control audio to obtain a control text;
performing word segmentation on the control text to obtain a word segmentation text;
and determining the type of the control intention corresponding to the word segmentation text by using a pre-trained classification model, wherein the pre-trained classification model is used for indicating the mapping relation between the word segmentation text and the type of the control intention.
The control intention corresponding to the control audio is identified through the pre-trained classification model, and the purpose of quickly and accurately determining the control intention can be achieved.
Step S103, determining a control mode of the intelligent household equipment based on the type of the control intention, wherein the control mode comprises voice interaction control and touch interaction control.
In one embodiment, the determining the control mode of the smart home device based on the type of the control intention includes:
if the type of the control intention is a multi-parameter task intention, performing touch interactive control on the intelligent home equipment, wherein the multi-parameter task intention is used for indicating that the intelligent home equipment needs to meet multiple parameters when executing a user instruction;
if the type of the control intention is a parameter ambiguous intention, performing touch interactive control on the intelligent home equipment, wherein the parameter ambiguous intention is used for indicating that parameters which do not exist in the intelligent home equipment exist in a user instruction;
and if the type of the control intention is an intention with a small output information amount, performing voice interaction control on the intelligent home equipment, wherein the intention with the small output information amount is used for indicating that the intelligent home equipment does not need to feed back information or the information amount needing to be fed back is lower than a preset threshold value when executing a user instruction. The predetermined threshold may be set or adjusted according to specific actual requirements, and is not specifically limited herein.
In one embodiment, after determining to perform touch interaction control on the smart home device based on the type of the control intention, the method further includes:
in the process of performing touch interactive control on the intelligent home equipment, if the retention time of a parameter setting interface of the intelligent home equipment exceeds preset time, controlling the intelligent home equipment to switch to voice interactive control. The predetermined time may be set or adjusted according to specific actual requirements, and is not specifically limited herein.
In conclusion, the control audio of the intelligent household equipment is acquired; analyzing the control audio to determine a control intention of the intelligent household equipment; the control mode of the intelligent home equipment is determined based on the type of the control intention, wherein the control mode comprises voice interaction control and touch interaction control, and compared with the fact that the voice interaction control and the touch interaction control of the intelligent home equipment in the prior art are mutually independent and lack of system visibility, the voice interaction control and the touch interaction control of the intelligent home equipment can be mutually adaptive, and therefore the use experience of controlling the intelligent home equipment by a user is improved.
The embodiments of the present application are described and illustrated below by means of preferred embodiments.
Fig. 2 is a schematic diagram illustrating a control method of smart home devices according to a preferred embodiment, as shown in fig. 2,
the embodiment can be applied to the intelligent home equipment with screen interaction, and the control mode of the intelligent home equipment is judged by the cloud server. The cloud server judges the control intention of the input control audio, wherein the control intentions are respectively as follows: the method comprises the following steps of (1) carrying out multi-parameter task intention, parameter ambiguous intention and output information quantity less intention, wherein the multi-parameter task intention refers to the execution of a user instruction and can be carried out only by meeting multiple (more than three) parameters, such as a temperature parameter, a time parameter, a mode parameter and the like; the ambiguous parameter intention means that no parameter or relatively abstract parameter exists in the user instruction; the purpose of outputting less information amount means that the information amount required to be fed back by the intelligent household equipment is less, such as directly controlling instructions of turning on light, turning off an air conditioner and the like. The intelligent home equipment acquires control audio input by a user and uploads the control audio to the cloud server, the control audio is converted into text information through ASR, the text is segmented through NLU analysis intention, in the analysis process, three states of filling grooves are carried out after the segmentation is acquired, three classifications are carried out, and the control intention is analyzed synchronously. When the multi-parameter task intention and the ambiguous parameter intention are analyzed, an instruction is returned to the intelligent home equipment, an interface of the intelligent home equipment is called to interactively fill in parameters or display feedback contents, and a large amount of contents are not synthesized through semantics to be broadcasted any more. When the intention of less input information is analyzed, the intention is directly converted into a control protocol and is issued to the intelligent household equipment for execution. The above is the state of the touch interaction control adaptive to the voice interaction control.
In the process of adapting the voice interaction control to the touch interaction control, the interface of the smart home equipment is displayed in the setting interface in the process of touch interaction control by a user, when the standing time of the setting interface exceeds preset time, for example, 2 minutes, the display state of the current interface and the current state of the smart home equipment are actively reported to the cloud server through WiFi, the cloud server judges the current information, and the current task state is determined to be completed through the voice interaction control.
The embodiment can solve the problem that the voice interaction control and the touch interaction control are independent from each other, and simultaneously solve the problem that the voice interaction control lacks system visibility. The embodiment can aim at different user use scenes, the voice interaction control and the touch interaction control are mutually adaptive, the voice interaction control and the touch interaction control are in two-dimensional coupling, embedded interaction is achieved, the combined output efficiency is high, and the user use experience is better.
The embodiment also provides a control device of the smart home device, which is used for implementing the above embodiments and preferred embodiments, and the description of the device is omitted. As used hereinafter, the terms "module," "unit," "subunit," and the like may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 3 is a schematic diagram illustrating a control apparatus of a smart home device according to an exemplary embodiment, where as shown in fig. 3, the apparatus includes:
the acquiring unit 31 is configured to acquire a control audio for the smart home device;
the first determining unit 32 is configured to analyze the control audio and determine a control intention of the smart home device;
a second determining unit 33, configured to determine a control manner for the smart home device based on the type of the control intention, where the control manner includes voice interaction control and touch interaction control.
Further, the second determination unit 33 includes:
the first determining module is used for performing touch interactive control on the intelligent home equipment if the type of the control intention is a multi-parameter task intention, wherein the multi-parameter task intention is used for indicating that the intelligent home equipment needs to meet a plurality of parameters when executing a user instruction;
the second determining module is used for performing touch interactive control on the intelligent household equipment if the type of the control intention is a parameter ambiguous intention, wherein the parameter ambiguous intention is used for indicating that parameters which do not exist in the intelligent household equipment exist in a user instruction;
a third determining module, configured to perform voice interaction control on the smart home device if the type of the control intention is an intention with a small output information amount, where the intention with a small output information amount is used to indicate that the smart home device does not need to feed back information or the information amount that needs to be fed back is lower than a predetermined threshold when executing a user instruction;
further, the apparatus further comprises:
and the switching unit is used for controlling the intelligent household equipment to be switched to voice interaction control if the retention time of a parameter setting interface of the intelligent household equipment exceeds preset time in the process of performing touch interaction control on the intelligent household equipment after determining to perform touch interaction control on the intelligent household equipment based on the type of the control intention.
Further, the first determination unit 32 includes:
the recognition module is used for carrying out voice recognition on the control audio to obtain a control text;
the word segmentation module is used for segmenting the control text to obtain a segmented text;
and the fourth determining module is used for determining the type of the control intention corresponding to the segmented text by using a pre-trained classification model, wherein the pre-trained classification model is used for indicating the mapping relation between the segmented text and the type of the control intention.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
The embodiment of the application also provides computer equipment, and the control method of the intelligent household equipment can be realized by the computer equipment. The computer device in the embodiment of the application comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein when the processor executes the computer program, the control method of any one of the smart home devices in the embodiment is realized.
The embodiment of the application also provides a computer readable storage medium. The computer readable storage medium having stored thereon computer program instructions; when executed by the processor, the computer program instructions implement the control method of any one of the smart home devices in the above embodiments.
It is understood that the same or similar parts in the above embodiments may be mutually referred to, and the same or similar parts in other embodiments may be referred to for the content which is not described in detail in some embodiments.
It should be noted that, in the description of the present application, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present application, the meaning of "plurality" means at least two unless otherwise specified.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or intervening elements may also be present; when an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present, and further, as used herein, connected may include wirelessly connected; the term "and/or" is used to include any and all combinations of one or more of the associated listed items.
Any process or method descriptions in flow charts or otherwise described herein may be understood as: represents modules, segments or portions of code which include one or more executable instructions for implementing specific logical functions or steps of a process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (8)

1. A control method of intelligent household equipment is characterized by comprising the following steps:
acquiring control audio of the intelligent household equipment;
analyzing the control audio to determine the type of the control intention of the intelligent household equipment;
if the type of the control intention is a multi-parameter task intention, performing touch interactive control on the intelligent household equipment, including: displaying a plurality of parameters to be filled in on a parameter setting interface, wherein if the retention time of the parameter setting interface of the intelligent household equipment exceeds preset time, the state of the parameter setting interface is uploaded to a server, and the intelligent household equipment is controlled to be switched back to voice interaction control;
the multi-parameter task intention is used for indicating that the intelligent household equipment needs to meet multiple parameters when executing a user instruction.
2. The method of claim 1, further comprising:
if the type of the control intention is a parameter ambiguous intention, performing touch interactive control on the intelligent home equipment, wherein the parameter ambiguous intention is used for indicating that parameters which do not exist in the intelligent home equipment exist in a user instruction;
and if the type of the control intention is an intention with a small output information amount, performing voice interaction control on the intelligent home equipment, wherein the intention with the small output information amount is used for indicating that the intelligent home equipment does not need to feed back information or the information amount needing to be fed back is lower than a preset threshold value when executing a user instruction.
3. The method according to any one of claims 1 to 2, wherein the parsing the control audio and the determining the control intention for the smart home device comprises:
carrying out voice recognition on the control audio to obtain a control text;
performing word segmentation on the control text to obtain a word segmentation text;
and determining the type of the control intention corresponding to the word segmentation text by using a pre-trained classification model, wherein the pre-trained classification model is used for indicating the mapping relation between the word segmentation text and the type of the control intention.
4. The utility model provides a controlling means of intelligent household equipment which characterized in that includes:
the acquisition unit is used for acquiring control audio of the intelligent household equipment;
the first determining unit is used for analyzing the control audio and determining the type of the control intention of the intelligent household equipment;
the second determining unit comprises a first determining module, is used for performing touch interactive control on the smart home device if the type of the control intention is a multi-parameter task intention, and comprises: displaying a plurality of parameters to be filled in on a parameter setting interface, wherein if the retention time of the parameter setting interface of the intelligent household equipment exceeds preset time, the state of the parameter setting interface is uploaded to a server, and the intelligent household equipment is controlled to be switched back to voice interaction control; the multi-parameter task intention is used for indicating that the intelligent household equipment needs to meet multiple parameters when executing a user instruction.
5. The apparatus of claim 4, wherein the second determining unit further comprises:
the second determining module is used for performing touch interactive control on the intelligent household equipment if the type of the control intention is a parameter ambiguous intention, wherein the parameter ambiguous intention is used for indicating that parameters which do not exist in the intelligent household equipment exist in a user instruction;
and the third determining module is used for performing voice interaction control on the intelligent home equipment if the type of the control intention is an intention with a small output information amount, wherein the intention with the small output information amount is used for indicating that the intelligent home equipment does not need to feed back information or the information amount needing to be fed back is lower than a preset threshold value when executing a user instruction.
6. The apparatus according to any one of claims 4 to 5, wherein the first determining unit comprises:
the recognition module is used for carrying out voice recognition on the control audio to obtain a control text;
the word segmentation module is used for segmenting the control text to obtain a segmented text;
and the fourth determining module is used for determining the type of the control intention corresponding to the segmented text by using a pre-trained classification model, wherein the pre-trained classification model is used for indicating the mapping relation between the segmented text and the type of the control intention.
7. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 3 when executing the computer program.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 3.
CN202011249355.6A 2020-11-10 2020-11-10 Control method and device of intelligent household equipment Active CN112350908B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011249355.6A CN112350908B (en) 2020-11-10 2020-11-10 Control method and device of intelligent household equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011249355.6A CN112350908B (en) 2020-11-10 2020-11-10 Control method and device of intelligent household equipment

Publications (2)

Publication Number Publication Date
CN112350908A CN112350908A (en) 2021-02-09
CN112350908B true CN112350908B (en) 2021-11-23

Family

ID=74362465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011249355.6A Active CN112350908B (en) 2020-11-10 2020-11-10 Control method and device of intelligent household equipment

Country Status (1)

Country Link
CN (1) CN112350908B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112581959B (en) * 2020-12-15 2023-05-09 四川虹美智能科技有限公司 Intelligent equipment control method, system and voice server

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102013254A (en) * 2010-11-17 2011-04-13 广东中大讯通信息有限公司 Man-machine interactive system and method for digital television voice recognition
CN102779398A (en) * 2012-07-09 2012-11-14 深圳市同洲电子股份有限公司 Control method, device and system for smart appliance
CN104138665A (en) * 2014-05-21 2014-11-12 腾讯科技(深圳)有限公司 Doll control method and doll
CN108600796A (en) * 2018-03-09 2018-09-28 百度在线网络技术(北京)有限公司 Control mode switch method, equipment and the computer-readable medium of smart television
CN110334110A (en) * 2019-05-28 2019-10-15 平安科技(深圳)有限公司 Natural language classification method, device, computer equipment and storage medium
CN111681653A (en) * 2020-04-28 2020-09-18 平安科技(深圳)有限公司 Call control method, device, computer equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002247183A (en) * 2001-02-16 2002-08-30 Denso Corp Mobile phone and program
CN107122404A (en) * 2017-03-22 2017-09-01 北京晓数聚传媒科技有限公司 A kind of user view data extracting method and device
CN110099173B (en) * 2019-04-30 2021-11-02 努比亚技术有限公司 Touch experience mode switching method, terminal and storage medium
CN110618613A (en) * 2019-09-03 2019-12-27 珠海格力电器股份有限公司 Linkage control method and device for intelligent equipment
CN111443803B (en) * 2020-03-26 2023-10-03 捷开通讯(深圳)有限公司 Mode switching method and device, storage medium and mobile terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102013254A (en) * 2010-11-17 2011-04-13 广东中大讯通信息有限公司 Man-machine interactive system and method for digital television voice recognition
CN102779398A (en) * 2012-07-09 2012-11-14 深圳市同洲电子股份有限公司 Control method, device and system for smart appliance
CN104138665A (en) * 2014-05-21 2014-11-12 腾讯科技(深圳)有限公司 Doll control method and doll
CN108600796A (en) * 2018-03-09 2018-09-28 百度在线网络技术(北京)有限公司 Control mode switch method, equipment and the computer-readable medium of smart television
CN110334110A (en) * 2019-05-28 2019-10-15 平安科技(深圳)有限公司 Natural language classification method, device, computer equipment and storage medium
CN111681653A (en) * 2020-04-28 2020-09-18 平安科技(深圳)有限公司 Call control method, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN112350908A (en) 2021-02-09

Similar Documents

Publication Publication Date Title
US10284705B2 (en) Method and apparatus for controlling smart device, and computer storage medium
CN109840111B (en) Graphical service processing system and method
CN107317853B (en) Method, device and system for displaying dynamic effect of message popup window
KR20190001894A (en) Method for creating web documents and Apparatus thereof
CN111880430A (en) Control method and device for intelligent household equipment
US9342498B2 (en) System and method for generating a design template based on graphical input
CN105389162A (en) Method and device for changing terminal system language and terminal
CN112350908B (en) Control method and device of intelligent household equipment
CN111399941A (en) Batch configuration method, device and equipment for BMC (baseboard management controller) functions
CN104461545A (en) Method and device of providing contents in mobile terminals to users
CN104063433A (en) Method and device for displaying recommended contents
CN110472298B (en) Method, device, equipment and storage medium for constructing electric power market model
CN106326087B (en) Web page experience method and system based on robot operating system
CN104252362A (en) Web page showing method and web page showing device
CN111831893A (en) Landing page generation method and device and server
US11347382B2 (en) Method of automatically switching mode of work screen of digital content authoring tool
CN113010435A (en) Method and device for screening algorithm model and test platform
CN115630011A (en) Method and device for realizing I2C bus communication of master and slave equipment by using CPLD
CN109978665A (en) A kind of method, apparatus, terminal device and the storage medium of the undercarriage on commodity
CN112596663A (en) Method and device for adjusting played content, computer equipment and storage medium
CN112579144A (en) Data processing method and device
CN106126392A (en) The determination methods of a kind of AM/BAM application program and device
CN104679853A (en) Information searching method and device
CN112083856B (en) Multi-page parameter viewing processing method and device, terminal and electrical equipment
US20230038513A1 (en) Interface display method and device, storage medium, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant