CN110738044B - Control intention recognition method and device, electronic equipment and storage medium - Google Patents

Control intention recognition method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110738044B
CN110738044B CN201910989897.8A CN201910989897A CN110738044B CN 110738044 B CN110738044 B CN 110738044B CN 201910989897 A CN201910989897 A CN 201910989897A CN 110738044 B CN110738044 B CN 110738044B
Authority
CN
China
Prior art keywords
controlled
equipment
attribute
information
mapping information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910989897.8A
Other languages
Chinese (zh)
Other versions
CN110738044A (en
Inventor
牛迪
段耀峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Tuya Information Technology Co Ltd
Original Assignee
Hangzhou Tuya Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Tuya Information Technology Co Ltd filed Critical Hangzhou Tuya Information Technology Co Ltd
Priority to CN201910989897.8A priority Critical patent/CN110738044B/en
Publication of CN110738044A publication Critical patent/CN110738044A/en
Application granted granted Critical
Publication of CN110738044B publication Critical patent/CN110738044B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Human Computer Interaction (AREA)
  • Selective Calling Equipment (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a control intention recognition method and device, an electronic device and a storage medium, wherein the method comprises the following steps: acquiring a voice instruction; identifying semantic information in the voice instruction, wherein the semantic information comprises controlled equipment and controlled attribute values aiming at the controlled equipment; based on a configured equipment body library, obtaining attribute categories matched with the controlled attribute values in mapping information of the controlled equipment, wherein the equipment body library comprises the mapping information of a plurality of pieces of equipment; and determining the control intention of controlling the controlled equipment based on the obtained attribute category and the controlled attribute value. The embodiment of the disclosure can accurately determine the control intention of the equipment.

Description

Control intention recognition method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of electronic equipment control, and in particular relates to a control intention identification method and device, electronic equipment and a storage medium.
Background
At present, in the intention recognition of a man-machine dialogue system, the language is usually required to be marked manually, and the language and the intention are bound manually and hard. The method for processing the intention can still meet the demands of users in the general field, but the field of the home Internet of things can lack flexibility.
This is because the intent will change as the properties of the device change. For example, in the internet of things of home, the phrase of "adjusting the warmer to 20 degrees" has two intended possibilities, namely setting the angle of the warmer and setting the temperature of the warmer. As to which purpose is, it depends entirely on which adjustable properties of the warmer, such as temperature, angle, etc. And different adjustment properties can be provided for heaters of different manufacturers and different models. These problems are not solved by making intent hard bindings to corpora in advance, because the annotation data does not anticipate which attributes the docked home device has, which attribute values can again be in units of "degrees".
Disclosure of Invention
The disclosure provides a technical scheme for identifying control intention.
According to an aspect of the present disclosure, there is provided a control intention recognition method including:
acquiring a voice instruction;
identifying semantic information in the voice instruction, wherein the semantic information comprises controlled equipment and controlled attribute values aiming at the controlled equipment;
based on a configured equipment body library, obtaining attribute categories matched with the controlled attribute values in mapping information of the controlled equipment, wherein the equipment body library comprises the mapping information of a plurality of pieces of equipment;
And determining the control intention of controlling the controlled equipment based on the obtained attribute category and the controlled attribute value.
In some possible embodiments, the obtaining, based on the configuration device ontology library, an attribute category matching the controlled attribute value in mapping information of the controlled device includes:
inquiring mapping information matched with the controlled equipment in the equipment body library based on the identified controlled equipment;
inquiring the controlled attribute value in the attribute range of the mapping information matched with the controlled equipment;
and obtaining the attribute category matched with the controlled attribute value based on the inquired controlled attribute value.
In some possible implementations, the querying, based on the identified controlled device, mapping information matched with the controlled device in the device ontology library includes:
requesting to obtain a graph data structure of the controlled device based on the identified controlled device;
and determining mapping information of the controlled equipment based on the obtained graph data structure.
In some possible embodiments, the method further comprises the step of configuring the device ontology library, including:
Receiving equipment information and mapping information of at least one electronic equipment, wherein the mapping information comprises an attribute type capable of being controlled by the electronic equipment and an attribute value corresponding to the attribute type;
forming a graph data structure of the electronic equipment based on equipment information and mapping information of the electronic equipment, wherein the graph data structure comprises the equipment information and the mapping information;
and storing the graph data structure of the electronic equipment to form the equipment body library.
In some possible embodiments, the identifying semantic information in the voice instruction includes:
and executing natural language processing on the voice instruction, and determining the intention information in the voice instruction.
In some possible embodiments, the method further comprises:
responding to obtaining at least two attribute categories matched with the controlled attribute values from the equipment body library, and outputting prompt information;
selection information is received, and an attribute category matching the controlled attribute value is determined based on the selection information.
In some possible embodiments, the method further comprises:
and controlling the controlled equipment according to the control intention.
According to a second aspect of the present disclosure, there is provided a control intention recognition apparatus including:
the acquisition module is used for acquiring the voice instruction;
the recognition module is used for recognizing semantic information in the voice instruction, wherein the semantic information comprises controlled equipment and controlled attribute values;
the matching module is used for obtaining attribute categories matched with the controlled attribute values in the mapping information of the controlled equipment based on an equipment body library, wherein the equipment body library comprises the mapping information of a plurality of pieces of equipment;
and the determining module is used for determining the control intention of the controlled equipment based on the obtained attribute category and the controlled attribute value.
According to a third aspect of the present disclosure, there is provided an electronic device comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to invoke the instructions stored in the memory to perform the method of any of the first aspects.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions, characterized in that the computer program instructions, when executed by a processor, implement the method of any one of the first aspects.
In the embodiment of the disclosure, each electronic device (such as various home devices) or server in the internet of things can determine, according to the received voice command, an attribute category matched with the controlled attribute value in the voice command, so as to realize the intention determination of the controlled attribute value in the attribute category. According to the embodiment of the disclosure, mapping information of a plurality of electronic devices can be uniformly managed through the configuration device body library, so that attribute categories of each electronic device matched with the controlled attribute value can be conveniently obtained, and further, intention determination can be accurately realized according to the received semantic instruction.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the technical aspects of the disclosure.
FIG. 1 illustrates a flow chart of a control intent recognition method in accordance with an embodiment of the present disclosure;
FIG. 2 illustrates a flow chart for configuring a device ontology library in accordance with an embodiment of the present disclosure;
FIG. 3 shows a schematic diagram of a graph data structure, according to an embodiment of the present disclosure;
FIG. 4 shows a flowchart of step S30, according to a disclosed embodiment;
FIG. 5 illustrates a schematic diagram of determining attribute categories in accordance with an embodiment of the present disclosure;
FIG. 6 illustrates a block diagram of a control intent recognition device in accordance with an embodiment of the present disclosure;
fig. 7 illustrates a block diagram of an electronic device 800, according to an embodiment of the disclosure;
fig. 8 illustrates a block diagram of another electronic device 1900 in accordance with an embodiment of the disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the disclosure will be described in detail below with reference to the drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, numerous specific details are set forth in the following detailed description in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements, and circuits well known to those skilled in the art have not been described in detail in order not to obscure the present disclosure.
The embodiment of the disclosure provides a control intention recognition method, an execution subject of which can be any electronic device or server, for example, the control intention recognition method can be executed by a terminal device or a home device, such as a sound box, an air conditioner, a refrigerator, a fan, and the like. Or the method can be executed by a server to realize the intention determination and control of the electronic equipment in the Internet of things. In some possible implementations, the control intent recognition method may be implemented by way of a processor invoking computer readable instructions stored in a memory.
Fig. 1 shows a flowchart of a control intention recognition method according to an embodiment of the present disclosure, as shown in fig. 1, including:
s10: acquiring a voice instruction;
In some possible implementations, the embodiments of the present disclosure may be applied in the field of the internet of things, where the internet of things may include a plurality of electronic devices, and the electronic devices may be devices capable of performing a control operation according to received voice information. Or the server can also receive the voice instruction, execute the determination of the controlled intention of the electronic equipment and control the corresponding electronic equipment. The electronic device or the server may be an internal voice receiving device or may be connected to an external voice receiving device, so as to receive the transmitted voice information. The following description will be made with the electronic device as an execution subject, and embodiments with the server as an execution subject will not be repeated.
In some possible embodiments, the user may send the voice command directly to the electronic device, or may send the voice command through another device capable of outputting voice information, for example, by operating the voice output device to send the voice information, and the electronic device may receive the transmitted voice command. The voice instruction is an instruction for controlling one or more electronic devices in the internet of things to execute an operation. For example, the voice command may be a command for starting up, shutting down, or adjusting an operation parameter of the electronic device, which is not particularly limited in the present disclosure.
In some possible embodiments, each electronic device in the internet of things may have a first state, which is a state that can be controlled by voice, and a second state, which is a state that is prohibited from being controlled by voice, so that on the one hand unnecessary interactions between a user and the electronic device may be reduced, and on the other hand power of the electronic device may be saved. The switching of the states can be achieved by inputting control instructions into the electronic device. And under the condition that the electronic equipment is in the first state, starting the voice receiving equipment for receiving the voice instruction. And under the condition that the electronic equipment is in the second state, the voice receiving equipment is turned off, and the voice instruction is not received any more.
S20: identifying semantic information in the voice instruction, wherein the semantic information comprises controlled equipment and controlled attribute values aiming at the controlled equipment;
in some possible implementations, where a voice instruction is received, semantic information in the voice instruction may be identified. Semantic information in voice instructions may be obtained, for example, by way of natural language processing. In one example, the voice command may be converted into corresponding text information, and then the text information is logically analyzed and processed in a natural voice processing manner to obtain semantic information. Or, the voice command can be directly analyzed to obtain semantic information. The embodiment of the disclosure can process the voice instruction through the convolutional neural network to obtain semantic information. The convolutional neural network can be trained to accurately extract semantic information in voice instructions. The manner in which the semantic information is obtained is not particularly limited by the present disclosure.
In some possible embodiments, the voice command may be voice information sent by the user, or may be voice information output by other devices. For example, the voice command may be "control the air conditioner temperature to 20 degrees". Semantic information is obtained by analyzing semantics corresponding to the voice message instruction, wherein the semantic information can comprise controlled equipment and controlled attribute values, the controlled equipment is equipment to be controlled, for example, an air conditioner in the example can be the controlled equipment, the controlled attribute values are attribute values corresponding to attributes to be controlled by the controlled equipment, for example, 20 degrees in the example can be the controlled attribute values. The foregoing is merely exemplary, and different semantic information may be obtained for different voice commands.
S30: based on a configured equipment body library, obtaining attribute categories matched with the controlled attribute values in mapping information of the controlled equipment, wherein the equipment body library comprises the mapping information of a plurality of pieces of equipment;
in some possible implementations, the device ontology library may be a pre-configured database including mapping information for a plurality of electronic devices. The electronic equipment can be respectively stored with an equipment body library, or the electronic equipment can be in communication connection with the equipment body library (database) so as to realize data interaction with the equipment body library. The device body library may store device information and mapping information of each electronic device in the internet of things, where the device information may include a name of the electronic device or other information capable of determining an identity of the electronic device, and the identity may be, for example, a number of the electronic device, which is not specifically limited in this disclosure. The mapping information of the device includes attribute categories in which the electronic device can be configured, and attribute ranges (attribute values) in which each attribute category can be configured. For example, the electronic device may include an air conditioner, the device information may include a device name air conditioner, or may also include an identification of the air conditioner (device a), the mapping information may include attribute categories of temperature, intensity, mode, etc., an attribute range of 0 degrees to 35 degrees may be configured in the attribute category of temperature, three intensity values of high, medium and low may be configured in the attribute category of intensity, and mode values of cooling, dehumidification, etc. may be configured in the attribute category of mode. The foregoing is merely exemplary of the attribute categories and attribute ranges of the electronic device, which are not specifically limited by the present disclosure.
In some possible embodiments, the electronic device may request, from the device ontology library, to obtain the attribute category matching the controlled attribute value in the semantic information, if the semantic information in the voice instruction is obtained. Each electronic device receiving the voice command can generate request information according to the obtained semantic information, and the request information can be used for obtaining attribute categories matched with the controlled attribute values in the device ontology library. The corresponding attribute categories may then be obtained from the device ontology library.
S40: and determining the control intention of controlling the controlled equipment based on the obtained attribute category and the controlled attribute value.
In some possible embodiments, the electronic device may determine the control intention to set the attribute value of the attribute class of the controlled device as the controlled attribute value in a case where the attribute class corresponding to the controlled attribute value is obtained. Further, the controlled device may also set the attribute value of the determined attribute category to the controlled attribute value, so that the control of the controlled device may be completed.
Based on the above configuration, each electronic device in the internet of things in the embodiment of the disclosure may determine, according to the received voice command, an attribute category matching with the controlled attribute value in the voice command, so as to implement the intent determination of the controlled attribute value in the attribute category. According to the embodiment of the disclosure, mapping information of a plurality of electronic devices can be uniformly managed through the configuration device body library, so that attribute categories of each electronic device matched with the controlled attribute value can be conveniently obtained, and further, intention determination can be accurately realized according to the received semantic instruction.
Embodiments of the present disclosure are described in detail below with reference to the attached drawings. Firstly, an embodiment of the present disclosure may configure an equipment body library, where the equipment body library includes equipment information and mapping information of each electronic device in the internet of things. Fig. 2 shows a flowchart of configuring a device ontology library according to an embodiment of the present disclosure, wherein configuring the device ontology library includes:
s101: receiving equipment information and mapping information of at least one electronic equipment, wherein the mapping information comprises an attribute type capable of being controlled by the electronic equipment and an attribute value corresponding to the attribute type;
in some possible embodiments, device information of each electronic device may be obtained, and as described in the foregoing embodiments, the device information may include a name of the electronic device, or any identifier capable of uniquely determining the electronic device. The device information may also include information of a model number, manufacturer, class of goods, etc. of the electronic device, which is not particularly limited in this disclosure. In addition, a mapping relationship of each electronic device may also be obtained, the mapping relationship including attribute categories that can be controlled and attribute values (ranges) of each attribute category object. That is, the embodiment of the disclosure can integrate knowledge of the electronic device, and analyze which functions, which attributes, and corresponding attribute values are supported by each category, each model of home equipment.
S102: forming a graph data structure of the electronic equipment based on equipment information and mapping information of the electronic equipment, wherein the graph data structure comprises the equipment information and the mapping information;
in some possible embodiments, in the case of obtaining the device information and the mapping information of the electronic device, a graph data structure between the device information and the mapping information of the electronic device may be obtained, where the graph data structure may represent the device information and the mapping information, and may represent a correspondence between each attribute value range and each attribute category in the mapping information. Fig. 3 shows a schematic diagram of a graph data structure according to an embodiment of the present disclosure.
As shown in fig. 3, a graph data structure of the device a and the device B is shown, wherein each device may include respective device information and mapping information in the graph data structure. For example, the device a and the device B may respectively represent the identifiers of the two electronic devices, and the information corresponding to the types may respectively represent the types (air conditioner and fan) of the two electronic devices, and also show the attribute types and the attribute values corresponding to the attribute types of the two electronic devices. The device information and the mapping information of each electronic device can be clearly determined through the data structure.
S103: and storing the association relation of the electronic equipment to form the equipment body library.
In some possible implementations, the graph data structures of the electronic devices may be stored to form a device ontology library, if available.
Different graph data structures can be formed for different electronic devices, and the intention of the corresponding attribute category of the voice instruction execution device can be conveniently determined by storing the graph data structures of the electronic devices in the device body library.
Next, a control intention determining process of the embodiment of the present disclosure will be described based on a configured device ontology library. Firstly, the embodiment of the disclosure can obtain the voice instruction, and semantic information is obtained by analyzing the voice instruction. The embodiment of the disclosure can execute the acquisition of the semantic information through a convolutional neural network, the convolutional neural network can receive a voice instruction and convert the voice instruction into text information, and the semantic information is obtained by executing feature processing on the text information. And extracting the controlled equipment and the controlled attribute value in the voice instruction.
In the embodiment of the present disclosure, in the case of obtaining the semantic information, it may be first determined whether the control intent can be directly determined according to the semantic information, for example, in the case where the controlled attribute value in the semantic information is a first type of attribute value, the controlled attribute value may be determined as the control intent for the controlled device, where the first type of controlled attribute value may include at least one of on, off, and sleep, or may be other attribute values capable of directly determining the intent, which is not specifically limited in the present disclosure. In addition, if the controlled attribute value in the semantic information is the second type attribute value, the attribute category matching the controlled attribute value may be queried from the ontology device library. The second type of attribute values are attribute values for which intent cannot be directly determined according to semantics, for example, the second type of attribute values may include at least one attribute value of degree, strength, and intensity, and again the above examples are not specific limitations of the present disclosure. By the configuration, all the controlled attribute values can be not required to be determined through the equipment body library to determine the corresponding attribute types, so that time is saved, and efficiency is improved.
Fig. 4 shows a flowchart of step S30 according to a disclosed embodiment, where the obtaining, based on the configuration device ontology library, an attribute category matching the controlled attribute value in mapping information of the controlled device includes:
s31: inquiring mapping information matched with the controlled equipment in the equipment body library based on the identified controlled equipment;
the embodiment of the disclosure can query the graph data structure matched with the controlled device in the device ontology library according to the identified controlled device, and in one example, the controlled device in the semantic information can comprise the name (type) of the controlled device or can also comprise the identification of the controlled device. According to the embodiment of the disclosure, the request information can be generated according to the equipment information of the controlled equipment, so that a graph data structure matched with the name or the identifier of the monitored equipment can be searched in the equipment body library, and under the condition of determining the graph data structure, mapping information matched with the controlled equipment can be searched in the determined graph data structure, namely, the attribute type of the controlled equipment and the attribute values respectively corresponding to the attribute types can be determined.
In one example, FIG. 5 illustrates a schematic diagram of determining attribute categories in accordance with an embodiment of the present disclosure. The voice command is "set air conditioner to powerful", where the semantic information may include "air conditioner" and "powerful", and at this time, the device body library may be queried for a graph data structure of "type air conditioner", for example, may be queried for determining a graph data structure of device a in fig. 4.
S32: inquiring the controlled attribute value in the attribute range of the mapping information matched with the controlled equipment;
in some possible embodiments, in the case of determining the graph data structure and the corresponding mapping information, the attribute range in which the controlled attribute values match may be looked up in the mapping information. The attribute values which are the same as the controlled attribute values can be queried, or the attribute ranges which correspond to the controlled attribute values can be queried, and the attribute ranges can be distinguished according to different attributes.
Referring to the above example, the map of the graph data structure of device a may be queried for a "strong" attribute value.
S33: and obtaining the attribute category matched with the controlled attribute value based on the inquired controlled attribute value.
In some possible embodiments, in the case that an attribute value corresponding to the controlled attribute value is queried in the graph data structure, an attribute category corresponding to the attribute value may be determined, so that an attribute category matching the controlled attribute value may be determined.
Referring to the above example, it may be further determined that the attribute category corresponding to "strong" is "mode".
Through the embodiment, the attribute category corresponding to the controlled attribute value of the controlled equipment can be conveniently determined, and the control intention can be conveniently and accurately determined.
Under the condition that the attribute type of the controlled attribute value is determined, the control intention aiming at the controlled equipment can be obtained according to the controlled equipment in the semantic information, the determined attribute type and the controlled attribute value.
In some possible embodiments, since the ontology device library may include graph data structures of a plurality of electronic devices, when matching of controlled attribute values of the controlled device is performed, a situation that attribute categories matching the controlled attribute values are a plurality of may be queried, and at this time, prompt information may be output. The prompt information may be a display output or may be a voice output, which is not particularly limited in this disclosure. In addition, the hint information may include information requesting a selection of a desired attribute category from a plurality of attribute categories. The user may input selection information when receiving the prompt information, where the selection information may include a final determined attribute category matching the controlled attribute value. That is, the electronic device may determine an attribute category matching the controlled attribute value based on the received selection information. Thereby performing a control operation of the controlled attribute value corresponding to the attribute category.
In some possible embodiments, when the attribute category matching the controlled attribute value is not found, the prompt information may be output, where the prompt information may include a prompt information that cannot match the attribute category, or a prompt information that prompts to reenter the voice information, so as to improve the user experience.
In summary, in the embodiment of the disclosure, each electronic device (such as various types of home devices) or the server in the internet of things may determine, according to the received voice command, an attribute category matching with the controlled attribute value in the voice command, so as to implement the intention determination of the controlled attribute value in the attribute category. According to the embodiment of the disclosure, mapping information of a plurality of electronic devices can be uniformly managed through the configuration device body library, so that attribute categories of each electronic device matched with the controlled attribute value can be conveniently obtained, and further, intention determination can be accurately realized according to the received semantic instruction.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
It will be appreciated that the above-mentioned method embodiments of the present disclosure may be combined with each other to form a combined embodiment without departing from the principle logic, and are limited to the description of the present disclosure.
In addition, the disclosure further provides a control intention recognition device, an electronic device, a computer readable storage medium and a program, which can be used for implementing any control intention recognition method provided by the disclosure, and corresponding technical schemes and descriptions and corresponding records referring to method parts are not repeated.
Fig. 6 shows a block diagram of a control intention recognition apparatus according to an embodiment of the present disclosure, as shown in fig. 6, the control intention recognition apparatus including:
an acquisition module 10 for acquiring a voice instruction;
the recognition module 20 is configured to recognize semantic information in the voice command, where the semantic information includes a controlled device and a controlled attribute value;
a matching module 30, configured to obtain, based on an equipment ontology library, an attribute category that matches the controlled attribute value in mapping information of the controlled equipment, where the equipment ontology library includes the mapping information of a plurality of equipment;
a determining module 40, configured to determine a control intention for controlling the controlled device based on the obtained attribute category and the controlled attribute value.
In some embodiments, functions or modules included in an apparatus provided by the embodiments of the present disclosure may be used to perform a method described in the foregoing method embodiments, and specific implementations thereof may refer to descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
The disclosed embodiments also provide a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described method. The computer readable storage medium may be a non-volatile computer readable storage medium.
The embodiment of the disclosure also provides an electronic device, which comprises: a processor; a memory for storing processor-executable instructions; wherein the processor is configured as the method described above.
The electronic device may be provided as a terminal, server or other form of device.
Fig. 7 illustrates a block diagram of an electronic device 800, according to an embodiment of the disclosure. For example, electronic device 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 7, an electronic device 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen between the electronic device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. When the electronic device 800 is in an operational mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the electronic device 800. For example, the sensor assembly 814 may detect an on/off state of the electronic device 800, a relative positioning of the components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in position of the electronic device 800 or a component of the electronic device 800, the presence or absence of a user's contact with the electronic device 800, an orientation or acceleration/deceleration of the electronic device 800, and a change in temperature of the electronic device 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the electronic device 800 and other devices, either wired or wireless. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi,2G, or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including computer program instructions executable by processor 820 of electronic device 800 to perform the above-described methods.
Fig. 8 illustrates a block diagram of another electronic device 1900 in accordance with an embodiment of the disclosure. For example, electronic device 1900 may be provided as a server. Referring to fig. 8, electronic device 1900 includes a processing component 1922 that further includes one or more processors and memory resources represented by memory 1932 for storing instructions, such as application programs, that can be executed by processing component 1922. The application programs stored in memory 1932 may include one or more modules each corresponding to a set of instructions. Further, processing component 1922 is configured to execute instructions to perform the methods described above.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 1932, including computer program instructions executable by processing component 1922 of electronic device 1900 to perform the methods described above.
The present disclosure may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for performing the operations of the present disclosure can be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, c++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present disclosure are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information of computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the technical improvement of the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (8)

1. A control intention recognition method, characterized by comprising:
acquiring a voice instruction;
identifying semantic information in the voice instruction, wherein the semantic information comprises controlled equipment and controlled attribute values aiming at the controlled equipment;
based on a configured equipment body library, obtaining attribute categories matched with the controlled attribute values in mapping information of the controlled equipment, wherein the equipment body library comprises the mapping information of a plurality of pieces of equipment; comprising the following steps:
based on the identified controlled equipment, inquiring mapping information matched with the controlled equipment in the equipment body library, wherein the mapping information comprises the following steps:
Requesting to obtain a graph data structure of the controlled device based on the identified controlled device;
determining mapping information of the controlled device based on the obtained graph data structure;
inquiring the controlled attribute value in the attribute range of the mapping information matched with the controlled equipment;
obtaining attribute categories matched with the controlled attribute values based on the inquired controlled attribute values;
and determining the control intention of controlling the controlled equipment based on the obtained attribute category and the controlled attribute value.
2. The method of claim 1, further comprising the step of configuring the device ontology library, comprising:
receiving equipment information and mapping information of at least one electronic equipment, wherein the mapping information comprises an attribute type capable of being controlled by the electronic equipment and an attribute value corresponding to the attribute type;
forming a graph data structure of the electronic equipment based on equipment information and mapping information of the electronic equipment, wherein the graph data structure comprises the equipment information and the mapping information;
and storing the graph data structure of the electronic equipment to form the equipment body library.
3. The method of claim 1, wherein said identifying semantic information in said voice instruction comprises:
And executing natural language processing on the voice instruction, and determining intention information in the voice instruction.
4. The method according to claim 1, wherein the method further comprises:
responding to obtaining at least two attribute categories matched with the controlled attribute values from the equipment body library, and outputting prompt information;
selection information is received, and an attribute category matching the controlled attribute value is determined based on the selection information.
5. The method according to claim 1, wherein the method further comprises:
and controlling the controlled equipment according to the control intention.
6. A control intention recognition device characterized by comprising:
the acquisition module is used for acquiring the voice instruction;
the recognition module is used for recognizing semantic information in the voice instruction, wherein the semantic information comprises controlled equipment and controlled attribute values;
the matching module is used for obtaining attribute categories matched with the controlled attribute values in the mapping information of the controlled equipment based on an equipment body library, wherein the equipment body library comprises the mapping information of a plurality of pieces of equipment; comprising the following steps:
based on the identified controlled equipment, inquiring mapping information matched with the controlled equipment in the equipment body library, wherein the mapping information comprises the following steps:
Requesting to obtain a graph data structure of the controlled device based on the identified controlled device;
determining mapping information of the controlled device based on the obtained graph data structure;
inquiring the controlled attribute value in the attribute range of the mapping information matched with the controlled equipment;
obtaining attribute categories matched with the controlled attribute values based on the inquired controlled attribute values;
and the determining module is used for determining the control intention of the controlled equipment based on the obtained attribute category and the controlled attribute value.
7. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to invoke the instructions stored in the memory to perform the method of any of claims 1-5.
8. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the method of any of claims 1-5.
CN201910989897.8A 2019-10-17 2019-10-17 Control intention recognition method and device, electronic equipment and storage medium Active CN110738044B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910989897.8A CN110738044B (en) 2019-10-17 2019-10-17 Control intention recognition method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910989897.8A CN110738044B (en) 2019-10-17 2019-10-17 Control intention recognition method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110738044A CN110738044A (en) 2020-01-31
CN110738044B true CN110738044B (en) 2023-09-22

Family

ID=69269197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910989897.8A Active CN110738044B (en) 2019-10-17 2019-10-17 Control intention recognition method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110738044B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113535987B (en) * 2021-09-13 2022-01-21 杭州涂鸦信息技术有限公司 Linkage rule matching method and related device

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7921156B1 (en) * 2010-08-05 2011-04-05 Solariat, Inc. Methods and apparatus for inserting content into conversations in on-line and digital environments
WO2015144065A1 (en) * 2014-03-26 2015-10-01 华为技术有限公司 Semantic recognition-based help processing method and device
CN107370649A (en) * 2017-08-31 2017-11-21 广东美的制冷设备有限公司 Household electric appliance control method, system, control terminal and storage medium
CN107492374A (en) * 2017-10-11 2017-12-19 深圳市汉普电子技术开发有限公司 A kind of sound control method, smart machine and storage medium
CN107515944A (en) * 2017-08-31 2017-12-26 广东美的制冷设备有限公司 Exchange method, user terminal and storage medium based on artificial intelligence
CN107688614A (en) * 2017-08-04 2018-02-13 平安科技(深圳)有限公司 It is intended to acquisition methods, electronic installation and computer-readable recording medium
CN108376543A (en) * 2018-02-11 2018-08-07 深圳创维-Rgb电子有限公司 A kind of control method of electrical equipment, device, equipment and storage medium
WO2019007245A1 (en) * 2017-07-04 2019-01-10 阿里巴巴集团控股有限公司 Processing method, control method and recognition method, and apparatus and electronic device therefor
CN109379261A (en) * 2018-11-30 2019-02-22 北京小米智能科技有限公司 Control method, device, system, equipment and the storage medium of smart machine
WO2019168235A1 (en) * 2018-03-02 2019-09-06 주식회사 머니브레인 Method and interactive ai agent system for providing intent determination on basis of analysis of same type of multiple pieces of entity information, and computer-readable recording medium
CN110246496A (en) * 2019-07-01 2019-09-17 珠海格力电器股份有限公司 Audio recognition method, system, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109841210B (en) * 2017-11-27 2024-02-20 西安中兴新软件有限责任公司 Intelligent control implementation method and device and computer readable storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7921156B1 (en) * 2010-08-05 2011-04-05 Solariat, Inc. Methods and apparatus for inserting content into conversations in on-line and digital environments
WO2015144065A1 (en) * 2014-03-26 2015-10-01 华为技术有限公司 Semantic recognition-based help processing method and device
WO2019007245A1 (en) * 2017-07-04 2019-01-10 阿里巴巴集团控股有限公司 Processing method, control method and recognition method, and apparatus and electronic device therefor
CN107688614A (en) * 2017-08-04 2018-02-13 平安科技(深圳)有限公司 It is intended to acquisition methods, electronic installation and computer-readable recording medium
CN107370649A (en) * 2017-08-31 2017-11-21 广东美的制冷设备有限公司 Household electric appliance control method, system, control terminal and storage medium
CN107515944A (en) * 2017-08-31 2017-12-26 广东美的制冷设备有限公司 Exchange method, user terminal and storage medium based on artificial intelligence
CN107492374A (en) * 2017-10-11 2017-12-19 深圳市汉普电子技术开发有限公司 A kind of sound control method, smart machine and storage medium
CN108376543A (en) * 2018-02-11 2018-08-07 深圳创维-Rgb电子有限公司 A kind of control method of electrical equipment, device, equipment and storage medium
WO2019168235A1 (en) * 2018-03-02 2019-09-06 주식회사 머니브레인 Method and interactive ai agent system for providing intent determination on basis of analysis of same type of multiple pieces of entity information, and computer-readable recording medium
CN109379261A (en) * 2018-11-30 2019-02-22 北京小米智能科技有限公司 Control method, device, system, equipment and the storage medium of smart machine
CN110246496A (en) * 2019-07-01 2019-09-17 珠海格力电器股份有限公司 Audio recognition method, system, computer equipment and storage medium

Also Published As

Publication number Publication date
CN110738044A (en) 2020-01-31

Similar Documents

Publication Publication Date Title
CN111508483B (en) Equipment control method and device
EP3163569B1 (en) Method and device for controlling a smart device by voice, computer program and recording medium
US10175671B2 (en) Method and apparatus for controlling intelligent device
WO2017045309A1 (en) Device control method and apparatus, and terminal device
WO2017113842A1 (en) Intelligent device control method and apparatus
CN104038536B (en) Plug-in unit communication means and device
CN107767864B (en) Method and device for sharing information based on voice and mobile terminal
CN104407592A (en) Method and device for regulating running state of smart home device
KR101735755B1 (en) Method and apparatus for prompting device connection
CN104636453A (en) Illegal user data identification method and device
CN104363205A (en) Application login method and device
CN108803892B (en) Method and device for calling third party application program in input method
EP3015949A1 (en) Method and device for displaying information
CN105549572A (en) Control method and device for intelligent home equipment and equipment
CN111061452A (en) Voice control method and device of user interface
CN105468775A (en) Method and device used for electronic explanation
CN111031124A (en) Home equipment networking deployment method and device, electronic equipment and storage medium
CN108766427B (en) Voice control method and device
CN110738044B (en) Control intention recognition method and device, electronic equipment and storage medium
CN109992754B (en) Document processing method and device
CN111338971B (en) Application testing method and device, electronic equipment and storage medium
CN104951522A (en) Searching method and device
US20170180148A1 (en) Method, device and system for intelligent household appliance to access multiple servers
CN112130839A (en) Method for constructing database, method for voice programming and related device
CN111667827B (en) Voice control method and device for application program and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant