CN110768877B - Voice control instruction processing method and device, electronic equipment and readable storage medium - Google Patents

Voice control instruction processing method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN110768877B
CN110768877B CN201910927375.5A CN201910927375A CN110768877B CN 110768877 B CN110768877 B CN 110768877B CN 201910927375 A CN201910927375 A CN 201910927375A CN 110768877 B CN110768877 B CN 110768877B
Authority
CN
China
Prior art keywords
voice control
control instruction
service
operations
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910927375.5A
Other languages
Chinese (zh)
Other versions
CN110768877A (en
Inventor
戚耀文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN201910927375.5A priority Critical patent/CN110768877B/en
Publication of CN110768877A publication Critical patent/CN110768877A/en
Application granted granted Critical
Publication of CN110768877B publication Critical patent/CN110768877B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The application discloses a processing method and device of a voice control instruction, electronic equipment and a readable storage medium, and relates to the artificial intelligence technology. According to the embodiment of the application, the voice control instruction of the user based on the user voice interface is acquired, and then, according to the voice control instruction, at least two associated operations of the intelligent home service associated with the voice control instruction are acquired, so that the control and execution can be realized.

Description

Voice control instruction processing method and device, electronic equipment and readable storage medium
Technical Field
The present disclosure relates to computer technologies, and in particular, to an artificial intelligence technology, and more particularly, to a method and an apparatus for processing a voice control command, an electronic device, and a readable storage medium.
Background
The intelligent home is embodied in an internet of things manner under the influence of the internet of things. The intelligent home connects various home devices (such as audio/video devices, lighting systems, curtain control, air conditioner control, security and protection systems, digital cinema systems, network home appliances, three-meter reading systems and the like) in a designated area such as a home together through the internet of things technology, and provides multiple functions and means such as home appliance control, lighting control, curtain control, telephone remote control, indoor and outdoor remote control, anti-theft alarm, environment monitoring, heating and ventilation control, infrared forwarding, programmable timing control and the like. The home devices are in an intelligent home and can be called as intelligent home devices.
Generally, if a plurality of smart home services need to be used, a user needs to control each smart home device to perform services. However, the control method of the smart home service has low control efficiency.
Disclosure of Invention
Aspects of the present application provide a method and an apparatus for processing a voice control command, an electronic device, and a readable storage medium, so as to improve control efficiency of smart home services.
In one aspect of the present application, a method for processing a voice control instruction is provided, including:
acquiring a voice control instruction of a user based on a user voice interface;
according to the voice control instruction, obtaining at least two associated operations of the intelligent home service associated with the voice control instruction;
and controlling to execute the at least two associated operations so as to realize a plurality of intelligent home services.
The above-described aspects and any possible implementations further provide an implementation in which the associating operation includes at least one of an instruction operation and a service operation.
As to the above-mentioned aspect and any possible implementation manner, there is further provided an implementation manner, before obtaining, according to the voice control instruction, at least two association operations of the smart home service associated with the voice control instruction, the method further includes:
responding to the input operation of the user based on an associated operation setting interface, and setting the voice control instruction and at least two instruction operations associated with the voice control instruction; and/or
And responding to the drag operation of the user based on an associated operation setting interface, and setting the voice control instruction and at least two service operations associated with the voice control instruction.
The foregoing aspect and any possible implementation manner further provide an implementation manner, where before the setting, in response to a drag operation of the user based on an associated operation setting interface, the voice control instruction and at least two service operations associated with the voice control instruction, the method further includes:
and responding to the user operation of the user based on a service operation setting interface, and setting the service parameters of each service operation in at least two service operations associated with the voice control instruction.
The above aspect and any possible implementation further provides an implementation in which the controlling performs the at least two association operations, including:
controlling to execute the at least two associated operations simultaneously to realize a plurality of intelligent home services; or
And controlling to execute the at least two associated operations in sequence to realize a plurality of intelligent home services.
In another aspect of the present application, there is provided a processing apparatus for voice control instruction, including:
the acquisition unit is used for acquiring a voice control instruction of a user based on a user voice interface;
the association unit is used for acquiring at least two association operations of the intelligent home service associated with the voice control instruction according to the voice control instruction;
a control unit for controlling the execution of the at least two associated operations.
The above-described aspects and any possible implementations further provide an implementation in which the associating operation includes at least one of an instruction operation and a service operation.
The above-mentioned aspect and any possible implementation manner further provide an implementation manner, and the association unit is further configured to
Responding to the input operation of the user based on an associated operation setting interface, and setting the voice control instruction and at least two instruction operations associated with the voice control instruction; and/or
And responding to the drag operation of the user based on an associated operation setting interface, and setting the voice control instruction and at least two service operations associated with the voice control instruction.
The above-mentioned aspects and any possible implementation further provide an implementation that the association unit is further configured to
And responding to the user operation of the user based on a service operation setting interface, and setting the service parameters of each service operation in at least two service operations associated with the voice control instruction.
The above-described aspects and any possible implementation further provide an implementation of the control unit, which is specifically configured to
Controlling to execute the at least two associated operations simultaneously to realize a plurality of intelligent home services; or
And controlling to execute the at least two associated operations in sequence to realize a plurality of intelligent home services.
In another aspect of the present invention, an electronic device is provided, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of the aspects and any possible implementation described above.
In another aspect of the invention, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of the above described aspects and any possible implementation.
According to the technical scheme, the voice control instruction of the user based on the user voice interface is obtained, and then, according to the voice control instruction, at least two associated operations of the intelligent home service associated with the voice control instruction are obtained, so that the at least two associated operations can be controlled and executed to realize a plurality of intelligent home services, each intelligent home device does not need to be controlled to be served respectively, only one voice control instruction of the user is needed, a series of associated operations can be triggered and executed, a plurality of intelligent home services are realized, and the control efficiency of the intelligent home service is improved.
In addition, by adopting the technical scheme provided by the application, the user can customize the voice control instruction and at least two associated operations of the intelligent home service associated with each voice control instruction, and the individuation of the intelligent home service can be effectively improved.
In addition, by adopting the technical scheme provided by the application, in the implementation of the instruction operation, one voice control instruction of the user can trigger and complete the instruction operation of a plurality of preset intelligent home services, so that the method is very convenient, and the usability of the intelligent home equipment is greatly improved.
In addition, by adopting the technical scheme provided by the application, in the implementation of the service operation, one voice control instruction of the user can trigger and complete the service operation of a plurality of preset intelligent home services, so that the method is very convenient, and the service efficiency of the intelligent home services is greatly improved.
In addition, by adopting the technical scheme provided by the application, the user can conveniently set the voice control instruction and the associated operation of the associated intelligent home service based on the visual associated operation setting interface and by adopting input operation or dragging operation, the operation is simple, and therefore the service efficiency of the intelligent home service is improved.
In addition, by adopting the technical scheme provided by the application, the user experience can be effectively improved.
Further effects of the above aspects or possible implementations will be described below in connection with specific embodiments.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and those skilled in the art can also obtain other drawings according to the drawings without inventive labor. The drawings are only for the purpose of illustrating the present invention and are not to be construed as limiting the present application. Wherein:
fig. 1A is a schematic flowchart illustrating a processing method of a voice control command according to an embodiment of the present application;
fig. 1B and fig. 1C are interface schematic diagrams of an association operation setting interface in the embodiment corresponding to fig. 1A;
FIGS. 1D-1G are schematic interface diagrams of a service operation setting interface in the embodiment corresponding to FIG. 1A;
fig. 2 is a schematic structural diagram of a processing apparatus for voice control commands according to another embodiment of the present application;
fig. 3 is a schematic diagram of an electronic device for implementing a method for processing a voice control instruction according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terminal involved in the embodiments of the present application may include, but is not limited to, a mobile phone, a Personal Digital Assistant (PDA), a wireless handheld device, a Tablet Computer (Tablet Computer), a Personal Computer (PC), an MP3 player, an MP4 player, a wearable device (e.g., smart glasses, smart watches, smart bracelets, etc.), a smart home device (e.g., smart sound box device, smart television, smart air conditioner, etc.), and the like.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
Fig. 1A is a flowchart illustrating a processing method of a voice control command according to an embodiment of the present application, as shown in fig. 1A.
101. And acquiring a voice control instruction of a user based on a user voice interface.
102. And obtaining at least two associated operations of the intelligent home service associated with the voice control instruction according to the voice control instruction.
103. And controlling to execute the at least two associated operations to realize a plurality of intelligent home services.
It should be noted that part or all of the execution subjects of 101 to 103 may be an application located at the local terminal, or may also be a functional unit such as a plug-in or Software Development Kit (SDK) set in the application located at the local terminal, or may also be a processing engine located in a server on the network side, or may also be a distributed system located on the network side, for example, a processing engine or a distributed system in a smart home service platform on the network side, which is not particularly limited in this embodiment.
It is to be understood that the application may be a native app (native app) installed on the terminal, or may also be a web page program (webApp) of a browser on the terminal, which is not limited in this embodiment.
Therefore, by acquiring the voice control instruction of the user based on the user voice interface and then obtaining at least two associated operations of the intelligent home service associated with the voice control instruction according to the voice control instruction, the at least two associated operations can be controlled and executed to realize a plurality of intelligent home services, each intelligent home device does not need to be controlled to perform services, a series of associated operations can be triggered and executed only by one voice control instruction of the user, a plurality of intelligent home services are realized, and the control efficiency of the intelligent home services is improved.
The so-called User Voice Interface (VUI) is used for completing tasks through Voice control, and is a main interaction mode of the internet of things and the intelligent home system. The VUI implementation is based on Natural Language Processing (NLP) NLP technology, sensor technology, and different types of data cross-Processing and judgment.
In the present application, the triggering manner for triggering the input of the voice control command may include, but is not limited to, the following types:
a voice triggering mode;
visual triggering;
triggering an action; and
the device is self-triggering. Wherein the content of the first and second substances,
so-called voice triggered: the user speaks a phrase prompting the device to begin processing speech, for example: "Ok Google", small size, etc.
So-called tactile triggering: a button (physical or virtual) press or a switch control, such as a microphone icon or the like.
So-called action triggering: waving an arm in front of the sensor, etc.
So-called device self-triggering: an event or predetermined setting triggers the device, such as the occurrence of an abnormal condition, a task reminder prompting you for confirmation, etc.
After acquiring the voice control instruction of the user based on the user voice interface, the step 102 may be executed immediately, or the step 102 may also be executed at regular time, for example, after half an hour, or at a specific time such as 3 pm, which is not particularly limited in this embodiment.
Optionally, in a possible implementation manner of this embodiment, in 102, the obtained at least two associated operations of the smart home service associated with the voice control instruction may include, but are not limited to, at least one of an instruction operation and a service operation, which is not particularly limited in this embodiment.
The command operation refers to a voice command for controlling the smart home device, for example, playing a theme song of a swordsman, adjusting the brightness of light to 30, adjusting a television to three stations in the center, and the like, and can be understood as a voice keyword (query), and operations to be executed by the smart home device can be searched by using the voice keyword.
In a specific implementation process, before 102, the voice control instruction and at least two instruction operations associated with the voice control instruction may be further set in response to an input operation of the user based on an associated operation setting interface, as shown in fig. 1B.
In the implementation process, the user can add/delete the voice control command and the command operation associated with the voice control command at will, so as to realize the real self-defined setting.
The service operation refers to a capability service that the smart home device needs to execute, for example, a music capability service, a humidifier capability service, an air conditioning capability service, a news capability service, a game capability service, and the like, which can be understood as a voice content and can be directly used for triggering the smart home device to execute a corresponding operation.
In another specific implementation process, before 102, the voice control instruction may be further set in response to an input operation of the user based on an associated operation setting interface; and responding to the drag operation of the user based on the associated operation setting interface, and setting at least two service operations associated with the voice control instruction, as shown in fig. 1C.
In the implementation process, the user can only add/delete the voice control command at will. For the service operation associated with the voice control instruction, the user cannot add any service operation, and only can selectively add/delete the service operation from the options provided by the terminal, but cannot completely realize the real self-defined setting.
In the implementation process, before setting at least two service operations associated with the voice control instruction in response to a drag operation of the user on the basis of the associated operation setting interface, a service parameter of each service operation in the at least two service operations associated with the voice control instruction may be further set in response to a user operation of the user on the basis of the service operation setting interface, as shown in fig. 1D to 1G.
For each service operation, a detailed service operation setting interface is set, and is used for setting specific service parameters in the service operation setting interface.
For example, as shown in fig. 1D, the user may set service parameters such as music selection, sound size, and the like, which perform a service operation of the music capability service.
Alternatively, for another example, as shown in fig. 1E, the user may set service parameters such as temperature selection, power selection, etc. of the service operation of performing the humidifier capability service.
Alternatively, for another example, as shown in fig. 1F, the user may set service parameters such as news selection, speech rate selection, sound size, tone selection, etc., for performing a service operation of the news capability service.
Alternatively, for another example, as shown in fig. 1G, the user may set service parameters such as game selection for performing a service operation of the game capability service.
In another specific implementation process, before 102, the technical solutions in the two specific implementation processes may be further performed. The detailed description can refer to the related contents in the foregoing two specific implementation processes.
Optionally, in a possible implementation manner of this embodiment, in 103, it may be specifically controlled to execute the at least two association operations simultaneously, so that multiple smart home services may be implemented simultaneously.
Optionally, in a possible implementation manner of this embodiment, an execution sequence of a plurality of service operations associated with the voice control instruction may be further set.
In a specific implementation process, the execution sequence of the plurality of service operations associated with the voice control instruction may be specifically set according to a sequence of the association operations set by a user.
In another specific implementation process, the execution sequence of the plurality of service operations associated with the voice control instruction may be specifically set according to a default sequence.
Further, the execution sequence of the plurality of service operations associated with the voice control instruction may be further adjusted in response to a user operation of the user based on the associated operation setting interface, for example, a selection operation, a drag operation, and the like.
Optionally, in a possible implementation manner of this embodiment, in 103, the at least two association operations may be specifically controlled to be executed in sequence, so that a plurality of smart home services can be sequentially implemented.
According to the technical scheme, the interaction of the User on the terminal such as the intelligent home equipment is Voice interaction based on a User Voice Interface (VUI), the User does not need to remember redundant information, and control execution of a plurality of complex tasks can be easily realized through linked associated operation design, so that the control efficiency is greatly improved.
In this embodiment, by obtaining a voice control instruction of a user based on a user voice interface, and then according to the voice control instruction, obtaining at least two associated operations of the smart home service associated with the voice control instruction, so as to control and execute the at least two associated operations to realize a plurality of smart home services, without respectively controlling each smart home device to perform services, only one voice control instruction of the user is needed to trigger and execute a series of associated operations, so as to realize a plurality of smart home services, thereby improving the control efficiency of the smart home services.
In addition, by adopting the technical scheme provided by the application, the user can customize the voice control instruction and at least two associated operations of the intelligent home service associated with each voice control instruction, and the individuation of the intelligent home service can be effectively improved.
In addition, by adopting the technical scheme provided by the application, in the implementation of the instruction operation, one voice control instruction of the user can trigger and complete the instruction operation of a plurality of preset intelligent home services, so that the method is very convenient, and the usability of the intelligent home equipment is greatly improved.
In addition, by adopting the technical scheme provided by the application, in the implementation of the service operation, one voice control instruction of the user can trigger and complete the service operation of a plurality of preset intelligent home services, so that the method is very convenient, and the service efficiency of the intelligent home services is greatly improved.
In addition, by adopting the technical scheme provided by the application, the user can conveniently set the voice control instruction and the associated operation of the associated intelligent home service based on the visual associated operation setting interface and by adopting input operation or dragging operation, the operation is simple, and therefore the service efficiency of the intelligent home service is improved.
In addition, by adopting the technical scheme provided by the application, the user experience can be effectively improved.
It should be noted that for simplicity of description, the above-mentioned embodiments of the method are described as a series of acts, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
Fig. 2 is a schematic structural diagram of a processing apparatus for voice control commands according to another embodiment of the present application, as shown in fig. 2. The processing apparatus 200 of the voice control instruction of the present embodiment may include an acquisition unit 201, an association unit 202, and a control unit 203. The acquiring unit 201 is configured to acquire a voice control instruction of a user based on a user voice interface; the association unit 202 is configured to obtain, according to the voice control instruction, at least two association operations of the smart home service associated with the voice control instruction; a control unit 203, configured to control to perform the at least two association operations.
It should be noted that, part or all of the execution main body of the processing apparatus of the voice control instruction provided in this embodiment may be an application located in the local terminal, or may also be a functional unit such as a plug-in or Software Development Kit (SDK) set in the application located in the local terminal, or may also be a processing engine located in a server on the network side, or may also be a distributed system located on the network side, for example, a processing engine or a distributed system in a test platform on the network side, and this embodiment is not particularly limited in this respect.
It is to be understood that the application may be a native app (native app) installed on the terminal, or may also be a web page app (webApp) of a browser on the terminal, which is not limited in this embodiment.
Optionally, in a possible implementation manner of this embodiment, the at least two association operations of the smart home service associated with the voice control instruction obtained by the associating unit 202 may include, but are not limited to, at least one of an instruction operation and a service operation, and this is not particularly limited in this embodiment.
The command operation refers to a voice command for controlling the smart home device, for example, playing a theme song of a swordsman, adjusting the brightness of light to 30, adjusting a television to three stations in the center, and the like, and can be understood as a voice keyword (query), and operations to be executed by the smart home device can be searched by using the voice keyword.
In a specific implementation process, the associating unit 202 may be further configured to set the voice control instruction and at least two instruction operations associated with the voice control instruction in response to an input operation of the user based on an associated operation setting interface.
The service operation refers to a capability service that the smart home device needs to execute, for example, a music capability service, a humidifier capability service, an air conditioning capability service, a news capability service, a game capability service, and the like, which can be understood as a voice content and can be directly used for triggering the smart home device to execute a corresponding operation.
In another specific implementation process, the association unit 202 may be further configured to set the voice control instruction in response to an input operation of the user based on an association operation setting interface; and responding to the drag operation of the user based on the associated operation setting interface, and setting at least two service operations associated with the voice control instruction.
Further, the associating unit 202 may be further configured to set a service parameter of each service operation of the at least two service operations associated with the voice control instruction in response to a user operation of the user based on a service operation setting interface.
Optionally, in a possible implementation manner of this embodiment, the control unit 203 may be specifically configured to control to execute the at least two association operations simultaneously, so as to implement multiple smart home services; or controlling the at least two associated operations to be executed in sequence so as to realize a plurality of intelligent home services.
It should be noted that the method in the embodiment corresponding to fig. 1A may be implemented by the processing device of the voice control instruction provided in this embodiment. For a detailed description, reference may be made to relevant contents in the embodiment corresponding to fig. 1A, and details are not described here.
According to the technical scheme, the interaction of the User on the terminal such as the intelligent home equipment is Voice interaction based on a User Voice Interface (VUI), the User does not need to remember redundant information, and control execution of a plurality of complex tasks can be easily realized through linked associated operation design, so that the control efficiency is greatly improved.
In this embodiment, the voice control instruction of the user based on the user voice interface is acquired through the acquisition unit, and then the association unit acquires at least two association operations of the smart home service associated with the voice control instruction according to the voice control instruction, so that the control unit can control and execute the at least two association operations to realize a plurality of smart home services, and without respectively controlling each smart home device to perform services, only one voice control instruction of the user is needed to trigger and execute a series of association operations, so as to realize a plurality of smart home services, thereby improving the control efficiency of the smart home services.
In addition, by adopting the technical scheme provided by the application, the user can customize the voice control instruction and at least two associated operations of the intelligent home service associated with each voice control instruction, and the individuation of the intelligent home service can be effectively improved.
In addition, by adopting the technical scheme provided by the application, in the implementation of the instruction operation, one voice control instruction of the user can trigger and complete the instruction operation of a plurality of preset intelligent home services, so that the method is very convenient, and the usability of the intelligent home equipment is greatly improved.
In addition, by adopting the technical scheme provided by the application, in the implementation of the service operation, one voice control instruction of the user can trigger and complete the service operation of a plurality of preset intelligent home services, so that the method is very convenient, and the service efficiency of the intelligent home services is greatly improved.
In addition, by adopting the technical scheme provided by the application, the user can conveniently set the voice control instruction and the associated operation of the associated intelligent home service based on the visual associated operation setting interface and by adopting input operation or dragging operation, the operation is simple, and therefore the service efficiency of the intelligent home service is improved.
In addition, by adopting the technical scheme provided by the application, the user experience can be effectively improved.
The present application also provides an electronic device and a non-transitory computer readable storage medium having computer instructions stored thereon, according to embodiments of the present application.
Fig. 3 is a schematic view of an electronic device for implementing a method for processing a voice control instruction according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 3, the electronic apparatus includes: one or more processors 301, memory 302, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information for a Graphical User Interface (GUI) on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 3, one processor 301 is taken as an example.
Memory 302 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by at least one processor, so that the at least one processor executes the processing method of the voice control instructions provided by the application. The non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to execute the processing method of a voice control instruction provided by the present application.
The memory 302, as a non-transitory computer-readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and units, such as program instructions/units (e.g., the acquisition unit 201, the association unit 202, and the control unit 203 shown in fig. 2) corresponding to the processing method of the voice control instruction in the embodiment of the present application. The processor 301 executes various functional applications of the server and data processing, i.e., a processing method of voice control instructions in the above-described method embodiments, by executing non-transitory software programs, instructions, and units stored in the memory 302.
The memory 302 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data and the like created according to use of an electronic device that implements the processing method of the voice control instruction provided by the embodiment of the present application. Further, the memory 302 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 302 may optionally include a memory remotely located from the processor 301, and these remote memories may be connected via a network to an electronic device implementing the processing method of voice control instructions provided by the embodiments of the present application. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the processing method of the voice control instruction may further include: an input device 303 and an output device 304. The processor 301, the memory 302, the input device 303 and the output device 304 may be connected by a bus or other means, and the bus connection is taken as an example in fig. 3.
The input device 303 may receive input numeric or character information and generate key signal inputs related to user settings and function control of an electronic device implementing the processing method of voice control instructions provided by the embodiments of the present application, such as an input device of a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, or the like. The output devices 304 may include a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, an Application Specific Integrated Circuit (ASIC), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a Cathode Ray Tube (CRT) or Liquid Crystal Display (LCD) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user may provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, the intelligent household equipment is arranged in the designated area such as a family and the like, and can serve a plurality of users (the users can be called as group members), so that the group members served by the intelligent household equipment can perform respective inquiry operation of communication interception through the intelligent household equipment, the operation is simple and flexible, and the efficiency and the flexibility of communication interception information acquisition are improved.
In addition, adopt the technical scheme that this application provided, acquire through intelligent household equipment user's communication interception setting instruction, and then, according to communication interception setting instruction, instruct communication interception system goes on user's communication interception setting makes communication interception system can carry out according to intelligent household equipment's instruction user's communication interception setting, because intelligent household equipment is laid in appointed area such as family etc. it can be served for a plurality of users (these users can be called the group member), consequently, these group members that it served all can carry out respective setting operation of communication interception through intelligent household equipment, and the operation is comparatively simple, and nimble to the efficiency and the flexibility that communication interception information acquireed have been improved.
In addition, by adopting the technical scheme provided by the application, the communication interception information of the user obtained by intercepting the communication of the user is sent to the intelligent sound box equipment corresponding to the user through the communication interception system, the intelligent sound box equipment provides the user with the inquiry of the communication interception of the user, the operation is simple and flexible, and therefore, the efficiency and the flexibility of obtaining the communication interception information are improved.
In addition, by adopting the technical scheme provided by the application, the provided communication interception information of the user comprises at least one item of calling party information of the communication intercepted by the communication interception system and intention information of the communication intercepted by the communication interception system, so that the richness and the effectiveness of the communication interception information can be effectively improved.
In addition, by adopting the technical scheme provided by the application, the communication interception information of the inquired user is directly output to the user through the intelligent household equipment, or the communication interception information of the user is output to the user through the terminal used by the user, so that the communication interception information of the user can be flexibly output, and the efficiency and the flexibility of obtaining the communication interception information can be further improved.
In addition, by adopting the technical scheme provided by the application, the user experience can be effectively improved.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (12)

1. A method for processing a voice control command is characterized by comprising the following steps:
acquiring a voice control instruction of a user based on a user voice interface;
under the condition that a user sets the voice control instruction and at least two instruction operations related to the voice control instruction based on input operation of a related operation setting interface, obtaining at least two related operations of the intelligent home service related to the voice control instruction according to the voice control instruction, wherein the instruction operations are voice instructions for controlling intelligent home equipment;
and controlling to execute the at least two associated operations to realize a plurality of intelligent home services.
2. The method of claim 1, wherein the association operation comprises at least one of an instruction operation and a service operation.
3. The method according to claim 1, wherein before obtaining at least two association operations of the smart home service associated with the voice control instruction according to the voice control instruction, the method further comprises:
responding to the input operation of the user based on an associated operation setting interface, and setting the voice control instruction and at least two instruction operations associated with the voice control instruction; and/or
Responding to the input operation of the user based on the associated operation setting interface, and setting the voice control instruction; and responding to the drag operation of the user based on the associated operation setting interface, and setting at least two service operations associated with the voice control instruction.
4. The method according to claim 3, wherein before the setting of the at least two service operations associated with the voice control instruction in response to the user's drag operation based on the associated operation setting interface, further comprising:
and responding to the user operation of the user based on a service operation setting interface, and setting the service parameters of each service operation in at least two service operations associated with the voice control instruction.
5. The method according to any of claims 1-4, wherein the controlling performs the at least two association operations, including:
controlling to execute the at least two associated operations simultaneously to realize a plurality of intelligent home services; or
And controlling to execute the at least two associated operations in sequence to realize a plurality of intelligent home services.
6. An apparatus for processing voice control commands, comprising:
the acquisition unit is used for acquiring a voice control instruction of a user based on a user voice interface;
the association unit is used for acquiring at least two association operations of the intelligent home service associated with the voice control instruction according to the voice control instruction under the condition that the voice control instruction and at least two instruction operations associated with the voice control instruction are set by a user based on input operation of an association operation setting interface, wherein the instruction operations are voice instructions for controlling the intelligent home equipment;
a control unit for controlling the execution of the at least two associated operations.
7. The apparatus of claim 6, wherein the association operation comprises at least one of an instruction operation and a service operation.
8. The apparatus of claim 6, wherein the association unit is further configured to associate the received data with the specific device
Responding to the input operation of the user based on an associated operation setting interface, and setting the voice control instruction and at least two instruction operations associated with the voice control instruction; and/or
Responding to the input operation of the user based on an associated operation setting interface, and setting the voice control instruction; and responding to the drag operation of the user based on the associated operation setting interface, and setting at least two service operations associated with the voice control instruction.
9. The apparatus of claim 8, wherein the association unit is further configured to associate the received data with the specific device
And responding to the user operation of the user based on a service operation setting interface, and setting the service parameters of each service operation in at least two service operations associated with the voice control instruction.
10. Device according to any of claims 6-9, characterized in that the control unit, in particular for
Controlling to execute the at least two associated operations simultaneously to realize a plurality of intelligent home services; or
And controlling to execute the at least two associated operations in sequence to realize a plurality of intelligent home services.
11. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-5.
12. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-5.
CN201910927375.5A 2019-09-27 2019-09-27 Voice control instruction processing method and device, electronic equipment and readable storage medium Active CN110768877B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910927375.5A CN110768877B (en) 2019-09-27 2019-09-27 Voice control instruction processing method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910927375.5A CN110768877B (en) 2019-09-27 2019-09-27 Voice control instruction processing method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN110768877A CN110768877A (en) 2020-02-07
CN110768877B true CN110768877B (en) 2022-05-27

Family

ID=69330673

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910927375.5A Active CN110768877B (en) 2019-09-27 2019-09-27 Voice control instruction processing method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN110768877B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113377322A (en) * 2020-03-09 2021-09-10 阿里巴巴集团控股有限公司 Page direct processing method and device and electronic equipment
CN111638882B (en) * 2020-05-29 2022-03-11 杭州鸿雁电器有限公司 Method and device for generating operation interface, storage medium and processor
CN112073471B (en) * 2020-08-17 2023-07-21 青岛海尔科技有限公司 Control method and device of equipment, storage medium and electronic device
CN114579819A (en) * 2020-11-30 2022-06-03 华为技术有限公司 Information sorting method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104426750A (en) * 2013-09-11 2015-03-18 腾讯科技(深圳)有限公司 Method, equipment and system for instant messaging
CN105700505A (en) * 2016-03-22 2016-06-22 珠海格力电器股份有限公司 Control method and apparatus for intelligent household device
CN108683574A (en) * 2018-04-13 2018-10-19 青岛海信智慧家居系统股份有限公司 A kind of apparatus control method, server and intelligent domestic system
CN109782612A (en) * 2018-12-04 2019-05-21 安徽精英智能科技有限公司 A kind of household voice interactive intelligence terminal, service system and implementation method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250474B (en) * 2016-07-29 2020-06-23 Tcl科技集团股份有限公司 Voice control processing method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104426750A (en) * 2013-09-11 2015-03-18 腾讯科技(深圳)有限公司 Method, equipment and system for instant messaging
CN105700505A (en) * 2016-03-22 2016-06-22 珠海格力电器股份有限公司 Control method and apparatus for intelligent household device
CN108683574A (en) * 2018-04-13 2018-10-19 青岛海信智慧家居系统股份有限公司 A kind of apparatus control method, server and intelligent domestic system
CN109782612A (en) * 2018-12-04 2019-05-21 安徽精英智能科技有限公司 A kind of household voice interactive intelligence terminal, service system and implementation method

Also Published As

Publication number Publication date
CN110768877A (en) 2020-02-07

Similar Documents

Publication Publication Date Title
CN110768877B (en) Voice control instruction processing method and device, electronic equipment and readable storage medium
CN108920084A (en) Visual field control method and device in a kind of game
CN112669831B (en) Voice recognition control method and device, electronic equipment and readable storage medium
CN110557699B (en) Intelligent sound box interaction method, device, equipment and storage medium
CN112530419B (en) Speech recognition control method, device, electronic equipment and readable storage medium
WO2013182089A1 (en) Object suspension realizing method and device
CN110620844B (en) Program starting method, device, equipment and storage medium
US11544088B1 (en) System and method for providing a customized graphical user interface based on user inputs
WO2016070338A1 (en) Method, apparatus and device for displaying message
CN103713813A (en) Method for controlling functions of intelligent handheld equipment
CN110601933A (en) Control method, device and equipment of Internet of things equipment and storage medium
CN110764857B (en) Virtual keyboard display effect configuration method, device, equipment and storage medium
CN110933227A (en) Assistance method, device, equipment and medium for intelligent terminal
US20210098012A1 (en) Voice Skill Recommendation Method, Apparatus, Device and Storage Medium
US11120683B2 (en) Remote control for interacting with smart home IoT devices and web services
CN111160318B (en) Electronic equipment control method and device
CN105491482B (en) Phonetic transmission method and speech sound transmitting device
CN112148954B (en) Method and device for processing article information, electronic equipment and storage medium
US20210210070A1 (en) Skill service updating method, electronic device and readable storage medium
CN110609671B (en) Sound signal enhancement method, device, electronic equipment and storage medium
CN112466300B (en) Interaction method, electronic device, intelligent device and readable storage medium
CN114363866A (en) Method and device for configuring scene mode of Bluetooth mesh network equipment and electronic device
CN113126756A (en) Application interaction method and device
CN110896427A (en) Communication interception processing method and device, electronic equipment and readable storage medium
US20150268841A1 (en) Method of Changing a User Interface to be a Dedicated SkypeTM Interface and Computer Program Product Thereof and Handheld Electronic Device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant