WO2022158824A1 - Method and device for controlling electronic apparatus - Google Patents

Method and device for controlling electronic apparatus Download PDF

Info

Publication number
WO2022158824A1
WO2022158824A1 PCT/KR2022/000919 KR2022000919W WO2022158824A1 WO 2022158824 A1 WO2022158824 A1 WO 2022158824A1 KR 2022000919 W KR2022000919 W KR 2022000919W WO 2022158824 A1 WO2022158824 A1 WO 2022158824A1
Authority
WO
WIPO (PCT)
Prior art keywords
control command
electronic apparatus
acquisition model
voice signal
user voice
Prior art date
Application number
PCT/KR2022/000919
Other languages
French (fr)
Inventor
Jin Guo
Wuzhi LUO
Xiaorong Liang
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Publication of WO2022158824A1 publication Critical patent/WO2022158824A1/en

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/52Indication arrangements, e.g. displays
    • F24F11/526Indication arrangements, e.g. displays giving audible indications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/56Remote control
    • F24F11/58Remote control using Internet communication
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/88Electrical aspects, e.g. circuits
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/89Arrangement or mounting of control or safety devices
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2120/00Control inputs relating to users or occupants
    • F24F2120/20Feedback from users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities

Definitions

  • the present disclosure relates to a control method and a control device of an electronic apparatus, in particular, to a method and device for controlling an electronic apparatus using a user voice signal and environmental information and an operating status of the electronic apparatus.
  • voice interaction applications With the intelligentization of electronic terminals and the development of information network technology, people use voice interaction applications in more and more scenarios. For example, using a smart phone or a smart speaker or an intelligent electronic apparatus itself as an entrance, and using voice interaction applications through voice-text interaction, such as electrical appliances/weather/stocks/music/traffic conditions/alarms/reminds.
  • voice interaction applications Through voice-text interaction, such as electrical appliances/weather/stocks/music/traffic conditions/alarms/reminds.
  • each voice interaction application is also called semantic skill.
  • a voice interaction logic is variable and complex.
  • a well-designed voice system training process may improve a user's experience of interacting with the system and increase the user's trust in system services and functions.
  • a manufacturer of smart phones and smart speakers that support voice training systems repairs one or more voices according to user settings and gives corresponding one or more responses (for example, voice dialogue/text dialogue/device control instructions/screen recording teachings, etc.).
  • responses for example, voice dialogue/text dialogue/device control instructions/screen recording teachings, etc.
  • users can only set each model one by one, which leads to cumbersome and repetitive operations.
  • the purpose of embodiments of the present disclosure is to provide a method and device for controlling an electronic apparatus.
  • a method for controlling an electronic apparatus including: acquiring a user voice signal; acquiring environmental information and an operating status of the electronic apparatus; determining a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus; and controlling operation of the electronic apparatus by using the control command.
  • the method before the determining a control command corresponding to the user voice signal, the method further includes: recognizing the user voice signal through natural language understanding processing to determine an intention indicated by the user voice signal.
  • the method before the determining a control command corresponding to the user voice signal, the method further includes: determining whether the user voice signal matches a preset habit template in the electronic apparatus; and in response to the user voice signal matching the preset habit template, determining an intention indicated by the user voice signal from the preset habit template.
  • the step of determining a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus includes: inputting the determined intention indicated by the user voice signal, the environmental information and the operating status of the electronic apparatus into a control command acquisition model to obtain the control command corresponding to the user voice signal.
  • control command acquisition model is a model based on a convolutional neural network.
  • the convolutional neural network is a residual neural network.
  • control command acquisition model is obtained through training sample data or historical data.
  • control command acquisition model is trained by: inputting a user intention in the sample data or the historical data, the environmental information and the operating status of the electronic apparatus into the control command acquisition model; and adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data.
  • the step of adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data includes: in response to the output of the control command acquisition model being consistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a positive example; and in response to the output of the control command acquisition model being inconsistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a negative example, to adjust the parameters of the control command acquisition model.
  • the electronic apparatus includes an air conditioner.
  • the environmental information of the electronic apparatus includes an indoor temperature and/or an outdoor temperature
  • the operating status of the electronic apparatus includes a switch status, a cooling status, and/or a heating status.
  • a device for controlling an electronic apparatus including: a first acquisition module, configured to acquire a user voice signal; a second acquisition module, configured to acquire environmental information and an operating status of the electronic apparatus; a command determination module, configured to determine a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus; and a controlling module, configured to control operation of the electronic apparatus by using the control command.
  • the device further includes: a recognition module, configured to recognize the user voice signal through natural language understanding processing to determine an intention indicated by the user voice signal, before the command determination module determines a control command corresponding to the user voice signal.
  • a recognition module configured to recognize the user voice signal through natural language understanding processing to determine an intention indicated by the user voice signal, before the command determination module determines a control command corresponding to the user voice signal.
  • the device further includes: a recognition module, configured to determine whether the user voice signal matches a preset habit template in the electronic apparatus, before the command determination module determines a control command corresponding to the user voice signal, and in response to the user voice signal matching the preset habit template, determine an intention indicated by the user voice signal from the preset habit template.
  • a recognition module configured to determine whether the user voice signal matches a preset habit template in the electronic apparatus, before the command determination module determines a control command corresponding to the user voice signal, and in response to the user voice signal matching the preset habit template, determine an intention indicated by the user voice signal from the preset habit template.
  • the command determination module is configured to input the determined intention indicated by the user voice signal, the environmental information and the operating status of the electronic apparatus into a control command acquisition model to obtain the control command corresponding to the user voice signal.
  • control command acquisition model is a model based on a convolutional neural network.
  • the convolutional neural network is a residual neural network.
  • control command acquisition model is obtained through training sample data or historical data.
  • control command acquisition model is trained by: inputting a user intention in the sample data or the historical data, the environmental information and the operating status of the electronic apparatus into the control command acquisition model; and adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data.
  • the operation of adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data includes: in response to the output of the control command acquisition model being consistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a positive example; and in response to the output of the control command acquisition model being inconsistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a negative example, to adjust the parameters of the control command acquisition model.
  • the electronic apparatus includes an air conditioner.
  • the environmental information of the electronic apparatus includes an indoor temperature and/or an outdoor temperature
  • the operating status of the electronic apparatus includes a switch status, a cooling status, and/or a heating status.
  • a computer readable storage medium storing computer programs
  • the computer programs when executed by a processor, implement the method for controlling an electronic apparatus.
  • a computing apparatus including: a processor; and a memory, storing a computer program, the computer program, when executed by the processor, implements the method for controlling an electronic apparatus.
  • an accuracy of generating control commands based on voice signals can be improved, and unnecessary operations for generating control commands can be reduced, and an efficiency of voice control can be improved.
  • the method and device for controlling an electronic apparatus can accurately generate a control signal of the electronic apparatus based on a user voice signal and environmental information and an operating status of the electronic apparatus, and simply based on a control command acquisition model. There is no need to configure different control command acquisition models for different environmental information and operating status of the electronic apparatus, thereby avoiding complicated operations of users/manufacturers to set various models, and at the same time improving service efficiency of the electronic apparatus.
  • Fig. 1 is a flowchart showing a method for controlling an electronic apparatus according to an embodiment of the present disclosure
  • Fig. 2 is a block diagram showing a device for controlling an electronic apparatus according to an embodiment of the present disclosure.
  • Fig. 3 is a block diagram showing a computing apparatus according to an embodiment of the present disclosure.
  • first, second and third may be used herein to describe various members, components, regions, layers or parts, these members, components, regions, layers or parts should not be restricted by these terms. On the contrary, these terms are only used to distinguish one member, component, region, layer or part from another member, component, region, layer or part. Therefore, without departing from the teaching of the examples, a first member, a first component, a first region, a first layer, or a first part in the examples described herein may also be referred to as a second member, a second component, a second region, a second layer or a second part.
  • the following describes application scenarios of the method and device for controlling an electronic apparatus according to the present disclosure.
  • the method and device for controlling an electronic apparatus according to the present disclosure may be applied to an air conditioner, but is not limited thereto.
  • the method and device for controlling an electronic apparatus according to the present disclosure may be applied to a display, a speaker, and the like.
  • a plurality of sets of habits may be generated through artificial intelligence training of a set of habits, thereby avoiding the user from manually editing a plurality of sets of habits and improving service efficiency.
  • a temperature of an (living room) air conditioner may be raised (heating), or the air conditioner may be turned on (heating).
  • the (living room) air conditioner itself may raise its temperature or may be turned on based on the voice command acquisition model.
  • an intelligent central control may control the (living room) air conditioner to raise the temperature or control the air conditioner to start based on the voice command acquisition model.
  • the command acquisition model is not suitable for summer, but can only control heating based on the user voice signal, and cannot learn. When the season changes, the command acquisition model cannot switch between "heating” and "cooling".
  • a new command acquisition model may be designed. Based on the new command acquisition model, in response to the user voice signal "the living room is cold", based on environmental information (including an indoor temperature and an outdoor temperature) and an operating status (a switch status/a cooling status/a heating status) of an environment of the air conditioner, the temperature of the (living room) air conditioner may be raised (heating) or the air conditioner may be turned on (heating), or the temperature of the air conditioner may be raised (cooling) or the air conditioner may be turned off (cooling).
  • environmental information including an indoor temperature and an outdoor temperature
  • an operating status a switch status/a cooling status/a heating status
  • Fig. 1 is a flowchart showing a method for controlling an electronic apparatus according to an embodiment of the present disclosure.
  • the method for controlling an electronic apparatus according to an embodiment of the present disclosure may run on the electronic apparatus itself, or may run on an intelligent central control controlling the electronic apparatus, and the present disclosure does not impose any limitation in this regard.
  • a user voice signal may be acquired.
  • the user voice signal may be acquired through a microphone set in the electronic apparatus or the intelligent central control.
  • environmental information and an operating status of the electronic apparatus may be acquired.
  • the electronic apparatus may be an air conditioner, but is not limited thereto.
  • the electronic apparatus may be a display, a speaker, or the like.
  • the environmental information of the electronic apparatus includes an indoor temperature and/or an outdoor temperature, and the operating status of the electronic apparatus includes a switch status, a cooling status, and/or a heating status.
  • step S110 and step S120 are not limited.
  • step S110 may be performed before step S120, may be performed after step S120, or may be performed simultaneously with step S120.
  • a control command corresponding to the user voice signal may be determined based on the acquired environmental information and the operating status of the electronic apparatus.
  • the method for controlling an electronic apparatus may include a step as follows: recognizing the user voice signal through natural language understanding processing (NLU) to determine an intention indicated by the user voice signal. For example, when the user voice signal is "it is very hot”, it may be determined through NLU that an intention indicated by the user voice signal is "cooling down”, and the intention is related to control of the electronic apparatus. However, when the user voice signal is "how is the weather today”, it may be determined through NLU that an intention indicated by the user voice signal is "asking for the weather”, and the intention has nothing to do with the control of the electronic apparatus.
  • NLU natural language understanding processing
  • the method for controlling an electronic apparatus may include steps as follows: determining whether the user voice signal matches a preset habit template in the electronic apparatus; and in response to the user voice signal matching the preset habit template, determining an intention indicated by the user voice signal from the preset habit template. For example, a user may pre-set a habit template "Aoligei" to be associated with an intention "cooling down”. When the user voice signal is "Aoligei”, it may be determined that the user voice signal matches the preset habit template in the electronic apparatus, and the intention "cooling down” indicated by the user voice signal may be determined from the preset habit template. However, when the user voice signal is "Fighting", it may be determined that the user voice signal does not match the preset habit template in the electronic apparatus, so that the intention indicated by the user voice signal cannot be acquired.
  • step S130 After determining the intention indicated by the user voice signal, in step S130, the determined intention indicated by the user voice signal, the environmental information and the operating status of the electronic apparatus may be input into a control command acquisition model to obtain the control command corresponding to the user voice signal. In step S140, the obtained control command may be used to control operation of the electronic apparatus.
  • the control command “turn off the air conditioner” or “set the air conditioner temperature to 26 degrees (°C)" corresponding to the user voice signal may be obtained.
  • the indoor temperature is 30 degrees (°C)
  • the outdoor temperature is 1 degree (°C)
  • the operating status of the electronic apparatus is "heating”
  • the control command "turn off the air conditioner” or “set the air conditioner temperature to 22 degrees (°C)” corresponding to the user voice signal
  • control command acquisition model may be a model based on a convolutional neural network (CNN), and the convolutional neural network may be a residual neural network (Resnet), but is not limited thereto.
  • CNN convolutional neural network
  • Resnet residual neural network
  • the following describes a training process of the control command acquisition model.
  • the control command acquisition model is obtained through training sample data or historical data. To this end, first, historical data including a user intention, the environmental information and the operating status of the electronic apparatus input may be collected, or sample data including a user intention, the environmental information and the operating status of the electronic apparatus input may be set. Then, the user intention, the environmental information and the operating status of the electronic apparatus in the sample data or the historical data are input into the control command acquisition model. Finally, based on an output of the control command acquisition model and expected decision data, parameters of the control command acquisition model are adjusted. Optionally, the expected decision data as an expected output of the control command acquisition model may be preset.
  • the output of the control command acquisition model may be fed back to the control command acquisition model in a form of a positive example to strengthen the parameters of the control command acquisition model currently set.
  • the output of the control command acquisition model may be fed back to the control command acquisition model in a form of a negative example to adjust the parameters of the control command acquisition model.
  • optimal parameters of the control command acquisition model may be finally determined.
  • the above training process may be performed by a manufacturer of the electronic apparatus in a manufacturing/testing phase of the electronic apparatus, or may be performed by a user of the electronic apparatus during using the electronic apparatus.
  • a sample intention or a preset intention such as “warming up” and the operating status of the air conditioner (such as “cooling”
  • the indoor temperature such as 22°C
  • the outdoor temperature such as 22°C
  • the control command acquisition model may classify the operating status and a set temperature of the air conditioner through Resnet based on the input sample intention or preset intention, the operating status of the air conditioner, the indoor temperature and the outdoor temperature, and output the operating status and the set temperature of the air conditioner.
  • Fig. 2 is a block diagram showing a device for controlling an electronic apparatus according to an embodiment of the present disclosure.
  • the device for controlling an electronic apparatus according to an embodiment of the present disclosure may be set on the electronic apparatus, or may be set on an intelligent central control controlling the electronic apparatus, and the present disclosure does not impose any limitation in this regard.
  • a device 200 for controlling an electronic apparatus may include a first acquisition module 210, a second acquisition module 220, a command determination module 230 and a controlling module 240.
  • the first acquisition module 210 may be configured to acquire a user voice signal.
  • the second acquisition module 220 may be configured to acquire environmental information and an operating status of the electronic apparatus.
  • the command determination module 230 may be configured to determine a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus.
  • the controlling module 240 may be configured to control operation of the electronic apparatus by using the control command.
  • the electronic apparatus may be an air conditioner, but is not limited thereto.
  • the electronic apparatus may be a display, a speaker, or the like.
  • the environmental information of the electronic apparatus includes an indoor temperature and/or an outdoor temperature, and the operating status of the electronic apparatus includes a switch status, a cooling status, and/or a heating status.
  • the environmental information of the electronic apparatus may be acquired through various sensors set in the electronic apparatus or the intelligent central control, and the operating status of the electronic apparatus may be acquired through a controller/processor of the electronic apparatus.
  • the device 200 for controlling an electronic apparatus may further include a recognition module (not shown).
  • the recognition module may be configured to recognize the user voice signal through natural language understanding processing to determine an intention indicated by the user voice signal, before the command determination module determines a control command corresponding to the user voice signal, or determine whether the user voice signal matches a preset habit template in the electronic apparatus, before the command determination module determines a control command corresponding to the user voice signal, and in response to the user voice signal matching the preset habit template, determine an intention indicated by the user voice signal from the preset habit template.
  • the command determination module 230 may be configured to input the determined intention indicated by the user voice signal, the environmental information and the operating status of the electronic apparatus into a control command acquisition model to obtain the control command corresponding to the user voice signal.
  • control command acquisition model may be a model based on a convolutional neural network (CNN), and the convolutional neural network may be a residual neural network (Resnet), but is not limited thereto.
  • the control command acquisition model may be obtained through training sample data or historical data. A user intention in the sample data or the historical data, the environmental information and the operating status of the electronic apparatus may be input into the control command acquisition model, then parameters of the control command acquisition model may be adjusted based on an output of the control command acquisition model and expected decision data.
  • the output of the control command acquisition model may be fed back to the control command acquisition model in a form of a positive example, and when the output of the control command acquisition model is inconsistent with the expected decision data, the output of the control command acquisition model may be fed back to the control command acquisition model in a form of a negative example to adjust the parameters of the control command acquisition model.
  • Fig. 3 is a block diagram showing a computing apparatus according to an embodiment of the present disclosure.
  • a computing apparatus 300 may include a processor 310 and a memory 320.
  • the processor 310 may include (but is not limited to) a central processing unit (CPU), a digital signal processor (DSP), a microcomputer, a field programmable gate array (FPGA), a system on chip (SoC), a microprocessor, an application specific integrated circuit (ASIC), etc.
  • the memory 320 stores computer programs to be executed by the processor 310.
  • the memory 320 includes a high-speed random access memory and/or a non-volatile computer readable storage medium.
  • the computing apparatus 300 may communicate with the electronic apparatus or an intelligent central control controlling the electronic apparatus in a wired/wireless communication.
  • the method for controlling an electronic apparatus may be written as computer programs and stored on the computer readable storage medium.
  • the computer programs are executed by the processor, the method for controlling an electronic apparatus as described above may be implemented.
  • the computer readable storage medium include: read only memory (ROM), random access programmable read only memory (PROM), electrically erasable programmable read only memory (EEPROM), random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), flash memory, non-volatile memory, CD-ROM, CD-R, CD+R, CD-RW, CD+RW, DVD-ROM, DVD-R, DVD+R, DVD-RW, DVD+RW, DVD-RAM, BD-ROM, BD-R, BD-R LTH, BD-RE, Blu-ray or CD storage, hard disk drive (HDD), solid state drive (SSD), card storage (such as multimedia cards, secure digital (SD) cards, or extreme digital (XD) cards), magnetic tape, floppy disk, magnet
  • the any other apparatuses are configured to store the computer programs and any associated data, data files and data structures in a non-transitory mode and provide the computer programs and any associated data, data files and data structures to the processor or computer, enabling the processor or computer to execute the computer programs.
  • the computer programs and any associated data, data files and data structures are distributed on a networked computer system, so that the computer programs and any associated data, data files and data structures are stored, accessed, and executed in a distributed mode through one or more processors or computers.
  • the method and device for controlling an electronic apparatus can accurately generate a control signal of the electronic apparatus based on a user voice signal and environmental information and an operating status of the electronic apparatus, and simply based on a control command acquisition model. There is no need to configure different control command acquisition models for different environmental information and operating status of the electronic apparatus, thereby avoiding complicated operations of users/manufacturers to set various models, and at the same time improving service efficiency of the electronic apparatus.

Abstract

A method and device for controlling an electronic apparatus are disclosed. The method includes: acquiring a user voice signal; acquiring environmental information and an operating status of the electronic apparatus; determining a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus; and controlling operation of the electronic apparatus by using the control command.

Description

METHOD AND DEVICE FOR CONTROLLING ELECTRONIC APPARATUS
The present disclosure relates to a control method and a control device of an electronic apparatus, in particular, to a method and device for controlling an electronic apparatus using a user voice signal and environmental information and an operating status of the electronic apparatus.
With the intelligentization of electronic terminals and the development of information network technology, people use voice interaction applications in more and more scenarios. For example, using a smart phone or a smart speaker or an intelligent electronic apparatus itself as an entrance, and using voice interaction applications through voice-text interaction, such as electrical appliances/weather/stocks/music/traffic conditions/alarms/reminds. Here, each voice interaction application is also called semantic skill.
However, a voice interaction logic is variable and complex. A well-designed voice system training process may improve a user's experience of interacting with the system and increase the user's trust in system services and functions. For example, a manufacturer of smart phones and smart speakers that support voice training systems repairs one or more voices according to user settings and gives corresponding one or more responses (for example, voice dialogue/text dialogue/device control instructions/screen recording teachings, etc.). However, under the existing voice and text training systems, users can only set each model one by one, which leads to cumbersome and repetitive operations.
The purpose of embodiments of the present disclosure is to provide a method and device for controlling an electronic apparatus.
According to an aspect of the embodiments of the present disclosure, a method for controlling an electronic apparatus is provided, the method including: acquiring a user voice signal; acquiring environmental information and an operating status of the electronic apparatus; determining a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus; and controlling operation of the electronic apparatus by using the control command.
Optionally, before the determining a control command corresponding to the user voice signal, the method further includes: recognizing the user voice signal through natural language understanding processing to determine an intention indicated by the user voice signal.
Optionally, before the determining a control command corresponding to the user voice signal, the method further includes: determining whether the user voice signal matches a preset habit template in the electronic apparatus; and in response to the user voice signal matching the preset habit template, determining an intention indicated by the user voice signal from the preset habit template.
Optionally, the step of determining a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus, includes: inputting the determined intention indicated by the user voice signal, the environmental information and the operating status of the electronic apparatus into a control command acquisition model to obtain the control command corresponding to the user voice signal.
Optionally, the control command acquisition model is a model based on a convolutional neural network.
Optionally, the convolutional neural network is a residual neural network.
Optionally, the control command acquisition model is obtained through training sample data or historical data.
Optionally, the control command acquisition model is trained by: inputting a user intention in the sample data or the historical data, the environmental information and the operating status of the electronic apparatus into the control command acquisition model; and adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data.
Optionally, the step of adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data, includes: in response to the output of the control command acquisition model being consistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a positive example; and in response to the output of the control command acquisition model being inconsistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a negative example, to adjust the parameters of the control command acquisition model.
Optionally, the electronic apparatus includes an air conditioner.
Optionally, the environmental information of the electronic apparatus includes an indoor temperature and/or an outdoor temperature, and/or the operating status of the electronic apparatus includes a switch status, a cooling status, and/or a heating status.
According to another aspect of the embodiments of the present disclosure, a device for controlling an electronic apparatus is provided, the device including: a first acquisition module, configured to acquire a user voice signal; a second acquisition module, configured to acquire environmental information and an operating status of the electronic apparatus; a command determination module, configured to determine a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus; and a controlling module, configured to control operation of the electronic apparatus by using the control command.
Optionally, the device further includes: a recognition module, configured to recognize the user voice signal through natural language understanding processing to determine an intention indicated by the user voice signal, before the command determination module determines a control command corresponding to the user voice signal.
Optionally, the device further includes: a recognition module, configured to determine whether the user voice signal matches a preset habit template in the electronic apparatus, before the command determination module determines a control command corresponding to the user voice signal, and in response to the user voice signal matching the preset habit template, determine an intention indicated by the user voice signal from the preset habit template.
Optionally, the command determination module is configured to input the determined intention indicated by the user voice signal, the environmental information and the operating status of the electronic apparatus into a control command acquisition model to obtain the control command corresponding to the user voice signal.
Optionally, the control command acquisition model is a model based on a convolutional neural network.
Optionally, the convolutional neural network is a residual neural network.
Optionally, the control command acquisition model is obtained through training sample data or historical data.
Optionally, the control command acquisition model is trained by: inputting a user intention in the sample data or the historical data, the environmental information and the operating status of the electronic apparatus into the control command acquisition model; and adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data.
Optionally, the operation of adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data, includes: in response to the output of the control command acquisition model being consistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a positive example; and in response to the output of the control command acquisition model being inconsistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a negative example, to adjust the parameters of the control command acquisition model.
Optionally, the electronic apparatus includes an air conditioner.
Optionally, the environmental information of the electronic apparatus includes an indoor temperature and/or an outdoor temperature, and/or the operating status of the electronic apparatus includes a switch status, a cooling status, and/or a heating status.
According to another aspect of the embodiments of the present disclosure, a computer readable storage medium storing computer programs is provided, the computer programs, when executed by a processor, implement the method for controlling an electronic apparatus.
According to another aspect of the present disclosure, a computing apparatus is provided, the computing apparatus including: a processor; and a memory, storing a computer program, the computer program, when executed by the processor, implements the method for controlling an electronic apparatus.
By using the method and device, an accuracy of generating control commands based on voice signals can be improved, and unnecessary operations for generating control commands can be reduced, and an efficiency of voice control can be improved.
The method and device for controlling an electronic apparatus according to the embodiments of the present disclosure, can accurately generate a control signal of the electronic apparatus based on a user voice signal and environmental information and an operating status of the electronic apparatus, and simply based on a control command acquisition model. There is no need to configure different control command acquisition models for different environmental information and operating status of the electronic apparatus, thereby avoiding complicated operations of users/manufacturers to set various models, and at the same time improving service efficiency of the electronic apparatus.
The above and other objectives and features of embodiments of the present disclosure will become clearer from the following description in conjunction with the accompanying drawings exemplarily showing the embodiments, in which:
Fig. 1 is a flowchart showing a method for controlling an electronic apparatus according to an embodiment of the present disclosure;
Fig. 2 is a block diagram showing a device for controlling an electronic apparatus according to an embodiment of the present disclosure; and
Fig. 3 is a block diagram showing a computing apparatus according to an embodiment of the present disclosure.
The following specific implementations are provided to help readers gain a comprehensive understanding of the methods, devices, and/or systems described herein. However, after understanding the disclosure of the present application, various changes, modifications and equivalents of the methods, devices and/or systems described herein will be clear. For example, the orders of operations described herein are only examples, and are not limited to those set forth herein, but except for operations that must occur in a specific order, they may be changed as will be clear after understanding the disclosure of the present application. In addition, for more clarity and conciseness, descriptions of features known in the art may be omitted.
The features described herein may be implemented in different forms and should not be construed as being limited to the examples described herein. On the contrary, the examples described herein have been provided to show only some of the many feasible methods for implementing the methods, devices, and/or systems described herein, which will be clear after understanding the disclosure of the present application.
As used herein, the term "and/or" includes any one of the associated listed items and any combination of any two or more.
Although terms such as "first", "second" and "third" may be used herein to describe various members, components, regions, layers or parts, these members, components, regions, layers or parts should not be restricted by these terms. On the contrary, these terms are only used to distinguish one member, component, region, layer or part from another member, component, region, layer or part. Therefore, without departing from the teaching of the examples, a first member, a first component, a first region, a first layer, or a first part in the examples described herein may also be referred to as a second member, a second component, a second region, a second layer or a second part.
In the specification, when an element (such as a layer, region, or substrate) is described as being "on", "connected to", or "coupled to" another element, the element may be directly "on" the other element, directly "connected to" or "coupled to" another element, or there may be one or more other elements intervening there between. In contrast, when an element is described as being "directly on", "directly connected to" or "directly coupled to" another element, there may be no intervening elements.
The terms used herein are only used to describe various examples, and are not intended to limit the present disclosure. Unless the context clearly indicates otherwise, the singular form is intended to include the plural form. The terms "including", "comprising" and "having" indicate the existence of the features, numbers, operations, members, elements, and/or combinations thereof, but do not exclude the existence or addition of one or more other features, numbers, operations, members, elements and/or combinations thereof.
Unless defined otherwise, all terms used herein (including technical terms and scientific terms) have the same meaning as commonly understood by those skilled in the art to which the present disclosure belongs after understanding the present disclosure. Unless clearly defined as such herein, terms (such as those defined in the general dictionaries) should be interpreted as having a meaning consistent with their contextual meaning in the related field and in the present disclosure, and should not be interpreted as having ideal or excessive formal meaning.
In addition, in the description of the examples, when it is considered that a detailed description of a well-known related structure or function will cause a vague interpretation of the present disclosure, such detailed description will be omitted.
The following describes application scenarios of the method and device for controlling an electronic apparatus according to the present disclosure. The method and device for controlling an electronic apparatus according to the present disclosure may be applied to an air conditioner, but is not limited thereto. For example, the method and device for controlling an electronic apparatus according to the present disclosure may be applied to a display, a speaker, and the like.
In real life, it often involves a plurality of sets of training voice texts and corresponding instructions, especially when an intelligent electronic apparatus has a same voice habit for the same user, the voice habit is often regular. In view of the related habit logic, a plurality of sets of habits may be generated through artificial intelligence training of a set of habits, thereby avoiding the user from manually editing a plurality of sets of habits and improving service efficiency.
According to the prior art, in winter, in response to a user voice signal "the living room is cold", based on a voice command acquisition model, a temperature of an (living room) air conditioner may be raised (heating), or the air conditioner may be turned on (heating). For example, the (living room) air conditioner itself may raise its temperature or may be turned on based on the voice command acquisition model. In addition, an intelligent central control may control the (living room) air conditioner to raise the temperature or control the air conditioner to start based on the voice command acquisition model. However, the command acquisition model is not suitable for summer, but can only control heating based on the user voice signal, and cannot learn. When the season changes, the command acquisition model cannot switch between "heating" and "cooling".
According to the present disclosure, a new command acquisition model may be designed. Based on the new command acquisition model, in response to the user voice signal "the living room is cold", based on environmental information (including an indoor temperature and an outdoor temperature) and an operating status (a switch status/a cooling status/a heating status) of an environment of the air conditioner, the temperature of the (living room) air conditioner may be raised (heating) or the air conditioner may be turned on (heating), or the temperature of the air conditioner may be raised (cooling) or the air conditioner may be turned off (cooling).
Hereinafter, the method and device for controlling an electronic apparatus according to the embodiments of the present disclosure will be described with reference to Fig. 1 and Fig. 2.
Fig. 1 is a flowchart showing a method for controlling an electronic apparatus according to an embodiment of the present disclosure. The method for controlling an electronic apparatus according to an embodiment of the present disclosure may run on the electronic apparatus itself, or may run on an intelligent central control controlling the electronic apparatus, and the present disclosure does not impose any limitation in this regard.
Referring to Fig. 1, in step S110, a user voice signal may be acquired. For example, the user voice signal may be acquired through a microphone set in the electronic apparatus or the intelligent central control. In step S120, environmental information and an operating status of the electronic apparatus may be acquired. As described above, the electronic apparatus may be an air conditioner, but is not limited thereto. For example, the electronic apparatus may be a display, a speaker, or the like. The environmental information of the electronic apparatus includes an indoor temperature and/or an outdoor temperature, and the operating status of the electronic apparatus includes a switch status, a cooling status, and/or a heating status. For example, the environmental information of the electronic apparatus may be acquired through various sensors set in the electronic apparatus or the intelligent central control, and the operating status of the electronic apparatus may be acquired through a controller/processor of the electronic apparatus. The order of step S110 and step S120 is not limited. For example, step S110 may be performed before step S120, may be performed after step S120, or may be performed simultaneously with step S120.
Next, in step S130, a control command corresponding to the user voice signal may be determined based on the acquired environmental information and the operating status of the electronic apparatus.
In order to determine the control command corresponding to the user voice signal, before step S120, the method for controlling an electronic apparatus according to an embodiment of the present disclosure may include a step as follows: recognizing the user voice signal through natural language understanding processing (NLU) to determine an intention indicated by the user voice signal. For example, when the user voice signal is "it is very hot", it may be determined through NLU that an intention indicated by the user voice signal is "cooling down", and the intention is related to control of the electronic apparatus. However, when the user voice signal is "how is the weather today", it may be determined through NLU that an intention indicated by the user voice signal is "asking for the weather", and the intention has nothing to do with the control of the electronic apparatus.
In another aspect, in order to determine the control command corresponding to the user voice signal, before step S120, the method for controlling an electronic apparatus according to an embodiment of the present disclosure may include steps as follows: determining whether the user voice signal matches a preset habit template in the electronic apparatus; and in response to the user voice signal matching the preset habit template, determining an intention indicated by the user voice signal from the preset habit template. For example, a user may pre-set a habit template "Aoligei" to be associated with an intention "cooling down". When the user voice signal is "Aoligei", it may be determined that the user voice signal matches the preset habit template in the electronic apparatus, and the intention "cooling down" indicated by the user voice signal may be determined from the preset habit template. However, when the user voice signal is "Fighting", it may be determined that the user voice signal does not match the preset habit template in the electronic apparatus, so that the intention indicated by the user voice signal cannot be acquired.
After determining the intention indicated by the user voice signal, in step S130, the determined intention indicated by the user voice signal, the environmental information and the operating status of the electronic apparatus may be input into a control command acquisition model to obtain the control command corresponding to the user voice signal. In step S140, the obtained control command may be used to control operation of the electronic apparatus.
For example, if the user voice signal is "the living room is cold" and the intention indicated by the voice signal is determined to be "warming up", the indoor temperature is 22 degrees (℃), the outdoor temperature is 32 degrees (℃), and the operating status of the electronic apparatus is "colling". Through the control command acquisition model, the control command "turn off the air conditioner" or "set the air conditioner temperature to 26 degrees (℃)" corresponding to the user voice signal may be obtained. Optionally, based on the user voice signal "too hot", and the determined intention indicated by the voice signal "cooling down", the indoor temperature is 30 degrees (℃), the outdoor temperature is 1 degree (℃), and the operating status of the electronic apparatus is "heating", through the control command acquisition model, the control command "turn off the air conditioner" or "set the air conditioner temperature to 22 degrees (℃)" corresponding to the user voice signal may be obtained.
According to an embodiment of the present disclosure, the control command acquisition model may be a model based on a convolutional neural network (CNN), and the convolutional neural network may be a residual neural network (Resnet), but is not limited thereto. Those skilled in the art may configure the CNN according to actual needs, and detailed description thereof will be omitted.
The following describes a training process of the control command acquisition model.
The control command acquisition model is obtained through training sample data or historical data. To this end, first, historical data including a user intention, the environmental information and the operating status of the electronic apparatus input may be collected, or sample data including a user intention, the environmental information and the operating status of the electronic apparatus input may be set. Then, the user intention, the environmental information and the operating status of the electronic apparatus in the sample data or the historical data are input into the control command acquisition model. Finally, based on an output of the control command acquisition model and expected decision data, parameters of the control command acquisition model are adjusted. Optionally, the expected decision data as an expected output of the control command acquisition model may be preset. When the output of the control command acquisition model is consistent with the expected decision data, the output of the control command acquisition model may be fed back to the control command acquisition model in a form of a positive example to strengthen the parameters of the control command acquisition model currently set. On the other hand, when the output of the control command acquisition model is inconsistent with the expected decision data, the output of the control command acquisition model may be fed back to the control command acquisition model in a form of a negative example to adjust the parameters of the control command acquisition model. By repeating the above training process, optimal parameters of the control command acquisition model may be finally determined. Optionally, the above training process may be performed by a manufacturer of the electronic apparatus in a manufacturing/testing phase of the electronic apparatus, or may be performed by a user of the electronic apparatus during using the electronic apparatus.
For example, a sample intention or a preset intention (such as "warming up") and the operating status of the air conditioner (such as "cooling") may be input into the control command acquisition model, and the indoor temperature (such as 22℃) and the outdoor temperature (such as 22℃) may be input into the control command acquisition model. The control command acquisition model may classify the operating status and a set temperature of the air conditioner through Resnet based on the input sample intention or preset intention, the operating status of the air conditioner, the indoor temperature and the outdoor temperature, and output the operating status and the set temperature of the air conditioner. When the operating status and the set temperature of the air conditioner (for example, "cooling" and "26℃") output by the control command acquisition model are consistent with the expected decision data, the operating status and the set temperature of the air conditioner output by the control command acquisition model are fed back to the control command acquisition model in the form of a positive example to strengthen the parameters of the control command acquisition model currently set. However, when the operating status and the set temperature of the air conditioner (for example, "cooling" and "24℃") output by the control command acquisition model are inconsistent with the expected decision data, the operating status and the set temperature of the air conditioner output by the control command acquisition model are fed back to the control command acquisition model in the form of a negative example to adjust the parameters of the control command acquisition model.
Using the above method for controlling an electronic apparatus according to an embodiment of the present disclosure, there is no need to configure different control command acquisition models for different environmental information and operating status of the electronic apparatus, avoiding complicated operations of users/manufacturers to set various models, and at the same time improving service efficiency of the electronic apparatus.
Fig. 2 is a block diagram showing a device for controlling an electronic apparatus according to an embodiment of the present disclosure. The device for controlling an electronic apparatus according to an embodiment of the present disclosure may be set on the electronic apparatus, or may be set on an intelligent central control controlling the electronic apparatus, and the present disclosure does not impose any limitation in this regard.
Referring to Fig. 2, a device 200 for controlling an electronic apparatus according to an embodiment of the present disclosure may include a first acquisition module 210, a second acquisition module 220, a command determination module 230 and a controlling module 240. The first acquisition module 210 may be configured to acquire a user voice signal. The second acquisition module 220 may be configured to acquire environmental information and an operating status of the electronic apparatus. The command determination module 230 may be configured to determine a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus. The controlling module 240 may be configured to control operation of the electronic apparatus by using the control command. As described above, the electronic apparatus may be an air conditioner, but is not limited thereto. For example, the electronic apparatus may be a display, a speaker, or the like. The environmental information of the electronic apparatus includes an indoor temperature and/or an outdoor temperature, and the operating status of the electronic apparatus includes a switch status, a cooling status, and/or a heating status. For example, the environmental information of the electronic apparatus may be acquired through various sensors set in the electronic apparatus or the intelligent central control, and the operating status of the electronic apparatus may be acquired through a controller/processor of the electronic apparatus.
The device 200 for controlling an electronic apparatus according to an embodiment of the present disclosure may further include a recognition module (not shown). The recognition module may be configured to recognize the user voice signal through natural language understanding processing to determine an intention indicated by the user voice signal, before the command determination module determines a control command corresponding to the user voice signal, or determine whether the user voice signal matches a preset habit template in the electronic apparatus, before the command determination module determines a control command corresponding to the user voice signal, and in response to the user voice signal matching the preset habit template, determine an intention indicated by the user voice signal from the preset habit template. Then, the command determination module 230 may be configured to input the determined intention indicated by the user voice signal, the environmental information and the operating status of the electronic apparatus into a control command acquisition model to obtain the control command corresponding to the user voice signal.
According to an embodiment of the present disclosure, the control command acquisition model may be a model based on a convolutional neural network (CNN), and the convolutional neural network may be a residual neural network (Resnet), but is not limited thereto. The control command acquisition model may be obtained through training sample data or historical data. A user intention in the sample data or the historical data, the environmental information and the operating status of the electronic apparatus may be input into the control command acquisition model, then parameters of the control command acquisition model may be adjusted based on an output of the control command acquisition model and expected decision data. For example, when the output of the control command acquisition model is consistent with the expected decision data, the output of the control command acquisition model may be fed back to the control command acquisition model in a form of a positive example, and when the output of the control command acquisition model is inconsistent with the expected decision data, the output of the control command acquisition model may be fed back to the control command acquisition model in a form of a negative example to adjust the parameters of the control command acquisition model.
Fig. 3 is a block diagram showing a computing apparatus according to an embodiment of the present disclosure.
Referring to Fig. 3, a computing apparatus 300 according to an embodiment of the present disclosure may include a processor 310 and a memory 320. The processor 310 may include (but is not limited to) a central processing unit (CPU), a digital signal processor (DSP), a microcomputer, a field programmable gate array (FPGA), a system on chip (SoC), a microprocessor, an application specific integrated circuit (ASIC), etc. The memory 320 stores computer programs to be executed by the processor 310. The memory 320 includes a high-speed random access memory and/or a non-volatile computer readable storage medium. When the processor 310 executes the computer programs stored in the memory 320, the method for controlling an electronic apparatus as described above may be implemented. Optionally, the computing apparatus 300 may communicate with the electronic apparatus or an intelligent central control controlling the electronic apparatus in a wired/wireless communication.
The method for controlling an electronic apparatus according to an embodiment of the present disclosure may be written as computer programs and stored on the computer readable storage medium. When the computer programs are executed by the processor, the method for controlling an electronic apparatus as described above may be implemented. Examples of the computer readable storage medium include: read only memory (ROM), random access programmable read only memory (PROM), electrically erasable programmable read only memory (EEPROM), random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), flash memory, non-volatile memory, CD-ROM, CD-R, CD+R, CD-RW, CD+RW, DVD-ROM, DVD-R, DVD+R, DVD-RW, DVD+RW, DVD-RAM, BD-ROM, BD-R, BD-R LTH, BD-RE, Blu-ray or CD storage, hard disk drive (HDD), solid state drive (SSD), card storage (such as multimedia cards, secure digital (SD) cards, or extreme digital (XD) cards), magnetic tape, floppy disk, magneto-optical data storage apparatus, optical data storage apparatus, hard disk, solid state disk, and any other apparatuses. The any other apparatuses are configured to store the computer programs and any associated data, data files and data structures in a non-transitory mode and provide the computer programs and any associated data, data files and data structures to the processor or computer, enabling the processor or computer to execute the computer programs. In one example, the computer programs and any associated data, data files and data structures are distributed on a networked computer system, so that the computer programs and any associated data, data files and data structures are stored, accessed, and executed in a distributed mode through one or more processors or computers.
The method and device for controlling an electronic apparatus according to the embodiments of the present disclosure, can accurately generate a control signal of the electronic apparatus based on a user voice signal and environmental information and an operating status of the electronic apparatus, and simply based on a control command acquisition model. There is no need to configure different control command acquisition models for different environmental information and operating status of the electronic apparatus, thereby avoiding complicated operations of users/manufacturers to set various models, and at the same time improving service efficiency of the electronic apparatus.
Although some embodiments of the present disclosure have been shown and described, those skilled in the art should understand that these embodiments may be modified without departing from the principle and spirit of the present disclosure whose scope is defined by the claims and their equivalents.

Claims (15)

  1. A method for controlling an electronic apparatus, the method comprising:
    acquiring a user voice signal;
    acquiring environmental information and an operating status of the electronic apparatus;
    determining a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus; and
    controlling operation of the electronic apparatus by using the control command.
  2. The method according to claim 1, wherein, before the determining a control command corresponding to the user voice signal, the method further comprises: recognizing the user voice signal through natural language understanding processing to determine an intention indicated by the user voice signal.
  3. The method according to claim 1, wherein, before the determining a control command corresponding to the user voice signal, the method further comprises:
    determining whether the user voice signal matches a preset habit template in the electronic apparatus; and
    in response to the user voice signal matching the preset habit template, determining an intention indicated by the user voice signal from the preset habit template.
  4. The method according to claim 2, wherein, the step of determining a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus, comprises:
    inputting the determined intention indicated by the user voice signal, the environmental information and the operating status of the electronic apparatus into a control command acquisition model to obtain the control command corresponding to the user voice signal.
  5. The method according to claim 4, wherein the control command acquisition model is obtained through training sample data or historical data.
  6. The method according to claim 5, wherein the control command acquisition model is trained by:
    inputting a user intention in the sample data or the historical data, the environmental information and the operating status of the electronic apparatus into the control command acquisition model; and
    adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data.
  7. The method according to claim 6, wherein, the step of adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data, comprise:
    in response to the output of the control command acquisition model being consistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a positive example; and
    in response to the output of the control command acquisition model being inconsistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a negative example, to adjust the parameters of the control command acquisition model.
  8. The method according to claim 1, wherein the electronic apparatus comprises an air conditioner, and
    wherein, the environmental information of the electronic apparatus comprises an indoor temperature and/or an outdoor temperature, and/or
    the operating status of the electronic apparatus comprises a switch status, a cooling status, and/or a heating status.
  9. A device for controlling an electronic apparatus, the device comprising:
    a first acquisition module, configured to acquire a user voice signal;
    a second acquisition module, configured to acquire environmental information and an operating status of the electronic apparatus;
    a command determination module, configured to determine a control command corresponding to the user voice signal based on the acquired environmental information and the operating status of the electronic apparatus; and
    a controlling module, configured to control operation of the electronic apparatus by using the control command.
  10. The device according to claim 9, wherein the device further comprises:
    a recognition module, configured to recognize the user voice signal through natural language understanding processing to determine an intention indicated by the user voice signal, before the command determination module determines a control command corresponding to the user voice signal.
  11. The device according to claim 9, wherein the device further comprises:
    a recognition module, configured to determine whether the user voice signal matches a preset habit template in the electronic apparatus, before the command determination module determines a control command corresponding to the user voice signal, and in response to the user voice signal matching the preset habit template, determine an intention indicated by the user voice signal from the preset habit template.
  12. The device according to claim 10, wherein the command determination module is configured to input the determined intention indicated by the user voice signal, the environmental information and the operating status of the electronic apparatus into a control command acquisition model to obtain the control command corresponding to the user voice signal.
  13. The device according to claim 12, wherein the control command acquisition model is obtained through training sample data or historical data.
  14. The device according to claim 13, wherein the control command acquisition model is trained by:
    inputting a user intention in the sample data or the historical data, the environmental information and the operating status of the electronic apparatus into the control command acquisition model; and
    adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data.
  15. The device according to claim 14, wherein, the operation of adjusting parameters of the control command acquisition model based on an output of the control command acquisition model and expected decision data, comprise:
    in response to the output of the control command acquisition model being consistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a positive example; and
    in response to the output of the control command acquisition model being inconsistent with the expected decision data, feeding back the output of the control command acquisition model to the control command acquisition model in a form of a negative example, to adjust the parameters of the control command acquisition model.
PCT/KR2022/000919 2021-01-21 2022-01-18 Method and device for controlling electronic apparatus WO2022158824A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110079778.6A CN112856727A (en) 2021-01-21 2021-01-21 Method and apparatus for controlling electronic device
CN202110079778.6 2021-01-21

Publications (1)

Publication Number Publication Date
WO2022158824A1 true WO2022158824A1 (en) 2022-07-28

Family

ID=76008526

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/000919 WO2022158824A1 (en) 2021-01-21 2022-01-18 Method and device for controlling electronic apparatus

Country Status (2)

Country Link
CN (1) CN112856727A (en)
WO (1) WO2022158824A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170286049A1 (en) * 2014-08-27 2017-10-05 Samsung Electronics Co., Ltd. Apparatus and method for recognizing voice commands
KR20190121639A (en) * 2018-04-18 2019-10-28 엘지전자 주식회사 Air conditioning apparatus and air conditioning system having the same
US20200106896A1 (en) * 2018-10-02 2020-04-02 Sharp Kabushiki Kaisha System and processing apparatus
JP2020061094A (en) * 2018-10-12 2020-04-16 オンキヨー株式会社 Voice input device and voice input program
JP2020085953A (en) * 2018-11-16 2020-06-04 トヨタ自動車株式会社 Voice recognition support device and voice recognition support program

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102393418B1 (en) * 2017-03-30 2022-05-03 삼성전자주식회사 Data learning server and method for generating and using thereof
CN107449109B (en) * 2017-07-20 2020-05-22 广东美的制冷设备有限公司 Air conditioner voice control method and device, air conditioner and readable storage medium
KR102443052B1 (en) * 2018-04-13 2022-09-14 삼성전자주식회사 Air conditioner and method for controlling air conditioner
KR102458336B1 (en) * 2018-05-18 2022-10-25 삼성전자주식회사 Air conditioner and Method for controlling the air conditioner thereof
JP7372040B2 (en) * 2019-03-22 2023-10-31 三菱重工サーマルシステムズ株式会社 Air conditioning control system and control method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170286049A1 (en) * 2014-08-27 2017-10-05 Samsung Electronics Co., Ltd. Apparatus and method for recognizing voice commands
KR20190121639A (en) * 2018-04-18 2019-10-28 엘지전자 주식회사 Air conditioning apparatus and air conditioning system having the same
US20200106896A1 (en) * 2018-10-02 2020-04-02 Sharp Kabushiki Kaisha System and processing apparatus
JP2020061094A (en) * 2018-10-12 2020-04-16 オンキヨー株式会社 Voice input device and voice input program
JP2020085953A (en) * 2018-11-16 2020-06-04 トヨタ自動車株式会社 Voice recognition support device and voice recognition support program

Also Published As

Publication number Publication date
CN112856727A (en) 2021-05-28

Similar Documents

Publication Publication Date Title
US20200227049A1 (en) Method, apparatus and device for waking up voice interaction device, and storage medium
CN105204357B (en) The contextual model method of adjustment and device of intelligent home device
US11954576B2 (en) Method for implementing and developing network model and related product
US20160336024A1 (en) Electronic device and method for controlling the same
US20170013342A1 (en) Wireless audio output devices
JP6811755B2 (en) Voice wake-up method by reading, equipment, equipment and computer-readable media, programs
CN111341296B (en) Voice control response test method, tester and storage medium
CN109903773B (en) Audio processing method, device and storage medium
US11249645B2 (en) Application management method, storage medium, and electronic apparatus
TW201719333A (en) A voice controlling system and method
CN108682414A (en) Sound control method, voice system, equipment and storage medium
KR102331660B1 (en) Methods and apparatuses for controlling voice of electronic devices, computer device and storage media
CN111462741B (en) Voice data processing method, device and storage medium
CN106375594A (en) Method and device for adjusting equipment, and electronic equipment
CN106471493A (en) Method and apparatus for managing data
WO2022158824A1 (en) Method and device for controlling electronic apparatus
CN116566760B (en) Smart home equipment control method and device, storage medium and electronic equipment
CN111933137B (en) Voice wake-up test method and device, computer readable medium and electronic equipment
CN107977443B (en) Intelligent teaching method and system based on voice analysis
WO2021174814A1 (en) Answer verification method and apparatus for crowdsourcing task, computer device, and storage medium
US11620996B2 (en) Electronic apparatus, and method of controlling to execute function according to voice command thereof
KR102416818B1 (en) Methods and apparatuses for controlling voice of electronic devices, computer device and storage media
CN107426425A (en) Application control method, device, computer installation and readable storage medium storing program for executing
CN212752277U (en) Intelligent communication equipment inspection, detection and training terminal and system
US20210357806A1 (en) Machine learning model training method and machine learning model training device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22742800

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22742800

Country of ref document: EP

Kind code of ref document: A1