WO2022080659A1 - Dispositif électronique et procédé de commande associé - Google Patents

Dispositif électronique et procédé de commande associé Download PDF

Info

Publication number
WO2022080659A1
WO2022080659A1 PCT/KR2021/011659 KR2021011659W WO2022080659A1 WO 2022080659 A1 WO2022080659 A1 WO 2022080659A1 KR 2021011659 W KR2021011659 W KR 2021011659W WO 2022080659 A1 WO2022080659 A1 WO 2022080659A1
Authority
WO
WIPO (PCT)
Prior art keywords
function
log data
log
voice command
sub
Prior art date
Application number
PCT/KR2021/011659
Other languages
English (en)
Korean (ko)
Inventor
이동섭
형지원
김효묵
양도준
조근석
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2022080659A1 publication Critical patent/WO2022080659A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems

Definitions

  • the present disclosure relates to an electronic device and a control method thereof, and more particularly, to an electronic device for guiding a voice command to a user and a control method thereof.
  • the virtual assistant refers to a software agent that processes tasks requested by users and provides specialized services to users.
  • voice recognition technology develops, as voice recognition technology is combined with a virtual assistant, electronic devices that perform a task or service desired by a user according to a user's voice command have become increasingly common.
  • the present disclosure has been made to solve the above problems, and an object of the present disclosure is to provide an electronic device for guiding a voice command usable in the electronic device and a control method thereof.
  • voice commands pre-stored in the electronic device can be increased, and the user can use the electronic device more conveniently by using the voice commands.
  • log data related to a function set in log data including a plurality of functions performed by the electronic device and sub-function information for each of the plurality of functions is converted into a voice command. Extracting using a data model for extracting available sub-functions, using the extracted log data, searching for a voice command capable of performing both a function and a sub-function included in the extracted log data , changing the searched voice command based on the sub-function and guiding the changed voice command.
  • An electronic device includes a memory in which a data model for extracting a plurality of functions and log data including sub-function information for each of the plurality of functions and a sub-function available as a voice command is stored, and the Extracting log data related to a function set from the log data using the data model, and using the extracted log data, search for a voice command that can perform both a function and a sub function included in the extracted log data and a processor for changing the searched voice command based on the sub-function and guiding the changed voice command.
  • FIG. 1 is a view for explaining a system according to an embodiment of the present disclosure
  • FIG. 2 is a block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure
  • FIG. 3 is a flowchart illustrating an operation of an electronic device according to an embodiment of the present disclosure
  • FIG. 4 is a block diagram for explaining the configuration of a server according to an embodiment of the present disclosure.
  • FIG. 5 is a view for explaining a log analysis system included in a server or an electronic device according to an embodiment of the present disclosure
  • 6A is a view for explaining a tag-based log deletion module included in the log analysis system
  • 6B is a diagram for explaining a tag-based log deletion module included in the log analysis system
  • 6c is a diagram for explaining a message-based log deletion module included in the log analysis system
  • 6D is a view for explaining a message-based log deletion module included in the log analysis system
  • FIG. 7 is a flowchart for explaining the operation of a server according to an embodiment of the present disclosure.
  • 8A is a view for explaining an electronic device that has received a user input for selecting a function according to an embodiment of the present disclosure
  • 8B is a diagram for describing an electronic device that has received a user input for searching for a sub-function included in a function according to an embodiment of the present disclosure
  • 8C is a view for explaining an electronic device receiving a user input for selecting a sub function according to an embodiment of the present disclosure
  • 8D is a view for explaining an electronic device that executes a function and a sub function according to a user input according to an embodiment of the present disclosure
  • 8E is a view for explaining an electronic device displaying a UI for guiding a voice command for a function and a sub function executed according to an embodiment of the present disclosure
  • 9A is a view for explaining an electronic device that receives a user input for selecting a UI according to an embodiment of the present disclosure
  • 9B is a view for explaining an electronic device that outputs a voice command according to an embodiment of the present disclosure.
  • 9C is a view for explaining an electronic device executing an application for receiving a voice command according to an embodiment of the present disclosure
  • FIG. 10 is a diagram for explaining a method of controlling an electronic device according to an embodiment of the present disclosure.
  • expressions such as “have,” “may have,” “includes,” or “may include” refer to the presence of a corresponding characteristic (eg, a numerical value, function, operation, or component such as a part). and does not exclude the presence of additional features.
  • expressions such as “A or B,” “at least one of A or/and B,” or “one or more of A or/and B” may include all possible combinations of the items listed together.
  • “A or B,” “at least one of A and B,” or “at least one of A or B” means (1) includes at least one A, (2) includes at least one B; Or (3) it may refer to all cases including both at least one A and at least one B.
  • first may be referred to as a second component, and similarly, the second component may also be renamed as a first component.
  • module As used herein, terms such as “module”, “unit”, “part”, etc. are terms used to refer to a component that performs at least one function or operation, and such component is implemented in hardware or software or may be implemented as a combination of hardware and software.
  • a plurality of “modules”, “units”, “parts”, etc. are integrated into at least one module or chip, except when each needs to be implemented in individual specific hardware, and thus at least one processor. can be implemented as
  • a component eg, a first component is "coupled with/to (operatively or communicatively)" to another component (eg, a second component);
  • another component eg, a second component
  • the certain element may be directly connected to the other element or may be connected through another element (eg, a third element).
  • a component eg, a first component
  • another component eg, a second component
  • a device configured to may mean that the device is “capable of” with other devices or parts.
  • a processor configured (or configured to perform) A, B, and C refers to a dedicated processor (eg, an embedded processor) for performing the corresponding operations, or by executing one or more software programs stored in a memory device.
  • a generic-purpose processor eg, a CPU or an application processor
  • the term user may refer to a person who uses an electronic device or a device (eg, an artificial intelligence electronic device) using the electronic device.
  • a device eg, an artificial intelligence electronic device
  • FIG. 1 is a diagram for explaining a system according to an embodiment of the present disclosure.
  • a system according to the present disclosure may include an electronic device 100 and a server 200 .
  • the electronic device 100 is illustrated as a smartphone in FIG. 1
  • the electronic device includes, for example, a tablet personal computer (PC), a mobile phone, a video phone, and an electronic device.
  • PC personal computer
  • PMP portable multimedia player
  • MP3 player MP3 player
  • mobile medical device a camera
  • camera or a wearable device.
  • the wearable device is an accessory type (eg, a watch, a ring, a bracelet, an anklet, a necklace, glasses, contact lenses, or a head-mounted-device (HMD)), a fabric or a clothing integrated type (eg, it may include at least one of an electronic garment), a body attachable type (eg, a skin pad), or a bioimplantable type (eg, an implantable circuit).
  • an accessory type eg, a watch, a ring, a bracelet, an anklet, a necklace, glasses, contact lenses, or a head-mounted-device (HMD)
  • HMD head-mounted-device
  • a fabric or a clothing integrated type For example, it may include at least one of an electronic garment), a body attachable type (eg, a skin pad), or a bioimplantable type (eg, an implantable circuit).
  • a body attachable type eg, a skin pad
  • a bioimplantable type eg, an implant
  • the electronic device 100 may be a home appliance.
  • Home appliances are, for example, televisions, digital video disk (DVD) players, audio, refrigerators, air conditioners, vacuum cleaners, ovens, microwave ovens, washing machines, air purifiers, set-top boxes, home automation controls.
  • Panel home automation control panel
  • security control panel security control panel
  • TV box e.g. Samsung HomeSyncTM, Apple TVTM, or Google TVTM
  • game console e.g. XboxTM, PlayStationTM
  • electronic dictionary e.g. an electronic key, a camcorder, or an electronic picture frame.
  • the electronic device 100 may perform an AI assistant (virtual assistant) function.
  • the electronic device 100 may receive a user command for requesting a specific operation service, and may provide an operation or service requested by the user to the user.
  • the electronic device 100 may receive a user command for requesting a specific operation or service through an input interface such as a touch screen or a button.
  • the electronic device 100 may receive a user command in the form of a user's voice through a microphone. That is, the electronic device 100 may receive a voice command from the user and perform a specific operation or service corresponding to the voice command.
  • the electronic device 100 may provide information about a voice command recognizable by the electronic device 100 to the user. Specifically, when the electronic device 100 receives a user command through an input interface such as a touch screen and there is a voice command corresponding to the received user command, information about the voice command corresponding to the user command is provided to the user. can do.
  • the electronic device 100 turns on the Bluetooth function and receives a user command to connect to the speaker (not shown) through the touch screen, and there is a voice command for Bluetooth connection, 'Connect with the speaker via Bluetooth' It is possible to provide a voice command such as
  • the electronic device 100 may display a UI including a voice command corresponding to a user command input through the input interface or output an utterance corresponding to the voice command to provide information about the voice command to the user.
  • information about a recognizable voice command may be pre-stored in the electronic device 100 .
  • the electronic device 100 may receive information about a voice command recognizable by the electronic device 100 from the server 200 .
  • the server 200 may be connected to the electronic device 100 to receive various information from the electronic device 100 or transmit various information to the electronic device 100 . Specifically, the server 200 may analyze log data for a voice command recognizable by the electronic device 100 and transmit the analysis result to the electronic device 100 . In addition, the server 200 may receive log data generated when the function of the electronic device 100 is executed from the electronic device 100 in order to analyze the log data.
  • the server 200 may include a log analysis system including at least one artificial intelligence model.
  • the log analysis system may be a system trained to execute a voice command for a function and a sub-function of the electronic device 100 and to analyze log data for the executed voice command.
  • the server 200 may generate a data model including log data for a function in which a voice command exists among functions of the electronic device 100 using a log analysis system, and the generated data model is used in the electronic device 100 . can be sent to
  • the electronic device 100 may identify whether a voice command for a function executed in the electronic device 100 exists by using the data model received from the server 200 , and when it is identified that the voice command exists, the corresponding Voice commands for functions can be guided to the user.
  • the server 200 includes a log analysis system and transmits the data model generated by the learned log analysis system to the electronic device 100 .
  • the electronic device 100 may include a log analysis system.
  • the electronic device 100 may learn the log analysis system included in the electronic device 100 , analyze log data for a function in which a voice command exists, and store a data model including the analysis result.
  • FIG. 2 is a block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • the memory 110 is a configuration for storing an operating system (OS) for controlling overall operations of the components of the electronic device 100 and at least one instruction or data related to the components of the electronic device 100 .
  • OS operating system
  • An instruction means one action statement that can be directly executed by the processor 120 in a programming language, and is a minimum unit for program execution or operation.
  • the processor 120 may perform operations according to various embodiments to be described later by executing at least one instruction stored in the memory 110 .
  • the memory 110 is a component for storing various programs and data necessary for the operation of the electronic device 100 .
  • the memory 110 may be implemented as a non-volatile memory, a volatile memory, a flash-memory, a hard disk drive (HDD), or a solid state drive (SSD).
  • the memory 110 is accessed by the processor 120 , and reading/writing/modification/deletion/update of data by the processor 120 may be performed.
  • the term "memory” refers to a memory 110, a ROM (not shown) in the processor 120, a RAM (not shown), or a memory card (not shown) mounted in the electronic device 100 (eg, micro SD). card, memory stick).
  • the memory 110 may store a plurality of functions of the electronic device 100 and log data 111 including sub-function information for each of the plurality of functions.
  • a function means an operation executable in the electronic device 100
  • a function is a task (eg, network connection, volume control, screen control, etc.) or application (eg, camera application, calendar application, etc.) executed on the system. ) corresponds to the action of
  • one function may include a plurality of sub-functions.
  • a sub-function refers to an operation that is included in one function and can be executed together with the function.
  • the 'image capture' function of the camera application may include a plurality of sub-functions such as 'take an image with 100x zoom', 'take an image in slow motion', and 'take an image with the flash on'.
  • the log data 111 is data including information about an event that occurs while the electronic device 100 is operating.
  • the log data 111 may include information about log data generated while a function or a sub-function of the electronic device 100 is executed.
  • a system or other function generated while the function and sub-function is executed is stored in the memory 110 .
  • Log data related to the event may be stored.
  • the memory 110 may store a data model 112 for extracting sub-functions usable as a voice command.
  • the data model 112 may include information on sub-functions available through voice commands.
  • the data model 112 may include information about a function available through a voice command.
  • information about a function or sub-function available as a voice command included in the data model 112 may include log data.
  • the data model 112 is generated by the log analysis system 400 of FIG. 5 and may include log data related to a function or sub-function in which a voice command exists.
  • the data model 112 may be transmitted from the server 200 . Specifically, when the data model 112 is generated through the log analysis system 500 of FIG. 5 stored in the server 200 , the server 200 may transmit the generated data model 112 to the electronic device 100 . there is.
  • the log analysis system 500 of FIG. 5 may be stored in the memory 110 .
  • the data model 112 may be generated by the log analysis system 500 stored in the memory 110
  • the generated data model 112 may be stored in the memory 110 .
  • the log filtering module 113 may be stored in the memory 110 .
  • the log filtering module 113 may extract log data related to a set function from among a plurality of log data included in the log data 111 .
  • the set function may include a function input based on a user input input to an input interface (eg, a touch screen, a button, etc.) rather than a voice command input through a microphone (not shown).
  • the set function may include a function set based on the user's electronic device usage pattern. For example, a function corresponding to a command corresponding to a function input for the first time among user commands input through an input interface other than a voice command by the user or a command corresponding to a function inputted more than a preset number of times may correspond to a set function. .
  • the log filtering module 113 may firstly extract log data related to a function set from the log data 111 , and may secondarily extract log data related to a sub-function from the firstly extracted log data.
  • the log filtering module 113 may filter log data not related to the set function among log data using the data model 112 to primarily extract log data related to the set function.
  • the log filtering module 113 uses a log similarity analysis module 114 to provide a degree of similarity between log information included in the log data primarily extracted and log information included in the data model 112 . By comparing , log data including the set function and sub-function can be extracted secondarily. At this time, the log similarity analysis module 114 converts each of the log information included in the data model 112 and the log information included in the log data 111 into vector data, and at least one of a direction, a distance, and a structure between the vector data. can be compared to confirm the similarity.
  • the processor 120 loads the log filtering module 113 to extract log data related to the set function and sub-function, and a detailed description of the operation of the log filtering module 130 will be described in more detail with reference to FIG. 3 . to do it
  • the processor 120 may be electrically connected to the memory 110 to control overall operations and functions of the electronic device 100 .
  • the processor 120 may control hardware or software components connected to the processor 120 by driving an operating system or an application program, and may perform various data processing and operations.
  • the processor 120 may load and process commands or data received from at least one of other components into the volatile memory, and store various data in the non-volatile memory.
  • the processor 120 is a general-purpose processor (eg, a CPU (Central) Processing Unit) or application processor).
  • a general-purpose processor eg, a CPU (Central) Processing Unit) or application processor.
  • the processor 120 may be implemented as a digital signal processor (DSP), a microprocessor, or a time controller (TCON) that processes a digital signal, but is not limited thereto, central processing unit (CPU), micro controller unit (MCU), micro processing unit (MPU), controller, application processor (AP), graphics-processing unit (GPU) or communication
  • the processor 120 may include one or more of a communication processor (CP) and an ARM processor (Advanced RISC Machine Processor), or may be defined as a corresponding term
  • the processor 120 is a system on system (SoC) in which a processing algorithm is embedded. Chip), large scale integration (LSI), or field programmable gate array (FPGA).
  • the processor 120 may load the log filtering module 113 and the log similarity analysis module 114 from the non-volatile memory to the volatile memory.
  • Non-volatile memory is memory that can retain stored information even when power is lost (for example, flash memory, programmable read-only memory (PROM), magnetoresistive random-access memory (MRAM), and resistive RAM (RRAM). ))
  • the volatile memory refers to a memory (eg, dynamic random-access memory (DRAM) and static RAM (SRAM)) that requires continuous power supply in order to maintain stored information.
  • loading refers to an operation of loading and storing data stored in the non-volatile memory into the volatile memory so that the processor 120 can access it.
  • the volatile memory may be included in the processor 120 or implemented as a component separate from the processor 120 according to an embodiment.
  • the processor 120 may load the log filtering module 113 stored in the memory 110 and extract log data related to the set function and sub-function from the log data 111 using the log filtering module 113 . .
  • Log data related to a set function and sub-function among a plurality of log data included in the log data 111 may be extracted using the data model 112 of the processor 120 .
  • the data model 112 is for extracting sub-functions usable by voice commands, and may include log data of functions and sub-functions usable by voice commands.
  • the processor 120 may firstly extract log data related to a set function from the log data 111 , and may secondarily extract log data related to a sub-function from the firstly extracted log data.
  • the processor 120 may search for a voice command related to the set function and sub-function based on the extracted log data. Specifically, the processor 120 may search for a voice command capable of performing both a function and a sub function included in the extracted log data by using the extracted log data.
  • the searched voice command may include a variable parameter, and the content of the voice command may vary according to a value input to the variable parameter.
  • the voice command may include variable parameters (eg, 'OO seconds', 'OO device') such as 'take an image in units of OO seconds', 'connect Bluetooth with OO', and Depending on the value, the voice command may be changed.
  • the processor 120 may change the searched voice command based on the set sub-function. For example, when the set sub-function includes an element corresponding to the value of the variable parameter, the processor 120 may change the voice command by inputting the corresponding element into the variable parameter of the voice command.
  • the set function and sub-function are 'take an image at 2 second intervals'
  • the searched voice command is 'take an image at OO second intervals'
  • the processor 120 issues a voice command based on the set sub function. You can change it like 'Take an image every 2 seconds'.
  • the processor 120 may guide the changed voice command. Specifically, the processor 120 may guide the changed voice command by controlling the display to display a UI including text corresponding to the changed voice command. Alternatively, the processor 120 may guide the voice command by controlling a speaker (not shown) to output the changed voice command. This will be described in more detail with reference to FIGS. 8A to 8E .
  • FIG. 3 is a flowchart illustrating an operation of an electronic device according to an embodiment of the present disclosure.
  • the processor 120 may primarily extract log data related to a set function from the log data 111 including a plurality of functions and sub-function information for each of the plurality of functions ( S310 ).
  • the processor 120 checks the log data area related to the function set in the log data 111 using the data model 112, and uses the checked log data area to compare the function set in the log data 111 and the log data area.
  • the related log data can be extracted first.
  • the data model 112 may include log information indicating the start of execution of the function for each function and log information indicating the completion of the execution of the function.
  • the processor 120 may check log data indicating the start of execution of the function set in the log data 111 and log data indicating the completion of execution, based on log information on the function included in the data model 112 .
  • the processor 120 may display log data indicating the start of the camera function and execution completion in the log data 111 based on log information corresponding to the start and end of shooting of the camera application in the data model 112 . You can check the displayed log data.
  • the processor 120 may obtain only log data indicating the completion of execution of the function by using the data model 112 , and set the obtained log data to the last of the log data area related to the set function.
  • preset log data eg, log data at a specific point in time
  • the log data area related to the set function may be checked.
  • the processor 120 may set the next log data of the last checked log data area or log data generated at a specific date or time as the starting point of the log data area related to the set function.
  • the processor 120 may first extract log data related to the set function by deleting log data not related to the set function from the checked log data area.
  • the log data not related to the set function represents log data related to the system of the electronic device 100 and log data related to a function different from the set function (eg, a function of another application or another domain).
  • the processor 120 may identify log data related to the set function and log data not related to the set function based on the data model 112 including log information for a plurality of functions.
  • the processor 120 may first extract log data related to the set function by deleting the log data area not related to the set function from the log data area.
  • the processor 120 may identify a usage pattern of the user who uses the electronic device 100 based on the log data 111 . For example, based on the log data 111 , the processor 120 determines the functions and sub-functions most used by the user, the functions and sub-functions not used by the user, and the functions and sub-functions used by the user for each time period. You can check usage patterns.
  • the processor 120 may first extract log data related to a function corresponding to the usage pattern identified in the log data 111 using the data model 112 .
  • the processor 120 may secondarily extract log data related to a sub-function set based on the firstly extracted log data.
  • the processor 120 may determine a similarity between log information included in the first extracted log data and log information included in the data model 112 ( S320 ).
  • the processor 120 may use the log similarity analysis module 114 to check the similarity between the log information on the sub-function included in the data model 112 and the log information included in the first extracted log data. .
  • the processor 120 converts each of the log information of the first extracted log data and the log information of the data model 112 into vector data, and compares at least one of a direction, a distance, and a structure between the transformed vector data to check the similarity.
  • the processor 120 may check the degree of similarity based on an angle between the vector data for the data model 112 and the vector data for the first extracted log data. Alternatively, the processor 120 may check the similarity based on the distance between the vector data for the data model 112 and the vector data for the first extracted log data. Alternatively, the processor 120 combines the angle and distance between the vector data for the data model 112 and the vector data for the first extracted log data, and the average or deviation value of the intersection of the two vector data to obtain a degree of similarity in structure can be checked.
  • the processor 120 may check whether log information having a similarity with log information for a sub-function included in the data model 112 in the first extracted log data is equal to or greater than a preset value (S330)
  • the processor 120 When log information equal to or greater than a preset value exists (S330-Y), the processor 120 performs log data including log information having a similarity of a preset condition among the log data first extracted based on the confirmed similarity. Secondary extraction may be performed (S340). For example, the processor 120 may secondarily extract log data including log information having the highest similarity among log data extracted firstly based on the confirmed similarity. In this case, the extracted log data may include information on functions and sub-functions.
  • the processor 120 may search for a voice command based on the extracted log data (S350).
  • the processor 120 may check the function and sub-function corresponding to the log data extracted secondarily using the data model 112 , and search for a voice command corresponding to each of the checked function and the sub-function. .
  • a voice command corresponding to each function and sub-function may be pre-stored in the memory 110 .
  • the processor 120 may search for a voice command corresponding to each function and sub-function included in the log data extracted secondarily based on the voice command stored in the memory 110 .
  • the processor 120 may change the voice command based on the sub-function (S360).
  • the processor 120 may change the voice command by inputting a specific value to the variable parameter.
  • a specific value input to the variable parameter may be determined based on a set sub-function.
  • a specific value input to the variable parameter may be determined based on a sub function executed according to a user input input through the input interface or a sub function set by the user's usage pattern. For example, when the user executes a function of 'taking an image every 3 seconds' through the touch screen, '3 seconds' may be a value input to the variable parameter.
  • the processor 120 may guide the changed voice command (S370).
  • the processor 120 may control the speaker (not shown) to output a guide message including the changed voice command.
  • the processor 120 may control the display to display a user interface (UI) including text corresponding to the changed voice command.
  • UI user interface
  • the processor 120 may execute an application for recognizing a new voice command.
  • the processor 120 may execute the virtual assistant application and control the display to display a screen of the virtual assistant application and text corresponding to the changed voice command. A detailed description in this regard will be described later with reference to FIGS. 9A to 9E .
  • FIG. 4 is a block diagram illustrating a configuration of a server according to an embodiment of the present disclosure.
  • the server 200 includes a memory 210 and a processor 220 .
  • the contents overlapping those of the memory 110 and the processor 120 of the electronic device 100 of FIG. 2 will be omitted.
  • the memory 210 may store the log analysis system 500 of FIG. 5 .
  • the log analysis system 400 is a system for analyzing log data for a function or sub-function included in the electronic device 100 .
  • Each module included in the log analysis system 500 or the log analysis system 500 of FIG. 5 may be implemented as an artificial intelligence model.
  • the log analysis system 500 may analyze log data for each function through learning of the log data.
  • the log analysis system 500 includes a voice command database 510 , a speech execution module 520 , a log collection module 530 , and a log An analysis module (Log Analyzer Module) 540 may be included.
  • the voice command database 510 may pre-store voice commands usable in the electronic device 100 .
  • the voice command database 510 may store a set of voice commands for functions. In this case, since one function includes a plurality of sub-functions, a voice command for each of the plurality of sub-functions may be stored in the voice command database 510 .
  • a plurality of voice commands for each function or sub-function of the electronic device 100 may be stored in the voice command database 510 , and a representative voice command among a plurality of voice commands for each function or sub-function Information about may be stored.
  • the voice command included in the voice command database 510 may be updated as a voice command usable in the electronic device 100 is added or deleted.
  • the voice command database 510 may store a voice command usable in another electronic device (not shown) in addition to the electronic device 100 .
  • the speech execution module 520 may identify a representative voice command for each function or sub-function in the voice command database 510 and check which function or sub-function is executed when the identified representative voice command is input.
  • the speech execution module 520 may identify the representative voice command based on information on the representative voice command. Alternatively, when information on the representative voice command is not stored in the voice command database 510, the speech execution module 520 represents a voice command used a preset number of times or more for a function or sub-function among the stored voice commands. It can be identified by voice commands.
  • the log collection module 530 may input the identified representative voice command to collect log data generated when the representative voice command is executed.
  • the collected log data may be a set of log data combined with one tag and one message.
  • the server 200 may receive load data from a separate external device (not shown) when learning the log analysis system 500 . Also, the server 200 may receive log data generated while the electronic device 100 operates from the electronic device 100 .
  • a tag included in log data means text corresponding to one function. That is, the tag may be a unique text for each domain or application of the electronic device 100 . Also, one function may include at least one or more tags. For example, the functions related to the camera application are different Tag 1 , Tag 2 , ... ,Tag May include n tags.
  • Messages contained in log data represent text defined by the developer when a function or sub-function is defined. That is, the message is text arbitrarily input to indicate a function or sub-function.
  • tags and messages included in log data may be separated by a specific symbol (eg, ':').
  • a specific symbol eg, ':'
  • log data it may be predefined that a tag is recorded before the symbol ':' and a message is recorded after the symbol ':'.
  • the log analysis module 540 may include a Tag-Based Log Remove Module 541 and a Message-Based Log Remover Module 542 .
  • the tag-based log deletion module 541 may delete log data not related to a function based on tag information included in the log data from the collected log data.
  • the tag-based log deletion module 541 may identify tag information recorded over a preset ratio (eg, 90% of the total collected log data) among tag information included in the collected log data. In addition, the tag-based log deletion module 541 may delete log data that does not include tag information identified in the collected log data.
  • a preset ratio eg, 90% of the total collected log data
  • the tag-based log deletion module 441 may leave only log data including tag information equal to or greater than a preset ratio among log data included in the collected log data, and delete the remaining log data.
  • one function includes n sub-functions S 1 , S 2 , ..., S n .
  • one sub function may include a plurality of log data.
  • sub function S 1 is Tag 1 , Tag 3 , Tag 4 ,...
  • S 2 is Tag 1 , Tag 4 , Tag 6 ,...
  • the tag-based log deletion module 541 may identify tag information included in log data equal to or greater than a preset ratio among a plurality of log data, and delete the remaining log data except for log data including the identified tag information.
  • the tag-based log deletion module 541 identifies Tag a as a function-related tag when it is identified that the tag Tag a is included in log data of a predetermined ratio or more among a plurality of log data for n sub-functions, You can delete log data including tags other than Tag a .
  • one tag included in log data greater than or equal to a preset ratio is tag a , but it is not necessarily limited to one tag, and there may be a plurality of tags included in log data greater than or equal to a preset ratio among all log data. .
  • the tag-based log deletion module 541 may check whether the identified tag information has messages different from each other at a preset ratio (eg, 50% of n sub-functions) or more in order to indicate all sub-functions. That is, the tag-based log deletion module 541 may check whether the combination of the identified tag information and the message can represent more than a preset ratio (eg, 50%) of all sub-functions.
  • a preset ratio eg, 50% of n sub-functions
  • the tag-based log deletion module 541 considers that the identified tag information is insufficient to represent the entire sub-function, and thus a log including tag information in the collected log data. Data can be deleted.
  • the tag-based log deletion module 541 may check whether a message included in log data including the tag tag a identified in FIG. 6A is greater than or equal to a preset ratio of all n sub-functions.
  • the tag-based log deletion module 541 is configured with a message Msg 1 , Msg 2 , ... combined with Tag a .
  • Msg k-1 , Msg K-2 is greater than or equal to a predetermined ratio of all n sub-functions, Tag a and Msg 1 , Msg 2 , ... , Msg k-1 , and Msg K-2 may not be deleted.
  • the tag-based log deletion module 541 is a message combined with Tag a Msg 1 , Msg 2 , ...
  • Msg k-1 , Msg K-2 is less than a predetermined ratio of the total n sub-functions
  • Tag a and Msg 1 , Msg 2 , ... , Msg k-1 , and Msg K-2 can be deleted.
  • the message-based log deletion module 542 may delete log data not related to a function by using the log data remaining through the tag-based deletion module 541 .
  • the message-based log deletion module 542 may execute a sub-function corresponding to the log data remaining after the tag-based deletion module 541 is executed a preset number of times (eg, 10 times).
  • the message-based log deletion module 542 may execute the sub-function S 1 corresponding to the log data in which the tag Tag -a and the message Msg 1 are combined a predetermined number of times.
  • the message-based log deletion module 542 may check whether log data with the same result as a result of executing the function a preset number of times, that is, log data in which the tag Tag -a and the message Msg 1 are combined, is repeatedly output. When log data combined with tag Tag -a and message Msg 1 is repeatedly output for a preset number of times, the message-based log deletion module 542 does not delete log data combined with tag Tag -a and message Msg 1 can
  • the message-based log deletion module 542 generates log data including the message Msg 1 (ie, log data including the tag Ta and the message Ma). ) can be deleted.
  • the tag-based log deletion module 541 when log data in which tag Tag -a and message Msg 1 are combined is deleted, the tag-based log deletion module 541, as shown in FIG . It can be checked again whether there are different messages (eg, Msg 2 , Msg 3 , ..., Msg z ) of more than 50% of log data.
  • tag-based log deletion module 541 deletes tag Tag -a You can delete the log data it contains.
  • the message-based log deletion module 542 may calculate an impact for each message of a tag included in log data.
  • the impact is a numerical value indicating the degree to which log data combined with a tag and a message match one sub-function one-to-one. The greater the number of messages that one-to-one matches one sub-function among a plurality of messages combined with a tag, the higher the impact of each message of the tag may be indicated.
  • the message Msg 1 , Msg 2 , ... combined with the tag Tag -a log data including Msg n are sub-functions S 1 , S 2 ,... , S n , the impact of each message of the tag Tag -a may be 100%.
  • a message combined with the tag Tag -a Msg 1 , Msg 2 , ... message combined with log data containing Msg n and tag Tag -b Msg 1 , Msg 2 , ... Assume that log data including , Msg n exists.
  • tag Tag -a and message Msg 1 match sub function Sa 1
  • tag Tag -a and message Msg 2 match sub function S a2
  • tag Tag -a and message Msg 3 match sub function S a3 one-to-one.
  • tag Tag -b and message Msg 1 match function S b1 and function S b2
  • tag Tag -b and message Msg 2 match function S b1 and function S b2 (in FIG. 6d ) case 2)
  • the message-based log deletion module 542 calculates the impact for each message of the tag for each tag, and the impact for each message (Msg 1 , Msg 2 , ..., Msg n ) of the tag is less than or equal to a preset ratio (eg, 20%) If tag Tag and Msg 1 , Msg 2 , ... , log data including Msg n can be deleted.
  • a preset ratio eg, 20%
  • the log analysis module 540 uses the tag-based log deletion module 541 and the message-based log deletion module 542 to log data that are not related to the function in the log data collected through the log collection module 530 .
  • the log analysis module 540 uses the tag-based log deletion module 541 and the message-based log deletion module 542 to log data that are not related to the function in the log data collected through the log collection module 530 .
  • the processor 220 analyzes log data according to the function or sub-function of the electronic device 100 in which the voice command exists by using the log analysis system 400 stored in the memory 210 . can do. And, the server 200 may generate a data model based on the analysis result.
  • FIG. 7 illustrates a process in which the server 200 analyzes log data according to a function or sub-function of the electronic device 100 using the log analysis system 500 of FIG. 5 and generates a data model.
  • the processor 220 may execute a representative voice command for a function or a sub-function (S710).
  • the processor 220 may load the speech execution module 520 to identify a representative voice command for each function or sub-function in the voice command database 510 and execute the identified representative voice command. At this time, information about the representative voice command is stored in the voice command database 510 . According to an example, when information about the representative voice command is not stored in the voice command database 510 , the processor 220 uses the speech execution module 520 to store the voice stored in the voice command database 510 . A voice command used more than a preset number of times among commands may be identified as a representative voice command.
  • the processor 220 may collect log data generated by executing the representative voice command (S720). Specifically, the processor 220 may load the log collection module 530 to collect log data generated when a representative voice command is executed. As described above, the log data at this time may be a combination of a tag corresponding to a function and a message arbitrarily written by a developer.
  • the log data generated by executing the representative voice command may include a plurality of log data.
  • the processor 220 may load the log analysis module 540 and analyze the collected log data to obtain log data information indicating a function or a sub-function ( S730 ).
  • the processor 220 includes an ambiguous tag and message to indicate a function corresponding to the representative voice command in the log data collected using the tag-based log deletion module 541 and the message-based log deletion module 542 . It is possible to delete log data to be used, and to acquire log data including log information according to a specific function.
  • the processor 220 may use the tag-based log deletion module 541 to identify tags that are included in the log data at a predetermined ratio or more, and delete log data that does not include the identified tags. Then, the processor 220 uses the tag-based log deletion module 541 to check whether the identified tag is combined with a predetermined number of different messages (eg, more than a predetermined ratio of the total number of sub-functions), and identify When the identified tags are combined with messages less than a preset number, log data including the identified tags may be deleted.
  • a predetermined number of different messages eg, more than a predetermined ratio of the total number of sub-functions
  • the processor 220 may delete log data not related to a function by using the message-based log deletion module 542 .
  • the processor 220 deletes log data using the tag-based log deletion module 521 and executes the remaining log data a preset number of times, and as a result of the execution, the log data combined with the same tag and message is executed You can delete log data that is not related to the function by checking whether it is output repeatedly as many times as it is.
  • the processor 220 may calculate an impact for each message of the tag using the tag-based log deletion module 521 , and delete log data based on the calculated impact.
  • the processor 220 identifies log data that is not related to a function from the log data collected in step S720 based on the tag-based log deletion module 541 and the message-based log deletion module 542, and identifies the By deleting the log data, it is possible to obtain log data accurately matching the function and the sub-function as a result.
  • the processor 220 may generate the data model 112 including the obtained log data ( S740 ). In this case, the processor 220 may generate the data model 112 including log data related to the function or sub-function in which the representative voice command exists.
  • the processor 220 may transmit the generated data model 112 to the electronic device 100 (S750).
  • the electronic device 100 receiving the data model 112 from the server 200 extracts log data related to functions and sub-functions set based on the log data included in the data model 112, and uses the extracted log data. You can guide the searched voice commands based on the basis.
  • FIG. 8 is a diagram for describing an electronic device for guiding a voice command for a set function and sub-function according to an embodiment of the present disclosure.
  • the processor 120 may control the display 130 to display icons corresponding to a plurality of applications or domains.
  • each icon corresponding to the application or domain may correspond to a function of the electronic device 100 , and each function may include at least one sub-function.
  • the processor 120 may receive a user input for executing a function and a sub function included in the electronic device 100 .
  • the processor 120 may receive a user input for selecting one icon 810 from among a plurality of icons displayed on the display 130 in order to execute one function (eg, a camera application).
  • one function eg, a camera application
  • the processor 120 may control the display 130 to display a screen on which a function is executed based on a user input.
  • the processor 120 may control the display 130 to display a screen on which an input UI for photographing is displayed by executing the camera application.
  • the processor 120 may control the display 130 to display a screen including an icon related to a sub-function included in the function. For example, when a menu 820 for displaying a menu related to display of a sub function is selected, the processor 120 controls the display 130 to display a screen including a plurality of icons related to the sub function as shown in FIG. 8C . can do.
  • the processor 120 when a user input for selecting one 830 among a plurality of icons related to a sub-function is received, the processor 120 performs a sub-function to be executed according to the received user input based on the user input. can be set.
  • the processor 120 may execute the set sub-function.
  • the processor 120 sets 'super slow motion shooting' as the sub function, and when a user input to execute the sub function is received, 'super slow motion shooting' You can take an image with 'Motion Shooting'.
  • the processor 120 may generate log data for a series of processes of FIGS. 8A to 8D .
  • log data related to selection of an icon corresponding to a camera application, selection of the menu 820, selection of an icon 830 corresponding to a sub function, reception of a user input for executing a sub function, and execution of a sub function can create
  • the processor 120 may search for a voice command for performing a function and a sub-function by using the data model 111 .
  • the processor 120 may change the voice command based on the set sub-function.
  • the processor 120 may guide a voice command.
  • the processor 120 may control the display 130 to display a user interface (UI) 840 including text corresponding to the searched voice command or the changed voice command.
  • UI user interface
  • the processor 120 may check a user's usage pattern and set a function and a sub-function corresponding to the confirmed usage pattern, thereby guiding a voice command related to the set function and sub-function.
  • the processor 120 checks the user's usage pattern using the sub-function included in the selected icon based on the log data 111 stored in the memory 110 .
  • the processor 120 may identify a sub-function that satisfies a specific condition based on the user's usage pattern information. For example, on the basis of the usage pattern information confirmed through the log data 111, the processor 120 determines the conditions such as a sub-function that is frequently used by the user but not used using a voice command or a sub-function not used by the user. A sub-function that is satisfied can be identified.
  • the processor 120 may control the display 130 to search for a voice command for the identified sub-function as shown in FIG. 8E and display the UI 840 including text corresponding to the searched voice command.
  • the processor 120 may set one of the plurality of sub-functions by using the user's usage pattern information included in the log data 111 .
  • the processor 120 extracts log data related to the set function and sub-function from the log data 111 using the data model 112 , and extracts the extracted log data.
  • a voice command corresponding to the sub-function set based on the search may be provided to the user.
  • FIG. 9 is a diagram for describing an electronic device for guiding a voice command according to an embodiment of the present disclosure.
  • the processor 120 may control the display 130 to display the UI 840 including text corresponding to the voice command.
  • the processor 120 may receive a user input for selecting the UI 840 displayed on the display 130 .
  • the processor 120 may output a voice message including the voice command to guide the user with the voice command.
  • the processor 120 when the processor 120 receives a user input for selecting the UI 840 including the text “shoot in super slow motion” corresponding to the voice command, “Hi Bixby, Super You can control the speaker (not shown) to output a voice message such as “Tell me shooting in slow motion”.
  • the processor 120 may guide the user to the voice command by executing an application that recognizes the voice command.
  • the processor 120 may execute the virtual assistant application and control the display 130 to display a screen on which the virtual assistant is executed.
  • the processor 130 may guide the voice command to the user by displaying the text corresponding to the voice command.
  • the processor 130 may execute a function or a sub function corresponding to the voice command.
  • FIG. 10 is a diagram for explaining a method of controlling an electronic device according to an embodiment of the present disclosure.
  • Log data including a plurality of functions performed by the electronic device 100 and information on a plurality of functions may be generated.
  • Log data related to a function set from the generated log data may be extracted using a data model for extracting a sub function that can be used with a voice command (S1010).
  • the data model may include log information on functions and sub-functions available as voice commands.
  • log data related to a function may be primarily extracted from log data, and log data related to an available sub function may be secondarily extracted using the data model.
  • check the log data indicating the start of the execution of the set function and the log data indicating the completion of the execution of the set function in the log data to check the log data area related to the function, and use the confirmed log data area Log data related to the set function can be primarily extracted from the log data.
  • a user's usage pattern may be checked based on the log data, and log data related to a function corresponding to the checked usage pattern may be primarily extracted.
  • a degree of similarity between each log information included in the log data extracted primarily and log information included in the data model may be checked.
  • each of the log information of the extracted log data and the log information of the data model is converted into vector data, and the similarity can be confirmed by comparing at least one of a direction, a distance, and a structure between the vector data.
  • log data including log information having a similarity degree of a preset condition among the log data extracted first may be secondarily extracted.
  • the searched voice command may be changed based on the sub function (S1030). Specifically, when the searched voice command includes a variable parameter, the searched voice command may be changed based on the set sub-function.
  • the changed voice command may be guided (S1040).
  • a UI including text corresponding to the changed voice command may be displayed.
  • an application for recognizing a changed voice command may be executed.
  • a voice message including a changed voice command may be output.
  • the various operations described above as being performed through at least one of the electronic device 100 or the server 200 are one or more electronic devices in the form of a method of controlling an electronic device or a method of controlling or operating a system including the electronic device. can be performed through
  • the embodiments described in the present disclosure are ASICs (Application Specific Integrated Circuits), DSPs (digital signal processors), DSPDs (digital signal processing devices), PLDs (Programmable logic devices), FPGAs (field programmable gate arrays) ), a processor, a controller, a micro-controller, a microprocessor, and may be implemented using at least one of an electrical unit for performing other functions.
  • ASICs Application Specific Integrated Circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs Programmable logic devices
  • FPGAs field programmable gate arrays
  • embodiments described herein may be implemented by the processor itself. According to the software implementation, embodiments such as the procedures and functions described in this specification may be implemented as separate software modules. Each of the software modules described above may perform one or more functions and operations described herein.
  • computer instructions for performing a processing operation in a user device or an administrator device may be stored in a non-transitory computer-readable medium.
  • the computer instructions stored in the non-transitory computer readable medium are executed by the processor of the specific device, the specific device performs the processing operation of the user device and/or the manager device according to the various embodiments described above.
  • the non-transitory readable medium refers to a medium that stores data semi-permanently, rather than a medium that stores data for a short moment, such as a register, cache, memory, and the like, and can be read by a device.
  • a non-transitory readable medium such as a CD, DVD, hard disk, Blu-ray disk, USB, memory card, ROM, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Acoustics & Sound (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Debugging And Monitoring (AREA)

Abstract

Est divulgué un procédé de commande d'un dispositif électronique. Le procédé de commande selon la présente divulgation comprend les étapes consistant : à extraire des données de journal relatives à une fonction définie à partir de données de journal comprenant une pluralité de fonctions exécutées par le dispositif électronique et des informations de sous-fonction concernant chacune de la pluralité de fonctions à l'aide d'un modèle de données pour extraire une sous-fonction disponible par l'intermédiaire d'une instruction vocale ; à rechercher une instruction vocale pour exécuter ensemble une fonction et une sous-fonction comprises dans les données de journal extraites à l'aide des données de journal extraites ; à modifier une instruction vocale trouvée sur la base de la sous-fonction ; et à guider l'instruction vocale modifiée.
PCT/KR2021/011659 2020-10-12 2021-08-31 Dispositif électronique et procédé de commande associé WO2022080659A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2020-0131456 2020-10-12
KR1020200131456A KR20220048374A (ko) 2020-10-12 2020-10-12 전자 장치 및 이의 제어 방법

Publications (1)

Publication Number Publication Date
WO2022080659A1 true WO2022080659A1 (fr) 2022-04-21

Family

ID=81208352

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/011659 WO2022080659A1 (fr) 2020-10-12 2021-08-31 Dispositif électronique et procédé de commande associé

Country Status (2)

Country Link
KR (1) KR20220048374A (fr)
WO (1) WO2022080659A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102496126B1 (ko) 2022-06-09 2023-02-06 노현승 골프 어드레스 정렬을 위한 장치
WO2024034785A1 (fr) * 2022-08-10 2024-02-15 주식회사 지어소프트 Procédé et dispositif permettant de fournir un service de marché en ligne à reconnaissance vocale à l'aide de la télévision par ip (iptv)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060018888A (ko) * 2003-06-12 2006-03-02 모토로라 인코포레이티드 캐시 특징을 갖는 분배형 음성 인식 시스템 및 방법
KR20090045878A (ko) * 2007-11-02 2009-05-08 (주)인피니티 텔레콤 모바일 에이전트 기능을 가진 이동통신 단말기 및 이동통신단말기를 이용한 모바일 에이전트 구동 방법
KR20130078518A (ko) * 2011-12-30 2013-07-10 삼성전자주식회사 전자 장치 및 그의 제어 방법
KR20160127614A (ko) * 2015-04-27 2016-11-04 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
US20190138539A1 (en) * 2012-06-21 2019-05-09 Google, Llc Dynamic language model

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060018888A (ko) * 2003-06-12 2006-03-02 모토로라 인코포레이티드 캐시 특징을 갖는 분배형 음성 인식 시스템 및 방법
KR20090045878A (ko) * 2007-11-02 2009-05-08 (주)인피니티 텔레콤 모바일 에이전트 기능을 가진 이동통신 단말기 및 이동통신단말기를 이용한 모바일 에이전트 구동 방법
KR20130078518A (ko) * 2011-12-30 2013-07-10 삼성전자주식회사 전자 장치 및 그의 제어 방법
US20190138539A1 (en) * 2012-06-21 2019-05-09 Google, Llc Dynamic language model
KR20160127614A (ko) * 2015-04-27 2016-11-04 엘지전자 주식회사 이동 단말기 및 그것의 제어방법

Also Published As

Publication number Publication date
KR20220048374A (ko) 2022-04-19

Similar Documents

Publication Publication Date Title
WO2020045927A1 (fr) Dispositif électronique et procédé de génération de raccourci de commande rapide
WO2022080659A1 (fr) Dispositif électronique et procédé de commande associé
WO2017078361A1 (fr) Dispositif électronique et procédé de reconnaissance vocale
WO2021025350A1 (fr) Dispositif électronique gérant une pluralité d'agents intelligents et son procédé de fonctionnement
WO2015030474A1 (fr) Dispositif électronique et procédé de reconnaissance vocale
WO2014035195A2 (fr) Appareil d'interface utilisateur dans un terminal utilisateur et procédé permettant le fonctionnement de celui-ci
WO2020091183A1 (fr) Dispositif électronique de partage de commande vocale spécifique à l'utilisateur et son procédé de commande
WO2014035199A1 (fr) Appareil d'interface utilisateur dans un terminal d'utilisateur et son procédé de support
WO2020167006A1 (fr) Procédé de fourniture de service de reconnaissance vocale et dispositif électronique associé
WO2019235793A1 (fr) Dispositif électronique et procédé permettant de fournir des informations relatives à une application au moyen d'une unité d'entrée
WO2021049877A1 (fr) Appareil électronique pour sélectionner un assistant ia et son procédé de fourniture de réponse
EP3818720A1 (fr) Appareil électronique et son procédé de commande
WO2019168315A1 (fr) Procédé de rendu graphique de zone de confiance et dispositif d'affichage utilisant celui-ci
WO2018056642A2 (fr) Dispositif électronique et son procédé de gestion d'applications
WO2019112181A1 (fr) Dispositif électronique pour exécuter une application au moyen d'informations de phonème comprises dans des données audio, et son procédé de fonctionnement
WO2020032655A1 (fr) Procédé d'exécution d'une fonction basée sur la voix et dispositif électronique le prenant en charge
WO2020032564A1 (fr) Dispositif électronique et procédé permettant de fournir un ou plusieurs articles en réponse à la voix d'un utilisateur
EP3087752A1 (fr) Appareil de terminal utilisateur, appareil électronique, système et procédé de commande associé
WO2017206867A1 (fr) Procédé et appareil d'arrêt de capteurs, support d'informations, et dispositif électronique
WO2015178716A1 (fr) Procédé et dispositif de recherche
WO2019190062A1 (fr) Dispositif électronique destiné au traitement d'une entrée vocale utilisateur
WO2015142031A1 (fr) Appareil de terminal utilisateur, appareil électronique, système et procédé de commande associé
WO2020076086A1 (fr) Système de traitement d'énoncé d'utilisateur et son procédé de fonctionnement
WO2020071858A1 (fr) Appareil électronique et procédé de fourniture de service d'assistant correspondant
WO2020075960A1 (fr) Dispositif électronique, dispositif électronique externe et procédé de commande de dispositif électronique externe à l'aide d'un dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21880307

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21880307

Country of ref document: EP

Kind code of ref document: A1