WO2021107371A1 - Dispositif électronique et son procédé de commande - Google Patents

Dispositif électronique et son procédé de commande Download PDF

Info

Publication number
WO2021107371A1
WO2021107371A1 PCT/KR2020/013056 KR2020013056W WO2021107371A1 WO 2021107371 A1 WO2021107371 A1 WO 2021107371A1 KR 2020013056 W KR2020013056 W KR 2020013056W WO 2021107371 A1 WO2021107371 A1 WO 2021107371A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
state
search
processor
user
Prior art date
Application number
PCT/KR2020/013056
Other languages
English (en)
Korean (ko)
Inventor
유지원
Original Assignee
삼성전자(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자(주) filed Critical 삼성전자(주)
Publication of WO2021107371A1 publication Critical patent/WO2021107371A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9032Query formulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3055Monitoring arrangements for monitoring the status of the computing system or of the computing system component, e.g. monitoring if the computing system is on, off, available, not available
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3065Monitoring arrangements determined by the means or processing involved in reporting the monitored data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/90335Query processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present invention relates to an electronic device that provides a solution according to the device state of the electronic device, and a control method therefor.
  • the state of the electronic device is identified according to state values indicated by a plurality of state items of the electronic device, and a plurality of states obtained based on the identified states of the electronic device are identified.
  • a search phrase may be generated using a search keyword, and a search result obtained based on the generated search phrase may be provided.
  • the processor may generate the search phrase based on a model trained to align the arrangement order of the plurality of search keywords.
  • the model may be trained to identify the state of the electronic device according to state values indicated by the plurality of state items.
  • the model may be trained to obtain the plurality of search keywords based on the identified state of the electronic device.
  • the model may be trained such that the search keyword includes similar keywords.
  • the processor may classify the plurality of state items by category, and identify the states of the electronic device based on information indicating states of the plurality of electronic devices for each state value indicated by the classified plurality of state items.
  • the processor may identify the state of the electronic device based on information corresponding to a current state value among information prepared for each of a plurality of state values of the electronic device.
  • the processor may generate the search phrase by additionally using the user's usage history.
  • the use history may include at least one of a connection history of an external device, an application use history, and a search history.
  • the processor adjusts a weight of a state of a corresponding electronic device among states of a plurality of electronic devices based on the user's usage history, and identifies a state of the electronic device corresponding to the state value based on the adjusted weight. can do.
  • the processor may generate the search phrase by additionally using a setting history of the electronic device.
  • the processor may provide a search result selected based on a predefined priority among the plurality of obtained search results.
  • the priority may include the number of times another user searches for the search result.
  • the processor may obtain the search result based on a search phrase selected by a user input among a plurality of the search phrases.
  • the processor may perform a subsequent operation based on a search result selected by a user input among a plurality of search results.
  • a control method of an electronic device comprising: identifying a state of the electronic device according to state values indicated by a plurality of state items of the electronic device; generating a search phrase using a plurality of search keywords obtained based on the identified state of the electronic device; and providing a search result obtained based on the generated search phrase.
  • the generating of the search phrase may include generating the search phrase based on a model trained to align an arrangement order of the plurality of search keywords.
  • the identifying of the state of the electronic device may include: classifying the plurality of state items into categories;
  • the method may include identifying states of the electronic devices based on information indicating states of a plurality of electronic devices for each state value indicated by the plurality of classified state items.
  • the generating of the search phrase may include generating the search phrase by additionally using a user's usage history.
  • the control method of the electronic device is based on a state value indicated by a plurality of status items of the electronic device. identifying the state of the electronic device according to the method; generating a search phrase using a plurality of search keywords obtained based on the identified state of the electronic device; and providing a search result obtained based on the generated search phrase.
  • the present invention can efficiently solve a user's requirements that occur while the user uses the electronic device, thereby increasing the user's convenience in using the electronic device.
  • FIG. 1 is a block diagram illustrating the configuration of an electronic device according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an operation flowchart of an electronic device according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a status item of an electronic device according to an embodiment of the present invention.
  • FIG. 4 is a diagram illustrating a status item of an electronic device according to an embodiment of the present invention.
  • FIG. 5 is a diagram illustrating a user's electronic device usage history according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating an operation of an electronic device according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating an operation of an electronic device according to an embodiment of the present invention.
  • FIG. 8 is a diagram illustrating an operation of an electronic device according to an embodiment of the present invention.
  • FIG. 9 is a diagram illustrating an operation of an electronic device according to an embodiment of the present invention.
  • a 'module' or 'unit' performs at least one function or operation, and may be implemented as hardware or software or a combination of hardware and software, and is integrated into at least one module. and can be implemented.
  • at least one of the plurality of elements refers to all of the plurality of elements as well as each one or a combination thereof excluding the rest of the plurality of elements.
  • FIG. 1 is a block diagram illustrating the configuration of an electronic device according to an embodiment of the present invention.
  • the electronic device 100 may be implemented as a display device capable of displaying an image.
  • the electronic device 100 may include a TV, a computer, a smart phone, a tablet, a portable media player, a wearable device, a video wall, an electronic picture frame, and the like.
  • the electronic device 100 may be implemented as various types of devices, such as an image processing device such as a set-top box without a display, household appliances such as a Bluetooth speaker, a refrigerator, a washing machine, and an information processing device such as a computer body.
  • the electronic device 100 does not include a display, components for displaying an image such as the display unit 240 may not be included.
  • the electronic device 100 may output an image signal or the like to an external display device such as a TV through the interface unit 110 .
  • the electronic device 100 may include an interface unit 110 .
  • the interface unit 110 may include a wired interface unit 111 .
  • the wired interface unit 111 includes a connector or port to which an antenna capable of receiving a broadcast signal according to a broadcasting standard such as terrestrial/satellite broadcasting can be connected, or a cable capable of receiving a broadcast signal according to the cable broadcasting standard can be connected. do.
  • the electronic device 100 may have a built-in antenna capable of receiving a broadcast signal.
  • the wired interface unit 111 is configured according to video and/or audio transmission standards, such as HDMI port, DisplayPort, DVI port, Thunderbolt, composite video, component video, super video, SCART, etc. It may include a connector or port, and the like.
  • the wired interface unit 111 may include a connector or port according to a universal data transmission standard such as a USB port.
  • the wired interface unit 111 may include a connector or a port to which an optical cable can be connected according to an optical transmission standard.
  • the wired interface unit 111 is connected to an external microphone or an external audio device having a microphone, and may include a connector or a port capable of receiving or inputting an audio signal from the audio device.
  • the wired interface unit 111 is connected to an audio device such as a headset, earphone, or external speaker, and may include a connector or port capable of transmitting or outputting an audio signal to the audio device.
  • the wired interface unit 111 may include a connector or port according to a network transmission standard such as Ethernet.
  • the wired interface unit 111 may be implemented as a LAN card connected to a router or a gateway by wire.
  • the wired interface unit 111 is wired through the connector or port in a 1:1 or 1:N (N is a natural number) method such as an external device such as a set-top box, an optical media playback device, or an external display device, speaker, server, etc. By being connected, a video/audio signal is received from the corresponding external device or a video/audio signal is transmitted to the corresponding external device.
  • the wired interface unit 111 may include a connector or port for separately transmitting video/audio signals.
  • the wired interface unit 111 is embedded in the electronic device 100 , but may be implemented in the form of a dongle or a module to be detachably attached to the connector of the electronic device 100 .
  • the interface unit 110 may include a wireless interface unit 112 .
  • the wireless interface unit 112 may be implemented in various ways corresponding to the implementation form of the electronic device 100 .
  • the wireless interface unit 112 is a communication method RF (radio frequency), Zigbee (Zigbee), Bluetooth (bluetooth), Wi-Fi (Wi-Fi), UWB (Ultra WideBand) and NFC (Near Field Communication), etc. Wireless communication can be used.
  • the wireless interface unit 112 may be implemented as a wireless communication module that performs wireless communication with an AP according to a Wi-Fi method, or a wireless communication module that performs one-to-one direct wireless communication such as Bluetooth.
  • the wireless interface unit 112 may transmit and receive data packets to and from the server by wirelessly communicating with the server on the network.
  • the wireless interface unit 112 may include an IR transmitter and/or an IR receiver capable of transmitting and/or receiving an IR (Infrared) signal according to an infrared communication standard.
  • the wireless interface unit 112 may receive or input a remote control signal from a remote control or other external device through an IR transmitter and/or an IR receiver, or transmit or output a remote control signal to another external device.
  • the electronic device 100 may transmit/receive a remote control signal to and from the remote control or other external device through the wireless interface unit 112 of another method such as Wi-Fi or Bluetooth.
  • the electronic device 100 may further include a tuner for tuning the received broadcast signal for each channel.
  • the electronic device 100 may include a display unit 120 .
  • the display unit 120 includes a display panel capable of displaying an image on the screen.
  • the display panel is provided with a light-receiving structure such as a liquid crystal type or a self-luminous structure such as an OLED type.
  • the display unit 120 may further include additional components according to the structure of the display panel. For example, if the display panel is a liquid crystal type, the display unit 120 includes a liquid crystal display panel and a backlight unit for supplying light. and a panel driving substrate for driving the liquid crystal of the liquid crystal display panel.
  • the electronic device 100 may include a user input unit 130 .
  • the user input unit 130 includes various types of input interface related circuits provided to perform user input.
  • the user input unit 130 may be configured in various forms depending on the type of the electronic device 100 , for example, a mechanical or electronic button unit of the electronic device 100 , a remote controller separated from the electronic device 100 , and a touch screen. There is a pad, a touch screen installed on the display unit 120, and the like.
  • the electronic device 100 may include a storage unit 140 .
  • the storage unit 140 stores digitized data.
  • the storage unit 140 has a non-volatile property that can preserve data regardless of whether or not power is provided, and data to be processed by the processor 170 is loaded, and data is stored when power is not provided. It includes memory of volatile properties that cannot.
  • Storage includes flash-memory, hard-disc drive (HDD), solid-state drive (SSD), read-only memory (ROM), etc., and memory includes buffers and random access memory (RAM). etc.
  • the electronic device 100 may include a microphone 150 .
  • the microphone 150 collects sounds of the external environment including the user's voice.
  • the microphone 150 transmits the collected sound signal to the processor 170 .
  • the electronic device 100 may include a microphone 150 for collecting user voices or may receive a voice signal from an external device such as a remote controller having a microphone or a smart phone through the interface unit 110 .
  • a remote controller application may be installed in an external device to control the electronic device 100 or perform functions such as voice recognition. In the case of an external device installed with such an application, a user voice may be received, and the external device may transmit/receive and control data using the electronic device 100 and Wi-Fi/BT or infrared rays, etc.
  • a plurality of interface units 110 that can be implemented may exist in the electronic device 100 .
  • the electronic device 100 may include a speaker 160 .
  • the speaker 160 outputs audio data processed by the processor 170 as sound.
  • the speaker 160 may include a unit speaker provided to correspond to audio data of one audio channel, and may include a plurality of unit speakers to respectively correspond to audio data of a plurality of audio channels.
  • the speaker 160 may output the search result when the search result is composed of audio data because the electronic device 100 does not have a display or does not have a display, but the search result is not limited to video data.
  • the speaker 160 may be provided separately from the electronic device 100 . In this case, the electronic device 100 may transmit audio data to the speaker 160 through the interface unit 110 .
  • the electronic device 100 may include a processor 170 .
  • the processor 170 includes one or more hardware processors implemented with a CPU, a chipset, a buffer, a circuit, etc. mounted on a printed circuit board, and may be implemented as a system on chip (SOC) depending on a design method.
  • SOC system on chip
  • the processor 170 includes modules corresponding to various processes such as a demultiplexer, a decoder, a scaler, an audio digital signal processor (DSP), and an amplifier.
  • DSP audio digital signal processor
  • some or all of these modules may be implemented as SOC.
  • a module related to image processing such as a demultiplexer, decoder, and scaler may be implemented as an image processing SOC
  • an audio DSP may be implemented as a chipset separate from the SOC.
  • the processor 170 may convert the voice signal into voice data.
  • the voice data may be text data obtained through a speech-to-text (STT) process for converting a voice signal into text data.
  • STT speech-to-text
  • the processor 170 identifies a command indicated by the voice data, and performs an operation according to the identified command.
  • the voice data processing process and the command identification and execution process may all be executed in the electronic device 100 .
  • at least a part of the process is performed by at least one server communicatively connected to the electronic device 100 through a network. can be performed.
  • the processor 170 may call at least one command among commands of software stored in a storage medium readable by a machine such as the electronic device 100 and execute it. This enables a device such as the electronic device 100 to be operated to perform at least one function according to the called at least one command.
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not include a signal (eg, electromagnetic wave), and this term refers to cases in which data is semi-permanently stored in a storage medium and temporarily stored. It does not distinguish between cases where
  • the processor 170 identifies the state of the electronic device according to the state value indicated by the plurality of state items, and during data analysis, processing, and result information generation for generating a search phrase based on the identified state of the electronic device. At least a portion may be performed using at least one of machine learning, a neural network, or a deep learning algorithm as a rule-based or artificial intelligence algorithm.
  • the processor 170 may perform the functions of the learning unit and the recognition unit together.
  • the learner may perform a function of generating a learned neural network
  • the recognizer may perform a function of recognizing (or inferring, predicting, estimating, or judging) data using the learned neural network.
  • the learning unit may generate or update the neural network.
  • the learning unit may acquire learning data to generate a neural network.
  • the learning unit may acquire the learning data from the storage unit 140 or the outside.
  • the learning data may be data used for learning of the neural network, and the neural network may be trained by using the data obtained by performing the above-described operation as learning data.
  • the learning unit may perform a preprocessing operation on the acquired training data before training the neural network using the training data, or may select data to be used for learning from among a plurality of training data. For example, the learning unit may process the learning data into a preset format, filter it, or add/remove noise to process the learning data into a form suitable for learning.
  • the learner may generate a neural network set to perform the above-described operation by using the preprocessed learning data.
  • the learned neural network network may be composed of a plurality of neural network networks (or layers). Nodes of the plurality of neural networks have weights, and the plurality of neural networks may be connected to each other so that an output value of one neural network is used as an input value of another neural network.
  • Examples of neural networks include Convolutional Neural Network (CNN), Deep Neural Network (DNN), Recurrent Neural Network (RNN), Restricted Boltzmann Machine (RBM), Deep Belief Network (DBN), Bidirectional Recurrent Deep Neural Network (BRDNN) and It can include models such as Deep Q-Networks.
  • the recognizer may acquire target data to perform the above-described operation.
  • the target data may be obtained from the storage 140 or from the outside.
  • the target data may be data to be recognized by the neural network.
  • the recognizer may perform preprocessing on the acquired target data before applying the target data to the learned neural network, or select data to be used for recognition from among a plurality of target data.
  • the recognition unit may process the target data into a preset format, filter, or add/remove noise to process the target data into a form suitable for recognition.
  • the recognizer may obtain an output value output from the neural network by applying the preprocessed target data to the neural network.
  • the recognition unit may obtain a probability value or a reliability value together with the output value.
  • control method of the electronic device 100 may be provided by being included in a computer program product.
  • the computer program product may include instructions for software executed by the processor 170 , as described above.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg, CD-ROM), or via an application store (eg, Play StoreTM) or between two user devices (eg, smartphones). It can be distributed directly, online (eg, downloaded or uploaded). In the case of online distribution, at least a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • the electronic device 100 has a plurality of functions, and a plurality of status items exist according to each function.
  • the processor 170 identifies the state of the electronic device according to the state values indicated by the plurality of state items of the electronic device 100 ( S210 ).
  • State items include network connection status, communication bandwidth, and computing context items such as printer, display, and workstation, user context including user information, location, and people around; It includes physical status items such as screen brightness and sound volume, and temporal status items related to time (Time context).
  • the state item may be identified and collected by the processor 170 through various sensing devices and applications and used to provide various application services, or may be used to define the state of the electronic device in combination with other state information.
  • the state value is a value that continuously changes according to the state of the electronic device 100 and means an actual value of the electronic device 100 for each state item. That is, the processor 170 identifies the state of the electronic device corresponding to the current state value of the electronic device 100 among the plurality of states of the electronic device through each state value of the plurality of state items identified by the electronic device 100 . can do.
  • the processor 170 may store information on the state of the electronic device 100 in the storage unit 140 or receive information about the state of the electronic device from the server, but is not limited thereto. .
  • the processor 170 generates a search phrase using a plurality of search keywords obtained based on the identified state of the electronic device (S220). Since the data based on the identified state of the electronic device includes keywords, the processor 170 may generate a search phrase by using them according to a predefined criterion. The processor 170 provides a result obtained based on the generated search phrase (S230). The processor 170 searches through the device guide stored in the server or the storage unit 140 using the generated search phrase, and provides the obtained result.
  • the electronic device 100 generates a search phrase according to the state of the electronic device based on the state value of the device and provides a search result. It is possible to efficiently solve the requirements, thereby increasing the convenience of using the electronic device.
  • FIG. 3 is a diagram illustrating a hierarchical structure in which state items of an electronic device are classified according to an embodiment of the present invention.
  • the status item of the electronic device according to an embodiment of the present invention may be determined based on data such as a user manual of the electronic device, but is not limited thereto.
  • the processor 170 may generate different hierarchical structures as data such as user manuals are set by various languages, and may update the corresponding hierarchical structure when the language setting in the electronic device 100 is changed.
  • status items of the electronic device are classified hierarchically based on information including a user manual.
  • the problem category may be broadly classified into a screen, a sound, a network, and the like.
  • the contents within the square brackets ( ⁇ , >) correspond to the type that classifies the status items, such as problem category, and 'screen', 'sound', and 'network' correspond to keywords for generating search phrases. More details will be described later with reference to FIG. 4 .
  • the item 321 in which no sound is produced may be classified as a device in which no sound is produced, that is, a device connected to a TV (connection devices 331 and 332 ).
  • sound may not be output from the TV itself or from a device connected to the TV, for example, a sound bar.
  • the TV setting exists as a solution (solution category, 341), and furthermore, the TV volume and TV sound mode are the reasons for such a state (solution reason, 351, 352). .
  • the items 361 and 362 classified thereafter are end items of the state item having a hierarchical structure, and it is possible to compare whether the end item matches the current state of the electronic device based on the current state value of the TV.
  • the sound bar When the connected device 332 returns to the sound bar item, the sound bar may be classified into a connection type connected to the TV (connection type 371 , 372 , 373 ).
  • the sound bar can be connected to the TV through optical, HDMI, or Bluetooth formats.
  • the optical type is reclassified, TV setting, sound bar setting, cable and HW exist as a solution category (342, 343, 344), and furthermore, the reason for such a state (solution reason, 351, 352) have TV volume and TV sound modes.
  • the item 363 classified thereafter is a terminal item, and it is possible to compare whether the terminal item matches the current state of the electronic device based on the current state value of the TV.
  • the status items shown in FIG. 3 are not limited thereto, and status items may be added or deleted in some cases. This is also true of the contents disclosed in each state item.
  • the hierarchical structure of the illustrated state items is classified into functions and usage environments of one device, and the processor 170 refers to the hierarchical structure of the states of a plurality of electronic devices. can be defined. Accordingly, when generating a search phrase based on the state of the electronic device, the processor 170 may obtain a keyword for logically and efficiently generating a search phrase by utilizing this structure.
  • FIG. 4 is a diagram illustrating a status item of an electronic device according to an embodiment of the present invention
  • FIG. 5 is a diagram illustrating a user's history of using an electronic device according to an embodiment of the present invention.
  • FIG. 4 it is possible to extract a usable state item according to a situation by using the states of the plurality of electronic devices defined as described above in FIG. 3 .
  • the processor 170 may extract a state item related thereto. Therefore, based on the TV volume, TV sound mode, and sound bar settings, it is possible to extract up to the highest state item as related state items along the hierarchical structure of FIG. 4 . That is, in the case of TV volume and TV sound mode, TV setting, sound, no sound, and sound can be extracted, and in case of sound bar setting, sound bar setting, optical, sound bar, no sound, and sound can be extracted.
  • the processor 170 may create a dictionary in which state items composed of types and keywords are combined as described above with reference to FIG. 3 . This is to use the type and keyword when the processor 170 later generates a search phrase based on the identified state of the electronic device. In this case, the processor 170 may also add a keyword similar to the keyword extracted from the state item in advance in order to efficiently generate a search phrase. For example, in the case of TV setting, there are a TV connection, a TV setting method, a connection method, and the like.
  • generating the search phrase may be performed by the processor 170 using a predefined algorithm or by using a machine-based learned model.
  • the processor 170 may generate a search phrase based on a keyword and a type extracted according to a situation in which no sound is produced from the sound bar in FIG. 4 and a dictionary generated based thereon. For example, the processor 170 uses " ⁇ device> ⁇ connection type> ⁇ symptom>" among the types extracted in FIG.
  • the processor 170 may arrange the arrangement order of the plurality of keywords.
  • the processor 170 may use a model in which a general or specific user's utterance tendency has been learned in order to arrange the arrangement order of the plurality of keywords.
  • the model may include a plurality of sub-models classified by category according to the linguistic characteristics of the keyword.
  • the model may include a plurality of sub-models classified by category according to the state characteristics of the device.
  • the model may include a plurality of sub-models classified according to user characteristics.
  • FIG. 6 is a diagram illustrating an operation of an electronic device according to an embodiment of the present invention.
  • the processor 170 generates sentences assuming a state of one electronic device in which a sound is output from the TV itself and no sound is output from the sound bar when the TV is used.
  • the processor 170 it is unknown whether the user intentionally sets the TV sound mode to the TV because he wants to hear the sound from the TV, or whether the user wants to listen to the sound bar but outputs sound to the TV.
  • FIG. 6 it is possible to identify the state of the electronic device according to the current situation based on the user's usage history and the setting history of the electronic device in addition to the actual state value of the TV. have.
  • the user's usage history includes at least one of a connection history of an external device, an application use history, and a search history, but is not limited thereto.
  • the processor 170 may obtain the user's usage history or the setting history of the electronic device from the storage 140 and receive it from the server, but is not limited thereto.
  • the actual state value according to the TV state item is the TV volume is 12, only HDMI[4] and Optical are plugged in in the connected state, and a set-top box and a sound bar are connected to each connector. .
  • the device for outputting sound is set to the TV, so information related to the sound bar is currently unknown.
  • the connection history of the external device the sound bar was connected and disconnected on August 7, 2019, and accordingly, the TV sound mode was changed from optical to TV by the system. There is also a history of opening and closing the user manual application on the next day, August 8, 2019.
  • the user searches for 'sound bar' in the user manual application, and selects 'how to reconnect the sound bar', 'output sound from the sound bar', 'weather today', etc. on the portal site. It can be seen that the search
  • the processor 170 can know that the reason why the TV and the sound bar is disconnected is because the TV sound mode is changed by the system based on the TV status item and the user connection history, and when the status of the electronic device is changed, Based on the history of searching for 'sound bar' by the user using the user manual application in the similar time period, and the history of searching for 'sound bar' and 'how to connect the sound bar' on the server, etc. intention can be inferred.
  • the processor 170 accurately extracts the state of the electronic device necessary for the user from among the states of the electronic device defined in the hierarchical structure shown in FIG. 4 based on the state item of the electronic device and the user's intention. can do it
  • the processor 170 shows a state in which a search phrase generated based on the identified state of the electronic device is displayed on the display. .
  • the processor 170 may generate a search phrase based on the user's usage history, the setting history of the electronic device, etc. in addition to the keyword extracted from the state value of the electronic device.
  • At least one search phrase may be generated, and for example, a plurality of search phrases such as "Sound bar optical setting method”, “Sound bar optical connection method”, “Optical sound bar no sound”, “Optical sound bar not connected” can be created with
  • the processor 170 receives a user input by a remote control or the like through the interface unit 110 and performs a subsequent operation based on the search phrase selected according to the user input. can do.
  • the user input may be made through a microphone or the like as well as a remote control button or a touch screen, but is not limited thereto.
  • an embodiment of the present invention is implemented as an electronic device having a display
  • the electronic device does not have a display, such as a Bluetooth speaker, it is displayed on another display device connected thereto, or a search phrase is outputted in various ways, such as by voice. can be implemented.
  • FIG. 8 is a diagram illustrating an operation of an electronic device according to an embodiment of the present invention.
  • the processor 170 when a plurality of search phrases are selected according to a user input or when one search phrase is generated, the processor 170 performs a subsequent operation using the selected or generated search phrase.
  • the subsequent operation includes performing a search through a user manual application or the web using the generated search phrase.
  • each search result using all of the generated search phrases may be aggregated and displayed.
  • the processor 170 may display the search results obtained by searching through the application and the web in order of priority based on a predefined priority.
  • a predefined priority when the search results are displayed in order of priority, more accurate search results can be provided than simply listing the search results or displaying the search results screen in a specific web portal site as it is.
  • the processor 170 may prioritize d, a, b, c, e in the order, and By displaying content, including links to documents in sequence, users can use more accurate search results. As another embodiment, the processor 170 may determine the priority by identifying the number of times other users have used the search result through the server.
  • the processor 170 may give priority to the documents a, b, c, d, and e that are found as a result of the search using the search phrase in the order of the highest number of views.
  • priorities may be assigned in various ways, and the present invention is not limited thereto.
  • the processor 170 may provide a plurality of results determined to be highly relevant, including the corresponding result, in a pop-up form on the display or may guide the result by voice.
  • the priority is increased to the document provided the most as a search result of the electronic device status item, user's use history, and other users in a similar situation, and the server periodically By learning this, when providing search results to other users in a similar situation later, documents with high priority are provided preferentially.
  • the processor 170 After providing the search result to the user, the processor 170 provides a user interface that asks the user's intention, such as whether the provided search result is suitable for the user's situation or a useful search result.
  • a user interface that asks the user's intention, such as whether the provided search result is suitable for the user's situation or a useful search result.
  • follow-up operations may be performed accordingly. For example, according to one embodiment, after providing the user interface 'Were the search results helpful?', if the user selects 'Yes', the user returns to the home screen and based on the information obtained through the search result, the user returns to the home screen.
  • the user may reset the use environment or perform another operation, and if the user selects 'No', the search phrase may be regenerated or the process of re-identifying the state of the electronic device may be performed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Quality & Reliability (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon un mode de réalisation de la présente invention, un dispositif électronique peut identifier l'état du dispositif électronique selon des valeurs d'état indiquées par une pluralité d'éléments d'état du dispositif électronique, générer une phrase de recherche à l'aide d'une pluralité de mots-clés de recherche acquis sur la base de l'état identifié du dispositif électronique, et fournir un résultat de recherche acquis sur la base de la phrase de recherche générée.
PCT/KR2020/013056 2019-11-27 2020-09-25 Dispositif électronique et son procédé de commande WO2021107371A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190153892A KR20210065308A (ko) 2019-11-27 2019-11-27 전자장치 및 그 제어방법
KR10-2019-0153892 2019-11-27

Publications (1)

Publication Number Publication Date
WO2021107371A1 true WO2021107371A1 (fr) 2021-06-03

Family

ID=76128801

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/013056 WO2021107371A1 (fr) 2019-11-27 2020-09-25 Dispositif électronique et son procédé de commande

Country Status (2)

Country Link
KR (1) KR20210065308A (fr)
WO (1) WO2021107371A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010041982A1 (en) * 2000-05-11 2001-11-15 Matsushita Electric Works, Ltd. Voice control system for operating home electrical appliances
US20030009265A1 (en) * 2001-06-01 2003-01-09 Richard Edwin Community energy consumption management
JP2014153990A (ja) * 2013-02-12 2014-08-25 Sony Corp 情報処理装置、情報処理方法およびプログラム
KR101913633B1 (ko) * 2011-10-26 2018-11-01 삼성전자 주식회사 전자 기기 제어 방법 및 이를 구비한 장치
KR20190114321A (ko) * 2018-03-29 2019-10-10 삼성전자주식회사 전자 장치 및 그 제어 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010041982A1 (en) * 2000-05-11 2001-11-15 Matsushita Electric Works, Ltd. Voice control system for operating home electrical appliances
US20030009265A1 (en) * 2001-06-01 2003-01-09 Richard Edwin Community energy consumption management
KR101913633B1 (ko) * 2011-10-26 2018-11-01 삼성전자 주식회사 전자 기기 제어 방법 및 이를 구비한 장치
JP2014153990A (ja) * 2013-02-12 2014-08-25 Sony Corp 情報処理装置、情報処理方法およびプログラム
KR20190114321A (ko) * 2018-03-29 2019-10-10 삼성전자주식회사 전자 장치 및 그 제어 방법

Also Published As

Publication number Publication date
KR20210065308A (ko) 2021-06-04

Similar Documents

Publication Publication Date Title
WO2016035933A1 (fr) Dispositif d'affichage et son procédé de fonctionnement
EP3721361A1 (fr) Procédé d'apprentissage de vocabulaire personnalisé inter-domaines et dispositif électronique associé
WO2014035061A1 (fr) Dispositif d'affichage et procédé de recherche vocale
WO2019135523A1 (fr) Dispositif électronique, son procédé de commande et produit de programme informatique
EP2941895A1 (fr) Appareil d'affichage et procédé de commande d'un appareil d'affichage dans un système de reconnaissance vocale
EP2941896A1 (fr) Appareil électronique commandé par la voix d'un utilisateur et procédé pour le commander
WO2020054980A1 (fr) Procédé et dispositif d'adaptation de modèle de locuteur basée sur des phonèmes
WO2022154270A1 (fr) Procédé de génération de vidéo de temps forts et dispositif électronique associé
WO2019103347A1 (fr) Dispositif électronique et son procédé de commande
EP3916723B1 (fr) Dispositifs pour la provision des résultats de recherche en réponse à des énoncés d'utilisateur
WO2014142422A1 (fr) Procédé permettant de traiter un dialogue d'après une expression d'instruction de traitement, et appareil associé
WO2018169276A1 (fr) Procédé pour le traitement d'informations de langue et dispositif électronique associé
WO2021091145A1 (fr) Appareil électronique et procédé associé
US11587571B2 (en) Electronic apparatus and control method thereof
WO2021020825A1 (fr) Dispositif électronique, procédé de commande associé et support d'enregistrement
WO2021107371A1 (fr) Dispositif électronique et son procédé de commande
WO2020130350A1 (fr) Appareil d'affichage et son procédé de commande
WO2021091063A1 (fr) Dispositif électronique et procédé de commande associé
WO2021167230A1 (fr) Dispositif électronique et son procédé de commande
US11942089B2 (en) Electronic apparatus for recognizing voice and method of controlling the same
WO2021112391A1 (fr) Dispositif électronique et son procédé de commande
WO2021256760A1 (fr) Dispositif électronique mobile et son procédé de commande
WO2021049802A1 (fr) Dispositif électronique et son procédé de commande
WO2021241938A1 (fr) Dispositif électronique et son procédé de commande
WO2021141332A1 (fr) Dispositif électronique et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20891474

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20891474

Country of ref document: EP

Kind code of ref document: A1