WO2023106591A1 - Dispositif électronique et procédé de fourniture de contenu sur la base de l'émotion d'utilisateur - Google Patents

Dispositif électronique et procédé de fourniture de contenu sur la base de l'émotion d'utilisateur Download PDF

Info

Publication number
WO2023106591A1
WO2023106591A1 PCT/KR2022/015225 KR2022015225W WO2023106591A1 WO 2023106591 A1 WO2023106591 A1 WO 2023106591A1 KR 2022015225 W KR2022015225 W KR 2022015225W WO 2023106591 A1 WO2023106591 A1 WO 2023106591A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
electronic device
processor
emotion
value
Prior art date
Application number
PCT/KR2022/015225
Other languages
English (en)
Korean (ko)
Inventor
김정희
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2023106591A1 publication Critical patent/WO2023106591A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Definitions

  • the descriptions below relate to an electronic device and method for providing content based on a user's emotion.
  • Emotion recognition technology which is a technology for identifying a person's emotion through at least one of a person's voice, expression, and bio-signal, is being discussed. Research and development on emotion recognition technology using machine learning and artificial intelligence is in progress. An electronic device may be set to recognize a user's emotion and operate based on the user's emotion.
  • the electronic device may recommend at least one piece of content to the user based on user history.
  • the electronic device may recommend at least one piece of content to the user based on the user's content viewing record.
  • An operation of an electronic device for recommending at least one piece of content identified to the user based on the user's emotion as well as the user's content viewing record is required.
  • An electronic device may include a communication circuit and at least one processor operatively coupled to the communication circuit.
  • the at least one processor may receive first information including user data obtained from a user from a second external electronic device that is distinct from the first external electronic device while the first content is reproduced in the first external electronic device. , using the communication circuit, can be set to receive.
  • the at least one processor may be configured to identify second information about a type of emotion of the user corresponding to the user data according to a specified period based on the received first information.
  • the at least one processor may be configured to acquire one or more second contents in response to identifying that the value of the user's emotion identified based on the second information is maintained within a specified range.
  • the at least one processor may, in response to obtaining the one or more second contents, display a visual object for guiding reproduction of one of the one or more acquired second contents to the first external electronic device. Can be set to request.
  • a method of an electronic device includes user data obtained from a user from a second external electronic device distinct from the first external electronic device while first content is reproduced in the first external electronic device. It may include an operation of receiving the first information, using a communication circuit. The method may include an operation of identifying second information about a type of the user's emotion corresponding to the user data according to a specified period, based on the received first information. The method may include an operation of acquiring one or more second contents in response to identifying that the value of the user's emotion identified based on the second information is maintained within a specified range. The method may include, in response to acquiring the one or more second contents, requesting, by the first external electronic device, display of a visual object for guiding reproduction of one of the one or more acquired second contents. can include
  • a non-transitory computer readable storage medium may store one or more programs.
  • a second external electronic device distinct from the first external electronic device while a first content is reproduced in the first external electronic device. may include instructions that cause the electronic device to receive, by using the communication circuitry, first information comprising user data obtained from a user.
  • the one or more programs based on the received first information, instructions for causing the electronic device to identify second information about the type of emotion of the user corresponding to the user data according to a specified period.
  • the one or more programs in response to identifying that the value of the user's emotion identified based on the second information remains within a specified range, instructions for causing the electronic device to obtain one or more second contents.
  • the one or more programs in response to acquiring the one or more second contents, request the first external electronic device to display a visual object for guiding reproduction of one of the one or more acquired second contents. It may include instructions that cause the electronic device to do so.
  • the electronic device may identify a user's emotion type based on user data (eg, biosignal data, face data, voice data, or input data). Based on the user's emotion type, the electronic device may identify at least one content for changing or maintaining the user's emotion type. The electronic device may request the external electronic device to display a visual object for reproducing one of at least one user's content.
  • user data eg, biosignal data, face data, voice data, or input data.
  • the electronic device may identify at least one content for changing or maintaining the user's emotion type.
  • the electronic device may request the external electronic device to display a visual object for reproducing one of at least one user's content.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments.
  • FIG. 2 illustrates an environment including an electronic device, a first external electronic device, and a second external electronic device according to various embodiments of the present disclosure.
  • FIG. 3 is a simplified block diagram of an electronic device according to various embodiments.
  • FIG. 4 is a flowchart illustrating an operation of an electronic device according to various embodiments.
  • FIG. 5 is a flowchart illustrating an operation of an electronic device according to various embodiments.
  • FIG. 6 illustrates an example of a user's emotion type according to various embodiments.
  • FIG. 7A to 7D are diagrams for explaining an operation of an electronic device according to various embodiments.
  • 8A to 8C illustrate examples of operations of an electronic device according to various embodiments.
  • 9A and 9B illustrate examples of visual objects displayed in a first external electronic device according to various embodiments.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or through a second network 199. It may communicate with at least one of the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or the antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added.
  • some of these components eg, sensor module 176, camera module 180, or antenna module 197) are integrated into a single component (eg, display module 160). It can be.
  • the processor 120 for example, executes software (eg, the program 140) to cause at least one other component (eg, hardware or software component) of the electronic device 101 connected to the processor 120. It can control and perform various data processing or calculations. According to one embodiment, as at least part of data processing or operation, the processor 120 transfers instructions or data received from other components (e.g., sensor module 176 or communication module 190) to volatile memory 132. , processing commands or data stored in the volatile memory 132 , and storing resultant data in the non-volatile memory 134 .
  • software eg, the program 140
  • the processor 120 transfers instructions or data received from other components (e.g., sensor module 176 or communication module 190) to volatile memory 132. , processing commands or data stored in the volatile memory 132 , and storing resultant data in the non-volatile memory 134 .
  • the processor 120 may include a main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit ( NPU: neural processing unit (NPU), image signal processor, sensor hub processor, or communication processor).
  • a main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit ( NPU: neural processing unit (NPU), image signal processor, sensor hub processor, or communication processor.
  • NPU neural network processing unit
  • the secondary processor 123 may be implemented separately from or as part of the main processor 121 .
  • the secondary processor 123 may, for example, take the place of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, running an application). ) state, together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of functions or states related to.
  • the auxiliary processor 123 eg, image signal processor or communication processor
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • AI models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself where the artificial intelligence model is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning or reinforcement learning, but in the above example Not limited.
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the foregoing, but is not limited to the foregoing examples.
  • the artificial intelligence model may include, in addition or alternatively, software structures in addition to hardware structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, program 140) and commands related thereto.
  • the memory 130 may include volatile memory 132 or non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used by a component (eg, the processor 120) of the electronic device 101 from the outside of the electronic device 101 (eg, a user).
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • a receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor set to detect a touch or a pressure sensor set to measure the intensity of force generated by the touch.
  • the audio module 170 may convert sound into an electrical signal or vice versa. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device connected directly or wirelessly to the electronic device 101 (eg: Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device connected directly or wirelessly to the electronic device 101 (eg: Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a bio sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that may be used to directly or wirelessly connect the electronic device 101 to an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 may be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert electrical signals into mechanical stimuli (eg, vibration or motion) or electrical stimuli that a user may perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to one embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as at least part of a power management integrated circuit (PMIC), for example.
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). Establishment and communication through the established communication channel may be supported.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : a local area network (LAN) communication module or a power line communication module).
  • a corresponding communication module is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, a legacy communication module).
  • the wireless communication module 192 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, NR access technology (new radio access technology).
  • NR access technologies include high-speed transmission of high-capacity data (enhanced mobile broadband (eMBB)), minimization of terminal power and access of multiple terminals (massive machine type communications (mMTC)), or high reliability and low latency (ultra-reliable and low latency (URLLC)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low latency
  • -latency communications can be supported.
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • the wireless communication module 192 uses various technologies for securing performance in a high frequency band, such as beamforming, massive multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. Technologies such as input/output (full dimensional MIMO (FD-MIMO)), array antenna, analog beam-forming, or large scale antenna may be supported.
  • the wireless communication module 192 may support various requirements defined for the electronic device 101, an external electronic device (eg, the electronic device 104), or a network system (eg, the second network 199).
  • the wireless communication module 192 is a peak data rate for eMBB realization (eg, 20 Gbps or more), a loss coverage for mMTC realization (eg, 164 dB or less), or a U-plane latency for URLLC realization (eg, Example: downlink (DL) and uplink (UL) each of 0.5 ms or less, or round trip 1 ms or less) may be supported.
  • eMBB peak data rate for eMBB realization
  • a loss coverage for mMTC realization eg, 164 dB or less
  • U-plane latency for URLLC realization eg, Example: downlink (DL) and uplink (UL) each of 0.5 ms or less, or round trip 1 ms or less
  • the antenna module 197 may transmit or receive signals or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator formed of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is selected from the plurality of antennas by the communication module 190, for example. can be chosen A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC) may be additionally formed as a part of the antenna module 197 in addition to the radiator.
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first surface (eg, a lower surface) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, array antennas) disposed on or adjacent to a second surface (eg, a top surface or a side surface) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or part of operations executed in the electronic device 101 may be executed in one or more external electronic devices among the external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 when the electronic device 101 needs to perform a certain function or service automatically or in response to a request from a user or another device, the electronic device 101 instead of executing the function or service by itself.
  • one or more external electronic devices may be requested to perform the function or at least part of the service.
  • One or more external electronic devices receiving the request may execute at least a part of the requested function or service or an additional function or service related to the request, and deliver the execution result to the electronic device 101 .
  • the electronic device 101 may provide the result as at least part of a response to the request as it is or additionally processed.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks. According to one embodiment, the external electronic device 104 or server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to intelligent services (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the electronic device transmits information from a second external electronic device distinct from the first external electronic device while the first content is reproduced in the first external electronic device.
  • first information including user data obtained from the user may be received.
  • the electronic device may identify second information about the user's emotion type.
  • the electronic device may obtain one or more pieces of second content in response to identifying that the value of the user's emotion identified based on the second information is maintained within a specified range.
  • the electronic device may request the first external electronic device to display a visual object for guiding reproduction of one of the one or more second contents.
  • the operation of the electronic device (or processor of the electronic device) for the above-described embodiment may be described below.
  • the electronic device described below may correspond to the server 108 of FIG. 1 .
  • FIG. 2 illustrates an environment including an electronic device, a first external electronic device, and a second external electronic device according to various embodiments of the present disclosure.
  • an environment 200 may include an electronic device 210 , a first external electronic device 201 , and a second external electronic device 202 .
  • Electronic device 210 may correspond to server 108 of FIG. 1 .
  • the first external electronic device 202 and the second external electronic device 202 may correspond to the electronic device 101 of FIG. 1 .
  • the electronic device 210 may establish a connection with the first external electronic device 201 .
  • the electronic device 210 may store various contents.
  • the electronic device 210 transmits a signal including at least one content (eg, first content) to the first external electronic device 201 to reproduce at least one of various contents. (201).
  • the first external electronic device 201 may include a display.
  • the first external electronic device 201 may be used to reproduce at least one piece of content received from the electronic device 210 .
  • the first external electronic device 201 may identify a user input for reproducing the first content.
  • the first external electronic device 201 may receive a user input for reproducing the first content from the user.
  • the first external electronic device 201 may request information (eg, first content) for reproducing the first content from the electronic device 210 based on a user input.
  • the first external electronic device 201 may receive information (eg, first content) for reproducing the first content from the electronic device 210 .
  • the first external electronic device 201 may reproduce the first content from the electronic device 210 based on information for reproducing the first content.
  • the electronic device 210 may establish a connection with the second external electronic device 202 .
  • the electronic device 210 may obtain user data related to the first external electronic device 201 from the second external electronic device 202 .
  • the second external electronic device 202 may transmit the acquired user data to the electronic device 210 .
  • the user data may include at least one of the user's bio-signal data, the user's face data, the user's voice data, and the user's input data.
  • the second external electronic device 202 may establish a connection with the first external electronic device 202 .
  • the second external electronic device 202 may establish a connection with the first external electronic device 201 through various communication techniques.
  • the second external electronic device 202 may establish a connection with the first external electronic device 201 through Bluetooth or a wireless local area network (WLAN).
  • the second external electronic device 202 may transmit the acquired user data to the first external electronic device 201 .
  • the first external electronic device 201 may transmit user data received from the second external electronic device 202 to the electronic device 210 .
  • the second external electronic device 202 may include a plurality of external devices.
  • Each of a plurality of external devices may be used to identify user data.
  • a first device eg, a device including a biosensor
  • biosignal data eg, heart rate or blood pressure
  • a second device eg, a device including a camera
  • a third device eg, a device including a microphone
  • a fourth device eg, a device for receiving user input
  • identify user input data e.
  • the second external electronic device 202 may be connected to at least one external device.
  • the user data obtained from the second external electronic device 202 may include user data obtained from at least one external device connected to the second external electronic device 202 .
  • the second external electronic device 202 may receive user data (eg, biosignal data, voice data, facial data, or user input data) obtained from at least one external device.
  • the second external electronic device 202 may obtain user data by receiving user data from at least one external device.
  • the first external electronic device 201 may perform at least some or all of the functions of the second external electronic device 202 .
  • the first external electronic device 201 may obtain user data.
  • the first external electronic device 201 may transmit the acquired user data to the electronic device 210 .
  • the first external electronic device 202 may perform at least some or all of the functions of the electronic device 210 .
  • the first external electronic device 201 may store a plurality of contents.
  • the first external electronic device 201 may reproduce at least one of a plurality of stored contents.
  • the first external electronic device 201 may identify at least one of a plurality of contents based on user data.
  • FIG. 3 is a simplified block diagram of an electronic device according to various embodiments.
  • the electronic device 210 may correspond to the server 108 of FIG. 1 .
  • the electronic device 210 may include a processor 310 , a communication circuit 320 , and/or a memory 330 .
  • the electronic device 210 may include at least one of a processor 310, a communication circuit 320, and a memory 330.
  • the processor 310, the communication circuit 320, and the memory 330 may be omitted according to embodiments.
  • the processor 310 may correspond to the processor 120 of FIG. 1 .
  • the processor 310 may be operatively or operably coupled with or connected with the communication circuitry 320 and the memory 330 .
  • the processor 310 may include at least one processor.
  • Processor 310 may include at least one processor.
  • the processor 310 may include a hardware component for processing data based on one or more instructions.
  • Hardware components for processing data may include, for example, an Arithmetic and Logic Unit (ALU), a Field Programmable Gate Array (FPGA), and/or a Central Processing Unit (CPU).
  • ALU Arithmetic and Logic Unit
  • FPGA Field Programmable Gate Array
  • CPU Central Processing Unit
  • the processor 310 may be configured to cause the electronic device 210 to operate when executing instructions stored in the memory 330 .
  • the electronic device 210 may include a communication circuit 320 .
  • the communication circuit 320 may correspond to the communication module 190 of FIG. 1 .
  • communication circuitry 320 may be used for various radio access technologies (RATs).
  • RATs radio access technologies
  • the communication circuit 320 may be used to communicate with the first external electronic device 201 and/or the second external electronic device 202 .
  • the processor 310 may transmit information for reproducing content in the first external electronic device 201 through the communication circuit 320 .
  • the processor 310 may receive user data from the second external electronic device 202 through the communication circuit 320 .
  • the electronic device 210 may include a memory 330 .
  • memory 330 may correspond to memory 130 of FIG. 1 .
  • the memory 330 may store a plurality of contents.
  • the processor 310 may transmit at least some of the plurality of contents stored in the memory 330 to the first external electronic device 201 .
  • the memory 330 may store information about a type of content reproduced by the first external electronic device 201 and a user's emotion for the reproduced content.
  • the memory 330 may store information about the type of user's emotion acquired according to a specified period while content is being reproduced.
  • FIG. 4 is a flowchart illustrating an operation of an electronic device according to various embodiments. This method may be executed by the electronic device 210 shown in FIG. 2 or 3 and the processor 310 of the electronic device 210 .
  • the processor 310 may receive first information including user data. For example, while the first content is reproduced in the first external electronic device 201, the processor 310 obtains information from a user from a second external electronic device 202 that is distinct from the first external electronic device 201. The first information including the user data may be received using the communication circuit.
  • the processor 310 may identify the first external electronic device 201 including a display that reproduces the first content.
  • the first external electronic device 201 may include a television, a monitor, or a smart phone.
  • the first external electronic device 201 may include a display.
  • the first external electronic device 201 may reproduce or display content through a display.
  • the first external electronic device 201 may identify a user input for reproducing the first content.
  • the first external electronic device 201 may receive a user input for reproducing the first content.
  • the first external electronic device 201 may request information for reproducing the first content from the electronic device 210 based on a user input for reproducing the first content.
  • the processor 310 of the electronic device 210 may transmit the first content or information for reproducing the first content to the first external electronic device 201 .
  • the processor 310 may identify a second external electronic device 202 that obtains user data related to the first external electronic device 201 .
  • the processor 310 may identify a second external electronic device 202 that obtains user data related to the first external electronic device 201.
  • the second external electronic device 202 may obtain user data.
  • the second external electronic device 202 may include a smart phone, a smart watch, a camera, a microphone, or a remote control.
  • the user data may include at least one of the user's bio-signal data, the user's face data, the user's voice data, and the user's input data.
  • the user data may further include information about the user's content viewing record, the user's preference, or the user's feedback.
  • the user of the first external electronic device 201 may be the same as the user of the second external electronic device 202 .
  • the first external electronic device 201 and the second external electronic device 202 may be used by the same user. While the first external electronic device 201 is reproducing the first content, the second external electronic device 202 may identify (or obtain) user data about the user.
  • user data may be obtained (or identified) through the second external electronic device 202 based on a user input for reproducing the first content received from the first external electronic device 201 .
  • the processor 310 may request acquisition of user data from the second external electronic device 202 based on a user input for reproducing the first content.
  • the second external electronic device 202 may display a notification indicating whether to consent to the acquisition of user data.
  • the second external electronic device 202 may receive a user input representing consent to acquisition of user data.
  • the second external electronic device 202 may acquire user data based on a user input indicating consent to acquisition of user data.
  • the second external electronic device 202 may not acquire user data based on a user input indicating refusal (or suspension) of acquisition of user data.
  • the second external electronic device 202 may receive a user input indicating suspension of acquisition of user data.
  • the second external electronic device 202 may stop the operation of obtaining user data.
  • the processor 310 may receive first information including user data from the second external electronic device 202 .
  • the processor 310 may receive first information including user data obtained from a user from the second external electronic device 202 in response to identifying the second external electronic device 202 . there is.
  • the processor 310 may identify the first information by receiving the first information including user data from the second external electronic device 202 .
  • the processor 310 may receive first information including user data from the first external electronic device 201 .
  • user data obtained through the second external electronic device 202 may be transmitted to the first external electronic device 201 .
  • the processor 310 may receive first information including user data from the first external electronic device 201 .
  • the processor 310 may identify first information by receiving first information including user data from the first external electronic device 201 .
  • the processor 310 may identify second information about the user's emotion type. For example, the processor 310 may identify second information about the type of user's emotion corresponding to the user data according to a designated period based on the received first information.
  • the user's emotion type may include first to fourth types.
  • the first type may be a type related to pleasure.
  • the second type may be an annoyance type.
  • the third type may be a type about boredom.
  • the fourth type may be a type related to comfort.
  • the processor 310 may identify the user's emotion type based on the first information.
  • the processor 410 may identify the user's emotion type corresponding to the user data based on the first information.
  • the processor 310 may identify a user's emotion type corresponding to user data based on data learned through a machine learning model.
  • the processor 310 sets the data included in the first information as an input value of the machine learning model (eg, an input vector including elements mapped to nodes of an input layer of the machine learning model) state, the user's emotion type may be identified by obtaining an output value of the machine learning model (eg, an output vector including elements mapped to nodes of an output layer of the machine learning model).
  • learning of a machine learning model is based on supervised learning and/or unsupervised learning by a neural network (e.g., a feedforward neural network, a convolutional neural network (CNN), a recursive neural network). and adjusting weights among a plurality of nodes included in a recurrent neural network (RNN) and/or a long-short term memory model (LSTM).
  • a neural network e.g., a feedforward neural network, a convolutional neural network (CNN), a recursive neural network.
  • CNN convolutional neural network
  • LSTM long-short term memory model
  • the processor 310 may identify the user's emotion type as one of a plurality of emotion types based on the first information.
  • the processor 310 may classify the plurality of emotion types into four types.
  • the processor 310 may identify the identified user's emotion type as one of four types.
  • the processor 310 may identify the user's emotion type as one of 28 emotion types according to Russell's circumplex model.
  • the processor 310 may classify (or reclassify or cluster) the 28 emotion types into 4 types.
  • the second information about the type of user's emotion may include at least one field.
  • Information for indicating the type of user's emotion may be included in each of the at least one field.
  • information indicating one of the first to fourth types may be included in the first field.
  • Information indicating one of the first to fourth types may be included in the second field.
  • Information indicating one of the first to fourth types may be included in the third field.
  • Information indicating one of the first to fourth types may be included in the fourth field.
  • the designated period may be identified based on the reproduction time of the first content.
  • the processor 310 may identify a user input for reproducing the first content received from the first external electronic device.
  • the processor 310 may identify a play time of the first content based on the identified user input.
  • the processor 310 may identify a designated period based on the reproduction time of the first content.
  • the processor 310 may set the designated period to be longer as the reproduction time of the first content is longer.
  • the processor 310 may set the designated period to be shorter as the reproduction time of the first content is shorter.
  • the designated period may be set to be the same regardless of the reproduction time of the content.
  • a designated period may be set so that the second information is identified the same number of times for each content.
  • the processor 310 may obtain one or more second contents.
  • the processor 310 may obtain one or more second contents in response to identifying that the value of the user's emotion identified based on the second information is maintained within a specified range.
  • the processor 310 may identify a value for the user's emotion based on the second information. For example, the processor 310 may identify a value for the user's emotion based on the second information about the type of the user's emotion. The processor 310 may identify a value for the user's emotion based on the type of the user's emotion. For example, the processor 310 may convert (or change) a user's emotion type into a user's emotion value. For example, the processor 310 may classify the user's emotion type into a positive emotion type and a negative emotion type. The processor 310 may identify the positive emotion type as a positive number. The processor 310 may identify the negative emotion type as a negative number.
  • the processor 310 may identify that the value of the user's emotion is maintained within a specified range.
  • the processor 310 may identify that the value of the user's emotion is maintained below a specified value (eg, 0).
  • a specified value eg, 0
  • the processor 310 may identify that the user's emotion is maintained as a negative emotion based on identifying that the user's emotion value is maintained below a specified value.
  • the processor 310 may obtain one or more second contents for changing the user's emotion value outside the specified range in response to identifying that the user's emotion value is maintained at or above the specified value.
  • the processor 310 may identify that the value of the user's emotion is maintained at a specified value (eg, 0) or higher.
  • the processor 310 may identify that the user's emotion is maintained as a positive emotion based on identifying that the user's emotion value is maintained above a specified value.
  • the processor 310 may obtain one or more second contents for maintaining the user's emotion value within a specified range in response to identifying that the user's emotion value is maintained above the specified value.
  • the processor 310 may identify that the value of the user's emotion is maintained within a specified range and for the identified time period based on a specified period. For example, the processor 310 may identify that the value for the user's emotion is maintained for a time interval corresponding to three cycles.
  • the one or more second contents may include contents for maintaining or changing a user's emotion.
  • one or more second contents may be set as one or more contents for changing the user's emotion from negative emotion to positive emotion.
  • one or more second contents may be set as one or more contents for maintaining the user's emotion as a positive emotion.
  • the processor 310 may identify one or more second contents based on the user's viewing record and the third information about the type of the user's emotion with respect to the user's viewing record.
  • the processor 310 may request display of a visual object for guiding reproduction of one of the one or more second contents. For example, in response to acquiring one or more second contents, the processor 310 requests the first external electronic device to display a visual object for guiding reproduction of one of the one or more acquired second contents.
  • a visual object for guiding reproduction of one of one or more second contents may be displayed on the display of the first external electronic device 201 while overlapping with the reproduced first content.
  • the processor 310 may identify a user input for stopping reproduction of the first content received from the first external electronic device 201 .
  • the processor 310 may store in the memory 330 second information obtained according to a designated period while the first content is reproduced based on a user input.
  • the processor 310 may store the second information together with the first content.
  • the electronic device 210 may include at least some or all of the configurations of the first external electronic device 201 and the second external electronic device 202 .
  • the processor 310 may operate the first external electronic device 201 and the second external electronic device 201. Operations 410 to 440 may be performed without the external electronic device 202 .
  • FIG. 5 is a flowchart illustrating an operation of an electronic device according to various embodiments. This method may be executed by the electronic device 210 shown in FIG. 2 or 3 and the processor 310 of the electronic device 210 .
  • operations 510 and 520 may be related to operation 430 of FIG. 4 .
  • the processor 310 may identify a first value for the first field, a second value for the second field, a third value for the third field, and a fourth value for the fourth field. .
  • the second information about the user's emotion type in operation 420 of FIG. 4 may include a first field, a second field, a third field, and a fourth field.
  • the first field may be identified based on biosignal data.
  • a second field may be identified based on facial data.
  • a third field may be identified based on voice data.
  • the fourth field may be identified based on user input data (eg, text input data).
  • each of the first to fourth fields may include information indicating a type of user's emotion.
  • the first field may include information indicating one of the first to fourth types, which are types for the user's emotion.
  • the second field may include information indicating one of the first to fourth types, which are types for the user's emotion.
  • the third field may include information indicating one of the first to fourth types, which are the user's emotion types.
  • the fourth field may include information indicating one of the first to fourth types, which are types for the user's emotion.
  • the first field may include information indicating the type of emotion of the user identified based on the biosignal data.
  • the second field may include information indicating the type of emotion of the user identified based on the facial data.
  • the third field may include information indicating the type of emotion of the user identified based on the voice data.
  • the fourth field may include information indicating the type of emotion of the user identified based on the input data.
  • the processor 310 may identify a first value for a first field, a second value for a second field, a third value for a third field, and a fourth value for a fourth field. can For example, the processor 310 may identify first to fourth values based on at least one weight set according to the first to fourth fields.
  • weights may be set for each of the first to fourth fields.
  • a first weight may be set in the first field.
  • a second weight may be set in the second field.
  • a third weight may be set in the third field.
  • a fourth weight may be set in the fourth field.
  • the processor 310 may set a large weight to a major criterion for the type of user's emotion.
  • the first weight may be set to be the largest.
  • the second weight and the third weight may be equally set.
  • the fourth weight may be set to the smallest.
  • the processor 310 may identify a value for the user's emotion. According to an embodiment, the processor 310 may identify a value for the user's emotion based on the first to fourth values. For example, the processor 310 may identify the sum of the first value to the fourth value as a value for the user's emotion. For another example, the processor 310 may identify an average of the first to fourth values as a value for the user's emotion.
  • operations 510 and 520 may be described with reference to FIGS. 6 to 7D.
  • FIG. 6 illustrates an example of a user's emotion type according to various embodiments.
  • the processor 310 may classify the user's emotion type into first to fourth types.
  • the user's emotion type may include first to fourth types.
  • the processor 310 may identify the user's emotion type based on the biosignal data.
  • the user's emotion type may be set in various ways. For example, one of six emotion types may be set as the user's emotion type. For another example, one of 28 emotion types may be set as the user's emotion type.
  • the processor 310 may set the user's emotion as one of 28 emotion types. For example, the processor 310 may set the user's emotion to one of 28 emotion types based on the biosignal data. As another example, the processor 310 may set the user's emotion to one of 28 emotion types based on facial data. As another example, the processor 310 may set the user's emotion to one of 28 emotion types based on voice data.
  • the processor 310 may set the user's emotion to one of 28 emotion types based on user data received from the second external electronic device 202 .
  • the processor 310 may identify (or extract) at least one feature point by processing user data.
  • the processor 310 may identify at least one feature point by identifying (or analyzing) a time domain and a frequency domain of user data.
  • the processor 310 may identify the user's emotion type based on at least one characteristic point.
  • the processor 310 may identify the user's emotion type by setting at least one feature point as an input value of the machine learning model.
  • the processor 310 may identify the user's emotion type based on the user's emotion type.
  • the user's emotion type may be displayed within the coordinate space 600 .
  • the x-axis of the coordinate space 600 represents valence.
  • the y-axis of the coordinate space represents the degree of arousal.
  • the user's emotion type is divided into a first quadrant 610 representing the first type, a second quadrant 620 representing the second type, a third quadrant 630 representing the third type, and a fourth quadrant representing the fourth type ( 640) may be included in one of them.
  • the first quadrant 610 may represent a type of enjoyment.
  • the emotional value is positive (ie, right) and the arousal level is excited (ie, up)
  • the user's emotion type may be included in the first quadrant 610 .
  • emotion types of surprise and happiness may be included in the first quadrant 610 .
  • Emotional types of surprise and happiness can be identified as types related to pleasure.
  • the second quadrant 620 may represent a type of annoyance.
  • the emotional value is negative (ie, left) and the arousal level is excited (ie, upper)
  • the user's emotion type may be included in the second quadrant 620 .
  • emotion types of anger and fear may be included in the second quadrant 620 .
  • the emotional types of anger and fear can be identified as types related to annoyance.
  • the third quadrant 630 may represent a type of boredom.
  • the emotional value is negative (ie, left) and the arousal level is in a non-excited state (ie, lower)
  • the user's emotion type may be included in the third quadrant 630 .
  • emotion types of disgust and sadness may be included in the third quadrant 630 .
  • the emotion types of disgust and sadness can be identified as types related to boredom.
  • the fourth quadrant 640 may represent a type of comfort.
  • the emotional value is positive (ie, right) and the arousal level is in a non-excited state (ie, lower)
  • the user's emotion type may be included in the fourth quadrant 640 .
  • the emotion type of calmness and sleepiness may be included in the fourth quadrant 640 .
  • Emotional types of calm and sleepiness can be identified as types related to comfort.
  • the user's emotion type may be identified (or classified) as neutral.
  • the coordinate space 600 shows an example in which 8 emotion types are classified into 4 types, but is not limited thereto.
  • the processor 310 may classify 28 emotion types as one of 4 types.
  • the processor 310 may identify the user's emotion type as one of 28 emotion types.
  • the processor 310 may identify the identified user's emotion type as one of four types.
  • FIG. 7A to 7D are diagrams for explaining an operation of an electronic device according to various embodiments.
  • the processor 310 may store information about the user's emotion type according to a designated period (eg, 10 minutes).
  • the fields shown in FIG. 7A may be examples of information about the type of user's emotion identified during one period.
  • the processor 310 may identify the user's emotion type based on biosignal data, face data, voice data, and input data, respectively.
  • the processor 310 may identify biosignal data received from the second external electronic device 202 and identified during one period.
  • the processor 310 may identify the user's emotion type based on the biosignal data identified during one period.
  • the processor 310 may store information indicating one of the first to fourth types in the field 701 .
  • the processor 310 may identify a user's emotion type as a pleasure type based on biosignal data identified during one period.
  • the processor 310 may store information for representing the type of pleasure in the field 701 .
  • the processor 310 may identify facial data received from the second external electronic device 202 and identified for one period.
  • the processor 310 may identify a user's emotion type based on facial data identified during one period.
  • the processor 310 may store information indicating one of the first to fourth types in the field 702 .
  • the processor 310 may identify a user's emotion type as a comfort type based on facial data identified during one period.
  • the processor 310 may store information for representing the type of comfort in the field 702 .
  • the processor 310 may identify voice data received from the second external electronic device 202 and identified for one period.
  • the processor 310 may identify the user's emotion type based on the voice data identified during one period.
  • the processor 310 may store information indicating one of the first to fourth types in the field 703 .
  • the processor 310 may identify a user's emotion type as a pleasure type based on voice data identified during one period.
  • the processor 310 may store information for representing the type of pleasure in the field 703 .
  • the processor 310 may identify input data (eg, text input data) received from the second external electronic device 202 and identified during one period. The processor 310 may identify that no input data has been received during one period. The processor 310 may set the user's emotion type as a default. The processor 310 may store information for indicating a default in the field 704 .
  • input data eg, text input data
  • the processor 310 may store information for indicating a default in the field 704 .
  • the processor 310 may store a field related to a user ID, a content ID, a genre, a genre detail, and/or a measurement time together with fields 701 to 704 . According to an embodiment, the processor 310 may identify and store information including the fields 701 to 704 at each designated period.
  • table 790 can be used to identify values for fields 701 through 704 .
  • Values for fields 701 to 704 are values for one of a pleasure type 711, annoyance type 712, boredom type 713, and comfort type 714. can be set to
  • the processor 310 determines the first value for the field 701, the second value for the field 702, and the field 703 based on the at least one weight set according to the fields 701 to 704.
  • a third value for , and a fourth value for field 704 can be identified.
  • the weight set in the field 701 may be set to an A value.
  • the weight set in the field 702 may be set to a value of B.
  • the weight set in the field 703 may be set to a B value.
  • the weight set in the field 704 may be set to a value of C.
  • information for representing the type 711 of pleasure may be stored in the field 701 .
  • a value for the field 701 may be set to the value A for the type 711 related to pleasure.
  • the value of A may be set to 3.
  • information for representing the type 714 of comfort may be stored in the field 702 .
  • the value for field 702 can be set to the value B for type 714 regarding comfort.
  • the B value may be set to 2.
  • information for representing the type 711 of pleasure may be stored in the field 703 .
  • a value for field 703 may be set to a value of B for type 711 related to pleasure.
  • the B value may be set to 2.
  • a default may be stored in field 704.
  • the value for field 704 may be set to the value C for default.
  • the C value may be set to 1.
  • the processor 310 may identify a value for the type 711 of pleasure or a value of the type of comfort, respectively.
  • the processor 310 may identify a value for a type 711 related to pleasure.
  • the processor 310 may identify the sum of the value for the field 701 and the value for the field 703 as the value for the type 711 related to pleasure.
  • Processor 310 can identify a value for type 714 related to comfort.
  • the processor 310 can identify the sum of the value for field 702 and the value for field 704 as a value for type 714 related to comfort.
  • the value of A may be set to 3.
  • the B value can be set to 2.
  • C value can be set to 1.
  • a value for the pleasure-related type 711 may be set to 5.
  • a value for the type 712 related to annoyance may be set to zero.
  • a value for the boredom type 713 may be set to 0.
  • the value for the comfort type 714 may be set to 3.
  • the processor 310 determines values for the first to fourth types based on user data (eg, biosignal data, facial data, voice data, and input data) identified during one period. can identify.
  • user data eg, biosignal data, facial data, voice data, and input data
  • the processor 310 may identify values for the first to fourth types through an operation for identifying the table 790 of FIG. 7B .
  • the processor 310 may identify and store a value for the type 711 related to pleasure set as the first type.
  • the processor 310 may identify a value for the type 712 related to irritation set as the second type.
  • the processor 310 may identify a value for the type 713 related to boredom set as the third type.
  • the processor 310 may identify a value for the comfort type 714 set as the fourth type.
  • the processor 310 calculates values for the pleasure type 711, irritation type 712, boredom type 713, and comfort type 714. can be saved
  • the processor 310 may set a value for the type 711 related to pleasure to 5.
  • the processor 310 may set a value for the annoyance-related type 712 to zero.
  • the processor 310 may set a value for the boredom-related type 713 to zero.
  • the processor 310 may set the value for the comfort type 714 to 3.
  • the processor 310 may set a field related to a user ID, a content ID, a genre, a genre detail, and/or a measurement time to a value for a first type, a value for a second type, and a value for a third type. value, and a value related to the fourth type.
  • the processor 310 may identify and store a value of the first type, a value of the second type, a value of the third type, and a value of the fourth type at each designated period. .
  • the processor 310 may identify a value of the first type, a value of the second type, a value of the third type, and a value of the fourth type in the first period.
  • the processor 310 may identify a value of the first type, a value of the second type, a value of the third type, and a value of the fourth type in a second period following the first period.
  • the processor 310 transmits a value of the first type, a value of the second type, a value of the third type, and a value of the fourth type at each specified period until reproduction of the content is completed.
  • value can be identified.
  • the processor 310 may identify and store the sum of the values of the identified first type in all periods.
  • the processor 310 may identify and store the sum of values of the identified second type in all periods.
  • the processor 310 may identify and store the sum of values of the identified third type in all periods.
  • the processor 310 may identify and store the sum of the values of the identified fourth type in all periods.
  • the processor 310 calculates the sum of values for the first type, the sum of values for the second type, the sum of values for the third type, and the sum of values for the fourth type. , it is possible to identify the type of user's emotion for the reproduced content. For example, the processor 310 may identify that the type of the user's emotion for the reproduced content is the first type, based on the largest sum of values for the first type being identified.
  • the processor 310 In the first period (eg, t1 ), the processor 310 generates a value related to the pleasure type 711 , a value related to the annoyance type 712 , and a value related to the boredom type 713 .
  • a value for , and a value for type 714 of comfort can be identified.
  • the processor 310 outputs values related to the pleasure type 711, annoyance type 712, and boredom until the twelfth period (eg, t12) when reproduction of the first content is completed.
  • a value for type 713 and a value for type 714 for comfort can be identified.
  • the processor 310 may identify that reproduction of the first content is completed (or terminated or stopped).
  • the processor 310 calculates the sum of the values of the pleasure type 711 and the annoyance type 712 identified in the first to twelfth periods. , the sum of values for the boredom type 713, and the sum of values for the comfort type 714 may be stored.
  • the processor 310 may include a sum of values for the pleasure type 711, a sum of values for the annoyance type 712, a sum of values for the boredom type 713, And based on the sum of the values for the comfort type 714, the user's emotion type for the first content may be identified.
  • the processor 310 may identify the user's emotion type for the first content as the comfort type 714 based on the largest sum of values for the comfort type 714 identified.
  • the processor 310 may classify the first to fourth types into a positive emotion type and a negative emotion type.
  • the processor 310 may identify the first type and the fourth type as positive emotion types.
  • the processor 310 may identify the second type and the third type as negative emotion types.
  • the processor 310 may identify a type having a positive valence as a positive emotion type.
  • the processor 310 may identify a type in which the appraisal value is negative as the negative appraisal type.
  • the processor 310 may set the value of the positive emotion type to a positive number.
  • the processor 310 may set the value of the negative emotion type to a negative number.
  • the processor 310 may identify the sum of the first type value and the fourth type value as a positive emotion type value.
  • the processor 310 may identify the sum of the second type value and the third type value as a negative emotion type value.
  • the processor 310 may identify the sum of the positive emotion type value and the negative emotion type value as the user's emotion value.
  • the processor 310 may identify whether the user's emotion for the content being played is positive or negative, based on the value of the user's emotion.
  • the processor 310 may identify that the user's emotion for the content being played is positive based on the positive value of the user's emotion. For example, the processor 310 may identify that the user's emotion for the content being reproduced is negative, based on the negative value of the user's emotion.
  • the processor 310 may identify a pleasure type 711 and a comfort type 714 as positive emotion types.
  • the processor 310 may identify the irritation type 712 and the boredom type 714 as negative emotion types.
  • the processor 310 may identify the sum of the value for the pleasure type 711 and the value for the comfort type 714 as the value of the positive emotion type.
  • the processor 310 may identify a value for the type 712 related to annoyance and a value for the type 714 related to boredom as a value of a negative emotion type.
  • the processor 310 may identify the sum of the value of the positive emotion type set as a positive number and the value of the negative emotion type set as a negative number.
  • the processor 310 may identify whether the user's emotion is positive or negative during one period (eg, the first period) based on the sum of the positive emotion type value and the negative emotion type value. .
  • the processor 310 may identify that the identified user's emotion is positively repeated at each designated period.
  • the processor 310 may identify one or more contents to maintain positive emotions of the user. For example, in response to identifying that the value of the user's emotion is maintained above a specified value (eg, 0), the processor 310 sets the value of the user's emotion within a specified range (eg, 0). one or more contents to be kept within).
  • the processor 310 may identify that the identified user's emotion is negatively repeated at each specified period.
  • the processor 310 may identify one or more contents for changing the user's negative emotion. For example, in response to identifying that the value for the user's emotion remains below a specified value (eg, 0), the processor 310 sets the value for the user's emotion within a specified range (eg, 0). less than) can identify one or more contents to change outside.
  • 8A to 8C illustrate examples of operations of an electronic device according to various embodiments.
  • the processor 310 may identify one or more contents based on identifying that the user's emotion value is maintained within a specified range. For example, the processor 310 may identify one or more contents based on identifying that the value of the user's emotion identified at each designated period is repeated within a designated range and a designated number of times.
  • the processor 310 identifies that the value of the user's emotion identified according to a specified period is maintained as a negative number, based on , it is possible to identify one or more contents for changing the value of the user's emotion outside the range of less than zero. For example, the processor 310 may identify that the value for the user's emotion identified in each designated period is repeated as many times (eg, 3 times) as a negative number. The processor 310 may identify one or more second contents for changing the value of the user's emotion to a positive number.
  • the processor 310 changes the user's emotion type from an annoyance type 820 to a comfort type 840. It is possible to identify one or more second contents to do.
  • the processor 310 may identify information about the type of user's emotion stored together with the third content. With respect to the third content, the processor 310 may identify that the user's emotion type is the comfort type 840 . For example, the processor 310 may identify the most recently stored third content as the user's emotion type 840 and the comfort related type.
  • the processor 310 may identify one or more second contents of a genre similar to the third content.
  • the processor 310 may request the first external electronic device 201 to display a visual object for guiding reproduction of one of one or more second contents.
  • the processor 310 changes the user's emotion type from a boredom type 830 to a pleasure type 810. It is possible to identify one or more second contents to do.
  • the processor 310 may identify information about the type of user's emotion stored together with the third content. With respect to the third content, the processor 310 may identify that the user's emotion type is the pleasure type 810 . For example, the processor 310 may identify the most recently stored third content as the user's emotion type 810 and the pleasure type 810 .
  • the processor 310 may identify one or more second contents of a genre similar to the third content.
  • the processor 310 may request the first external electronic device 201 to display a visual object for guiding reproduction of one of one or more second contents.
  • the processor 310 identifies that the value for the user's emotion identified according to a specified cycle is maintained as a positive number. Based on this, one or more second contents for maintaining the value of the user's emotion within a range of 0 or more may be identified. For example, the processor 310 may identify that the value for the user's emotion identified as a positive number is repeated for a specified number of times (eg, 3 times) at each specified period. The processor 310 may identify one or more second contents for maintaining the value of the user's emotion as a positive number.
  • the processor 310 may identify one or more second contents for maintaining the user's emotion type as the pleasure type 810 .
  • the processor 310 may identify one or more second contents of a genre similar to the first content.
  • the processor 310 may request the first external electronic device 201 to display a visual object for guiding reproduction of one of one or more second contents.
  • the processor 310 maintains a pleasant emotional state of the user by displaying a visual object for guiding reproduction of one of one or more second contents of a similar genre to the first content through the first external electronic device 201 ( or reinforced).
  • the processor 310 may identify one or more second contents for changing the user's emotion type from the pleasure type 810 to the comfort type 840 .
  • the processor 310 may identify information about the type of user's emotion stored together with the third content. With respect to the third content, the processor 310 may identify that the user's emotion type is the comfort type 840 . For example, the processor 310 may identify the most recently stored third content as the user's emotion type 840 and the comfort related type.
  • the processor 310 may identify one or more pieces of second content of a similar genre to the third content.
  • the processor 310 may request the first external electronic device 201 to display a visual object for guiding reproduction of one of one or more second contents.
  • the processor 310 displays a visual object for guiding reproduction of one of the one or more second contents of a genre similar to the third content through the first external electronic device 201 so that the user's pleasant emotional state is converted into a comfortable feeling. status can be changed.
  • the processor 310 may identify one or more second contents for maintaining the user's emotion type as the comfort type 840 .
  • the processor 310 may identify one or more second contents of a genre similar to the first content.
  • the processor 310 may request the first external electronic device 201 to display a visual object for guiding reproduction of one of one or more second contents.
  • the processor 310 maintains a user's comfortable emotional state by displaying a visual object for guiding reproduction of one of one or more second contents of a similar genre to the first content through the first external electronic device 201 ( or reinforced).
  • the processor 310 may identify one or more second contents for changing the user's emotion type from the comfort type 840 to the pleasure type 810 .
  • the processor 310 may identify information about the type of user's emotion stored together with the third content. With respect to the third content, the processor 310 may identify that the user's emotion type is the pleasure type 810 . For example, the processor 310 may identify the most recently stored third content as the user's emotion type 810 and the pleasure type 810 .
  • the processor 310 may identify one or more pieces of second content of a similar genre to the third content.
  • the processor 310 may request the first external electronic device 201 to display a visual object for guiding reproduction of one of one or more second contents.
  • the processor 310 displays a visual object for guiding reproduction of one of one or more second contents of a genre similar to the third content through the first external electronic device 201, so that the comfortable emotional state of the user is changed into a pleasant feeling. status can be changed.
  • 9A and 9B illustrate examples of visual objects displayed in a first external electronic device according to various embodiments.
  • the first external electronic device 201 may display a screen 901 through a display.
  • the first external electronic device 201 may receive a user input for reproducing the first content.
  • the first external electronic device 201 may request information for reproducing the first content 902 (eg, the first content 902 ) from the electronic device 210 .
  • the first external electronic device 201 may receive information for reproducing the first content 902 from the electronic device 210 .
  • the first external electronic device 201 may display a screen 901 for reproducing the first content 902 on the display based on information for reproducing the first content 902 from the electronic device 210. there is.
  • the processor 310 of the electronic device 210 While the first content 902 is being reproduced, the processor 310 of the electronic device 210, based on user data (eg, user data obtained from the second external electronic device 202), at a designated period Accordingly, information about the type of user's emotion may be identified.
  • the processor 310 may identify one or more second contents based on identifying that the value of the user's emotion identified based on the information about the user's emotion type is maintained within a specified range.
  • the processor 310 may request the first external electronic device 201 to display a visual object for guiding reproduction of one of one or more second contents.
  • the first external electronic device 201 may receive from the electronic device 210 a request for displaying a visual object for guiding reproduction of one of one or more second contents. Based on a request for displaying a visual object for guiding reproduction of one of the one or more second contents, the first external electronic device 201 superimposes the first content 902 on one or more of the one or more second contents. A visual object for guiding one play may be displayed.
  • a visual object for guiding reproduction of one of one or more second contents may include a first visual object 903, a second visual object 904, and a third visual object 905. .
  • the first external electronic device 201 may display the first visual object 903 , the second visual object 904 , and the third visual object 905 overlapping the first content 902 .
  • the first external electronic device 201 may display the first visual object 903 , the second visual object 904 , and the third visual object 905 as floating overlays.
  • the first external electronic device 201 may receive a user input for one of the first visual object 903 , the second visual object 904 , and the third visual object 905 .
  • the first external electronic device 201 transmits information for reproducing second content corresponding to one of the first visual object 903, the second visual object 904, and the third visual object 905 to the electronic device ( 210) can be requested.
  • the first external electronic device 201 may receive information for reproducing the second content from the electronic device 210 and reproduce the second content based on the received information.
  • the processor 310 may display a visual object for guiding reproduction of one of one or more second contents after the reproduction of the first content is terminated (or stopped).
  • one or more second contents may be identified based on information about the user's emotion type with respect to the first contents.
  • the processor 310 of the electronic device 210 may identify the user's emotion type for the first content reproduced in the first external electronic device 201 as a negative emotion type.
  • the processor 310 may identify one or more second contents for changing the user's emotion type from a negative emotion type to a positive emotion type.
  • the processor 310 may transmit information on one or more second contents to the first external electronic device 201 after reproduction of the first content is finished.
  • the information on the one or more second contents may include a visual object for guiding reproduction of one of the one or more second contents.
  • the first external electronic device 201 may display a screen 901 including the first visual object 911 to the sixth visual object 916 based on information on one or more second contents.
  • the first visual object 911 to the sixth visual object 916 may be displayed to reproduce one of one or more second contents.
  • the first external electronic device 201 may receive a user input for one of the first visual object 911 to the sixth visual object 916 .
  • the processor 310 may request information for reproducing the second content corresponding to one of the first visual object 911 to the sixth visual object 916 from the electronic device 210 .
  • the first external electronic device 201 may receive information for reproducing the second content from the electronic device 210 and reproduce the second content based on the received information.
  • the processor 310 of the electronic device 210 may identify one or more second contents based on the user's content viewing information.
  • the user's content viewing information may include information about the user's emotion type for each of a plurality of contents.
  • the processor 310 selects at least one content of which the user's emotion type is identified as the first type among a plurality of contents. can be identified.
  • the processor 310 may request the first external electronic device 201 to display a visual object for guiding reproduction of one of at least one piece of content identified as the first type.
  • the first external electronic device 201 is a visual object (eg, the first visual object 911 to the sixth visual object 916) for guiding reproduction of one of the at least one content identified as the first type.
  • a screen 901 including may be displayed through the display.
  • the electronic device 210 may include all or part of the first external electronic device 201 and the second external electronic device 202 .
  • the electronic device 210 may perform all or part of the functions of the first external electronic device 201 and the second external electronic device 202 .
  • the processor 310 of the electronic device 210 may identify first information including user data acquired from the user while the first content is being reproduced.
  • the processor 310 may identify second information about the type of the user's emotion corresponding to the user data according to a designated period based on the first information.
  • the processor 310 may obtain one or more second contents in response to identifying that the value of the user's emotion identified based on the second information is maintained within a specified range.
  • the processor 310 may display a visual object for guiding reproduction of one of the one or more acquired second contents.
  • An electronic device (eg, the electronic device 210 of FIG. 2 ) according to various embodiments includes a communication circuit, and at least one processor operatively coupled to the communication circuit, wherein the at least one processor , While the first content is reproduced in the first external electronic device, first information including user data obtained from the user is obtained from a second external electronic device that is different from the first external electronic device, using the communication circuit.
  • one or more second contents are obtained, and in response to the one or more second contents, to the first external electronic device, It may be configured to request display of a visual object for guiding reproduction of one of the obtained one or more second contents.
  • the user data may include at least one of the user's biosignal data, the user's face data, the user's voice data, and the user's input data.
  • the second information about the user's emotion type is based on a first field identified based on the biosignal data, a second field identified based on the face data, and the voice data. and a fourth field identified based on the input data.
  • the at least one processor may determine a first value for the first field, a second value for the second field, a third value for the third field, and a value for the fourth field. It may be set to identify a fourth value, and to identify a value for the user's emotion based on the first to fourth values.
  • the at least one processor may be configured to identify the first value to the fourth value based on at least one weight set according to the first to fourth fields.
  • the at least one processor is configured to change the value of the user's emotion outside the specified range in response to identifying that the value of the user's emotion is maintained below the specified value. It may be set to obtain one or more second contents.
  • the at least one processor is configured to maintain the value of the user's emotion within the specified range in response to identifying that the value of the user's emotion is maintained above a specified value. It may be set to obtain one or more second contents.
  • the user's emotion type may be set to one of the first to fourth types.
  • the at least one processor identifies a user input for stopping playback of the first content, received from the first external electronic device, and based on the user input, the first content It may be further configured to store the second information obtained according to the designated period in the memory while is being reproduced.
  • a visual object for guiding reproduction of one of the one or more acquired second contents may be displayed on the display of the first external electronic device while overlapping the reproduced first content.
  • the at least one processor identifies the user's emotion type corresponding to the user data according to the specified period, based on the identified first information, and identifies the user's emotion type. Based on the type, it may be set to identify the second information about the type of the user's emotion corresponding to the user data.
  • the at least one processor identifies a user input for reproducing the first content, received from the first external electronic device, and reproduces the first content based on the user input. It may be further configured to identify a time, and to identify the designated period based on the reproduction time of the first content.
  • the at least one processor may be set to identify that the value of the user's emotion is maintained within the specified range for a time period identified based on the specified period.
  • the at least one processor is configured to identify the one or more second contents based on the user's viewing record and third information about the type of the user's emotion with respect to the user's viewing record. can be set.
  • the user data obtained from the second external electronic device may include user-related data obtained from at least one external device connected to the second external electronic device.
  • a method of an electronic device includes user data obtained from a user from a second external electronic device distinct from the first external electronic device while first content is reproduced in the first external electronic device. operation of receiving, by using the communication circuit, first information regarding the type of user's emotion corresponding to the user data, based on the received first information, according to a designated period; Identifying, obtaining one or more second contents in response to identifying that the value of the user's emotion identified based on the second information is maintained within a specified range, and obtaining one or more second contents. In response to the acquisition, an operation of requesting display of a visual object for guiding playback of one of the obtained one or more second contents to the first external electronic device.
  • the user data may include at least one of the user's biosignal data, the user's face data, the user's voice data, and the user's input data.
  • the second information about the user's emotion type is based on a first field identified based on the biosignal data, a second field identified based on the face data, and the voice data. and a fourth field identified based on the input data.
  • the operation of obtaining the one or more second contents includes the first identifying a first value for a field, a second value for the second field, a third value for the third field, and a fourth value for the fourth field, the first to fourth values based on, identifying a value for the user's emotion, and acquiring the one or more second contents in response to identifying that the value for the user's emotion is maintained within the specified range.
  • a non-transitory computer readable storage medium when executed by a processor of an electronic device having a communication circuit,
  • first information including user data acquired from the user is transmitted from a second external electronic device that is different from the first external electronic device, using the communication circuit.
  • Receive, and based on the received first information identify second information about the type of the user's emotion corresponding to the user data according to a specified period, and identify the second information based on the second information
  • one or more second contents are obtained, and in response to the one or more second contents, to the first external electronic device, the One or more programs including instructions for causing the electronic device to request display of a visual object for guiding reproduction of one of the acquired one or more second contents may be stored.
  • a processor may consist of one or a plurality of processors.
  • the one or more processors may be a general-purpose processor such as a CPU, an AP, or a digital signal processor (DSP), a graphics-only processor such as a GPU or a vision processing unit (VPU), or an artificial intelligence-only processor such as an NPU.
  • DSP digital signal processor
  • GPU graphics-only processor
  • VPU vision processing unit
  • NPU artificial intelligence-only processor
  • One or more processors control input data to be processed according to predefined operating rules or artificial intelligence models stored in a memory.
  • the processors dedicated to artificial intelligence may be designed with a hardware structure specialized for processing a specific artificial intelligence model.
  • a predefined action rule or an artificial intelligence model is characterized in that it is created through learning.
  • being made through learning means that a basic artificial intelligence model is learned using a plurality of learning data by a learning algorithm, so that a predefined action rule or artificial intelligence model set to perform a desired characteristic (or purpose) is created. means burden.
  • Such learning may be performed in the device itself in which artificial intelligence according to the present disclosure is performed, or through a separate server and/or system.
  • Examples of learning algorithms include supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but are not limited to the above examples.
  • An artificial intelligence model may be composed of a plurality of neural network layers.
  • Each of the plurality of neural network layers has a plurality of weight values, and a neural network operation is performed through an operation between an operation result of a previous layer and a plurality of weight values.
  • a plurality of weights possessed by a plurality of neural network layers may be optimized by a learning result of an artificial intelligence model. For example, a plurality of weights may be updated so that a loss value or a cost value obtained from an artificial intelligence model is reduced or minimized during a learning process.
  • the artificial neural network may include a deep neural network (DNN), for example, a Convolutional Neural Network (CNN), a Deep Neural Network (DNN), a Recurrent Neural Network (RNN), a Restricted Boltzmann Machine (RBM), A deep belief network (DBN), a bidirectional recurrent deep neural network (BRDNN), or deep Q-networks, but is not limited to the above examples.
  • DNN deep neural network
  • CNN Convolutional Neural Network
  • DNN Deep Neural Network
  • RNN Recurrent Neural Network
  • RBM Restricted Boltzmann Machine
  • BBN Restricted Boltzmann Machine
  • BBN deep belief network
  • BNN bidirectional recurrent deep neural network
  • Q-networks deep Q-networks
  • the method for recognizing the user's voice and interpreting the intention to provide the content includes a second external device (eg, A voice signal, which is an analog signal, can be received through a microphone), and the voice part can be converted into computer-readable text using an ASR (Automatic Speech Recognition) model.
  • ASR Automatic Speech Recognition
  • the user's utterance intention may be obtained by interpreting the converted text using a natural language understanding (NLU) model.
  • NLU natural language understanding
  • the ASR model or NLU model may be an artificial intelligence model.
  • the artificial intelligence model can be processed by an artificial intelligence processor designed with a hardware structure specialized for the processing of artificial intelligence models. AI models can be created through learning.
  • An artificial intelligence model may be composed of a plurality of neural network layers. Each of the plurality of neural network layers has a plurality of weight values, and a neural network operation is performed through an operation between an operation result of a previous layer and a plurality of weight values.
  • Linguistic understanding is a technology that recognizes and applies/processes human language/text, and includes natural language processing, machine translation, dialog system, question answering, and voice recognition. /Includes Speech Recognition/Synthesis, etc.
  • user data eg, biosignal data, voice data, facial data
  • an artificial intelligence model may be used to recommend/execute at least one content using user input data).
  • At least one processor may perform a preprocessing process on the data to convert it into a form suitable for use as an input of an artificial intelligence model.
  • AI models can be created through learning. Here, being made through learning means that a basic artificial intelligence model is learned using a plurality of learning data by a learning algorithm, so that a predefined action rule or artificial intelligence model set to perform a desired characteristic (or purpose) is created. means burden.
  • An artificial intelligence model may be composed of a plurality of neural network layers.
  • Each of the plurality of neural network layers has a plurality of weight values, and a neural network operation is performed through an operation between an operation result of a previous layer and a plurality of weight values.
  • Inference prediction is a technology that judges information and logically infers and predicts it. It includes Knowledge based Reasoning, Optimization Prediction, Preference-based Planning, and Recommendation. include
  • Electronic devices may be devices of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a camera
  • a wearable device e.g., a smart bracelet
  • first, second, or first or secondary may simply be used to distinguish that component from other corresponding components, and may refer to that component in other respects (eg, importance or order) is not limited.
  • a (eg, first) component is said to be “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively.”
  • the certain component may be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logical blocks, components, or circuits.
  • a module may be an integrally constructed component or a minimal unit of components or a portion thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • a storage medium eg, internal memory 136 or external memory 138
  • a machine eg, electronic device 101
  • a processor eg, the processor 120
  • a device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • the storage medium is a tangible device and does not contain a signal (e.g. electromagnetic wave), and this term refers to the case where data is stored semi-permanently in the storage medium. It does not discriminate when it is temporarily stored.
  • a signal e.g. electromagnetic wave
  • the method according to various embodiments disclosed in this document may be included and provided in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • Computer program products are distributed in the form of machine-readable storage media (e.g. CD-ROM (compact disc read only memory)) or through application stores (e.g. Play Store). ) or directly between two user devices (eg smart phones), online distribution (eg download or upload).
  • online distribution at least part of the computer program product may be temporarily stored or temporarily created in a device-readable storage medium such as a manufacturer's server, an application store server, or a relay server's memory.
  • each component (eg, module or program) of the above-described components may include a single object or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. there is.
  • one or more components or operations among the aforementioned corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg modules or programs
  • the integrated component may perform one or more functions of each of the plurality of components identically or similarly to those performed by a corresponding component of the plurality of components prior to the integration. .
  • the actions performed by a module, program, or other component are executed sequentially, in parallel, iteratively, or heuristically, or one or more of the actions are executed in a different order, or omitted. or one or more other actions may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon un mode de réalisation, au moins un processeur d'un dispositif électronique est configuré pour: recevoir, en provenance d'un second dispositif électronique externe, une première information comprenant des données d'utilisateur obtenues à partir d'un utilisateur lors de la lecture du premier contenu dans un premier dispositif électronique externe; identifier une seconde information concernant un type d'émotion de l'utilisateur correspondant aux données d'utilisateur selon une période spécifiée; obtenir un ou plusieurs second(s) contenu(s) en réponse à l'identification qu'une valeur par rapport à l'émotion de l'utilisateur identifiée sur la base de la seconde information est maintenue dans une plage spécifiée; et la demande, au premier dispositif électronique externe, de l'affichage d'un objet visuel pour guider la lecture d'un parmi ledit un ou lesdits plusieurs second(s) contenu(s) obtenu(s). Entretemps, le processeur est configuré pour mettre en oeuvre un procédé de fourniture de contenu sur la base de l'émotion de l'utilisateur du dispositif électronique au moyen d'un modèle d'intelligence artificielle.
PCT/KR2022/015225 2021-12-08 2022-10-07 Dispositif électronique et procédé de fourniture de contenu sur la base de l'émotion d'utilisateur WO2023106591A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0175246 2021-12-08
KR1020210175246A KR20230086526A (ko) 2021-12-08 2021-12-08 사용자의 감정에 기반하여 콘텐트를 제공하기 위한 전자 장치 및 방법

Publications (1)

Publication Number Publication Date
WO2023106591A1 true WO2023106591A1 (fr) 2023-06-15

Family

ID=86730595

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/015225 WO2023106591A1 (fr) 2021-12-08 2022-10-07 Dispositif électronique et procédé de fourniture de contenu sur la base de l'émotion d'utilisateur

Country Status (2)

Country Link
KR (1) KR20230086526A (fr)
WO (1) WO2023106591A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060104194A (ko) * 2005-03-29 2006-10-09 엘지전자 주식회사 감정 상태에 따른 프로그램 추천 기능을 갖는 영상표시기기및 그 제어방법
KR20120071202A (ko) * 2010-12-22 2012-07-02 전자부품연구원 감성형 콘텐츠 커뮤니티 서비스 시스템
KR20140094336A (ko) * 2013-01-22 2014-07-30 삼성전자주식회사 사용자 감정 추출이 가능한 전자기기 및 전자기기의 사용자 감정 추출방법
KR20200065755A (ko) * 2018-11-30 2020-06-09 오당찬 몰입도 운용 방법 및 이를 지원하는 전자 장치
KR20210133945A (ko) * 2019-10-23 2021-11-08 주식회사 라라랩스 컨텐츠 추천 시스템 및 이를 이용한 컨텐츠 추천 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060104194A (ko) * 2005-03-29 2006-10-09 엘지전자 주식회사 감정 상태에 따른 프로그램 추천 기능을 갖는 영상표시기기및 그 제어방법
KR20120071202A (ko) * 2010-12-22 2012-07-02 전자부품연구원 감성형 콘텐츠 커뮤니티 서비스 시스템
KR20140094336A (ko) * 2013-01-22 2014-07-30 삼성전자주식회사 사용자 감정 추출이 가능한 전자기기 및 전자기기의 사용자 감정 추출방법
KR20200065755A (ko) * 2018-11-30 2020-06-09 오당찬 몰입도 운용 방법 및 이를 지원하는 전자 장치
KR20210133945A (ko) * 2019-10-23 2021-11-08 주식회사 라라랩스 컨텐츠 추천 시스템 및 이를 이용한 컨텐츠 추천 방법

Also Published As

Publication number Publication date
KR20230086526A (ko) 2023-06-15

Similar Documents

Publication Publication Date Title
WO2022055068A1 (fr) Dispositif électronique pour identifier une commande contenue dans de la voix et son procédé de fonctionnement
WO2022019538A1 (fr) Modèle de langage et dispositif électronique le comprenant
WO2022010157A1 (fr) Procédé permettant de fournir un écran dans un service de secrétaire virtuel à intelligence artificielle, et dispositif de terminal d'utilisateur et serveur pour le prendre en charge
WO2022080634A1 (fr) Procédé pour entraîner un réseau neuronal artificiel et dispositif électronique le prenant en charge
WO2023106591A1 (fr) Dispositif électronique et procédé de fourniture de contenu sur la base de l'émotion d'utilisateur
WO2022177166A1 (fr) Procédé de commande de fréquence de rafraîchissement, et dispositif électronique prenant en charge celui-ci
WO2022163963A1 (fr) Dispositif électronique et procédé de réalisation d'instruction de raccourci de dispositif électronique
WO2021096281A1 (fr) Procédé de traitement d'entrée vocale et dispositif électronique prenant en charge celui-ci
WO2022245174A1 (fr) Dispositif électronique et procédé d'appel vidéo basé sur un service de réaction
WO2022131805A1 (fr) Procédé de fourniture de réponse à une entrée vocale et dispositif électronique pour le prendre en charge
WO2024080745A1 (fr) Procédé d'analyse de la parole d'un utilisateur sur la base d'une mémoire cache de parole, et dispositif électronique prenant en charge celui-ci
WO2023090747A1 (fr) Dispositif électronique d'acquisition d'image à un instant prévu par l'utilisateur et son procédé de commande
WO2024111843A1 (fr) Dispositif électronique et procédé de représentation d'un objet visuel dans un environnement virtuel
WO2024043670A1 (fr) Procédé d'analyse de la parole d'un utilisateur, et dispositif électronique prenant celui-ci en charge
WO2024117508A1 (fr) Dispositif électronique et procédé de fourniture d'espace virtuel
WO2024075982A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2022092539A1 (fr) Dispositif électronique de gestion de modèle utilisateur et son procédé de fonctionnement
WO2023048379A1 (fr) Serveur et dispositif électronique pour traiter un énoncé d'utilisateur, et son procédé de fonctionnement
WO2024085461A1 (fr) Dispositif électronique et procédé destiné à fournir un service de traduction
WO2023008798A1 (fr) Dispositif électronique de gestion de réponses inappropriées et son procédé de fonctionnement
WO2023149782A1 (fr) Dispositif électronique et procédé de fourniture d'une fonction haptique
WO2023128208A1 (fr) Dispositif électronique pouvant être monté sur la tête d'un utilisateur, et procédé pour fournir une fonction à l'aide d'informations biométriques dans le même dispositif électronique
WO2023177079A1 (fr) Serveur et dispositif électronique permettant de traiter une parole d'utilisateur sur la base d'un vecteur synthétique, et procédé de fonctionnement associé
WO2022177165A1 (fr) Dispositif électronique et procédé permettant d'analyser un résultat de reconnaissance vocale
WO2023287053A1 (fr) Procédé de création d'un avatar et dispositif électronique prenant en charge ledit procédé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22904425

Country of ref document: EP

Kind code of ref document: A1