WO2016072595A1 - Dispositif électronique et son procédé de fonctionnement - Google Patents

Dispositif électronique et son procédé de fonctionnement Download PDF

Info

Publication number
WO2016072595A1
WO2016072595A1 PCT/KR2015/008471 KR2015008471W WO2016072595A1 WO 2016072595 A1 WO2016072595 A1 WO 2016072595A1 KR 2015008471 W KR2015008471 W KR 2015008471W WO 2016072595 A1 WO2016072595 A1 WO 2016072595A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
emotion
emotional
user
expression
Prior art date
Application number
PCT/KR2015/008471
Other languages
English (en)
Korean (ko)
Inventor
얀차오
리지쑤엔
후앙잉
슝준준
왕치앙
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201410638408.1A external-priority patent/CN105615902A/zh
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to US15/524,419 priority Critical patent/US20180285641A1/en
Priority to EP15857415.2A priority patent/EP3217254A4/fr
Publication of WO2016072595A1 publication Critical patent/WO2016072595A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present application relates to the field of computer technology, and more particularly to the field of human-computer interaction technology, and more particularly to an electronic device for monitoring emotions and a method of operating the same.
  • an electronic device obtains emotion expression information representing emotions of a memory, a display unit, and a user, obtains emotion stimulation source information indicating a cause of the emotion, and compares the emotion expression information with the emotion stimulation source information.
  • a control unit for associating, storing the associated emotional expression information and the emotional stimulus source information in the memory, and displaying the emotional expression information and the emotional stimulus source information on the display unit.
  • An electronic device for monitoring emotions and a method of operating the same may be provided.
  • 1 is a schematic diagram of recording emotional information
  • FIG. 2 is a flowchart illustrating a method for monitoring emotions according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of an application scene of the embodiment shown in FIG. 2;
  • FIG. 4 is a flowchart illustrating a method for monitoring emotions according to another embodiment of the present application.
  • FIG. 5 is a flowchart illustrating a method for monitoring emotions according to another embodiment of the present application.
  • FIG. 6 is a schematic diagram of an application scene of the embodiment shown in FIG. 5;
  • FIG. 7A is a schematic diagram of an application scene of an embodiment
  • FIG. 7B is another schematic diagram of an application scene of an embodiment
  • FIG. 8 is a flowchart illustrating a method for monitoring emotions according to another embodiment of the present application.
  • FIG. 9 is a schematic diagram of an application scene of the embodiment shown in FIG. 8.
  • FIG. 10 is a schematic diagram illustrating an apparatus for monitoring emotions according to an embodiment of the present application.
  • FIG. 11 is a schematic diagram illustrating an apparatus for monitoring emotions according to another embodiment of the present application.
  • FIG. 12 is a block diagram showing an apparatus for monitoring emotions according to another embodiment of the present application.
  • FIG. 13 is a block diagram showing an apparatus for monitoring emotions according to another embodiment of the present application.
  • FIG. 14 is a block diagram illustrating a configuration of an electronic device according to some embodiments.
  • FIG. 15 is a block diagram illustrating still another configuration of the electronic device of FIG. 14.
  • an electronic device obtains emotion expression information representing emotions of a memory, a display unit, and a user, obtains emotion stimulation source information indicating a cause of the emotion, and compares the emotion expression information with the emotion stimulation source information.
  • a control unit for associating, storing the associated emotional expression information and the emotional stimulus source information in the memory, and displaying the emotional expression information and the emotional stimulus source information on the display unit.
  • the controller may display content indicating the emotion of the user over time on the display unit.
  • the electronic device may further include a user input unit configured to receive a user input of selecting a specific time from the content, and the controller may display the emotion expression information and the emotion stimulation source information related to the selected time on the display unit. have.
  • the controller may acquire the emotional stimulus source information based on at least one of multimedia information, environment information, and physiological signal information of the user, wherein the multimedia information includes at least one of text information, image information, audio information, and video information. It may include.
  • the controller may acquire the emotion expression information based on at least one of the multimedia information and the physiological signal information.
  • the electronic device may further include a communication unit configured to communicate with at least one of a server and a second electronic device used by the second user, wherein the controller is further configured to express an emotion of a second user associated with the user based on the multimedia information.
  • a communication unit configured to communicate with at least one of a server and a second electronic device used by the second user, wherein the controller is further configured to express an emotion of a second user associated with the user based on the multimedia information.
  • the communication unit may be controlled to share with the user.
  • the control unit receives at least one of third emotion expression information representing the emotion of the user and third emotion stimulation source information corresponding to the third emotion expression information, which are shared by the second user, through the communication unit,
  • the emotion expression information and the emotion stimulation source information may be combined with at least one of the third emotion expression information and the third emotion stimulation source information.
  • the control unit stores the associated emotion expression information and the emotion stimulation source information in the memory, and then monitors a change in emotion type and a change in emotion intensity with respect to the emotion of the user, so that the emotion type does not change, When the change in the emotion intensity is greater than a threshold value, the emotion expression information and the emotion stimulation source information may be updated.
  • the controller may normalize the emotion intensity.
  • the controller may obtain emotion management recommendation information based on the emotion expression information and the emotion stimulation source information, and display the emotion management recommendation information on the display unit.
  • a method of operating an electronic device may include obtaining emotion expression information indicating a user's emotion, obtaining emotion stimulation source information indicating a cause of the emotion, and converting the emotion expression information to the emotion stimulation source information. And associating the associated emotional expression information and the emotional stimulus source information and displaying the emotional expression information and the emotional stimulus source information.
  • the method may further comprise displaying content indicative of the user's emotion over time.
  • the method may further include receiving a user input for selecting a specific time in the content, and displaying the emotion expression information and the emotion stimulation source information may include the emotion expression information associated with the selected time. And displaying the emotional stimulus source information.
  • the acquiring of the emotional stimulus source information may include acquiring the emotional stimulus source information based on at least one of multimedia information, environment information, and physiological signal information of the user, wherein the multimedia information includes text information and an image. It may include at least one of information, audio information, and video information.
  • Acquiring the emotional expression information may include acquiring the emotional expression information based on at least one of the multimedia information and the physiological signal information.
  • the method includes obtaining at least one of second emotion expression information indicating emotion of a second user associated with the user and second emotion stimulation source information indicating a cause of the emotion of the second user based on the multimedia information;
  • the method may further include sharing at least one of the second emotional expression information and the second emotional stimulation source information with the second user.
  • the method comprises receiving at least one of third emotional expression information representing the emotion of the user and third emotional stimulation source information corresponding to the third emotional expression information shared by the second user and the emotional expression And combining the information and the emotional stimulation source information with at least one of the third emotional expression information and the third emotional stimulation source information.
  • the method comprises storing the associated emotional expression information and the emotional stimulus source information, then monitoring a change in the emotion type and a change in the emotion intensity with respect to the emotion of the user and the emotion type is not changed, the emotion If the change in intensity is greater than a threshold value, the method may further include updating the emotional expression information and the emotional stimulation source information.
  • the method may further include obtaining emotion management recommendation information based on the emotion expression information and the emotion stimulation source information and displaying the emotion management recommendation information.
  • 1 is a schematic diagram of recording emotion information.
  • different emotion images may be distinguished according to emotion types, and a simple emotion tag such as "happy”, “surprised”, “sad”, “angry”, etc. may be attached to each emotion type. have.
  • a simple emotion tag such as "happy”, “surprised”, “sad”, “angry”, etc.
  • the user can only obtain a simple emotion tag and cannot effectively manage or analyze emotion changes when recalling emotions in the future.
  • FIG. 2 shows a flowchart 200 illustrating a method for monitoring emotions according to one embodiment.
  • the method may be used in various terminals such as smart phones, smart TVs, tablet computers, portable laptop computers, desktop computers, wearable devices, and the like.
  • the method for monitoring emotions includes the following steps.
  • step 201 emotional expression information of the current emotion of the user is obtained.
  • the emotional expression information is external presentation information that can reflect the characteristics of the current emotion of the user, and specifically, the expression of the current face of the user, the body movement of the user, the pronunciation and intonation of the user's voice, and the user. It may include information such as speech that represents the input text or emotions of the.
  • the terminal may include various information collecting devices such as a camera and a microphone, may actively collect the emotion expression information of the current emotion of the user, or may include text and speech. Emotional expression information may be extracted from the user's input information such as.
  • step 202 emotional stimulus source information of the current emotion is obtained.
  • the emotional stimulus source information may be the causes that caused the user's current emotion.
  • Causes that provoke the user's emotions can be various possible causes, including people, objects, events, sentences, actions, and combinations thereof.
  • all the information currently obtained by the terminal may be analyzed, and the emotional stimulus source information may be extracted.
  • step 203 the emotional expression information is associated with the emotional stimulus source information, and the associated emotional expression information and the emotional stimulus source information are stored.
  • the emotional expression information and the emotional stimulus source obtained at a time point or in a preset time interval when associating the emotional expression information with the emotional stimulus source information and storing the associated emotional expression information and the emotional stimulus source information.
  • the information may be associated according to the time relationship of the information acquisition, and the associated emotional expression information and the emotional stimulus source information may be stored in the storage of the terminal itself or in external devices such as cloud storage and a server.
  • FIG. 3 shows an application scene.
  • the user posts a text message 302 indicating "Today I have a dinner with my family together.Happy!
  • the terminal 301 can obtain word information “Happy” representing a self-emotional expression from the message 302, and also the picture.
  • word information “Happy” representing a self-emotional expression from the message 302, and also the picture.
  • the smiley face and word information "Happy” is emotional expression information of the user's current emotion.
  • the terminal may also obtain event information "I have a dinner with my family together" from the text message 302, and may further determine that the event is emotional stimulus source information of the current emotion by semantic analysis.
  • the terminal may associate the emotion expression information “Happy” and the smiley face in the picture with the event information “I have a dinner with my family together” and store the associated emotion expression information and the event information.
  • the user can see that the reason why he was so happy at that time was that the user had dinner with the family.
  • Biological characteristic information includes, but is not limited to, facial feature information, voiceprint information, iris information, and physiological signal feature information.
  • Stored emotional expression information and emotional stimulus source information may be displayed on the user device.
  • the above-described embodiment of the present application obtains the emotion expression information of the current emotion and the emotion stimulation source information of the current emotion, associates the emotion expression information with the emotion stimulation source information, and the associated emotion expression information and the emotion stimulation source information. It provides a method for monitoring emotions by storing the information, and can record the stimulus source information of the emotion expressions while recording the user's emotion expressions, thereby extending the scope of monitoring emotions, and traceability of the emotion expressions. To ensure.
  • FIG. 4 shows a flowchart 400 illustrating a method for monitoring emotions according to another embodiment.
  • Flowchart 400 includes the following steps.
  • the emotion expression information of the current emotion of the user is determined according to the currently collected multimedia information and / or physiological signal information, and the multimedia information includes at least one of text information, image information, audio information, and video information.
  • the terminal may use a multimedia sensor to collect multimedia information, and / or use a physiological signal sensor to collect physiological signal information of the user.
  • the terminal may use an image sensor for collecting image information, use an audio sensor for collecting audio information, use the above two sensors to collect video information, Or a text acquiring technique for acquiring a text message input by the user, such as a short message edited by the user, a text description attached to the photo, or the like.
  • the collected physiological signal information includes, but is not limited to, heart rate, electrocardiogram, skin conductance, body temperature and blood pressure.
  • the terminal may perform data analysis on at least one of the collected text information, image information, audio information, video information, and physiological signal information to determine emotion expression information of the current emotion of the user. For example, if a textual depiction characterizing emotions such as sadness and happiness is obtained from currently collected text information, the text information may be emotional expression information; If the smiley face of the user is extracted from the currently collected image information, the image information may be emotional expression information; If the user's crying voice is extracted from the currently collected audio information, the audio information may be emotional expression information; If a signal characterizing hypertension is extracted from currently collected physiological signal information, the physiological signal information may be emotional expression information; If the painful expression and / or painful moan of the user is extracted from the currently collected video information, the video information may be emotional expression information; If one or more emotional expression information is simultaneously collected in different file formats, the one or more emotional expression information is considered as emotional expression information.
  • a textual depiction characterizing emotions such as sadness and happiness is obtained from currently collected text information
  • the text information may be emotional expression information
  • step 402 emotional stimulus source information of the current emotion is obtained.
  • the emotional stimulus source information may be the causes that caused the user's current emotion.
  • the causes can be people, objects, events, sentences, actions, and the like.
  • all the information currently obtained by the terminal may be analyzed, and the emotional stimulus source information may be extracted.
  • step 403 if a change in the current emotion is detected, the emotional expression information is associated with the emotional stimulus source information, and the associated emotional expression information and the emotional stimulus source information are stored.
  • the emotional state of the user may continue to be collected at some sampling frequency by the emotional expression information obtained in step 401. If a change in the user's emotion is detected, the emotional expression information and the emotional stimulus source information obtained in step 402 are associated and stored. That is, the terminal may continuously collect the emotion expression information and determine the current emotion of the user based on the emotion expression information. In addition, when the user's emotion is maintained in a stable state, the terminal does not need to store the collected emotional expression information and the collected emotional stimulus source information. Only when the current emotion changes considerably, the terminal needs to associate the emotion expression information with the emotion stimulation source information and store the associated emotion expression information and the emotion stimulation source information.
  • the change in the current emotion includes a change in the emotion type or a change in the intensity of the current emotion.
  • the emotion type may represent a classification of emotions according to human emotion cognition, and in particular, includes calmness, joy, sadness, surprise, anger and fear. do;
  • the emotion intensity change may indicate the saturation degree of a particular type of emotion in terms of expressing each emotion. Since emotion can include two attributes: emotion type and emotion intensity, changes in the current emotion can also include two aspects of change. One is a change in emotion type, such as a change from calm to joy, and the other is a change in emotion intensity, such as a change from joy to very joy.
  • the emotion type and emotion intensity may be obtained according to the emotion expression information, since the other emotion types and the emotion intensities correspond to other emotion characteristic parameters, which may include expression parameters and speech parameters (eg For example, the facial feature "rise of mouth angle" corresponds to emotion type "joy", and the greater the magnitude of the elevation, the higher the intensity of emotion).
  • Emotional feature parameters may be extracted from the collected emotional expression information of the current emotion, and the emotion type and the emotion intensity of the current emotion may be further determined according to the emotion feature parameters.
  • the emotion intensity change is implemented as if the magnitude of the emotion intensity change reaches a preset threshold.
  • different intensity values that are directly proportional to the saturation of the emotion may be assigned to different emotion intensities.
  • the change in the emotion intensity may indicate a change in the emotion intensity value, and the magnitude of the change in the emotion intensity value reaches a preset threshold.
  • the emotional expression information and the emotional stimulus source information may be stored only when the variation in the intensity of emotion reaches a certain degree for the same emotion type. Otherwise, related emotional information does not need to be stored.
  • conversion may be performed on the collected emotional expression information, and the converted emotional expression information and emotional stimulus source information may be associated and stored.
  • the terminal may perform informatization extraction on the directly collected emotion expression information, so that information elements that may represent the characteristics of the current emotion may be obtained, and the expression type conversion may be further performed on the information elements.
  • the smiley face of the user may be extracted from the group picture as an information element representing the emotion characteristics of the user by facial feature recognition technology, and the expression shape conversion is performed on the smiley face.
  • the actual smiley face may be converted into an animated smiley face, audio information including laughter sounds, or text information including the word "happy".
  • the converted emotional expression information and emotional stimulus source information are then associated and stored. As such, if the emotional expression information needs to be stored, all the emotional expression information can be first converted into the same type of information, and the same type of information is stored for subsequent information management. Alternatively, the collected audio information can be converted into a text message and stored, reducing the space required for information storage.
  • the emotional expression information includes at least one of text information, image information, audio information, video information, and physiological signal information.
  • the method for monitoring emotions of the embodiment shown in FIG. 4 is illustrative of how to obtain emotional expression information and when the emotional expression information and the emotional stimulus source information are stored. Describe the implementation. Specifically, the emotion expression information of the current emotion of the user is first determined according to at least one of the currently collected image information, audio information, video information, and physiological signal information, emotion stimulation source information of the current emotion is obtained, and If a change is detected, emotional expression information and emotional stimulus source information are associated and stored. Embodiments associate emotional expression information with emotional stimulus source information and further determine triggers for storing associated emotional expression information and emotional stimulus source information, such that the stored emotional relationship information effectively characterizes the state of emotional change. And improve the efficiency of monitoring emotions.
  • FIG. 5 shows a flowchart 500 illustrating a method for monitoring emotions in accordance with another embodiment of the present application.
  • the method for monitoring emotions includes the following steps.
  • step 501 emotional expression information of the current emotion of the user is obtained.
  • step 502 the emotional stimulus source information of the current emotion is determined according to at least one of the currently obtained multimedia information, environmental information, and physiological signal information.
  • the emotional stimulus source information of the current emotion is determined according to at least one of the currently obtained multimedia information, environmental information, and physiological signal information.
  • the terminal may acquire at least one of multimedia information, environmental information, and physiological signal information by various signal sensors, perform data analysis on at least one of the collected information, and according to a result of the data analysis. Emotion stimulation source information of the current emotion may be determined.
  • determining the emotional stimulation source information according to the currently obtained multimedia information includes determining the emotional stimulation source information according to the currently output and / or collected multimedia information.
  • the multimedia information includes at least one of text information, image information, audio information, and video information.
  • the multimedia information output and / or collected by the terminal is obtained. Data analysis is performed on the multimedia information, and emotional stimulus source information causing an emotional change can be determined according to the results of the data analysis.
  • determining the emotional stimulus source information according to the currently obtained multimedia information may include: currently output and / or collected audio information, attribute information derived from the audio information, a preset time of the audio information. And selecting at least one of semantic information derived from the audio segment and the audio information in the section as the emotional stimulation source information.
  • the audio information currently output and / or collected by the terminal is first obtained.
  • Data analysis on the next audio information is performed to derive the attribute information derived from the audio information, the audio segment in the predetermined time interval of the audio information, the semantic information derived from the audio information, the event information derived from the audio information, and the audio information.
  • Acquired personal information Thereafter, at least one of the audio information, the attribute information derived from the audio information, the semantic information derived from the audio segment and the audio information in a predetermined time interval of the audio information is selected as the emotional stimulation source information.
  • the preset time interval may be an interval before and / or after the current time point.
  • the terminal when the terminal outputs (ie, plays) a song, the currently played song can be obtained by detecting an operating state of the terminal, to obtain at least one of the following three items: a) Song Attribute information such as title, artist / performer and album can be obtained by performing data analysis on the song; b) semantic information, such as song lyrics and the central content of the lyrics, obtained by semantic analysis can be obtained; And c) an audio segment in a preset time interval can be obtained by splitting a song.
  • Song Attribute information such as title, artist / performer and album
  • semantic information such as song lyrics and the central content of the lyrics, obtained by semantic analysis can be obtained
  • an audio segment in a preset time interval can be obtained by splitting a song.
  • the terminal may actively collect audio of the call and perform data analysis on the audio, obtaining at least one of the following three items: To: a) attribute information such as caller, talk time and caller and calling frequency between callers can be obtained; b) semantic information may be obtained by semantic analysis of the call content; c) An audio segment in a preset time interval may be obtained by dividing the call audio. To determine the stimulus source information, stimulus source analysis is performed on the information related to the audio obtained in the two examples above.
  • determining emotional stimulus source information according to the currently output and / or collected multimedia information may include: currently output and / or collected video information, attribute information derived from the video information, video information.
  • attribute information derived from the video information may include: currently output and / or collected video information, attribute information derived from the video information, video information.
  • video information Among the video segment, semantic information derived from the video information, event information derived from the video information, personal information derived from the video information, screenshot information derived from the video information, and audio information derived from the video information in a predetermined time interval of. Selecting at least one as emotional stimulation source information.
  • video information currently output and / or collected by the terminal is first obtained.
  • Data analysis is then performed on the next video information to obtain video segments in preset time intervals of attribute information, semantic information, event information, personal information and video information.
  • at least one emotional stimulus source information is determined from the information by stimulus source analysis.
  • the preset time interval may be an interval before and / or after the current time point.
  • the terminal when the terminal outputs (ie, plays) a movie, the currently played movie can be obtained by detecting an operating state of the terminal, to obtain at least one of the following four items: a) Movie Attribute information such as title, starring actors, director, and production company can be obtained by performing data analysis on the film; b) semantic information can be obtained, such as a movie dialogue and a summary of the dialogue, obtained by semantic analysis; And c) screenshots can be taken for the movie, wherein the screenshots are analyzed to obtain event information and personal information; And d) video segments within a preset time interval may be obtained by dividing the movie.
  • Movie Attribute information such as title, starring actors, director, and production company can be obtained by performing data analysis on the film
  • semantic information can be obtained, such as a movie dialogue and a summary of the dialogue, obtained by semantic analysis
  • screenshots can be taken for the movie, wherein the screenshots are analyzed to obtain event information and personal information
  • video segments within a preset time interval may be obtained
  • the terminal may actively collect the call video and perform data analysis on the video, obtaining at least one of the following four items: To: a) attribute information such as caller, talk time and caller and call frequency between callers can be obtained; b) semantic information may be obtained by semantic analysis of the video call content; c) screenshots can be taken for the video, the screenshots being analyzed to obtain event information and personal information; And d) video segments within a preset time interval may be obtained by dividing the call video. To determine the stimulus source information, stimulus source analysis is performed on the information related to the video obtained in the two examples above.
  • determining the emotional stimulus source information according to the currently output and / or collected multimedia information is derived from the currently output and / or collected image information, event information derived from the image information, and image information. Selecting at least one of the personal information as the emotional stimulus source information.
  • the image information currently output and / or collected by the terminal is first obtained.
  • Data analysis on the next image information is performed to obtain event information and personal information included in the image information.
  • at least one emotional stimulus source information is determined from the information by stimulus source analysis.
  • the terminal outputs (ie, displays) or collects a picture by the camera
  • a picture currently being browsed may be obtained by detecting an operation state of the terminal, and performing analysis on the picture. This is to obtain event information or personal information.
  • personal information such as a person's name, gender, and age may be determined through features and attire of a person's face.
  • the event information included in the picture is determined by performing analysis on a body motion or a face expression of a person, a picture background, and the like in the picture.
  • stimulus source analysis is performed on the information related to the photograph obtained in the two examples above.
  • FIG. 6 is a schematic diagram of an application scene of an embodiment.
  • the terminal when the user uses the terminal 601 to browse the photo 602, the terminal detects that the user's emotions are suddenly feared, and through the image analysis technique for the photo 602.
  • the terminal may acquire a picture 602.
  • personal information that is, Snow White (first personal information) and old witch (second personal information) in the picture 602 are identified according to the features and clothes of the human face.
  • the first event information of the image ie, the old witch is apologizing to Snow White
  • the queen is an old witch and the apple is poisonous, so that the second event information (that is, the queen wants to give Snow White a poisoned apple to Snow White) can be obtained.
  • the emotional stimulus source information that causes fear to the user that the queen wants to give Snow White a poisoned apple.
  • determining the emotional stimulus source information according to the currently output and / or collected multimedia information may affect at least one of the currently output and / or collected text information and semantic information derived from the text information. Selection as stimulus source information.
  • the text information currently output and / or collected by the terminal is first obtained.
  • Data analysis is then performed on the text information to obtain semantic information derived from the text information.
  • at least one of the text information and the semantic information is determined as the emotional stimulus source information.
  • the terminal outputs (ie, displays) text information
  • the currently browsed text image may be obtained by detecting an operation state of the terminal, and obtaining text information included in the text image by a feature recognition technique. For that.
  • the input words may be directly collected to be used as collected text information.
  • semantic analysis may be performed on the obtained text information to obtain semantic information.
  • stimulus source analysis may be performed on the acquired text information and / or semantic information.
  • the text message input to the user and collected by the terminal is "A lonely stranger in a strange land I am cast, I miss my family all the more on every festive day”
  • this text message is the user Can be used directly as emotional stimulus source information to make people feel sad.
  • semantic analysis may be performed on the textual information to obtain semantic information "being far away from my hometown, I can't reunite with families", which may be used as emotional stimulus source information.
  • the physiological signal information includes health information obtained according to the user's current physiological signal information or physiological signal information.
  • Health information may be obtained by analyzing physiological signal information, and may reflect a user's health status. For example, if the user's collected body temperature signal is 38 ° C., the temperature signal may be analyzed to find that the temperature signal is higher than the normal human body temperature 36 ° C., thus inferring that the user has a fever,
  • the health information may be an image or text representing "fever".
  • physiological signal information is used as emotional stimulation source information, it should be pointed out that the acquired emotional expression information will no longer include physiological signal information. Instead, for example, acquired image information (e.g., a picture of hot flashes by heat) will be used as emotional expression information.
  • acquired image information e.g., a picture of hot flashes by heat
  • the environmental information includes at least one of time information, location information, weather information, temperature information, and humidity information.
  • the environmental information may be used as emotional stimulus source information.
  • the environmental information may be information related to the current environment that may be obtained while collecting the emotional expression information.
  • Information related to the current environment may include time information, geographic information and climate information, such as current weather, temperature and humidity, associated with the obtained emotional expression information.
  • the environmental information may be characterized by text information, image information, audio information and video information. For example, a picture depicting a blue sky can be used as emotional stimulus source information because the picture includes weather information "sunny.”
  • two or three of the aforementioned multimedia information, environmental information and physiological information can be used as emotional stimulus source information if the two or three information causes a change of emotion.
  • the user may select one of the above information to be emotional stimulus source information.
  • the terminal determines that the current emotion of the user is sadness according to the currently collected emotional expression information, and detects the operating state of the terminal, thereby determining that the currently played movie is a movie entitled "Butterfly Lovers", It determines from the weather forecast website that the current weather information is light rain, and determines that the user's current body temperature is higher than normal body temperature according to the temperature signal obtained by the physiological signal sensor. All three pieces of information can be used as emotional stimulus source information to make the user feel sad.
  • the user may receive all three pieces of information and select one or more pieces of information as emotional stimulus source information. For example, if the user likes a rainy day, the user will not select weather information indicating drizzle as the emotional stimulus source information causing grief.
  • the multimedia information, the environmental information and the physiological signal information may be listed simply or as the emotional stimulus source information.
  • the information can be logically analyzed and the analysis results can be used as emotional stimulus source information. For example, if the time information obtained is the user's birthday and video information is collected indicating that the user is celebrating his birthday with his friends, then the text or voice related to "my friends celebrate my birthday" may be time information. And emotional stimulus source information according to a logical analysis of the video information.
  • step 503 emotional expression information and emotional stimulus source information are associated and stored.
  • emotional expression type and emotional intensity are associated with emotional stimulus source information, and the associated emotional expression type, emotional strength and emotional stimulus source information are stored.
  • the emotion type and emotion intensity obtained by the emotional expression information may be associated with the emotional stimulus source information and may be stored. As such, the user may obtain a particular emotion type and emotion intensity, or classify all of the collected emotions according to the emotion types.
  • the emotional intensity may be normalized.
  • different users express their own feelings in different ways. For example, an extrovert does not care about showing his feelings. He will laugh when he is happy, and he will cry when he is sad. In contrast, introverts are rather calm. He will only smile if he feels happy, and weep when he feels sad.
  • the emotion intensity can be normalized. Specifically, for emotions of the same emotion type, the strongest emotion intensity expressed by the user is set to the maximum value MAX, the calmest emotion intensity expressed by the user is set to the minimum value MIN, Other emotion levels X may be normalized based on MAX and MIN.
  • the emotion intensity of the current emotion when the emotion type of the current emotion remains unchanged in a preset sampling time interval, if the emotion intensity of the current emotion is greater than the stored emotion intensity, the stored emotion expression information and the stored emotion stimulation source information are respectively The emotion expression information of the emotion and the emotional stimulus source information of the current emotion are replaced. Specifically, after the association of emotional expression information with emotional stimulus source information and storage of associated information, and the emotion type of the current emotion remains unchanged in a preset sampling interval (eg, 10 minutes), the emotion intensity ranges When fluctuating within, only emotional expression information and associated emotional stimulus source information corresponding to the maximum emotional intensity are stored, not only sufficiently reflecting the emotional state of the user, but also reducing the required storage space.
  • a preset sampling interval eg, 10 minutes
  • the emotional expression information and the emotional stimulus source information obtained at the time of the change are associated and stored, at which time the value of the emotional intensity is 60%. Can be stored as.
  • the value of the emotional intensity changes for example, if the value of the emotional intensity changes from 70 minutes to 70%
  • the previously stored information is 7 minutes. Is replaced with the emotional expression information and the emotional stimulus source information obtained in.
  • the value of the emotional intensity increases to 95% at 7 minutes ie, the magnitude of the change is 35% greater than the preset size threshold, for example 30%
  • the emotional expression information obtained at 7 minutes are associated and stored, while previously stored information is still preserved.
  • the stored emotional expression information and associated emotional stimulus source information are shared and displayed.
  • the user may share the stored emotional expression information and associated emotional stimulus source information in a wired or wireless manner, for example, the user may post the information on a social platform, or to any contactable person.
  • Can send information Alternatively, the user can display the information on the user's terminal.
  • the user can log in to the system by the identity of the registered user, in which the information is stored through various terminals of different platforms, and the stored emotional expression information and emotional stimulus source information are browsed. .
  • the user can bind the user's biological information and the user's ID, so that the terminal can automatically identify the current user and log in to the associated system.
  • data analysis is performed on the stored emotional expression information and associated emotional stimulus source information, and the analysis results are shared or displayed.
  • data analysis may be performed on stored emotional expression information and associated emotional stimulus source information, and may be performed primarily with a focus on causes and variations in emotion.
  • related information such as stored emotional expression information
  • the stored emotional expression information and associated emotional stimulus source information may be formed to be a slide file. The graph or slide file can then be shared or displayed.
  • data analysis is performed on emotional expression information and associated emotional stimulus source information
  • emotional management recommendation information is output in accordance with the data analysis results.
  • the emotion management recommendation information may be input according to the data analysis result. For example, if data analysis reveals that a user's feelings are upset in a time interval and the causes that caused the anger are minor, the emotion management recommendation information can tell the user to control his mood and to control angerable emotions. It can remind you and provide some tips to help the user feel good.
  • FIG. 7A is a schematic diagram of another application scene of an embodiment.
  • a diagram 702 is shown on the terminal 701, in which the histogram represents the degree of change of the user's joy, and each column of the histogram represents a record of joy, The height of the column indicates the degree of intensity of the user's delight.
  • the user wants to know more about a particular condition of a happy emotion, the user can click on the star pattern 703 at the top of the histogram.
  • the terminal includes the emotional expression information of the user's joyful emotion, that is, the user's smiley face 704 and the emotional stimulus source information that caused the user's happy emotion, that is, the SMS 705. It will show you specific information about your feelings. From FIG. 7B, the user can then not only see his joyful feelings, but also recall that the reason he was happy was that he received a message informing him that he was promoted.
  • step 501 and step 503 in the above flowchart are the same as step 201 and step 203 in the above-described embodiment, respectively, and will not be described in detail.
  • the method for monitoring emotions in the embodiment of FIG. 5 shows in more detail an example implementation of obtaining emotional stimulus source information.
  • emotional expression information of the user's current emotion is first obtained.
  • the emotion stimulus source information of the current emotion is determined according to at least one of the currently collected multimedia information, environmental information, and physiological signal information.
  • the emotional expression information and the emotional stimulus source information are associated and stored.
  • the emotional stimulus source information is determined according to the obtained multimedia information, the environmental information and the physiological signal information, and the emotional stimulus source information obtained in such a manner includes almost all possible stimulus source information causing the emotional changes.
  • the stimulus sources thus obtained are more comprehensive, further improving the accuracy of monitoring emotion.
  • FIG. 8 shows a flowchart 800 illustrating a method for monitoring emotion in accordance with another embodiment.
  • the method for monitoring emotions includes the following steps.
  • step 801 when it is detected that the preset emotion monitoring program is executed, emotion expression information and emotion stimulation source information of the user are obtained, and emotion expression information and / or emotion stimulation source information of the associated user are obtained.
  • the emotion monitoring program may be a general program, such as a program designed for monitoring emotions or a short message program, and is implemented in the terminal so that when the general program is set to be executed, the terminal may collect emotion relationship information. It may be a program. Therefore, when a preset emotion monitoring program is executed, the terminal may actively acquire emotion expression information of the user's current emotion and associated emotion expression information and / or emotion stimulation source information.
  • the biological information of the associated user may be recorded such that the emotional expression information and / or emotional stimulation of the associated user from various information currently collected according to the biological information. It should be pointed out that source information can be identified.
  • a selfie taken by user A includes user A's smiley face and user B's slipping and scary emotions in the background, where user A associates user B with user A's associated
  • the terminal may not only acquire user A's emotion expression information, that is, a smiley face, but also user B's emotion expression information, that is, scary emotion and user B's emotion stimulation source information, that is, slipping. ) Can be obtained.
  • the terminal may collect the emotion stimulation source information, obtain the emotion expression information and / or the emotion stimulation source information of the associated user, and the emotion When it is detected that the monitoring program is executed, emotional expression information and emotional stimulation source information of the user are obtained, and emotional expression information and / or emotional stimulation source information of the associated user are obtained.
  • the smiley face of the associated user B may be the emotional stimulus source information of user A, which smiled face is the emotional expression information of the associated user B.
  • step 802 emotional expression information and emotional stimulus source information of the current emotion are associated and stored.
  • step 803 emotional expression information and / or emotional stimulation source information of the associated user is shared with the associated user.
  • the terminal when the terminal acquires the emotion expression information of the associated user in step 801, the terminal may record the information and share the information with the associated user in a wired or wireless manner. Before sharing, the terminal may also output a prompting message to enable the current user to select whether to share the emotional expression information with the associated user.
  • emotional stimulus source information corresponding to the emotional expression information of the associated user if emotional stimulus source information corresponding to the emotional expression information of the associated user is obtained, the emotional expression information and the corresponding emotional stimulation source information of the associated user will be shared with the associated user.
  • emotional stimulus source information corresponding to the emotional expression information may not be obtained correctly. Under such circumstances, only emotional expression information will be shared. Otherwise, emotional expression information will be shared with the corresponding emotional stimulus source information.
  • FIG. 9 is a schematic diagram of an application scene of an embodiment.
  • the terminal 901 displays the group picture 902 that has just been taken.
  • face recognition technology it can be determined that not only the current user A's smiley face 903, but also the user A's friend, that is, the smiley face 904 of the associated user B, is present.
  • terminal 901 may output a prompting message 905 that prompts User A to determine whether to share the picture with User B.
  • step 804 emotional expression information and emotional stimulation source information of an associated user are received.
  • the terminal may also receive emotional expression information and emotional stimulus source information that is associated with the current user and shared with the associated user. Prior to reception, the terminal may also output a prompting message that prompts the current user to select whether to receive the emotion expression information and the emotion stimulus source information.
  • step 805 the emotional expression information and the emotional stimulus source information of the associated user are combined with the stored emotional expression information and the emotional stimulus source information.
  • the terminal after receiving the emotion expression information and the emotion stimulation source information shared by the associated user, stores the received information according to a time sequence and / or location information in which each information is generated. Can be combined with
  • the terminal collects the current user A's laughter and song X's melody through the acoustic sensor, and the associated user B's dancing behavior and smiley face through the image sensor.
  • the user A's emotional expression information is a laugh and the user A's emotional stimulus source information is the user B's dancing behavior, and the user B's emotional expression information is a smiley face.
  • the emotional stimulus source information of user B may be determined to be a melody of song X.
  • the emotional expression information and emotional stimulus information of the current user and associated user may be mixed together. Indeed, different emotional relationship information can be distinguished according to different application scenes.
  • the received information may be combined with the information stored in the terminal.
  • a time at which received emotional relationship information is generated can be obtained, and the received emotional relationship information is stored as information related to the corresponding time.
  • the emotional expression information of the user B stored in the day includes the emotional expression information X generated at 9 o'clock and the emotional expression information Y generated at 11 o'clock.
  • the information Z may be stored as the emotional expression information of user B.
  • the user B receives other emotional expression information M generated at 9:00, the received information M and previously stored information X are used together as emotional expression information generated at 9:00;
  • the common emotional expression features of information M and information X may be extracted to form the emotional expression information of 9 o'clock.
  • location information indicating where the received emotional relationship information was generated may be obtained, the emotional relationship information being combined with other stored information generated at the same location.
  • the stored emotional stimulus source information of user B is the emotional stimulus source information N generated at position 1.
  • information P may be stored as user B's emotional stimulus source information.
  • both the received information Q and the previously stored information N are used as the emotional expression information generated at position 1.
  • time information and location information regarding each of the time and place where the received emotional relationship information was generated may be obtained simultaneously, and the emotional relationship information generated at the same location is combined according to time links.
  • the method for monitoring the emotions of the embodiment describes in more detail an example implementation of obtaining emotional expression information and emotional stimulation source information of an associated user.
  • the terminal may collect the emotion relationship information of the associated user while collecting the emotion relationship information of the current user, and may share it with the associated user.
  • the terminal may receive sentiment relation information transmitted from the associated user.
  • Embodiments broaden the source from which emotional relationship information can be obtained, further improving the range of monitoring emotions.
  • an apparatus for monitoring emotions of an embodiment includes: expression information acquisition module 1010, stimulus information acquisition module 1020, and association and storage module 1030.
  • the expression information obtaining module 1010 is configured to obtain emotion expression information of the current emotion of the user;
  • the stimulus information obtaining module 1020 is configured to obtain emotional stimulus source information of the current emotion
  • the associating and storing module 1030 is configured to associate the emotional expression information obtained by the expression information obtaining module 1010 with the emotional stimulus source information obtained by the stimulus information obtaining module 1020, and the associated emotional expression information and Store emotional stimulus source information.
  • the foregoing embodiment of the present application provides an apparatus for monitoring emotions.
  • the emotion expression information of the current emotion of the user is obtained by the expression information obtaining module
  • the emotion stimulation source information of the current emotion is obtained by the stimulus information obtaining module
  • the emotion expression information and the emotion stimulation source information are obtained by the association and storage module.
  • the device can record both the emotional expression of the user and the stimulus source information that caused the emotional expression, thus broadening the scope of monitoring emotions and ensuring traceability of the emotional expression.
  • the expression information obtaining module 1010 obtains the expression information for determining the emotion expression information of the current emotion of the user according to the currently collected multimedia information and / or physiological signal information. It further includes a submodule.
  • the multimedia information includes at least one of text information, image information, audio information, and video information.
  • the apparatus for monitoring the aforementioned emotions further includes a change detection module, not shown in the figures, for detecting a change in the current emotion.
  • the associating and storing module 1030 associates the emotional expression information with the emotional stimulus source information and stores the associated emotional expression information and the emotional stimulus source information.
  • the change in current emotion includes a change in emotion type or a change in emotion intensity.
  • the emotion intensity change comprises a magnitude of the emotion intensity change reaching a preset threshold.
  • the apparatus for monitoring the aforementioned emotions further includes a representation information conversion module, not shown in the figures, for performing information transformation of the emotion expression information.
  • the association and storage module 1030 described above includes an association and storage submodule for associating the converted emotional expression information with emotional stimulation source information and for storing the associated emotional expression information and emotional stimulation source information.
  • the emotional expression information includes at least one of text information, image information, audio information, video information, and physiological signal information.
  • an apparatus for monitoring emotions of the embodiment shown in FIG. 11 relates to associating emotional expression information with emotional stimulation source information and for storing associated emotional expression information and emotional stimulation source information. Further determination of the trigger allows the stored emotional relationship information to effectively characterize the state of emotional change and improve the efficiency of monitoring the emotions.
  • the aforementioned stimulus source information acquisition module 1020 may generate the emotion stimulus source information of the current emotion according to at least one of the currently obtained multimedia information, environmental information, and physiological signal information.
  • the stimulus source information acquisition submodule 1021 is configured to determine the emotional stimulus source information according to the currently output and / or collected multimedia information.
  • the multimedia information includes at least one of text information, image information, audio information, and video information.
  • the stimulus source information acquisition submodule 1021 derives from the currently output and / or collected audio information, attribute information derived from the audio information, audio segment and audio information in a predetermined time interval of the audio information.
  • An audio information unit not shown in the figures, for selecting at least one of the semantic information specified as emotional stimulus source information.
  • the stimulus source information acquisition submodule 1021 may derive from the currently output and / or collected video information, the attribute information derived from the video information, the video segment in the preset time interval of the video information, the video information.
  • the video information may be selected at least one of the semantic information derived from the video information, the event information derived from the video information, the personal information derived from the video information, the screenshot information derived from the video information, and the audio information derived from the video information.
  • Video information units that are not shown.
  • the stimulus source information acquisition submodule 1021 may include at least one of the currently output and / or collected image information, event information derived from the image information, and personal information derived from the image information.
  • An image information unit not shown in the figures, for selection.
  • the stimulus source information acquisition submodule 1021 may, in the figures, select at least one of currently output and / or collected text information and semantic information derived from the text information as emotional stimulation source information.
  • a text information unit not shown.
  • the physiological signal information includes health information derived from the user's current physiological signal information or current physiological signal information.
  • the environmental information includes at least one of time information, location information, weather information, temperature information, and humidity information.
  • the apparatus for monitoring the aforementioned emotions is provided for associating an emotion type and emotion intensity of a current emotion with emotion expression information and storing associated emotion type, emotion intensity and emotion stimulation source information. It further includes a emotion storage module not shown.
  • the apparatus for monitoring the aforementioned emotions further includes a normalization module for normalizing the emotion intensity.
  • the device is maintained unchanged in a preset sampling time interval, and if the emotion intensity of the current emotion is greater than the stored emotion intensity, the stored emotion expression information and stored emotion stimulation source information.
  • an emotion update module not shown in the figures, for respectively replacing s with emotion expression information of the current emotion and emotion stimulus source information of the current emotion.
  • the apparatus for monitoring the aforementioned emotions further comprises a program detection module, not shown in the figures, for detecting the execution of the emotion monitoring program.
  • the apparatus for monitoring the aforementioned emotions includes: a display and sharing module, not shown in the figures, for sharing or displaying stored emotional expression information and associated emotional stimulus source information;
  • a data analysis module not shown in the figures, for performing data analysis on the stored emotional expression information and associated emotional stimulus source information, and for sharing or displaying the data analysis results;
  • the method further includes an emotion recommendation module (not shown in the drawings) for performing data analysis on the stored emotion expression information and the associated emotion stimulus source information and outputting the emotion management recommendation information according to the data analysis result.
  • the apparatus for monitoring the emotions of the embodiment determines the emotional stimulus source information according to the obtained multimedia information, the environmental information and the physiological signal information, and the emotional stimulus source information obtained in such a manner causes almost all possible stimuli to cause emotional changes. Including source information, the stimulus sources thus obtained are more comprehensive, further improving the accuracy of monitoring emotion.
  • the apparatus for monitoring the aforementioned emotions may include an associated emotion acquisition module 1040 and an associated emotion for obtaining emotional expression information and / or emotional stimulation source information of an associated user. And an associated emotion sharing module 1050 for sharing the emotion expression information and / or the emotion stimulation source information of the user.
  • the apparatus for monitoring the above-described emotions is provided by an emotion receiving module 1060 and an emotion receiving module 1060 for receiving emotion expression information and emotion stimulation source information shared by an associated user.
  • Emotion combining module 1070 is further included for combining the received emotional expression information and the emotional stimulus source information with the stored emotional expression information and the stored emotional stimulus source information.
  • the apparatus for monitoring the emotions of the embodiment may obtain the emotional relationship information of the associated user collected from the emotional relationship information of the current user and share it with the associated user.
  • the device may receive emotional relationship information sent from an associated user. Embodiments broaden the source from which emotional relationship information can be obtained, further improving the range of monitoring emotions.
  • Modules described in the foregoing embodiments may be implemented in software or hardware, or may be disposed in a processor.
  • the processor includes a representation information acquisition module, a stimulus source information acquisition module, and an association and storage module.
  • the name of each module does not limit the module itself.
  • the expression information obtaining module may also be described as "a module used to obtain emotional expression information of a user's current emotion".
  • the electronic apparatus 1000 may be an embodiment of the apparatus for monitoring the emotions described in the above drawings, and all of the above descriptions may be applied.
  • the electronic device 1000 may be various electronic devices such as a smart phone, a smart TV, a tablet computer, a portable laptop computer, a desktop computer, a wearable device, and the like.
  • the electronic apparatus 1000 includes a memory 1700, a display unit 1210, and a controller 1300.
  • the controller 1300 may control overall operations of the electronic device 1000 and may process various data required for the operation of the electronic device 1000.
  • the control of the operation of the apparatus for monitoring emotion described above with reference to the drawings, or data processing necessary for the operation may all be performed by the controller 1300. Therefore, the above description may be applied to the electronic device 1000 of FIG. 14 even if not specifically mentioned.
  • the controller 1300 may include a CPU, a microprocessor, a GPU, and the like.
  • the controller 1300 may include modules shown in the above-described drawings.
  • the controller 1300 may obtain emotion expression information indicating the emotion of the user.
  • the controller 1300 may obtain emotional stimulus source information indicating a cause of the emotion.
  • the controller 1300 may associate the emotion expression information with the emotion stimulation source information, and store the associated emotion expression information and the emotion stimulation source information in the memory 1700.
  • the controller 1300 may display the emotion expression information and the emotion stimulation source information on the display unit 1210.
  • the controller 1300 may associate the time information regarding the time at which the emotion expression information and the emotion stimulation source information are obtained with the emotion expression information and the emotion stimulation source information and store the same in the memory 1700.
  • the controller 1300 may display the content indicating the emotion of the user over time on the display 1210.
  • the content may vary from text, charts, slides, graphs, etc. to indicate emotions over time. An example of the content representing the emotion of the user over time is described with reference to FIG. 7A.
  • the controller 1300 may display the emotion expression information and the emotion stimulus source information related to the selected time on the display 1210.
  • the electronic apparatus 1000 may further include a user input unit configured to receive a user input for selecting a specific time from the content.
  • the screen of the display unit 1210 may be switched as shown in FIG. 7B. That is, the controller 1300 may display the emotion expression information and the emotion stimulus source information related to the selected time on the display 1210.
  • the controller 1300 may obtain emotional stimulation source information based on at least one of multimedia information, environment information, and physiological signal information of the user.
  • the multimedia information may include at least one of text information, image information, audio information, and video information.
  • the controller 1300 may acquire the emotion expression information based on at least one of the multimedia information and the physiological signal information.
  • the controller 1300 may store the emotion expression information and the emotion stimulation source information in the memory 1700, and then monitor the change in the emotion type and the intensity of the emotion with respect to the user's emotion. As a result of the monitoring, the emotion type does not change, and when the change in the emotion intensity is greater than the first threshold value, the controller 1300 may update the emotion expression information and the emotion stimulation source information. The controller 1300 may replace the emotion expression information and the emotion stimulation source information previously stored in the memory 1700 with the updated emotion expression information and the emotion stimulation source information. If the change in the emotion intensity is greater than the second threshold value greater than the first threshold value, the controller 1300 preserves the emotion expression information and the emotion stimulation source information previously stored in the memory 1700, and updates the updated emotion expression. Information and emotional stimulus source information may be newly stored in the memory 1700.
  • the controller 1300 may normalize the emotion intensity.
  • the controller 1300 may obtain emotion management recommendation information based on the emotion expression information and the emotion stimulation source information.
  • FIG. 15 is a block diagram illustrating still another configuration of the electronic apparatus 1000 of FIG. 14.
  • the configuration of the electronic device 1000 illustrated in FIG. 15 may be applied to all the devices for monitoring the aforementioned emotions.
  • the electronic apparatus 1000 may include a user input unit 1100, an output unit 1200, a controller 1300, and a communication unit 1500. However, not all of the components illustrated in FIG. 15 are essential components of the electronic apparatus 1000.
  • the electronic device 1000 may be implemented by more components than those illustrated in FIG. 15, or the electronic device 1000 may be implemented by fewer components than those illustrated in FIG. 15.
  • the electronic apparatus 1000 may include the sensing unit 1400, the A / V input unit 1600, and the memory 1700 in addition to the user input unit 1100, the output unit 1200, the control unit 1300, and the communication unit 1500. It may further include.
  • the controller 1300 may obtain emotion expression information indicating the emotion of the user.
  • the controller 1300 may obtain emotional stimulus source information indicating a cause of the emotion.
  • the controller 1300 may associate the emotion expression information with the emotion stimulation source information, and store the associated emotion expression information and the emotion stimulation source information in the memory 1700.
  • the controller 1300 may obtain emotional stimulation source information based on at least one of multimedia information, environment information, and physiological signal information of the user.
  • the multimedia information may include at least one of text information, image information, audio information, and video information.
  • the controller 1300 may acquire the emotion expression information based on at least one of the multimedia information and the physiological signal information.
  • the controller 1300 may include multimedia information, environment information, and physiological signal information of the user through the user input unit 1100, the output unit 1200, the sensing unit 1400, the communication unit 1500, and the A / V input unit 1600. At least one can be learned.
  • the multimedia information may be received through the user input unit 1100 or the communication unit 1500 or may be output to the output unit 1200.
  • the multimedia information may be obtained through the A / V input unit 1600.
  • the environmental information or the physiological signal information of the user may be obtained by the sensing unit 1400.
  • the user input unit 1100 means a means for a user to input data for controlling the electronic apparatus 1000.
  • the user input unit 1100 includes a key pad, a dome switch, a touch pad (contact capacitive type, pressure resistive layer type, infrared sensing type, surface ultrasonic conduction type, and integral type). Tension measurement method, piezo effect method, etc.), a jog wheel, a jog switch, and the like, but are not limited thereto.
  • the user input unit 1100 may receive multimedia information input by a user.
  • the controller 1300 may obtain emotion expression information and emotion stimulation source information about the user's emotion based on the multimedia information received through the user input unit 1100.
  • the user input unit 1100 may receive a command for instructing to perform emotion monitoring.
  • the user input unit 1100 may receive a command for instructing an emotion monitoring program to be executed by the user.
  • the electronic apparatus 1000 may perform an operation of monitoring the emotion.
  • the output unit 1200 may output audio information or video information or a vibration signal, and the output unit 1200 may include a display unit 1210, an audio output unit 1220, and a vibration motor 1230. have.
  • the controller 1300 may obtain emotion expression information and emotion stimulation source information about the user's emotion based on the audio information or the video information output by the output unit 1200.
  • the output unit 1200 may output emotion expression information and emotion stimulation source information. In addition, the output unit 1200 may output emotion management recommendation information.
  • the display unit 1210 displays and outputs information processed by the electronic apparatus 1000. Meanwhile, when the display unit 1210 and the touch pad form a layer structure and are configured as a touch screen, the display unit 1210 may be used as an input device in addition to the output device.
  • the display unit 1210 may include a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, and a three-dimensional display. 3D display, an electrophoretic display.
  • the electronic apparatus 1000 may include two or more display units 1210 according to the implementation form of the electronic apparatus 1000. In this case, the two or more display units 1210 may be disposed to face each other using a hinge.
  • the controller 1300 may display the emotion expression information and the emotion stimulation source information on the display 1210.
  • the controller 1300 may display the content indicating the emotion of the user over time on the display 1210.
  • the user input unit 1100 may receive a user input for selecting a specific time from the content.
  • the controller 1300 may display the emotion expression information and the emotion stimulus source information related to the selected time on the display 1210.
  • the sound output unit 1220 outputs audio information received from the communication unit 1500 or stored in the memory 1700.
  • the sound output unit 1220 outputs a sound signal related to a function (for example, a call signal reception sound, a message reception sound, and a notification sound) performed by the electronic device 1000.
  • the sound output unit 1220 may include a speaker, a buzzer, and the like.
  • the vibration motor 1230 may output a vibration signal.
  • the vibration motor 1230 may output a vibration signal corresponding to the output of audio information or video information (eg, call signal reception sound, message reception sound, etc.).
  • the vibration motor 1230 may output a vibration signal when a touch is input to the touch screen.
  • the controller 1300 may typically control overall operations of the electronic device 1000.
  • the controller 1300 may control the electronic device 1000 to perform an operation of the electronic device (for example, 100 of FIG. 1) of the aforementioned drawings.
  • the controller 1300 executes programs stored in the memory 1700, such that the user input unit 1100, the output unit 1200, the sensing unit 1400, the communication unit 1500, and the A / V input unit 1600 are provided. ) Can be controlled overall.
  • the sensing unit 1400 may detect a state of the electronic device 1000, a state around the electronic device 1000, or a state of a user, and transmit the detected information to the controller 1300.
  • the sensing unit 1400 may include a geomagnetic sensor 1410, an acceleration sensor 1420, a temperature / humidity sensor 1430, an infrared sensor 1440, a gyroscope sensor 1450, and a position sensor. (Eg, GPS) 1460, barometric pressure sensor 1470, proximity sensor 1480, and RGB sensor (illuminance sensor) 1490, but are not limited thereto. Since functions of the respective sensors can be intuitively deduced by those skilled in the art from the names, detailed descriptions thereof will be omitted.
  • the sensing unit 1400 may obtain environment information used to acquire emotion expression information or emotion stimulation source information.
  • the environment information includes at least one of time information, location information, weather information, temperature information, and humidity information.
  • the sensing unit 1400 may further include a physiological signal sensor that acquires physiological signal information of the user.
  • the physiological signal sensor can acquire, for example, heart rate, electrocardiogram, skin conductance, body temperature and blood pressure.
  • the communicator 1500 may include one or more components for communicating with a cloud, a server, and / or another electronic device.
  • the second user associated with the user may be a user of another electronic device.
  • the communicator 1500 may include a short range communicator 1510, a mobile communicator 1520, and a broadcast receiver 1530.
  • the short-range wireless communication unit 1510 includes a Bluetooth communication unit, a Bluetooth low energy (BLE) communication unit, a near field communication unit, a WLAN (Wi-Fi) communication unit, a Zigbee communication unit, an infrared ray ( IrDA (Infrared Data Association) communication unit, WFD (Wi-Fi Direct) communication unit, UWB (ultra wideband) communication unit, Ant + communication unit and the like, but may not be limited thereto.
  • the mobile communication unit 1520 transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
  • the broadcast receiving unit 1530 receives a broadcast signal and / or broadcast related information from the outside through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel. According to an embodiment of the present disclosure, the electronic device 1000 may not include the broadcast receiving unit 1530.
  • the controller 1300 may control the communicator 1500 to share emotion relationship information, which is at least one of emotion expression information and emotion stimulation source information, with a second user associated with the user.
  • the controller 1300 may transmit emotion relation information to at least one of a cloud or a server that a second user can access through the communication unit 1500, and a second electronic device used by the second user.
  • the controller 1300 may receive the emotion relationship information shared by the second user through the communication unit 1500.
  • the electronic apparatus 1000 may share the emotion relation information with the second user by the communicator 1500.
  • the controller 1300 may obtain at least one of second emotion expression information indicating emotion of a second user associated with the user and second emotion stimulation source information indicating a cause of the emotion of the second user based on the multimedia information. .
  • the controller 1300 may control the communicator 1500 to share at least one of the second emotion expression information and the second emotion stimulation source information with the second user.
  • the controller 1300 may receive, via the communicator 1500, at least one of third emotion expression information representing the emotion of the user and third emotion stimulation source information corresponding to the third emotion expression information shared by the second user. Can be.
  • the communicator 1500 may combine the emotion expression information and the emotion stimulation source information with at least one of the third emotion expression information and the third emotion stimulation source information.
  • the A / V input unit 1600 is for inputting an audio signal or a video signal, and may include a camera 1610 and a microphone 1620.
  • the camera 1610 may obtain an image frame such as a still image or a moving image through an image sensor in a video call mode or a photographing mode.
  • the image captured by the image sensor may be processed by the controller 1300 or a separate image processor (not shown).
  • the image frame processed by the camera 1610 may be stored in the memory 1700 or transmitted to the outside through the communication unit 1500. Two or more cameras 1610 may be provided according to the configuration aspect of the terminal.
  • the microphone 1620 receives an external sound signal and processes the external sound signal into electrical voice data.
  • the microphone 1620 may receive an acoustic signal from an external electronic device or a speaker.
  • the microphone 1620 may use various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.
  • the camera 1610 may capture a user and acquire image information.
  • the camera 1610 and the microphone 1620 may capture a user to acquire video information.
  • the microphone 1620 may acquire audio information about the voice of the user.
  • the A / V input unit 1600 may obtain multimedia information that may include at least one of image information, video information, and audio information.
  • the controller 1300 may obtain emotion expression information or emotion stimulus source information based on the multimedia information obtained through the A / V input unit 1600.
  • the memory 1700 may store a program for processing and controlling the controller 1300, and may store data input to or output from the electronic device 1000.
  • the memory 1700 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM Random Access Memory (RAM) Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, Magnetic Disk It may include at least one type of storage medium of the optical disk.
  • RAM Random Access Memory
  • SRAM Static Random Access Memory
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only Memory
  • Magnetic Memory Magnetic Disk It may include at least one type of storage medium of the optical disk.
  • Programs stored in the memory 1700 may be classified into a plurality of modules according to their functions.
  • the programs stored in the memory 1700 may be classified into a UI module 1710, a touch screen module 1720, a notification module 1730, and the like. .
  • the UI module 1710 may provide a specialized UI, GUI, and the like that interoperate with the electronic device 1000 for each application.
  • the touch screen module 1720 may detect a touch gesture on a user's touch screen and transmit information about the touch gesture to the controller 1300.
  • the touch screen module 1720 according to some embodiments may recognize and analyze a touch code.
  • the touch screen module 1720 may be configured as separate hardware including a controller.
  • Various sensors may be provided inside or near the touch screen to detect a touch or proximity touch of the touch screen.
  • An example of a sensor for sensing a touch of a touch screen is a tactile sensor.
  • the tactile sensor refers to a sensor that senses the contact of a specific object to the extent that a person feels or more.
  • the tactile sensor may sense various information such as the roughness of the contact surface, the rigidity of the contact object, the temperature of the contact point, and the like.
  • an example of a sensor for sensing a touch of a touch screen is a proximity sensor.
  • the proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
  • Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the user's touch gesture may include tap, touch and hold, double tap, drag, pan, flick, drag and drop, and swipe.
  • the notification module 1730 may generate a signal for notifying occurrence of an event of the electronic device 1000. Examples of events occurring in the electronic apparatus 1000 may include call signal reception, message reception, key signal input, and schedule notification.
  • the notification module 1730 may output a notification signal in the form of a video signal through the display unit 1210, may output the notification signal in the form of an audio signal through the sound output unit 1220, and the vibration motor 1230. Through the notification signal may be output in the form of a vibration signal.
  • the present application further provides a computer readable storage medium, which may be a computer readable storage medium included in the apparatus of the previous embodiment, or may be a separate computer readable storage medium not installed in the terminal. have.
  • the computer readable storage medium stores one or more programs executed by one or more processors for implementing the method for monitoring the emotions of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif électronique comprenant : une mémoire ; une unité d'affichage ; et une unité de commande pour acquérir des informations d'expression d'émotion indiquant l'émotion d'un utilisateur, acquérir des informations de source de stimulation d'émotion indiquant la raison de l'émotion, corréler les informations d'expression d'émotion avec les informations de source de stimulation d'émotion, stocker les informations d'expression d'émotion corrélées et les informations de source de stimulation d'émotion en mémoire, et afficher les informations d'expression d'émotion et les informations de source de stimulation d'émotion sur l'unité d'affichage.
PCT/KR2015/008471 2014-11-06 2015-08-13 Dispositif électronique et son procédé de fonctionnement WO2016072595A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/524,419 US20180285641A1 (en) 2014-11-06 2015-08-13 Electronic device and operation method thereof
EP15857415.2A EP3217254A4 (fr) 2014-11-06 2015-08-13 Dispositif électronique et son procédé de fonctionnement

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201410638408.1 2014-11-06
CN201410638408.1A CN105615902A (zh) 2014-11-06 2014-11-06 情绪监控方法和装置
KR1020150113854A KR102327203B1 (ko) 2014-11-06 2015-08-12 전자 장치 및 그 동작 방법
KR10-2015-0113854 2015-08-12

Publications (1)

Publication Number Publication Date
WO2016072595A1 true WO2016072595A1 (fr) 2016-05-12

Family

ID=55909299

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/008471 WO2016072595A1 (fr) 2014-11-06 2015-08-13 Dispositif électronique et son procédé de fonctionnement

Country Status (1)

Country Link
WO (1) WO2016072595A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106507280A (zh) * 2016-10-28 2017-03-15 宇龙计算机通信科技(深圳)有限公司 一种情绪监测方法及相关设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070162283A1 (en) * 1999-08-31 2007-07-12 Accenture Llp: Detecting emotions using voice signal analysis
US20100003969A1 (en) * 2008-04-07 2010-01-07 Shin-Ichi Isobe Emotion recognition message system, mobile communication terminal therefor and message storage server therefor
WO2013048225A2 (fr) * 2011-09-29 2013-04-04 Hur Min Procédé pour transmettre des données d'affichage d'émotion et système correspondant
US20140016860A1 (en) * 2010-06-07 2014-01-16 Affectiva, Inc. Facial analysis to detect asymmetric expressions
US20140234815A1 (en) * 2013-02-18 2014-08-21 Electronics And Telecommunications Research Institute Apparatus and method for emotion interaction based on biological signals

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070162283A1 (en) * 1999-08-31 2007-07-12 Accenture Llp: Detecting emotions using voice signal analysis
US20100003969A1 (en) * 2008-04-07 2010-01-07 Shin-Ichi Isobe Emotion recognition message system, mobile communication terminal therefor and message storage server therefor
US20140016860A1 (en) * 2010-06-07 2014-01-16 Affectiva, Inc. Facial analysis to detect asymmetric expressions
WO2013048225A2 (fr) * 2011-09-29 2013-04-04 Hur Min Procédé pour transmettre des données d'affichage d'émotion et système correspondant
US20140234815A1 (en) * 2013-02-18 2014-08-21 Electronics And Telecommunications Research Institute Apparatus and method for emotion interaction based on biological signals

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3217254A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106507280A (zh) * 2016-10-28 2017-03-15 宇龙计算机通信科技(深圳)有限公司 一种情绪监测方法及相关设备

Similar Documents

Publication Publication Date Title
KR102327203B1 (ko) 전자 장치 및 그 동작 방법
WO2020080773A1 (fr) Système et procédé de fourniture de contenu sur la base d'un graphe de connaissances
WO2020032608A1 (fr) Procédé et appareil de rétroaction de dispositif électronique, permettant de confirmer l'intention d'un utilisateur
WO2017043857A1 (fr) Procédé de fourniture d'application, et dispositif électronique associé
WO2018182351A1 (fr) Procédé destiné à fournir des informations et dispositif électronique prenant en charge ledit procédé
WO2016126007A1 (fr) Procédé et dispositif de recherche d'image
WO2020067710A1 (fr) Procédé et système de fourniture d'interface interactive
WO2017082519A1 (fr) Dispositif de terminal utilisateur pour recommander un message de réponse et procédé associé
WO2016122158A1 (fr) Procédé de traitement d'images et dispositif électronique pour le prendre en charge
WO2018080023A1 (fr) Dispositif électronique et son procédé de commande de fonctionnement
WO2016089047A1 (fr) Procédé et dispositif de distribution de contenu
WO2019083275A1 (fr) Appareil électronique de recherche d'image associée et procédé de commande associé
WO2019203488A1 (fr) Dispositif électronique et procédé de commande de dispositif électronique associé
WO2018128403A1 (fr) Dispositif et procédé de traitement de contenu
WO2020159288A1 (fr) Dispositif électronique et son procédé de commande
WO2019125082A1 (fr) Dispositif et procédé de recommandation d'informations de contact
WO2015199288A1 (fr) Terminal du type lunettes, et procédé de commande de ce terminal
WO2020036467A1 (fr) Serveur destiné à fournir un message de réponse sur la base d'une entrée vocale d'un utilisateur et procédé de fonctionnement associé
WO2019235793A1 (fr) Dispositif électronique et procédé permettant de fournir des informations relatives à une application au moyen d'une unité d'entrée
WO2019199030A1 (fr) Système de traitement d'un énoncé d'utilisateur et son procédé de commande
EP3652925A1 (fr) Dispositif et procédé de recommandation d'informations de contact
EP3529774A1 (fr) Dispositif et procédé de traitement de contenu
WO2018048130A1 (fr) Procédé de lecture de contenu et dispositif électronique prenant en charge ce procédé
WO2018124584A1 (fr) Dispositif de sécurité personnelle et son procédé de fonctionnement
WO2019112181A1 (fr) Dispositif électronique pour exécuter une application au moyen d'informations de phonème comprises dans des données audio, et son procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15857415

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15524419

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2015857415

Country of ref document: EP