WO2020175969A1 - Appareil de reconnaissance d'émotion et procédé de reconnaissance d'émotion - Google Patents

Appareil de reconnaissance d'émotion et procédé de reconnaissance d'émotion Download PDF

Info

Publication number
WO2020175969A1
WO2020175969A1 PCT/KR2020/002928 KR2020002928W WO2020175969A1 WO 2020175969 A1 WO2020175969 A1 WO 2020175969A1 KR 2020002928 W KR2020002928 W KR 2020002928W WO 2020175969 A1 WO2020175969 A1 WO 2020175969A1
Authority
WO
WIPO (PCT)
Prior art keywords
emotion
unit
recognition
recognition unit
party
Prior art date
Application number
PCT/KR2020/002928
Other languages
English (en)
Korean (ko)
Inventor
류숙희
이재익
이찬수
Original Assignee
주식회사 하가
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 하가 filed Critical 주식회사 하가
Priority claimed from KR1020200025111A external-priority patent/KR102351008B1/ko
Publication of WO2020175969A1 publication Critical patent/WO2020175969A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Definitions

  • the present invention relates to a device and a method for recognizing the emotions of the other party facing the user.
  • Expensive devices that assist with vision impairment do not include technology to support communication skills. Expensive products amount to 3,500,000 won. The actual supply rate is less than 5%.
  • Visionics, Viisage, Miros, etc. are leading the market. These are not only face detection technology, lighting recognition technology, pose recognition technology, expression recognition technology, which are the core technologies of face recognition technology, but also aging recognition technology, large-scale DB construction technology, and 3D. Active research is in progress focusing on facial shape restoration technology.
  • the present invention is intended to provide a recognition device and recognition method capable of accurately recognizing the emotion of a user facing a user.
  • the recognition device of the present invention includes a photographing unit for photographing the other party facing the user; an emotion recognition unit for recognizing the external emotion of the other party through the analysis of the image of the other party photographed by the photographing unit; the appearance of the above by voice or vibration Expressing emotion 2020/175969 1»(:1 ⁇ 1 ⁇ 2020/002928 display part; can be included.
  • the emotion recognition unit analyzes the image of the other party to
  • At least one of the gestures can be recognized.
  • the emotion recognition unit may recognize the external emotion by using the expression or the gesture.
  • the recognition method of the present invention includes the steps of photographing the surrounding area; storing the photographed image; Analyzing the photographed image and extracting the other party to be conversed with the user; When the other party is extracted, recognizing and correcting the external feeling of the other party; Displaying the corrected external feeling. It may include.
  • the recognition device and recognition method of the present invention analyzes the other party photographed by the camera
  • External emotions such as facial expressions and gestures may show different strengths or different types of real emotions depending on the surrounding environment.
  • the recognition device and recognition method of the present invention recognizes the external feeling of the other party, and can correct the external external feeling of the other party, which is attributed to the object or text, by using the peripheral lens. Considering the Eunju conversion
  • the emotion of the other party can be analyzed and recognized in consideration of the surrounding environment.
  • the present invention can accurately recognize the emotion of the other party.
  • the emotion of the other party recognized through complex emotion recognition technology can be provided to the user.
  • a user with low vision can accurately recognize the other party's emotions, and thus can communicate in the process of the other party.
  • This invention is for people with low vision based on complex emotion recognition technology to improve communication skills.
  • the present invention can aid vision by allowing low vision people to read characters, objects, and expressions that are difficult to distinguish.
  • the present invention enables users with low vision to understand the surrounding situation and , Be aware of the risk, 2020/175969 1»(:1 ⁇ 1 ⁇ 2020/002928 You can guide your everyday life, such as shopping. This invention can guide users to communicate their emotions normally with the other party.
  • 1 is a schematic diagram showing the recognition device of the present invention.
  • FIG. 2 is a schematic diagram showing an image photographed by a photographing unit.
  • Fig. 3 is a flow chart showing the recognition method of the present invention.
  • FIG. 4 is a diagram showing a computing device according to an embodiment of the present invention.
  • the terms'and/or' include a combination of a plurality of listed items or any of a plurality of listed items.
  • Emotion recognition technology is a technology that aims to improve the quality of human life by measuring human emotions and analyzing them, applying them to product development or environmental design. This is the comfort of external physical stimulation obtained through personal experiences, It belongs to the field of technology that changes the product or environment by scientifically measuring and analyzing complex emotions such as discomfort, comfort, and discomfort, and applying them engineeringly.
  • emotion recognition technology accurately recognizes the emotions of users
  • emotion recognition technology can provide emotion-based services to users by using the emotions of users in entertainment, education, and medical fields, and respond immediately to users when using the service.
  • the quality of service can be improved by checking and providing feedback according to the response.
  • the conventional emotion recognition technology as described above only improves the accuracy of the measured value itself for the biological response, and there is a problem in that it does not reflect the difference in the individual's emotion recognition and surroundings. That is, the emotion recognition technology.
  • a standardized emotion recognition rule-base that can be applied statistically to all people has been established, and this standardized rule-base has been applied to all individuals.
  • the recognition device and recognition method of the present invention can be used to accurately grasp the emotion of the other party faced by this user.
  • Fig. 1 is a schematic diagram showing the recognition device of the present invention.
  • Fig. 2 is a photographing unit (1W) 2020/175969 1»(:1 ⁇ 1 ⁇ 2020/002928 This is a schematic diagram showing the captured image.
  • the recognition device shown in FIG. 1 includes a photographing unit 110, an emotion recognition unit 130, and a display unit 150
  • the photographing unit 110 may photograph the user and the other party who speaks. For example,
  • the photographing unit (0) can photograph the other person's face, arms, legs, and body.
  • the opponent's face photographed in the photographing unit 110 can be used to analyze and recognize the other's expression.
  • the opponent's arms, legs, and body photographed in the photographing unit (0) analyze and recognize the other's gestures.
  • the photographing unit (0) can contain various types of cameras that generate video images.
  • the emotion recognition unit 130 may recognize the external emotion of the other party through the analysis of the image of the other party photographed by the photographing unit 0.
  • the other party's image may refer to an image that includes the other party among the images captured by the photographing unit 110.
  • the emotion recognition unit 130 may extract the other party included in the other party image.
  • the emotion recognition unit 130 Can apply various expression recognition technologies and various gesture recognition technologies to the image of the other party to the extracted information of the other party, and at least one of the other party’s expressions and gestures) can be grasped or recognized.
  • the emotion recognition unit 130 recognizes the emotion of the identified opponent's expression or gesture
  • the emotion recognition algorithm can be generated based on deep learning. Using the external appearance of the other party, such as expressions or gestures), the external emotion corresponding to the external emotion expression of the other party can be recognized. have.
  • Appearance may be a difficult part for low vision users to recognize.
  • the appearance feeling recognized by the recognition unit 130 may be provided to the user by the display unit 150.
  • This display unit 150 can display the external feeling of the other party by voice or vibration.
  • the user can sense the voice or vibration of the display unit 150 by holding or wearing the display unit 150.
  • the photographing unit 110 and the emotion recognition unit 130 may also be formed to be possessed or worn by the user. .
  • the display unit 150 may include an audio output means such as a speaker that displays a voice such as “the other party is in a joyful state” of the other party’s emotions.
  • the display unit 150 makes an appointment with the user.
  • Vibration means for generating a vibration signal can be included.
  • the vibration means can indicate that the emotion of the other party is in a joyful state through vibrations tailored to the setting routine.
  • the user can receive the external feeling through the display unit 150 while acquiring the voice of the other party by himself.
  • the user can accurately grasp the other party's feelings through the detection of the external feeling.
  • the user himself/herself Emotional learning can be performed while matching the external emotion provided by the display unit 150 to the voice of the other party being heard. Through this, the user can learn what voice the other party is talking with. 2020/175969 1»(:1 ⁇ 1 ⁇ 2020/002928
  • the external sentiment of the other party through expressions or gestures may vary depending on the surrounding environment. For example, in a library where quietness is required, expressions/gestures in which the other party expresses characteristic emotions is a noisy market. It may be different from what is expressed in. For this reason, it may be difficult to accurately recognize the other party's emotions only with the user's external emotions.
  • the recognition device of the present invention may be provided with a collection unit 120 for collecting the voice of the other party.
  • the emotion recognition unit 130 can analyze the voice by using a voice emotion recognition model such as deep learning or a voice characteristic analysis algorithm, and recognize the voice emotion of the other party.
  • the sound emotion is recognized through the voice. It may contain emotions that can be or perceived.
  • the emotion recognition unit 130 may correct the external emotion by using the recognized sound emotion.
  • the display unit 150 may display the corrected appearance emotion.
  • the display unit 150 may display the corrected appearance emotion using sound emotions.
  • Correction of appearance emotion can mean correction of intensity for the same kind of emotion.
  • a plurality of display information may be provided on the display unit 150 by dividing the intensity of emotion for the same type of external emotion. For example, even if it is a feeling of joy, a little joy can be provided. It is advantageous that the intensity is included as, normal joy, very joy, etc.
  • the emotion recognition unit 130 can recognize the type of external emotion through the analysis of the image of the other party.
  • the emotion recognition unit 130 can recognize the type of external emotion according to the strength of the voice. It is possible to discriminate the strength and weakness of the type of external feeling that is attributed.
  • the display unit 150 can display specific display information indicating the type of external feeling and the determination result of the strength and weakness. The century can be strong.
  • This emotion recognition unit 130 uses the image of the other party to calculate the image of the other party to calculate the image of the other party to calculate the image of the other party to calculate
  • the emotion recognition unit 130 may correct the perceived external emotion through analysis of the image of the other party if the external emotion and the sound emotion are different from each other and the intensity of the sound emotion satisfies the set value. For example, the external appearance While emotions are perceived as'joy', sound emotions can be recognized as'sadness.' In this case, if the intensity of the sound emotion, for example, the strength of the voice, the decibels, or the volume is above the set value, the emotion recognition unit ( 130) can modify the external emotion of'joy' to the state of'sadness'.
  • the display unit 150 is the appearance modified by the emotion recognition unit 130 2020/175969 1 » (:1 ⁇ 1 ⁇ 2020/002928 You can express your emotions.
  • the emotion recognition unit 130 may provide the determined external emotion to the display unit 150 as it is, when the external emotion and the sound emotion are the same or the intensity of the sound emotion does not satisfy the set value. For example, , If the intensity of the sound emotion is less than the set value, the emotion recognition unit 130 may ignore the sound emotion and provide the already recognized external emotion'joy' to the display unit 150 as it is.
  • An object recognition unit 170 can be provided to guide users with low vision.
  • the photographing unit 110 can photograph surrounding objects.
  • the object recognition unit 170 is
  • the surrounding object photographed by the photographing unit 0 can be recognized.
  • the display unit 150 can display object information corresponding to the recognition result of the object by voice or vibration. Even if the user does not visually recognize the surrounding object, With the help of the recognition device of the present invention, surrounding objects can be recognized audibly or tactilely.
  • the emotion recognition unit 130 may identify person 1 excluding the other party among objects.
  • the emotion recognition unit 130 analyzes the number of person 1s corresponding to the object, and determines the number of characters 1 If you are satisfied, you can highly correct the intensity of your external feelings.
  • the other party can refrain from expressing external emotions when other people are around. For example, even in a very happy situation, the other party can express joy calmly in consideration of the people around him.
  • the emotion recognition unit 130 can determine the number of the surrounding person I excluding the other person. If the identified number of characters satisfies the set number, the intensity of the currently recognized appearance can be highly corrected. Through the analysis of the other party's facial expressions, the other party's emotions can be recognized as'normal joy.' Person 17 ⁇ If there are 2 people corresponding to the set number, emotion
  • the recognition unit 130 may correct the intensity of the emotion as high as the setting step. By the correction, the external emotion of the other party is corrected to be “very joyful” and may be displayed through the display unit 150.
  • person I can react to the other person's emotional expression.
  • the other person 1 who stops around the other party may be an acquaintance of the other party.
  • the acquaintance of the other party 2020/175969 1»(:1/10 ⁇ 020/002928 Can react to the other party's emotional expression.
  • the other party who is aware of this fact can express his/her emotional expression calmly compared to the general situation.
  • the other party's sentiment expressed as may be in a state where the intensity of the original sentiment information is weakened. Therefore, the other person's actual feelings can be stronger than those expressed externally.
  • the emotion recognition unit 130 when it is determined that one surrounding person is moving,
  • the emotion recognition unit 130 can correct the intensity of the external emotion high.
  • the emotion recognition unit 130 displays an external emotion that is highly corrected in intensity instead of the original external emotion. Can be provided to
  • a recognition unit 180 may be provided.
  • the emotion recognition unit 130 may analyze the contents of the character III included in the image photographed by the photographing unit 0.
  • the emotion recognition unit 130 may be defined by the letter III.
  • the appearance emotion can be corrected according to the analysis result of the content.
  • the display unit 150 can display the appearance emotion corrected according to the analysis result of the content.
  • the recognition device of the present invention analyzes characters. Through this, you can correct the external feeling with an intensity higher than the intensity of the emotion expressed by the other party.
  • the emotion recognition unit 130 can analyze various characters included in the image and determine the swap environment in which the other party is facing.
  • the environment in which the other party is facing corresponds to the current location of the other party.
  • Content information such as facility information, text on books, letters, text messages, menu boards, product labels, screens, etc. that affect the emotions of the other party may be included.
  • the emotion recognition unit 130 corrects the intensity high even if the external information of the other party is ⁇ a little joy''. , It is possible to modify the external appearance information with'very joyful'.
  • the display unit 150 can display the external feeling of the other party as'very joyful'. The user can accurately grasp the actual feeling of the other party and deal with the other party.
  • an object recognition unit 170 In the recognition apparatus, an object recognition unit 170, a character recognition unit 180, and a selection unit 190 may be provided.
  • the object recognition unit 170 may recognize surrounding objects photographed by the photographing unit 110.
  • the character recognition unit 180 can recognize the character 111 photographed by the photographing unit (0).
  • the selection unit 190 is among the appearance information, recognition results of objects, and recognition results of characters. 2020/175969 1»(:1 ⁇ 1 ⁇ 2020/002928 Through the display unit 150, the object to be displayed can be selected according to the user's choice.
  • the user can use the object recognition unit 170 for the purpose of recognizing surrounding objects, or the user can use the character recognition unit 180 for the purpose of recognizing the surrounding characters.
  • Display unit 150 May display the object selected by the selection unit 190.
  • the emotion recognition unit 130 may recognize the appearance emotion.
  • At least one of the recognition units 180 can recognize objects or characters while operating together with the emotion recognition unit 130.
  • the appearance emotion can be modified using the content information.
  • the storage unit 140 may be provided as a means for obtaining the external feeling of the other party in advance so that the user can show the attitude of focusing only on the other party.
  • a photographed image captured by the photographing unit 0 may be stored for a set time.
  • the recognition unit 170 or the character recognition unit 180 may be provided with a photographed image stored for a set time before the current time point.
  • the object recognition unit 170 may recognize surrounding objects included in the photographed image.
  • the character recognition unit 180 may recognize characters included in the photographed image.
  • the emotion recognition unit 130 may receive object information corresponding to the object recognition result from the object recognition unit 170, or may receive content information corresponding to the character recognition result from the character recognition unit 180.
  • the emotion recognition unit 130 uses object information or content information to calculate emotion recognition
  • the emotion may be corrected.
  • the display unit 150 may display the appearance emotion corrected by the emotion recognition unit 130.
  • an analysis can be performed on the captured image photographed before the time of the other party's face-to-face and stored in the storage unit 140.
  • the character is recognized, and the emotion recognition unit 130 may correct the external emotion of the other party facing by using the corresponding surrounding object or character.
  • the recognition device of the present invention is manufactured in the form of glasses or a mounting device.
  • the recognition device of the present invention may be worn by the user.
  • the recognition device of the present invention may be manufactured in the form of a mobile device to follow the user or be worn by the user.
  • the photographing unit (0) equipped with glasses or the like may be formed to shoot the user's front side.
  • the present embodiment even when the user who starts to face the other party continuously looks at the other party, the external appearance of the other party can be accurately corrected by using the pre-stored peripheral environment. As a result, the actual feeling of the other party can be followed. A user who knows the other person's actual feelings accurately can respond correctly to the other's feelings.
  • 3 is a flow chart showing the recognition method of the present invention.
  • the recognition method of FIG. 3 can be performed by the recognition device of FIG. 1.
  • the photographing unit (0) can shoot the surrounding area desired by the user 510).
  • This storage unit 140 may store the photographed image 520).
  • the emotion recognition unit 130 may analyze the photographed image and extract the other party who communicates with the user 530).
  • the emotion recognition unit 130 may recognize and correct the external emotion of the other party (540).
  • the display unit 150 may display the appearance emotion corrected by the emotion recognition unit 130 (550).
  • the step 540 of recognizing and correcting the external emotion can be subdivided as follows.
  • the selection unit 190 may select any one of emotion recognition, object recognition, and character recognition according to the user's selection (541).
  • the object recognition unit 170 can recognize surrounding objects photographed by the photographing unit 110 543). At this time, the recognition of the surrounding objects can be performed in real time. 150) may display the information of the surrounding object recognized by the object recognition unit 170 by voice or vibration.
  • the character recognition unit 180 can recognize the characters photographed in the photographing unit 110 544). At this time, the character recognition can be performed in real time.
  • the display unit 150 may display content information of a character recognized by the character recognition unit 180 by voice or vibration.
  • the emotion recognition unit 130 may recognize a nearby object or recognize a text through the analysis of the photographed image previously stored in the storage unit 140.
  • the recognition of an object or text performed during emotion recognition is performed in real time.
  • the recognition of objects or characters performed during emotional recognition has a characteristic in that it targets the photographed image previously stored in the storage unit 140.
  • This emotion recognition unit 130 recognizes object information or text corresponding to the object recognition result. 2020/175969 1»(:1 ⁇ 1 ⁇ 2020/002928 You can use the content information corresponding to the result to correct the appearance feeling.
  • the emotion recognition unit 130 may determine the level of quietness that requires a quiet environment through the analysis of object information or content information, and correct the intensity of external emotion in proportion to the level of quietness.
  • the quietness of the location can be determined through the text included in the photographed image.
  • a quietness level per number of surrounding people included in a photographed image may be set. If the number of neighboring people is 1, it can be set as quietness level 1, if the number of neighbors is 2, quietness level 2, etc.
  • the emotion recognition unit (130 ) Can correct the external information in three stages of joy, which increases the intensity of the current emotion by one level. If the current level of emotion and joy of the other party is at the second level, and if the quietness is 2, the emotion recognition unit 130 adjusts the strength of the current emotion to the second level. The appearance information can be corrected in 4 steps of increased joy.
  • a quietness level of 5 can be given. If the current location is a market, a quietness level of 1 can be given. If the quietness level is 5, the emotion recognition unit 130 can correct the appearance information to 7 levels of joy, which raises the current level of emotion by 5 levels. If the current level of emotion is 2 levels of the other party's emotions, the level of quietness is 1, the emotion recognition unit ( 130) can correct the appearance information in three stages of joy, which raises the intensity of the current emotion by one level.
  • the present invention can perform recognition of five basic emotions (eg, joy, surprise, sadness, anger, fear).
  • the present invention has 17 complex emotions (eg, positive and negative, arousal and taste awakening). According to the three-dimensional concept of domination and obedience, it is possible to perform recognition of complex emotions, including intensity (intensity) levels).
  • the present invention has a function of presenting a communication language. Specifically, the present invention has a maximum of 75 emotions (eg, energetic, cool, disappointed, fearful, a lot of surprise, despair, embarrassment, anxiety, annoyance). They can recognize and express their feelings of anger, irritability, wanting to get out, energetic, fun, and shy.
  • emotions eg, energetic, cool, disappointed, fearful, a lot of surprise, despair, embarrassment, anxiety, annoyance. They can recognize and express their feelings of anger, irritability, wanting to get out, energetic, fun, and shy.
  • the present invention uses deep learning artificial intelligence technique.
  • the present invention can recognize a variety of complex emotions that were previously difficult to recognize by using a high-dimensional combination method in complex expressions.
  • the present invention first advances environmental awareness and reduces the number of objects to objects that can be found within the identified environment in order to compensate for the recognition rate that decreases significantly when the number of objects increases. Can be raised.
  • the present invention can input the motion flow of the sequence image into the Dictionary learning method, and design a recognition technology through the pre-learning of latent motions for each expression.
  • the present invention is a variety of methods for extracting past expression features.
  • PCA Principal Component Analysis
  • ICA Independent Component Analysis
  • ASM Active Shape 2020/175969 1»(:1 ⁇ 1 ⁇ 2020/002928
  • the present invention can use an object recognition algorithm. Specifically, about 100
  • Object recognition and voice guidance algorithms that can detect indoor and outdoor atmospheres can be used.
  • the present invention can use a character recognition algorithm. Specifically, an algorithm capable of recognizing characters in 102 languages including Korean, Chinese, Japanese, and English can be used.
  • the present invention can use a basic emotional real-time recognition algorithm. Specifically,
  • An algorithm capable of real-time recognition of five basic emotions eg, joy, surprise, sadness, anger, and fear
  • five basic emotions eg, joy, surprise, sadness, anger, and fear
  • the present invention can use 17 complex expression and emotion recognition algorithms.
  • AU action unit detection and LSTM-based complex emotion recognition function can be used.
  • the present invention can perform data collection and database construction, data learning and model creation, and object recognition YOLO.
  • the present invention can perform data collection and database construction, data learning and model generation, and character recognition tesseract-ocr use.
  • the present invention can implement the object recognition, text recognition, emotion recognition and communication language speech integrated system.
  • the present invention is based on the feature extraction and AU detection method for landmark point and LSTM-based expression recognition, 23 expression recognition methods based on AU detection and LSTM features, and expression DB-based expression recognition. A method of improving the recognition rate of complex expressions through optimization can be used.
  • the present invention can use emotional characteristic modeling based on complex expression recognition.
  • the present invention can provide numerical indicators for emotional evaluation based on 23 expression recognition.
  • the emotion device or method of the present invention can measure emotion as follows.
  • the emotion recognition unit 130 may judge, recognize, or classify the emotion of the other party by using the expression extracted from the image of the other party.
  • the expression may be the primary basis for judgment of the emotion. However, the expression may be a few seconds after expression. It is common to disappear within. Also, in the case of Asians, it is difficult to apply to emotion judgment because they do not make big expressions.
  • 75 emotions can be measured or classified.
  • emotions can be evaluated by taking into account the movement of the facial muscles. 2020/175969 1»(:1 ⁇ 1 ⁇ 2020/002928 Depending on muscle mass, habits, etc., it may vary from person to person.
  • the emotion recognition unit 130 may judge, recognize, or classify the emotion of the other party by using the eyes extracted from the image of the other party.
  • the eyes are the size of the eyes, the movement of the eyes, the direction of the eyes, the movement speed of the eyes, and the muscles of the eyes. It can be defined as a combination of the movement of the eyelids, the degree of opening and closing of the eyelids, the gaze time, the gap between the brows and the wrinkles.
  • feelings of emotion miod, hobbies, emotional atmosphere, emotions, etc.
  • Including high-order complex emotions that occur during development can be grasped; therefore, eyes can be used to measure emotions such as emotional stability and instability, passion and indifference, shyness, guilt, disgust, curiosity, and empathy.
  • the emotion recognition unit 130 uses the movement of the neck extracted from the image of the other party
  • Neck movement can include the direction of the head, the angle between the head and the neck, the angle between the head and the back, the speed of the movement of the neck, the movement of the shoulder, and the degree and angle of the bending of the shoulder. Using neck movement, feelings such as depression and lethargy can be accurately measured.
  • the emotion recognition unit 130 includes posture, gesture, and behavior extracted from the image of the other party.
  • Gestures can be used to judge, recognize, or classify the emotions of the other party.
  • the emotion recognition unit 130 comprehensively judges the movements of the neck, arms, back, and legs, and the physical openness, aggression, physical pain, and excitement (elevated) ), you can measure your favor.
  • the emotion recognition unit 130 may judge, recognize, or classify the emotion of the other party by using the voice of the other party.
  • the height of the voice, the length of the voice, the wave of the voice, the intonation of the voice, and the voice of the other party are large and small.
  • the other person's emotions can be classified by measuring and classifying the length of the lost paragraph and the intensity of the voice.
  • the emotion recognition unit 130 may judge the other person’s emotions by using the vocabulary analyzed through the voice of the other party.
  • the emotion recognition unit 130 uses a specific vocabulary related to emotion among the vocabulary expressed by the voice of the other party. You can judge, recognize, or classify the other person's feelings.
  • a plurality of elements capable of grasping the emotions of the other party may be input through various input means such as a photographing unit and a collection unit.
  • the emotion recognition unit 130 receives elements (emotion analysis information) obtained through multiple channels. You can do it like this:
  • the factors can include expressions, eyes, movements of the neck, gestures, voices, vocabulary, etc.
  • the emotion recognition unit analyzes the other person's image to analyze the other person's gestures and expressions.
  • the emotion recognition unit can grasp at least one of the voice of the other party and the vocabulary spoken by the other party through the collection unit.
  • This emotion recognition unit 130 receives the data obtained first in time among a plurality of elements.
  • the emotion recognition unit 130 may recognize a specific element used to recognize the emotion of the other party according to whether each element satisfies a set intensity or whether each element lasts for a set time. Can be selected.
  • the emotion recognition unit 130 can analyze the intensity of each element. For example, it is assumed that the expression level 1, the eyes 3 level, the movement level 2, the gesture level 5, the voice level 2, and the vocabulary level 0 When the set intensity is set to 3 levels, the emotion recognition unit 130 can recognize the emotion of the other party by using only the eyes and gestures with the intensity 3 or higher.
  • the emotion recognition unit 130 can analyze the duration of each element. For example, it is assumed that the expression is 3 seconds, the eyes are 4 seconds, the neck movement is 1 second, the gesture is 2 seconds, the voice is 3 seconds, and the vocabulary is 2 seconds. When the set time is set to 3 seconds, the emotion recognition unit 130 can recognize/determine/classify the other person's emotions using only the expression, eyes, and voice whose duration is 3 seconds or more.
  • the emotion recognition unit 130 may give priority to classify emotions in the order of gesture, voice, expression, vocabulary, and neck movement. Or, the emotion recognition unit ( 130) can give priority to classify emotions in the order of gesture, voice, expression, vocabulary, and eyes. According to this embodiment, when various complex emotion elements are identified, the priority to accurately grasp the emotion of the other party is given. Can be assigned to each complex emotion element.
  • the computing device TNW0 of FIG. 4 may be a device (eg, a recognition device) described in this specification.
  • the computing device TNW0 may include at least one processor (TN110), a transmission/reception device (TN120), and a memory (TN130).
  • the computing device (TNW0) is a storage device.
  • TN140 input interface device
  • TN160 output interface device
  • Components included in the computing device (TNW0) are connected by a bus (TN170) to communicate with each other. can do.
  • the processor (TN110) can execute a program command stored in at least one of the memory (TN130) and the storage device (TN140).
  • the processor (TN110) is a central processing unit (CPU), It may mean a graphics processing unit (GPU: graphics processing unit), or a dedicated processor in which the methods according to the embodiment of the present invention are performed.
  • the processor (TN110) is a process, function, and It can be configured to implement methods, etc.
  • the processor (TN110) can control each component of the computing device (TNW0). 2020/175969 1»(:1 ⁇ 1 ⁇ 2020/002928
  • Each of the memory (TN130) and the storage device (TN140) can store various information related to the operation of the processor (TN110).
  • the memory (TN130) and the storage device (TN140) are each a volatile storage medium and a nonvolatile storage medium. It can be composed of at least one of them.
  • the memory (TN130) may consist of at least one of read only memory (ROM) and random access memory (RAM).
  • the transmitting and receiving device (TN120) can transmit or receive a wired signal or a wireless signal.
  • the transmitting and receiving device (TN120) is connected to the network and can perform communication.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un appareil de reconnaissance. L'appareil de reconnaissance peut comprendre : une unité de capture destinée à capturer une image d'un interlocuteur faisant face à un utilisateur ; une unité de reconnaissance d'émotion destinée à reconnaître une émotion externe de l'interlocuteur par analyse de l'image d'interlocuteur capturée par l'unité de capture ; et une unité d'affichage destinée à afficher l'émotion externe par la voix ou la vibration.
PCT/KR2020/002928 2019-02-28 2020-02-28 Appareil de reconnaissance d'émotion et procédé de reconnaissance d'émotion WO2020175969A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2019-0023632 2019-02-28
KR20190023632 2019-02-28
KR1020200025111A KR102351008B1 (ko) 2019-02-28 2020-02-28 감정 인식 장치 및 감정 인식 방법
KR10-2020-0025111 2020-02-28

Publications (1)

Publication Number Publication Date
WO2020175969A1 true WO2020175969A1 (fr) 2020-09-03

Family

ID=72239771

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/002928 WO2020175969A1 (fr) 2019-02-28 2020-02-28 Appareil de reconnaissance d'émotion et procédé de reconnaissance d'émotion

Country Status (1)

Country Link
WO (1) WO2020175969A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113808623A (zh) * 2021-09-18 2021-12-17 武汉轻工大学 一种可以供盲人使用的情绪识别眼镜
CN117496333A (zh) * 2023-12-29 2024-02-02 深医信息技术(深圳)有限公司 一种医疗设备数据采集方法及医疗设备数据采集系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134937A (ja) * 2008-12-08 2010-06-17 Korea Electronics Telecommun 状況認知装置およびこれを用いた状況認知方法
KR20100100380A (ko) * 2009-03-06 2010-09-15 중앙대학교 산학협력단 상황 정보와 감정 인식을 사용한 유비쿼터스 환경의 최적 서비스 추론 방법 및 시스템
KR20130009123A (ko) * 2011-07-14 2013-01-23 삼성전자주식회사 사용자의 감정 인식 장치 및 방법
KR20150081824A (ko) * 2014-01-07 2015-07-15 한국전자통신연구원 사용자 감정 기반 디지털 멀티미디어 제어장치 및 그 방법
KR20180111467A (ko) * 2017-03-31 2018-10-11 삼성전자주식회사 사용자 감정 판단을 위한 전자 장치 및 이의 제어 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134937A (ja) * 2008-12-08 2010-06-17 Korea Electronics Telecommun 状況認知装置およびこれを用いた状況認知方法
KR20100100380A (ko) * 2009-03-06 2010-09-15 중앙대학교 산학협력단 상황 정보와 감정 인식을 사용한 유비쿼터스 환경의 최적 서비스 추론 방법 및 시스템
KR20130009123A (ko) * 2011-07-14 2013-01-23 삼성전자주식회사 사용자의 감정 인식 장치 및 방법
KR20150081824A (ko) * 2014-01-07 2015-07-15 한국전자통신연구원 사용자 감정 기반 디지털 멀티미디어 제어장치 및 그 방법
KR20180111467A (ko) * 2017-03-31 2018-10-11 삼성전자주식회사 사용자 감정 판단을 위한 전자 장치 및 이의 제어 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113808623A (zh) * 2021-09-18 2021-12-17 武汉轻工大学 一种可以供盲人使用的情绪识别眼镜
CN117496333A (zh) * 2023-12-29 2024-02-02 深医信息技术(深圳)有限公司 一种医疗设备数据采集方法及医疗设备数据采集系统

Similar Documents

Publication Publication Date Title
KR102351008B1 (ko) 감정 인식 장치 및 감정 인식 방법
US20210081056A1 (en) Vpa with integrated object recognition and facial expression recognition
Vinola et al. A survey on human emotion recognition approaches, databases and applications
US9031293B2 (en) Multi-modal sensor based emotion recognition and emotional interface
CN108886532B (zh) 用于操作个人代理的装置和方法
US9501743B2 (en) Method and apparatus for tailoring the output of an intelligent automated assistant to a user
CN110349667B (zh) 结合调查问卷及多模态范式行为数据分析的孤独症评估系统
US9734730B2 (en) Multi-modal modeling of temporal interaction sequences
JP2004310034A (ja) 対話エージェントシステム
US20140212854A1 (en) Multi-modal modeling of temporal interaction sequences
Prado et al. Visuo-auditory multimodal emotional structure to improve human-robot-interaction
JP2005237561A (ja) 情報処理装置及び方法
JP6633250B2 (ja) 対話ロボットおよび対話システム、並びに対話プログラム
WO2020215590A1 (fr) Dispositif de prise de vues intelligent et son procédé de génération de scène basée sur une reconnaissance biométrique
KR20200092207A (ko) 전자 장치 및 이를 이용한 감정 정보에 대응하는 그래픽 오브젝트를 제공하는 방법
WO2020175969A1 (fr) Appareil de reconnaissance d'émotion et procédé de reconnaissance d'émotion
Sosa-Jiménez et al. A prototype for Mexican sign language recognition and synthesis in support of a primary care physician
Zhang et al. A survey on mobile affective computing
JP2017182261A (ja) 情報処理装置、情報処理方法、およびプログラム
JP7111042B2 (ja) 情報処理装置、提示システム、及び情報処理プログラム
Mansouri Benssassi et al. Wearable assistive technologies for autism: opportunities and challenges
US20240335952A1 (en) Communication robot, communication robot control method, and program
JP2000194252A (ja) 理想行動支援装置及びその方法及びそのシステム並びに記憶媒体
Haq Audio visual expressed emotion classification
JP4355823B2 (ja) 表情等の情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20762976

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20762976

Country of ref document: EP

Kind code of ref document: A1