WO2022024355A1 - Emotion analysis system - Google Patents

Emotion analysis system Download PDF

Info

Publication number
WO2022024355A1
WO2022024355A1 PCT/JP2020/029468 JP2020029468W WO2022024355A1 WO 2022024355 A1 WO2022024355 A1 WO 2022024355A1 JP 2020029468 W JP2020029468 W JP 2020029468W WO 2022024355 A1 WO2022024355 A1 WO 2022024355A1
Authority
WO
WIPO (PCT)
Prior art keywords
biological reaction
emotion
subject
moving image
unit
Prior art date
Application number
PCT/JP2020/029468
Other languages
French (fr)
Japanese (ja)
Inventor
渉三 神谷
Original Assignee
株式会社I’mbesideyou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社I’mbesideyou filed Critical 株式会社I’mbesideyou
Priority to PCT/JP2020/029468 priority Critical patent/WO2022024355A1/en
Priority to JP2022539467A priority patent/JPWO2022025025A1/ja
Priority to PCT/JP2021/027638 priority patent/WO2022025025A1/en
Publication of WO2022024355A1 publication Critical patent/WO2022024355A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Definitions

  • the present invention relates to an emotion analysis system.
  • Patent Document 1 a technique for analyzing emotions received by others in response to a speaker's remark is known (see, for example, Patent Document 1). Further, there is also known a technique of comparing a normal (expressionless) facial expression of a subject with a current facial expression to determine the degree of emotion of the subject (for example, Patent Documents 2 to 4). reference).
  • the subject A is determined to have a current emotion X level of 3 based on the difference from his / her normal facial expression
  • the subject B is also determined to have a current emotion X level of 3. .
  • the meaning of emotion X (the degree of emotion in the true sense) is different between the level 3 of the subject A who easily causes emotion X and the level 3 of the subject B who does not easily generate emotion X. Since the techniques described in Patent Documents 2 to 4 cannot evaluate the degree of emotion in the true sense of the word, it is not possible to objectively compare the degree of emotion between different subjects.
  • the emotion analysis system of the present invention analyzes changes in the biological reaction of the subject based on the moving image obtained by photographing the subject, and changes in the analyzed biological reaction. Based on, the degree of emotion of the subject is evaluated according to the evaluation criteria leveled among the plurality of subjects.
  • the emotion analysis system of the present embodiment changes the biological reaction of the subject based on the moving image obtained by photographing the subject (participant of the online session), for example, in an environment where an online session is performed by a plurality of people.
  • An online session is, for example, an online conference, an online class, an online chat, etc., in which terminals installed in multiple locations are connected to a server via a communication network such as the Internet, and moving images are transmitted between the terminals through the server. It is designed to be able to communicate.
  • the moving image handled in the online session includes the face image and voice of the user who uses the terminal.
  • the moving image also includes an image such as a material shared and viewed by a plurality of users. It is possible to switch between the face image and the material image on the screen of each terminal to display only one of them, or to divide the display area and display the face image and the material image at the same time. Further, it is possible to display the image of one of a plurality of people on the full screen, or to display the image of a part or all of the users on a small screen.
  • FIG. 1 is a block diagram showing a functional configuration example of the emotion analysis system according to the present embodiment.
  • the emotion analysis system of the present embodiment includes a moving image acquisition unit 11, a biological reaction analysis unit 12, and an emotion evaluation unit 13 as functional configurations.
  • Each of the above functional blocks 11 to 13 can be configured by any of hardware, DSP (Digital Signal Processor), and software provided in the server device, for example.
  • DSP Digital Signal Processor
  • each of the above functional blocks 11 to 13 is actually configured to include a computer CPU, RAM, ROM, etc., and is a program stored in a recording medium such as RAM, ROM, a hard disk, or a semiconductor memory. Is realized by the operation of.
  • the moving image acquisition unit 11 acquires a moving image obtained by shooting a plurality of people (multiple users) with a camera provided in each terminal during an online session. It does not matter whether the moving image acquired from each terminal is set to be displayed on the screen of each terminal. That is, the moving image acquisition unit 11 acquires the moving image from each terminal, including the moving image being displayed on each terminal and the moving image being hidden.
  • the biological reaction analysis unit 12 analyzes changes in the biological reaction of each of a plurality of persons based on the moving image acquired by the moving image acquisition unit 11.
  • the biological reaction analysis unit 12 separates the moving image acquired by the moving image acquisition unit 11 into a set of images (a collection of frame images) and a voice, and analyzes changes in the biological reaction from each.
  • the biological reaction analysis unit 12 analyzes the user's face image using the frame image separated from the moving image acquired by the moving image acquisition unit 11, and thereby at least one of the facial expression, the line of sight, the pulse, and the movement of the face. Analyze changes in biological reactions related to one. In addition, the biological reaction analysis unit 12 analyzes changes in the biological reaction regarding at least one of the user's speech content and voice quality by analyzing the voice separated from the moving image acquired by the moving image acquisition unit 11.
  • the biological reaction analysis unit 12 calculates a biological reaction index value that reflects the content of the change in the biological reaction by quantifying the change in the biological reaction according to a predetermined standard.
  • Analysis of changes in facial expressions is performed, for example, as follows. That is, for each frame image, a facial area is specified from the frame image, and the specified facial expressions are classified into a plurality of types according to an image analysis model trained in advance by machine learning. Then, based on the classification result, it is analyzed whether a positive facial expression change occurs between consecutive frame images, a negative facial expression change occurs, and how large the facial expression change occurs. The facial expression change index value according to the analysis result is output.
  • Analysis of changes in the line of sight is performed, for example, as follows. That is, for each frame image, the area of the eyes is specified from the frame image, and the orientation of both eyes is analyzed to analyze where the user is looking. For example, it analyzes whether the speaker's face being displayed, the shared material being displayed, or the outside of the screen is being viewed. In addition, it may be possible to analyze whether the movement of the line of sight is large or small, and whether the movement is frequent or infrequent. The change in the line of sight is also related to the degree of concentration of the user.
  • the biological reaction analysis unit 12 outputs the line-of-sight change index value according to the analysis result of the line-of-sight change.
  • Analysis of pulse changes is performed, for example, as follows. That is, for each frame image, the face area is specified from the frame image. Then, using a trained image analysis model that captures the numerical value of the face color information (G in RGB), the change in the G color on the face surface is analyzed. By arranging the results along the time axis, a waveform showing the change in color information is formed, and the pulse is specified from this waveform. When a person is nervous, the pulse becomes faster, and when he / she feels calm, the pulse becomes slower.
  • the biological reaction analysis unit 12 outputs a pulse change index value according to the analysis result of the pulse change.
  • Analysis of changes in facial movement is performed, for example, as follows. That is, for each frame image, the area of the face is specified from the frame image, and the orientation of the face is analyzed to analyze where the user is looking. For example, it analyzes whether the speaker's face being displayed, the shared material being displayed, or the outside of the screen is being viewed. In addition, it may be possible to analyze whether the movement of the face is large or small, and whether the movement is frequent or infrequent. The movement of the face and the movement of the line of sight may be combined and analyzed. For example, it may be possible to analyze whether the speaker's face being displayed is viewed straight, whether the speaker is viewed with an upper eye or a lower eye, or whether the speaker is viewed from an angle.
  • the biological reaction analysis unit 12 outputs a face orientation change index value according to the analysis result of the face orientation change.
  • the content of the statement is analyzed as follows, for example. That is, the biological reaction analysis unit 12 converts the voice into a character string by performing a known voice recognition process on the voice for a specified time (for example, a time of about 30 to 150 seconds), and morphologically analyzes the character string. By doing so, words unnecessary for expressing conversation such as particles and acronyms are removed. Then, the remaining words are vectorized, and whether a positive emotional change is occurring, a negative emotional change is occurring, and how large the emotional change is occurring is analyzed, and the analysis result is used. Outputs the statement content index value.
  • Voice quality analysis is performed as follows, for example. That is, the biological reaction analysis unit 12 identifies the acoustic characteristics of the voice by performing a known voice analysis process on the voice for a specified time (for example, a time of about 30 to 150 seconds). Then, based on the acoustic characteristics, it is analyzed whether a positive voice quality change is occurring, a negative voice quality change is occurring, and how loud the voice quality change is occurring, and according to the analysis result. Outputs the voice quality change index value.
  • the biological reaction analysis unit 12 uses at least one of the facial expression change index value, the line-of-sight change index value, the pulse change index value, the face orientation change index value, the speech content index value, and the voice quality change index value calculated as described above.
  • the biological reaction index value is calculated.
  • the biological reaction index value is calculated by weighting the facial expression change index value, the line-of-sight change index value, the pulse change index value, the face orientation change index value, the speech content index value, and the voice quality change index value.
  • the emotion evaluation unit 13 evaluates the degree of emotion of the subject according to the evaluation criteria leveled among the plurality of subjects based on the change in the biological reaction analyzed for the subject by the biological reaction analysis unit 12. For example, the emotion evaluation unit 13 has an emotional response based on an evaluation standard leveled among a plurality of subjects based on the change in the biological reaction (biological reaction index value) analyzed for the subject by the biological reaction analysis unit 12. Calculate the absolute value.
  • the emotional response absolute value calculated by the emotional evaluation unit 13 is, for example, a value obtained by adjusting the biological reaction index value calculated by the biological reaction analysis unit 12 according to the likelihood of the same emotion occurring by the subject.
  • the emotion evaluation unit 13 calculates the absolute emotional response value by multiplying the biological reaction index value calculated by the biological reaction analysis unit 12 by a weight value according to the frequency of causing the same emotion.
  • the emotion evaluation unit 13 calculates the absolute emotional response value according to a function such that the weight value becomes smaller as the same emotion is more likely to occur, and the weight value becomes larger as the same emotion is less likely to occur.
  • the emotion evaluation unit 13 is a degree of emotion based on the magnitude of the difference in the current biological reaction with respect to the biological reaction in normal times, and the emotion is adjusted according to the likelihood of the same emotion being generated by the subject.
  • the degree may be evaluated.
  • the emotion evaluation unit 13 uses the biological reaction index value calculated by the biological reaction analysis unit 12 to determine the magnitude of the difference in the current biological reaction to the biological reaction in normal times and the susceptibility to the same emotion by the subject.
  • the absolute value of emotional response is calculated by adjusting accordingly.
  • the absolute emotional response value calculated in this way is a value representing the degree of emotion based on the magnitude of the difference in the current biological response to the biological response in normal times, and the subject is likely to generate the same emotion or occurs. It is a value adjusted according to the degree of difficulty.
  • the frequency of generating the same emotion is used as a measure for expressing the susceptibility to the same emotion
  • the present invention is not limited to this.
  • the nature or personality of the subject may be used in place of or in addition to the frequency with which the same emotions occur.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Resources & Organizations (AREA)
  • Operations Research (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Data Mining & Analysis (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The present invention is provided with a moving image acquisition unit 11 that acquires a moving image obtained by photographing a subject, a biological reaction analysis unit 12 that analyzes a change in a biological reaction of the subject, on the basis of the moving image obtained by the moving image acquisition unit 11, and an emotion evaluation unit 13 that, on the basis of the change in the biological reaction of the subject analyzed by the biological reaction analysis unit 12, evaluates a degree of emotion of the subject in accordance with an evaluation standard normalized among a plurality of subjects. The change in the biological reaction of the subject is analyzed on the basis of the moving image obtained by photographing the subject, and on the basis of the analyzed change in the biological reaction an emotional reaction absolute value based on the evaluation standard normalized among a plurality of subjects is calculated, thereby enabling the degree of emotion to be evaluated in a true sense with regard to the subject, and enabling an objective comparison of the degree of emotion between different subjects.

Description

感情解析システムEmotion analysis system
 本発明は、感情解析システムに関するものである。 The present invention relates to an emotion analysis system.
 従来、発言者の発言に対して他者が受ける感情を解析する技術が知られている(例えば、特許文献1参照)。また、対象者の平常時(無表情時)の表情と現在の表情とを比較して、対象者の感情の度合いを判定するようにした技術も知られている(例えば、特許文献2~4参照)。 Conventionally, a technique for analyzing emotions received by others in response to a speaker's remark is known (see, for example, Patent Document 1). Further, there is also known a technique of comparing a normal (expressionless) facial expression of a subject with a current facial expression to determine the degree of emotion of the subject (for example, Patent Documents 2 to 4). reference).
特開2019-58625号公報Japanese Unexamined Patent Publication No. 2019-58625 特開2011-154665号公報Japanese Unexamined Patent Publication No. 2011-154665 特開2012-8949号公報Japanese Unexamined Patent Publication No. 2012-8949 特開2013-300号公報Japanese Unexamined Patent Publication No. 2013-300
 上記特許文献2~4に記載の技術では、平常時の表情から現在の表情がどの程度変化しているかによって感情の度合いを評価しているだけである。そのため、この評価情報は、異なる対象者間で感情の度合いを客観的に対比するための情報としては用いることができないという問題があった。 In the techniques described in Patent Documents 2 to 4, the degree of emotion is only evaluated based on how much the current facial expression changes from the normal facial expression. Therefore, there is a problem that this evaluation information cannot be used as information for objectively comparing the degree of emotion between different subjects.
 例えば、対象者Aが自身の平常時の表情との違いから現在の感情Xの度合いがレベル3と判定され、対象者Bについても同様に現在の感情Xの度合いがレベル3と判定されたとする。しかしながら、感情Xを生起しやすい対象者Aのレベル3と、感情Xを生起しにくい対象者Bのレベル3とでは感情Xの意味合い(真の意味での感情の度合い)が違う。上記特許文献2~4に記載の技術では、このような真の意味での感情の度合いを評価することができないため、異なる対象者間で感情の度合いを客観的に対比することができない。 For example, it is assumed that the subject A is determined to have a current emotion X level of 3 based on the difference from his / her normal facial expression, and the subject B is also determined to have a current emotion X level of 3. .. However, the meaning of emotion X (the degree of emotion in the true sense) is different between the level 3 of the subject A who easily causes emotion X and the level 3 of the subject B who does not easily generate emotion X. Since the techniques described in Patent Documents 2 to 4 cannot evaluate the degree of emotion in the true sense of the word, it is not possible to objectively compare the degree of emotion between different subjects.
 本発明は、対象者に関する真の意味での感情の度合いを評価可能とすることで、異なる対象者間で感情の度合いを客観的に対比することができるようにすることを目的とする。 It is an object of the present invention to be able to objectively compare the degree of emotion between different subjects by making it possible to evaluate the degree of true emotions regarding the subject.
 上記した課題を解決するために、本発明の感情解析システムでは、対象者を撮影することによって得られる動画像に基づいて、対象者について生体反応の変化を解析し、解析された生体反応の変化に基づいて、複数の対象者間で平準化された評価基準に従って対象者の感情の度合いを評価する。 In order to solve the above-mentioned problems, the emotion analysis system of the present invention analyzes changes in the biological reaction of the subject based on the moving image obtained by photographing the subject, and changes in the analyzed biological reaction. Based on, the degree of emotion of the subject is evaluated according to the evaluation criteria leveled among the plurality of subjects.
 上記のように構成した本発明によれば、対象者に関する真の意味での感情の度合いを評価することが可能となり、異なる対象者間で感情の度合いを客観的に対比することができる。 According to the present invention configured as described above, it is possible to evaluate the degree of emotion in the true sense of the subject, and it is possible to objectively compare the degree of emotion between different subjects.
本実施形態による感情解析システムの機能構成例を示すブロック図である。It is a block diagram which shows the functional structure example of the emotion analysis system by this embodiment.
 本実施形態の感情解析システムは、例えば複数人でオンラインセッションが行われる環境において、対象者(オンラインセッションの参加者)を撮影することによって得られる動画像に基づいて、対象者について生体反応の変化を解析し、解析された生体反応の変化に基づいて、複数の対象者間で平準化された評価基準に従って対象者の感情の度合いを評価するシステムである。 The emotion analysis system of the present embodiment changes the biological reaction of the subject based on the moving image obtained by photographing the subject (participant of the online session), for example, in an environment where an online session is performed by a plurality of people. Is a system that evaluates the degree of emotion of a subject according to an evaluation standard leveled among a plurality of subjects based on the change in the analyzed biological reaction.
 オンラインセッションは、例えばオンライン会議、オンライン授業、オンラインチャットなどであり、複数の場所に設置された端末をインターネットなどの通信ネットワークを介してサーバに接続し、当該サーバを通じて複数の端末間で動画像をやり取りできるようにしたものである。 An online session is, for example, an online conference, an online class, an online chat, etc., in which terminals installed in multiple locations are connected to a server via a communication network such as the Internet, and moving images are transmitted between the terminals through the server. It is designed to be able to communicate.
 オンラインセッションで扱う動画像には、端末を使用するユーザの顔画像や音声が含まれる。また、動画像には、複数のユーザが共有して閲覧する資料などの画像も含まれる。各端末の画面上に顔画像と資料画像とを切り替えて何れか一方のみを表示させたり、表示領域を分けて顔画像と資料画像とを同時に表示させたりすることが可能である。また、複数人のうち1人の画像を全画面表示させたり、一部または全部のユーザの画像を小画面に分割して表示させたりすることが可能である。 The moving image handled in the online session includes the face image and voice of the user who uses the terminal. In addition, the moving image also includes an image such as a material shared and viewed by a plurality of users. It is possible to switch between the face image and the material image on the screen of each terminal to display only one of them, or to divide the display area and display the face image and the material image at the same time. Further, it is possible to display the image of one of a plurality of people on the full screen, or to display the image of a part or all of the users on a small screen.
 以下、本発明の一実施形態を図面に基づいて説明する。図1は、本実施形態による感情解析システムの機能構成例を示すブロック図である。図1に示すように、本実施形態の感情解析システムは、機能構成として、動画像取得部11、生体反応解析部12および感情評価部13を備えている。 Hereinafter, an embodiment of the present invention will be described with reference to the drawings. FIG. 1 is a block diagram showing a functional configuration example of the emotion analysis system according to the present embodiment. As shown in FIG. 1, the emotion analysis system of the present embodiment includes a moving image acquisition unit 11, a biological reaction analysis unit 12, and an emotion evaluation unit 13 as functional configurations.
 上記各機能ブロック11~13は、例えばサーバ装置に備えられたハードウェア、DSP(Digital Signal Processor)、ソフトウェアの何れによっても構成することが可能である。例えばソフトウェアによって構成する場合、上記各機能ブロック11~13は、実際にはコンピュータのCPU、RAM、ROMなどを備えて構成され、RAMやROM、ハードディスクまたは半導体メモリ等の記録媒体に記憶されたプログラムが動作することによって実現される。 Each of the above functional blocks 11 to 13 can be configured by any of hardware, DSP (Digital Signal Processor), and software provided in the server device, for example. For example, when configured by software, each of the above functional blocks 11 to 13 is actually configured to include a computer CPU, RAM, ROM, etc., and is a program stored in a recording medium such as RAM, ROM, a hard disk, or a semiconductor memory. Is realized by the operation of.
 動画像取得部11は、オンラインセッション中に各端末が備えるカメラにより複数人(複数のユーザ)を撮影することによって得られる動画像を各端末から取得する。各端末から取得する動画像は、各端末の画面上に表示されるように設定されているものか否かは問わない。すなわち、動画像取得部11は、各端末に表示中の動画像および非表示中の動画像を含めて、動画像を各端末から取得する。 The moving image acquisition unit 11 acquires a moving image obtained by shooting a plurality of people (multiple users) with a camera provided in each terminal during an online session. It does not matter whether the moving image acquired from each terminal is set to be displayed on the screen of each terminal. That is, the moving image acquisition unit 11 acquires the moving image from each terminal, including the moving image being displayed on each terminal and the moving image being hidden.
 生体反応解析部12は、動画像取得部11により取得された動画像に基づいて、複数人のそれぞれについて生体反応の変化を解析する。本実施形態において生体反応解析部12は、動画像取得部11により取得された動画像を画像のセット(フレーム画像の集まり)と音声とに分離し、それぞれから生体反応の変化を解析する。 The biological reaction analysis unit 12 analyzes changes in the biological reaction of each of a plurality of persons based on the moving image acquired by the moving image acquisition unit 11. In the present embodiment, the biological reaction analysis unit 12 separates the moving image acquired by the moving image acquisition unit 11 into a set of images (a collection of frame images) and a voice, and analyzes changes in the biological reaction from each.
 例えば、生体反応解析部12は、動画像取得部11により取得された動画像から分離したフレーム画像を用いてユーザの顔画像を解析することにより、表情、目線、脈拍、顔の動きの少なくとも1つに関する生体反応の変化を解析する。また、生体反応解析部12は、動画像取得部11により取得された動画像から分離した音声を解析することにより、ユーザの発言内容、声質の少なくとも1つに関する生体反応の変化を解析する。 For example, the biological reaction analysis unit 12 analyzes the user's face image using the frame image separated from the moving image acquired by the moving image acquisition unit 11, and thereby at least one of the facial expression, the line of sight, the pulse, and the movement of the face. Analyze changes in biological reactions related to one. In addition, the biological reaction analysis unit 12 analyzes changes in the biological reaction regarding at least one of the user's speech content and voice quality by analyzing the voice separated from the moving image acquired by the moving image acquisition unit 11.
 人は感情が変化すると、それが表情、目線、脈拍、顔の動き、発言内容、声質などの生体反応の変化となって現れる。本実施形態では、ユーザの生体反応の変化を解析することを通じて、ユーザの感情の変化を解析する。本実施形態において解析する感情は、一例として、快/不快の程度である。本実施形態において生体反応解析部12は、生体反応の変化を所定の基準に従って数値化することにより、生体反応の変化の内容を反映させた生体反応指標値を算出する。 When a person's emotions change, it appears as changes in biological reactions such as facial expressions, eyes, pulse, facial movements, speech content, and voice quality. In this embodiment, changes in the user's emotions are analyzed by analyzing changes in the user's biological reaction. The emotion analyzed in this embodiment is, for example, the degree of comfort / discomfort. In the present embodiment, the biological reaction analysis unit 12 calculates a biological reaction index value that reflects the content of the change in the biological reaction by quantifying the change in the biological reaction according to a predetermined standard.
 表情の変化の解析は、例えば以下のようにして行う。すなわち、フレーム画像ごとに、フレーム画像の中から顔の領域を特定し、事前に機械学習させた画像解析モデルに従って特定した顔の表情を複数に分類する。そして、その分類結果に基づいて、連続するフレーム画像間でポジティブな表情変化が起きているか、ネガティブな表情変化が起きているか、およびどの程度の大きさの表情変化が起きているかを解析し、その解析結果に応じた表情変化指標値を出力する。 Analysis of changes in facial expressions is performed, for example, as follows. That is, for each frame image, a facial area is specified from the frame image, and the specified facial expressions are classified into a plurality of types according to an image analysis model trained in advance by machine learning. Then, based on the classification result, it is analyzed whether a positive facial expression change occurs between consecutive frame images, a negative facial expression change occurs, and how large the facial expression change occurs. The facial expression change index value according to the analysis result is output.
 目線の変化の解析は、例えば以下のようにして行う。すなわち、フレーム画像ごとに、フレーム画像の中から目の領域を特定し、両目の向きを解析することにより、ユーザがどこを見ているかを解析する。例えば、表示中の話者の顔を見ているか、表示中の共有資料を見ているか、画面の外を見ているかなどを解析する。また、目線の動きが大きいか小さいか、動きの頻度が多いか少ないかなどを解析するようにしてもよい。目線の変化はユーザの集中度にも関連する。生体反応解析部12は、目線の変化の解析結果に応じた目線変化指標値を出力する。 Analysis of changes in the line of sight is performed, for example, as follows. That is, for each frame image, the area of the eyes is specified from the frame image, and the orientation of both eyes is analyzed to analyze where the user is looking. For example, it analyzes whether the speaker's face being displayed, the shared material being displayed, or the outside of the screen is being viewed. In addition, it may be possible to analyze whether the movement of the line of sight is large or small, and whether the movement is frequent or infrequent. The change in the line of sight is also related to the degree of concentration of the user. The biological reaction analysis unit 12 outputs the line-of-sight change index value according to the analysis result of the line-of-sight change.
 脈拍の変化の解析は、例えば以下のようにして行う。すなわち、フレーム画像ごとに、フレーム画像の中から顔の領域を特定する。そして、顔の色情報(RGBのG)の数値を捉える学習済みの画像解析モデルを用いて、顔表面のG色の変化を解析する。その結果を時間軸に合わせて並べることによって色情報の変化を表した波形を形成し、この波形から脈拍を特定する。人は緊張すると脈拍が速くなり、気持ちが落ち着くと脈拍が遅くなる。生体反応解析部12は、脈拍の変化の解析結果に応じた脈拍変化指標値を出力する。 Analysis of pulse changes is performed, for example, as follows. That is, for each frame image, the face area is specified from the frame image. Then, using a trained image analysis model that captures the numerical value of the face color information (G in RGB), the change in the G color on the face surface is analyzed. By arranging the results along the time axis, a waveform showing the change in color information is formed, and the pulse is specified from this waveform. When a person is nervous, the pulse becomes faster, and when he / she feels calm, the pulse becomes slower. The biological reaction analysis unit 12 outputs a pulse change index value according to the analysis result of the pulse change.
 顔の動きの変化の解析は、例えば以下のようにして行う。すなわち、フレーム画像ごとに、フレーム画像の中から顔の領域を特定し、顔の向きを解析することにより、ユーザがどこを見ているかを解析する。例えば、表示中の話者の顔を見ているか、表示中の共有資料を見ているか、画面の外を見ているかなどを解析する。また、顔の動きが大きいか小さいか、動きの頻度が多いか少ないかなどを解析するようにしてもよい。顔の動きと目線の動きとを合わせて解析するようにしてもよい。例えば、表示中の話者の顔をまっすぐ見ているか、上目遣いまたは下目使いに見ているか、斜めから見ているかなどを解析するようにしてもよい。生体反応解析部12は、顔の向きの変化の解析結果に応じた顔向き変化指標値を出力する。 Analysis of changes in facial movement is performed, for example, as follows. That is, for each frame image, the area of the face is specified from the frame image, and the orientation of the face is analyzed to analyze where the user is looking. For example, it analyzes whether the speaker's face being displayed, the shared material being displayed, or the outside of the screen is being viewed. In addition, it may be possible to analyze whether the movement of the face is large or small, and whether the movement is frequent or infrequent. The movement of the face and the movement of the line of sight may be combined and analyzed. For example, it may be possible to analyze whether the speaker's face being displayed is viewed straight, whether the speaker is viewed with an upper eye or a lower eye, or whether the speaker is viewed from an angle. The biological reaction analysis unit 12 outputs a face orientation change index value according to the analysis result of the face orientation change.
 発言内容の解析は、例えば以下のようにして行う。すなわち、生体反応解析部12は、指定した時間(例えば、30~150秒程度の時間)の音声について公知の音声認識処理を行うことによって音声を文字列に変換し、当該文字列を形態素解析することにより、助詞、冠詞などの会話を表す上で不要なワードを取り除く。そして、残ったワードをベクトル化し、ポジティブな感情変化が起きているか、ネガティブな感情変化が起きているか、およびどの程度の大きさの感情変化が起きているかを解析し、その解析結果に応じた発言内容指標値を出力する。 The content of the statement is analyzed as follows, for example. That is, the biological reaction analysis unit 12 converts the voice into a character string by performing a known voice recognition process on the voice for a specified time (for example, a time of about 30 to 150 seconds), and morphologically analyzes the character string. By doing so, words unnecessary for expressing conversation such as particles and acronyms are removed. Then, the remaining words are vectorized, and whether a positive emotional change is occurring, a negative emotional change is occurring, and how large the emotional change is occurring is analyzed, and the analysis result is used. Outputs the statement content index value.
 声質の解析は、例えば以下のようにして行う。すなわち、生体反応解析部12は、指定した時間(例えば、30~150秒程度の時間)の音声について公知の音声解析処理を行うことによって音声の音響的特徴を特定する。そして、その音響的特徴に基づいて、ポジティブな声質変化が起きているか、ネガティブな声質変化が起きているか、およびどの程度の大きさの声質変化が起きているかを解析し、その解析結果に応じた声質変化指標値を出力する。 Voice quality analysis is performed as follows, for example. That is, the biological reaction analysis unit 12 identifies the acoustic characteristics of the voice by performing a known voice analysis process on the voice for a specified time (for example, a time of about 30 to 150 seconds). Then, based on the acoustic characteristics, it is analyzed whether a positive voice quality change is occurring, a negative voice quality change is occurring, and how loud the voice quality change is occurring, and according to the analysis result. Outputs the voice quality change index value.
 生体反応解析部12は、以上のようにして算出した表情変化指標値、目線変化指標値、脈拍変化指標値、顔向き変化指標値、発言内容指標値、声質変化指標値の少なくとも1つを用いて生体反応指標値を算出する。例えば、表情変化指標値、目線変化指標値、脈拍変化指標値、顔向き変化指標値、発言内容指標値および声質変化指標値を重み付け計算することにより、生体反応指標値を算出する。 The biological reaction analysis unit 12 uses at least one of the facial expression change index value, the line-of-sight change index value, the pulse change index value, the face orientation change index value, the speech content index value, and the voice quality change index value calculated as described above. The biological reaction index value is calculated. For example, the biological reaction index value is calculated by weighting the facial expression change index value, the line-of-sight change index value, the pulse change index value, the face orientation change index value, the speech content index value, and the voice quality change index value.
 感情評価部13は、生体反応解析部12により対象者について解析された生体反応の変化に基づいて、複数の対象者間で平準化された評価基準に従って対象者の感情の度合いを評価する。例えば、感情評価部13は、生体反応解析部12により対象者について解析された生体反応の変化(生体反応指標値)に基づいて、複数の対象者間で平準化された評価基準に基づく感情反応絶対値を算出する。 The emotion evaluation unit 13 evaluates the degree of emotion of the subject according to the evaluation criteria leveled among the plurality of subjects based on the change in the biological reaction analyzed for the subject by the biological reaction analysis unit 12. For example, the emotion evaluation unit 13 has an emotional response based on an evaluation standard leveled among a plurality of subjects based on the change in the biological reaction (biological reaction index value) analyzed for the subject by the biological reaction analysis unit 12. Calculate the absolute value.
 感情評価部13が算出する感情反応絶対値は、例えば、生体反応解析部12により算出された生体反応指標値を、対象者による同じ感情の生起しやすさに応じて調整した値である。例えば、感情評価部13は、生体反応解析部12により算出された生体反応指標値に対し、同じ感情を生起する頻度に応じた重み値を乗算することによって感情反応絶対値を算出する。 The emotional response absolute value calculated by the emotional evaluation unit 13 is, for example, a value obtained by adjusting the biological reaction index value calculated by the biological reaction analysis unit 12 according to the likelihood of the same emotion occurring by the subject. For example, the emotion evaluation unit 13 calculates the absolute emotional response value by multiplying the biological reaction index value calculated by the biological reaction analysis unit 12 by a weight value according to the frequency of causing the same emotion.
 例えば、対象者Aについて算出された生体反応指標値と対象者Bについて算出された生体反応指標値とが同じ値であった場合としても、同じ感情の生起しやすさ(同じ感情を生起する頻度)が対象者Aと対象者Bとで異なる場合、感情評価部13により算出される感情反応絶対値は対象者Aと対象者Bとで異なる値となる。一例として、感情評価部13は、同じ感情を生起しやすいほど重み値が小さくなり、同じ感情を生起しにくいほど重み値が大きくなるような関数に従って感情反応絶対値を算出する。 For example, even if the biological reaction index value calculated for the subject A and the biological reaction index value calculated for the subject B are the same value, the susceptibility to generate the same emotion (frequency of generating the same emotion). ) Is different between the subject A and the subject B, the absolute emotional response value calculated by the emotion evaluation unit 13 is different between the subject A and the subject B. As an example, the emotion evaluation unit 13 calculates the absolute emotional response value according to a function such that the weight value becomes smaller as the same emotion is more likely to occur, and the weight value becomes larger as the same emotion is less likely to occur.
 このように算出した感情反応絶対値を用いることにより、対象者に関する真の意味での感情の度合いを評価することが可能となり、異なる対象者間で感情の度合いを客観的に対比することができる。 By using the emotional response absolute value calculated in this way, it is possible to evaluate the degree of emotion in the true sense of the subject, and it is possible to objectively compare the degree of emotion between different subjects. ..
 なお、感情評価部13は、平常時の生体反応に対する現在の生体反応の違いの大きさに基づく感情の程度であって、対象者による同じ感情の生起しやすさに応じて調整された感情の度合いを評価するようにしてもよい。例えば、感情評価部13は、生体反応解析部12により算出された生体反応指標値を、平常時の生体反応に対する現在の生体反応の違いの大きさと、対象者による同じ感情の生起しやすさとに応じて調整することによって感情反応絶対値を算出する。このように算出される感情反応絶対値は、平常時の生体反応に対する現在の生体反応の違いの大きさに基づく感情の程度を表す値であって、対象者が同じ感情を生起しやすいまたは生起しにくい度合いに応じて調整された値である。 The emotion evaluation unit 13 is a degree of emotion based on the magnitude of the difference in the current biological reaction with respect to the biological reaction in normal times, and the emotion is adjusted according to the likelihood of the same emotion being generated by the subject. The degree may be evaluated. For example, the emotion evaluation unit 13 uses the biological reaction index value calculated by the biological reaction analysis unit 12 to determine the magnitude of the difference in the current biological reaction to the biological reaction in normal times and the susceptibility to the same emotion by the subject. The absolute value of emotional response is calculated by adjusting accordingly. The absolute emotional response value calculated in this way is a value representing the degree of emotion based on the magnitude of the difference in the current biological response to the biological response in normal times, and the subject is likely to generate the same emotion or occurs. It is a value adjusted according to the degree of difficulty.
 また、上記実施形態では、同じ感情の生起しやすさを表す尺度として、同じ感情を生起する頻度を用いる例について説明したが、これに限定されない。例えば、同じ感情を生起する頻度に代えてまたは加えて、対象者の性質または性格を用いるようにしてもよい。 Further, in the above embodiment, an example in which the frequency of generating the same emotion is used as a measure for expressing the susceptibility to the same emotion has been described, but the present invention is not limited to this. For example, the nature or personality of the subject may be used in place of or in addition to the frequency with which the same emotions occur.
 その他、上記実施形態は、何れも本発明を実施するにあたっての具体化の一例を示したものに過ぎず、これによって本発明の技術的範囲が限定的に解釈されてはならないものである。すなわち、本発明はその要旨、またはその主要な特徴から逸脱することなく、様々な形で実施することができる。 Other than that, all of the above embodiments are merely examples of the embodiment of the present invention, and the technical scope of the present invention should not be construed in a limited manner by this. That is, the present invention can be implemented in various forms without departing from its gist or its main features.
 11 動画像取得部
 12 生体反応解析部
 13 感情評価部
11 Moving image acquisition unit 12 Biological reaction analysis unit 13 Emotion evaluation unit

Claims (6)

  1.  対象者を撮影することによって得られる動画像を取得する動画像取得部と、
     上記動画像取得部により取得された動画像に基づいて、上記対象者について生体反応の変化を解析する生体反応解析部と、
     上記生体反応解析部により上記対象者について解析された上記生体反応の変化に基づいて、複数の対象者間で平準化された評価基準に従って上記対象者の感情の度合いを評価する感情評価部とを備えた
    ことを特徴とする感情解析システム。
    A moving image acquisition unit that acquires a moving image obtained by shooting a target person,
    Based on the moving image acquired by the moving image acquisition unit, the biological reaction analysis unit that analyzes the change in the biological reaction of the subject, and the biological reaction analysis unit.
    Based on the changes in the biological reaction analyzed for the subject by the biological reaction analysis unit, the emotion evaluation unit that evaluates the degree of emotion of the subject according to the evaluation criteria leveled among the plurality of subjects. An emotion analysis system characterized by being prepared.
  2.  上記感情評価部は、平常時の生体反応に対する現在の生体反応の違いの大きさに基づく感情の程度であって、上記対象者による同じ感情の生起しやすさに応じて調整された感情の度合いを評価することを特徴とする請求項1に記載の感情解析システム。 The emotion evaluation unit is the degree of emotion based on the magnitude of the difference in the current biological reaction to the biological reaction in normal times, and the degree of emotion adjusted according to the likelihood of the same emotion being generated by the subject. The emotion analysis system according to claim 1, wherein the emotional analysis system is characterized in that.
  3.  上記生体反応解析部は、上記生体反応の変化を所定の基準に従って数値化することによって生体反応指標値を算出し、
     上記感情評価部は、上記生体反応解析部により算出された上記生体反応指標値を上記対象者による同じ感情の生起しやすさに応じて調整した値である感情反応絶対値を算出する
    ことを特徴とする請求項1に記載の感情分析システム。
    The biological reaction analysis unit calculates the biological reaction index value by quantifying the change in the biological reaction according to a predetermined standard.
    The emotion evaluation unit is characterized in that it calculates an absolute emotional response value, which is a value obtained by adjusting the biological reaction index value calculated by the biological reaction analysis unit according to the likelihood of the same emotion occurring by the subject. The emotion analysis system according to claim 1.
  4.  上記生体反応解析部は、上記生体反応の変化を所定の基準に従って数値化することによって生体反応指標値を算出し、
     上記感情評価部は、上記生体反応解析部により算出された上記生体反応指標値を、平常時の生体反応に対する現在の生体反応の違いの大きさと、上記対象者による同じ感情の生起しやすさとに応じて調整することによって感情反応絶対値を算出する
    ことを特徴とする請求項2に記載の感情分析システム。
    The biological reaction analysis unit calculates the biological reaction index value by quantifying the change in the biological reaction according to a predetermined standard.
    The emotion evaluation unit uses the biological reaction index value calculated by the biological reaction analysis unit to determine the magnitude of the difference in the current biological reaction to the biological reaction in normal times and the susceptibility to the same emotion by the subject. The emotion analysis system according to claim 2, wherein the absolute value of the emotional response is calculated by adjusting according to the situation.
  5.  上記生体反応解析部は、上記動画像取得部により取得された動画像にける顔画像を解析することにより、表情、目線、脈拍、顔の動きの少なくとも1つに関する生体反応の変化を解析することを特徴とする請求項1~4の何れか1項に記載の感情解析システム。 The biological reaction analysis unit analyzes changes in the biological reaction related to at least one of facial expression, line of sight, pulse, and facial movement by analyzing the facial image in the moving image acquired by the moving image acquisition unit. The emotion analysis system according to any one of claims 1 to 4, wherein the emotion analysis system is characterized.
  6.  上記生体反応解析部は、上記動画像取得部により取得された動画像にける音声を解析することにより、発言内容、声質の少なくとも1つに関する生体反応の変化を解析することを特徴とする請求項1~5の何れか1項に記載の感情解析システム。 The claim is characterized in that the biological reaction analysis unit analyzes changes in the biological reaction relating to at least one of the content of speech and voice quality by analyzing the voice in the moving image acquired by the moving image acquisition unit. The emotion analysis system according to any one of 1 to 5.
PCT/JP2020/029468 2020-07-31 2020-07-31 Emotion analysis system WO2022024355A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2020/029468 WO2022024355A1 (en) 2020-07-31 2020-07-31 Emotion analysis system
JP2022539467A JPWO2022025025A1 (en) 2020-07-31 2021-07-27
PCT/JP2021/027638 WO2022025025A1 (en) 2020-07-31 2021-07-27 Emotion analysis system and emotion analysis device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/029468 WO2022024355A1 (en) 2020-07-31 2020-07-31 Emotion analysis system

Publications (1)

Publication Number Publication Date
WO2022024355A1 true WO2022024355A1 (en) 2022-02-03

Family

ID=80035318

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2020/029468 WO2022024355A1 (en) 2020-07-31 2020-07-31 Emotion analysis system
PCT/JP2021/027638 WO2022025025A1 (en) 2020-07-31 2021-07-27 Emotion analysis system and emotion analysis device

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/027638 WO2022025025A1 (en) 2020-07-31 2021-07-27 Emotion analysis system and emotion analysis device

Country Status (2)

Country Link
JP (1) JPWO2022025025A1 (en)
WO (2) WO2022024355A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001083984A (en) * 1999-09-09 2001-03-30 Alpine Electronics Inc Interface device
JP2011039934A (en) * 2009-08-17 2011-02-24 Tokai Univ Emotion estimation system and learning system using the same
JP2015141428A (en) * 2014-01-27 2015-08-03 株式会社日立システムズ Server device, feeling notification system, feeling notification method, and program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130065846A (en) * 2011-12-02 2013-06-20 삼성전자주식회사 Apparatus and method for sharing users' emotion
JP5947237B2 (en) * 2013-03-22 2016-07-06 日本電信電話株式会社 Emotion estimation device, emotion estimation method, and program
US9646198B2 (en) * 2014-08-08 2017-05-09 International Business Machines Corporation Sentiment analysis in a video conference

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001083984A (en) * 1999-09-09 2001-03-30 Alpine Electronics Inc Interface device
JP2011039934A (en) * 2009-08-17 2011-02-24 Tokai Univ Emotion estimation system and learning system using the same
JP2015141428A (en) * 2014-01-27 2015-08-03 株式会社日立システムズ Server device, feeling notification system, feeling notification method, and program

Also Published As

Publication number Publication date
WO2022025025A1 (en) 2022-02-03
JPWO2022025025A1 (en) 2022-02-03

Similar Documents

Publication Publication Date Title
WO2022024354A1 (en) Reaction analysis system
JP2012059107A (en) Emotion estimation device, emotion estimation method and program
WO2022024194A1 (en) Emotion analysis system
JP7197957B2 (en) Reaction analysis system and reaction analysis device
WO2022024355A1 (en) Emotion analysis system
WO2022064622A1 (en) Emotion analysis system
WO2022064621A1 (en) Video meeting evaluation system and video meeting evaluation server
WO2022024356A1 (en) Organization attribute analysis system
WO2022064620A1 (en) Video meeting evaluation system and video meeting evaluation server
JP7465040B1 (en) Communication visualization system
WO2022064617A1 (en) Video meeting evaluation system and video meeting evaluation server
WO2022201272A1 (en) Video analysis program
WO2022064619A1 (en) Video meeting evaluation system and video meeting evaluation server
JP7100938B1 (en) Video analysis program
WO2022064618A1 (en) Video meeting evaluation system and video meeting evaluation server
WO2022254497A1 (en) Video analysis system
JP7197955B1 (en) Video meeting evaluation terminal
JP7152817B1 (en) Video analysis program
WO2022074785A1 (en) Video meeting evaluation terminal, video meeting evaluation system, and video meeting evaluation program
WO2024142291A1 (en) Communication visualizing system
WO2022230136A1 (en) Video analysis system
JP7121439B1 (en) Video image analysis system
WO2022113248A1 (en) Video meeting evaluation terminal and video meeting evaluation method
JP7121436B1 (en) Video analysis program
JP7121433B1 (en) Video analysis program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20946896

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20946896

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP