WO2022024354A1 - Reaction analysis system - Google Patents

Reaction analysis system Download PDF

Info

Publication number
WO2022024354A1
WO2022024354A1 PCT/JP2020/029467 JP2020029467W WO2022024354A1 WO 2022024354 A1 WO2022024354 A1 WO 2022024354A1 JP 2020029467 W JP2020029467 W JP 2020029467W WO 2022024354 A1 WO2022024354 A1 WO 2022024354A1
Authority
WO
WIPO (PCT)
Prior art keywords
biological reaction
unit
moving image
change
analysis
Prior art date
Application number
PCT/JP2020/029467
Other languages
French (fr)
Japanese (ja)
Inventor
渉三 神谷
Original Assignee
株式会社I’mbesideyou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社I’mbesideyou filed Critical 株式会社I’mbesideyou
Priority to PCT/JP2020/029467 priority Critical patent/WO2022024354A1/en
Priority to JP2022539571A priority patent/JP7242114B2/en
Priority to PCT/JP2021/028142 priority patent/WO2022025200A1/en
Publication of WO2022024354A1 publication Critical patent/WO2022024354A1/en
Priority to JP2023031017A priority patent/JP2023075197A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Definitions

  • the present invention relates to a reaction analysis system.
  • Patent Document 1 a technique for analyzing emotions received by others in response to a speaker's remark is known (see, for example, Patent Document 1). Further, there is also known a technique of analyzing changes in a subject's facial expression over a long period of time in a time series and estimating the emotions held during that period (see, for example, Patent Document 2). Further, a technique for identifying an element that has the greatest influence on emotional changes is also known (see, for example, Patent Documents 3 to 5). Furthermore, there is also known a technique of comparing a subject's normal facial expression with the current facial expression and issuing an alert when the facial expression is dark (see, for example, Patent Document 6).
  • An object of the present invention is to analyze a specific reaction different from the previous one for a participant in an environment where an online session is held by a plurality of people, and to make it possible to utilize the analysis result.
  • the reaction analysis system of the present invention is based on a moving image obtained by photographing a participant for each of a plurality of online sessions in an environment where an online session is held by a plurality of people. , Analyze different specific reactions for participants.
  • the reaction analysis system of the present embodiment is different from the previous one for the participants based on the moving images obtained by photographing the participants for each of a plurality of online sessions in an environment where an online session is held by a plurality of people. It is a system that analyzes specific reactions.
  • the biological reaction of the participants is analyzed.
  • the emotions of the participants may be analyzed through the analysis of biological reactions.
  • An online session is, for example, an online conference, an online class, an online chat, etc., in which terminals installed in multiple locations are connected to a server via a communication network such as the Internet, and moving images are transmitted between the terminals through the server. It is designed to be able to communicate. It is preferable that the target online session in the present embodiment is performed multiple times with the same participant or almost the same participant.
  • the moving image handled in the online session includes the face image and voice of the user who uses the terminal.
  • the moving image also includes an image such as a material shared and viewed by a plurality of users. It is possible to switch between the face image and the material image on the screen of each terminal to display only one of them, or to divide the display area and display the face image and the material image at the same time. Further, it is possible to display the image of one of a plurality of people on the full screen, or to display the image of a part or all of the users on a small screen.
  • the leader, facilitator, or administrator of the online session designates any user as the analysis target.
  • Organizers of online sessions include, for example, instructors of online classes, chairs and facilitators of online conferences, and coaches of sessions for coaching purposes.
  • the organizer of an online session is usually one of a plurality of users who participate in the online session, but may be another person who does not participate in the online session.
  • all the participants may be the analysis target without designating the analysis target person.
  • FIG. 1 is a block diagram showing a functional configuration example of the reaction analysis system according to the present embodiment.
  • the reaction analysis system of the present embodiment has a moving image acquisition unit 11, a biological reaction analysis unit 12, a peculiarity determination unit 13, a related event identification unit 14, a clustering unit 15, and an analysis result notification as functional configurations.
  • the part 16 is provided.
  • Each of the above functional blocks 11 to 16 can be configured by any of hardware, DSP (Digital Signal Processor), and software provided in the server device, for example.
  • DSP Digital Signal Processor
  • each of the above functional blocks 11 to 16 is actually configured to include a computer CPU, RAM, ROM, etc., and is a program stored in a recording medium such as RAM, ROM, a hard disk, or a semiconductor memory. Is realized by the operation of.
  • the moving image acquisition unit 11 acquires a moving image obtained by shooting a plurality of people (multiple users) with a camera provided in each terminal during the online session for each of a plurality of online sessions. It does not matter whether the moving image acquired from each terminal is set to be displayed on the screen of each terminal. That is, the moving image acquisition unit 11 acquires the moving image from each terminal, including the moving image being displayed on each terminal and the moving image being hidden.
  • the biological reaction analysis unit 12 analyzes changes in the biological reaction of the designated analysis target person based on the moving image acquired by the moving image acquisition unit 11. It should be noted that the change in the biological reaction may be analyzed for each of a plurality of persons including others other than the analysis target person. In the present embodiment, the biological reaction analysis unit 12 separates the moving image acquired by the moving image acquisition unit 11 into a set of images (a collection of frame images) and a voice, and analyzes changes in the biological reaction from each.
  • the biological reaction analysis unit 12 analyzes the user's face image using the frame image separated from the moving image acquired by the moving image acquisition unit 11, and thereby at least one of the facial expression, the line of sight, the pulse, and the movement of the face. Analyze changes in biological reactions related to one. Further, the biological reaction analysis unit 12 analyzes changes in the biological reaction regarding at least one of the user's speech content and voice quality by analyzing the voice separated from the moving image acquired by the moving image acquisition unit 11.
  • the biological reaction analysis unit 12 calculates a biological reaction index value that reflects the content of the change in the biological reaction by quantifying the change in the biological reaction according to a predetermined standard.
  • Analysis of changes in facial expressions is performed, for example, as follows. That is, for each frame image, a facial area is specified from the frame image, and the specified facial expressions are classified into a plurality of types according to an image analysis model trained in advance by machine learning. Then, based on the classification result, it is analyzed whether a positive facial expression change occurs between consecutive frame images, a negative facial expression change occurs, and how large the facial expression change occurs. The facial expression change index value according to the analysis result is output.
  • Analysis of changes in the line of sight is performed, for example, as follows. That is, for each frame image, the area of the eyes is specified from the frame image, and the orientation of both eyes is analyzed to analyze where the user is looking. For example, it analyzes whether the speaker's face being displayed, the shared material being displayed, or the outside of the screen is being viewed. In addition, it may be possible to analyze whether the movement of the line of sight is large or small, and whether the movement is frequent or infrequent. The change in the line of sight is also related to the degree of concentration of the user.
  • the biological reaction analysis unit 12 outputs the line-of-sight change index value according to the analysis result of the line-of-sight change.
  • Analysis of pulse changes is performed, for example, as follows. That is, for each frame image, the face area is specified from the frame image. Then, using a trained image analysis model that captures the numerical value of the face color information (G in RGB), the change in the G color on the face surface is analyzed. By arranging the results along the time axis, a waveform showing the change in color information is formed, and the pulse is specified from this waveform. When a person is nervous, the pulse becomes faster, and when he / she feels calm, the pulse becomes slower.
  • the biological reaction analysis unit 12 outputs a pulse change index value according to the analysis result of the pulse change.
  • Analysis of changes in facial movement is performed, for example, as follows. That is, for each frame image, the area of the face is specified from the frame image, and the orientation of the face is analyzed to analyze where the user is looking. For example, it analyzes whether the speaker's face being displayed, the shared material being displayed, or the outside of the screen is being viewed. In addition, it may be possible to analyze whether the movement of the face is large or small, and whether the movement is frequent or infrequent. The movement of the face and the movement of the line of sight may be combined and analyzed. For example, it may be possible to analyze whether the speaker's face being displayed is viewed straight, whether the speaker is viewed with an upper eye or a lower eye, or whether the speaker is viewed from an angle.
  • the biological reaction analysis unit 12 outputs a face orientation change index value according to the analysis result of the face orientation change.
  • the content of the statement is analyzed as follows, for example. That is, the biological reaction analysis unit 12 converts the voice into a character string by performing a known voice recognition process on the voice for a specified time (for example, a time of about 30 to 150 seconds), and morphologically analyzes the character string. By doing so, words unnecessary for expressing conversation such as particles and acronyms are removed. Then, the remaining words are vectorized, and whether a positive emotional change is occurring, a negative emotional change is occurring, and how large the emotional change is occurring is analyzed, and the analysis result is used. Outputs the statement content index value.
  • Voice quality analysis is performed as follows, for example. That is, the biological reaction analysis unit 12 identifies the acoustic characteristics of the voice by performing a known voice analysis process on the voice for a specified time (for example, a time of about 30 to 150 seconds). Then, based on the acoustic characteristics, it is analyzed whether a positive voice quality change is occurring, a negative voice quality change is occurring, and how loud the voice quality change is occurring, and according to the analysis result. Outputs the voice quality change index value.
  • the biological reaction analysis unit 12 uses at least one of the facial expression change index value, the line-of-sight change index value, the pulse change index value, the face orientation change index value, the speech content index value, and the voice quality change index value calculated as described above.
  • the biological reaction index value is calculated.
  • the biological reaction index value is calculated by weighting the facial expression change index value, the line-of-sight change index value, the pulse change index value, the face orientation change index value, the speech content index value, and the voice quality change index value.
  • the change in the biological reaction analyzed for the analysis target person in one online session is the change in the biological reaction analyzed for the analysis target person in the online session prior to the one online session. Judge whether it is specific or not by comparison.
  • the peculiarity determination unit 13 is based on the biological reaction index value calculated for the analysis target person by the biological reaction analysis unit 12, and the change in the biological reaction analyzed for the analysis target person is more specific than before. Judge whether or not.
  • the peculiarity determination unit 13 calculates the variance of the biological reaction index value calculated for each of a plurality of online sessions for the analysis target person by the biological reaction analysis unit 12, and the biological reaction index value calculated for this online session. By contrasting with the variance, it is determined whether or not the change in the biological reaction analyzed for the subject to be analyzed this time is more specific than before.
  • the related event identification unit 14 identifies an event occurring with respect to at least one of the analysis subject, another person, and the environment when a change in the biological reaction determined to be specific by the peculiarity determination unit 13 occurs. .. For example, the related event identification unit 14 identifies the behavior of the analysis target person himself / herself from the moving image when a specific change in the biological reaction occurs for the analysis target person. In addition, the related event identification unit 14 identifies the behavior of another person from the moving image when a specific change in the biological reaction occurs for the analysis target person. In addition, the related event identification unit 14 identifies the environment when a specific change in the biological reaction occurs for the analysis target person from the moving image.
  • the environment is, for example, a shared material displayed on the screen, an environment reflected in the background of the person to be analyzed, and the like. By identifying such an event, it is possible to capture an event that may have influenced the change in the reaction of the analysis subject.
  • the clustering unit 15 is the content of the change in the biological reaction determined to be specific by the peculiarity determination unit 13 (for example, one or a combination of the line of sight, the pulse, the movement of the face, the content of speech, and the voice quality) and the past.
  • the pattern of change in biological response is clustered based on the magnitude of change from. This makes it possible to analyze and present patterns of behaviors that may occur in the future for the analysis target person. For example, it is possible to understand that the subject to be analyzed may withdraw from the online session when the degree of smile gradually decreases and negative biological reactions continue for a predetermined number of times. You may want to give an alert to let you know.
  • the analysis result notification unit 16 is at least one of a change in a biological reaction determined to be specific by the peculiarity determination unit 13, an event specified by the related event identification unit 14, and a change pattern clustered by the clustering unit 15. Is notified to the designated person (the organizer of the online session) of the analysis target.
  • the organizer of the online session may be notified of the future behavior pattern of the analysis target person predicted from the clustered change pattern, a predetermined alert based on the future behavior pattern, and the like.
  • the organizer of the online session grasps the behavior tendency of the specified analysis target person and predicts the behavior and state that may occur in the future, depending on what kind of change pattern the specified analysis target person is clustered into. be able to. Then, it becomes possible to take appropriate measures for the analysis target person.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Resources & Organizations (AREA)
  • Operations Research (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Data Mining & Analysis (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The present invention comprises: a moving image acquisition unit 11 that acquires a moving image obtained by photographing a subject to be analyzed during an online session, in each of a plurality of online sessions; a biological reaction analysis unit 12 that analyzes a change in the biological reaction of the subject to be analyzed on the basis of the moving image acquired by the moving image acquisition unit 11; and an anomaly assessment unit 13 that assesses whether an analyzed biological reaction change in the subject to be analyzed during one online session is anomalous in comparison with an analyzed biological reaction change in the subject to be analyzed during an online session earlier in time than the one online session, wherein an anomalous reaction in the subject to be analyzed different from previous reactions is analyzed on the basis of the moving image obtained by photographing the subject to be analyzed in each of the plurality of online sessions in an environment in which the online sessions are conducted with a plurality of persons.

Description

反応解析システムReaction analysis system
 本発明は、反応解析システムに関するものである。 The present invention relates to a reaction analysis system.
 従来、発言者の発言に対して他者が受ける感情を解析する技術が知られている(例えば、特許文献1参照)。また、対象者の表情の変化を長期間にわたり時系列的に解析し、その間に抱いた感情を推定する技術も知られている(例えば、特許文献2参照)。さらに、感情の変化に最も影響を与えた要素を特定する技術も知られている(例えば、特許文献3~5参照)。さらにまた、対象者の普段の表情と現在の表情とを比較して、表情が暗い場合にアラートを発する技術も知られている(例えば、特許文献6参照)。 Conventionally, a technique for analyzing emotions received by others in response to a speaker's remark is known (see, for example, Patent Document 1). Further, there is also known a technique of analyzing changes in a subject's facial expression over a long period of time in a time series and estimating the emotions held during that period (see, for example, Patent Document 2). Further, a technique for identifying an element that has the greatest influence on emotional changes is also known (see, for example, Patent Documents 3 to 5). Furthermore, there is also known a technique of comparing a subject's normal facial expression with the current facial expression and issuing an alert when the facial expression is dark (see, for example, Patent Document 6).
特開2019-58625号公報Japanese Unexamined Patent Publication No. 2019-58625 特開2016-149063号公報Japanese Unexamined Patent Publication No. 2016-149063 特開2020-86559号公報Japanese Unexamined Patent Publication No. 2020-86559 特開2000-76421号公報Japanese Unexamined Patent Publication No. 2000-76421 特開2017-201499号公報Japanese Unexamined Patent Publication No. 2017-201499 特開2018-112831号公報Japanese Unexamined Patent Publication No. 2018-112831
 本発明は、複数人でオンラインセッションが行われる環境において、参加者について以前とは異なる特異的な反応を解析し、その解析結果を活用できるようにすることを目的とする。 An object of the present invention is to analyze a specific reaction different from the previous one for a participant in an environment where an online session is held by a plurality of people, and to make it possible to utilize the analysis result.
 上記した課題を解決するために、本発明の反応解析システムでは、複数人でオンラインセッションが行われる環境において、複数回のオンラインセッションごとに参加者を撮影することによって得られる動画像をもとに、参加者について以前とは異なる特異的な反応を解析する。 In order to solve the above-mentioned problems, the reaction analysis system of the present invention is based on a moving image obtained by photographing a participant for each of a plurality of online sessions in an environment where an online session is held by a plurality of people. , Analyze different specific reactions for participants.
 上記のように構成した本発明によれば、複数人でオンラインセッションが行われる環境において、参加者について以前とは異なる特異的な反応を解析し、その解析結果を活用することができる。 According to the present invention configured as described above, in an environment where an online session is held by a plurality of people, it is possible to analyze a specific reaction different from the previous one for the participants and utilize the analysis result.
本実施形態による反応解析システムの機能構成例を示すブロック図である。It is a block diagram which shows the functional structure example of the reaction analysis system by this embodiment.
 本実施形態の反応解析システムは、複数人でオンラインセッションが行われる環境において、複数回のオンラインセッションごとに参加者を撮影することによって得られる動画像をもとに、参加者について以前とは異なる特異的な反応を解析するシステムである。本実施形態では、参加者の生体反応を解析する。また、生体反応の解析を通じて、参加者の感情(自分または他人の言動に対して起こる気持ち。快・不快またはその程度など)を解析してもよい。 The reaction analysis system of the present embodiment is different from the previous one for the participants based on the moving images obtained by photographing the participants for each of a plurality of online sessions in an environment where an online session is held by a plurality of people. It is a system that analyzes specific reactions. In this embodiment, the biological reaction of the participants is analyzed. In addition, the emotions of the participants (feelings that occur in the words and actions of oneself or others, such as comfort / discomfort or the degree thereof) may be analyzed through the analysis of biological reactions.
 オンラインセッションは、例えばオンライン会議、オンライン授業、オンラインチャットなどであり、複数の場所に設置された端末をインターネットなどの通信ネットワークを介してサーバに接続し、当該サーバを通じて複数の端末間で動画像をやり取りできるようにしたものである。本実施形態において対象とするオンラインセッションは、同じ参加者またはほぼ同じ参加者で複数回実施されるものであることが好ましい。 An online session is, for example, an online conference, an online class, an online chat, etc., in which terminals installed in multiple locations are connected to a server via a communication network such as the Internet, and moving images are transmitted between the terminals through the server. It is designed to be able to communicate. It is preferable that the target online session in the present embodiment is performed multiple times with the same participant or almost the same participant.
 オンラインセッションで扱う動画像には、端末を使用するユーザの顔画像や音声が含まれる。また、動画像には、複数のユーザが共有して閲覧する資料などの画像も含まれる。各端末の画面上に顔画像と資料画像とを切り替えて何れか一方のみを表示させたり、表示領域を分けて顔画像と資料画像とを同時に表示させたりすることが可能である。また、複数人のうち1人の画像を全画面表示させたり、一部または全部のユーザの画像を小画面に分割して表示させたりすることが可能である。 The moving image handled in the online session includes the face image and voice of the user who uses the terminal. In addition, the moving image also includes an image such as a material shared and viewed by a plurality of users. It is possible to switch between the face image and the material image on the screen of each terminal to display only one of them, or to divide the display area and display the face image and the material image at the same time. Further, it is possible to display the image of one of a plurality of people on the full screen, or to display the image of a part or all of the users on a small screen.
 端末を使用してオンラインセッションに参加する複数のユーザのうち、何れか1人または複数人を解析対象者として指定することが可能である。例えば、オンラインセッションの主導者、進行者または管理者(以下、まとめて主催者という)が何れかのユーザを解析対象者として指定する。オンラインセッションの主催者は、例えばオンライン授業の講師、オンライン会議の議長やファシリテータ、コーチングを目的としたセッションのコーチなどである。オンラインセッションの主催者は、オンラインセッションに参加する複数のユーザの中の一人であるのが普通であるが、オンラインセッションに参加しない別人であってもよい。なお、解析対象者を指定せず全ての参加者を解析対象としてもよい。 It is possible to specify any one or more of the multiple users who participate in the online session using the terminal as the analysis target. For example, the leader, facilitator, or administrator of the online session (hereinafter collectively referred to as the organizer) designates any user as the analysis target. Organizers of online sessions include, for example, instructors of online classes, chairs and facilitators of online conferences, and coaches of sessions for coaching purposes. The organizer of an online session is usually one of a plurality of users who participate in the online session, but may be another person who does not participate in the online session. In addition, all the participants may be the analysis target without designating the analysis target person.
 以下、本発明の一実施形態を図面に基づいて説明する。図1は、本実施形態による反応解析システムの機能構成例を示すブロック図である。図1に示すように、本実施形態の反応解析システムは、機能構成として、動画像取得部11、生体反応解析部12、特異判定部13、関連事象特定部14、クラスタリング部15および解析結果通知部16を備えている。 Hereinafter, an embodiment of the present invention will be described with reference to the drawings. FIG. 1 is a block diagram showing a functional configuration example of the reaction analysis system according to the present embodiment. As shown in FIG. 1, the reaction analysis system of the present embodiment has a moving image acquisition unit 11, a biological reaction analysis unit 12, a peculiarity determination unit 13, a related event identification unit 14, a clustering unit 15, and an analysis result notification as functional configurations. The part 16 is provided.
 上記各機能ブロック11~16は、例えばサーバ装置に備えられたハードウェア、DSP(Digital Signal Processor)、ソフトウェアの何れによっても構成することが可能である。例えばソフトウェアによって構成する場合、上記各機能ブロック11~16は、実際にはコンピュータのCPU、RAM、ROMなどを備えて構成され、RAMやROM、ハードディスクまたは半導体メモリ等の記録媒体に記憶されたプログラムが動作することによって実現される。 Each of the above functional blocks 11 to 16 can be configured by any of hardware, DSP (Digital Signal Processor), and software provided in the server device, for example. For example, when configured by software, each of the above functional blocks 11 to 16 is actually configured to include a computer CPU, RAM, ROM, etc., and is a program stored in a recording medium such as RAM, ROM, a hard disk, or a semiconductor memory. Is realized by the operation of.
 動画像取得部11は、複数回のオンラインセッションごとに、当該オンラインセッション中に各端末が備えるカメラにより複数人(複数のユーザ)を撮影することによって得られる動画像を各端末から取得する。各端末から取得する動画像は、各端末の画面上に表示されるように設定されているものか否かは問わない。すなわち、動画像取得部11は、各端末に表示中の動画像および非表示中の動画像を含めて、動画像を各端末から取得する。 The moving image acquisition unit 11 acquires a moving image obtained by shooting a plurality of people (multiple users) with a camera provided in each terminal during the online session for each of a plurality of online sessions. It does not matter whether the moving image acquired from each terminal is set to be displayed on the screen of each terminal. That is, the moving image acquisition unit 11 acquires the moving image from each terminal, including the moving image being displayed on each terminal and the moving image being hidden.
 生体反応解析部12は、動画像取得部11により取得された動画像に基づいて、指定された解析対象者について生体反応の変化を解析する。なお、解析対象者以外の他人を含めて、複数人のそれぞれについて生体反応の変化を解析するようにしてもよい。本実施形態において生体反応解析部12は、動画像取得部11により取得された動画像を画像のセット(フレーム画像の集まり)と音声とに分離し、それぞれから生体反応の変化を解析する。 The biological reaction analysis unit 12 analyzes changes in the biological reaction of the designated analysis target person based on the moving image acquired by the moving image acquisition unit 11. It should be noted that the change in the biological reaction may be analyzed for each of a plurality of persons including others other than the analysis target person. In the present embodiment, the biological reaction analysis unit 12 separates the moving image acquired by the moving image acquisition unit 11 into a set of images (a collection of frame images) and a voice, and analyzes changes in the biological reaction from each.
 例えば、生体反応解析部12は、動画像取得部11により取得された動画像から分離したフレーム画像を用いてユーザの顔画像を解析することにより、表情、目線、脈拍、顔の動きの少なくとも1つに関する生体反応の変化を解析する。また、生体反応解析部12は、動画像取得部11により取得された動画像から分離した音声を解析することにより、ユーザの発言内容、声質の少なくとも1つに関する生体反応の変化を解析する。 For example, the biological reaction analysis unit 12 analyzes the user's face image using the frame image separated from the moving image acquired by the moving image acquisition unit 11, and thereby at least one of the facial expression, the line of sight, the pulse, and the movement of the face. Analyze changes in biological reactions related to one. Further, the biological reaction analysis unit 12 analyzes changes in the biological reaction regarding at least one of the user's speech content and voice quality by analyzing the voice separated from the moving image acquired by the moving image acquisition unit 11.
 人は感情が変化すると、それが表情、目線、脈拍、顔の動き、発言内容、声質などの生体反応の変化となって現れる。本実施形態では、ユーザの感情の変化に起因する生体反応の変化を解析する。また、生体反応の変化を解析することを通じて、ユーザの感情の変化を解析するようにしてもよい。この場合に解析する感情は、一例として、快/不快の程度である。本実施形態において生体反応解析部12は、生体反応の変化を所定の基準に従って数値化することにより、生体反応の変化の内容を反映させた生体反応指標値を算出する。 When a person's emotions change, it appears as changes in biological reactions such as facial expressions, eyes, pulse, facial movements, speech content, and voice quality. In this embodiment, changes in biological reactions caused by changes in the user's emotions are analyzed. Further, the change in the user's emotion may be analyzed by analyzing the change in the biological reaction. The emotion analyzed in this case is, for example, the degree of comfort / discomfort. In the present embodiment, the biological reaction analysis unit 12 calculates a biological reaction index value that reflects the content of the change in the biological reaction by quantifying the change in the biological reaction according to a predetermined standard.
 表情の変化の解析は、例えば以下のようにして行う。すなわち、フレーム画像ごとに、フレーム画像の中から顔の領域を特定し、事前に機械学習させた画像解析モデルに従って特定した顔の表情を複数に分類する。そして、その分類結果に基づいて、連続するフレーム画像間でポジティブな表情変化が起きているか、ネガティブな表情変化が起きているか、およびどの程度の大きさの表情変化が起きているかを解析し、その解析結果に応じた表情変化指標値を出力する。 Analysis of changes in facial expressions is performed, for example, as follows. That is, for each frame image, a facial area is specified from the frame image, and the specified facial expressions are classified into a plurality of types according to an image analysis model trained in advance by machine learning. Then, based on the classification result, it is analyzed whether a positive facial expression change occurs between consecutive frame images, a negative facial expression change occurs, and how large the facial expression change occurs. The facial expression change index value according to the analysis result is output.
 目線の変化の解析は、例えば以下のようにして行う。すなわち、フレーム画像ごとに、フレーム画像の中から目の領域を特定し、両目の向きを解析することにより、ユーザがどこを見ているかを解析する。例えば、表示中の話者の顔を見ているか、表示中の共有資料を見ているか、画面の外を見ているかなどを解析する。また、目線の動きが大きいか小さいか、動きの頻度が多いか少ないかなどを解析するようにしてもよい。目線の変化はユーザの集中度にも関連する。生体反応解析部12は、目線の変化の解析結果に応じた目線変化指標値を出力する。 Analysis of changes in the line of sight is performed, for example, as follows. That is, for each frame image, the area of the eyes is specified from the frame image, and the orientation of both eyes is analyzed to analyze where the user is looking. For example, it analyzes whether the speaker's face being displayed, the shared material being displayed, or the outside of the screen is being viewed. In addition, it may be possible to analyze whether the movement of the line of sight is large or small, and whether the movement is frequent or infrequent. The change in the line of sight is also related to the degree of concentration of the user. The biological reaction analysis unit 12 outputs the line-of-sight change index value according to the analysis result of the line-of-sight change.
 脈拍の変化の解析は、例えば以下のようにして行う。すなわち、フレーム画像ごとに、フレーム画像の中から顔の領域を特定する。そして、顔の色情報(RGBのG)の数値を捉える学習済みの画像解析モデルを用いて、顔表面のG色の変化を解析する。その結果を時間軸に合わせて並べることによって色情報の変化を表した波形を形成し、この波形から脈拍を特定する。人は緊張すると脈拍が速くなり、気持ちが落ち着くと脈拍が遅くなる。生体反応解析部12は、脈拍の変化の解析結果に応じた脈拍変化指標値を出力する。 Analysis of pulse changes is performed, for example, as follows. That is, for each frame image, the face area is specified from the frame image. Then, using a trained image analysis model that captures the numerical value of the face color information (G in RGB), the change in the G color on the face surface is analyzed. By arranging the results along the time axis, a waveform showing the change in color information is formed, and the pulse is specified from this waveform. When a person is nervous, the pulse becomes faster, and when he / she feels calm, the pulse becomes slower. The biological reaction analysis unit 12 outputs a pulse change index value according to the analysis result of the pulse change.
 顔の動きの変化の解析は、例えば以下のようにして行う。すなわち、フレーム画像ごとに、フレーム画像の中から顔の領域を特定し、顔の向きを解析することにより、ユーザがどこを見ているかを解析する。例えば、表示中の話者の顔を見ているか、表示中の共有資料を見ているか、画面の外を見ているかなどを解析する。また、顔の動きが大きいか小さいか、動きの頻度が多いか少ないかなどを解析するようにしてもよい。顔の動きと目線の動きとを合わせて解析するようにしてもよい。例えば、表示中の話者の顔をまっすぐ見ているか、上目遣いまたは下目使いに見ているか、斜めから見ているかなどを解析するようにしてもよい。生体反応解析部12は、顔の向きの変化の解析結果に応じた顔向き変化指標値を出力する。 Analysis of changes in facial movement is performed, for example, as follows. That is, for each frame image, the area of the face is specified from the frame image, and the orientation of the face is analyzed to analyze where the user is looking. For example, it analyzes whether the speaker's face being displayed, the shared material being displayed, or the outside of the screen is being viewed. In addition, it may be possible to analyze whether the movement of the face is large or small, and whether the movement is frequent or infrequent. The movement of the face and the movement of the line of sight may be combined and analyzed. For example, it may be possible to analyze whether the speaker's face being displayed is viewed straight, whether the speaker is viewed with an upper eye or a lower eye, or whether the speaker is viewed from an angle. The biological reaction analysis unit 12 outputs a face orientation change index value according to the analysis result of the face orientation change.
 発言内容の解析は、例えば以下のようにして行う。すなわち、生体反応解析部12は、指定した時間(例えば、30~150秒程度の時間)の音声について公知の音声認識処理を行うことによって音声を文字列に変換し、当該文字列を形態素解析することにより、助詞、冠詞などの会話を表す上で不要なワードを取り除く。そして、残ったワードをベクトル化し、ポジティブな感情変化が起きているか、ネガティブな感情変化が起きているか、およびどの程度の大きさの感情変化が起きているかを解析し、その解析結果に応じた発言内容指標値を出力する。 The content of the statement is analyzed as follows, for example. That is, the biological reaction analysis unit 12 converts the voice into a character string by performing a known voice recognition process on the voice for a specified time (for example, a time of about 30 to 150 seconds), and morphologically analyzes the character string. By doing so, words unnecessary for expressing conversation such as particles and acronyms are removed. Then, the remaining words are vectorized, and whether a positive emotional change is occurring, a negative emotional change is occurring, and how large the emotional change is occurring is analyzed, and the analysis result is used. Outputs the statement content index value.
 声質の解析は、例えば以下のようにして行う。すなわち、生体反応解析部12は、指定した時間(例えば、30~150秒程度の時間)の音声について公知の音声解析処理を行うことによって音声の音響的特徴を特定する。そして、その音響的特徴に基づいて、ポジティブな声質変化が起きているか、ネガティブな声質変化が起きているか、およびどの程度の大きさの声質変化が起きているかを解析し、その解析結果に応じた声質変化指標値を出力する。 Voice quality analysis is performed as follows, for example. That is, the biological reaction analysis unit 12 identifies the acoustic characteristics of the voice by performing a known voice analysis process on the voice for a specified time (for example, a time of about 30 to 150 seconds). Then, based on the acoustic characteristics, it is analyzed whether a positive voice quality change is occurring, a negative voice quality change is occurring, and how loud the voice quality change is occurring, and according to the analysis result. Outputs the voice quality change index value.
 生体反応解析部12は、以上のようにして算出した表情変化指標値、目線変化指標値、脈拍変化指標値、顔向き変化指標値、発言内容指標値、声質変化指標値の少なくとも1つを用いて生体反応指標値を算出する。例えば、表情変化指標値、目線変化指標値、脈拍変化指標値、顔向き変化指標値、発言内容指標値および声質変化指標値を重み付け計算することにより、生体反応指標値を算出する。 The biological reaction analysis unit 12 uses at least one of the facial expression change index value, the line-of-sight change index value, the pulse change index value, the face orientation change index value, the speech content index value, and the voice quality change index value calculated as described above. The biological reaction index value is calculated. For example, the biological reaction index value is calculated by weighting the facial expression change index value, the line-of-sight change index value, the pulse change index value, the face orientation change index value, the speech content index value, and the voice quality change index value.
 特異判定部13は、一のオンラインセッションに関して解析対象者について解析された生体反応の変化が、当該一のオンラインセッションより時間的に前のオンラインセッションに関して解析対象者について解析された生体反応の変化と比べて特異的か否かを判定する。本実施形態において、特異判定部13は、生体反応解析部12により解析対象者について算出された生体反応指標値に基づいて、解析対象者について解析された生体反応の変化が以前と比べて特異的か否かを判定する。 In the peculiarity determination unit 13, the change in the biological reaction analyzed for the analysis target person in one online session is the change in the biological reaction analyzed for the analysis target person in the online session prior to the one online session. Judge whether it is specific or not by comparison. In the present embodiment, the peculiarity determination unit 13 is based on the biological reaction index value calculated for the analysis target person by the biological reaction analysis unit 12, and the change in the biological reaction analyzed for the analysis target person is more specific than before. Judge whether or not.
 例えば、特異判定部13は、生体反応解析部12により解析対象者について複数回のオンラインセッションごとに算出された生体反応指標値の分散を算出し、今回のオンラインセッションについて算出された生体反応指標値と分散との対比により、解析対象者について解析された今回の生体反応の変化が以前と比べて特異的か否かを判定する。 For example, the peculiarity determination unit 13 calculates the variance of the biological reaction index value calculated for each of a plurality of online sessions for the analysis target person by the biological reaction analysis unit 12, and the biological reaction index value calculated for this online session. By contrasting with the variance, it is determined whether or not the change in the biological reaction analyzed for the subject to be analyzed this time is more specific than before.
 関連事象特定部14は、特異判定部13により特異的であると判定された生体反応の変化が起きたときに解析対象者、他者および環境の少なくとも1つに関して発生している事象を特定する。例えば、関連事象特定部14は、解析対象者について特異的な生体反応の変化が起きたときにおける解析対象者自身の言動を動画像から特定する。また、関連事象特定部14は、解析対象者について特異的な生体反応の変化が起きたときにおける他者の言動を動画像から特定する。また、関連事象特定部14は、解析対象者について特異的な生体反応の変化が起きたときにおける環境を動画像から特定する。環境は、例えば画面に表示中の共有資料、解析対象者の背景に写っているものなどである。このような事象を特定することにより、解析対象者の反応の変化に影響を与えた可能性がある事象を捉えることが可能である。 The related event identification unit 14 identifies an event occurring with respect to at least one of the analysis subject, another person, and the environment when a change in the biological reaction determined to be specific by the peculiarity determination unit 13 occurs. .. For example, the related event identification unit 14 identifies the behavior of the analysis target person himself / herself from the moving image when a specific change in the biological reaction occurs for the analysis target person. In addition, the related event identification unit 14 identifies the behavior of another person from the moving image when a specific change in the biological reaction occurs for the analysis target person. In addition, the related event identification unit 14 identifies the environment when a specific change in the biological reaction occurs for the analysis target person from the moving image. The environment is, for example, a shared material displayed on the screen, an environment reflected in the background of the person to be analyzed, and the like. By identifying such an event, it is possible to capture an event that may have influenced the change in the reaction of the analysis subject.
 クラスタリング部15は、特異判定部13により特異的であると判定された生体反応の変化(例えば、目線、脈拍、顔の動き、発言内容、声質のうち1つまたは複数の組み合わせ)の内容および以前からの変化の大きさに基づいて、生体反応の変化のパターンをクラスタリングする。これにより、解析対象者について将来発生し得る行動をパターン分析して提示することが可能である。例えば、笑顔の度合いが徐々に下がり、ネガティブな生体反応が連続して所定回続く場合に、その解析対象者がオンラインセッションから離脱する可能性があることを把握することが可能であり、そのことを知らせるアラートを行うようにしてもよい。 The clustering unit 15 is the content of the change in the biological reaction determined to be specific by the peculiarity determination unit 13 (for example, one or a combination of the line of sight, the pulse, the movement of the face, the content of speech, and the voice quality) and the past. The pattern of change in biological response is clustered based on the magnitude of change from. This makes it possible to analyze and present patterns of behaviors that may occur in the future for the analysis target person. For example, it is possible to understand that the subject to be analyzed may withdraw from the online session when the degree of smile gradually decreases and negative biological reactions continue for a predetermined number of times. You may want to give an alert to let you know.
 解析結果通知部16は、特異判定部13により特異的であると判定された生体反応の変化、関連事象特定部14により特定された事象、およびクラスタリング部15によりクラスタリングされた変化パターンの少なくとも1つを、解析対象者の指定者(オンラインセッションの主催者)に通知する。クラスタリングされた変化パターンから予測される解析対象者の将来の行動パターンや、それに基づく所定のアラートなどをオンラインセッションの主催者に通知するようにしてもよい。 The analysis result notification unit 16 is at least one of a change in a biological reaction determined to be specific by the peculiarity determination unit 13, an event specified by the related event identification unit 14, and a change pattern clustered by the clustering unit 15. Is notified to the designated person (the organizer of the online session) of the analysis target. The organizer of the online session may be notified of the future behavior pattern of the analysis target person predicted from the clustered change pattern, a predetermined alert based on the future behavior pattern, and the like.
 これにより、オンラインセッションの主催者は、指定した解析対象者に関して、どのような事象がどのような生体反応の変化に影響を与えているのかを知ることができる。そして、その把握した内容に応じて適切な処置を解析対象者に対して行うことが可能となる。また、オンラインセッションの主催者は、指定した解析対象者がどのような変化パターンにクラスタリングされたかによって、解析対象者に行動の傾向を把握したり、今後起こり得る行動や状態などを予測したりすることができる。そして、それに対して適切な処置を解析対象者に対して行うことが可能となる。 This allows the organizer of the online session to know what kind of event affects what kind of change in biological reaction for the specified analysis target person. Then, it becomes possible to take appropriate measures for the analysis target person according to the grasped contents. In addition, the organizer of the online session grasps the behavior tendency of the specified analysis target person and predicts the behavior and state that may occur in the future, depending on what kind of change pattern the specified analysis target person is clustered into. be able to. Then, it becomes possible to take appropriate measures for the analysis target person.
 なお、上記実施形態は、何れも本発明を実施するにあたっての具体化の一例を示したものに過ぎず、これによって本発明の技術的範囲が限定的に解釈されてはならないものである。すなわち、本発明はその要旨、またはその主要な特徴から逸脱することなく、様々な形で実施することができる。 It should be noted that the above embodiments are merely examples of the embodiment of the present invention, and the technical scope of the present invention should not be construed in a limited manner. That is, the present invention can be implemented in various forms without departing from its gist or its main features.
 11 動画像取得部
 12 生体反応解析部
 13 特異判定部
 14 関連事象特定部
 15 クラスタリング部
 16 解析結果通知部
11 Moving image acquisition unit 12 Biological reaction analysis unit 13 Singularity determination unit 14 Related event identification unit 15 Clustering unit 16 Analysis result notification unit

Claims (9)

  1.  複数人でオンラインセッションが行われる環境において、複数回のオンラインセッションごとに参加者を撮影することによって得られる動画像をもとに、参加者について以前とは異なる特異的な反応を解析する反応解析システム。 Reaction analysis that analyzes a specific reaction different from before for a participant based on a moving image obtained by taking a picture of a participant for each of multiple online sessions in an environment where an online session is held by multiple people. system.
  2.  上記複数回のオンラインセッションごとに、当該オンラインセッション中に上記参加者を撮影することによって得られる動画像を取得する動画像取得部と、
     上記動画像取得部により取得された動画像に基づいて、上記参加者の生体反応の変化を解析する生体反応解析部と、
     一のオンラインセッションに関して上記参加者について解析された上記生体反応の変化が、上記一のオンラインセッションより時間的に前のオンラインセッションに関して上記参加者について解析された上記生体反応の変化と比べて特異的か否かを判定する特異判定部とを備えた
    ことを特徴とする請求項1に記載の反応解析システム。
    For each of the plurality of online sessions, a moving image acquisition unit that acquires a moving image obtained by photographing the participants during the online session, and a moving image acquisition unit.
    Based on the moving image acquired by the moving image acquisition unit, the biological reaction analysis unit that analyzes changes in the biological reaction of the participants, and the biological reaction analysis unit.
    The changes in the biological response analyzed for the participant for one online session are specific compared to the changes in the biological response analyzed for the participant for the online session prior to the online session in time. The reaction analysis system according to claim 1, further comprising a peculiarity determination unit for determining whether or not the condition is present.
  3.  上記生体反応解析部は、上記動画像取得部により取得された動画像にける顔画像を解析することにより、表情、目線、脈拍、顔の動きの少なくとも1つに関する生体反応の変化を解析することを特徴とする請求項2に記載の反応解析システム。 The biological reaction analysis unit analyzes changes in the biological reaction related to at least one of facial expression, line of sight, pulse, and facial movement by analyzing the facial image in the moving image acquired by the moving image acquisition unit. 2. The reaction analysis system according to claim 2.
  4.  上記生体反応解析部は、上記動画像取得部により取得された動画像にける音声を解析することにより、発言内容、声質の少なくとも1つに関する生体反応の変化を解析することを特徴とする請求項2または3に記載の反応解析システム。 The claim is characterized in that the biological reaction analysis unit analyzes changes in the biological reaction relating to at least one of the content of speech and voice quality by analyzing the voice in the moving image acquired by the moving image acquisition unit. The reaction analysis system according to 2 or 3.
  5.  上記生体反応解析部は、上記生体反応の変化を所定の基準に従って数値化することによって生体反応指標値を算出し、
     上記特異判定部は、上記生体反応解析部により上記参加者について算出された上記生体反応指標値に基づいて、上記一のオンラインセッションに関して上記参加者について解析された上記生体反応の変化が、上記一のオンラインセッションより時間的に前のオンラインセッションに関して上記参加者について解析された上記生体反応の変化と比べて特異的か否かを判定する
    ことを特徴とする請求項2~4の何れか1項に記載の反応解析システム。
    The biological reaction analysis unit calculates the biological reaction index value by quantifying the change in the biological reaction according to a predetermined standard.
    In the peculiarity determination unit, based on the biological reaction index value calculated for the participant by the biological reaction analysis unit, the change in the biological reaction analyzed for the participant with respect to the online session is the above-mentioned one. Any one of claims 2 to 4, wherein it is determined whether or not the online session prior to the online session is specific to the change in the biological reaction analyzed for the participants. The reaction analysis system described in.
  6.  上記特異判定部により特異的であると判定された生体反応の変化が起きたときに解析対象者、当該解析対象者以外の他者および環境の少なくとも1つに関して発生している事象を特定する関連事象特定部を更に備えたことを特徴とする請求項2~5の何れか1項に記載の反応解析システム。 A relationship that identifies an event occurring with respect to at least one of the analysis target person, another person other than the analysis target person, and the environment when a change in a biological reaction determined to be specific by the specificity determination unit occurs. The reaction analysis system according to any one of claims 2 to 5, further comprising an event identification unit.
  7.  上記特異判定部により特異的であると判定された生体反応の変化の内容および以前からの変化の大きさに基づいて、上記生体反応の変化パターンをクラスタリングするクラスタリング部を更に備えたことを特徴とする請求項2~6の何れか1項に記載の反応解析システム。 It is characterized by further including a clustering unit for clustering the change pattern of the biological reaction based on the content of the change of the biological reaction determined to be specific by the specificity determination unit and the magnitude of the change from the past. The reaction analysis system according to any one of claims 2 to 6.
  8.  上記特異判定部により特異的であると判定された生体反応の変化および上記関連事象特定部により特定された事象の少なくとも一方を、上記オンラインセッションの主催者に通知する解析結果通知部を更に備えたことを特徴とする請求項6に記載の感情解析システム。 Further provided with an analysis result notification unit for notifying the organizer of the online session of at least one of the changes in the biological reaction determined to be specific by the peculiarity determination unit and the event specified by the related event identification unit. The emotion analysis system according to claim 6.
  9.  上記特異判定部により特異的であると判定された生体反応の変化および上記クラスタリング部によりクラスタリングされた変化パターンの少なくとも1つを、上記オンラインセッションの主催者に通知する解析結果通知部を更に備えたことを特徴とする請求項7に記載の感情解析システム。 Further provided with an analysis result notification unit for notifying the organizer of the online session of at least one of the changes in the biological reaction determined to be specific by the peculiarity determination unit and the change patterns clustered by the clustering unit. The emotion analysis system according to claim 7.
PCT/JP2020/029467 2020-07-31 2020-07-31 Reaction analysis system WO2022024354A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/JP2020/029467 WO2022024354A1 (en) 2020-07-31 2020-07-31 Reaction analysis system
JP2022539571A JP7242114B2 (en) 2020-07-31 2021-07-29 Reaction analysis system and reaction analysis device
PCT/JP2021/028142 WO2022025200A1 (en) 2020-07-31 2021-07-29 Reaction analysis system and reaction analysis device
JP2023031017A JP2023075197A (en) 2020-07-31 2023-03-01 Reaction analysis system and reaction analysis device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/029467 WO2022024354A1 (en) 2020-07-31 2020-07-31 Reaction analysis system

Publications (1)

Publication Number Publication Date
WO2022024354A1 true WO2022024354A1 (en) 2022-02-03

Family

ID=80035314

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2020/029467 WO2022024354A1 (en) 2020-07-31 2020-07-31 Reaction analysis system
PCT/JP2021/028142 WO2022025200A1 (en) 2020-07-31 2021-07-29 Reaction analysis system and reaction analysis device

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/028142 WO2022025200A1 (en) 2020-07-31 2021-07-29 Reaction analysis system and reaction analysis device

Country Status (2)

Country Link
JP (2) JP7242114B2 (en)
WO (2) WO2022024354A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115381411A (en) * 2022-09-19 2022-11-25 慧医谷中医药科技(天津)股份有限公司 Pulse condition analysis information processing method and system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023183280A (en) * 2022-06-15 2023-12-27 京セラ株式会社 Electronic equipment, control method of electronic equipment, and control program of electronic equipment
JP2023183278A (en) * 2022-06-15 2023-12-27 京セラ株式会社 Electronic equipment, control method of electronic equipment, and control program of electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010191689A (en) * 2009-02-18 2010-09-02 Nec Corp Apparatus for supporting measure against phenomenon, terminal, and system, method and program for supporting measure against phenomenon

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9113035B2 (en) * 2013-03-05 2015-08-18 International Business Machines Corporation Guiding a desired outcome for an electronically hosted conference
JP2016046705A (en) * 2014-08-25 2016-04-04 コニカミノルタ株式会社 Conference record editing apparatus, method and program for the same, conference record reproduction apparatus, and conference system
JP2020048610A (en) * 2018-09-21 2020-04-02 富士ゼロックス株式会社 State evaluation system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010191689A (en) * 2009-02-18 2010-09-02 Nec Corp Apparatus for supporting measure against phenomenon, terminal, and system, method and program for supporting measure against phenomenon

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ANDO, JUN: "Visualizing Employee Health", NIHON KEIZAI SHIMBUN, vol. 13, 20 April 2020 (2020-04-20) *
ANONYMOUS: "AI Sakura discovers mental disorders early! -It is now possible to support the mental health of employees who are newly teleworking- ", TIFANA, 3 June 2020 (2020-06-03), XP055903252, Retrieved from the Internet <URL:https://www.tifana.ai/news/20200603> [retrieved on 20220321] *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115381411A (en) * 2022-09-19 2022-11-25 慧医谷中医药科技(天津)股份有限公司 Pulse condition analysis information processing method and system

Also Published As

Publication number Publication date
JP7242114B2 (en) 2023-03-20
WO2022025200A1 (en) 2022-02-03
JP2023075197A (en) 2023-05-30
JPWO2022025200A1 (en) 2022-02-03

Similar Documents

Publication Publication Date Title
WO2022024354A1 (en) Reaction analysis system
WO2022024194A1 (en) Emotion analysis system
WO2022024353A1 (en) Reaction analysis system
WO2022064621A1 (en) Video meeting evaluation system and video meeting evaluation server
WO2022064622A1 (en) Emotion analysis system
WO2022024355A1 (en) Emotion analysis system
WO2022064620A1 (en) Video meeting evaluation system and video meeting evaluation server
WO2022064617A1 (en) Video meeting evaluation system and video meeting evaluation server
WO2022064618A1 (en) Video meeting evaluation system and video meeting evaluation server
WO2022064619A1 (en) Video meeting evaluation system and video meeting evaluation server
JP7465040B1 (en) Communication visualization system
WO2022024356A1 (en) Organization attribute analysis system
WO2022074785A1 (en) Video meeting evaluation terminal, video meeting evaluation system, and video meeting evaluation program
JP7388768B2 (en) Video analysis program
JP7100938B1 (en) Video analysis program
JP7121436B1 (en) Video analysis program
WO2022254497A1 (en) Video analysis system
JP7121433B1 (en) Video analysis program
WO2024142291A1 (en) Communication visualizing system
WO2022137502A1 (en) Video meeting evaluation terminal, video meeting evaluation system, and video meeting evaluation program
US11935329B2 (en) Video analysis program
JP7445331B2 (en) Video meeting evaluation terminal and video meeting evaluation method
WO2022145044A1 (en) Reaction notification system
WO2022145038A1 (en) Video meeting evaluation terminal, video meeting evaluation system and video meeting evaluation program
WO2022145042A1 (en) Video meeting evaluation terminal, video meeting evaluation system, and video meeting evaluation program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20947476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20947476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP