WO2021157530A1 - 対話ユーザの感情情報の提供装置 - Google Patents
対話ユーザの感情情報の提供装置 Download PDFInfo
- Publication number
- WO2021157530A1 WO2021157530A1 PCT/JP2021/003558 JP2021003558W WO2021157530A1 WO 2021157530 A1 WO2021157530 A1 WO 2021157530A1 JP 2021003558 W JP2021003558 W JP 2021003558W WO 2021157530 A1 WO2021157530 A1 WO 2021157530A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- information
- dialogue
- unit
- viewpoint
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
Definitions
- the present invention relates to a device for providing emotional information of a dialogue user in a dialogue between users who are remotely separated from each other.
- Patent Document 1 a technique of analyzing a user's line-of-sight direction from an image captured by an imaging unit provided near a display unit of a video conferencing device, expanding a screen area of interest to the user, and distributing the image to the user. Is disclosed.
- Patent Document 1 does not disclose a technique for improving communication by communicating the emotions of a remote dialogue user.
- an object of the present invention is to improve communication of a remote dialogue user.
- the apparatus is an input receiving unit that receives the viewpoint information of the first user on the input / output terminal of the first user, an analysis unit that analyzes the viewpoint information, and the analyzed viewpoint information. It has an emotion information generation unit that generates emotion information based on the above.
- FIG. 1 is a block configuration diagram showing a remote dialogue system according to the first embodiment of the present invention according to the first embodiment of the present invention.
- This system 1 is used for dialogue between users and a server terminal 100 for storing and analyzing viewpoint information and generating emotion information, and incorporates an imaging unit such as a camera to acquire the user's viewpoint information.
- a single server terminal and two dialogue devices are described, but a plurality of server terminals and one or more dialogue devices may be configured. good.
- the server terminal 100 and the dialogue devices 200A and 200B are each connected via the network NW.
- the network NW is composed of the Internet, an intranet, a wireless LAN (Local Area Network), a WAN (Wide Area Network), and the like.
- the server terminal 100 may be a general-purpose computer such as a workstation or a personal computer, or may be logically realized by cloud computing.
- the dialogue device 200 may be configured by, for example, a video conferencing device, an information processing device such as a personal computer or a tablet terminal, a smartphone, a mobile phone, a PDA, or the like. Further, for example, as a dialogue device, a personal computer or smartphone and a liquid crystal display device are connected by short-range wireless communication or the like, and necessary operations are performed while displaying images of the own user and another user who have a dialogue on the liquid crystal display device. , It may be configured so that it can be performed via a personal computer or a smartphone.
- FIG. 2 is a functional block configuration diagram of the server terminal 100 of FIG.
- the server terminal 100 includes a communication unit 110, a storage unit 120, and a control unit 130.
- the communication unit 110 is a communication interface for communicating with the dialogue device 200 via the network NW, and communication is performed according to a communication convention such as TCP / IP (Transmission Control Protocol / Internet Protocol).
- a communication convention such as TCP / IP (Transmission Control Protocol / Internet Protocol).
- the storage unit 120 stores various control processes, each function in the control unit 130, a program for executing a remote dialogue application, input data, and the like, and includes a RAM (Random Access Memory) and a ROM (Read Only). It is composed of Memory) and the like. Further, the storage unit 120 stores a user data storage unit 121 that stores various data related to the user, an analysis data that analyzes viewpoint information from the user, and an analysis data storage unit that stores emotion information generated based on the analysis result. It has 122. A database (not shown) storing various data may be constructed outside the storage unit 120 or the server terminal 100.
- the control unit 130 controls the entire operation of the server terminal 100 by executing the program stored in the storage unit 120, and is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like. Will be done.
- the input receiving unit 131 that receives information such as viewpoint information from each device
- the analysis unit 132 that analyzes the viewpoint information
- the analysis unit 132 that generates emotion information based on the analysis result of the viewpoint information, emotions. It has an information generation unit 133.
- the input reception unit 131, the analysis unit 132, and the emotional information generation unit 133 are started by a program stored in the storage unit 120 and executed by the server terminal 100 which is a computer (electronic computer).
- the input receiving unit 131 can receive the user's viewpoint information acquired by the dialogue device 200, and can receive voice information, image information, and the like from the user in the case of a video call.
- the received user's viewpoint information can be stored in the user data storage unit 121 and / or the analysis data storage unit 122 of the storage unit 120.
- the analysis unit 132 analyzes the received viewpoint information and can store the analyzed viewpoint information in the user data storage unit 121 and / or the analysis data storage unit 122.
- the emotion information generation unit 133 can generate emotion information based on the analyzed viewpoint information, and can store the emotion information in the user data storage unit 121 and / or the analysis data storage unit 122.
- control unit 130 may also have an emotion information notification control unit (not shown), and for example, the notification unit vibrates the smartphone terminal so that the emotion information can be notified via the notification unit provided in the dialogue device 200.
- the notification unit vibrates the smartphone terminal so that the emotion information can be notified via the notification unit provided in the dialogue device 200.
- a control signal for controlling the activation of vibration based on the emotion of the dialogue user can be generated, and the control signal can be transmitted to a dialogue device different from the dialogue user.
- the control unit 130 may also have a screen generation unit (not shown) to generate screen information to be displayed via the user interface of the dialogue device 200.
- a screen generation unit (not shown) to generate screen information to be displayed via the user interface of the dialogue device 200.
- images and text data (not shown) stored in the storage unit 120 as materials and arranging various images and texts in a predetermined area of the user interface based on a predetermined layout rule (for example, an advertisement).
- Generate a user interface (such as a dashboard) to visualize and show the advertising effect to the main.
- Processing related to the image generation unit can also be executed by the GPU (Graphics Processing Unit).
- the screen generation unit can generate the visualized screen information by identifying the emotion information with colors, characters, or the like.
- control unit 130 can execute various processes included in the remote dialogue application for realizing the remote dialogue by video between a plurality of users.
- FIG. 3 is a functional block configuration diagram showing the dialogue device 200 of FIG.
- the dialogue device 200 includes a communication unit 210, a display operation unit 220, a storage unit 230, a control unit 240, an imaging unit 250, and a notification unit 260.
- the communication unit 210 is a communication interface for communicating with the server terminal 100 and another dialogue device 200 via the network NW, and communication is performed according to a communication convention such as TCP / IP.
- the display operation unit 220 is a user interface used for the user to input an instruction and display text, an image, or the like according to the input data from the control unit 240, and the dialogue device 200 is composed of a personal computer. In the case, it is composed of a display, a keyboard and a mouse, and when the dialogue device 200 is composed of a smartphone or a tablet terminal, it is composed of a touch panel or the like.
- the display operation unit 220 is activated by a control program stored in the storage unit 230 and executed by the dialogue device 200 which is a computer (electronic computer).
- the storage unit 230 stores various control processes, programs for executing each function in the control unit 440, input data, and the like, and is composed of a RAM, a ROM, and the like. In addition, the storage unit 230 temporarily stores the communication content with the server terminal 100.
- the control unit 240 controls the entire operation of the dialogue device 200 by executing a program stored in the storage unit 230 (including a program included in the remote dialogue application), and is controlled from a CPU, a GPU, or the like. It is composed.
- the built-in device 200 can image the user's eyeball with infrared rays and track the viewpoint position on the user's liquid crystal display screen. It may have an imaging unit 250 such as a camera, and if it is composed of a smartphone or the like, it may have a notification unit for notifying the user of emotional information such as a vibration motor that generates vibration.
- FIG. 4 is a diagram illustrating an imaging unit as another example of the dialogue device 200.
- the dialogue device 200 shown in FIG. 4 includes a liquid crystal display device 210, and a through hole 230 is provided in the central portion of the liquid crystal display unit 220 so that the CCD camera 240 is fitted into the through hole 230.
- the dialogue device 200 of this example further includes a smartphone (not shown) that is connected to the liquid crystal display device 210 by short-range wireless communication or by wire, and the smartphone includes various processes such as video call and screen sharing included in the remote dialogue application.
- the screen generated from the image information transmitted via the server terminal 100 and the network NW from the user's dialogue device 200A that executes remote dialogue can be displayed on the liquid crystal display unit 210 of the liquid crystal display device 210. ..
- the CCD camera 240 can image the eyeball of the user who uses the dialogue device 200 with infrared rays, and can track the viewpoint position of the user on the liquid crystal display device.
- an imaging unit CCD camera
- the user who has a dialogue using the liquid crystal display device can have a conversation with the other dialogue user displayed on the liquid crystal display unit in a natural manner. Can be done.
- the display is such that the position of the face (more preferably the position of the eyes) of the other user is aligned with the area where the imaging unit is located.
- the camera provided in the dialogue device of the other user is always followed so that the face is always located in the center.
- FIG. 5 is a diagram showing an example of user data stored in the server 100.
- the user data 1000 stores various data related to the user.
- FIG. 5 for convenience of explanation, an example of one user (scheduled to be identified by the user ID “10001”) is shown, but information related to a plurality of users can be stored.
- various data related to the user for example, basic information of the user (for example, information used as attribute information as the user such as "name, address, age, gender, occupation"), viewpoint information (for example, captured).
- the viewpoint position information on the liquid crystal display screen of the user identified by the user ID "10001" analyzed based on the image, and the emotion information (for example, the user ID "10001” generated based on the viewpoint position information).
- the user's emotional information identified by) can be included.
- FIG. 6 is a diagram showing an example of analysis data stored in the server 100.
- the analysis data for example, the viewpoint position information on the liquid crystal display screen of each user analyzed based on the captured image
- the emotion information for example, the viewpoint position information of each user generated based on the viewpoint position information. Emotional information
- FIG. 7 is a diagram showing an example of emotional information stored in the server 100.
- a user when the user defines the coordinates of the central portion of the liquid crystal display unit (liquid crystal display screen) as (0, 0) in the x-axis and y-axis directions, a user It is configured to track the viewpoint position from the top of the table downward and include the corresponding emotional information. For example, in a liquid crystal display screen in which an image of a dialogue user interacting with a certain user is displayed in the center of the screen, when the user adjusts the viewpoint to the viewpoint position (0, 0), that is, the center of the screen, the user is asked. It can be presumed that they are very positive (highly interested) in communicating with dialogue users.
- a rule can be set in advance so as to correspond to the range of the coordinates centering on the coordinates of the central part, or the past of one user. It is also possible to output emotion information from the input of viewpoint information by machine learning using the combination of viewpoint information and emotion information and / or the combination of past viewpoint information and emotion information of a plurality of users as a learning model. When generating the learning model, feedback of emotional information from the user can also be obtained by additional information such as survey and voice information. When using voice information, for example, it is possible to detect the user's emotions from the voice information, perform natural language analysis from the voice information, detect the emotion information from the conversation content, and evaluate it as an output for the input information (viewpoint information). can.
- FIG. 8 is a diagram showing emotional information in chronological order.
- the vertical axis shows the user's emotions in five stages (1: Very Negative, 2: Negative, 3: Neutral, 4: Positive, 5: Very Positive), and the horizontal axis shows the time axis.
- emotional information can be derived based on the user's viewpoint information and expressed in time series.
- FIG. 8 it is visualized that the user shows a high interest in communication at the beginning of the dialogue, becomes less interested in the middle, and then gradually shows an increase in interest.
- the transition of the visualized emotion information is generated as screen information by the screen generation unit of the server terminal 100, transmitted to the dialogue device 200, and displayed, so that the user can display the emotion information of the dialogue user. You can communicate while referring to the transition.
- FIG. 9 is a diagram showing another example of emotional information stored in the server 100.
- the user can communicate with the interactive user as a whole (including the progress). As a result, you can measure what kind of feelings you have. For example, from the information shown in FIG. 9, the user can understand that the viewpoint position is most focused on the coordinates (0, 0), that is, the center of the screen throughout the communication, and the user can understand the communication. It can be seen that they are receiving Very Positive (very high interest) feelings.
- FIG. 10 is a flowchart showing a method of generating emotional information according to the first embodiment of the present invention.
- the user accesses the server terminal 100 by using the web browser or application of each interactive device, and when using the service for the first time, the above-mentioned basic user information and the like are used.
- the service can be used by logging in after receiving predetermined authentication such as inputting an ID and password.
- predetermined authentication such as inputting an ID and password.
- a predetermined user interface is provided via a website, an application, or the like, the video call service can be used, and the process proceeds to step S101 shown in FIG.
- the input receiving unit 131 of the control unit 130 of the server terminal 100 receives the viewpoint information from the dialogue device 200A via the communication unit 110.
- the viewpoint information for example, the information on the viewpoint position can be acquired by capturing an image of the user with the CCD camera 240 provided in the liquid crystal display unit 220 of the dialogue device shown in FIG.
- the image of the dialogue user is displayed at the center of the liquid crystal display unit 220 (the position where the camera 240 is provided).
- the information related to the viewpoint position can be transmitted from the dialogue device 200A to the server terminal 100, and the image information can be transmitted.
- the analysis unit 132 of the control unit 130 of the server terminal 100 can calculate the viewpoint position based on the received image.
- the analysis unit 132 of the control unit 130 of the server terminal 100 analyzes the viewpoint information. Further, the analysis unit 132 links the viewpoint position of the user on the liquid crystal display unit (screen) as the viewpoint information to a specific user continuously or at predetermined time intervals each time the viewpoint information is acquired. It is also stored in the user data storage unit 121 and / or the analysis data storage unit 122. In addition, the analysis unit 132 can track and store the user's viewpoint information in chronological order. Further, the analysis unit 132 counts the number of times the user's viewpoint position is placed at a predetermined coordinate based on the viewpoint information, or measures the time when the user's viewpoint position is placed at a predetermined coordinate each time, and accumulates the time. Can be calculated. Further, as described above, the analysis unit 132 can also calculate the viewpoint position based on the image including the dialogue user received from the dialogue device 200A.
- the emotion information generation unit 133 of the control unit 130 of the server terminal 100 generates emotion information based on the analyzed viewpoint information.
- the emotion information generation unit 133 has emotion information based on a predetermined rule of the range of the user's viewpoint position from the coordinates centered on the center of the liquid crystal display unit. Can be generated. For example, when the user's viewpoint position is in coordinates (0, 0), that is, in the center of the screen, the emotional information that the user is very positive (shows high interest) in communicating with the interactive user.
- Emotional information can be generated.
- emotion information can be generated based on the input viewpoint information by machine learning from the learning model composed of the user's viewpoint information and emotion information.
- information that visualizes the transition of emotional information changes in time series is generated, and as shown in FIG. 9, the number and / or cumulative total of coordinates where the user's viewpoint is placed. Over time, it is also possible to generate information that evaluates the user's emotions throughout the communication.
- the degree of emotional information (described above) is to transmit the generated emotional information as visualization information to the dialogue device 200B and display it on the display unit of the dialogue device 200B, or to notify the user who uses the dialogue device 200B of the emotional information.
- a control signal for driving a notification unit such as a vibration motor of the dialogue device 200B is generated in order to identify and display by an icon or the like based on (5 grade evaluation) or to sensuously transmit emotional information to the user. Can be sent.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- User Interface Of Digital Computer (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
図1は、本発明の第一実施形態に係る、本発明の第一実施形態に係る、遠隔対話システムを示すブロック構成図である。本システム1は、視点情報を格納し、解析し、感情情報を生成するためのサーバ端末100と、ユーザ同士の対話に用いられ、カメラ等の撮像部を内蔵し、ユーザの視点情報を取得する対話装置200A、200Bと、を含む。なお、説明の便宜上、サーバ端末を単一のものとして、また、対話装置を2台記載しているが、複数のサーバ端末、また、1台または2台より多くの対話装置で構成されてもよい。
図10を参照しながら、本実施形態のシステム1が実行する感情情報生成処理の流れについて説明する。図10は、本発明の第一実施形態に係る、感情情報の生成方法を示すフローチャートである。
Claims (5)
- 相互に遠隔に位置する、第1のユーザの対話装置と第2ユーザの対話装置とを用いた、第1ユーザと第2ユーザとのビデオ対話を支援する装置であって、
前記装置は、
第1のユーザの、前記第1のユーザの対話装置上の視点情報を受信する入力受付部と、
前記視点情報を解析する解析部と、
前記解析した視点情報を基に、感情情報を生成する感情情報生成部と、
を有する装置。 - 請求項1に記載の装置であって、
さらに、前記感情情報を、前記第2ユーザの対話装置に送信する感情情報送信部を有する装置。 - 請求項1に記載の装置であって、
さらに、前記感情情報を、前記第2ユーザの対話装置が有する感情報知部を制御するための制御情報に変換する、感情報知制御部を有する、装置。 - 請求項1に記載の装置であって、
前記感情情報生成部は、前記視点情報に含まれる、前記対話装置上の視点の位置に基づいて、感情情報を生成する、装置。 - 請求項1に記載の装置であって、
前記感情情報生成部は、前記視点情報に含まれる、前記対話装置上の視点の位置の回数または時間に基づいて、感情情報を生成する、装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2212377.2A GB2607800B (en) | 2020-02-03 | 2021-02-01 | Dialogue user emotion information providing device |
US17/794,153 US20230074113A1 (en) | 2020-02-03 | 2021-02-01 | Dialogue user emotion information providing device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-016175 | 2020-02-03 | ||
JP2020016175A JP7316664B2 (ja) | 2020-02-03 | 2020-02-03 | 対話ユーザの感情情報の提供装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021157530A1 true WO2021157530A1 (ja) | 2021-08-12 |
Family
ID=77200248
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/003558 WO2021157530A1 (ja) | 2020-02-03 | 2021-02-01 | 対話ユーザの感情情報の提供装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230074113A1 (ja) |
JP (1) | JP7316664B2 (ja) |
GB (1) | GB2607800B (ja) |
WO (1) | WO2021157530A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2023135939A1 (ja) * | 2022-01-17 | 2023-07-20 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012105196A1 (ja) * | 2011-02-04 | 2012-08-09 | パナソニック株式会社 | 関心度推定装置および関心度推定方法 |
JP2019030557A (ja) * | 2017-08-09 | 2019-02-28 | 沖電気工業株式会社 | 提示装置、提示方法、感情推定サーバ、感情推定方法及び感情推定システム |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8643691B2 (en) * | 2008-05-12 | 2014-02-04 | Microsoft Corporation | Gaze accurate video conferencing |
CN102204225B (zh) * | 2008-09-05 | 2013-12-11 | Sk电信有限公司 | 传送振动信息的移动通信终端及其方法 |
US20100060713A1 (en) * | 2008-09-10 | 2010-03-11 | Eastman Kodak Company | System and Method for Enhancing Noverbal Aspects of Communication |
KR101596975B1 (ko) | 2008-12-16 | 2016-02-23 | 파나소닉 인텔렉츄얼 프로퍼티 코포레이션 오브 아메리카 | 정보 표시 장치 및 정보 표시 방법 |
US8384760B2 (en) * | 2009-10-29 | 2013-02-26 | Hewlett-Packard Development Company, L.P. | Systems for establishing eye contact through a display |
WO2012095917A1 (ja) * | 2011-01-13 | 2012-07-19 | 株式会社ニコン | 電子機器および電子機器の制御プログラム |
US9531998B1 (en) * | 2015-07-02 | 2016-12-27 | Krush Technologies, Llc | Facial gesture recognition and video analysis tool |
WO2016112531A1 (en) * | 2015-01-16 | 2016-07-21 | Hewlett-Packard Development Company, L.P. | User gaze detection |
US10242379B2 (en) * | 2015-01-30 | 2019-03-26 | Adobe Inc. | Tracking visual gaze information for controlling content display |
US10289908B2 (en) * | 2015-10-21 | 2019-05-14 | Nokia Technologies Oy | Method, apparatus, and computer program product for tracking eye gaze and eye movement |
JP6055535B1 (ja) | 2015-12-04 | 2016-12-27 | 株式会社ガイア・システム・ソリューション | 集中度処理システム |
US10963914B2 (en) * | 2016-06-13 | 2021-03-30 | International Business Machines Corporation | System, method, and recording medium for advertisement remarketing |
US10255885B2 (en) * | 2016-09-07 | 2019-04-09 | Cisco Technology, Inc. | Participant selection bias for a video conferencing display layout based on gaze tracking |
KR20180027917A (ko) | 2016-09-07 | 2018-03-15 | 삼성전자주식회사 | 디스플레이장치 및 그 제어방법 |
US10382722B1 (en) * | 2017-09-11 | 2019-08-13 | Michael H. Peters | Enhanced video conference management |
KR101968723B1 (ko) * | 2017-10-18 | 2019-04-12 | 네이버 주식회사 | 카메라 이펙트를 제공하는 방법 및 시스템 |
US11017239B2 (en) * | 2018-02-12 | 2021-05-25 | Positive Iq, Llc | Emotive recognition and feedback system |
KR20200107023A (ko) * | 2019-03-05 | 2020-09-16 | 현대자동차주식회사 | 차량 안전주행 지원 장치 및 방법 |
US11410331B2 (en) * | 2019-10-03 | 2022-08-09 | Facebook Technologies, Llc | Systems and methods for video communication using a virtual camera |
-
2020
- 2020-02-03 JP JP2020016175A patent/JP7316664B2/ja active Active
-
2021
- 2021-02-01 US US17/794,153 patent/US20230074113A1/en not_active Abandoned
- 2021-02-01 GB GB2212377.2A patent/GB2607800B/en active Active
- 2021-02-01 WO PCT/JP2021/003558 patent/WO2021157530A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012105196A1 (ja) * | 2011-02-04 | 2012-08-09 | パナソニック株式会社 | 関心度推定装置および関心度推定方法 |
JP2019030557A (ja) * | 2017-08-09 | 2019-02-28 | 沖電気工業株式会社 | 提示装置、提示方法、感情推定サーバ、感情推定方法及び感情推定システム |
Also Published As
Publication number | Publication date |
---|---|
GB2607800B (en) | 2024-05-22 |
GB2607800A (en) | 2022-12-14 |
GB202212377D0 (en) | 2022-10-12 |
JP2021125734A (ja) | 2021-08-30 |
US20230074113A1 (en) | 2023-03-09 |
JP7316664B2 (ja) | 2023-07-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6992870B2 (ja) | 情報処理システム、制御方法、およびプログラム | |
US20190332400A1 (en) | System and method for cross-platform sharing of virtual assistants | |
TWI681298B (zh) | 觸碰式通訊系統和方法 | |
JP2018137723A (ja) | 遠隔会議の参加者の資質のフィードバックを提供するための方法およびシステム、コンピューティングデバイス、プログラム | |
US20160078449A1 (en) | Two-Way Interactive Support | |
KR101921926B1 (ko) | 컨텐츠 제공 시스템 및 동작 방법 | |
US20090044112A1 (en) | Animated Digital Assistant | |
KR102415719B1 (ko) | 메타버스에서 가상 상담 환경을 위한 아바타의 상태 정보를 표시하는 메타버스 서버 및 이의 실행 방법 | |
WO2022089192A1 (zh) | 一种互动处理方法、装置、电子设备和存储介质 | |
JP6978141B1 (ja) | ブロックオブジェクトの設計図を生成する方法 | |
US10082928B2 (en) | Providing content to a user based on amount of user contribution | |
WO2013018731A1 (ja) | カウンセリングシステム、カウンセリング装置、及びクライアント端末 | |
WO2013094065A1 (ja) | 判定装置、判定方法及び判定プログラム | |
CN113923515A (zh) | 视频制作方法、装置、电子设备及存储介质 | |
WO2021157530A1 (ja) | 対話ユーザの感情情報の提供装置 | |
JP6787973B2 (ja) | 質問に対する回答を支援するためのシステム、方法、及びプログラム | |
KR20130122300A (ko) | 통화 중 감정 분석 서비스 제공 방법 및 장치 | |
US20170277412A1 (en) | Method for use of virtual reality in a contact center environment | |
KR20130015472A (ko) | 디스플레이장치, 그 제어방법 및 서버 | |
Huang et al. | Computer-Supported Collaboration: Theory and Practice | |
KR20010089005A (ko) | 캐릭터를 이용한 인터넷 포탈 서비스 시스템 | |
CN113157241A (zh) | 交互设备、交互装置及交互系统 | |
JP2021135426A (ja) | オンライン会話支援方法 | |
US20220076671A1 (en) | Information processing terminal, information processing apparatus, and information processing method | |
CN115131547A (zh) | Vr/ar设备截取图像的方法、装置及系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21750280 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 202212377 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20210201 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21750280 Country of ref document: EP Kind code of ref document: A1 |