WO2023199256A3 - Affective response modulation in embodied agents - Google Patents
Affective response modulation in embodied agents Download PDFInfo
- Publication number
- WO2023199256A3 WO2023199256A3 PCT/IB2023/053784 IB2023053784W WO2023199256A3 WO 2023199256 A3 WO2023199256 A3 WO 2023199256A3 IB 2023053784 W IB2023053784 W IB 2023053784W WO 2023199256 A3 WO2023199256 A3 WO 2023199256A3
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- response
- empathetic
- emotions
- verbal
- affective response
- Prior art date
Links
- 230000004044 response Effects 0.000 title abstract 5
- 230000008451 emotion Effects 0.000 abstract 4
- 230000002996 emotional effect Effects 0.000 abstract 2
- 230000001755 vocal effect Effects 0.000 abstract 2
- 206010027940 Mood altered Diseases 0.000 abstract 1
- 206010027951 Mood swings Diseases 0.000 abstract 1
- 230000009118 appropriate response Effects 0.000 abstract 1
- 230000006399 behavior Effects 0.000 abstract 1
- 230000001815 facial effect Effects 0.000 abstract 1
- 238000013507 mapping Methods 0.000 abstract 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
- G10L25/63—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Processing Or Creating Images (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
An affective response system for autonomous agents creates a configurable mapping from perceived emotions from stimuli to appropriate response emotions as part of an empathetic autonomously animated system. The capacity to configure the empathetic response allows for the easy and intuitive creation of different styles of response to convey desired personality traits and tailor the emotional performance to specific use-cases. The parameters that describe the empathetic response can be modified dynamically to simulate mood swings or changes in the state of mind. The input emotions can come from any emotion classification system including but not limited to an NLP system or facial emotional analysis system. The outputs can be used to drive verbal or non-verbal behaviors.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NZ78727222 | 2022-04-13 | ||
NZ787272 | 2022-04-13 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2023199256A2 WO2023199256A2 (en) | 2023-10-19 |
WO2023199256A3 true WO2023199256A3 (en) | 2023-12-07 |
Family
ID=88329121
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2023/053784 WO2023199256A2 (en) | 2022-04-13 | 2023-04-13 | Affective response modulation in embodied agents |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023199256A2 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20130091364A (en) * | 2011-12-26 | 2013-08-19 | 한국생산기술연구원 | Apparatus and method for generating emotion of robot |
JP5729692B2 (en) * | 2011-02-28 | 2015-06-03 | 国立大学法人信州大学 | Robot equipment |
US20190321985A1 (en) * | 2018-04-18 | 2019-10-24 | Korea Institute Of Industrial Technology | Method for learning and embodying human facial expression by robot |
US20200090392A1 (en) * | 2018-09-19 | 2020-03-19 | XRSpace CO., LTD. | Method of Facial Expression Generation with Data Fusion |
US20200349752A1 (en) * | 2013-08-02 | 2020-11-05 | Soul Machines Limited | System for neurobehaviorual animation |
-
2023
- 2023-04-13 WO PCT/IB2023/053784 patent/WO2023199256A2/en unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5729692B2 (en) * | 2011-02-28 | 2015-06-03 | 国立大学法人信州大学 | Robot equipment |
KR20130091364A (en) * | 2011-12-26 | 2013-08-19 | 한국생산기술연구원 | Apparatus and method for generating emotion of robot |
US20200349752A1 (en) * | 2013-08-02 | 2020-11-05 | Soul Machines Limited | System for neurobehaviorual animation |
US20190321985A1 (en) * | 2018-04-18 | 2019-10-24 | Korea Institute Of Industrial Technology | Method for learning and embodying human facial expression by robot |
US20200090392A1 (en) * | 2018-09-19 | 2020-03-19 | XRSpace CO., LTD. | Method of Facial Expression Generation with Data Fusion |
Also Published As
Publication number | Publication date |
---|---|
WO2023199256A2 (en) | 2023-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Hoffman et al. | Design and evaluation of a peripheral robotic conversation companion | |
WO2018093770A3 (en) | Generating communicative behaviors for anthropomorphic virtual agents based on user's affect | |
Hegel et al. | Towards a typology of meaningful signals and cues in social robotics | |
Forster | Distraction and mind-wandering under load | |
Santos et al. | From motions to emotions: Can the fundamental emotions be expressed in a robot swarm? | |
Wrede et al. | Appropriate feedback in asymmetric interactions | |
Xu et al. | Effects of a robotic storyteller's moody gestures on storytelling perception | |
Hammer et al. | Investigating politeness strategies and their persuasiveness for a robotic elderly assistant | |
Feldmaier et al. | Evaluation of a RGB-LED-based emotion display for affective agents | |
Lazzeri et al. | Towards a believable social robot | |
Woo et al. | Facial and gestural expression generation for robot partners | |
Basori et al. | Intelligent avatar on E-learning using facial expression and haptic | |
Hosseini et al. | Both “look and feel” matter: Essential factors for robotic companionship | |
Lehmann et al. | Physiologically inspired blinking behavior for a humanoid robot | |
Johal et al. | Towards companion robots behaving with style | |
WO2023199256A3 (en) | Affective response modulation in embodied agents | |
Lee et al. | Developing social robots with empathetic non-verbal cues using large language models | |
DiPaola et al. | A multi-layer artificial intelligence and sensing based affective conversational embodied agent | |
Giambattista et al. | Expression of emotions by a service robot: a pilot study | |
Loth et al. | Understanding social signals: how do we recognize the intentions of others? | |
De Beir et al. | Enhancing nao expression of emotions using pluggable eyebrows | |
Ayanoğlu et al. | Human-Robot Interaction | |
Hernez‐Broome | Social intelligence: the new science of human relationships | |
Straßmann et al. | Alexa feels blue and so do i? conversational agents displaying emotions via light modalities | |
Tseng et al. | Designing the personalized nostalgic emotion value of a product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23787917 Country of ref document: EP Kind code of ref document: A2 |