JPWO2022180770A5 - - Google Patents
Download PDFInfo
- Publication number
- JPWO2022180770A5 JPWO2022180770A5 JP2023501941A JP2023501941A JPWO2022180770A5 JP WO2022180770 A5 JPWO2022180770 A5 JP WO2022180770A5 JP 2023501941 A JP2023501941 A JP 2023501941A JP 2023501941 A JP2023501941 A JP 2023501941A JP WO2022180770 A5 JPWO2022180770 A5 JP WO2022180770A5
- Authority
- JP
- Japan
- Prior art keywords
- user
- expression
- character
- present
- control unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 claims 9
- 230000010365 information processing Effects 0.000 claims 5
- 238000003672 processing method Methods 0.000 claims 2
- 230000035807 sensation Effects 0.000 claims 2
- 230000008451 emotion Effects 0.000 claims 1
- 230000002996 emotional effect Effects 0.000 claims 1
- 238000000034 method Methods 0.000 claims 1
Claims (10)
聴覚に基づく感覚を提示するようにデバイスを制御する制御部として機能させるためのプログラムであって、
前記制御部は、
ユーザ設定又はセンサによる検知結果に応じて前記聴覚を提示して、前記聴覚に基づく所定のキャラクターを表現するキャラクター表現を人の声として提示するように前記デバイスを制御し、
方向に関連付けられた誘導表現を前記所定のキャラクターで提示するように前記デバイスを制御し、
前記ユーザの行動の前記センサの検知結果に基づいて前記ユーザの行動と前記誘導表現との間の一致度を判定し、
判定された前記一致度に応じて、前記誘導表現の特性を変更するように前記デバイスを制御して聴覚に提示し、
前記デバイスは、前記ユーザが前記デバイスと共感を構築するように前記ユーザと前記デバイスから提示される人の声のキャラクター表現を前記ユーザに提示する、プログラム。 computer,
A program for functioning as a control unit that controls a device to present an auditory -based sensation ,
The control unit includes:
controlling the device to present the auditory sense according to a user setting or a detection result by a sensor to present a character expression representing a predetermined character based on the auditory sense as a human voice;
controlling the device to present a guiding representation associated with a direction with the predetermined character;
determining the degree of matching between the user's behavior and the guidance expression based on the detection result of the sensor of the user's behavior;
controlling the device to change the characteristics of the guided expression and presenting it auditorily according to the determined degree of matching;
The device is programmed to present to the user a character representation of a human voice presented by the user and the device such that the user builds empathy with the device .
前記制御部は、
ユーザ設定又はセンサによる検知結果に応じて前記聴覚を提示して、前記聴覚に基づく所定のキャラクターを表現するキャラクター表現を人の声として提示するように前記デバイスを制御し、
方向に関連付けられた誘導表現を前記所定のキャラクターで提示するように前記デバイスを制御し、
前記ユーザの行動の前記センサの検知結果に基づいて前記ユーザの行動と前記誘導表現との間の一致度を判定し、
判定された前記一致度に応じて、前記誘導表現の特性を変更するように前記デバイスを制御して聴覚に提示し、
前記デバイスは、前記ユーザが前記デバイスと共感を構築するように前記ユーザと前記デバイスから提示される人の声のキャラクター表現を前記ユーザに提示する、情報処理装置。 An information processing device comprising a control unit that controls a device to present a sense based on hearing ,
The control unit includes:
controlling the device to present the auditory sense according to a user setting or a detection result by a sensor to present a character expression representing a predetermined character based on the auditory sense as a human voice;
controlling the device to present a guiding representation associated with a direction with the predetermined character;
determining the degree of matching between the user's behavior and the guidance expression based on the detection result of the sensor of the user's behavior;
controlling the device to change the characteristics of the guided expression and presenting it auditorily according to the determined degree of matching;
The device is an information processing apparatus that presents to the user a character representation of a human voice presented by the user and the device so that the user builds empathy with the device.
前記制御部は、
ユーザ設定又はセンサによる検知結果に応じて前記聴覚を提示して、前記聴覚に基づく所定のキャラクターを表現するキャラクター表現を人の声として提示するように前記デバイスを制御し、
方向に関連付けられた誘導表現を前記所定のキャラクターで提示するように前記デバイスを制御し、
前記ユーザの行動の前記センサの検知結果に基づいて前記ユーザの行動と前記誘導表現との間の一致度を判定し、
判定された前記一致度に応じて、前記誘導表現の特性を変更するように前記デバイスを制御して聴覚に提示し、
前記デバイスは、前記ユーザが前記デバイスと共感を構築するように前記ユーザと前記デバイスから提示される人の声のキャラクター表現を前記ユーザに提示する、情報処理方法。
An information processing method in an information processing device comprising a control unit that controls a device to present a sensation based on hearing , the method comprising:
The control unit includes:
controlling the device to present the auditory sense according to a user setting or a detection result by a sensor to present a character expression representing a predetermined character based on the auditory sense as a human voice;
controlling the device to present a guiding representation associated with a direction with the predetermined character;
determining the degree of matching between the user's behavior and the guidance expression based on the detection result of the sensor of the user's behavior;
controlling the device to change the characteristics of the guided expression and presenting it auditorily according to the determined degree of matching;
An information processing method , wherein the device presents to the user a character representation of a human voice presented by the user and the device so that the user builds empathy with the device .
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/007279 WO2022180770A1 (en) | 2021-02-26 | 2021-02-26 | Program, information processing device, and information processing method |
Publications (2)
Publication Number | Publication Date |
---|---|
JPWO2022180770A1 JPWO2022180770A1 (en) | 2022-09-01 |
JPWO2022180770A5 true JPWO2022180770A5 (en) | 2024-02-27 |
Family
ID=83048938
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2023501941A Pending JPWO2022180770A1 (en) | 2021-02-26 | 2021-02-26 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2022180770A1 (en) |
WO (1) | WO2022180770A1 (en) |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10240434A (en) * | 1997-02-27 | 1998-09-11 | Matsushita Electric Ind Co Ltd | Command menu selecting method |
JPH11259446A (en) * | 1998-03-12 | 1999-09-24 | Aqueous Reserch:Kk | Agent device |
JP5477740B2 (en) * | 2009-11-02 | 2014-04-23 | 独立行政法人情報通信研究機構 | Multisensory interaction system |
JP5714855B2 (en) * | 2010-09-16 | 2015-05-07 | オリンパス株式会社 | Image generation system, program, and information storage medium |
EP3136055A4 (en) * | 2014-04-21 | 2018-04-11 | Sony Corporation | Communication system, control method, and storage medium |
JP2017181449A (en) * | 2016-03-31 | 2017-10-05 | カシオ計算機株式会社 | Electronic device, route search method, and program |
JP2018100936A (en) * | 2016-12-21 | 2018-06-28 | トヨタ自動車株式会社 | On-vehicle device and route information presentation system |
JP2019082904A (en) * | 2017-10-31 | 2019-05-30 | ソニー株式会社 | Information processor, information processing method and program |
-
2021
- 2021-02-26 JP JP2023501941A patent/JPWO2022180770A1/ja active Pending
- 2021-02-26 WO PCT/JP2021/007279 patent/WO2022180770A1/en active Application Filing
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2019096349A5 (en) | ||
TWI713000B (en) | Online learning assistance method, system, equipment and computer readable recording medium | |
US20160364002A1 (en) | Systems and methods for determining emotions based on user gestures | |
Mead et al. | Perceptual models of human-robot proxemics | |
JP2023085255A5 (en) | ||
US10303436B2 (en) | Assistive apparatus having accelerometer-based accessibility | |
DE60317338D1 (en) | CONTROL SYSTEM FOR A LOAD HANDLING DEVICE | |
Hyrskykari | Utilizing eye movements: Overcoming inaccuracy while tracking the focus of attention during reading | |
KR20160089184A (en) | Apparatus and method for recognizing speech | |
CN107081774B (en) | Robot shakes hands control method and system | |
GB2590473A (en) | Method and apparatus for dynamic human-computer interaction | |
JPWO2022180770A5 (en) | ||
JP2020202575A5 (en) | ||
CN112219234B (en) | Method, medium and system for identifying physiological stress of user of virtual reality environment | |
JP2022118239A5 (en) | ||
JP4947439B2 (en) | Voice guidance device, voice guidance method, voice guidance program | |
JP2000099306A5 (en) | ||
Nolin et al. | Activation cues and force scaling methods for virtual fixtures | |
KR20200134974A (en) | Apparatus and method for controlling image based on user recognition | |
CN111027358A (en) | Dictation and reading method based on writing progress and electronic equipment | |
KR101019655B1 (en) | Apparatus and method having a function of guiding user's controlling behavior | |
US12019993B2 (en) | Systems and methods for short- and long-term dialog management between a robot computing device/digital companion and a user | |
KR101938231B1 (en) | Apparatus and method for estimation of user personality based on accumulated short-term personality character | |
JP2000352992A5 (en) | Voice recognition device and navigation device | |
Zhou et al. | Proposal and validation of an index for the operator’s haptic sensitivity in a master-slave system |