WO2006070495A1 - Dispositif d'interface - Google Patents

Dispositif d'interface Download PDF

Info

Publication number
WO2006070495A1
WO2006070495A1 PCT/JP2005/009412 JP2005009412W WO2006070495A1 WO 2006070495 A1 WO2006070495 A1 WO 2006070495A1 JP 2005009412 W JP2005009412 W JP 2005009412W WO 2006070495 A1 WO2006070495 A1 WO 2006070495A1
Authority
WO
WIPO (PCT)
Prior art keywords
main body
interface device
user
pressure
unit
Prior art date
Application number
PCT/JP2005/009412
Other languages
English (en)
Japanese (ja)
Inventor
Masahiko Inami
Naoya Koizumi
Noriyoshi Shimizu
Maki Sugimoto
Hideaki Nii
Original Assignee
Campus Create Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Campus Create Co., Ltd. filed Critical Campus Create Co., Ltd.
Publication of WO2006070495A1 publication Critical patent/WO2006070495A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Definitions

  • the present invention relates to an apparatus for providing an interface between an operator such as a computer and an operator.
  • Non-patent Document 1 Space-sharing telecommunications offers people-to-people encounters beyond physical spaces! / It can be a new entertainment that creates a samurai-style relationship.
  • Non-patent Document 2 RUI (Robotic User Interface) using a robot as an interface and the concept of! Have been recently proposed (Non-patent Document 2).
  • the conventional RUI is called the holding type, and it handles the robot with both hands.
  • RUI (trade name: “RobotP HONE”) shown in Non-Patent Document 2 is a system that communicates with one another by synchronizing the shape of a robot remotely.
  • the technology of Non-Patent Document 2 mainly uses a robot as an interface to a computer by synchronizing the robot with a computer generated CG model (V, so-called “avatar”).
  • V computer generated CG model
  • Non-Patent Document 1 Takashi Matsuo; Psychology of Communication, Nakasha Publishing, 1999
  • Non-Patent Document 2 Sekiguchi Continent, Masahiko Inami, Susumu Tsuji T; Robotics User The Interface with Object-Oriented Teleregistance-Shape Proposal and experimental implementation of shared system-Interactive system and software VIII: Japan Society for Software Science, Modern Science, pp.51-56, 2000
  • the present invention has been made in view of the above-described situation, and is intended to provide an interface device that enables self-projection to a main body attached to a user's hand and presents a sense through the main body. To do.
  • the interface device of the present invention includes a main body and a presentation unit.
  • the main body is configured to cover the user's hand. Further, the main body is configured to be capable of self-projection.
  • the presenting unit is configured to present a sense to the user at the position of the main body corresponding to the position of the user's body.
  • the main body By making the main body itself anthropomorphic, the main body may be configured to be capable of self-projection.
  • the interface device may further include a display unit.
  • This display unit is configured to display an avatar that moves in synchronization with the main body.
  • the main body is configured to be capable of self-projection when the display unit displays the avatar.
  • a communication system of the present invention includes a plurality of interface devices having any of the above-described configurations and a network that enables communication between these interface devices.
  • the interface device further includes a detection unit that detects a sense applied by the operator to the main body.
  • a sense detected by one interface device is transmitted to another interface device via the network and presented by the presentation unit of the other interface device.
  • the communication system of the present embodiment includes a first interface device 1, a second interface device 2, a server 3, and a network 4 that enables mutual communication between them (see FIG. 1). ).
  • This system may further include another (that is, nth) interface device (that is, a user), but the following description is based on two interface devices.
  • the network 4 in the present embodiment may be a network using, for example, the power of the Internet or another protocol.
  • first interface device 1 and the second interface device 2 are configured in substantially the same manner, the configuration of the first interface device 1 will be described below.
  • the first interface device 1 includes a main body 11, a pressure sense presentation unit 12, a force sense presentation unit 13, a pressure sense detection unit 14, a position detection unit 15, a CPU 16, a display unit 17, and a dynamic simulator 18. It is equipped with.
  • FIG. 1 only the main body 11 and the display unit 17 are shown, and other elements will be described later with reference to FIG.
  • the pressure sensation presentation unit 12 and the haptic presentation unit 13 correspond to the presentation unit in the present invention, but a presentation unit that presents other senses can also be used.
  • a detection unit that detects a sense other than pressure sense may be provided.
  • the pressure sensor 14 corresponds to the detector in the present invention.
  • the main body 11 of the first interface device 1 is configured to cover the user A's hand (right hand in the illustrated example) 5 (see FIGS. 2 and 3).
  • the main body 11 has a shape that personifies the main body 11 itself. In the example of illustration, it has a shape imitating an animal. Thereby, the main body is configured to be capable of self-projection.
  • being capable of self-projection means “it is possible to project and recognize the user's body on the object”. Therefore, the shape of the main body 11 can be a shape imitating various animals or humans.
  • the thumb and little finger in the hand 5 of the user A are stored in the right hand and the left hand in the main body 1.
  • the remaining three fingers in the hand 5 are stored in the head in the main body 1.
  • the body 11 need only be able to self-project a part of the body that does not need to self-project the entire body.
  • the pressure sense presentation unit 12 and the pressure sense detection unit 14 are disposed so as to be stacked over almost the entire inner surface of the main body 1.
  • the pressure sense presentation unit 12 and the pressure sense detection unit 14 may be further arranged so as to cover the index finger, the middle finger, and the ring finger.
  • the pressure sensation presentation unit 12 and the pressure sensation detection unit 14 may be further arranged so as to individually cover the thumb or little finger.
  • FIG. 1 An example of the detailed structure of the pressure sense presentation unit 12 and the pressure sense detection unit 14 is shown in FIG.
  • the pressure sense presentation unit 12 includes a coil 121, a magnet 122, and a drive circuit 123.
  • the drive circuit is connected to the CPU 16, and the coil 121 is energized by a command of CPU16.
  • the magnet 122 is moved up and down in FIG.
  • the pressure sensation presentation unit 12 can present pressure sensation to the hand 5 of the user A.
  • the pressure sensation presentation unit 12 is configured by using a piezoelectric element.
  • the pressure sensing unit 14 includes a pressure sensitive grease 141, an electrode 142, and an amplifier circuit 143.
  • the output of the pressure sensitive resin 141 is input to the CPU 16 via the electrode 142 and the amplifier circuit 143.
  • this makes it possible to acquire a sensation (pressure sensation) when the user A's other hand (left hand in the illustrated example) strokes or touches the main body 1.
  • the position detection unit 15 includes a potentiometer 151 (see FIGS. 7 and 8), a 3D motion sensor 1502 (see FIG. 9), and a small geared motor 153 with a built-in potentiometer.
  • the potentiometer 151 can detect the movement angle of the movable body 151a that changes according to the movement of the finger.
  • the 3D motion sensor 152 can detect the position and inclination angle of the hand 5 (that is, the main body 11) in the space. Since such a sensor is well known, a detailed description of its structure is omitted. In this specification, when simply referred to as “position”, in principle, it is used as a concept including an angle or a posture.
  • the geared motor 153 is arranged on the thumb side and the little finger side of the hand 5, respectively. Since these geared motors 153 have substantially the same structure, only the thumb side will be described.
  • the geared motor 153 drives the movable piece 153a attached to the thumb and detects the drive amount. You can do it! / Such geared motors are well known and will not be described in detail.
  • the geared motor 153 also serves as the knot presentation unit 13.
  • the display unit 17 includes a CG generator 171, application software 172, and a monitor 173. These operations will also be described later.
  • a dynamic simulator 18 can be provided.
  • the dynamic simulator 18 is designed to perform the following operations.
  • the server 3 acquires data from each interface device acquired via the network 4 and sends a control command to each interface device! /.
  • the operation of the system according to the present embodiment will be described with reference mainly to FIG.
  • the first interface device 1 will be mainly described. Since the first or the second is a relative relationship, the description mainly using the second interface device 2 is omitted.
  • the main body 11 of the first interface device 1 is moved by the hand 5 of the operator A.
  • the operator A touches the outer surface of the main body 11.
  • the pressure sense detection unit 14 and the position detection unit 15 detect the pressure sense and position of each part of the main body 11.
  • the user can project his / her body (or the partner's body) onto the main body 11 and touch or move the main body 11. Therefore, the pressure sense and position of each part of the main body 11 are information on the main body 11 itself and also information on the body projected on the main body 11.
  • Each detected control data is sent to the CPU 16.
  • the CPU 16 sends these data to the dynamic simulator 18. Therefore, according to the present embodiment, it is possible to input a sense of pressure or the like given to the main body 11 by the operator A to the computer.
  • Step 12-3 The dynamic simulator 18 compares the read control data with the initial data, and corrects the data. The result is sent to the CG generator 171 of the display unit 17.
  • the CG generator 171 uses the corrected data to obtain data for generating an avatar with CG. That is, data conversion is performed.
  • the CG generator 171 generates an avatar as CG and sends the data to the application software 172.
  • the application software 172 calculates the status of the avatar therein.
  • the status of the avatar obtained as a result is sent to the monitor 173 and displayed. That is, the avatar 6 (see FIG. 10) that moves in synchronization with the main body 11 can be displayed.
  • the first interface device 1 transmits the data for avatar to the second interface device 2 via the IZF and the network 4.
  • the second interface device 2 receives the avatar data sent in step 12-7 via the IZF.
  • the CG generator of the second interface device 2 generates an avatar data force avatar and transmits it to the application software of the second interface device 2.
  • the application software of the second interface device 2 calculates the state of the annator in the application software, and further displays the avatar on the monitor of the second interface device 2 based on the calculation result.
  • the dynamic simulator of the second interface device 2 was calculated by application software. Based on the state of the avatar, control data to the pressure sensation presentation unit and the haptic presentation unit in the second interface device 2 is generated. Further, the dynamics simulator sends this control data to the pressure sense presentation unit and the force sense presentation unit in the second interface device 2, and the sense (pressure sense and force sense) in the second interface device 2 is transmitted to the user B of the second interface device B. To present.
  • a pressure sensation such as a pressure sensation given to the main body 11 of the first interface 1 can be sent to the second interface device 2 via the network 4.
  • the second interface 2 it is possible to present a sensation to another operator B (for example, an operator in a remote place) B via the pressure sensation presentation unit 12 of the main body 11.
  • Operator B can receive the self-projected stimulus or the position of each part of the main body from the main body.
  • stimulation or position information on the main body can be received as information about the body of the user (or the other party) with the following correspondence relationship.
  • the second interface device 2 transmits control data in the second interface device 2 to the first interface device 1 via the I / F and the network 4.
  • the first interface device 1 performs the operations in steps 12-3 to 12-6 described above.
  • the first interface device 1 uses the data obtained in step 12-6 to provide the user A with the pressure sense presentation unit 12 and the force sense presentation unit 13 provided in the first interface device 1. Sense of pressure and force can be presented.
  • the main body 11 of the first interface 1 adds to itself.
  • the received stimulus or the position of each part of the main body 11 can be received by self-projection.
  • the relationship between each part of the main body 11 and the body is as described above.
  • a large number of users can communicate with each other and share a virtual space via a main body or an avatar.
  • space-sharing telecommunications can be performed by presenting physical sensations through the main body.
  • the main body 11 has a high self-projecting property and a shape.
  • the avatar 6 displayed on the display unit 17 is made into a shape capable of self-projection, the stimulus or position in the main body 11 is changed from the stimulus or position in the corresponding part of the user's body. Can be recognized. Also, based on this correspondence, it is possible to input position or stimulus information on the body to the computer using the main body 11.
  • the communication is performed between the plurality of interface devices.
  • the interface device can be used alone.
  • the interface device described above can be used for input to a computer or output of computer power. Also in this case, input or output self-projected on the main body is possible.
  • an example of a processing procedure when the first interface device of the present embodiment is used as a computer interface without assuming a communication system will be schematically described with reference to FIG.
  • the position of the main body 11 is moved by the operator's hand 5.
  • the operator touches the surface of the main body 11.
  • the pressure sensor 14 and position detector 15 of the main unit 11 detect the pressure and position of each part of the main unit 11. To do.
  • the user can project the body of the user (or the body of the avatar displayed on the display unit 17) by touching or moving the body. Therefore, the pressure sense and position of each part of the main body 11 are information on the main body 11 itself and also information on the body projected on the main body 11. Each detected control data is sent to the computer.
  • the computer receives these control data (sensor values).
  • the computer compares the received control data with the initial data, and calculates the avatar shape.
  • the computer calculates the presence or absence of interference between the avatar and the VR environment.
  • Step 13 If there is an interference in 5! /, Calculate the reaction force.
  • step 13-5 If there is no interference in step 13-5, the control value in the pressure sensation presentation unit 12 or the force sense presentation unit 13 is calculated. The same calculation is performed after the reaction force calculation in Step 13-6.
  • the computer sends the control value obtained in step 13-7 to the main body 11.
  • the main body 11 receives the sent control value, and controls the pressure sense presentation unit 12 and the force sense presentation unit 13 in accordance with the control value.
  • the computer and the main body 11 repeat the above operation or end the operation according to the presence / absence of the end signal.
  • each interface device is configured to present and detect sensations, but may be configured to only present or detect. It should be noted that the apparatus and method of the present invention are not limited to the above-described embodiments, and various modifications can be made without departing from the scope of the present invention.
  • each component described above may not exist as independent hardware as long as it exists as a functional block.
  • a mounting method hardware or computer software may be used.
  • one functional element in the present invention may be realized by a set of a plurality of functional elements, and a plurality of functional elements in the present invention may be realized by one functional element.
  • the functional elements may be arranged at physically separated positions.
  • the functional elements may be connected by a network.
  • FIG. 1 is an explanatory diagram for explaining the overall configuration of a communication system using an interface device according to an embodiment of the present invention.
  • FIG. 2 is an explanatory view showing a state where the first interface device is attached to the hand.
  • FIG. 3 is a rear view of FIG. 2.
  • FIG. 4 is a schematic explanatory diagram for explaining a cross-sectional structure of the main body 1.
  • FIG. 5 is an explanatory view in a state where a part of the main body 1 is peeled off in order to explain the structure of the main body 1.
  • FIG. 6 is a schematic cross-sectional view for explaining the structures of a pressure sense presentation unit and a pressure sense detection unit.
  • FIG. 7 is an explanatory diagram for explaining the structure of a position detection unit.
  • FIG. 8 is an explanatory diagram for explaining the structure of a position detection unit.
  • FIG. 9 is an explanatory diagram for explaining the structures of a position detection unit and force presentation units.
  • FIG. 10 is an explanatory diagram showing a state in which an avatar is displayed by CG.
  • FIG. 11 is a block diagram showing functional elements in the first interface device.
  • FIG. 12 is a flowchart for explaining a procedure for transmitting a sense of pressure or the like using the communication system.
  • FIG. 13 is a flowchart for explaining a procedure in the case of performing an interface with a computer using the interface device of the present invention. Explanation of symbols

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

Le problème à résoudre dans le cadre de cette invention est de fournir un dispositif pour l'exécution d'interfaces à travers un corps en permettant une autoprojection sur le corps installé sur la main d'un utilisateur. La solution proposée consiste à ce que le corps (11) soit formé pour couvrir la main de l'utilisateur (A). Le corps (11) est formé dans une forme anthropomorphique de sorte que l'autoprojection puisse être effectuée sur lui-même. Un élément d'indication est formé pour fournir une sensation à l'utilisateur (A) à la position du corps correspondant à la position du corps de l'utilisateur (A). Également, l'élément d'indication peut indiquer une marque indélébile (6) se déplaçant en synchronisme avec le corps (11). Le corps (11) peut être ainsi formé pour qu'il puisse effectuer l'autoprojection par indication de la marque indélébile (6).
PCT/JP2005/009412 2004-12-28 2005-05-24 Dispositif d'interface WO2006070495A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004-379106 2004-12-28
JP2004379106A JP2006185252A (ja) 2004-12-28 2004-12-28 インタフェース装置

Publications (1)

Publication Number Publication Date
WO2006070495A1 true WO2006070495A1 (fr) 2006-07-06

Family

ID=36614622

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/009412 WO2006070495A1 (fr) 2004-12-28 2005-05-24 Dispositif d'interface

Country Status (2)

Country Link
JP (1) JP2006185252A (fr)
WO (1) WO2006070495A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5201150B2 (ja) * 2007-11-27 2013-06-05 日本電気株式会社 触力覚通信端末装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2132650A4 (fr) * 2007-03-01 2010-10-27 Sony Comp Entertainment Us Système et procédé pour communiquer avec un monde virtuel
US8362882B2 (en) 2008-12-10 2013-01-29 Immersion Corporation Method and apparatus for providing Haptic feedback from Haptic textile

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000099240A (ja) * 1998-09-18 2000-04-07 Sony Corp 力覚提示装置
JP2003305670A (ja) * 2002-02-13 2003-10-28 Center For Advanced Science & Technology Incubation Ltd ロボットフォン

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996021994A1 (fr) * 1995-01-11 1996-07-18 Shaw Christopher D Systeme d'interface tactile
JP4063131B2 (ja) * 2003-04-15 2008-03-19 セイコーエプソン株式会社 画像処理装置及び触覚・力覚提示方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000099240A (ja) * 1998-09-18 2000-04-07 Sony Corp 力覚提示装置
JP2003305670A (ja) * 2002-02-13 2003-10-28 Center For Advanced Science & Technology Incubation Ltd ロボットフォン

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KOIZUMI N. ET AL.: "Hand Puppet-gata Robotic User Interface no Kenkyu", ENTERTAINMENT COMPUTING 2004 RONBUNSHU, 20 August 2004 (2004-08-20), pages 106, XP002999590 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5201150B2 (ja) * 2007-11-27 2013-06-05 日本電気株式会社 触力覚通信端末装置

Also Published As

Publication number Publication date
JP2006185252A (ja) 2006-07-13

Similar Documents

Publication Publication Date Title
US10970936B2 (en) Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
US10564730B2 (en) Non-collocated haptic cues in immersive environments
EP1523725B1 (fr) Dispositif manuel interactif avec un ordinateur
CN103257783B (zh) 用于在移动设备上共享反馈的交互模型
CN104107539B (zh) 具有触感使能触发器的游戏设备
KR100906576B1 (ko) 핸드 햅틱 인터페이스 장치 및 방법
US20210132697A1 (en) Haptic feedback system having two independent actuators
Hummel et al. A lightweight electrotactile feedback device for grasp improvement in immersive virtual environments
US20190294249A1 (en) Systems and methods for haptic feedback in a virtual reality system
KR20200110502A (ko) 햅틱 컨트롤러 및 이를 이용한 햅틱 피드백 제공 시스템 및 방법
US20230142242A1 (en) Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds
Pabon et al. A data-glove with vibro-tactile stimulators for virtual social interaction and rehabilitation
WO2006070495A1 (fr) Dispositif d'interface
WO2024028980A1 (fr) Dispositif d'aide au toucher à distance et procédé d'aide au toucher à distance
CN117251058B (zh) 一种多信息体感交互系统的控制方法
JP7248271B2 (ja) 情報処理装置、ロボットハンド制御システム、及びロボットハンド制御プログラム
JP5342354B2 (ja) 力覚提示装置及び力覚提示プログラム
WO2024030641A1 (fr) Avatar de robot
JP2016197376A (ja) 力覚制御装置及び力覚提示装置
CN117572965A (zh) 一种用于虚拟现实系统的多信息体感交互手套系统
KR20230101498A (ko) 가상현실에 기반하여 감정을 출력하는 방법 및 그 전자장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05743624

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 5743624

Country of ref document: EP