WO2006096776A2 - Systemes et procedes de teleportation dans un environnement virtuel - Google Patents

Systemes et procedes de teleportation dans un environnement virtuel Download PDF

Info

Publication number
WO2006096776A2
WO2006096776A2 PCT/US2006/008264 US2006008264W WO2006096776A2 WO 2006096776 A2 WO2006096776 A2 WO 2006096776A2 US 2006008264 W US2006008264 W US 2006008264W WO 2006096776 A2 WO2006096776 A2 WO 2006096776A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
virtual environment
teleportation
directional input
create
Prior art date
Application number
PCT/US2006/008264
Other languages
English (en)
Other versions
WO2006096776A3 (fr
Inventor
Leonidas Deligiannidis
Original Assignee
The University Of Georgia Research Foundation,Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The University Of Georgia Research Foundation,Inc. filed Critical The University Of Georgia Research Foundation,Inc.
Priority to US11/816,968 priority Critical patent/US20080153591A1/en
Publication of WO2006096776A2 publication Critical patent/WO2006096776A2/fr
Publication of WO2006096776A3 publication Critical patent/WO2006096776A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the present disclosure is generally related to virtual technology and, more particularly, is related to systems and methods for providing user interaction in a virtual environment.
  • IVEs Immersive Virtual Environments
  • sequences of gestures to make virtual environmental changes such as a direction or
  • Embodiments of the present disclosure provide a system and method for teleportation in a virtual environment. Briefly described one embodiment of the
  • a teleportation device configured to provide navigation in the virtual environment; at least one feedback
  • Embodiments of the present disclosure can also be viewed as methods for
  • FIG. 1 is a schematic diagram of an embodiment of a system for teleportation in a virtual environment.
  • FIG. 2 is a schematic diagram of an alternative embodiment of a system for teleportation in a virtual environment.
  • FIG. 3 is a schematic diagram illustrating a top view of an embodiment of a system for teleportation in a virtual environment.
  • FIG. 4 is a schematic diagram illustrating a top view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.
  • FIG. 5 is a schematic diagram illustrating a partial front view of a system for teleportation in a virtual environment.
  • FIG. 6 is a schematic diagram illustrating a side view of an embodiment of a
  • teleportation device showing exemplary inputs to a directional input component.
  • FIG. 7 is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device.
  • FIG. 8 is a functional block diagram illustrating an embodiment of a control
  • FIG. 9 is a block diagram illustrating an embodiment of an architecture for
  • FIG. 10 is a block diagram illustrating an embodiment of a method for
  • FIG. 1 is a schematic diagram of an embodiment of a system 100 for teleportation in an immersive virtual environment.
  • An immersive virtual environment includes multiple sources of feedback for a user to create the sensation that the user is fully immersed in the virtual environment.
  • system 100 includes a teleportation device 104 that provides for general purpose
  • the navigation activities can include, for example
  • a user 108 can rotate himself/herself and the teleportation
  • the system 100 also includes a computing device 102, which can include a
  • the computing device 102 is configured to provide data to a head mounted display 114.
  • the head mounted display 114 is configured to communicate video and audio signals to a user 108 using one or more
  • the computing device 102 is also configured
  • FIG. 1 includes user position sensors 112 at the users head and hands.
  • the user position sensors 112 can also be used to provide orientation data to the computing device 102.
  • the computing device 102 is also configured to receive position and
  • the computing device can render the virtual environment based on the position and orientation of the teleportation device 104.
  • the teleportation device 104 includes a base 118 configured to optionally support all or a portion of the user 108.
  • the base 118 is attached to a directional input
  • the moveable coupling 120 of this embodiment includes one or more springs configured in modes of compression,
  • the teleportation device 104 also includes a vibratory feedback device 106
  • the vibratory feedback device 106 is used to deliver sound and/or vibration to the user 108 to simulate
  • the vibratory feedback device 106 maybe configured to operate at a low frequency and output level when the teleportation device 104 is moving through the virtual environment at a slow speed. Accordingly, the output level and frequency might be increased as the speed of the teleportation device 104 is increased. In some
  • the vibratory feedback device 106 can be configured as a subwoofer speaker, for example. Alternatively, or in addition, to the vibratory feedback device
  • 106 can be implemented as vibrotactile devices mounted at a variety of points on the teleportation device 104.
  • FIG. 2 is a schematic diagram of an alternative embodiment of a system 122 for teleportation in a virtual environment, hi addition to the components of the system 100 described above in reference to FIG. 1, the system 122 also includes a position interface 124, configured to communicate with the position sensors 112, 116. Communication between the position interface
  • the position interface 124 also a senor 124, also a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 116, also includes
  • 3-D tracker reports the position and orientation of each of the position sensors 112, 116 to the computing device 102.
  • the system 122 also includes one or more fans 128 for generating a wind simulation.
  • the fan or fans 128 can be controlled by the computing device 102
  • the output device controller 130 can
  • the system 122 can also optionally include a status interface system 125
  • the status interface system 125 can be implemented to replace or supplement either or both of the position interface 124 and the output
  • the status interface system 125 includes the
  • the status interface system 125 may be implemented in separate units, or as a single unit (e.g., with two cards in it, one corresponding to the switching action function of a relay controller and the other having functionality to detect button presses and releases).
  • the status interface system 125 when implemented as a single unit, may
  • ADC analog-to-digital conversion
  • DAC digital-to-analog conversion
  • FIG. 3 is a schematic diagram illustrating a
  • the system 138 includes a teleportation device 104 having a base 118 and a directional input component 110, also referred to as a steering wheel or handle bar.
  • the teleportation device 104 includes a vibratory feedback device 106 and one or
  • the user interface devices can include
  • Switches and buttons among others.
  • Alternative embodiments may include user
  • the user interface devices can include an UP button 140 and a DOWN button 142 for causing the teleportation device 104 to move up or down within the virtual environment.
  • the UP and DOWN functions could be combined into
  • User interface devices can also be implemented as a STOP 144 button configured to cause
  • a FLY/DRIVE switch 150 is also included.
  • the FLY/DRIVE switch 150 can be toggled between a
  • an INC button 154 and a DEC button 156 configured to cause the teleportation device to increase speed or decrease speed, respectively.
  • the UP and DOWN functions, the INC and DEC functions can alternatively be combined into a multiple position switch such as a toggle switch.
  • a multiple position switch such as a toggle switch.
  • embodiments can include throttle and/or handbrake structures that can generate, for example, analog signals to increase or decrease the speed, respectively.
  • the analog can include throttle and/or handbrake structures that can generate, for example, analog signals to increase or decrease the speed, respectively.
  • signals from a throttle and/or a handbrake may be processed using, for example,
  • a throttle and/or handbrake can also be configured to generate digital signals.
  • devices providing a
  • quadrature pulse output in conjunction with a counter can be used for increasing and
  • LIGHTS button 148 for adjusting the lighting levels in the virtual environment.
  • Some embodiments may feature a simple on and off control for the lighting.
  • Other embodiments may include
  • the teleportation device 104 can also include a DEBUG button 146 configured
  • a user may experience a situation where he or she cannot move in
  • a user can activate the DEBUG
  • the teleportation device 104 can also include a JUMP button 152
  • the system 138 also includes an example arrangement of fans 128.
  • the fans 128 used independently or in selective combination can be used to simulate wind that
  • the corresponding fan 128 would be activated to simulate wind commensurate with that motion. Also, when the
  • the teleportation device 104 can detect which direction the teleportation device 104 is facing and operate one or more fans 128 corresponding to movement in the new direction.
  • FIG. 4 is a schematic diagram
  • the teleportation device 104 includes a base 118 moveably coupled to a directional input component 110.
  • a base 118 moveably coupled to a directional input component 110.
  • the directional input component 110 rotates the directional input component 110 counter-clockwise, the teleportation device 104 will turn to the left in the virtual environment. To cause an upward movement of the teleportation device 104 in the virtual environment, the directional input component 110 is pulled or tilted towards the user. Similarly, to cause a
  • directional input component 110 is pushed or tilted away from the user.
  • embodiments may use a directional input component 110 mounted to a telescopic shaft where the up and down motions are accomplished by manipulating the directional input component in a substantially vertical up and down motion.
  • FIG. 5 is a schematic diagram
  • An arrangement of multiple fans of an embodiment includes an over the head fan 210 for simulating, for example, upward movement in the virtual environment. Similarly, the arrangement includes a right side fan 212 and a left side fan 214 for simulating
  • a left ground fan 218 and a right ground fan 216 can be used to simulate left and right downward movement, respectively.
  • a left ground fan 218 and a right ground fan 216 can be used to simulate left and right downward movement, respectively.
  • a right ground fan 216 can be used to simulate left and right downward movement, respectively.
  • front of face fan 220 can be used to simulate forward motion.
  • Each of the fans can be any type of the fans.
  • fans can be used alone or in combination to create
  • FIG. 6 is a schematic diagram illustrating a side view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.
  • the teleportation device 104 includes a base 118
  • the teleportation device 104 to move down, the user 108 pushes or tilts the directional input component 110 away from himself/herself. Similarly, to direct the teleportation
  • the user 108 pulls or tilts the directional input device 110
  • Alternative embodiments can feature a telescopic arrangement such that the directional input device is moved substantially vertically up
  • FIG. 7 is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device.
  • device 104 includes a base 118 attached to a directional input component 110 through
  • the moveable coupling 120 is a spring. Additional springs 230 are included to provide force feedback through additional
  • FIG. 8 is a functional block diagram illustrating an embodiment of a control arrangement for a teleportation system as
  • the computer 160 (herein, computer or host computer) communicates with a 3-D tracker 162, a fan/relay controller 176, an eye tracking
  • controller 182 controls the status interface 166. Note that in some embodiments, fewer or
  • the 3-D tracker 162 provides the position and orientation of the user's head, hands, the teleportation device, etc. to perform the following functionality:
  • the teleportation system comprises a physics component 196 to simulate gravity, so that the user stays on the ground and not in the middle of the air
  • the output of the physics component 196 is fed to a vibrator controller 198
  • the graphics generator 190 may retrieve environment data from an environment storage 192.
  • the head mounted display 202 is used and comprises
  • the headphones can be used to hear things or
  • teleportation system can simulate circumstances such as when a user collides with another object by activating one or more vibration units 200 to provide tactile
  • the host computer 160 controls the wind generator units 180 (on/off) and their speed (how much air they
  • the wind generator units 180 can be driven through a fan speed controller 178
  • the host computer 160 also drives a sound generator 172 that simulates the noise generated by the teleportation device and can also serve as a secondary vibration mechanism.
  • the sound generator 172 can be used to drive sound output units 174 using data in a sound data storing unit 170.
  • the status interface 166 can use a switch polling facility 168 to detect button presses
  • the teleportation system may also
  • the eye tracking controller 182 can communicate with a speech recognizer 164 that recognizes commands that a user verbally issues.
  • the eye tracking controller 182 can communicate with a
  • a head mounted display 202 comprises a camera that tracks the user's eye.
  • the eye tracking controller 182 determines the coordinates of the eye and further determines what the user is observing in the virtual environment. Such a feature may be useful in games. For example, as a missile from the enemy is coming at the user,
  • the user can look at the missile and press a button located on the teleportation device
  • FIG. 9 is a block diagram illustrating an embodiment of an architecture for
  • the control computer generally includes a
  • the local interface 244 may be, for example, one or more buses or other wired or wireless connections.
  • the local interface 244 may have additional elements such as
  • the local interface 244 may include address, control, and/or
  • the processor 240 is a hardware device for executing software, particularly that which is stored in memory.
  • the processor 240 may be any custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the processing device, a
  • semiconductor-based microprocessor in the form of a microchip or chip set
  • a microchip or chip set a semiconductor-based microprocessor (in the form of a microchip or chip set)
  • the memory 242 may include any one or combination of volatile memory
  • RAM random access memory
  • the memory 242 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory 242 may
  • the software in memory 242 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing
  • the software in the memory 242 includes control software 246 for providing one or more of the functionality shown in FIG. 8 according to an embodiment.
  • memory 242 may also comprise a suitable operating system (O/S) 248.
  • O/S operating system
  • system 248 essentially controls the execution of other computer programs, such as the control software, and provides scheduling, input-output control, file and data management, memory management, and communication control and related services.
  • the control software 246 is a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed.
  • the control software 246 can be implemented, in one embodiment, as a distributed
  • the modules can be accessed by one or more applications or programs or components thereof.
  • the modules can be accessed by one or more applications or programs or components thereof.
  • control software 246 can be implemented as a single module with all of the
  • control software 246 is a
  • control software 246 can be written with (a) an object oriented programming language, which has classes of data and methods, or (b) a procedure programming language, which has
  • the I/O devices 250 may include input devices such as, for example, a
  • 250 may also include output devices such as, for example, a printer, display, audio
  • the I/O devices 250 may further include devices that communicate both inputs and outputs such as, for instance, a modulator/demodulator (modem for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router,
  • a modulator/demodulator modem for accessing another device, system, or network
  • RF radio frequency
  • the processor 240 is configured to
  • control software 246 and the operating system 248, in whole or in part, but typically the latter, are read by the processor 240, perhaps buffered within the processor 240, and then executed.
  • control software 246 can be stored on any computer-readable medium for use by or in connection with any computer-related
  • a computer-readable medium is an
  • control software 246 can be embodied in any computer-readable
  • control software 246 is implemented in hardware, or as a combination of software and hardware, the functionality of the control software 246 can be implemented with any or a
  • ASIC application specific integrated circuit
  • PGA programmable gate array
  • FPGA gate array
  • FIG. 10 is a block diagram illustrating an embodiment of a method 300 for providing teleportation in a virtual environment.
  • the method 300 includes the step of delivering a video signal to the user in block 310.
  • the video signal may be delivered using, for example, one or more displays
  • the method 300 also includes the step of delivering an audio signal to a user in block 320.
  • the audio signal can be delivered through, for example, headphones or speakers.
  • the audio signal can be delivered through, for example, headphones or speakers.
  • the method 300 also includes the step of receiving position inputs relating to
  • the three-dimensional position and orientation of the hands and head of the user can serve to ensure that the user's position and video signal correspond to the virtual
  • the computer controlling the virtual environment can be received from the three-dimensional position and orientation data for the teleportation device.
  • a user is provided vibratory feedback in block 340.
  • a user can experience the sounds and vibrations corresponding to
  • the air is directed at varying rates and from different directions to
  • Air can
  • wind generation devices including for example, fans or blowers.
  • Each wind generation device can be driven independently or in combination at one or more preset speeds or at any speed over a range of speeds.
  • Controlling the wind generation units can be accomplished using relays, electronic speed controllers, electronic motor drives, or any combination thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne des systèmes et des procédés de téléportation dans un environnement virtuel. Un mode de réalisation d'un tel système peut être mis en oeuvre sous la forme d'un visiocasque configuré pour présenter un environnement virtuel immersif, et d'un dispositif de téléportation configuré pour permettre la navigation dans l'environnement virtuel, au moins un dispositif de rétro-information étant configuré pour fournir à l'usager des informations se rapportant au déplacement du dispositif de téléportation à l'intérieur de l'environnement virtuel. Le système comprend également une pluralité de dispositifs d'entrée configurés pour produire une pluralité de signaux d'entrée en réponse aux données entrées par l'utilisateur; et un dispositif de calcul configuré pour recevoir la pluralité de signaux d'entrée et commander le(s) dispositif(s) de rétro-information.
PCT/US2006/008264 2005-03-07 2006-03-07 Systemes et procedes de teleportation dans un environnement virtuel WO2006096776A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/816,968 US20080153591A1 (en) 2005-03-07 2006-03-07 Teleportation Systems and Methods in a Virtual Environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US65928305P 2005-03-07 2005-03-07
US60/659,283 2005-03-07

Publications (2)

Publication Number Publication Date
WO2006096776A2 true WO2006096776A2 (fr) 2006-09-14
WO2006096776A3 WO2006096776A3 (fr) 2007-08-16

Family

ID=36954010

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/008264 WO2006096776A2 (fr) 2005-03-07 2006-03-07 Systemes et procedes de teleportation dans un environnement virtuel

Country Status (2)

Country Link
US (1) US20080153591A1 (fr)
WO (1) WO2006096776A2 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202016103302U1 (de) 2016-06-22 2016-07-11 Stefan Zimmermann Für eine Virtual-Reality-Brille bestimmte Führung einer elektrischen Leitung
CN110335511A (zh) * 2019-05-30 2019-10-15 桂林蓝港科技有限公司 一种学生端虚拟现实头戴式显示设备控制系统及方法

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2038026A1 (fr) * 2006-06-19 2009-03-25 AMBX UK Limited Améliorateur de jeux
US9256347B2 (en) * 2009-09-29 2016-02-09 International Business Machines Corporation Routing a teleportation request based on compatibility with user contexts
US9254438B2 (en) 2009-09-29 2016-02-09 International Business Machines Corporation Apparatus and method to transition between a media presentation and a virtual environment
KR101926477B1 (ko) * 2011-07-18 2018-12-11 삼성전자 주식회사 콘텐츠 재생 방법 및 장치
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US9393490B2 (en) * 2014-04-14 2016-07-19 International Business Machines Corporation Simulation based on audio signals
US10628186B2 (en) * 2014-09-08 2020-04-21 Wirepath Home Systems, Llc Method for electronic device virtualization and management
DE102014013961A1 (de) 2014-09-19 2016-03-24 Audi Ag Virtual-Reality-Brille, System mit einer Virtual-Reality-Brille und Verfahren zum Betreiben einer Virtual-Reality-Brille
US10768704B2 (en) * 2015-03-17 2020-09-08 Whirlwind VR, Inc. System and method for modulating a peripheral device based on an unscripted feed using computer vision
US10466790B2 (en) * 2015-03-17 2019-11-05 Whirlwind VR, Inc. System and method for processing an audio and video input in a point of view program for haptic delivery
US10825350B2 (en) * 2017-03-28 2020-11-03 Wichita State University Virtual reality driver training and assessment system
US10777008B2 (en) 2017-08-31 2020-09-15 Disney Enterprises, Inc. Drones generating various air flow effects around a virtual reality or augmented reality user
US10898798B2 (en) * 2017-12-26 2021-01-26 Disney Enterprises, Inc. Directed wind effect for AR/VR experience
KR20190122546A (ko) * 2018-04-20 2019-10-30 한국과학기술원 가상현실 또는 증강현실에서의 운동감각을 구현하는 웨어러블 장치 및 그 제어방법
US20220154964A1 (en) * 2020-11-16 2022-05-19 Mumarba LLC Artificial Breeze System

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6184847B1 (en) * 1998-09-22 2001-02-06 Vega Vista, Inc. Intuitive control of portable data displays

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7098891B1 (en) * 1992-09-18 2006-08-29 Pryor Timothy R Method for providing human input to a computer
US6591250B1 (en) * 1998-02-23 2003-07-08 Genetic Anomalies, Inc. System and method for managing virtual property
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US20010041328A1 (en) * 2000-05-11 2001-11-15 Fisher Samuel Heyward Foreign language immersion simulation process and apparatus
US6952716B1 (en) * 2000-07-12 2005-10-04 Treehouse Solutions, Inc. Method and system for presenting data over a network based on network user choices and collecting real-time data related to said choices
US6884170B2 (en) * 2001-09-27 2005-04-26 Igt Method and apparatus for graphically portraying gaming environment and information regarding components thereof
JP2004329463A (ja) * 2003-05-06 2004-11-25 Nintendo Co Ltd ゲーム装置および仮想カメラの制御プログラム
US7828657B2 (en) * 2003-05-20 2010-11-09 Turbine, Inc. System and method for enhancing the experience of participant in a massively multiplayer game
US7584082B2 (en) * 2003-08-07 2009-09-01 The Mathworks, Inc. Synchronization and data review system
CN1950133A (zh) * 2004-05-10 2007-04-18 世嘉股份有限公司 电子游戏装置、其数据处理方法和程序、及记录介质

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6184847B1 (en) * 1998-09-22 2001-02-06 Vega Vista, Inc. Intuitive control of portable data displays

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202016103302U1 (de) 2016-06-22 2016-07-11 Stefan Zimmermann Für eine Virtual-Reality-Brille bestimmte Führung einer elektrischen Leitung
CN110335511A (zh) * 2019-05-30 2019-10-15 桂林蓝港科技有限公司 一种学生端虚拟现实头戴式显示设备控制系统及方法

Also Published As

Publication number Publication date
WO2006096776A3 (fr) 2007-08-16
US20080153591A1 (en) 2008-06-26

Similar Documents

Publication Publication Date Title
US20080153591A1 (en) Teleportation Systems and Methods in a Virtual Environment
US7918732B2 (en) Manifold compatibility electronic omni axis human interface
US6864877B2 (en) Directional tactile feedback for haptic feedback interface devices
US6147674A (en) Method and apparatus for designing force sensations in force feedback computer applications
US10322336B2 (en) Haptic braille output for a game controller
JP4441179B2 (ja) 玩具用触覚リモートコントロール装置
US5803738A (en) Apparatus for robotic force simulation
US7209117B2 (en) Method and apparatus for streaming force values to a force feedback device
JP2020030845A (ja) 没入型環境における非コロケートな触覚キュー
US20090325699A1 (en) Interfacing with virtual reality
Rahman et al. Motion-path based in car gesture control of the multimedia devices
WO2005050427A1 (fr) Systeme et procede d'affichage d'informations de detection de force tactile
US20130308243A1 (en) Magnetically Movable Objects Over a Display of an Electronic Device
JP2003199974A6 (ja) 触覚フィードバックインターフェースデバイス用の方向接触フィードバック
CN104423595A (zh) 执行触觉转换的系统和方法
JP2010061667A (ja) ホストコンピュータを利用して力フィードバックインタフェースを制御する方法および装置
WO2021240601A1 (fr) Système de sensation corporelle d'espace virtuel
CN201223711Y (zh) 具传感功能的互动式健身器材
US20170348594A1 (en) Device, System, and Method for Motion Feedback Controller
WO2019083751A1 (fr) Sortie en braille haptique pour un dispositif de commande de jeu
Borst et al. Touchpad-driven haptic communication using a palm-sized vibrotactile array with an open-hardware controller design
CN210845261U (zh) 一种多功能沉浸式vr运动平台装置
KR20180105285A (ko) 햅틱 체감 장치 및 시스템
TWI479364B (zh) 具三維磁力觸控反饋之行動裝置及三維磁力觸控反饋裝置
JP2023148749A (ja) 入出力装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 11816968

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06737435

Country of ref document: EP

Kind code of ref document: A2