WO2021011208A1 - Système d'entrée d'utilisateur pour atténuer une disparité de rotation en translation à travers des environnements virtuels - Google Patents

Système d'entrée d'utilisateur pour atténuer une disparité de rotation en translation à travers des environnements virtuels Download PDF

Info

Publication number
WO2021011208A1
WO2021011208A1 PCT/US2020/040800 US2020040800W WO2021011208A1 WO 2021011208 A1 WO2021011208 A1 WO 2021011208A1 US 2020040800 W US2020040800 W US 2020040800W WO 2021011208 A1 WO2021011208 A1 WO 2021011208A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
virtual reality
support device
input device
input
Prior art date
Application number
PCT/US2020/040800
Other languages
English (en)
Inventor
Christopher Gabriel
Christopher Purvis
Gabriel King
Original Assignee
Dreamscape Immersive, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dreamscape Immersive, Inc. filed Critical Dreamscape Immersive, Inc.
Publication of WO2021011208A1 publication Critical patent/WO2021011208A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the present invention is directed to an input system and method for displaying a field of view within a virtual environment to a user via a headset worn by the user.
  • the input system includes a user-controlled user-support device for supporting a user such that at least a head of the user is trackable by a tracking system of a virtual reality simulator.
  • the user-support device has an axis of rotation generally perpendicular to a floor.
  • the input system also includes a user- controlled input device having an axis that defines a forward direction of translation of the user through a virtual environment displayed by the virtual reality simulator to the user via a headset worn by the user.
  • the user-support device supports the user in a standing position. In other embodiments, the user-support device supports the user in a seated position.
  • the input device is connected to the user-support device, the input device has an appearance of an input, a segment or portion, or an accessory control mechanism, and a control mechanism of the accessory defines the axis that defines the forward direction of translation of the user through the virtual environment displayed by the virtual reality simulator to the user via the headset worn by the user.
  • FIG. 9 is an exemplary input system, according to one aspect.
  • FIG. 10 is an exemplary input system, according to one aspect.
  • FIG. 1 1 is an illustration of an example computer-readable medium or computer-readable device including processor-executable instructions configured to embody one or more of the provisions set forth herein, according to one aspect.
  • the user can rotate the user-support device about a 360° angle of rotation 20.
  • a position and orientation of the user-support device is tracked by a tracking system (not shown). This can be accomplished using sensors mounted to the user-support device or via tracking elements that are worn by or associated with the user. In other embodiments, cameras are used to determine the rotation of the user supported on the user-support device.
  • the tracking system of the input system may track an orientation of the input device, an orientation of the user’s head, an orientation of the user’s body, etc. Because rotation is possible, any rotational disparity that generally may occur between a physical position, physical orientation, and a virtual orientation may be mitigated, thereby mitigating motion sickness for many users.
  • the virtual reality simulator (which may include a processor 1216 and a memory 1218) may generate a determination, based on the tracked features, as to whether to render a change in speed or velocity, a change in the field of view, or a change in orientation for the user with respect to the user’s virtual reality experience. For example, if the user’s head is turned to the left and the user reacts to a stimulus within the virtual reality environment by providing an input to the input device, that input may be disregarded by the virtual reality simulator based on a determination that input is incidental.
  • a user-controlled input device is attached to the user-support device.
  • the input device is attached to the user- support device on the support post 40.
  • a frontal section 75 of the input device is connected to a rear section of the input device at a section 85, which has a rotational axis 90.
  • the input device is connected to the user-support device, according to this aspect.
  • the frontal section 75 of the input device is selectively positionable by the user between an elevated position 100 and a lowered position 1 10.
  • the input device is connected to the user-support device and may include a handle.
  • the frontal section 75 of the input device can act as a control mechanism and define the axis coinciding with the forward direction of translation of the user through the virtual environment displayed by the virtual reality simulator to the user via the headset worn by the user.
  • the frontal section 75 of the input device can act as an input device that controls direction, speed, etc.
  • the input device may be an unattached control for user input, such as a wheel, reins, a separate input device, tracked device, etc.
  • the speed of the user translating through the virtual environment decreases.
  • the speed of the user translating through the virtual environment increases.
  • the user is able to control both a speed and a direction of the user through the virtual environment.
  • a speed of the user in the forward direction of translation through the virtual environment displayed by the virtual reality simulator to the user via the headset worn by the user can be defined by an orientation of the input device (e.g., based on whether the frontal section 75 of the input device is in the elevated position 100 or the lowered position 1 10, although other positions may be possible, according to other aspects).
  • the frontal section 75 of the input device can further receipt additional inputs from the user, such as pressure inputs or steering inputs (e.g., left or right movements or pressures), which may be utilized by the virtual reality simulator to make adjustments to the user’s virtual reality experience.
  • pressure inputs or steering inputs e.g., left or right movements or pressures
  • the input device e.g., or frontal section 75 therefor
  • the orientation of the input device is tracked by the tracking system of the virtual reality simulator or other sensors of the input system.
  • the input device comprises a tracking module 70, which is mounted on the frontal section of the input device.
  • the tracking module is trackable by the tracking system.
  • An axis defined by the longitudinal extension of the front section of the he input device defines a forward direction 80 in the virtual environment.
  • the user is able to rotate the user-support device manually (e.g., by using the user's legs to press against the floor and generate rotation).
  • sensors mounted in the foot pegs sense pressure applied by the user, which causes motors to rotate the user-support device on which the user is seated.
  • the object of the invention is a system and method of gathering user input for free-roaming translation in virtual environments that is associated with reduced amount of motion sickness for affected individuals.
  • the invention comprises a user-support device that allows the user to rotate physically (such as a freely rotating seat where the axis of rotation is generally perpendicular to a level floor) while the user's head (and optionally the user's body, hands, feet, or other body parts) are being monitored by a tracking system. Tracking of the user's head and other body parts can be accomplished via any tracking methodology known in the art of virtual reality simulations (inside-out or outside-in tracking paradigms both apply).
  • a commercially-available performance capture system consisting of an array of cameras which monitor a collection of identifiable objects or points can be utilized to provide the tracking data.
  • the users head at minimum, can be tracked sufficiently to provide the simulation computer with position and orientation data at a frequency of at least 60 Hz.
  • the system comprises a user-controlled input device (a joystick, yoke, or other similar device) which is manipulated by the user to control speed and direction of the user’s virtual reality experience.
  • Input from this device can be obtained via the tracking system if it is a tracked object, or via direct connection to the simulation system.
  • the input device can also be tracked in a manner that provides similar data to the simulation system as is collected for the user’s head (e.g., position and orientation).
  • the input device can define a forward direction through the virtual environment. It is along the forward direction defined by the input device that the user’s virtual translation path will follow.
  • the input device is affixed to the rotating mechanism.
  • the input device is separate, and is held by the user while they are seated or standing on the rotating mechanism.
  • the method of operation is that the user can physically rotate their body (e.g., via the rotating mechanism) when they desire to rotate their representation in the virtual environment for the purpose of changing direction or heading.
  • the method of rotation can be anything - user-powered (e.g., with their feet) or motorized (e.g., following the motion of the input device, such as a control mechanism, steering wheel, input device, tracked device, etc.) will both afford the same results.
  • one or more motors can rotate the user-support device based on changes made by the user to an orientation of the input device or via other inputs to the input device.
  • the rotational orientation of the user in the real world is maintained within the virtual environment at all times, with the input device determining which way is forward (e.g., for the translation of the user’s virtual coordinates) by its real-world orientation.
  • the user is translated through the virtual environment they are always translated in the same direction as the device is, based on its forward reference.
  • the user simply swivels their body (via rotation of the mechanism along with the input device) in the direction they desire to orient.
  • the tracking system monitors this rotation and reorients the virtual point of view and path to match. Because the user’s head is tracked separately, the user is able to look around freely while maintaining forward progress along the vector provided by the input device. Because the user’s rotational orientation within the virtual space is maintained along with their real-world rotation, and any changes in virtual rotation are accompanied by true physical rotation. This greatly reduces or eliminates the disparity between what the user sees and what their inner ear detects, and induced motion sickness is greatly reduced or eliminated.
  • FIGS. 4-10 illustrate different views of an exemplary embodiment of the input system of the present disclosure.
  • the input system or virtual reality system may be similar to the input system of FIG. 1 , with some modifications.
  • the system may include a seat back and armrests and be in the form of a chair.
  • the chair of FIGS. 4-10 may spin on a post.
  • the user can rotate the user-support device (e.g., the virtual reality chair) while seated on the seat of the chair on a rotational axis, which may coincide with the post.
  • the user can rotate the user-support device about a 360° angle of rotation or portion thereof.
  • the input system can include a slip ring implemented that enables electronics or other components of the system to rotate infinitely.
  • the input system of FIGS. 4-10 may include a user- support device (e.g., chair) for supporting a user, the user-support device having an axis of rotation generally perpendicular to the floor.
  • the input system may also include a tracking system tracking an orientation of a user or a gaze of the user supported by the user-support device.
  • the tracking system 510 may be able to ascertain where the user is looking and the position of the user in the real-world environment, and thus, be able to render corresponding views within the virtual reality environment.
  • height and width adjustment of the user-support device e.g., chair
  • the user-support device e.g., chair
  • the input device 612 may receive user inputs indicative of whether the user is trying to move within the virtual reality environment by providing real-world user inputs.
  • the input device 612 and/or the seat may include an accessory arm which may be extendable and accommodate different types of attachments (handles, joysticks, steering wheels etc.).
  • the input device 612 may be communicatively coupled with the virtual reality simulator and/or controller 610.
  • the pressure sensors on the control mechanism, the foot pegs, or the seat of the user-support device may be utilized to provide the input system with information regarding how the user is reacting or commanding the input system to react.
  • the virtual reality simulator may render the virtual reality environment for the user based on the tracking information tracked by the tracking system 510 and may provide the rendering independent of the forward direction of translation of the user through the virtual environment defined by the axis of the tracked input device 612. Stated another way, the virtual reality simulator may render the virtual reality environment so that a user is able to look over his or her shoulder, to the left, or to the right, up, or down, even when the virtual reality experience is moving along a ‘track’ or moving in the forward direction of translation of the user through the virtual environment, which may be defined by the axis of the tracked input device 612.
  • the input device 612 can include a motion capture sensor which captures a user input indicative of the user’s body position or just the user’s body position.
  • the motion capture sensor may be embodied as an image capture sensor, linear arrays, radio tracking sensor, one or more accelerometers, or other type of sensor.
  • the user’s position may be assumed to be on the central axis of the chair, unless indicated otherwise by the motion capture sensor.
  • the virtual reality simulator may then render the virtual reality environment for the user based on the user input indicative of the user’s body position.
  • the body position may be associated with a lean on the user-supported device, for example.
  • the input system may include one or more supplemental virtual reality effect elements.
  • the input system may include a vibration transducer 616 producing a vibration for the user-support device based on a stimulus within the virtual reality environment.
  • the input system may include a fan 618 configured to provide a real-world stimulus, such as a breeze, based on a stimulus within the virtual reality environment.
  • the input system may include a scent-releasing module 710 configured to release scented aromas based on a stimulus within the virtual reality environment.
  • the input system may further include a variable resistance device which may be a passive device for applying a constant force to the user. For example, when a user pulls back on the control mechanism, the variable resistance device provides tension.
  • the user’s feet, which may be in constant contact with the ground may control rotation of the VR chair.
  • the virtual reality simulator and/or associated computer may be mounted to the VR chair.
  • a room may be equipped with multiple input systems or VR chairs, each independently controlled in a shared VR environment.
  • the VR chair may not necessarily tilt or rotate about a vertical axis.
  • the input device 612 may be a control mechanism, a pressure sensor on the control mechanism, a foot peg, a pressure sensor on the foot peg, or a pressure sensor on the user-support device 500.
  • the control mechanism of the input device 612 may have an axis that defines a forward direction of translation with respect to a virtual reality environment which the user is experiencing.
  • the virtual reality simulator may include a headset 614 rendering the virtual reality environment for the user based on a user input received from the input device 612 such that rotation of the user on the user-support device 500 about the axis of rotation generally perpendicular to the floor is associated with a corresponding rendering within the virtual reality environment and so the user’s real-world orientation and virtual orientation coincide.
  • the virtual reality simulator may render the virtual reality environment for the user based on the tracking information tracked by the tracking system 510 and may be independent of the forward direction of translation of the user through the virtual environment defined by the axis of the tracked input device 612.
  • the processor-executable computer instructions 1 104 may be configured to perform a method 1 102, such as the method 300 of FIG. 3.
  • the processor-executable computer instructions 1 104 may be configured to implement a system, such as the system 10 of FIGS. 1 -2.
  • Many such computer-readable media may be devised by those of ordinary skill in the art that are configured to operate in accordance with the techniques presented herein.
  • FIG. 12 and the following discussion provide a description of a suitable computing environment to implement aspects of one or more of the provisions set forth herein.
  • the operating environment of FIG. 12 is merely one example of a suitable operating environment and is not intended to suggest any limitation as to the scope of use or functionality of the operating environment.
  • Example computing devices include, but are not limited to, personal computers, server computers, hand-held or laptop devices, mobile devices, such as mobile phones, Personal Digital Assistants (PDAs), media players, and the like, multiprocessor systems, consumer electronics, mini computers, mainframe computers, distributed computing environments that include any of the above systems or devices, etc.
  • PDAs Personal Digital Assistants
  • FIG. 12 illustrates a system 1200 including a computing device 1212 configured to implement one aspect provided herein.
  • the computing device 1212 includes at least one processor 1216 and memory 1218.
  • memory 1218 may be volatile, such as RAM, non-volatile, such as ROM, flash memory, etc., or a combination of the two. This configuration is illustrated in FIG. 12 by dashed line 1214.
  • the computing device 1212 includes additional features or functionality.
  • the computing device 1212 may include additional storage such as removable storage or non-removable storage, including, but not limited to, magnetic storage, optical storage, etc.
  • additional storage is illustrated in FIG. 12 by storage 1220.
  • computer readable instructions to implement one aspect provided herein are in storage 1220.
  • Storage 1220 may store other computer readable instructions to implement an operating system, an application program, etc.
  • Computer readable instructions may be loaded in memory 1218 for execution by processor 1216, for example.
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions or other data.
  • Memory 1218 and storage 1220 are examples of computer storage media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVDs) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by the computing device 1212. Any such computer storage media is part of the computing device 1212.
  • first”, “second”, or the like are not intended to imply a temporal aspect, a spatial aspect, an ordering, etc. Rather, such terms are merely used as identifiers, names, etc. for features, elements, items, etc.
  • a first channel and a second channel generally correspond to channel A and channel B or two different or two identical channels or the same channel.
  • “comprising”,“comprises”,“including”,“includes”, or the like generally means comprising or including, but not limited to.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un système d'entrée qui comprend un dispositif d'entrée commandé par l'utilisateur ayant un axe qui définit une direction avant de translation de l'utilisateur par l'intermédiaire d'un environnement virtuel affiché par le simulateur de réalité virtuelle à l'utilisateur par l'intermédiaire d'un casque porté par l'utilisateur. La rotation de l'utilisateur supportée sur le dispositif de support d'utilisateur sur l'axe de rotation et la rotation correspondante du dispositif d'entrée par rapport à un tel axe de rotation change la direction avant de la translation de l'utilisateur par l'intermédiaire de l'environnement virtuel affiché par le simulateur de réalité virtuelle à l'utilisateur par l'intermédiaire du casque porté par l'utilisateur, mais un champ de vision à l'intérieur de l'environnement virtuel affiché par le simulateur de réalité virtuelle à l'utilisateur par l'intermédiaire du casque porté par l'utilisateur est déterminé par une position suivie de la tête de l'utilisateur et est indépendant de la direction avant de translation de l'utilisateur par l'intermédiaire de l'environnement virtuel.
PCT/US2020/040800 2019-07-16 2020-07-03 Système d'entrée d'utilisateur pour atténuer une disparité de rotation en translation à travers des environnements virtuels WO2021011208A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962874788P 2019-07-16 2019-07-16
US62/874,788 2019-07-16

Publications (1)

Publication Number Publication Date
WO2021011208A1 true WO2021011208A1 (fr) 2021-01-21

Family

ID=74210883

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/040800 WO2021011208A1 (fr) 2019-07-16 2020-07-03 Système d'entrée d'utilisateur pour atténuer une disparité de rotation en translation à travers des environnements virtuels

Country Status (1)

Country Link
WO (1) WO2021011208A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150030999A1 (en) * 2012-04-12 2015-01-29 Motion Device Inc. Motion simulator
US20180304160A1 (en) * 2017-04-25 2018-10-25 Universal City Studios Llc Seated motion simulation amusement park attraction
US20180369702A1 (en) * 2017-06-22 2018-12-27 Jntvr Llc Synchronized motion simulation for virtual reality
US20190163274A1 (en) * 2015-03-17 2019-05-30 Whirlwind VR, Inc. System and Method for Modulating a Peripheral Device Based on an Unscripted Feed Using Computer Vision

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150030999A1 (en) * 2012-04-12 2015-01-29 Motion Device Inc. Motion simulator
US20190163274A1 (en) * 2015-03-17 2019-05-30 Whirlwind VR, Inc. System and Method for Modulating a Peripheral Device Based on an Unscripted Feed Using Computer Vision
US20180304160A1 (en) * 2017-04-25 2018-10-25 Universal City Studios Llc Seated motion simulation amusement park attraction
US20180369702A1 (en) * 2017-06-22 2018-12-27 Jntvr Llc Synchronized motion simulation for virtual reality

Similar Documents

Publication Publication Date Title
US11406872B2 (en) Force feedback arm for an interactive exercise machine
JP6676213B2 (ja) 仮想現実ユーザインターフェイスのためのユーザの動きの範囲の検出
US20180053349A1 (en) Running exercise equipment with associated virtual reality interaction method and non-volatile storage media
JP6548821B2 (ja) ヘッドマウントディスプレイの画面上でコンテンツの配置を最適化する方法
US10241566B2 (en) Sensory feedback systems and methods for guiding users in virtual reality environments
CN109388142B (zh) 一种基于惯性传感器进行虚拟现实行走控制的方法及系统
EP3316735B1 (fr) Dispositif d'entrée pour siège à commande de mouvement
US10223064B2 (en) Method for providing virtual space, program and apparatus therefor
CN111630476B (zh) 虚拟现实运动装置
KR20110031925A (ko) 틸트 기능성의 감도의 동적인 선택
US20190121425A1 (en) Chair-type virtual reality controller
KR101492372B1 (ko) 모션제어유닛을 포함하는 가상현실 시뮬레이션장치
Kim et al. Walking-in-place for omnidirectional VR locomotion using a single RGB camera
US20200060610A1 (en) Walking sense presentation device and presentation method
WO2021011208A1 (fr) Système d'entrée d'utilisateur pour atténuer une disparité de rotation en translation à travers des environnements virtuels
WO2021154298A1 (fr) Bras positionnable à libération rapide pour une machine d'exercice interactive
KR102108962B1 (ko) 전방향 가상현실 네비게이션을 위한 제자리 걸음을 이용한 인터랙션 장치 및 방법
US11107364B2 (en) Method to enhance first-person-view experience
Lee et al. MIP-VR: an omnidirectional navigation and jumping method for VR shooting game using IMU
TWI620100B (zh) 虛擬實境移動畫面顯示方法、虛擬實境裝置及記錄媒體
WO2019111027A1 (fr) Procédé pour créer une réalité virtuelle ou augmentée et système pour créer une réalité virtuelle ou augmentée
KR102148735B1 (ko) 몰입형 가상 현실 시스템
KR20190065806A (ko) 롤러 기반의 트레드밀 시스템을 이용한 vr 환경에서의 걷기 추적 시스템 및 그 방법
US20230092395A1 (en) Physical object integration with extended reality environments
Wu Design and Evaluation of Dynamic Field-of-View Restriction Techniques to Mitigate Cybersickness in Virtual Reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20839784

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20839784

Country of ref document: EP

Kind code of ref document: A1