WO2022117517A1 - Visual reference for virtual reality (vr) locomotion - Google Patents

Visual reference for virtual reality (vr) locomotion Download PDF

Info

Publication number
WO2022117517A1
WO2022117517A1 PCT/EP2021/083420 EP2021083420W WO2022117517A1 WO 2022117517 A1 WO2022117517 A1 WO 2022117517A1 EP 2021083420 W EP2021083420 W EP 2021083420W WO 2022117517 A1 WO2022117517 A1 WO 2022117517A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
visual reference
user
rendering
locomotion
Prior art date
Application number
PCT/EP2021/083420
Other languages
French (fr)
Inventor
Tommy Falk
Alvin Jude Hari Haran
Original Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget Lm Ericsson (Publ) filed Critical Telefonaktiebolaget Lm Ericsson (Publ)
Publication of WO2022117517A1 publication Critical patent/WO2022117517A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • This disclosure relates to systems and method for creating virtual reality (VR) environments.
  • VR virtual reality
  • VR virtual reality
  • XR extended reality
  • VR environments a.k.a., VR environments or VR scenes.
  • a user is immersed inside an entirely simulated experience, while other VR environments augment or extend a user’s reality by, for example, displaying virtual objects so that they overlay what the user sees in reality (for instance, a shopper wearing virtual reality glasses while shopping in a supermarket might see nutritional information for an object as they place their object in their shopping carpet).
  • Locomotion in VR systems is a feature that makes it possible for the user to experience the sense of movement within the VR scene (e.g., the user can move from one location in the virtual scene to another location in the virtual scene) without physically moving from one physical location to another physical location.
  • Locomotion can be implemented in different ways, including: 1) teleportation, 2) treadmills, and 3) smooth artificial locomotion.
  • the user may select a spot in the virtual scene and then the user is instantly transported to that point in the virtual scene when, for example, clicking a button.
  • Implementing locomotion using a treadmill typically involves a platform-like device that tracks physical movements of the user and converts that physical movements into virtual movements so that the user within the virtual scene moves in a way that corresponds to the tracked physical movements.
  • a platform-like device that tracks physical movements of the user and converts that physical movements into virtual movements so that the user within the virtual scene moves in a way that corresponds to the tracked physical movements.
  • the user can move smoothly within the virtual scene using, for example, a touchpad or joystick or other input device.
  • Motion sickness e.g., the feelings of dizziness, lightheadedness, or nausea
  • vestibular mismatch a mismatch between the movement the user senses and the lack of actual physical movement of the user.
  • a method for providing a visual reference to a user in a virtual reality (VR) scene includes detecting an activation of a locomotion feature in the VR scene.
  • the method also includes rendering the visual reference in the VR scene in response to detecting the activation of the locomotion feature in the VR scene, wherein the visual reference moves with the user as the user moves within the VR scene.
  • the method further includes detecting a deactivation of the locomotion feature in the VR scene and removing the visual reference from the VR scene in response to detecting the deactivation of the locomotion feature in the VR scene.
  • a computer program comprising instructions which when executed by processing circuitry of a virtual reality (VR) rendering device causes the VR rendering device to perform the method of any one of the above embodiments disclosed herein.
  • a carrier containing the computer program, wherein the carrier is one of an electronic signal, an optical signal, a radio signal, and a computer readable storage medium.
  • VR rendering device where the VR rendering device is adapted to perform the method of any embodiments disclosed herein.
  • VR rendering device includes processing circuitry; and a memory containing instructions executable by the processing circuitry, whereby the VR rendering device is operative to perform the methods disclosed herein.
  • the embodiments disclosed herein are advantageous in that they decrease problems with motion sickness during locomotion in a VR scene, and, moreover, the embodiments do not negatively impact the user’s experience because the embodiments do not disturb the view of the user when the locomotion feature is not being used. Additionally, the visual reference represents an abstract object that can be used in many, if not all, VR scenes. BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 A illustrates a VR system according to an embodiment.
  • FIG. IB illustrate a VR headset according to an embodiment.
  • FIG. 2A illustrate a visual reference according to an embodiment.
  • FIG. 2B illustrate a visual reference according to an embodiment.
  • FIG. 2C illustrate a visual reference according to an embodiment.
  • FIG. 2D illustrate a visual reference according to an embodiment.
  • FIG. 3 is a flowchart illustrating a process according to an embodiment.
  • FIG. 4 illustrates a VR rendering device according to an embodiment.
  • FIG. 1A illustrates a VR system 100 according to some embodiments.
  • VR system 100 comprises a VR headset 120 (e.g., VR goggles, VR glasses, VR head mounted display (HMD), etc.) that is configured to be worn by a user and that is operable to display to the user a VR scene (e.g., a VR scene in which the user is virtually immersed), speakers 134 and 135 for producing sound for the user, and an input device 150 for receiving input from the user (in this example the input device 150 is in the form of a joystick).
  • VR headset 120 e.g., VR goggles, VR glasses, VR head mounted display (HMD), etc.
  • HMD VR head mounted display
  • VR headset 120 may comprise an orientation sensing unit 121, a position sensing unit 122, and a VR rendering device 124.
  • Orientation sensing unit 121 is configured to detect a change in the orientation of the user and provides information regarding the detected change to VR rendering device 124.
  • VR rendering device 124 determines the absolute orientation (in relation to some coordinate system) given the detected change in orientation detected by orientation sensing unit 121.
  • orientation sensing unit 121 may comprise one or more accelerometers and/or one or more gyroscopes.
  • VR rendering device 124 may also receive input from input device 150 and may also obtain VR scene configuration information. Based on these inputs and the VR scene configuration, VR rendering device 124 renders a VR scene in real-time for the user. That is, in real-time, VR rendering device produces VR content, including, for example, video data that is provided to a display driver 126 so that display driver 126 will display on a display screen 127 images included in the VR scene and audio data that is provided to speaker driver 128 so that speaker driver 128 will play audio for the using speakers 134 and 135.
  • VR rendering device 124 is shown as being within VR headset 120 in this embodiment, in other embodiments VR rendering device 124 (or one or more components thereof) are located remotely from VR headset 120, in which case VR headset 120 and VR rendering device 124 have communication means (transmitter, receiver) for enabling VR rendering device 124 to transmit VR content to VR headset 120.
  • communication means transmitter, receiver
  • VR system 100 supports a VR locomotion feature (i.e., a feature in which the user is moving within the VR scene from a first point in the VR scene towards another point in the VR scene without the user having to move from one point to another point in the user’s physical space).
  • VR system 100 renders inertia in the VR scene during locomotion using, for example, the technique disclosed in Gugenheimer, et. al., “GyroVR: Simulating Inertia in Virtual Reality using Head Worn Flywheels,” ACM, UIST 2016. That is, during locomotion, VR system 100 may simulate kinesthetic forces by, for example, attaching flywheels to the user’s head.
  • the embodiments of this disclosure mitigate motion sickness by introducing into the VR scene a visual reference that moves with the user during locomotion.
  • the visual reference helps the user to understand that the movement caused by locomotion is decoupled from their physical movement and, in some embodiments, indicates the current direction of the movement using, for example, a direction indicator.
  • the direction indicator is in the form of an arrow pointing in the direction of the locomotion.
  • the direction indicator is an object that is placed in the direction of the locomotion relative to the user’s position.
  • the direction indicator shows only the horizontal direction and omits the vertical component.
  • the direction of locomotion is indicated to the user by rotating the whole visual reference so that it is aligned with the direction of the locomotion.
  • the visual reference has a pronounced frontal part that would be used to indicate the direction of locomotion.
  • the rotation of the visual reference could omit the vertical component of the direction so that it is always kept horizontal.
  • the visual reference is only shown during locomotion so that the visual reference does not disturb the view for the user when locomotion is not occurring.
  • the visual reference is faded in when locomotion begins and is faded out when the locomotion is stopped.
  • this fade-in/fade-out may be correlated (e.g., proportional) to the amount of the simulated inertia.
  • the appearance of the visual reference is semi-transparent in order to be less intrusive and make it apparent that it is an abstract object that is added to VR scene for the purpose of helping the user not experience motion sickness.
  • the visual reference may have different shapes, colors, and other visual qualities.
  • the visual reference provides the user with the sense of standing on an object that moves with the user within the scene and is only fully visible when locomotion is activated.
  • the visual reference may also have a physical impact towards movement, and may limit movements when present, such as physically constraining the user within the VR scene. This could be done through a haptic suit that is worn by the user, such as the Teslasuit(TM) haptic suit available from VR Electronics Ltd.
  • Other haptic interaction devices based on ultrasound see, e.g., haptic products from Ultraleap (www . ultraleap . com / haptics /) could also be used.
  • the VR rendering device renders the visual reference in a particular form based on the type of haptic feedback available to the user. For example, in some embodiments, as a result of determining that the user is wearing body suit that can cause the user to feel a force being applied on the user’s body, the VR rendering device may render the visual reference in the form of a wall. As another example, as a result of determining that the user has an ultrasound based haptic feedback mechanism, the VR rendering device may render the visual reference as a “force field” or a bubble. VR rendering device may determine the type of haptic feedback that is available to the user at the start of the VR session as it is unlikely the type of type of haptic feedback that is available to the user will change mid-way through the session.
  • the visual reference In order to effectively reduce motion sickness, the visual reference must cover at least a certain amount of the user’s view. In one extreme end the visual reference would completely block the view of the scene, in which case the problem with motion sickness would be completely gone but the task of navigating the scene would be complicated. In the other extreme end, the visual reference is so small or transparent that it is barely seen by the users, in which case it would not have any effect at all. In between these extreme cases, there is a whole scale of solutions that cover different amounts of the user’s view. In one embodiment, the visual reference covers the whole view for the user, but it is made semi-transparent so the user can see the VR scene through it. This would resemble, for example, a glass-cage around the user. In another embodiment, the visual reference is only a platform (e.g., a magic carpet) that the user stands on.
  • a platform e.g., a magic carpet
  • FIGs 2A-2D illustrates various visual references that can be used to mitigate motion sickness.
  • FIG. 2A illustrates a visual reference 201 according to an embodiment
  • visual reference 201 consists of a platform 211 and direction indicator 213.
  • direction indicator 213 is in the form of an arrow that is pointing in the direction of the locomotion from the user’s perspective.
  • FIG. 2B illustrates a visual reference 202 according to an embodiment
  • visual reference 202 consists of a platform 211, direction indicator 213, and barrier 212.
  • Barrier 212 is in the form of a guard rail.
  • guard rails 212 encircle the user.
  • direction indicator 213 is an object that is placed in the direction of the locomotion relative to the user’s position.
  • FIG. 2C illustrates a visual reference 203 according to an embodiment
  • visual reference 203 consists of a platform 211, direction indicator 213, and barrier 214, where the barrier 214 is in the form of a short wall that extends upwardly from platform 211, but that does not significantly impede the user’s view of the virtual scene. Because barrier 214 does not significantly impede the user’s view of the virtual scene, barrier 214 may be opaque.
  • FIG. 2D illustrates a visual reference 204 according to an embodiment
  • visual reference 204 consists of a platform 211, direction indicator 213, and barrier 215, where the barrier 215 is in the form of a semi-transparent housing and the user is given the impression that the user is housed within the housing.
  • FIG. 3 is a flowchart illustrating a process 300, according to an embodiment, for providing a visual reference to a user in a virtual reality (VR) scene.
  • Process 300 may be performed by VR rendering device 124 and may begin with step s302.
  • Step s302 comprises detecting an activation of a locomotion feature in the VR scene.
  • Step s304 comprises rendering the visual reference in the VR scene in response to detecting the activation of the locomotion feature in the VR scene, wherein the visual reference moves with the user as the user moves within the VR scene.
  • rendering the visual reference in the VR scene comprises providing haptic feedback to the user.
  • the visual reference is in the form of a wall, and in such an embodiment when the user virtually touches the wall, the VR rendering device activates a haptic device worn by the user so that the user will experience a haptic feed back (e.g., force, pressure, vibration, etc.).
  • the visual reference is not included in the VR scene prior to the activation of the locomotion feature, and the step of rendering the visual reference in the VR scene in response to detecting the activation of the locomotion feature comprises or consists of adding the visual reference to the VR scene.
  • the visual reference is included in the VR scene prior to the activation of the locomotion feature, prior to the activation of the locomotion feature the visual reference has a first level of transparency
  • the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises or consists of reducing the transparency level of the visual reference within the VR scene from the first level of transparency to a second level of transparency such that the visual reference is less transparent at the second level of transparency than at the first level of transparency.
  • process 300 further includes steps s306 and s308. Step s306 comprises detecting a deactivation of the locomotion feature in the VR scene; and step s308 comprises removing the visual reference from the VR scene in response to detecting the deactivation of the locomotion feature in the VR scene.
  • the visual reference comprises an image of a platform.
  • the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the platform so that the user has the impression that the user is standing on the platform or that the platform is underneath the user.
  • the visual reference further comprises an image of a barrier (e.g., one or more walls or a railing).
  • the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the barrier so that the user has the impression that at least a portion of the barrier is in front of the user.
  • the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the barrier so that the user has the impression that the barrier surrounds the user on at least three sides (e.g., the barrier is a semi-transparent housing in which the user has the impression of being housed).
  • the barrier comprises or consists of a front wall
  • the step of rendering the front wall comprises rendering the wall such that the wall extends upwardly from the platform.
  • the front wall may be semi-transparent.
  • FIG. 4 is a block diagram of VR rendering device 124, according to some embodiments.
  • VR rendering device 124 may comprise: processing circuitry (PC) 402, which may include one or more processors (P) 455 (e.g., one or more general purpose microprocessors and/or one or more other processors, such as an application specific integrated circuit (ASIC), field-programmable gate arrays (FPGAs), and the like), which processors may be co-located in a single housing or in a single data center or may be geographically distributed (i.e., VR rendering device 124 may be a distributed computing apparatus); at least one network interface 448 (e.g., a physical interface or air interface) comprising a transmitter (Tx) 445 and a receiver (Rx) 447 for enabling VR rendering device 124 to transmit data to and receive data from other nodes connected to a network 110 (e.g., an Internet Protocol (IP) network) to which network interface 448 is connected (physically or wirelessly) (
  • IP Internet Protocol
  • CPP 441 includes a computer readable medium (CRM) 442 storing a computer program (CP) 443 comprising computer readable instructions (CRI) 444.
  • CRM 442 may be a non-transitory computer readable medium, such as, magnetic media (e.g., a hard disk), optical media, memory devices (e.g., random access memory, flash memory), and the like.
  • the CRI 444 of computer program 443 is configured such that when executed by PC 402, the CRI causes VR rendering device 124 to perform steps described herein (e.g., steps described herein with reference to the flow charts).
  • VR rendering device 124 may be configured to perform steps described herein without the need for code. That is, for example, PC 402 may consist merely of one or more ASICs. Hence, the features of the embodiments described herein may be implemented in hardware and/or software.
  • a method (300) for providing a visual reference to a user in a virtual reality (VR) scene comprising: detecting (s302) an activation of a locomotion feature in the VR scene; in response to detecting the activation of the locomotion feature in the VR scene, rendering (s304) the visual reference in the VR scene, wherein the visual reference moves with the user as the user moves within the VR scene.
  • A5. The method of any one of embodiments A1-A4, wherein the visual reference comprises an image of a platform.
  • A6 The method of embodiment A5, wherein the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the platform so that the user has the impression that the user is standing on the platform or that the platform is underneath the user.
  • A7 The method embodiments A5 or A6, wherein the visual reference further comprises an image of a barrier (e.g., one or more walls or a railing).
  • step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the barrier so that the user has the impression that at least a portion of the barrier is in front of the user.
  • step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the barrier so that the user has the impression that the barrier surrounds the user on at least three sides.
  • A10 The method of embodiment A7 or A8, wherein the barrier comprises or consists of a front wall, and the step of rendering the front wall comprises rendering the wall such that the wall extends upwardly from the platform.
  • Al l The method of embodiment A10, wherein the front wall is semi-transparent.
  • Al 2 The method of any one of embodiments Al -Al 1, wherein the form of the visual reference is dependent on the type(s) of haptic feedback available to the user.
  • A14 The method of any one of embodiments Al -Al 3, wherein the visual reference indicates the direction of the user’s movement within the VR scene.
  • Al 5 The method of any one of embodiments Al -Al 5, wherein the visual reference indicates the direction of the user’s movement within the VR scene by comprising a direction indicator.
  • a computer program (443) comprising instructions (444) which when executed by processing circuitry (402) of a virtual reality (VR) rendering device (124) causes the VR rendering device (124) to perform the method of any one of the above embodiments.
  • VR virtual reality
  • a virtual reality (VR) rendering device (124), the VR rendering device being adapted to perform the method of any embodiments Al -Al 5.
  • a virtual reality (VR) rendering device (124), the VR rendering device comprising: processing circuitry (402); and a memory (442), the memory containing instructions (444) executable by the processing circuitry, whereby the VR rendering device is operative to perform the method of any one embodiments Al -Al 5.
  • VR virtual reality
  • this visual reference decreases problems with motion sickness because it provides a visual cue of a stable platform that is moving with the user, like a virtual aerial work platform of a fire truck.
  • the virtual aerial work platform provides a sense of standing on a firm ground with safety guard rails.
  • the visual reference may indicate the direction of the user’s movement within the VR scene. This helps the user’s brain to understand that the movement is decoupled from the users own physical movements, thereby reducing the likelihood that the user will experience motion sickness.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A virtual reality (VR) rendering device (124). The VR rendering device is configured to detect an activation of a locomotion feature in the VR scene and, in response to detecting the activation of the locomotion feature in the VR scene, render a visual reference in the VR scene, wherein the visual reference moves with the user as the user moves within the VR scene.

Description

VISUAL REFERENCE FOR VIRTUAL REALITY (VR) LOCOMOTION
TECHNICAL FIELD
[001] This disclosure relates to systems and method for creating virtual reality (VR) environments.
BACKGROUND
[002] Virtual reality (VR) (which is also referred to as extended reality (XR)) uses computing technology to create simulated environments (a.k.a., VR environments or VR scenes). In many VR environments, a user is immersed inside an entirely simulated experience, while other VR environments augment or extend a user’s reality by, for example, displaying virtual objects so that they overlay what the user sees in reality (for instance, a shopper wearing virtual reality glasses while shopping in a supermarket might see nutritional information for an object as they place their object in their shopping carpet).
[003] Locomotion in VR systems is a feature that makes it possible for the user to experience the sense of movement within the VR scene (e.g., the user can move from one location in the virtual scene to another location in the virtual scene) without physically moving from one physical location to another physical location. Locomotion can be implemented in different ways, including: 1) teleportation, 2) treadmills, and 3) smooth artificial locomotion.
[004] With teleportation, the user may select a spot in the virtual scene and then the user is instantly transported to that point in the virtual scene when, for example, clicking a button.
Implementing locomotion using a treadmill typically involves a platform-like device that tracks physical movements of the user and converts that physical movements into virtual movements so that the user within the virtual scene moves in a way that corresponds to the tracked physical movements. With smooth artificial locomotion, the user can move smoothly within the virtual scene using, for example, a touchpad or joystick or other input device.
[005] Motion sickness (e.g., the feelings of dizziness, lightheadedness, or nausea) during VR locomotion is a well-known problem and is often referred to as “simulator sickness.” The phenomenon is caused by a so-called vestibular mismatch, which is a consequence of a mismatch between the movement the user senses and the lack of actual physical movement of the user.
[006] The problem with motion sickness during VR locomotion varies greatly from person to person. Also, the different implementations of VR locomotion tend to trigger the motion sickness to different degrees. Teleportation is usually seen as the method that will trigger the least amount of motion sickness, but it may cause the user to be get slightly disoriented when triggering teleportation several times in a row (e.g., jumping from spot to spot). Smooth artificial locomotion provides a more natural movement and has been the most common implementation of locomotion within non-VR games for decades. However, in VR this method has shown to cause problems with motion sickness for many people, and for some of these people the degree of motion sickness is so great that they cannot use this feature at all.
[007] Different solutions have been proposed to decrease the level of motion sickness when using smooth artificial locomotion, such as putting the user on or inside a virtual vehicle (see, e.g., Medeiros, et. al., “Magic Carpet: Interaction Fidelity for Flying in VR,” IEEE Transactions on Visualization and Computer Graphics, vol, 26, no. 9, September 2020), masking the peripheral vision during motion, or rendering a visual virtual nose (see, e.g. US Patent Publication No. 2016030039).
SUMMARY
[008] Certain challenges presently exist. For instance, the idea of putting the user inside a virtual vehicle to alleviate the user’s motion sickness may be problematic because a virtual vehicle may not fit within or be appropriate for the virtual scene, and, therefore, it might disturb the experience of the user. As another example, the idea of adding a virtual object (such as a magic flying carpet) to the VR scene for the purpose of reducing the user’s motion sickness is not always desirable because the virtual object may disturb the view for the user when the virtual object is no longer needed. In short, there are no solutions currently available that can provide a visual reference for the user during locomotion in VR scenes without negatively impacting the user’s experience in the VR scene. [009] Accordingly, in one aspect there is provided a method for providing a visual reference to a user in a virtual reality (VR) scene. The method includes detecting an activation of a locomotion feature in the VR scene. The method also includes rendering the visual reference in the VR scene in response to detecting the activation of the locomotion feature in the VR scene, wherein the visual reference moves with the user as the user moves within the VR scene. In some embodiments, the method further includes detecting a deactivation of the locomotion feature in the VR scene and removing the visual reference from the VR scene in response to detecting the deactivation of the locomotion feature in the VR scene.
[0010] In another aspect there is provided a computer program comprising instructions which when executed by processing circuitry of a virtual reality (VR) rendering device causes the VR rendering device to perform the method of any one of the above embodiments disclosed herein. In another aspect there is provided a carrier containing the computer program, wherein the carrier is one of an electronic signal, an optical signal, a radio signal, and a computer readable storage medium.
[0011] In another aspect there is provided a VR rendering device, where the VR rendering device is adapted to perform the method of any embodiments disclosed herein. In some embodiments, VR rendering device includes processing circuitry; and a memory containing instructions executable by the processing circuitry, whereby the VR rendering device is operative to perform the methods disclosed herein.
[0012] The embodiments disclosed herein are advantageous in that they decrease problems with motion sickness during locomotion in a VR scene, and, moreover, the embodiments do not negatively impact the user’s experience because the embodiments do not disturb the view of the user when the locomotion feature is not being used. Additionally, the visual reference represents an abstract object that can be used in many, if not all, VR scenes. BRIEF DESCRIPTION OF THE DRAWINGS
[0013] The accompanying drawings, which are incorporated herein and form part of the specification, illustrate various embodiments.
[0014] FIG. 1 A illustrates a VR system according to an embodiment.
[0015] FIG. IB illustrate a VR headset according to an embodiment.
[0016] FIG. 2A illustrate a visual reference according to an embodiment.
[0017] FIG. 2B illustrate a visual reference according to an embodiment.
[0018] FIG. 2C illustrate a visual reference according to an embodiment.
[0019] FIG. 2D illustrate a visual reference according to an embodiment.
[0020] FIG. 3 is a flowchart illustrating a process according to an embodiment.
[0021] FIG. 4 illustrates a VR rendering device according to an embodiment.
DETAILED DESCRIPTION
[0022] FIG. 1A illustrates a VR system 100 according to some embodiments. As shown in FIG. 1A, VR system 100 comprises a VR headset 120 (e.g., VR goggles, VR glasses, VR head mounted display (HMD), etc.) that is configured to be worn by a user and that is operable to display to the user a VR scene (e.g., a VR scene in which the user is virtually immersed), speakers 134 and 135 for producing sound for the user, and an input device 150 for receiving input from the user (in this example the input device 150 is in the form of a joystick).
[0023] As shown in FIG. IB, VR headset 120 may comprise an orientation sensing unit 121, a position sensing unit 122, and a VR rendering device 124. Orientation sensing unit 121 is configured to detect a change in the orientation of the user and provides information regarding the detected change to VR rendering device 124. In some embodiments, VR rendering device 124 determines the absolute orientation (in relation to some coordinate system) given the detected change in orientation detected by orientation sensing unit 121. In some embodiments, orientation sensing unit 121 may comprise one or more accelerometers and/or one or more gyroscopes.
[0024] In addition to receiving input from sensing units 121 and 122, VR rendering device 124 may also receive input from input device 150 and may also obtain VR scene configuration information. Based on these inputs and the VR scene configuration, VR rendering device 124 renders a VR scene in real-time for the user. That is, in real-time, VR rendering device produces VR content, including, for example, video data that is provided to a display driver 126 so that display driver 126 will display on a display screen 127 images included in the VR scene and audio data that is provided to speaker driver 128 so that speaker driver 128 will play audio for the using speakers 134 and 135. While VR rendering device 124 is shown as being within VR headset 120 in this embodiment, in other embodiments VR rendering device 124 (or one or more components thereof) are located remotely from VR headset 120, in which case VR headset 120 and VR rendering device 124 have communication means (transmitter, receiver) for enabling VR rendering device 124 to transmit VR content to VR headset 120.
[0025] In this example, VR system 100 supports a VR locomotion feature (i.e., a feature in which the user is moving within the VR scene from a first point in the VR scene towards another point in the VR scene without the user having to move from one point to another point in the user’s physical space). In some embodiments, VR system 100 renders inertia in the VR scene during locomotion using, for example, the technique disclosed in Gugenheimer, et. al., “GyroVR: Simulating Inertia in Virtual Reality using Head Worn Flywheels,” ACM, UIST 2016. That is, during locomotion, VR system 100 may simulate kinesthetic forces by, for example, attaching flywheels to the user’s head. As noted above, when a user uses the locomotion feature of the VR system 100, there is a good chance the user will experience motion sickness. The embodiments of this disclosure mitigate motion sickness by introducing into the VR scene a visual reference that moves with the user during locomotion. The visual reference helps the user to understand that the movement caused by locomotion is decoupled from their physical movement and, in some embodiments, indicates the current direction of the movement using, for example, a direction indicator. In some embodiments, the direction indicator is in the form of an arrow pointing in the direction of the locomotion. In other embodiments, the direction indicator is an object that is placed in the direction of the locomotion relative to the user’s position. In some embodiments the direction indicator shows only the horizontal direction and omits the vertical component. In some embodiments, the direction of locomotion is indicated to the user by rotating the whole visual reference so that it is aligned with the direction of the locomotion. In some cases the visual reference has a pronounced frontal part that would be used to indicate the direction of locomotion. The rotation of the visual reference could omit the vertical component of the direction so that it is always kept horizontal.
[0026] In some embodiments, the visual reference is only shown during locomotion so that the visual reference does not disturb the view for the user when locomotion is not occurring. For example, the visual reference is faded in when locomotion begins and is faded out when the locomotion is stopped. In embodiments in which inertia is simulated, this fade-in/fade-out may be correlated (e.g., proportional) to the amount of the simulated inertia. In some embodiments, the appearance of the visual reference is semi-transparent in order to be less intrusive and make it apparent that it is an abstract object that is added to VR scene for the purpose of helping the user not experience motion sickness.
[0027] The visual reference may have different shapes, colors, and other visual qualities. In one embodiment, the visual reference provides the user with the sense of standing on an object that moves with the user within the scene and is only fully visible when locomotion is activated. The visual reference may also have a physical impact towards movement, and may limit movements when present, such as physically constraining the user within the VR scene. This could be done through a haptic suit that is worn by the user, such as the Teslasuit(TM) haptic suit available from VR Electronics Ltd. Other haptic interaction devices based on ultrasound (see, e.g., haptic products from Ultraleap (www . ultraleap . com / haptics /)) could also be used. The nature of the visual reference may be tied to the kind of haptic interaction. The presence of these haptic interactions may correspond to the presence of the visual reference, and the intensity of the haptics may correspond to the transparency of the visual reference. In some embodiments, the VR rendering device renders the visual reference in a particular form based on the type of haptic feedback available to the user. For example, in some embodiments, as a result of determining that the user is wearing body suit that can cause the user to feel a force being applied on the user’s body, the VR rendering device may render the visual reference in the form of a wall. As another example, as a result of determining that the user has an ultrasound based haptic feedback mechanism, the VR rendering device may render the visual reference as a “force field” or a bubble. VR rendering device may determine the type of haptic feedback that is available to the user at the start of the VR session as it is unlikely the type of type of haptic feedback that is available to the user will change mid-way through the session.
[0028] In order to effectively reduce motion sickness, the visual reference must cover at least a certain amount of the user’s view. In one extreme end the visual reference would completely block the view of the scene, in which case the problem with motion sickness would be completely gone but the task of navigating the scene would be complicated. In the other extreme end, the visual reference is so small or transparent that it is barely seen by the users, in which case it would not have any effect at all. In between these extreme cases, there is a whole scale of solutions that cover different amounts of the user’s view. In one embodiment, the visual reference covers the whole view for the user, but it is made semi-transparent so the user can see the VR scene through it. This would resemble, for example, a glass-cage around the user. In another embodiment, the visual reference is only a platform (e.g., a magic carpet) that the user stands on.
[0029] FIGs 2A-2D illustrates various visual references that can be used to mitigate motion sickness.
[0030] FIG. 2A illustrates a visual reference 201 according to an embodiment; visual reference 201 consists of a platform 211 and direction indicator 213. In this example direction indicator 213 is in the form of an arrow that is pointing in the direction of the locomotion from the user’s perspective.
[0031] FIG. 2B illustrates a visual reference 202 according to an embodiment; visual reference 202 consists of a platform 211, direction indicator 213, and barrier 212. Barrier 212 is in the form of a guard rail. In this embodiment, guard rails 212 encircle the user. In this example direction indicator 213 is an object that is placed in the direction of the locomotion relative to the user’s position.
[0032] FIG. 2C illustrates a visual reference 203 according to an embodiment; visual reference 203 consists of a platform 211, direction indicator 213, and barrier 214, where the barrier 214 is in the form of a short wall that extends upwardly from platform 211, but that does not significantly impede the user’s view of the virtual scene. Because barrier 214 does not significantly impede the user’s view of the virtual scene, barrier 214 may be opaque.
[0033] FIG. 2D illustrates a visual reference 204 according to an embodiment; visual reference 204 consists of a platform 211, direction indicator 213, and barrier 215, where the barrier 215 is in the form of a semi-transparent housing and the user is given the impression that the user is housed within the housing.
[0034] FIG. 3 is a flowchart illustrating a process 300, according to an embodiment, for providing a visual reference to a user in a virtual reality (VR) scene. Process 300 may be performed by VR rendering device 124 and may begin with step s302.
[0035] Step s302 comprises detecting an activation of a locomotion feature in the VR scene. Step s304 comprises rendering the visual reference in the VR scene in response to detecting the activation of the locomotion feature in the VR scene, wherein the visual reference moves with the user as the user moves within the VR scene. In some embodiments, rendering the visual reference in the VR scene comprises providing haptic feedback to the user. For instance, in some embodiments the visual reference is in the form of a wall, and in such an embodiment when the user virtually touches the wall, the VR rendering device activates a haptic device worn by the user so that the user will experience a haptic feed back (e.g., force, pressure, vibration, etc.).
[0036] In some embodiments, the visual reference is not included in the VR scene prior to the activation of the locomotion feature, and the step of rendering the visual reference in the VR scene in response to detecting the activation of the locomotion feature comprises or consists of adding the visual reference to the VR scene.
[0037] In some embodiments, the visual reference is included in the VR scene prior to the activation of the locomotion feature, prior to the activation of the locomotion feature the visual reference has a first level of transparency, and the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises or consists of reducing the transparency level of the visual reference within the VR scene from the first level of transparency to a second level of transparency such that the visual reference is less transparent at the second level of transparency than at the first level of transparency. [0038] In some embodiments, process 300 further includes steps s306 and s308. Step s306 comprises detecting a deactivation of the locomotion feature in the VR scene; and step s308 comprises removing the visual reference from the VR scene in response to detecting the deactivation of the locomotion feature in the VR scene.
[0039] In some embodiments, the visual reference comprises an image of a platform. In such an embodiment, the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the platform so that the user has the impression that the user is standing on the platform or that the platform is underneath the user. In some embodiments, the visual reference further comprises an image of a barrier (e.g., one or more walls or a railing). In some embodiments, the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the barrier so that the user has the impression that at least a portion of the barrier is in front of the user. In some embodiments, the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the barrier so that the user has the impression that the barrier surrounds the user on at least three sides (e.g., the barrier is a semi-transparent housing in which the user has the impression of being housed). In some embodiments, the barrier comprises or consists of a front wall, and the step of rendering the front wall comprises rendering the wall such that the wall extends upwardly from the platform. In such an embodiment, the front wall may be semi-transparent.
[0040] FIG. 4 is a block diagram of VR rendering device 124, according to some embodiments. As shown in FIG. 4, VR rendering device 124 may comprise: processing circuitry (PC) 402, which may include one or more processors (P) 455 (e.g., one or more general purpose microprocessors and/or one or more other processors, such as an application specific integrated circuit (ASIC), field-programmable gate arrays (FPGAs), and the like), which processors may be co-located in a single housing or in a single data center or may be geographically distributed (i.e., VR rendering device 124 may be a distributed computing apparatus); at least one network interface 448 (e.g., a physical interface or air interface) comprising a transmitter (Tx) 445 and a receiver (Rx) 447 for enabling VR rendering device 124 to transmit data to and receive data from other nodes connected to a network 110 (e.g., an Internet Protocol (IP) network) to which network interface 448 is connected (physically or wirelessly) (e.g., network interface 448 may be coupled to an antenna arrangement comprising one or more antennas for enabling VR rendering device 124 to wirelessly transmit/receive data); and a local storage unit (a.k.a., “data storage system”) 408, which may include one or more non-volatile storage devices and/or one or more volatile storage devices. In embodiments where PC 402 includes a programmable processor, a computer program product (CPP) 441 may be provided. CPP 441 includes a computer readable medium (CRM) 442 storing a computer program (CP) 443 comprising computer readable instructions (CRI) 444. CRM 442 may be a non-transitory computer readable medium, such as, magnetic media (e.g., a hard disk), optical media, memory devices (e.g., random access memory, flash memory), and the like. In some embodiments, the CRI 444 of computer program 443 is configured such that when executed by PC 402, the CRI causes VR rendering device 124 to perform steps described herein (e.g., steps described herein with reference to the flow charts). In other embodiments, VR rendering device 124 may be configured to perform steps described herein without the need for code. That is, for example, PC 402 may consist merely of one or more ASICs. Hence, the features of the embodiments described herein may be implemented in hardware and/or software.
[0041] Summary of Various Embodiments
[0042] Al. A method (300) for providing a visual reference to a user in a virtual reality (VR) scene, the method comprising: detecting (s302) an activation of a locomotion feature in the VR scene; in response to detecting the activation of the locomotion feature in the VR scene, rendering (s304) the visual reference in the VR scene, wherein the visual reference moves with the user as the user moves within the VR scene.
[0043] A2. The method of embodiment Al, wherein the visual reference is not included in the VR scene prior to the activation of the locomotion feature, and the step of rendering the visual reference in the VR scene in response to detecting the activation of the locomotion feature comprises or consists of adding the visual reference to the VR scene.
[0044] A3. The method of embodiment Al, wherein the visual reference is included in the VR scene prior to the activation of the locomotion feature, prior to the activation of the locomotion feature the visual reference has a first level of transparency, and the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises or consists of reducing the transparency level of the visual reference within the VR scene from the first level of transparency to a second level of transparency such that the visual reference is less transparent at the second level of transparency than at the first level of transparency.
[0045] A4. The method of any one of embodiment Al -A3, further comprising: detecting
(s306) a deactivation of the locomotion feature in the VR scene; and in response to detecting the deactivation of the locomotion feature in the VR scene, removing (s308) the visual reference from the VR scene.
[0046] A5. The method of any one of embodiments A1-A4, wherein the visual reference comprises an image of a platform.
[0047] A6. The method of embodiment A5, wherein the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the platform so that the user has the impression that the user is standing on the platform or that the platform is underneath the user. A7. The method embodiments A5 or A6, wherein the visual reference further comprises an image of a barrier (e.g., one or more walls or a railing).
[0048] A8. The method of embodiment A7, wherein the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the barrier so that the user has the impression that at least a portion of the barrier is in front of the user.
[0049] A9. The method of embodiment A8, wherein the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the barrier so that the user has the impression that the barrier surrounds the user on at least three sides.
[0050] A10. The method of embodiment A7 or A8, wherein the barrier comprises or consists of a front wall, and the step of rendering the front wall comprises rendering the wall such that the wall extends upwardly from the platform. [0051] Al l. The method of embodiment A10, wherein the front wall is semi-transparent.
[0052] Al 2. The method of any one of embodiments Al -Al 1, wherein the form of the visual reference is dependent on the type(s) of haptic feedback available to the user.
[0053] Al 3. The method of embodiment Al 2, further comprising determining a type of haptic feedback available to the user, wherein the step of rendering the visual reference in the VR scene comprises rendering the visual reference in a particular form based on the determined type of haptic feedback.
[0054] A14. The method of any one of embodiments Al -Al 3, wherein the visual reference indicates the direction of the user’s movement within the VR scene.
[0055] Al 5. The method of any one of embodiments Al -Al 5, wherein the visual reference indicates the direction of the user’s movement within the VR scene by comprising a direction indicator.
[0056] Bl. A computer program (443) comprising instructions (444) which when executed by processing circuitry (402) of a virtual reality (VR) rendering device (124) causes the VR rendering device (124) to perform the method of any one of the above embodiments.
[0057] B2. A carrier containing the computer program of embodiment Bl, wherein the carrier is one of an electronic signal, an optical signal, a radio signal, and a computer readable storage medium (442).
[0058] Cl. A virtual reality (VR) rendering device (124), the VR rendering device being adapted to perform the method of any embodiments Al -Al 5.
[0059] C2. A virtual reality (VR) rendering device (124), the VR rendering device comprising: processing circuitry (402); and a memory (442), the memory containing instructions (444) executable by the processing circuitry, whereby the VR rendering device is operative to perform the method of any one embodiments Al -Al 5.
[0060] Conclusion
[0061] As noted above, there are no solutions currently available that provide a visual reference for the user during locomotion in VR scenes without negatively impacting the user’s experience in the VR scene. This disclosure overcomes this problem by introducing a virtual object that works as a visual reference for the user when the user uses the locomotion functionality in the virtual scene and is not present (or is faded) when the locomotion functionality is not being used.
[0062] In one embodiment, this visual reference decreases problems with motion sickness because it provides a visual cue of a stable platform that is moving with the user, like a virtual aerial work platform of a fire truck. The virtual aerial work platform provides a sense of standing on a firm ground with safety guard rails. In addition, the visual reference may indicate the direction of the user’s movement within the VR scene. This helps the user’s brain to understand that the movement is decoupled from the users own physical movements, thereby reducing the likelihood that the user will experience motion sickness.
[0063] While various embodiments are described herein, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of this disclosure should not be limited by any of the above-described exemplary embodiments. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the disclosure unless otherwise indicated herein or otherwise clearly contradicted by context.
[0064] Additionally, while the processes described above and illustrated in the drawings are shown as a sequence of steps, this was done solely for the sake of illustration. Accordingly, it is contemplated that some steps may be added, some steps may be omitted, the order of the steps may be re-arranged, and some steps may be performed in parallel.

Claims

1. A method (300) for providing a visual reference (201, 202, 203, 204) to a user in a virtual reality, VR, scene, the method comprising: detecting (s302) an activation of a locomotion feature in the VR scene; in response to detecting the activation of the locomotion feature in the VR scene, rendering (s304) the visual reference in the VR scene, wherein the visual reference moves with the user as the user moves within the VR scene.
2. The method of claim 1 , wherein the visual reference is not included in the VR scene prior to the activation of the locomotion feature, and the step of rendering the visual reference in the VR scene in response to detecting the activation of the locomotion feature comprises or consists of adding the visual reference to the VR scene.
3. The method of claim 1, wherein the visual reference is included in the VR scene prior to the activation of the locomotion feature, prior to the activation of the locomotion feature the visual reference has a first level of transparency, and the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises or consists of reducing the transparency level of the visual reference within the VR scene from the first level of transparency to a second level of transparency such that the visual reference is less transparent at the second level of transparency than at the first level of transparency.
4. The method of any one of claim 1-3, further comprising: detecting (s306) a deactivation of the locomotion feature in the VR scene; and in response to detecting the deactivation of the locomotion feature in the VR scene, removing (s308) the visual reference from the VR scene.
5. The method of any one of claims 1-4, wherein the visual reference comprises an image of a platform.
6. The method of claim 5, wherein the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the platform so that the user has the impression that the user is standing on the platform or that the platform is underneath the user.
7. The method claims 5 or 6, wherein the visual reference further comprises an image of a barrier.
8. The method of claim 7, wherein the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the barrier so that the user has the impression that at least a portion of the barrier is in front of the user.
9. The method of claim 8, wherein the step of rendering the visual reference in the VR scene in response to the activation of the locomotion function comprises rendering the image of the barrier so that the user has the impression that the barrier surrounds the user on at least three sides.
10. The method of claim 7 or 8, wherein the barrier comprises or consists of a front wall, and the step of rendering the front wall comprises rendering the wall such that the wall extends upwardly from the platform.
11. The method of claim 10, wherein the front wall is semi-transparent.
12. The method of any one of claims 1-11, wherein the form of the visual reference is dependent on the type(s) of haptic feedback available to the user.
13. The method of claim 12, further comprising determining a type of haptic feedback available to the user, wherein the step of rendering the visual reference in the VR scene comprises rendering the visual reference in a particular form based on the determined type of haptic feedback.
14. The method of any one of claims 1-13, wherein the visual reference indicates the direction of the user’s movement within the VR scene.
15. The method of any one of claims 1-14, wherein the visual reference indicates the direction of the user’s movement within the VR scene by comprising a direction indicator.
16. A computer program (443) comprising instructions (444) which when executed by processing circuitry (402) of a virtual reality (VR) rendering device (124) causes the VR rendering device (124) to perform the method of any one of claims 1-15.
17. A carrier containing the computer program of claim 16, wherein the carrier is one of an electronic signal, an optical signal, a radio signal, and a computer readable storage medium (442).
18. A virtual reality, VR, rendering device (124), the VR rendering device being configured to: detecting (s302) an activation of a locomotion feature in a VR scene; in response to detecting the activation of the locomotion feature in the VR scene, rendering (s304) a visual reference (201, 202, 203, 204) in the VR scene, wherein the visual reference moves with a user as the user moves within the VR scene. 17
19. The VR rendering device (124) of claim 18, wherein the VR rendering device is further configured to perform the method of any claims 2-15.
20. A virtual reality, VR, rendering device (124), the VR rendering device comprising: processing circuitry (402); and a memory (442), the memory containing instructions (444) executable by the processing circuitry, whereby the VR rendering device is operative to perform the method of any one claims 1-15.
PCT/EP2021/083420 2020-12-01 2021-11-29 Visual reference for virtual reality (vr) locomotion WO2022117517A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063119850P 2020-12-01 2020-12-01
US63/119,850 2020-12-01

Publications (1)

Publication Number Publication Date
WO2022117517A1 true WO2022117517A1 (en) 2022-06-09

Family

ID=78829397

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/083420 WO2022117517A1 (en) 2020-12-01 2021-11-29 Visual reference for virtual reality (vr) locomotion

Country Status (1)

Country Link
WO (1) WO2022117517A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160030039A1 (en) 2014-08-04 2016-02-04 Jeffrey F. Seavey Surgical instrument for implanting fixation device
US20180224930A1 (en) * 2015-08-04 2018-08-09 Board Of Regents Of The Nevada System Of Higher Education, On Behalf Of The University Of Nevada, Immersive virtual reality locomotion using head-mounted motion sensors

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160030039A1 (en) 2014-08-04 2016-02-04 Jeffrey F. Seavey Surgical instrument for implanting fixation device
US20180224930A1 (en) * 2015-08-04 2018-08-09 Board Of Regents Of The Nevada System Of Higher Education, On Behalf Of The University Of Nevada, Immersive virtual reality locomotion using head-mounted motion sensors

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HAN PING-HSUAN ET AL: "A Compelling Virtual Tour of the Dunhuang Cave With an Immersive Head-Mounted Display", IEEE COMPUTER GRAPHICS AND APPLICATIONS, IEEE SERVICE CENTER, NEW YORK, NY, US, vol. 40, no. 1, 30 August 2019 (2019-08-30), pages 40 - 55, XP011765767, ISSN: 0272-1716, [retrieved on 20200107], DOI: 10.1109/MCG.2019.2936753 *
MEDEIROS DANIEL ET AL: "Magic Carpet: Interaction Fidelity for Flying in VR", IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, IEEE, USA, vol. 26, no. 9, 15 March 2019 (2019-03-15), pages 2793 - 2804, XP011801000, ISSN: 1077-2626, [retrieved on 20200729], DOI: 10.1109/TVCG.2019.2905200 *
MEDEIROS: "Magic Carpet: Interaction Fidelity for Flying in VR", IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, vol. 26, 9 September 2020 (2020-09-09)
PAUSCH RANDY ET AL: "Disney's Aladdin first steps toward storytelling in virtual reality", COMPUTER GRAPHICS AND INTERACTIVE TECHNIQUES, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 1 August 1996 (1996-08-01), pages 193 - 203, XP058610541, ISBN: 978-0-89791-344-7, DOI: 10.1145/237170.237257 *

Similar Documents

Publication Publication Date Title
AU2017204739B2 (en) Massive simultaneous remote digital presence world
US11484790B2 (en) Reality vs virtual reality racing
CN108885488A (en) Visual cues relevant to virtual objects are generated in enhancing and/or reality environment
US20160124502A1 (en) Sensory feedback systems and methods for guiding users in virtual reality environments
JP6342038B1 (en) Program for providing virtual space, information processing apparatus for executing the program, and method for providing virtual space
CN110536665A (en) Carry out simulation space using virtual echolocation to perceive
GB2560004A (en) Virtual reality
KR20180013892A (en) Reactive animation for virtual reality
JP2018514836A (en) Controller visualization in virtual and augmented reality environments
JP2019516180A (en) Method and apparatus for presenting an image in a virtualized environment
CN108700944A (en) System and method related with the movement in reality environment
JP2019074962A (en) Program for providing virtual experience, computer and method
WO2022117517A1 (en) Visual reference for virtual reality (vr) locomotion
US20240160273A1 (en) Inferring vr body movements including vr torso translational movements from foot sensors on a person whose feet can move but whose torso is stationary
JP2018200688A (en) Program to provide virtual space, information processing device to execute the same and method for providing virtual space

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21823514

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21823514

Country of ref document: EP

Kind code of ref document: A1