WO2018234318A1 - Reducing simulation sickness in virtual reality applications - Google Patents

Reducing simulation sickness in virtual reality applications Download PDF

Info

Publication number
WO2018234318A1
WO2018234318A1 PCT/EP2018/066285 EP2018066285W WO2018234318A1 WO 2018234318 A1 WO2018234318 A1 WO 2018234318A1 EP 2018066285 W EP2018066285 W EP 2018066285W WO 2018234318 A1 WO2018234318 A1 WO 2018234318A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
ghost
application
location
analyze
Prior art date
Application number
PCT/EP2018/066285
Other languages
French (fr)
Inventor
Andreu BARTOLÍ GARCIA
Miquel FARRERONS FALGUERAS
Original Assignee
Soccer Science, S.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Soccer Science, S.L. filed Critical Soccer Science, S.L.
Publication of WO2018234318A1 publication Critical patent/WO2018234318A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress

Definitions

  • the present disclosure relates to techniques and devices for reducing simulation sickness in Virtual Reality (VR) systems and applications, and in particular in VR soccer simulators.
  • VR Virtual Reality
  • VR Virtual Reality
  • 3D 3- Dimensional
  • sounds and Ul to give a realistic feeling of being into a 3D world and interacting with it.
  • the amount of realism depends on the quality of the graphics, the accompanying sounds, the mode of interaction, the use of feedback like vibrations etc. produced by haptic devices, and the synchronization between the virtual and real stimuli and sensed signal received by the user.
  • simulation sickness The motion sickness, or more appropriately termed "simulation sickness” in VR applications has drawn significant attention in the research community and industry and various solutions have been proposed and incorporated in modern VR systems. Among them, techniques have been used to detect simulation sickness by analyzing biomedical signals from the user of the VR system. Once simulation sickness is detected, use of special sounds, vibrations, electrostimulation at the user's head and other techniques are applied. Although these may reduce dizziness they necessitate the use of complex, purpose built equipment and require significant processing power in real time, both not always available or convenient, and usually cumbersome.
  • Blink mode is also used where the motion is intentionally made jerky like the avatar instantly jumps along intermediate positions before it reaches its final destination, i.e. implementing teleporting in small steps. This is usually supplemented by drum-like sounds for every step made along the path to the final destination.
  • AR Augmented Reality
  • a special case of VR is Augmented Reality (AR) which follows a similar approach.
  • AR uses live video image of the real environment upon which it overlays virtual 2-Dimensional or 3D objects and avatars with which the use can interact. It presents a potentially more realistic experience for its user but still has similar problems.
  • live real image for the surroundings removes a great amount of dizziness as the disconnection between the real and virtual worlds is significantly reduced.
  • the calculation of the 3D virtual graphic overlays may involve a perceivable delay and/or misalignment which makes the perceived experience unreal and eventually introduces simulation sickness.
  • a method is proposed of reducing simulation sickness in a virtual reality application or system.
  • Said simulation sickness may be caused by contradicting and imperfectly synchronized signals received from the human visual, vestibular, and proprioceptive systems involved in balance.
  • the method may comprise identifying a location in the virtual world where the user wants to teleport; identifying a user's request for teleportation to said location; displaying a visual mark at said location; displaying a ghost of the user's avatar at the user's current position; displaying said ghost moving towards the visual mark; and instantly teleporting the user to the location of the visual mark once the ghost has reached said mark.
  • the moving ghost virtual objects and avatars of virtual living beings are also displayed nearby or while interacting with the ghost; at the same time, the user may be looking towards any location in the virtual world.
  • the method teaches identifying the user's intended destination as it is indicated by his/her gaze (or using eye tracking technology, or technology measuring the direction pointed by the centre line of his/her virtual reality display, or using any other pointing method), and rendering in the virtual world a ghost of his/her avatar.
  • the ghost exiting the user's avatar and moving in natural motion towards the user selected destination, which is indicated by a visual mark in the virtual environment.
  • the proposed solution teaches a method for using a ghost element of a user's avatar for teleportation into the virtual world, where the ghost is displayed exiting the avatar; while the avatar stays fixed at its original position, the ghost runs towards a destination identified by the VR system from the user's gaze direction. Once the ghost reaches its destination it is erased from the VR world and the avatar is instantly teleported to the same destination.
  • the method may include the calculation of the speed of running of the ghost element so as to increase realism and avoid dizziness resulting from jerky of very fast motion.
  • the method may comprise rendering other avatars and events taking place at the same time in the VR world while the ghost is running towards its destination.
  • the method may further comprise allowing the user to interrupt the teleportation operation prior to the ghost reaching its destination and rendering the avatar in a ready state for another teleportation or other action.
  • the VR system once the VR system receives an interrupt request by the user, it stops the ghost at its current position in the virtual scene, prior to its destination, erases it, and instantly teleports the avatar at the same position.
  • the taught method is modified to adapt the teleporting motion in such a way as to match natural speed according to the circumstances and to allow the user to view other avatars and their actions happening in parallel.
  • the above method also includes a mechanism for interrupting the teleportation operation prior to its termination and rendering the VR application into a state ready for a subsequent teleportation or other operation.
  • a hardware processor for implementing the teleportation method in the VR environment.
  • a computer program product may comprise program instructions for causing a computing system to perform a method of teleporting a ghost character of an avatar inside a VR world according to some examples disclosed herein.
  • the computer program product may be embodied on a storage medium (for example, a CD-ROM, a DVD, a USB drive, on a computer memory or on a read-only memory) or carried on a carrier signal (for example, on an electrical or optical carrier signal).
  • a storage medium for example, a CD-ROM, a DVD, a USB drive, on a computer memory or on a read-only memory
  • a carrier signal for example, on an electrical or optical carrier signal
  • the computer program may be in the form of source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other form suitable for use in the implementation of the processes.
  • the carrier may be any entity or device capable of carrying the computer program.
  • the carrier may comprise a storage medium, such as a ROM, for example a CD ROM or a semiconductor ROM, or a magnetic recording medium, for example a hard disk.
  • a storage medium such as a ROM, for example a CD ROM or a semiconductor ROM, or a magnetic recording medium, for example a hard disk.
  • the carrier may be a transmissible carrier such as an electrical or optical signal, which may be conveyed via electrical or optical cable or by radio or other means.
  • the carrier may be constituted by such cable or other device or means.
  • the carrier may be an integrated circuit in which the computer program is embedded, the integrated circuit being adapted for performing, or for use in the performance of, the relevant methods.
  • FIG.1 shows a Virtual Reality system
  • FIG.2 shows a Hardware Architecture of a Processor used in a Virtual Reality System
  • FIG.3 shows a Software Architecture used in a Virtual Reality System
  • FIG.4 shows a flowchart of the steps involved in teleporting a ghost element of the user's avatar in a Virtual Reality world for reducing simulation sickness
  • FIG.5 shows a flowchart of the steps involved in teleporting a ghost element of the user's avatar in a Virtual Reality world for enhanced realism and for reducing simulation sickness;
  • FIG.6 shows a flowchart of the steps involved in interrupting a teleporting operation
  • FIG.7a illustrates a schematic representation of the VR world after the user's gaze has been identified
  • FIG.7b illustrates a schematic representation of the VR world after the user's avatar ghost is first displayed
  • FIG.7c illustrates a schematic representation of the VR world while the user's avatar ghost is running towards its destination
  • FIG.7d illustrates a schematic representation of the VR world when the user's avatar ghost has just reached its destination
  • FIG.7e illustrates a schematic representation of the VR world after the user's avatar has been teleported to its destination
  • FIG.8 shows the overall framework of application of the present invention. Detailed description
  • CPU Central Processing Unit
  • VR virtual Reality
  • AR is intended to mean “Augmented Reality”.
  • HMD Head-Mounted Display
  • VRML Virtual Reality Modeling Language
  • FIG.1 shows a Virtual Reality system.
  • the VR system 100 may be worn by a user 1 10. It may comprise a Head-Mounted Display (HMD) 120 attached to the user's head by a fixture, elastic band, strap, or other mounting apparatus 130.
  • the HMD 120 may supply 3D graphics in the form of an immersive world which the user 1 10 can navigate and interact with, according to the particular VR application used.
  • the HMD may feature a pair of earphones 140 used to provide sound related to the VR world and the user's interaction.
  • the earphones 140 may be replaced by a pair of speakers attached or integrated in the HMD 120 or in the real surroundings.
  • the HMD 120 may be replaced by any other display device providing immersiveness by ensuring that the displayed graphics scene extends well beyond the human field of view, ideally at least 180°.
  • a VR helmet e.g. gyroscopes, accelerators, position sensors, etc.
  • the HMD 120 may be equipped with sensors (e.g. gyroscopes, accelerators, position sensors, etc.), which may be used to accurately define the direction the user is looking in the three coordinates. It may also comprise a pair of screens used to present 3D graphics in stereo. Alternatively, a single screen may be used as one unit or divided into two smaller screens.
  • the user may interact with his environment by means of any available Ul device 160.
  • Ul device 160 By means of example, a joystick, a gamepad, a keyboard, or other interaction devices like Wii and the like, an interaction glove, etc. may be used.
  • the Ul device 160 may be omitted and the user may interact by hand gestures, bodily movements, etc. (e.g. simple gestures capture using at least one camera or ultrasonic sensors, etc.) or with a combination of voice commands or natural language with any of these Ul methods.
  • These modes of interaction may necessitate the use of cameras or other sensors. It is noted that these may be used if available but are not necessary to implement the present invention and are not part of the invention.
  • the choice of Ul method is by no means limiting the scope of the present invention as it does fall within the context of the invention.
  • control unit 150 This may be any type of portable computing device, like a smart-phone, a tablet, a portable PC, a minicomputer, a specialized computing device, or the like.
  • the control unit 150 may be equipped with a battery and may be connected to the HMD 120 and Ul device 160 via a cable, not shown in FIG.1. This cable may comprise wires carrying data and control signals, as well as, wires carrying power to the HMD 120 and Ul device 160.
  • the HMD 120 and Ul device 160 may possess the necessary wireless communication capabilities to communicate with the control unit 150 and no cables are used. This way, they may comprise short-range communication modules based in, for example, Bluetooth (e.g. BLE - Bluetooth Low Energy), NFC, Zigbee or Wi-Fi technology. In this exemplary embodiment the HMD 120 and Ul device 160 may also be equipped with batteries or some other electrical energy provision means to ensure their uninterrupted operation.
  • Bluetooth e.g. BLE - Bluetooth Low Energy
  • NFC e.g. BLE - Bluetooth Low Energy
  • Zigbee Zigbee
  • Wi-Fi Wireless Fidelity
  • the HMD 120 and Ul device 160 may also be equipped with batteries or some other electrical energy provision means to ensure their uninterrupted operation.
  • the control unit 150 may possess enough processing capability to use data from the HMD 120 and Ul device 160 and their sensors to adapt the VR world and render the 3D graphics accurately and fast enough to create realism and minimize simulation sickness.
  • the VR world and 3D graphics may be retrieved from a storage location and may be calculated, processed and adapted by the control unit 150.
  • the storage may be at the control unit 150 or remote.
  • the control unit 150 may have very little processing power and may not be capable of creating and adapting the VR worlds. It may be more of the type of dumb terminal which may relay data and 3D graphics between the HMD 120, Ul device 160 and one or more local or remote servers (not shown in FIG.1 ) or some cloud infrastructure. All processing may be done by the at least one or more local or remote servers or cloud infrastructure.
  • control unit 150 may be implemented by electronic means, computing means or a combination of them, that is, said electronic/computing means may be used interchangeably so that a part of the described means may be electronic means and the other part may be computing means, or all described means may be electronic means or all described means may be computing means.
  • Examples of a control unit 150 comprising only electronic means may be a programmable electronic device such as a CPLD (Complex Programmable Logic Device), an FPGA (Field Programmable Gate Array) or an ASIC (Application-Specific Integrated Circuit).
  • control unit 150 comprising only computing means may be a computing system, which may comprise a memory and a processor, the memory being adapted to store a series of computer program instructions, and the processor being adapted to execute these instructions stored in the memory in order to generate the various events and actions for which the control unit has been programmed.
  • control unit 150 may also have a hybrid configuration between computing and electronic means.
  • the unit may comprise a memory and a processor to implement computationally part of its functionalities and certain electronic circuits to implement the remaining functionalities.
  • FIG.2 shows a Hardware Architecture of a Processor used in a Virtual Reality System. It may comprise an HMD 200 or other similar headset or display unit, a Ul 220, and a control unit 250 all interconnected by a network, of the wired or wireless kind.
  • the HMD 200 may comprise a display unit 205 for displaying the VR world to the user and motion and position sensors 210 for accurately determining the direction of gaze of the user, his position and motion.
  • the Ul 220 may comprise optional haptics 225 module, actuator 230 module, and motion controller 235, which may provide force feedback to the user for making the VR experience more natural and minimizing the simulation sickness. It may also comprise a Ul device 240 for user interaction with the system; this may be a joystick, a gamepad, a game controller device (e.g. Wii or button based device, etc.).
  • the control unit 250 may comprise a CPU 255 for performing all processing and control of the operation of the control unit 250, an optional (but preferable) graphics accelerator 260 for calculating the 3D graphics and rendering the VR worlds, rotating, scaling and transposing them, an audio module 265 for handling sound, a communications unit 270 for wired and/or wireless communication with the HMD 200, and Ul 220, and/or wireless communication with local or remote servers and cloud infrastructure, a memory 275 (of any type or combination of different memory types), a storage unit 280 for storing software, graphics and other data, and a battery 285 for powering the control unit 250 and in some exemplary embodiments the HMD 200 and Ul 220 (when these are wired to the control unit 250).
  • a CPU 255 for performing all processing and control of the operation of the control unit 250
  • an optional (but preferable) graphics accelerator 260 for calculating the 3D graphics and rendering the VR worlds, rotating, scaling and transposing them
  • an audio module 265 for handling sound
  • a communications unit 270
  • FIG.3 shows a Software Architecture used in a Virtual Reality System.
  • the Device-Specific Capabilities 390 that is the device-specific commands for controlling the various device hardware components.
  • the OS 380 Virtual Machines 360-370 (like a Java Virtual Machine), Device/User Manager 350, Application Manager 340, and at the top layer, the Applications 310-330. These applications may access, manipulate and display data.
  • only a single Virtual Machine and one Application may be present.
  • FIG.4 shows a flowchart of the steps involved in teleporting a ghost element of the user's avatar in a Virtual Reality world for reducing simulation sickness.
  • the method may start with identifying a location in the virtual world 400 where the user wants to teleport, e.g. by capturing his gaze with at least one camera mounted on the HMD and analyzing it.
  • This virtual world is already calculated and rendered on the HMD 200 by the controller unit 250. It may be programmed using any available programming language like VRML or the like.
  • the location stared at by the user may be on the virtual ground or floor where the user may want to move his avatar.
  • this location may be configured in the method to correspond to a square of dimensions 1 mx1 m, or a circle or diameter of 1 m, or any other similarly defined location and be tied to a coordinate point (x, y, z) in the virtual world.
  • the size and shape of the location may be parameterized and selected before or during the rendering of the VR world.
  • the method may check if this point (and location) is occupied by any other object or avatar 410. If the location is already occupied, the method may loop back to the previous step 400 and may wait until the user looks at an unoccupied location.
  • the method may check if the user has made a teleportation request by e.g. pressing a trigger or button at the Ul device 240. If not, it may wait until a press event is detected or a new location is looked at by the user 425.
  • a press event 420 i.e. teleportation request
  • the method may display a destination mark at the selected location 430 to help the user understand his selection and indicate the destination of movement so as to prepare him and minimize the effect of simulation sickness.
  • a ghost of the user's avatar may then be rendered and shown to "jump out" of the user's avatar at its current position 440.
  • the ghost may then be displayed to run 450 from its original position towards the selected location, indicated with the destination mark. This action may be done in smooth motion, avoiding any jerkiness or other effects.
  • the physical user of the VR system may experience it as a stationary viewer watching the ghost running away from him.
  • the ghost may keep running until it reaches the destination mark 460. Once there, the destination mark may be erased.
  • the user's avatar may be instantly teleported to the destination location 470.
  • the ghost may then be erased from the VR world 480 and the method may end.
  • the destination mark may be erased together with the ghost.
  • the ghost may be moving or walking instead of running, according to the context of the VR application, and the selection of destination may be done with any of the interaction methods mentioned above other than gaze detection.
  • FIG.5 shows a flowchart of the steps involved in teleporting a ghost element of the user's avatar in a Virtual Reality world for enhanced realism and for reducing simulation sickness. The method may start with identifying a location in the virtual world 500 where the user may be looking at. This virtual world may be already calculated and rendered by the controller unit 250 on the HMD 200. It may be programmed using any available programming language like VRML or the like.
  • the location stared at by the user may be on the virtual ground or floor where the user wants to move his avatar.
  • this location may be configured in the method to correspond to a square of dimensions 1 mx1 m, or a circle or diameter 1 m, or any other similarly defined location and be tied to a coordinate point (x, y, z) in the virtual world.
  • the size and shape of the destination location may be parameterized and changed before or after rendering the VR world.
  • the method may check if this point (and location) is occupied by any other object or avatar 510. If the location is already occupied, the method may loop back to the previous step 500 and may wait until the user looks at an unoccupied location.
  • the method may display a destination mark at the selected location 530 to help the user understand his selection and indicate the destination of movement so as to prepare him and minimize the effect of simulation sickness.
  • a ghost of the user's avatar may then be rendered and shown to "jump out” of the user's avatar at its current position 540.
  • a scaling parameter "s" may be set to scale the ghost to a smaller size than the original user's avatar. This scaling may be useful to allow the ghost to be easily distinguishable from the avatar.
  • the scaling parameter may be set to 0.8 for scaling the ghost to 80% of the original user's avatar size.
  • a variable scaling parameter may be set. This parameter may depend on the traversed distance "d" from the current position, so that the ghost is rendered progressively smaller as it runs away from the user's avatar.
  • the speed of motion "v" may be calculated in block 545 or retrieved from a storage location (memory, permanent storage medium, etc.) or storage construct (e.g. database, variable, vector, table, etc.).
  • a storage location memory, permanent storage medium, etc.
  • storage construct e.g. database, variable, vector, table, etc.
  • block 545 The purpose of block 545 is to enhance realism in the ghost's motion and the overall VR experience so that the user's feeling of disconnection between the real and virtual worlds is minimized and consequently simulation sickness is reduced.
  • Speed may simply be read as a constant value applicable to all situations, or be computed taking into account any number of parameters, like terrain, weather, energy levels associated with the user's avatar, flight or fight situations and psychological factors, etc.
  • the method used to calculate speed using these parameters is beyond the scope of this invention and is obvious to any person of ordinary skill in related art and basic physics.
  • the ghost may then be displayed to run, in block 550, at the calculated or retrieved speed from its original position towards the selected location, indicated with the destination mark, which it reaches after time "t". This action may be done in smooth motion, avoiding any jerkiness or other effects.
  • the physical user of the VR system may experience it as a stationary viewer watching the ghost running away from him, while at the same time the user may look towards any location in the virtual world.
  • the method may also display other avatars (and/or objects, or ideally the entire VR world), in block 554, that may be in the vicinity (both stationary and taking actions) while the ghost is running towards its destination. These avatars may not belong to the user.
  • the ghost may keep moving, running or walking until it reaches the destination mark, in block 560. Running or walking may depend on the context of the application and the interaction method. Once there, the destination mark may be erased. Immediately after the ghost has reached its destination, the user's avatar may be instantly teleported to the destination location, in block 570.
  • the ghost may then be erased from the VR world, in block 580, and the method ends.
  • the destination mark may be erased together with the ghost.
  • FIG.6 shows a flowchart of the steps involved in interrupting a teleporting operation.
  • the method may start with identifying a location in the virtual world 600 where the user may be looking at.
  • This virtual world may already be calculated and rendered by the control unit 250 on the HMD 200. It may be programmed using any available programming language like VRML or the like.
  • the location stared at by the user may be on the virtual ground or floor where the user wants to move his avatar.
  • this location may be configured in the method to correspond to a square of dimensions 1 mx1 m, or a circle or diameter 1 m, or any other similarly defined location and be tied to a coordinate point (x, y, z) in the virtual world.
  • the size and shape of the destination location may be parameterized and changed before or after rendering the VR world.
  • the method checks if this point (and location) is occupied by any other object or avatar 610. If the location is already occupied, the method may loop back to the previous step 600 and may wait until the user looks at an unoccupied location. Then, it may check if the user has made a teleportation request by e.g. pressing a trigger or button at the Ul device 240. If not, it may wait until a press event is detected or a new location is looked at by the user 625. Upon detecting a press event 620, the method may display a destination mark at the selected location 630 to help the user understand his selection and indicate the destination of movement so as to prepare him and minimize the effect of simulation sickness.
  • a ghost of the user's avatar may then be rendered and shown to "jump out” of the user's avatar at its current position 640.
  • a scaling parameter "s” may be set to scale the ghost to a smaller size than the original user's avatar. This parameter may depend on the traversed distance "d" from the current position (e.g. as given by Equation 1 ), so that the ghost is rendered progressively smaller as it runs away from the user's avatar.
  • the speed of motion "v” may be calculated 645 or retrieved from a storage location (memory, permanent storage medium, etc.) or storage construct (e.g. database, variable, vector, table, etc.). The purpose of step 645 is to enhance realism in the motion and the overall VR experience so that the user's feeling of disconnection between the real and virtual worlds may be minimized and consequently simulation sickness may be reduced.
  • Speed may simply be read as a constant value applicable to all situations, or be computed taking into account any number of parameters, like terrain, weather, energy levels associated with the user's avatar, flight or fight situations and psychological factors, etc.
  • the method used to calculate speed using these parameters is beyond the scope of this invention and is obvious to any person of ordinary skill in related art and basic physics. Having calculated or retrieved the ghost's speed and measured the distance "I" to be run, the method may calculate the time "t" needed for the ghost to reach its destination (e.g. as in Equation 2).
  • the ghost may then be displayed to run, in block 650, at the calculated or retrieved speed from its original position towards the selected location, indicated with the destination mark, which it may reach after time "t". This action may be done in smooth motion, avoiding any jerkiness or other effects.
  • the physical user of the VR system experiences it as a stationary viewer watching the ghost running away from him/her, while the user may look towards any location in the virtual world.
  • the method may also display, in block 654, other avatars, not belonging to the user (or objects, or ideally the entire VR world), that may be in the vicinity (both stationary and taking actions) while the ghost is running towards its destination.
  • the method may continuously check if the user has made any teleportation interrupt request by monitoring the Ul module for a trigger/button release event, in block 656, while it renders the running ghost. If a release event is detected, the user's avatar may be immediately teleported to the current position of the running ghost, in block 658, the ghost and the destination mark may be erased, and the method may loop back to detecting the user's gaze location, in block 600. If no Ul module trigger/button release event is detected, in block 656, the method may continue displaying the ghost running until it reaches the destination mark, in block 660. Once there, the destination mark may be erased. Immediately after the ghost has reached its destination, the user's avatar may be instantly teleported to the destination location, in block 670.
  • the ghost may then be erased from the VR world, in block 680, and the method may end.
  • the destination mark may be erased together with the ghost.
  • the ghost may be moving instead of running, according to the context of the VR application, and the selection of destination may be done with any of the interaction methods mentioned above other than gaze detection.
  • FIG.7a illustrates a schematic representation of the VR world after the user's gaze has been identified.
  • the VR world is that of a soccer game, containing goalpost 710 and the location 715 looked at by the user, i.e. the destination where he wants to be teleported.
  • FIG.7b illustrates a schematic representation of the VR world after the user's avatar ghost is first displayed. It comprises the goalpost 720, the user's destination 725 and the user's avatar ghost 728 which has just been displayed "jumping out" of the user's avatar.
  • FIG.7c illustrates a schematic representation of the VR world while the user's avatar ghost is running towards its destination. It comprises the goalpost 730, the ghost's destination mark 735, the ghost 738, and the avatar of a first opponent player 739 chasing the ghost 738.
  • FIG.7d illustrates a schematic representation of the VR world when the user's avatar ghost has just reached its destination. It comprises the goalpost 740, the ghost's destination mark 745, the ghost 748, and the first opponent player 749 chasing the ghost 748.
  • FIG.7e illustrates a schematic representation of the VR world after the user's avatar has been teleported to its destination. It comprises the goalpost 750, the first opponent player 755, and a second opponent player 756. The ghost and its destination mark have been erased since the user's avatar has already been teleported.
  • the VR world may be any other type of video game (e.g. fighting, war, flying, driving, racing, sports game, sports training, etc.), or any other type of professional VR application (e.g. personal perception training, disease or phobia treatment, rehabilitation, architecture, interior design, manufacturing worker training, engineer training, pilot training, etc.).
  • the running/moving avatar may be replaced by an alternative type of avatar, etc.
  • the method may detect the destination location the user wants to select by any available method.
  • the pool of available methods may comprise:
  • sensors e.g. ultrasonic, infrared, etc.
  • Other methods involving sensors e.g. ultrasonic, infrared, etc. or combinations thereof.
  • the calculation of all parameters is done at the control unit 150 and its processors.
  • processing may be distributed to one or more servers connected to the control unit 150.
  • the heavy processing is performed by the at least one server and the control unit 150 acts as an intermediary between the servers and the HDM displaying the VR world to the user.
  • FIG.8 shows the overall framework of application of the present invention.
  • a VR world is created, in block 800, by a VR system using a description in a programming language (e.g. VRML) and 3D graphics.
  • the VR system may determine the user's location, viewpoint, movements and interactions, in block 810, by reading and processing sensory data.
  • the VR system may use the current invention to reduce motion sickness, in block 820, and calculate and render the 3D VR objects, in block 830 according to methods disclosed herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to methods and devices for reducing simulation sickness in VR systems and applications and in particular to teleportation inside a virtual world, while the user can see the entire VR world. Solutions are proposed making use of 3D graphics where a ghost of the user's avatar is displayed moving in natural speed towards the user selected destination of teleportation while the user is watching it from his original position. Realism and minimization of simulation sickness are also enhanced by simultaneously displaying the entire VR world alongside the ghost,also in natural speed of motion,and allowing the user to look towards any location in the VR world. The invention also supports user interrupts to the teleportation operation and can be used, among other VR applications, in VR soccer simulators.

Description

REDUCING SIMULATION SICKNESS IN VIRTUAL REALITY APPLICATIONS
This application claims the benefit of European Patent Application EP17382380.8 filed June 20, 2017.
Technical Field
The present disclosure relates to techniques and devices for reducing simulation sickness in Virtual Reality (VR) systems and applications, and in particular in VR soccer simulators.
Background
Computing and in particular the use of graphics, haptics, User Interaction (Ul) and other techniques have been used since the 60's or 70's to create virtual environments and allow users to interact with them. Typical examples include flight simulators used in military and civilian applications to train pilots to fly new airplanes and to handle abnormal and crisis situations and events in virtual flights accurately mimicking real ones. As these technologies started to mature they were introduced in industry and gaming with the widespread use of desktop computers, and later wireless mobile devices, bringing them closer to an increasing number of professional and recreational users.
Virtual Reality (VR), a technology that sometimes uses mobile device screens and headsets that cover the entire Field Of View (FOV) of its user, has become very popular for immersing and interacting in virtual worlds. It typically uses 3- Dimensional (3D) graphics, sounds and Ul to give a realistic feeling of being into a 3D world and interacting with it. The amount of realism depends on the quality of the graphics, the accompanying sounds, the mode of interaction, the use of feedback like vibrations etc. produced by haptic devices, and the synchronization between the virtual and real stimuli and sensed signal received by the user.
Despite the realism offered by modern VR systems, it is very frequent for users to experience some degree of dizziness or motion sickness as the signals received by the various human systems involved in balance are contradictory or at least no 100% synchronized. These systems comprise the visual system (i.e. the eyes capturing live 3D image of the surroundings), the vestibular system (i.e. the balance-related organs in the ear), and the proprioceptive system (i.e. bodily position). In most cases disorientation is caused by sideward movements and, in particular, movements perpendicular to the direction of the user's sight. Thus, if the user is moving in the direction in which he is looking, the related motion sickness effects can be reduced. However, this solution is limited as one cannot apply it in every case. Furthermore, different individuals experience varying degrees of simulation sickness from the same stimuli.
The motion sickness, or more appropriately termed "simulation sickness" in VR applications has drawn significant attention in the research community and industry and various solutions have been proposed and incorporated in modern VR systems. Among them, techniques have been used to detect simulation sickness by analyzing biomedical signals from the user of the VR system. Once simulation sickness is detected, use of special sounds, vibrations, electrostimulation at the user's head and other techniques are applied. Although these may reduce dizziness they necessitate the use of complex, purpose built equipment and require significant processing power in real time, both not always available or convenient, and usually cumbersome.
To avoid these limitations, other approaches have been proposed. Considering for example the situation where a virtual user, or rather his avatar, needs to be teleported (i.e. instantly transported) to a position in the virtual world, such as in a video game, the user (i.e. the gamer) feels a disconnection between the real environment as he senses it and the virtual environment as he sees it through the 3D graphics. Many such products include techniques like "dark mode" while teleporting where the user can still see his feet while moving but his surrounding is darkened so as not to draw his attention; consequently the technique can "blur" the related dizziness. Similarly, with or without darkening, "blue floating (i.e. stabilization) cubes" or other marks as used to show the direction of motion, usually accompanied by drum like sounds as the avatar progresses in the virtual world. "Blink mode" is also used where the motion is intentionally made jerky like the avatar instantly jumps along intermediate positions before it reaches its final destination, i.e. implementing teleporting in small steps. This is usually supplemented by drum-like sounds for every step made along the path to the final destination.
These techniques may bring reduction in dizziness but at the cost of reduced realism in the VR application, therefore depriving it of its main feature. A special case of VR is Augmented Reality (AR) which follows a similar approach. However, AR uses live video image of the real environment upon which it overlays virtual 2-Dimensional or 3D objects and avatars with which the use can interact. It presents a potentially more realistic experience for its user but still has similar problems. The use of live real image for the surroundings (where the user is located) removes a great amount of dizziness as the disconnection between the real and virtual worlds is significantly reduced. However, the calculation of the 3D virtual graphic overlays may involve a perceivable delay and/or misalignment which makes the perceived experience unreal and eventually introduces simulation sickness. Furthermore, it requires serious computational resources and is not always feasible as the user may have, for instance, to physically run long distances for moving inside the real world as this is necessary for navigating within it; this comes in contrast with VR where the user can navigate in the virtual world while standing at the same location and employing a Ul device, his gaze, head movements, hand gestures and body and leg movements.
It is, therefore, obvious that a better solution is needed to combat simulator sickness. This solution needs to be easy to implement, versatile and not requiring special-purpose, cumbersome hardware; a solution that will preserve realism, allow easy interaction and not prevent the user from seeing what is happening in the virtual world (e.g. other avatars). This solution should be suitable for use in gaming and professional VR systems.
Summary of the invention The current disclosure teaches a solution to the problem of dizziness or simulation sickness in VR systems and applications and in particular to teleportation inside a virtual world.
It also teaches how to enhance realism of the VR experience perceived by the user by taking into account time and speed considerations, as well as, other events occurring in parallel in the virtual world with the aim of reducing simulation sickness and increasing user's awareness of events, and developments in the virtual world during the sequence of the user's teleportation. That is, the user may be aware of what is happening around and in the overall VR environment during his/her movement.
In a first aspect, a method is proposed of reducing simulation sickness in a virtual reality application or system. Said simulation sickness may be caused by contradicting and imperfectly synchronized signals received from the human visual, vestibular, and proprioceptive systems involved in balance. The method may comprise identifying a location in the virtual world where the user wants to teleport; identifying a user's request for teleportation to said location; displaying a visual mark at said location; displaying a ghost of the user's avatar at the user's current position; displaying said ghost moving towards the visual mark; and instantly teleporting the user to the location of the visual mark once the ghost has reached said mark. While displaying the moving ghost, virtual objects and avatars of virtual living beings are also displayed nearby or while interacting with the ghost; at the same time, the user may be looking towards any location in the virtual world.
The method teaches identifying the user's intended destination as it is indicated by his/her gaze (or using eye tracking technology, or technology measuring the direction pointed by the centre line of his/her virtual reality display, or using any other pointing method), and rendering in the virtual world a ghost of his/her avatar. The ghost exiting the user's avatar and moving in natural motion towards the user selected destination, which is indicated by a visual mark in the virtual environment. By employing a ghost avatar of the avatar, the problem of reducing dizziness or simulation sickness in Virtual Reality systems and applications may be treated. In particular it is proposed a solution that is easy to implement, does not necessitate the use of purpose-built or expensive hardware and requires no intensive processing or user training.
The proposed solution teaches a method for using a ghost element of a user's avatar for teleportation into the virtual world, where the ghost is displayed exiting the avatar; while the avatar stays fixed at its original position, the ghost runs towards a destination identified by the VR system from the user's gaze direction. Once the ghost reaches its destination it is erased from the VR world and the avatar is instantly teleported to the same destination.
In some examples, the method may include the calculation of the speed of running of the ghost element so as to increase realism and avoid dizziness resulting from jerky of very fast motion.
In some examples, the method may comprise rendering other avatars and events taking place at the same time in the VR world while the ghost is running towards its destination.
In some examples, the method may further comprise allowing the user to interrupt the teleportation operation prior to the ghost reaching its destination and rendering the avatar in a ready state for another teleportation or other action. According to this method, once the VR system receives an interrupt request by the user, it stops the ghost at its current position in the virtual scene, prior to its destination, erases it, and instantly teleports the avatar at the same position.
In a second aspect the taught method is modified to adapt the teleporting motion in such a way as to match natural speed according to the circumstances and to allow the user to view other avatars and their actions happening in parallel.
In a third aspect, the above method also includes a mechanism for interrupting the teleportation operation prior to its termination and rendering the VR application into a state ready for a subsequent teleportation or other operation.
In another aspect, a hardware processor is disclosed for implementing the teleportation method in the VR environment.
In yet another aspect, a computer program product is disclosed. The computer program product may comprise program instructions for causing a computing system to perform a method of teleporting a ghost character of an avatar inside a VR world according to some examples disclosed herein.
The computer program product may be embodied on a storage medium (for example, a CD-ROM, a DVD, a USB drive, on a computer memory or on a read-only memory) or carried on a carrier signal (for example, on an electrical or optical carrier signal).
The computer program may be in the form of source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other form suitable for use in the implementation of the processes. The carrier may be any entity or device capable of carrying the computer program.
For example, the carrier may comprise a storage medium, such as a ROM, for example a CD ROM or a semiconductor ROM, or a magnetic recording medium, for example a hard disk. Further, the carrier may be a transmissible carrier such as an electrical or optical signal, which may be conveyed via electrical or optical cable or by radio or other means.
When the computer program is embodied in a signal that may be conveyed directly by a cable or other device or means, the carrier may be constituted by such cable or other device or means.
Alternatively, the carrier may be an integrated circuit in which the computer program is embedded, the integrated circuit being adapted for performing, or for use in the performance of, the relevant methods. Brief description of the drawings
FIG.1 shows a Virtual Reality system;
FIG.2 shows a Hardware Architecture of a Processor used in a Virtual Reality System;
FIG.3 shows a Software Architecture used in a Virtual Reality System;
FIG.4 shows a flowchart of the steps involved in teleporting a ghost element of the user's avatar in a Virtual Reality world for reducing simulation sickness;
FIG.5 shows a flowchart of the steps involved in teleporting a ghost element of the user's avatar in a Virtual Reality world for enhanced realism and for reducing simulation sickness;
FIG.6 shows a flowchart of the steps involved in interrupting a teleporting operation;
FIG.7a illustrates a schematic representation of the VR world after the user's gaze has been identified;
FIG.7b illustrates a schematic representation of the VR world after the user's avatar ghost is first displayed;
FIG.7c illustrates a schematic representation of the VR world while the user's avatar ghost is running towards its destination;
FIG.7d illustrates a schematic representation of the VR world when the user's avatar ghost has just reached its destination;
FIG.7e illustrates a schematic representation of the VR world after the user's avatar has been teleported to its destination;
FIG.8 shows the overall framework of application of the present invention. Detailed description
The word "exemplary" is used herein to mean "serving as an example, instance, or illustration". Any example described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. The acronym "OS" is intended to mean Operating System".
The acronym "CPU" is intended to mean "Central Processing Unit".
The acronym "Ul" is intended to mean "User Interaction".
The acronym "VR" is intended to mean "Virtual Reality".
The acronym "AR" is intended to mean "Augmented Reality".
The acronym "2D" is intended to mean "Two Dimensional".
The acronym "3D" is intended to mean "Three Dimensional".
The acronym "HMD" is intended to mean "Head-Mounted Display".
The acronym "VRML" is intended to mean "Virtual Reality Modeling Language".
As used herein and in the claims, the singular forms "a," and "the" include plural references unless the context clearly dictates otherwise. Thus, for example, reference to "a virtual machine" includes one or a plurality of such machines and equivalents thereof known to those skilled in computing.
As used herein and in the claims, the term "memory" has the meaning of any type of memory element that is implemented in any known technology and configuration unless explicitly stated otherwise. FIG.1 shows a Virtual Reality system. The VR system 100 may be worn by a user 1 10. It may comprise a Head-Mounted Display (HMD) 120 attached to the user's head by a fixture, elastic band, strap, or other mounting apparatus 130. The HMD 120 may supply 3D graphics in the form of an immersive world which the user 1 10 can navigate and interact with, according to the particular VR application used. The HMD may feature a pair of earphones 140 used to provide sound related to the VR world and the user's interaction.
In an alternative exemplary embodiment, the earphones 140 may be replaced by a pair of speakers attached or integrated in the HMD 120 or in the real surroundings.
In yet another exemplary embodiment, the HMD 120 may be replaced by any other display device providing immersiveness by ensuring that the displayed graphics scene extends well beyond the human field of view, ideally at least 180°. By means of example and without limiting the scope of the invention one may use a VR helmet, VR visor, VR binoculars, large curvilinear screens, large curvilinear projection screens and the like. The HMD 120 may be equipped with sensors (e.g. gyroscopes, accelerators, position sensors, etc.), which may be used to accurately define the direction the user is looking in the three coordinates. It may also comprise a pair of screens used to present 3D graphics in stereo. Alternatively, a single screen may be used as one unit or divided into two smaller screens.
The user may interact with his environment by means of any available Ul device 160. By means of example, a joystick, a gamepad, a keyboard, or other interaction devices like Wii and the like, an interaction glove, etc. may be used. In alternative exemplary embodiments, the Ul device 160 may be omitted and the user may interact by hand gestures, bodily movements, etc. (e.g. simple gestures capture using at least one camera or ultrasonic sensors, etc.) or with a combination of voice commands or natural language with any of these Ul methods. These modes of interaction may necessitate the use of cameras or other sensors. It is noted that these may be used if available but are not necessary to implement the present invention and are not part of the invention. The choice of Ul method is by no means limiting the scope of the present invention as it does fall within the context of the invention.
The control of the operation of the VR system and the implementation of the invention is done by means of a control unit 150. This may be any type of portable computing device, like a smart-phone, a tablet, a portable PC, a minicomputer, a specialized computing device, or the like. The control unit 150 may be equipped with a battery and may be connected to the HMD 120 and Ul device 160 via a cable, not shown in FIG.1. This cable may comprise wires carrying data and control signals, as well as, wires carrying power to the HMD 120 and Ul device 160.
In an alternative exemplary embodiment the HMD 120 and Ul device 160 may possess the necessary wireless communication capabilities to communicate with the control unit 150 and no cables are used. This way, they may comprise short-range communication modules based in, for example, Bluetooth (e.g. BLE - Bluetooth Low Energy), NFC, Zigbee or Wi-Fi technology. In this exemplary embodiment the HMD 120 and Ul device 160 may also be equipped with batteries or some other electrical energy provision means to ensure their uninterrupted operation.
The control unit 150 may possess enough processing capability to use data from the HMD 120 and Ul device 160 and their sensors to adapt the VR world and render the 3D graphics accurately and fast enough to create realism and minimize simulation sickness. The VR world and 3D graphics may be retrieved from a storage location and may be calculated, processed and adapted by the control unit 150. The storage may be at the control unit 150 or remote. In yet another alternative exemplary embodiment, the control unit 150 may have very little processing power and may not be capable of creating and adapting the VR worlds. It may be more of the type of dumb terminal which may relay data and 3D graphics between the HMD 120, Ul device 160 and one or more local or remote servers (not shown in FIG.1 ) or some cloud infrastructure. All processing may be done by the at least one or more local or remote servers or cloud infrastructure.
In any case, the control unit 150 may be implemented by electronic means, computing means or a combination of them, that is, said electronic/computing means may be used interchangeably so that a part of the described means may be electronic means and the other part may be computing means, or all described means may be electronic means or all described means may be computing means. Examples of a control unit 150 comprising only electronic means (that is, a purely electronic configuration) may be a programmable electronic device such as a CPLD (Complex Programmable Logic Device), an FPGA (Field Programmable Gate Array) or an ASIC (Application-Specific Integrated Circuit). An example of a control unit 150 comprising only computing means may be a computing system, which may comprise a memory and a processor, the memory being adapted to store a series of computer program instructions, and the processor being adapted to execute these instructions stored in the memory in order to generate the various events and actions for which the control unit has been programmed.
In addition, the control unit 150 may also have a hybrid configuration between computing and electronic means. In this case, the unit may comprise a memory and a processor to implement computationally part of its functionalities and certain electronic circuits to implement the remaining functionalities.
FIG.2 shows a Hardware Architecture of a Processor used in a Virtual Reality System. It may comprise an HMD 200 or other similar headset or display unit, a Ul 220, and a control unit 250 all interconnected by a network, of the wired or wireless kind.
The HMD 200 may comprise a display unit 205 for displaying the VR world to the user and motion and position sensors 210 for accurately determining the direction of gaze of the user, his position and motion.
The Ul 220 may comprise optional haptics 225 module, actuator 230 module, and motion controller 235, which may provide force feedback to the user for making the VR experience more natural and minimizing the simulation sickness. It may also comprise a Ul device 240 for user interaction with the system; this may be a joystick, a gamepad, a game controller device (e.g. Wii or button based device, etc.).
The control unit 250 may comprise a CPU 255 for performing all processing and control of the operation of the control unit 250, an optional (but preferable) graphics accelerator 260 for calculating the 3D graphics and rendering the VR worlds, rotating, scaling and transposing them, an audio module 265 for handling sound, a communications unit 270 for wired and/or wireless communication with the HMD 200, and Ul 220, and/or wireless communication with local or remote servers and cloud infrastructure, a memory 275 (of any type or combination of different memory types), a storage unit 280 for storing software, graphics and other data, and a battery 285 for powering the control unit 250 and in some exemplary embodiments the HMD 200 and Ul 220 (when these are wired to the control unit 250).
FIG.3 shows a Software Architecture used in a Virtual Reality System. At the lowest layer there may be the Device-Specific Capabilities 390, that is the device-specific commands for controlling the various device hardware components. Moving to higher layers there may lie the OS 380, Virtual Machines 360-370 (like a Java Virtual Machine), Device/User Manager 350, Application Manager 340, and at the top layer, the Applications 310-330. These applications may access, manipulate and display data. In an alternative exemplary embodiment only a single Virtual Machine and one Application may be present.
FIG.4 shows a flowchart of the steps involved in teleporting a ghost element of the user's avatar in a Virtual Reality world for reducing simulation sickness. The method may start with identifying a location in the virtual world 400 where the user wants to teleport, e.g. by capturing his gaze with at least one camera mounted on the HMD and analyzing it.
This virtual world is already calculated and rendered on the HMD 200 by the controller unit 250. It may be programmed using any available programming language like VRML or the like.
The location stared at by the user may be on the virtual ground or floor where the user may want to move his avatar. By means of example, this location may be configured in the method to correspond to a square of dimensions 1 mx1 m, or a circle or diameter of 1 m, or any other similarly defined location and be tied to a coordinate point (x, y, z) in the virtual world. The size and shape of the location may be parameterized and selected before or during the rendering of the VR world. The method may check if this point (and location) is occupied by any other object or avatar 410. If the location is already occupied, the method may loop back to the previous step 400 and may wait until the user looks at an unoccupied location. Then, it may check if the user has made a teleportation request by e.g. pressing a trigger or button at the Ul device 240. If not, it may wait until a press event is detected or a new location is looked at by the user 425. Upon detecting a press event 420 (i.e. teleportation request), the method may display a destination mark at the selected location 430 to help the user understand his selection and indicate the destination of movement so as to prepare him and minimize the effect of simulation sickness.
A ghost of the user's avatar may then be rendered and shown to "jump out" of the user's avatar at its current position 440.
The ghost may then be displayed to run 450 from its original position towards the selected location, indicated with the destination mark. This action may be done in smooth motion, avoiding any jerkiness or other effects. The physical user of the VR system may experience it as a stationary viewer watching the ghost running away from him. The ghost may keep running until it reaches the destination mark 460. Once there, the destination mark may be erased. Immediately after the ghost has reached its destination, the user's avatar may be instantly teleported to the destination location 470. The ghost may then be erased from the VR world 480 and the method may end.
In an alternative exemplary embodiment, the destination mark may be erased together with the ghost. In other alternative exemplary embodiments the ghost may be moving or walking instead of running, according to the context of the VR application, and the selection of destination may be done with any of the interaction methods mentioned above other than gaze detection. FIG.5 shows a flowchart of the steps involved in teleporting a ghost element of the user's avatar in a Virtual Reality world for enhanced realism and for reducing simulation sickness. The method may start with identifying a location in the virtual world 500 where the user may be looking at. This virtual world may be already calculated and rendered by the controller unit 250 on the HMD 200. It may be programmed using any available programming language like VRML or the like. The location stared at by the user may be on the virtual ground or floor where the user wants to move his avatar. By means of example, this location may be configured in the method to correspond to a square of dimensions 1 mx1 m, or a circle or diameter 1 m, or any other similarly defined location and be tied to a coordinate point (x, y, z) in the virtual world. The size and shape of the destination location may be parameterized and changed before or after rendering the VR world. The method may check if this point (and location) is occupied by any other object or avatar 510. If the location is already occupied, the method may loop back to the previous step 500 and may wait until the user looks at an unoccupied location. Then, it may check if the user has made a teleportation request by e.g. pressing a trigger or button at the Ul device 240. If not, it may wait until a press event is detected or a new location is looked at by the user 525. Upon detecting a press event 520, the method may display a destination mark at the selected location 530 to help the user understand his selection and indicate the destination of movement so as to prepare him and minimize the effect of simulation sickness.
A ghost of the user's avatar may then be rendered and shown to "jump out" of the user's avatar at its current position 540. A scaling parameter "s" may be set to scale the ghost to a smaller size than the original user's avatar. This scaling may be useful to allow the ghost to be easily distinguishable from the avatar. By means of example the scaling parameter may be set to 0.8 for scaling the ghost to 80% of the original user's avatar size. In an alternative exemplary embodiment, a variable scaling parameter may be set. This parameter may depend on the traversed distance "d" from the current position, so that the ghost is rendered progressively smaller as it runs away from the user's avatar. By means of example and without limiting the scope of the current invention, variable scaling may be computed by: s = s0 * (c / d) (Equation 1 ) where s is the current scaling value to be applied to the ghost, s0 is the initial scaling value applied when the ghost is first rendered at its initial position, c is a constant and d is the traversed distance from the original ghost position (i.e. the position of the user's avatar) towards its destination.
Before the ghost is rendered to run towards its destination, the speed of motion "v" may be calculated in block 545 or retrieved from a storage location (memory, permanent storage medium, etc.) or storage construct (e.g. database, variable, vector, table, etc.).
The purpose of block 545 is to enhance realism in the ghost's motion and the overall VR experience so that the user's feeling of disconnection between the real and virtual worlds is minimized and consequently simulation sickness is reduced.
Speed may simply be read as a constant value applicable to all situations, or be computed taking into account any number of parameters, like terrain, weather, energy levels associated with the user's avatar, flight or fight situations and psychological factors, etc. The method used to calculate speed using these parameters is beyond the scope of this invention and is obvious to any person of ordinary skill in related art and basic physics.
Having calculated or retrieved the ghost's speed and measured the distance "I" to be run, the method may calculate the time "t" needed for the ghost to reach its destination as follows: t = l/v (Equation 2)
The ghost may then be displayed to run, in block 550, at the calculated or retrieved speed from its original position towards the selected location, indicated with the destination mark, which it reaches after time "t". This action may be done in smooth motion, avoiding any jerkiness or other effects. The physical user of the VR system may experience it as a stationary viewer watching the ghost running away from him, while at the same time the user may look towards any location in the virtual world.
To enhance realism and reduce simulation sickness the method may also display other avatars (and/or objects, or ideally the entire VR world), in block 554, that may be in the vicinity (both stationary and taking actions) while the ghost is running towards its destination. These avatars may not belong to the user.
The ghost may keep moving, running or walking until it reaches the destination mark, in block 560. Running or walking may depend on the context of the application and the interaction method. Once there, the destination mark may be erased. Immediately after the ghost has reached its destination, the user's avatar may be instantly teleported to the destination location, in block 570.
The ghost may then be erased from the VR world, in block 580, and the method ends.
In an alternative exemplary embodiment, the destination mark may be erased together with the ghost.
FIG.6 shows a flowchart of the steps involved in interrupting a teleporting operation.
The method may start with identifying a location in the virtual world 600 where the user may be looking at. This virtual world may already be calculated and rendered by the control unit 250 on the HMD 200. It may be programmed using any available programming language like VRML or the like. The location stared at by the user may be on the virtual ground or floor where the user wants to move his avatar. By means of example, this location may be configured in the method to correspond to a square of dimensions 1 mx1 m, or a circle or diameter 1 m, or any other similarly defined location and be tied to a coordinate point (x, y, z) in the virtual world. The size and shape of the destination location may be parameterized and changed before or after rendering the VR world. The method checks if this point (and location) is occupied by any other object or avatar 610. If the location is already occupied, the method may loop back to the previous step 600 and may wait until the user looks at an unoccupied location. Then, it may check if the user has made a teleportation request by e.g. pressing a trigger or button at the Ul device 240. If not, it may wait until a press event is detected or a new location is looked at by the user 625. Upon detecting a press event 620, the method may display a destination mark at the selected location 630 to help the user understand his selection and indicate the destination of movement so as to prepare him and minimize the effect of simulation sickness.
A ghost of the user's avatar may then be rendered and shown to "jump out" of the user's avatar at its current position 640. A scaling parameter "s" may be set to scale the ghost to a smaller size than the original user's avatar. This parameter may depend on the traversed distance "d" from the current position (e.g. as given by Equation 1 ), so that the ghost is rendered progressively smaller as it runs away from the user's avatar. Before the ghost is rendered to run towards its destination, the speed of motion "v" may be calculated 645 or retrieved from a storage location (memory, permanent storage medium, etc.) or storage construct (e.g. database, variable, vector, table, etc.). The purpose of step 645 is to enhance realism in the motion and the overall VR experience so that the user's feeling of disconnection between the real and virtual worlds may be minimized and consequently simulation sickness may be reduced.
Speed may simply be read as a constant value applicable to all situations, or be computed taking into account any number of parameters, like terrain, weather, energy levels associated with the user's avatar, flight or fight situations and psychological factors, etc. The method used to calculate speed using these parameters is beyond the scope of this invention and is obvious to any person of ordinary skill in related art and basic physics. Having calculated or retrieved the ghost's speed and measured the distance "I" to be run, the method may calculate the time "t" needed for the ghost to reach its destination (e.g. as in Equation 2).
The ghost may then be displayed to run, in block 650, at the calculated or retrieved speed from its original position towards the selected location, indicated with the destination mark, which it may reach after time "t". This action may be done in smooth motion, avoiding any jerkiness or other effects. The physical user of the VR system experiences it as a stationary viewer watching the ghost running away from him/her, while the user may look towards any location in the virtual world.
To enhance realism and reduce simulation sickness the method may also display, in block 654, other avatars, not belonging to the user (or objects, or ideally the entire VR world), that may be in the vicinity (both stationary and taking actions) while the ghost is running towards its destination.
The method may continuously check if the user has made any teleportation interrupt request by monitoring the Ul module for a trigger/button release event, in block 656, while it renders the running ghost. If a release event is detected, the user's avatar may be immediately teleported to the current position of the running ghost, in block 658, the ghost and the destination mark may be erased, and the method may loop back to detecting the user's gaze location, in block 600. If no Ul module trigger/button release event is detected, in block 656, the method may continue displaying the ghost running until it reaches the destination mark, in block 660. Once there, the destination mark may be erased. Immediately after the ghost has reached its destination, the user's avatar may be instantly teleported to the destination location, in block 670.
The ghost may then be erased from the VR world, in block 680, and the method may end. In an alternative exemplary embodiment, the destination mark may be erased together with the ghost.
In other alternative exemplary embodiments the ghost may be moving instead of running, according to the context of the VR application, and the selection of destination may be done with any of the interaction methods mentioned above other than gaze detection.
FIG.7a illustrates a schematic representation of the VR world after the user's gaze has been identified. In this example, the VR world is that of a soccer game, containing goalpost 710 and the location 715 looked at by the user, i.e. the destination where he wants to be teleported.
FIG.7b illustrates a schematic representation of the VR world after the user's avatar ghost is first displayed. It comprises the goalpost 720, the user's destination 725 and the user's avatar ghost 728 which has just been displayed "jumping out" of the user's avatar.
FIG.7c illustrates a schematic representation of the VR world while the user's avatar ghost is running towards its destination. It comprises the goalpost 730, the ghost's destination mark 735, the ghost 738, and the avatar of a first opponent player 739 chasing the ghost 738.
FIG.7d illustrates a schematic representation of the VR world when the user's avatar ghost has just reached its destination. It comprises the goalpost 740, the ghost's destination mark 745, the ghost 748, and the first opponent player 749 chasing the ghost 748.
FIG.7e illustrates a schematic representation of the VR world after the user's avatar has been teleported to its destination. It comprises the goalpost 750, the first opponent player 755, and a second opponent player 756. The ghost and its destination mark have been erased since the user's avatar has already been teleported. In an alternative exemplary embodiment, the VR world may be any other type of video game (e.g. fighting, war, flying, driving, racing, sports game, sports training, etc.), or any other type of professional VR application (e.g. personal perception training, disease or phobia treatment, rehabilitation, architecture, interior design, manufacturing worker training, engineer training, pilot training, etc.). In these cases, the running/moving avatar may be replaced by an alternative type of avatar, etc.
In other alternative exemplary embodiments, the method may detect the destination location the user wants to select by any available method. By means of example and by no means limiting the scope of the invention the pool of available methods may comprise:
- Gaze detection - eye tracking using at least one camera in the HMD;
- Hand gesture detection using cameras in the area where the VR system is used;
- Ul device comprising touchpad, touch screen, joystick or other Ul method;
- Voice commands in structured or natural speech, captured with one or more microphones attached to any module of the VR system or located in the area where the VR system is used;
- Other methods involving sensors (e.g. ultrasonic, infrared, etc.) or combinations thereof.
In the current exemplary embodiment, the calculation of all parameters (e.g. speed and duration of ghost movement), 3D graphics, motion, and interaction is done at the control unit 150 and its processors. In alternative exemplary embodiments, processing may be distributed to one or more servers connected to the control unit 150. In this case, the heavy processing is performed by the at least one server and the control unit 150 acts as an intermediary between the servers and the HDM displaying the VR world to the user.
FIG.8 shows the overall framework of application of the present invention. A VR world is created, in block 800, by a VR system using a description in a programming language (e.g. VRML) and 3D graphics. The VR system may determine the user's location, viewpoint, movements and interactions, in block 810, by reading and processing sensory data. The VR system may use the current invention to reduce motion sickness, in block 820, and calculate and render the 3D VR objects, in block 830 according to methods disclosed herein.
It is understood that the above elements not forming part of the invention may be implemented using any method, technique, hardware and software available or preferable to any person of ordinary skill in related art.
The above example descriptions are simplified and do not include hardware and software elements that are used in the examples but are not part of the current invention, are not needed for the understanding of the examples, and are obvious to any user of ordinary skill in related art. Furthermore, variations of the described method, system architecture, and software architecture are possible, where, for instance, method steps, and hardware and software elements may be rearranged, omitted, or new added.
Although only a number of examples have been disclosed herein, other alternatives, modifications, uses and/or equivalents thereof are possible. Furthermore, all possible combinations of the described examples are also covered. Thus, the scope of the present disclosure should not be limited by particular examples, but should be determined only by a fair reading of the claims that follow. If reference signs related to drawings are placed in parentheses in a claim, they are solely for attempting to increase the intelligibility of the claim, and shall not be construed as limiting the scope of the claim. Further, although the examples described with reference to the drawings comprise computing apparatus/systems and processes performed in computing apparatus/systems, the invention also extends to computer programs, particularly computer programs on or in a carrier, adapted for putting the system into practice.

Claims

1 . A method of reducing simulation sickness in a virtual reality application or system, said simulation sickness being caused by contradicting and imperfectly synchronized signals received from the human visual, vestibular, and proprioceptive systems involved in balance, the method comprising:
- identifying a user's request for teleportation to a selected location;
- displaying a visual mark at said location;
- displaying a ghost of the user's avatar at the user's current position;
- displaying said ghost moving towards the visual mark;
- instantly teleporting the user to the location of the visual mark once the ghost has reached said mark.
2. The method according to claim 1 , further comprising applying a pre- selected speed value, or calculating and applying a speed value to the movement of the avatar, said speed value being substantially equal to the speed of natural motion in the context of said virtual reality application.
3. The method according to any of claims 1 and 2, further comprising displaying the entire VR world together with the moving ghost.
4. The method according to any of claims 1 to 3, further comprising:
identifying a user's request to interrupt an ongoing teleportation operation;
- immediately teleporting the user to the current location of the moving ghost; and
- erasing the ghost and visual mark.
5. The method according to any of claims 1 to 4, where the virtual reality application may be a video game, a VR movie, an AR application, or a professional application, such applications comprising:
- fighting game;
- war game;
- flying simulator; - driving simulator;
- racing game;
- sports game;
- sports training;
- personal perception training;
- disease or phobia treatment;
- rehabilitation;
- architecture application;
- interior design application;
- scene interaction application;
- manufacturing worker training application;
- engineer training application;
- pilot training application;
- criminal science and forensics application; and
- accident analysis and visualization application.
6. The method according to any of claims 1 to 5, where the means for identifying the location in the virtual world where the user wants to teleport, comprises at least one of:
- using the centre line from the VR display;
- using at least one camera to capture and analyze the user's gaze;
- using at least one camera to capture and analyze the user's hand gestures;
- using at least one camera to capture and analyze the user's bodily movements;
- using at least one microphone to capture and analyze the user's voice commands in the form of structured or natural speech;
- using signals from a Ul device comprising one of touchpad, touch screen, joystick or other Ul method; and
- using a combination thereof.
7. The method according to claim 4, where identifying the user's request to interrupt an ongoing teleportation operation, comprises at least one of:
- using the centre line from the VR display; - using at least one camera to capture and analyze the user's gaze;
- using at least one camera to capture and analyze the user's hand gestures;
- using at least one camera to capture and analyze the user's bodily movements;
- using at least one microphone to capture and analyze the user's voice commands in the form of structured or natural speech;
- using signals from a Ul device comprising one of touchpad, touch screen, joystick or other Ul method; and
- using a combination thereof.
8. The method according to any of claims 1 to 7, further comprising verifying that said location in the virtual world where the user wants to teleport is not occupied by a virtual object or avatar.
9. A processor for reducing simulation sickness in a virtual reality application, said simulation sickness being caused by contradicting and imperfectly synchronized signals received from the human visual, vestibular, and proprioceptive systems involved in balance, said processor possessing adequate processing power to perform tasks in real time, and said processor comprising logic to:
- identify a user's request for teleportation to a selected location;
- display a visual mark at said location;
- display a ghost of the user's avatar at the user's current position;
- display said ghost moving towards the visual mark; and
- instantly teleport the user to the location of the visual mark once the ghost has reached said mark.
10. The processor according to claim 9, further comprising logic to apply a pre- selected speed value, or to calculate and apply a speed value to the movement of the avatar, said speed value being substantially equal to the speed of natural motion in the context of said virtual reality application.
1 1 . The processor according to any of claims 9 or 10, further comprising logic to display the entire VR world together with the moving ghost.
The processor according to any of claims 9 to 1 1 , further comprising logic identify a user's request to interrupt an ongoing teleportation operation; immediately teleport the user to the current location of the moving ghost; and
erase the ghost and visual mark.
13. The processor according to any of claims 9 to 12, where the virtual reality application may be a video game, a VR movie, an AR application, or a professional application, such applications comprising:
- fighting game;
- war game;
- flying simulator;
- driving simulator;
- racing game;
- sports game;
- sports training;
- personal perception training;
- disease or phobia treatment;
- rehabilitation;
- architecture application;
- interior design application;
- scene interaction application;
- manufacturing worker training application;
- engineer training application;
- pilot training application;
- criminal science and forensics application; and
- accident analysis and visualization application.
14. The processor according to any of claims 9 to 13, where the logic to identify the location in the virtual world where the user wants to teleport, comprise at least one of: - the centre line of the VR display;
- at least one camera to capture and analyze the user's gaze;
- at least one camera to capture and analyze the user's hand gestures;
- at least one camera to capture and analyze the user's bodily movements; - at least one microphone to capture and analyze the user's voice commands in the form of structured or natural speech;
- signals from a Ul device comprising one of touchpad, touch screen, joystick or other Ul method; and
- a combination thereof.
15. The processor according to claim 12, where the logic for identifying the user's request to interrupt an ongoing teleportation operation, comprise at least one of:
- the centre line of the VR display;
- at least one camera to capture and analyze the user's gaze;
- at least one camera to capture and analyze the user's hand gestures;
- at least one camera to capture and analyze the user's bodily movements;
- at least one microphone to capture and analyze the user's voice commands in the form of structured or natural speech;
- signals from a Ul device comprising one of touchpad, touch screen, joystick or other Ul method; and
- a combination thereof.
16. The processor according to any of claims 9 to 15, where said processor is implemented in programmable logic.
17. A non-transitory computer program product that causes a processor to reduce simulation sickness in a virtual reality application, said simulation sickness being caused by contradicting and imperfectly synchronized signals received from the human visual, vestibular, and proprioceptive systems involved in balance, the non-transitory computer program product having instructions to:
- identify a location in the virtual world where the user wants to teleport;
- verify that said location is not occupied by a virtual object or avatar;
- identify a user's request for teleportation to said location; - display a visual mark at said location;
- display a ghost of the user's avatar at the user's current position;
- display said ghost moving towards the visual mark;
- instantly teleport the user to the location of the visual mark once the ghost has reached said mark.
18. A computer program product comprising program instructions for causing a computing system to perform a method of reduce simulation sickness in a virtual reality application according to any of claims 1 to 8.
19. The computer program product according to claim 18, embodied on a storage medium.
20. The computer program product according to claim 18, carried on a carrier signal.
PCT/EP2018/066285 2017-06-20 2018-06-19 Reducing simulation sickness in virtual reality applications WO2018234318A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP17382380.8 2017-06-20
EP17382380 2017-06-20

Publications (1)

Publication Number Publication Date
WO2018234318A1 true WO2018234318A1 (en) 2018-12-27

Family

ID=59285127

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/066285 WO2018234318A1 (en) 2017-06-20 2018-06-19 Reducing simulation sickness in virtual reality applications

Country Status (1)

Country Link
WO (1) WO2018234318A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090089684A1 (en) * 2007-10-01 2009-04-02 Boss Gregory J Systems, methods, and media for temporal teleport in a virtual world environment
US20100309097A1 (en) * 2009-06-04 2010-12-09 Roni Raviv Head mounted 3d display
US20150091891A1 (en) * 2013-09-30 2015-04-02 Dumedia, Inc. System and method for non-holographic teleportation
WO2017096351A1 (en) * 2015-12-03 2017-06-08 Google Inc. Teleportation in an augmented and/or virtual reality environment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090089684A1 (en) * 2007-10-01 2009-04-02 Boss Gregory J Systems, methods, and media for temporal teleport in a virtual world environment
US20100309097A1 (en) * 2009-06-04 2010-12-09 Roni Raviv Head mounted 3d display
US20150091891A1 (en) * 2013-09-30 2015-04-02 Dumedia, Inc. System and method for non-holographic teleportation
WO2017096351A1 (en) * 2015-12-03 2017-06-08 Google Inc. Teleportation in an augmented and/or virtual reality environment

Similar Documents

Publication Publication Date Title
TWI786701B (en) Method and system for eye tracking with prediction and late update to gpu for fast foveated rendering in an hmd environment and non-transitory computer-readable medium
JP6977134B2 (en) Field of view (FOV) aperture of virtual reality (VR) content on head-mounted display
EP3427130B1 (en) Virtual reality
JP7164630B2 (en) Dynamic Graphics Rendering Based on Predicted Saccade Landing Points
US20170178411A1 (en) Mobile tele-immersive gameplay
US20150070274A1 (en) Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements
US10223064B2 (en) Method for providing virtual space, program and apparatus therefor
JP7249975B2 (en) Method and system for directing user attention to location-based gameplay companion applications
KR20180013892A (en) Reactive animation for virtual reality
WO2018234318A1 (en) Reducing simulation sickness in virtual reality applications
JP2018092635A (en) Information processing method, device, and program for implementing that information processing method on computer
JP2019032715A (en) Information processing method, device, and program for causing computer to execute the method
JP6522572B2 (en) Method for providing virtual reality, program for causing a computer to execute the method, and information processing apparatus
JP6318224B1 (en) A method executed by a computer to display content in a virtual space using a head-mounted device, a program for causing a computer to execute the method, and an information processing apparatus
JP2018147497A (en) Method for providing virtual reality, program for allowing the method to be performed by computer, and information processing apparatus
CN117122910A (en) Method and system for adding real world sounds to virtual reality scenes
JP2018085137A (en) Method executed by computer for displaying content in virtual space using head mount device, program enabling computer to execute the same and information processing device
JP2017220162A (en) Method for providing virtual space, program to cause computer to realize the same and system to provide virtual space

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18732341

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18732341

Country of ref document: EP

Kind code of ref document: A1