GB2560156A - Virtual reality system and method - Google Patents
Virtual reality system and method Download PDFInfo
- Publication number
- GB2560156A GB2560156A GB1702584.2A GB201702584A GB2560156A GB 2560156 A GB2560156 A GB 2560156A GB 201702584 A GB201702584 A GB 201702584A GB 2560156 A GB2560156 A GB 2560156A
- Authority
- GB
- United Kingdom
- Prior art keywords
- viewpoint
- virtual reality
- user
- reality navigation
- updated
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/212—Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5255—Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
- H04N13/117—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2016—Rotation, translation, scaling
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Graphics (AREA)
- Biophysics (AREA)
- Cardiology (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A virtual reality navigation method, comprising the steps of rendering a virtual environment at a first viewpoint s210, obtaining a user command to rotate the viewpoint s220, calculating an updated viewpoint that differs from the first viewpoint by a fixed angle selected with a range greater than 2 degrees inclusive s230, and rendering the virtual environment at the updated viewpoint s240. Preferably the fixed angle is between 3 and 20 degrees inclusive and even more preferably between 5 and 10 degrees inclusive. Preferably the user command is received from a handheld controller. A left-hand controller to rotate left and right-hand controller to rotate right. Preferably a different fixed angle is selected depending on whether the first viewpoint is currently stationary or in motion. Preferably the viewpoint is updated in response to detected movement of a users head. The intent of the application is to prevent the feeling of discomfort (Nausea, sickness etc.) that subtle motion can produce in a virtual environment by mimicking saccadic movement.
Description
(71) Applicant(s):
Sony Interactive Entertainment Inc.
1-7-1 Konan, MinatoKu 108-8270, Tokyo, Japan (72) Inventor(s):
Elisheva Shapiro Richard Lee (74) Agent and/or Address for Service:
D Young & Co LLP
120 Holborn, LONDON, EC1N 2DY, United Kingdom (56) Documents Cited:
US 20170132845 A1 US 20160027213 A1 Cloudhead Games, 3 April 2014, VR Comfort Mode Explained Available from: https://www.youtube.com/ watch?v=GpOeMNSVtZA [Accessed 31/07/2017] (58) Field of Search:
INT CLA63F, G06F, G06T
Other: WPI, EPODOC, TXTE, Internet (54) Title of the Invention: Virtual reality system and method Abstract Title: Virtual Reality System and Method (57) A virtual reality navigation method, comprising the steps of rendering a virtual environment at a first viewpoint s210, obtaining a user command to rotate the viewpoint s220, calculating an updated viewpoint that differs from the first viewpoint by a fixed angle selected with a range greater than 2 degrees inclusive s230, and rendering the virtual environment at the updated viewpoint s240. Preferably the fixed angle is between 3 and 20 degrees inclusive and even more preferably between 5 and 10 degrees inclusive. Preferably the user command is received from a handheld controller. A left-hand controller to rotate left and right-hand controller to rotate right. Preferably a different fixed angle is selected depending on whether the first viewpoint is currently stationary or in motion. Preferably the viewpoint is updated in response to detected movement of a user’s head. The intent of the application is to prevent the feeling of discomfort (Nausea, sickness etc.) that subtle motion can produce in a virtual environment by mimicking saccadic movement.
Figure 2 ’Γ”
CM
CO '’Φ
If) to ’Μ
CO xt
<
to co u
*** ,05
LL
2/2
s210 s220 s230 s240
Figure 2
Intellectual
Property
Office
Application No. GB1702584.2
RTM
Date :31 July 2017
The following terms are registered trade marks and should be read as such wherever they occur in this document:
HDMI (Page 2)
FreeBSD (Page 4)
Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo
VIRTUAL REALITY SYSTEM AND METHOD
The present invention relates to a virtual reality system and method.
Conventional videogames have striven for immersion and realism, to the extent that for so-called first person games (where the TV screen notionally represents the player’s in-game viewpoint), the in-game virtual camera is sometimes made to move up and down to mimic head motion whilst walking or running (so-called head-bobbing), and similarly when moving forwards or turning left or right in response to a keypress some games introduce a short acceleration period to mimic inertia.
However, in virtual reality (VR) systems, rather than merely representing the user’s viewpoint, 10 the output stereoscopic display actually is the user’s viewpoint, creating a much deeper sense of immersion. However in this case, in-game effects such as head bobbing and inertia conflict with the user’s own vestibular sense of motion, and this can cause nausea.
Accordingly, it is considered good practice to faithfully track a user’s head movements, and to avoid imposing additional and unintended movements on the user’s viewpoint.
However, there is still the problem that a user may wish to navigate a virtual world that may be unbounded, or at least larger than the corresponding real world space occupied by the user, and to do so while travelling in any direction. Because of this, in many cases there will still be a disconnect between the physical movement of the user (which may be little or none, if for example they are sat in a chair) and their virtual movement within the game.
There will still be a small proportion of users who experience discomfort in these circumstances, and it is desirable to mitigate this problem for them.
In a first aspect, a virtual reality navigation method is provided in accordance with claim 1.
In another aspect, a virtual reality navigation system is provided in accordance with claim 13.
Further respective aspects and features of the invention are defined in the appended claims.
Embodiments of the present invention will now be described by way of example with reference to the accompanying drawings, in which:
Figure 1 is a schematic diagram of an entertainment system in accordance with embodiments of the present invention.
Figure 2 is a flow diagram of a virtual reality navigation method in accordance with embodiments of the present invention.
A virtual reality system and method are disclosed. In the following description, a number of specific details are presented in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to a person skilled in the art that these specific details need not be employed to practice the present invention. Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity where appropriate.
A suitable virtual reality system may be provided by an suitable PC, videogame console or other 10 general purpose computer operating under suitable software instruction, such as the Sony ®
PlayStation 4 ®, operating in conjunction with a suitable head mounted display (HMD), such as PlayStation VR ® HMD.
Figure 1 schematically illustrates the overall system architecture of a Sony® PlayStation 4® entertainment device. A system unit 10 is provided, with various peripheral devices connectable to the system unit.
The system unit 10 comprises an accelerated processing unit (APU) 20 being a single chip that in turn comprises a central processing unit (CPU) 20A and a graphics processing unit (GPU) 20B. The APU 20 has access to a random access memory (RAM) unit 22.
The APU 20 communicates with a bus 40, optionally via an I/O bridge 24, which may be a 20 discreet component or part of the APU 20.
Connected to the bus 40 are data storage components such as a hard disk drive 37, and a Blu-ray ® drive 36 operable to access data on compatible optical discs 36A. Additionally the RAM unit 22 may communicate with the bus 40.
Optionally also connected to the bus 40 is an auxiliary processor 38. The auxiliary processor 38 25 may be provided to run or support the operating system.
The system unit 10 communicates with peripheral devices as appropriate via an audio/visual input port 31, an Ethernet ® port 32, a Bluetooth ® wireless link 33, a Wi-Fi ® wireless link 34, or one or more universal serial bus (USB) ports 35. Audio and video may be output via an AV output 39, such as an HDMI port.
The peripheral devices may include a monoscopic or stereoscopic video camera 41 such as the PlayStation Eye ®; wand-style videogame controllers 42 such as the PlayStation Move ® and conventional handheld videogame controllers 43 such as the Dual Shock 4 ®; portable entertainment devices 44 such as the PlayStation Portable ® and PlayStation Vita ®; a keyboard 45 and/or a mouse 46; a media controller 47, for example in the form of a remote control; and a headset 48. Other peripheral devices may similarly be considered such as a printer, or a 3D printer (not shown).
The GPU 20B, optionally in conjunction with the CPU 20A, generates video images and audio for output via the AV output 39. Optionally the audio may be generated in conjunction with or instead by an audio processor (not shown).
The video and optionally the audio may be presented to a television 51. Where supported by the television, the video may be stereoscopic. The audio may be presented to a home cinema system 52 in one of a number of formats such as stereo, 5.1 surround sound or 7.1 surround sound. Video and audio may likewise be presented to a head mounted display unit 53 worn by a user 60.
Regarding peripherals, the system unit is typically provided with at least one hand-held controller 43 such as the DualShock 4 ®. This controller may be used to interact with user interfaces presented by the system unit that are associated with the operating system and/or a particular game or application being run by the system unit.
The user may also interact with the system unit using a video camera 41 such as the PlayStation Eye ®. This may provide monoscopic or stereoscopic video images to the system unit 10 via for example AV input 31. Where these images capture some or all of the user, the user may enact gestures, facial expressions or speech as appropriate to interact with the currently presented user interface.
Alternatively or in addition, a controller designed to assist with camera-based user interaction, such as the PlayStation Move ® 42, may be provided. This controller has a wand form factor and an illuminated region that facilitates detection of the controller within a captured video image. Illuminated regions may similarly be provided on other controllers 43, such as on the DualShock 4 ®. Both kinds of controller comprise motion sensors to detect transverse movement along three axes and rotational movement around three axes, and wireless communication means (such as Bluetooth®) to convey movement data to the system unit. Optionally such controls can also receive control data from the system unit to enact functions such as a rumble effect, or to change the colour or brightness of the illuminated region, where these are supported by the controller.
Finally, the video and optionally audio may be conveyed to a head mounted display 53 such as the Sony PlayStation VR ® display. The head mounted display typically comprises two small display units respectively mounted in front of the user’s eyes, optionally in conjunction with suitable optics to enable the user to focus on the display units. Alternatively one or more display sources may be mounted to the side of the user’s head and operably coupled to a light guide to respectively present the or each displayed image to the user’s eyes. Alternatively, one or more display sources may be mounted above the user’s eyes and presented to the user via mirrors or half mirrors. In this latter case the display source may be a mobile phone or portable entertainment device 44, optionally displaying a split screen output with left and right portions of the screen displaying respective imagery for the left and right eyes of the user. Their head mounted display may comprise integrated headphones, or provide connectivity to headphones. Similarly the mounted display may comprise an integrated microphone or provide connectivity to a microphone.
In operation, the entertainment device defaults to an operating system such as a variant of FreeBSD 9.0. The operating system may run on the CPU 20A, the auxiliary processor 38, or a mixture of the two. The operating system provides the user with a graphical user interface such as the PlayStation Dynamic Menu. The menu allows the user to access operating system features and to select games and optionally other content.
Referring now also to Figure 2, in an embodiment of the present invention a virtual reality navigation method comprises:
In a first step s210, rendering a virtual environment at a first viewpoint. This viewpoint, as discussed previously, typically represents a first person view of the player’s in-game character, and has a physical position within the virtual world (i.e. an x, y and possibly z-axis co-ordinate) and also a viewpoint angle at that position within the virtual world (i.e. a rotation angle Θ on the horizontal plane about the y-axis, and optionally a rotation angle φ on the vertical plane abut the x-axis).
During conventional gameplay, a user may move around the virtual environment using any appropriate control method, thereby changing their physical positon within the virtual world. Similarly, they may change their viewpoint angle using any appropriate control method (e.g. mouse-look, key press or head movement).
Hence a second step s220 comprises obtaining a user command to rotate the viewpoint. In an embodiment of the present invention, the user command is generated by pressing an appropriate button on a handheld controller, although alternative actions may be considered, such as a gesture (e.g. a wrist flick left or right), or a voice input.
Also, purely for simplicity of explanation, it will be assumed that the viewpoint is only rotated in the horizontal plane about the y-axis, but it will be appreciated that the techniques discussed herein are also applicable to rotation in the vertical plane about the z-axis, or a rotation that is a product of rotation in both planes.
A third step s230 then comprises calculating an updated viewpoint that differs from the first viewpoint by a fixed angle greater than 2 degrees inclusive.
It will be appreciated that modem videogame consoles are capable of rendering a scene with a high degree of fidelity, and are capable of rotating a viewpoint by an angle of less than one degree (for example, by the equivalent of a one-pixel shift in viewpoint to the left or right at a notional focal point in-game, resulting in sub-pixel rotation for the majority of a scene) with every new video frame.
By contrast, in step s230, a deliberate decision is made to rotate the viewpoint by a comparatively large step (for example at least 2 degrees), thereby preventing a smooth and continuous rotation of the user’s point of view.
The step angle may be chosen to be notionally similar to the angle of rotation of the human eye that occurs during saccadic repositioning when a person is reading; this is a subjectively instantaneous shift in viewing direction, often accompanied by a subconscious blinking action, during which the user is unaware of the physical rotation of the eye to the new viewing direction.
In a more specific embodiment of the present invention the value of the fixed angle may lie between 3 and 20 degrees. In a more specific embodiment of the present invention the value of the fixed angle may lie between 4 and 15 degrees. In a more specific embodiment of the present invention the value of the fixed angle may lie between 5 and 10 degrees. In a more specific embodiment of the present invention the value of the fixed angle is 5 degrees.
Advantageously, this step-wise, granular progression around an axis is not interpreted by the brain as true rotation; in real life smooth rotation of a user’s head induces some centrifugal forces detectable by the vestibular system of the user, and the absence of these force when experiencing smooth virtual rotation can cause some people to feel discomfort. However the step-wise repositioning described above can be perceived by a user as a sequence of still views and so significantly reduces this effect, making navigation of the virtual world more comfortable for those users susceptible to this issue.
Optionally only a single such step-wise rotation is implemented in response to a key-press or other input, regardless of duration. The user may then control the rate of stepping by controlling their rate of key pressing. Alternatively, successive step-wise rotations are automatically performed after a delay period, such as for example 0.5 seconds. The delay period provides time in which the user can perceive each new viewpoint as (momentarily) stationary. Typically a delay of 0.25 seconds or less is treated by the brain as continuous (if jerky) motion.
Then in a fourth step s240, the virtual environment is rendered at the updated viewpoint.
This results in a step, jump or snap to the new viewing direction without any intervening rotation, even if the videogame console or indeed the game would be capable of doing so.
Indeed, optionally the above technique may be provided as an optional feature when navigating a virtual environment in an application, so that either smooth rotation or the above stepped rotation may be provided. This allows the minority of users who find smooth rotation uncomfortable to select an alternative in which rotation is not shown in the conventional sense, but instead a discontinuous sequence of new viewpoints are presented to them, giving a subjective impression that each is a different and stationary view.
When interacting in a virtual environment, it will be appreciated that it may be impractical for a user to turn their point of view by turning their head; firstly, this may cause sensors or distinguishing marks on their HMD or controller(s) to move out of sight of a camera or other sensors, and secondly it may cause any power lead or video lead of the HMD to become tangled around the user.
Consequently it is more common to use a controller to rotate and to move within the environment. Where a two-handed controller is used (such as the Sony ® Dualshock 4 ®), then conventional buttons or joystick activations for rotating left and right may be used. Where one single-handed controller is used (such as the Sony ® PlayStation Move ®), then similarly conventional buttons for rotating left and right may be used. Meanwhile, where such a singlehanded controller is use in each hand, then optionally a button on the left-hand controller may be used to rotate left, and a button on the right hand controller may be used to rotate right. Alternatively, each controller may have left and right buttons, and the user may press a button on either device. Where conflicting buttons are pressed on the two controllers, then either the more recent button press can override the earlier press, or the earlier press can remain dominant until released.
Clearly, and as noted previously herein it is typically also desirable to move the user around within the environment, and hence reposition their viewpoint. Such conventional walking, running, driving etc., may be termed translational movement or simply travel.
In an embodiment of the present invention, where a two-handed controller or one or two single handed controllers are used, then the or each controller is represented within the virtual world (for example as a hand), providing visual feedback to the user as to where the controlled s) are pointing with respect to their current viewpoint in the virtual world. The user may then move by pressing a key on one or either controller, to move in the direction pointed to by the representation of the controller in the virtual environment. If motion is constrained to a 2D plane (i.e. potentially including traversing a non-flat terrain, but excluding the ability to fly) then direction can be determined by considering the 3D vector of a controller as projected onto a 2D horizontal plane.
The user thus has clear visibility of and control over their direction of travel. This may therefor allow movement forwards, backwards (e.g. by pointing over one’s shoulder) and strafing left or right. Rotation is controlled separately, as discussed previously herein.
In one embodiment, the user moves at a smooth, fixed speed, transitioning instantly from stationary to moving when the motion button is pressed, and back to stationary again when the button is released. Hence there is no simulated acceleration.
Optionally an arrow or other additional direction indicator is provided during travel. This may be associated with the representation of a given controller (where the user has two) to avoid confusion for the user as to which controller is currently determining their direction of travel.
With two single-handed controllers, optionally travel may also be triggered by pressing the motion instigating button on both controllers; for example, this may trigger a faster motion to traverse the virtual environment more quickly. In this case the direction of movement may track either the first controller whose button was pressed (which as noted above may be visually indicated as the dominant controller for this function), or it may track the second controller on the basis that the user has elected to press a button on the second controller to modify movement behaviour and hence considers that controller the currently dominant one, or the direction may be an average of the two pointed directions of the controllers.
However, again, some users may experience discomfort due to the sensory discrepancy between their eyes and vestibular system, which does not experience the normal forces generated when the user moves in real life.
Consequently, in an embodiment of the present invention the above smooth motion may be replaced by step-wise motion, i.e. by teleporting the user’s viewpoint in the indicated direction by a predetermined and fixed amount.
In an embodiment of the present invention the fixed amount may lie between 0.2 and 3 virtual metres within the virtual environment. In a more specific embodiment of the present invention the value of the fixed amount may lie between 0.3 and 2 virtual meters. In a more specific embodiment of the present invention the value of the fixed amount may lie between 0.4 and 1.5 virtual meters. In a more specific embodiment of the present invention the value of the fixed amount may lie between 0.5 and 1.0 virtual meters. In a more specific embodiment of the present invention the value of the fixed amount is 0.5 virtual meters. A virtual meter is the length that a meter appears to occupy within the virtual world.
Optionally, the fixed amount may be selected by asking the user to enter their height, and from this obtain a typical stride length within the virtual environment, for example from a look-up table. This may provide the most comfortable step-wise motion for a given user.
In a similar manner to the step-wise rotation, such step-wise travel may only be implemented once per key press (other equivalent commend), or if the key is held down then may repeat after a predetermined delay period, again for example 0.5 seconds.
Where it is possible to select different navigation speeds (for example using the movement button on both controllers to indicate running, as described previously, or due to collection of a power-up or use of a vehicle in-game), then this may be replicated by use of a larger repositioning step; for example between xl.5 and xlO, depending on designer choice.
It will be appreciated that when the user presses a motion button on a single-handed controllers, then they may travel by the fixed amount in the indicated direction, but retain the same rotational position as before. Hence unless moving directly ahead, there is a sense of lateral drift, crab-wise motion or strafing.
Accordingly, in an embodiment of the present invention, optionally the step-wise repositioning of the user is accompanied by a rotation of the user’s view-point responsive to the direction indicated by the user via the controller. Hence for example the rotation may serve to re-centre the direction indicated by the controller used to activate the motion in the user’s viewpoint. Optionally in this case, this is subject to an angular cut-off such that if the user points more than a predetermined angle away from straight ahead (for example, more than 30, 45, 60, or 75 degrees, depending on designer preference), then the rotation does not occur. This allows deliberate strafing actions to still be signified by a large left or right pointing action with the or each controller.
As noted above, the user can control their rotation separately to translational motion through the environment. Hence for example the user may chose smooth translational motion but step-wise rotation, or vice-versa, or both, or neither, depending on preference.
Optionally, when step-wise rotation is in use, a different fixed angle may be used depending on the apparent in-game speed of travel. Hence when the user is stationary, a first fixed value is used. Then if the user is walking, running, or driving, successively smaller fixed values are used in response to the command to rotate. This may feel subjectively more consistent with a user’s experience that it is more difficult to change direction the faster one is travelling, and hence turning at the same angle when running that one does when stationary would feel more extreme. Notionally an amount of rotation that might induce similar magnitude forces in the user’s vestibular system for a given speed may be approximated by this scheme.
A possible outcome of the above technique for travel is that if the user alternates between pressing the motion button on the right hand controller and the motion button on the left hand controller, they keep stepping forward, but in slightly different directions as the players hands are very unlikely to be pointing in exactly the same direction. This gives a feeling of stepping slightly right and then slightly left, and hence a feeling of walking.
In this case the user may control their speed according to how rapidly they press the buttons, so that the more rapidly the buttons are pressed, the faster the player moves; in some games therefore, the comparative effort of walking and running may thus be replicated by button presses, if desired, rather than having a ‘running’ button or button combination.
The stepwise nature of the travel and rotation schemes can makes the user’s actions appear less natural to onlooking players, and possibly therefore less social. Optionally this can be overcome by smoothing the animation of the user as seen by other players.
Optionally to provide precision movement (for example to sidle up to someone, or to interact with a specific object), then the user may press an override button or button combination to revert to smooth motion temporarily, or to reduce the fixed distance amount and/or rotation angle of step-wise motion by a predetermined factor (for example by a factor of 10), so that for example travel changed from increments of 50cm to 5cm, an rotation changed from increments of 5 degrees to 0.5 degrees. This may be a toggle between gross and fine/smooth motion, or a change that lasts for a short period of time (for example associated with a visible timer or indicator so the user knows when they are about to revert to larger step-wise motion).
It will be appreciated that the above techniques have been described in relation to commands received from a controller, because it may be impractical to navigate a virtual environment with 1:1 motions in the real world. However detected movements of the user’s own head, whether intentional or unintentional, typically should be replicated in updated renders of the field of view. Hence movements of the user’s head away from a notional centre line are replicated in the viewpoint presented to the user, but when the user also commands the system to rotate left or right, the notional centreline is rotated by the fixed angle, causing a rotational shift but preserving the user’s relative viewpoint with respect to the updated notional centreline and continuing to update that viewpoint as the user’s head continues to move.
As noted previously herein, it will be appreciated that the above techniques and methods may be carried out on conventional hardware suitably adapted as applicable by software instruction, or by the inclusion or substitution of dedicated hardware.
Thus the required adaptation to existing parts of a conventional equivalent device may be implemented in the form of a computer program product comprising processor implementable instructions stored on a non-transitory machine-readable medium such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the conventional equivalent device. Separately, such a computer program may be transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these or other networks.
Hence, referring again to Figure 1, in an embodiment of the present invention a virtual reality navigation system comprises an image processor (such as CPU 20A and/or GPU 20B) is adapted (for example by suitable software instruction) to render a virtual environment at a first viewpoint; a user input (such as Bluetooth ® WiFi ® or USB ports 33, 34, 35) adapted to obtain a user command to rotate the viewpoint (for example by pressing a button on a controller 42,
43); a viewpoint processor (such as CPU 20A and/or GPU 20B) is adapted (for example by suitable software instruction) to calculate an updated viewpoint that differs from the first viewpoint by a fixed angle selected with a range between 2 and 20 degrees inclusive; and wherein the image processor is further adapted to render the virtual environment at the updated viewpoint.
It will be appreciated that the virtual reality navigation system can be adapted to implement the various methods and techniques described previously herein. Hence for example, the range may be one selected from the list consisting of between 3 and 15 degrees inclusive; between 4 and 10 degrees inclusive; and between 5 and 7 degrees inclusive, or may indeed simply be 5 degrees.
Similarly, in the virtual reality navigation system the viewpoint processor is adapted to reposition the viewpoint by a fixed amount ahead of the first viewpoint, and to rotate an amount to the left or right of the first viewpoint responsive to the fixed angle by which the updated viewpoint is changed.
As noted previously, the apparatus and methods described herein replace a subjective impression of smooth or jerky motion with a subjective impression of a series of stationary repositionings within the virtual environment, through which the user is still able to look around and/or traverse the environment. By removing the impression of motion for the user, it is possible to reduce instances of motion sickness.
Claims (15)
1. A virtual reality navigation method, comprising the steps of: rendering a virtual environment at a first viewpoint; obtaining a user command to rotate the viewpoint;
5 calculating an updated viewpoint that differs from the first viewpoint by a fixed angle greater than 2 degrees inclusive; and rendering the virtual environment at the updated viewpoint.
2. The virtual reality navigation method of claim 1, in which the fixed angle is in a range between 3 and 20 degrees inclusive.
3. The virtual reality navigation method of claim between 4 and 15 degrees inclusive.
4. The virtual reality navigation method of claim between 5 and 10 degrees inclusive.
in which the fixed angle is in a range in which the fixed angle is in a range
5. The virtual reality navigation method of any preceding claim, in which the user command 15 is received from a handheld controller.
6. The virtual reality navigation method of claim 5, in which a user command to rotate left is received from a handheld controller held in the left hand, and a user command to rotate right is received from a handheld controller held in the right hand.
7. The virtual reality navigation method of any preceding claim, in which the viewpoint is 20 repositioned within the virtual environment by a fixed amount in response to user command to move the viewpoint.
8. The virtual reality navigation method of claim 7, in which the viewpoint is repositioned by a fixed amount from the first viewpoint in a direction indicated by the user, and rotated an amount to the left or right of the first viewpoint responsive to a direction indicated by the user.
25
9. The virtual reality navigation method of any preceding claim, in which the viewpoint is updated once for each separate user command obtained.
10. The virtual reality navigation method of any preceding claim, in which a different fixed angle is selected depending on whether the first viewpoint is currently stationary or in motion.
11. The virtual reality navigation method of any preceding claim, in which the viewpoint is updated in correspondence with detected movements of the user’s head.
12. A computer readable medium having computer executable instructions adapted to cause a computer system to perform the method of any preceding claim.
5
13. A virtual reality navigation system, comprising:
an image processor adapted to render a virtual environment at a first viewpoint;
a user input adapted to obtain a user command to rotate the viewpoint;
a viewpoint processor adapted to calculate an updated viewpoint that differs from the first viewpoint by a fixed angle greater than 2 degrees inclusive; and
10 wherein the image processor is further adapted to render the virtual environment at the updated viewpoint.
14. A virtual reality navigation system according to claim 13, in which the fixed angle is in a range selected from the list consisting of:
i. between 3 and 20 degrees inclusive;
ii. between 4 and 15 degrees inclusive; and iii. between 5 and 10 degrees inclusive.
15. A virtual reality navigation system according to claim 13 or claim 14, in which the viewpoint processor is adapted to reposition the viewpoint by a fixed amount ahead of the first viewpoint, and to rotate an amount to the left or right of the first viewpoint responsive to the
20 fixed angle by which the updated viewpoint is changed.
Intellectual
Property
Office
Application No: GB1702584.2 Examiner: Mr Rhys Miles
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1702584.2A GB2560156A (en) | 2017-02-17 | 2017-02-17 | Virtual reality system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1702584.2A GB2560156A (en) | 2017-02-17 | 2017-02-17 | Virtual reality system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201702584D0 GB201702584D0 (en) | 2017-04-05 |
GB2560156A true GB2560156A (en) | 2018-09-05 |
Family
ID=58486986
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1702584.2A Withdrawn GB2560156A (en) | 2017-02-17 | 2017-02-17 | Virtual reality system and method |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2560156A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020233536A1 (en) * | 2019-05-17 | 2020-11-26 | 华为技术有限公司 | Vr video quality evaluation method and device |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10460527B2 (en) * | 2017-06-30 | 2019-10-29 | Tobii Ab | Systems and methods for displaying images in a virtual world environment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160027213A1 (en) * | 2014-07-25 | 2016-01-28 | Aaron Burns | Ground plane adjustment in a virtual reality environment |
US20170132845A1 (en) * | 2015-11-10 | 2017-05-11 | Dirty Sky Games, LLC | System and Method for Reducing Virtual Reality Simulation Sickness |
-
2017
- 2017-02-17 GB GB1702584.2A patent/GB2560156A/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160027213A1 (en) * | 2014-07-25 | 2016-01-28 | Aaron Burns | Ground plane adjustment in a virtual reality environment |
US20170132845A1 (en) * | 2015-11-10 | 2017-05-11 | Dirty Sky Games, LLC | System and Method for Reducing Virtual Reality Simulation Sickness |
Non-Patent Citations (1)
Title |
---|
Cloudhead Games, 3 April 2014, "VR Comfort Mode Explained" Available from: https://www.youtube.com/watch?v=Gp0eMNSVtZA [Accessed 31/07/2017] * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020233536A1 (en) * | 2019-05-17 | 2020-11-26 | 华为技术有限公司 | Vr video quality evaluation method and device |
Also Published As
Publication number | Publication date |
---|---|
GB201702584D0 (en) | 2017-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10463962B2 (en) | Spectator view perspectives in VR environments | |
US11962954B2 (en) | System and method for presenting virtual reality content to a user | |
US11662813B2 (en) | Spectating virtual (VR) environments associated with VR user interactivity | |
US11199705B2 (en) | Image rendering responsive to user actions in head mounted display | |
US9905052B2 (en) | System and method for controlling immersiveness of head-worn displays | |
CN106873767B (en) | Operation control method and device for virtual reality application | |
US10712900B2 (en) | VR comfort zones used to inform an In-VR GUI editor | |
WO2020190399A1 (en) | Methods and systems for spectating characters in follow-mode for virtual reality views | |
JP2022184958A (en) | animation production system | |
JP2023116432A (en) | animation production system | |
GB2560156A (en) | Virtual reality system and method | |
JP2022020686A (en) | Information processing method, program, and computer | |
CN115315684A (en) | Method for changing viewpoint in virtual space | |
JP2022153476A (en) | Animation creation system | |
JP2017069924A (en) | Image display device | |
JP7300569B2 (en) | Information processing device, information processing method and program | |
WO2023162668A1 (en) | Information processing device and floor height adjustment method | |
JP7390542B2 (en) | Animation production system | |
GB2529191A (en) | Display apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |