GB2524269A - Virtual reality - Google Patents

Virtual reality Download PDF

Info

Publication number
GB2524269A
GB2524269A GB1404850.8A GB201404850A GB2524269A GB 2524269 A GB2524269 A GB 2524269A GB 201404850 A GB201404850 A GB 201404850A GB 2524269 A GB2524269 A GB 2524269A
Authority
GB
United Kingdom
Prior art keywords
user
boundary
virtual reality
virtual
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1404850.8A
Other versions
GB201404850D0 (en
GB2524269B (en
Inventor
Jeremy David Ashforth
Simon Mark Benson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Computer Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Ltd filed Critical Sony Computer Entertainment Europe Ltd
Publication of GB201404850D0 publication Critical patent/GB201404850D0/en
Publication of GB2524269A publication Critical patent/GB2524269A/en
Application granted granted Critical
Publication of GB2524269B publication Critical patent/GB2524269B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Provided is a virtual reality system in which images representing a virtual environment are displayed to a user by a head-mountable display (HMD). The system comprises a boundary detector configured to detect, with respect to the physical environment around the user, a boundary of an activity area. A virtual reality generator is configured to vary the users interaction with the virtual reality environment according to the users physical position with respect to the boundary of the activity area. The virtual reality generator may be configured to warn the user if the user approaches the boundary of the activity area. The user may be warned if the user is less than a threshold distance from the boundary and the nature of the warning may be changed in response to the user moving closer to the boundary than the threshold distance. The boundary detector may be configured to detect the location of one or more markers 410 in the physical environment, the markers indicating respective reference positions with respect to the boundary.

Description

VIRTUAL REALITY
This invention relates to virtual reality systems and methods.
In a head-mountable display (HMD) for use in a virtual reality system, an image or video display device is provided which may be worn on the head or as part of a helmet. Either one eye or both eyes are provided with small electronic display devices.
Some HMD5 allow a displayed image to be superimposed on a real-world view. This type of HMD can be referred to as an optical see-through HMD and generally requires the display devices to be positioned somewhere other than directly in front of the user's eyes.
Some way of deflecting the displayed image so that the user may see it is then required. This might be through the use of a partially reflective mirror placed in front of the user's eyes so as to allow the user to see through the mirror but also to see a reflection of the output of the display devices. In another arrangement, disclosed in EP-A-1 731 943 and US-A-2010/0157433, a waveguide arrangement employing total internal reflection is used to convey a displayed image from a display device disposed to the side of the user's head so that the user may see the displayed image but still see a view of the real world through the waveguide. Once again, in either of these types of arrangement, a virtual image of the display is created (using known techniques) so that the user sees the virtual image at an appropriate size and distance to allow relaxed viewing. For example, even though the physical display device may be tiny (for example, 10 mm x 10 mm) and may be just a few millimetres from the user's eye, the virtual image may be arranged so as to be perceived by the user at a distance of (for example) 20 m from the user, having a perceived size of 5 m x 5m.
Other HMD5, however, allow the user only to see the displayed images, which is to say that they obscure the real world environment surrounding the user. This type of HMD can position the actual display devices in front of the user's eyes, in association with appropriate lenses or other optical components which place a virtual displayed image at a suitable distance for the user to focus in a relaxed manner -for example, at a similar virtual distance and perceived size as the optical see-through HMD described above. This type of device might be used for viewing movies or similar recorded content, or for viewing so-called virtual reality content representing a virtual space surrounding the user. It is of course however possible to display a real-world view on this type of HMD, for example by using a forward-facing camera to generate images for display on the display devices.
Although the original development of HMD5 was perhaps driven by the military and professional applications of these devices, HMD5 are becoming more popular for use by casual users in, for example, computer game or domestic computing applications.
Accordingly, a significant consideration for this type of HMD is user comfort, safety and enjoyment. The aim is that the user should be keen to use the HMD in a virtual reality arrangement and should feel safe and comfortable while doing so.
Various aspects and features of the present invention are defined in the appended claims and within the text of the accompanying description and include at least a virtual reality system and method.
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 schematically illustrates an HMD worn by a user, the HMD being connected to a Sony® PlayStation 3® games console; Figure 2 schematically illustrates a playing area mapped out with respect to a user of an HMD; Figure 3 schematically illustrates a playing area marker; Figure 4a is a schematic flowchait illustrating a playing area definition process; Figure 4b isa schematic flowchart illustrating a player warning process; Figure 5 schematically illustrates a playing area with respect to the environment surrounding a user; Figure 6 schematically illustrates a user interface menu and a game object; Figure 7 is a schematic flowchart illustrating a user interface rendering process; Figure 8 is a schematic flowchart illustrating an object rendering process; Figure 9 schematically illustrates a user interface display; Figure 10 is a schematic flowehait illustrating a rendering process; Figure 11 schematically illustrates structural elements of a games machine; Figures 12A to 120 schematically illustrate a circular playing area; Figures 13A to 13C schematically illustrate a square playing area; Figures 14A to 140 schematically illustiate an ellipsoid playing area; and Figures 15A to 150 schematically illustrate an arbitrarily shaped playing area.
Referring now to Figure 1, a user 10 is wearing an HMD 20 on the user's head 30. The HMD 20 forms pad of a system comprising the HMD and a games console to provide images for display by the HMD.
The HMD of Figure 1 completely (or at least substantially completely) obscures the user's view of the surrounding environment. All that the user can see is the pair of images displayed within the HMD.
The HMD has associated headphone audio transducers or earpieces 60 which fit into the users left and right ears. The earpieces 60 replay an audio signal provided from an external source, which may be the same as the video signal source which provides the video signal for display to the users eyes.
The combination of the fact that the user can see only what is displayed by the HMD and, subject to the limitations of the noise blocking or active cancellation properties of the earpieces and associated electronics, can hear only what is provided via the earpieces, mean that this HMD may be considered as a so-called "full immersion" HMD. Note however that in some embodiments the HMD is not a full immersion HMD, and may provide at least some facility for the user to see and/or hear the user's surroundings. This could be by providing some degree of transparency or partial transparency in the display arrangements, and/or by projecting a view of the outside (captured using a camera, for example a camera mounted on the HMD) via the HMD's displays, and/or by allowing the transmission of ambient sound past the earpieces andlor by providing a microphone to generate an input sound signal (for transmission to the earpieces) dependent upon the ambient sound.
A front-facing camera 122 may capture images to the front of the HMD, in use.
The HMD is connected to a Sony® PlayStation 3® games console 300 as an example of a base device. Another example is a Sony® PlayStation 4® games console. The games console 300 is connected (optionally) to a main display screen (not shown) and to a one or more cameras 310 (such as a stereoscopic camera, although one or more monoscopic cameras may be used). A cable 82, acting (in this example) as both power supply and signal cables, links the HMD 20 to the games console 300 and is, for example, plugged into a USB socket 320 on the console 300.
The user is also shown holding a hand-held controller 330 which may be, for example, a Sony® Move® controller which communicates wirelessly with the games console 300 to control (or to contribute to the control of) game operations relating to a currently executed game program.
The video displays in the HMD 20 are arranged to display images generated by the games console 300, and the earpieces 60 in the HMD 20 are arranged to reproduce audio signals generated by the games console 300. Note that if a USB type cable is used, these signals will be in digital form when they reach the HMD 20, such that the HMD 20 comprises a digital to analogue converter (DAC) to convert at least the audio signals back into an analogue form for reproduction.
Images from the camera 122 mounted on the HMD 20 are passed back to the games console 300 via the cable 82. Similarly, if motion or other sensors are provided at the HMD 20, signals from those sensors may be at least partially processed at the HMD 20 and/or may be at least partially processed at the games console 300. The use and processing of such signals will be described further below.
The USB connection from the games console 300 also (optionally) provides power to the HMD 20, according to the USB standard.
Optionally, at a position along the cable 82 there is a so-called "break out box" acting as a base or intermediate device 350, to which the HMD 20 is connected by the cable 82 and which is connected to the base device by the cable 82. The breakout box has various functions in this regard. One function is to provide a location, near to the user, for some user controls relating to the operation of the HMD, such as (for example) one or more of a power control, a brightness control, an input source selector, a volume control and the like. Another function is to provide a local power supply for the HMD (if one is needed according to the embodiment being discussed). Another function is to provide a local cable anchoring point. In this last function, it is not envisaged that the break-out box 350 is fixed to the ground or to a piece of furniture, but rather than having a very long trailing cable from the games console 300, the break-out box provides a locally weighted point so that the cable 82 linking the HMD 20 to the break-out box will tend to move around the position of the break-out box. This can improve user safety and comfort by avoiding the use of very long trailing cables.
It will be appreciated that there is no technical requirements to use a cabled link (such as the cable 82) between the HMD and the base unit 300 or the break-out box 350. A wireless link could be used instead. Note however that the use of a wireless link would require a potentially heavy power supply to be carried by the user, for example as part of the HMD itself.
Embodiments of the present disclosure relates to techniques for defining or mapping out a playing area with respect to one or both of the user of an HMD and the user's surroundings.
Figure 2 schematically illustrates such a playing area mapped out with respect to a user of an HMD.
In Figure 2, the users position is represented by a circle 400. Smaller circles 410 indicate points on a boundary representing the extremities of the user's desired game playing area.
There are various reasons why a user might wish to define a game playing area in this way. In the case of a user of an HMD, particularly a full immersion HMD, it may be desirable to define an area in which the game action will take place in terms of a physical area surrounding the user's position in which it is safe for the user, if necessary or appropriate to the game action, to move around. It would be potentially dangerous for a user wearing a full immersion HMD to walk into items of furniture, or to approach other hazards. Accordingly, by defining a game playing area in advance, it can be made less likely that the user will stray outside of such a deemed safe area.
The game playing area in Figure 2 is defined by points on its boundary. Only four points are shown, for simplest of the drawing, but other points could be included so as to surround the user's position. Once a playing area has been defined, a game running on the games machine 300 is constrained so that one or both of the following takes place: (a) game action relevant to that user (that is to say, game action to which the user can or should directly contribute) takes place only within a virtual area which, when mapped to the real world, extends no further than the defined playing area; and (b) the user is warned if the user is about to leave the defined playing area, for example by an audible or visible warning, or by mixing the virtual reality environment of the game with video from the forward facing camera 122 (so that the user is gently made aware of the user's physical surroundings) as the user approaches (for example within a threshold distance of) the edge of the defined playing area. This provides an example of a virtual reality generator configured to warn the user if the user approaches the boundary of the activity (playing) area. To provide the warning, the virtual reality generator may be configured to mix the images representing the virtual environment with images captured of the physical environment to provide the warning to the user. The provision of the warning relates to an example of the process of Figure 4B described below. Further examples of warning techniques will be described below.
Although the user has defined spot positions or points along the boundary in Figure 2, a contiguous boundary can be derived by the games machine 300 by simply joining the positions of each pair of most closely adjacent points 410.
Various techniques may be used to define each of the points 410.
In an example system, before playing a game the user moves a marker such as the Move controller 330 to each of the points 410 along with boundary and, with the Move controller 330 at the appropriate place, signals to the games machine 300 to register that position as a boundary point 410. The signalling to the games machine 300 can be carried out by, for example, pressing a control button on the Move controller which transmits a wireless signal to the games machine 300. The games machine 300 ascertains the current position of the Move controller at the time that the button is pressed using the standard interface between a games machine and a Move controller, namely by means of the camera 310 detecting the location of an illuminated portion of the Move controller. This arrangement has the advantage that once the user has set up the game playing area, there is no physical indication of its presence to be cleared away after playing the game.
In another example system, the user can distribute markers around the environment surrounding the user's game playing position. An example marker is showing schematically in Figure 3, being a coloured cube shape, for example having a side length of 5 cm. The particular colour is not important except for the fact that the cubes are likely to be more easily detected if their colour is different to dominant colours in the users surroundings. As a further measure to assist in the detection of the markers, a machine-recognisable pattern similar to an augmented reality marker may be placed on each face of the cube of Figure 3. This arrangement allows the detection of the playing area to be carried out by one or both of the camera 310 and the forward-facing camera 122 mounted on the HMD. By detecting the playing area, at least in part, by using the camera mounted on the HMD, a potentially more accurate determination of whether the user is about to leave the physical playing area can be obtained.
In another example system, the camera 310 and/or the camera 122 take a series of one or more stills photographs of the environment surrounding the user. These are displayed to the user by the games machine 300. The user then indicates, for example by a drawing operation (for example using a computer mouse or a touch panel) the boundary of the user playing area.
In another example system, a coloured or illuminated cable, rope or the like could be laid out by the user to indicate the extent to the desired playing area. For example, the cable 82 could be used in this way, for example by providing the cable in a distinctive colour and/or by providing illumination along the length of the cable, for example by providing LED lights every (say) O.25m.
Combinations of these approaches may of course be used. The examples are indicative of arrangements in which the boundary detector is configured to detect the location of one or more boundary markers in the physical environment, and/or the boundary detector is configured to detect user activation of a user control at one or more positions in the physical environment so as to define respective positions on the boundary.
Figure 4a is a schematic flowchart illustrating a playing area definition process. As discussed, this process can be carried out by the games machine 300 on the basis of signals from one or both of the camera 310 and the forward-facing camera 122 on the HMD. Of course, as an alternative, some or all of the processing could be carried out by the HMD itself, if the HMD has sufficient intrinsic processing power. This is true of the various processes discussed in the present description. However, in the case of an HMD connected to an external base unit such as the games machine 300, it can be advantageous to make use of the games machine 300 to carry out processor-intensive tasks, so as to save weight and power consumption at the HMD itself. In other embodiments, though, if the HMD were a stand-alone device such that it provided not only the user display functions but also the functions of a game machine, it will be appreciated that the various functions described in this specification could be carried out by the HMD.
Referring to Figure 4a, at a step 430, the games machine 300 detects one or more reference points such as the points 410 defining the boundary of the physical playing area. As discussed above, this could be the detection of the user activating the Move controller to define the points, or it could be an ongoing detection of markers such as the marker of Figure 3. As a step 440, the games machine 300 defines the playing area in response to the detection of the boundary points, both as a physical boundaiy and also as a corresponding boundary in the virtual world represented by the game in use. In these arrangements, the boundary can have a user-selectable shape.
Accordingly, an apparatus operating as described with reference to Figure 4a provides an example of a virtual reality system in which images representing a virtual environment are displayed to a user by a head-mountable display, the system comprising: a boundary detector configured to detect, with respect to the physical environment around the user, a boundary of an activity area; and a virtual reality generator configured to vary the user's interaction with the virtual reality environment according to the user's physical position with respect to the boundary of the activity area.
Although the one or more reference positions are positions on the boundary, in alternative examples the one or more reference positions are positions within the boundary, with respect to which positions the boundary is defined. As an example of such an alternative technique, at the step 430 the games machine 300 could detect just one reference point, indicative of a calibration position of the user, and at the step 440 the playing area is defined by the games machine as a predetermined or user-selectable area (having a predetermined or user-selectable shape) around the reference point. If the shape has an orientation (that is to say, it is not circularly symmetric) then in some examples the virtual reality generator is configured to set the orientation of the boundary, with respect to the one or more reference positions, according to an orientation of the head-mountable display at a calibration operation.
Examples of such an operation will be discussed below with reference to Figures 1 3A-1 4C.
As another example alternative, at the step 430 the games machine 300 could detect two or more player positions (in a similar manner to the detection of boundary points), but in this case the player positions are not directly on the boundary but instead indirectly define the boundary, in that the boundary is constrained by the games machine to the smallest area which passes no more than a predetermined or user-selectable distance (such as 1 m) from any of the reference points. This example provides for the boundary having a user-selectable shape.
Accordingly, the various possibilities discussed with reference to Figure 4a include the boundary detector being configured to detect the location of one or more markers in the physical environment, the one or more markers indicating respective reference positions with respect to the boundary, and/or the boundary detector being configured to detect user activation of a user control at one or more positions in the physical environment so as to define respective reference positions with respect to the boundary.
Figure 4b is a schematic flowchait illustrating a player warning process. At a step 442, the games machine and/or the HMD detects that the player is near (for example, within a threshold distance of) or at the boundary. At a step 444, the games machine and/or the HMD issues a warning to the user. Note that the nature or extent of the warning may change as the player gets closer to the boundary.
The location of the player in the physical world can be detected by various techniques.
For example, the HMD can be arranged to include motion sensors, so that the location of the HMD relative to a start point (for example, one of the reference points discussed above) can be ascertained by integrating the motion detections by the HMD motion sensors. The Move controller includes a motion tracking arrangement involving a collaborative detection using motion sensors and optical detection (by the camera 310) of a coloured shape forming part of the Move controller. In this way, the position of the Move controller can be taken as a proxy (or approximation for the present purposes) of the position of the user. (A similar mechanism is used if the Move controller is employed to define one or more reference points relating to the boundary). The camera 310 can be arranged to detect characteristic markings on the user and/or on the HMD and to detect a direction and range from those markings. The HMD's front facing or other camera 122 can be used (a) to detect motion of the HMD by a so-called optical flow mechanism, whereby changes in images of the surroundings, as captured by the camera 122, are detected so as to detect camera motion, and/or (b) to detect the location of the HMD relative to one or more markers of Figure 3, and/or (c) to capture an image of the surroundings at a reference point on the boundary, so that if the camera 122 subsequently captures another such image identical to that one, the HMD is at the boundary. Other position tracking techniques such as optical, radio frequency, acoustic (using a microphone array, for example) and other ranging and direction finding techniques may be used.
Figure 5 schematically illustrates a playing area with respect to the environment surrounding a user.
In particular, Figure 5 schematically illustrates a physical environment (a room) in which the user is engaged in playing a computer game. The current physical position of the user is a position at the front centre of the view of Figure 5, which is to say that the user is stood facing into the room as shown. Boundary markers 410 represented the extremities of a playing area defined with respect to the physical environment by the actions of the user as discussed above.
During operation of a virtual reality system using the HMD, such as a computer game, various objects will be displayed using the displays of the HMD for the user to see. As discussed earlier, the HMD is a full immersion HMD so, in use, the user cannot see the physical surroundings. However, for the sake of the following explanation, a superposition of 3-D virtual objects in a 3-D virtual world and the corresponding real physical positions is shown in Figure 6, which schematically illustrates a user interface menu and a game object. In Figure 6, the virtual reality generator is configured to render one or more virtual objects within the virtual environment.
The game object could be, for example, under the control of the games machine and the game engine relating to the currently executing game. As regards the user inteiface, the games machine could select a position to display it or alternatively the user could select such a position within the virtual environment, for example by using a position selection operation employing the Move controller as discussed above, or by using a special marker (such as a specially or differently coloured one of the markers of Figure 3).
When the user engages with the virtual reality game displayed to the user using the HMD, the user would normally interact with virtual objects in the virtual world. These objects can be rendered by the games machine 300 at appropriate positions relative to the user in the virtual world. An example object 450 is rendered close to the user.
If the user is able to interact with the object, an interaction marker 455 may be rendered by the virtual reality generator on, near, above or otherwise in association with the object. If the interaction marker is not present, this indicates that the user cannot interact with the object.
Similarly, the user may interact with user interface menus, warnings, messages and the like. An example user interface menu 460 is also displayed at a position within the virtual world.
Various considerations apply to the rendering of these objects in the virtual world.
With regard to the user interface, Figure 7 is a schematic flowchart illustrating a user interface rendering process. This process deals with a potential conflict in position (in the virtual world) between the virtual position at which the user interface is to be rendered and the virtual position at which a game object (such as a character or game asset) is to be rendered. At a step 470, a test is carried out to detect whether the render position of the user interface conflicts with the render position of a game object, both with respect to the virtual world. If so, then at a step 480 the user interface is rendered in preference to the game object. This provides an example of detecting whether the display position, in the virtual environment, of a user interface graphic conflicts with the display position, in the virtual environment, of a virtual object, and if so, rendering the user interface graphic in preference to the virtual object.
With regard to game objects, the present disclosure recognises that the user may be uncomfortable if objects are rendered unduly close to the user in the virtual world. To address this issue, Figure 8 is a schematic flowchart illustrating an object rendering process. A "personal space" is defined around the user, for example corresponding to a distance which the user will perceive in the virtual world to be about 1 m from the user. The present disclosure recognises that the user may be uncomfortable or unduly threatened if realistic-looking objects are rendered in this personal space. Accordingly, in Figure 8, if, at a step 490, and object is to be rendered closer than a threshold distance from the user, then at a step 500 the object is rendered as a schematic representation such as (in this example) a wire-frame representation rather than as a realistically rendered representation. Accordingly, for any individual object, the manner in which it is rendered varies according to whether or not it is within the threshold distance and accordingly within the personal space of the user. This provides an example of an arrangement in which the virtual reality generator is configured to detect a separation, in the virtual environment, between an object to be rendered and the user, and to vary the way in which the object is rendered according to the detected separation. In some examples the virtual reality generator is configured to render a more detailed representation of the object if the separation is greater than a threshold separation, and to render a less detailed representation if the separation is less than the threshold separation. The less detailed representation may be a wire-frame representation.
Rather than necessarily tying the position of a user interface to a position within the virtual world, it is possible to associate a user interface with a position on the user's display, so that the user interface moves with the user's head movements. In other words, the user interface display remains as a constant position in the user's field of view rather than at a constant position within the virtual world. Figure 9 schematically illustrates such a user interface display in which left 510 and right 520 images are displayed to the user using the HMD, and a user interface 530 is displayed in the left and right images so as to appear at a 3-D position relative to the user but to move with the user's display so as to appear at a constant position within the user's field of view. This facility could be reserved for certain system or personal menus or information, for example a display of the user's current score or a master control menu such as a cross media bar or the like.
Figure 10 is a schematic flowchart illustrating a rendering process, which again addresses the issue of the user's personal space. The arrangement of Figure 10 provides a degree of privacy to the user, by selectively rendering or not rendering game objects to other users of a multiplayer game. In particular, at a step 540, the games machine 300 detect whether an object to be rendered is within the user's personal space, which is to say it is to be rendered wholly or partly within a threshold distance (DmR) of the position, in the virtual world, of the user.
If so, then at a step 550 the object is rendered so as to be visible to the user (within whose personal space lies) but is not rendered to other users who may share a view of the virtual world.
This provides an example of a virtual reality generator configured so as to inhibit the rendering, to another user in a multi-user virtual environment, an object within the threshold separation from the user. This arrangement provides a localised privacy region for each user such that objects within that region are not rendered to be visible by other users, even if those other users are positioned and oriented so as to see the first user. Note that the restriction on rendering items to other users does not apply to the first user's character or avatar but only to game objects within the user's personal space. The restriction can, in some embodiments, be applied only to objects which the user carries with the user rather than to objects which the user is moving past, so as to avoid stationary objects disappearing and re-materialising as the user walks past them in the virtual world.
Figure 11 schematically illustrates parts of the internal structure of a computer games machine such as the computer games machine 300 (which, as discussed, is an example of a general-purpose data-processing machine). Figure 11 illustrates a central processing unit (CPU) 1100, a hard disk drive (HDD) 1110, a graphics processing unit (GPU) 1120, a random access memory (RAM) 1130, a read-only memory (ROM) 1140 and an interface 1150, all connected to one another by a bus structure 1160. The HOD 1110 and the ROM 1140 are examples of a machine-readable non-transitory storage medium. The interface 1150 can provide an interface to the camera 20, to other input devices, to a computer network such as the Internet, to a display device (not shown in Figure 11, but corresponding, for example, to the HMD 20) and so on. Operations of the apparatus shown in Figure 11 to perform one or more of the operations described in the present description are carried out by the CPU 1100 and the GPU 1120 under the control of appropriate computer software stored by the HOD 1110, the RAM 1130 and/or the ROM 1140. It will be appreciated that such computer software, and the storage media (including the non-transitory machine-readable storage media) by which such software is provided or stored, are considered as embodiments of the present disclosure.
Such a software-implemented method may include a virtual reality method comprising: displaying images representing a virtual environment to a user by a head-mountable display; detecting, with respect to the physical environment around the user, a boundary of an activity area; and varying the user's interaction with the virtual reality environment according to the user's physical position with respect to the boundary of the activity area.
As mentioned above in connection with Figure 4a, one way in which the active playing area can be defined is with respect to one or more reference points. While these can be points demarking the periphery or boundary itself, and several examples of such an arrangement are discussed above, another way in which a playing area can be defined is with respect to one or more reference points which are not themselves on the boundary. Examples of such a technique will be discussed with reference to Figures 12A-12C, 13A-13C and 14A-14C. Figures 15A-15C below relate to an arbitrarily defined boundary, for example one defined using boundary markers as discussed above.
Figures 12A to 12C schematically illustrate a circular playing area 1220, defined as having a predetermined or user-selectable radius 1200 from a calibration point 1210 established by the user placing a marker or executing a calibration operation (for example by operating a control with respect to the Move controller while occupying the calibration point). Once established, the playing area 1220 remains in place with respect to the original calibration position until cancelled by the user, until the game has finished, or until the machine is switched off or reset.
After the establishment of the playing area 1220, the user may move around inside the playing area in the course of playing the game. A virtual zone 1230 is established around the current position of the user, for example representing a radius 1240, less than the radius 1200, around the current position of the user.
The step 442 of the flowchart of Figure 4B can, in this example, involve detecting whether the virtual zone 1230 meets or intersects the boundary of the playing area 1220.
In some examples, the virtual zone 1230 simply meeting the boundary of the playing area 1220 is sufficient to trigger the step 444 of Figure 4B. In other embodiments, an actual intersection between the virtual zone 1230 and the boundary of the playing area 1220 is required to trigger the step 444. Of course, the virtual zone 1230 is just a convenient way of visualising a test which assesses the distance between the player and the boundary. These conditions in fact correspond to the test "is the distance between the player and the boundary less than or equal to a threshold distance?" and "is the distance between the player and the boundary less than a threshold distance'?" respectively. In practice, because of the observational errors incurred in assessing the exact position of the player and the exact position of the boundary, the two variants of the test are unlikely to give different answers.
In Figure 12B, the position 1250 of the player is such that virtual zone 1230 has intersected the boundary of the playing area 1220. A mild warning is provided to the user, for example using one of the warning techniques discussed above and/or a further technique discussed below. The mild warning is represented schematically by a light shading around the player position 1250. In some examples, the light shading indicates a light opacity (in the case that a fade to a solid colour is used as the warning mechanism), so that the image as displayed to the user in this state is mainly the game or virtual reality image, with a small component from the solid colour.
In Figure 12C, the position 1260 of the player is such that virtual zone 1230 has significantly intersected the boundary of the playing area 1220. Indeed, the player position 1260 is practically at the boundary. A stronger warning is provided to the user, for example using one of the warning techniques discussed above and/or a further technique discussed below. The stronger warning is represented schematically by a denser shading around the player position 1260. In some examples, the denser shading indicates a greater opacity (in the case that a fade to a solid colour is used as the warning mechanism), so that the image as displayed to the user in this state is mainly the solid colour, with only a small component from the game or virtual reality image. It will be appreciated that the ratio between the two images (the game / virtual reality image and the solid colour) can vary according to the degree of overlap, that is, the separation of the player from the boundary. Similar techniques can be applied to other warning mechanisms such as a fade to a front facing camera image.
So far the discussion has related to a two-dimensional playing area. It is however possible to define a three dimensional playing area using equivalent techniques. For example in the case of Figures 12A-12C, the playing area could be defined as a hemisphere centred around the ground-level projection of the calibration position 1210.
The warning technique of the step 444 can take various forms. Some possible techniques, such as mixing with an HMD head-mounted camera signal or with one or more predetermined images (such as predetermined colours) to provide the warning to the user. The mix ratio can be varied (from a small proportion of the camera signal to a larger proportion of the camera signal) in dependence upon the strength of the warning required, for example in a generally inverse relationship to the separation of the player's current position from the edge of the playing area. For example, the system can warn the user if the user is less than a threshold distance from the boundary, and can change the nature of the warning in response to the user moving closer to the boundary than the threshold distance.
Other possibilities for the step 444 include fading the HMD video display to a predetermined colour (for example, black, red, white or the like) with the fade becoming more pronounced (more fixed colour, less game video) as the user approaches the boundary.
Other possibilities include progressively distorting, decomposing, defocusing or applying another type of video processing or rendering effect to the video content generated by the game.
Once again, the severity of the effect could be linked to the separation from the boundary so that as the user gets closer to the inside edge of the boundary, the severity of the video processing or rendering effect, in terms of the amount of change imposed on the game video, is increased.
Figures 13A to 13C schematically illustrate a square playing area. Here, similar principles to those discussed in connection with Figures 12A-12C can be applied, and wherever possible the same reference numerals have been used.
A square playing area 1300 of side 1310 is established in response to the original calibration point 1210. Because the square is not rotationally symmetric, in order to orient the square shape at the calibration stage, a forward facing direction or orientation 1320 of the HMD can be detected by the games machine and used as a predetermined direction (in this example, a direction parallel to two sides of the square) to orient the square.
The remainder of Figures 13B and 13C follow the same operational considerations as those discussed with respect to Figures 12B-12C. Wien considering the separation of the user from the boundary of the playing area 1300, the shortest distance (for example) between the user's current position and the boundary (that is, the nearest point on the boundary) can be considered.
Figures 14A to 14C schematically illustrate an ellipsoid playing area. Here once again, similar principles to those discussed in connection with Figures 12A-12C can be applied, and wherever possible the same reference numerals have been used.
An elliptical playing area 1400 is defined. This could be defined with respect to one calibration position 1210, with an orientation (for example, the minor axis being parallel to a forward facing direction 1430 of the HMD at the time of calibration) set by the user. Or two reference points 1410, 1420 could be defined as foci of the ellipse. Once again, the closest point to the user is compared with the threshold distance to detect whether the user is too close to the edge of the playing area.
Incidentally, a circular virtual zone 1230, which is a representation of a comparison of the user's separation from the boundary with a threshold distance, is used as an example in each of the sets of diagrams in Figures 12A-15C. However, an asymmetric virtual zone 1230, corresponding to a threshold distance which varies with direction, could be used.
Figures 15A to 15C schematically illustrate an arbitrarily shaped playing area 1500.
Here, the playing area could be set (as a predetermined but arbitrary-looking shape) with respect to a calibration position 1210. However, as discussed above with reference to Figure 4A, other ways of defining such an area include the user of boundary markers or marker actions, or other reference points (or a reference locus or curve) from which the boundary of the playing area 1500 is at least a threshold distance away.

Claims (25)

  1. CLAIMS1. A virtual reality system in which images representing a virtual environment are displayed to a user by a head-mountable display, the system comprising: a boundary detector configured to detect, with respect to the physical environment around the user, a boundary of an activity area; and a virtual reality generator configured to vary the user's interaction with the virtual reality environment according to the user's physical position with respect to the boundary of the activity area.
  2. 2. A system according to claim 1, in which the virtual reality generator is configured to warn the user if the user approaches the boundary of the activity area.
  3. 3. A system according to claim 2, in which the virtual reality generator is configured to mix the images representing the virtual environment with images captured of the physical environment to provide the warning to the user.
  4. 4. A system according to claim 2, in which the virtual reality generator is configured to mix the images representing the virtual environment with one or more predetermined images to provide the warning to the user.
  5. 5. A system according to any one of claims 2 to 4, in which the virtual reality generator is configured: to warn the user if the user is less than a threshold distance from the boundary; and to change the nature of the warning in response to the user moving closer to the boundary than the threshold distance.
  6. 6. A system according to any one of the preceding claims, in which the boundary detector is configured to detect the location of one or more markers in the physical environment, the one or more markers indicating respective reference positions with respect to the boundary.
  7. 7. A system according to any one of the preceding claims, in which the boundary detector is configured to detect user activation of a user control at one or more positions in the physical environment so as to define respective reference positions with respect to the boundary.
  8. 8. A system according to claim 6 or claim 7, in which the one or more reference positions are positions on the boundary.
  9. 9. A system according to claim 6 or claim 7, in which the one or more reference positions are positions within the boundary, with respect to which positions the boundary is defined.
  10. 10. A system according to claim 9, in which the boundary has a user-selectable shape.
  11. 11. A system according to claim 9 or claim 10, in which the virtual reality generator is configured to set the orientation of the boundary, with respect to the one or more reference positions, according to an orientation of the head-mountable display at a calibration operation.
  12. 12. A system according to any one of the preceding claims, in which the virtual reality generator is configured to render one or more virtual objects within the virtual environment.
  13. 13. A system according to claim 12, in which the virtual reality generator is configured to detect a separation, in the virtual environment, between an object to be rendered and the user, and to vary the way in which the object is rendered according to the detected separation.
  14. 14. A system according to claim 13, in which the virtual reality generator is configured to render a more detailed representation of the object if the separation is greater than a threshold separation, and to render a less detailed representation if the separation is less than the threshold separation.
  15. 15. A system according to claim 14, in which the less detailed representation is a wire-frame representation.
  16. 16. A system according to claim 14 or claim 15, in which the virtual reality generator is configured so as to inhibit the rendering, to another user in a multi-user virtual environment, an object within the threshold separation from the user.
  17. 17. A system according to any one of claims 12 to 15, in which the virtual reality generator is configured to display a marker, associated with a rendered virtual object, to indicate whether a user may interact with that virtual object.
  18. 18. A system according to any one claims 12 to 15, in which the virtual reality generator is configured to display user interface graphics to the user at positions, within the virtual environment, selected by the user.
  19. 19. A system according to claim 18, in which the virtual reality generator is configured to detect whether the display position, in the virtual environment, of a user interface graphic conflicts with the display position, in the virtual environment, of a virtual object, and if so, to render the user interface graphic in preference to the virtual object.
  20. 20. A virtual reality system substantially as hereinbefore described with reference to the accompanying drawings.
  21. 21. A head-mountable display comprising a virtual reality system according to any one of the preceding claims.
  22. 22. A virtual reality method comprising: displaying images representing a virtual environment to a user by a head-mountable display; detecting, with respect to the physical environment around the user, a boundary of an activity area; and varying the user's interaction with the virtual reality environment according to the user's physical position with respect to the boundary of the activity area.
  23. 23. A virtual reality method substantially as hereinbefore described with reference to the accompanying drawings.
  24. 24. Computer software which, when executed by a computer, causes the computer to carry out the method of claim 22 or claim 23.
  25. 25. A machine-readable non-transitory storage medium which stores computer software according to claim 24.
GB1404850.8A 2014-03-17 2014-03-18 Virtual reality Active GB2524269B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1404732.8A GB201404732D0 (en) 2014-03-17 2014-03-17 Virtual Reality

Publications (3)

Publication Number Publication Date
GB201404850D0 GB201404850D0 (en) 2014-04-30
GB2524269A true GB2524269A (en) 2015-09-23
GB2524269B GB2524269B (en) 2021-04-14

Family

ID=50634898

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1404732.8A Ceased GB201404732D0 (en) 2014-03-17 2014-03-17 Virtual Reality
GB1404850.8A Active GB2524269B (en) 2014-03-17 2014-03-18 Virtual reality

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1404732.8A Ceased GB201404732D0 (en) 2014-03-17 2014-03-17 Virtual Reality

Country Status (1)

Country Link
GB (2) GB201404732D0 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105739125A (en) * 2016-03-30 2016-07-06 惠州Tcl移动通信有限公司 VR (Virtual Reality) glasses and obstacle prompting method
CN105913715A (en) * 2016-06-23 2016-08-31 同济大学 VR sharable experimental system and method applicable to building environmental engineering study
CN106781264A (en) * 2017-02-16 2017-05-31 京东方科技集团股份有限公司 Safe range positioner for terminal and the terminal with it
CN106851575A (en) * 2017-01-22 2017-06-13 上海乐相科技有限公司 The method and locating calibration device of a kind of unified locating base station coordinate system
CN106878944A (en) * 2017-01-22 2017-06-20 上海乐相科技有限公司 A kind of method and locating calibration device for calibrating locating base station coordinate system
DE102015226580A1 (en) * 2015-12-22 2017-06-22 Audi Ag Method for operating a virtual reality system and virtual reality system
WO2017153775A1 (en) * 2016-03-11 2017-09-14 Sony Computer Entertainment Europe Limited Image processing method and apparatus
CN107277736A (en) * 2016-03-31 2017-10-20 株式会社万代南梦宫娱乐 Simulation System, Sound Processing Method And Information Storage Medium
WO2017193297A1 (en) * 2016-05-11 2017-11-16 Intel Corporation Movement mapping based control of telerobot
EP3264801A1 (en) * 2016-06-30 2018-01-03 Nokia Technologies Oy Providing audio signals in a virtual environment
WO2018096599A1 (en) * 2016-11-22 2018-05-31 Sony Mobile Communications Inc. Environment-aware monitoring systems, methods, and computer program products for immersive environments
CN108872937A (en) * 2018-06-27 2018-11-23 上海乐相科技有限公司 A kind of method and device for calibrating locating base station coordinate system
EP3457251A1 (en) * 2016-04-27 2019-03-20 Rovi Guides, Inc. Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment
EP3514663A1 (en) * 2016-05-17 2019-07-24 Google LLC Techniques to change location of objects in a virtual/augmented reality system
US10691199B2 (en) 2016-04-27 2020-06-23 Rovi Guides, Inc. Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment
US10832477B2 (en) 2017-11-30 2020-11-10 International Business Machines Corporation Modifying virtual reality boundaries based on usage
GB2587371A (en) * 2019-09-25 2021-03-31 Nokia Technologies Oy Presentation of premixed content in 6 degree of freedom scenes
WO2021183736A1 (en) * 2020-03-13 2021-09-16 Harmonix Music Systems, Inc. Techniques for virtual reality boundaries and related systems and methods
US11488330B2 (en) 2018-06-01 2022-11-01 Hewlett-Packard Development Company, L.P. Boundary maps for virtual reality systems

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2301216A (en) * 1995-05-25 1996-11-27 Philips Electronics Uk Ltd Display headset
GB2376397A (en) * 2001-06-04 2002-12-11 Hewlett Packard Co Virtual or augmented reality
EP1538512A2 (en) * 2003-12-04 2005-06-08 Canon Kabushiki Kaisha Mixed reality exhibiting method and apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9630105B2 (en) * 2013-09-30 2017-04-25 Sony Interactive Entertainment Inc. Camera based safety mechanisms for users of head mounted displays

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2301216A (en) * 1995-05-25 1996-11-27 Philips Electronics Uk Ltd Display headset
GB2376397A (en) * 2001-06-04 2002-12-11 Hewlett Packard Co Virtual or augmented reality
EP1538512A2 (en) * 2003-12-04 2005-06-08 Canon Kabushiki Kaisha Mixed reality exhibiting method and apparatus

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107533364B (en) * 2015-12-22 2020-06-02 奥迪股份公司 Method for operating a virtual reality system and virtual reality system
DE102015226580A1 (en) * 2015-12-22 2017-06-22 Audi Ag Method for operating a virtual reality system and virtual reality system
US10528125B2 (en) 2015-12-22 2020-01-07 Audi Ag Method for operating a virtual reality system, and virtual reality system
CN107533364A (en) * 2015-12-22 2018-01-02 奥迪股份公司 For running the method and virtual reality system of virtual reality system
US11350156B2 (en) 2016-03-11 2022-05-31 Sony Interactive Entertainment Europe Limited Method and apparatus for implementing video stream overlays
WO2017153775A1 (en) * 2016-03-11 2017-09-14 Sony Computer Entertainment Europe Limited Image processing method and apparatus
CN105739125A (en) * 2016-03-30 2016-07-06 惠州Tcl移动通信有限公司 VR (Virtual Reality) glasses and obstacle prompting method
CN107277736A (en) * 2016-03-31 2017-10-20 株式会社万代南梦宫娱乐 Simulation System, Sound Processing Method And Information Storage Medium
EP3457251A1 (en) * 2016-04-27 2019-03-20 Rovi Guides, Inc. Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment
US11353949B2 (en) 2016-04-27 2022-06-07 Rovi Guides, Inc. Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment
US10691199B2 (en) 2016-04-27 2020-06-23 Rovi Guides, Inc. Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment
US10434653B2 (en) 2016-05-11 2019-10-08 Intel Corporation Movement mapping based control of telerobot
WO2017193297A1 (en) * 2016-05-11 2017-11-16 Intel Corporation Movement mapping based control of telerobot
EP3514663A1 (en) * 2016-05-17 2019-07-24 Google LLC Techniques to change location of objects in a virtual/augmented reality system
US10496156B2 (en) 2016-05-17 2019-12-03 Google Llc Techniques to change location of objects in a virtual/augmented reality system
CN105913715A (en) * 2016-06-23 2016-08-31 同济大学 VR sharable experimental system and method applicable to building environmental engineering study
US11019448B2 (en) * 2016-06-30 2021-05-25 Nokia Technologies Oy Providing audio signals in a virtual environment
WO2018002427A1 (en) * 2016-06-30 2018-01-04 Nokia Technologies Oy Providing audio signals in a virtual environment
EP3264801A1 (en) * 2016-06-30 2018-01-03 Nokia Technologies Oy Providing audio signals in a virtual environment
US20190335290A1 (en) * 2016-06-30 2019-10-31 Nokia Technologies Oy Providing audio signals in a virtual environment
US11030879B2 (en) 2016-11-22 2021-06-08 Sony Corporation Environment-aware monitoring systems, methods, and computer program products for immersive environments
WO2018096599A1 (en) * 2016-11-22 2018-05-31 Sony Mobile Communications Inc. Environment-aware monitoring systems, methods, and computer program products for immersive environments
CN106878944A (en) * 2017-01-22 2017-06-20 上海乐相科技有限公司 A kind of method and locating calibration device for calibrating locating base station coordinate system
CN106851575A (en) * 2017-01-22 2017-06-13 上海乐相科技有限公司 The method and locating calibration device of a kind of unified locating base station coordinate system
CN106878944B (en) * 2017-01-22 2020-04-24 上海乐相科技有限公司 Method for calibrating coordinate system of positioning base station and positioning calibration device
CN106781264A (en) * 2017-02-16 2017-05-31 京东方科技集团股份有限公司 Safe range positioner for terminal and the terminal with it
US10832477B2 (en) 2017-11-30 2020-11-10 International Business Machines Corporation Modifying virtual reality boundaries based on usage
US11488330B2 (en) 2018-06-01 2022-11-01 Hewlett-Packard Development Company, L.P. Boundary maps for virtual reality systems
CN108872937B (en) * 2018-06-27 2020-11-13 上海乐相科技有限公司 Method and device for calibrating and positioning base station coordinate system
CN108872937A (en) * 2018-06-27 2018-11-23 上海乐相科技有限公司 A kind of method and device for calibrating locating base station coordinate system
GB2587371A (en) * 2019-09-25 2021-03-31 Nokia Technologies Oy Presentation of premixed content in 6 degree of freedom scenes
WO2021183736A1 (en) * 2020-03-13 2021-09-16 Harmonix Music Systems, Inc. Techniques for virtual reality boundaries and related systems and methods
US11602691B2 (en) 2020-03-13 2023-03-14 Harmonix Music Systems, Inc. Techniques for virtual reality boundaries and related systems and methods

Also Published As

Publication number Publication date
GB201404850D0 (en) 2014-04-30
GB201404732D0 (en) 2014-04-30
GB2524269B (en) 2021-04-14

Similar Documents

Publication Publication Date Title
GB2524269A (en) Virtual reality
GB2556347B (en) Virtual Reality
JP6373920B2 (en) Simulation system and program
EP3008698B1 (en) Head-mountable apparatus and systems
CA2989939C (en) Technique for more efficiently displaying text in virtual image generation system
US20150054734A1 (en) Head-mountable apparatus and systems
JP2013506226A (en) System and method for interaction with a virtual environment
US11507201B2 (en) Virtual reality
KR20230117639A (en) Methods for adjusting and/or controlling immersion associated with user interfaces
GB2583535A (en) Data processing
WO2018115840A1 (en) Virtual reality content control
GB2566745A (en) Motion signal generation
WO2018234318A1 (en) Reducing simulation sickness in virtual reality applications
NZ738277B2 (en) Technique for more efficiently displaying text in virtual image generation system