WO2016135450A1 - A mixed reality system and method for displaying data therein - Google Patents

A mixed reality system and method for displaying data therein Download PDF

Info

Publication number
WO2016135450A1
WO2016135450A1 PCT/GB2016/050375 GB2016050375W WO2016135450A1 WO 2016135450 A1 WO2016135450 A1 WO 2016135450A1 GB 2016050375 W GB2016050375 W GB 2016050375W WO 2016135450 A1 WO2016135450 A1 WO 2016135450A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
displayed
screen
layer
Prior art date
Application number
PCT/GB2016/050375
Other languages
French (fr)
Inventor
Christopher James WHITEFORD
Nicholas Giacomo Robert Colosimo
Julian David WRIGHT
Original Assignee
Bae Systems Plc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from EP15275046.9A external-priority patent/EP3062219A1/en
Priority claimed from GB201503113A external-priority patent/GB201503113D0/en
Application filed by Bae Systems Plc filed Critical Bae Systems Plc
Publication of WO2016135450A1 publication Critical patent/WO2016135450A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • This invention relates generally to a mixed reality system and a method for displaying data therein and, more particularly, to such a system and a method for displaying information therein to a plurality of users, and permitting selective viewing and manipulation thereof.
  • Virtual reality systems comprising a headset which, when placed over a user's eyes, creates and displays a three dimensional virtual environment in which a user feels immersed and with which the user can interact in a manner dependent on the application.
  • the virtual environment created may comprise a game zone, within which a user can play a game.
  • virtual reality systems have been developed which enable "screens" of information, derived from multiple data sources, to be displayed within a three-dimensional virtual room, such that when a user places the headset over their eyes, they feel immersed within a virtual room having multiple data sources displayed simultaneously in three dimensions.
  • an image of a real world object can be captured, rendered and placed within a 3D virtual reality environment, such that it can be viewed and manipulated within that environment in the same way as virtual objects therein.
  • data from one or more external data sources can be visually represented and placed within the virtual environment such that multiple data sources are displayed simultaneously in three dimensions.
  • This concept has a large number of potential uses within a working environment, as it allows many forms of information to be displayed, in novel and convenient ways, within an environment that can be tailored to the user.
  • An extension of such a system provides a plurality of headsets, each displaying a user environment, within which a plurality of respective users can share information in the same way as physical objects and information can be shared within the real world environment.
  • problems may arise in the case where there are a number of different users of the system, some or all of which have different authorisations and permissions to access and/or manipulate the information provided.
  • different people When there are multiple people working in a close, collaborative environment, different people will be authorised for, and require, access to different information and different security levels.
  • the shared information may not apply to all users of the system.
  • a user may not wish to see a specific piece of information, as it may not be relevant to them and its inclusion within their field of view simply acts to detract their attention from the information they require.
  • a user may wish to interact with private data without any other user having access thereto in order to view and/or edit it.
  • a mixed reality system comprising a headset for placing over a user's eyes, in use, said headset including a screen, the system further comprising a processor configured to receive data from one or more sources and display said data on said screen within a three-dimensional virtual environment, an image capture means arranged to capture images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to create a mixed reality environment, wherein said virtual environment includes at least two virtual, interactive data layers on which said data can be selectively displayed and within which said data can be manipulated by a user, each said layer having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system.
  • One of the data layers may be designated as a shared layer, such that said layer and data displayed thereon is accessible for display within the virtual environment displayed on the screen of any or all users of the system.
  • One of the data layers may be designated as a private layer, such that data displayed thereon is displayed only within the virtual environment on the screen of a selected one or more users of the system.
  • the system may include a control module configured to enable a user to selectively manipulate the relative positions of the data layers within the virtual environment displayed on their screen.
  • the data layers may be arranged such that a first data layer is displayed in the foreground of the virtual environment displayed on a user's screen and one or more other data layers are located within the virtual environment behind the first data layer, wherein said control module may be configured to selectively interchange the data layers displayed in the foreground.
  • the system may further comprise a selection module configured to enable a user to selectively transfer data from a first data layer displayed within the virtual environment on their screen to a second data layer displayed therein.
  • each data layer defined in the system may have associated therewith a security attribute defining whether or not a respective data layer is displayed within the virtual environment on a user's screen.
  • the system may further include a security module for selectively applying security attributes to items of data within a layer.
  • the system may further comprise an identification module for identifying a user and predefined permission attributes associated therewith, and displaying and/or manipulation of only data layers and data having corresponding security attributes within the virtual environment on said user's screen.
  • Another aspect of the present invention extends to a method of displaying data for viewing and/or manipulation by a plurality of users within a mixed reality system
  • a mixed reality system comprising at least one headset including a screen, and a processor configured to receive data from one or more sources and display said data on said screen within a three-dimensional virtual environment, an image capture means for capturing images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to create a mixed reality environment, the method comprising building, within said virtual environment, at least two virtual, interactive data layers, each said layer having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system, selectively displaying said data on said data layers, and providing a control function configured to enable a user to selectively manipulate said data layers so as to change the relative location thereof within the virtual environment displayed on their screen, and selectively move data displayed on a first data layer to a second data layer.
  • Figure 1 is a front perspective view of a headset for use in a system according to an exemplary embodiment of the present invention
  • Figure 2 is a schematic block diagram of a system according to an exemplary embodiment of the present invention.
  • Figure 3 is a schematic diagram illustrating the concept of displaying data in the form of layers within a virtual environment as envisaged in accordance with an exemplary embodiment of the present invention.
  • a system according to a present invention may comprise a headset comprising a visor 10 having a pair of arms 12 hingedly attached at opposing sides thereof in order to allow the visor to be secured onto a user's head, over their eyes, in use, by placing the curved ends of the arms 12 over and behind the user's ears, in a manner similar to conventional spectacles.
  • the headset is illustrated herein in the form of a visor, it may alternatively comprise a helmet for placing over a user's head, or even a pair of contact lenses or the like, for placing within the user's eyes, and the present invention is not intended to be in any way limited in this regard.
  • the system of the present invention further comprises a processor, which is communicably connected in some way to a screen which provided inside the visor 10.
  • a processor which is communicably connected in some way to a screen which provided inside the visor 10.
  • Such communicable connection may be a hard wired electrical connection, in which case the processor and associated circuitry will also be mounted on the headset.
  • the processor may be configured to wirelessly communicate with the visor, for example, by means of Bluetooth or similar wireless communication protocol, in which case, the processor need not be mounted on the headset but can instead be located remotely from the headset, with the relative allowable distance between them being dictated and limited only by the wireless communication protocol being employed.
  • a system according to an exemplary embodiment of the invention comprises, generally, a headset 100, incorporating a screen 102, a processor 104, and a pair of external digital image capture devices (only one shown) 106.
  • the user's headset 100 includes two image capture devices 14, which may be used to capture respective images of the real world environment in the vicinity of the user, and data representative thereof can be blended to produce a stereoscopic depth map which enables the processor 104 to determine depth within the captured images without any additional infrastructure being required. All or selected portions of the 3D images can be blended into the virtual environment being displayed on the screen 102.
  • colour data sampled from the source image may be backward warped, using homography, to each pixel in the resultant virtual scene. All of these computational steps require minimal processing capacity and time and can, therefore, be performed quickly and in real time. Thus, if the selected object is moving, for example, the user's own body, the corresponding image data within the virtual environment can be updated in real time.
  • the processor 104 is also configured to display multiple data items simultaneously within a three dimensional virtual environment. Such data items may be received from several different sources, via, for example, an Internet connection, a secure LAN, Bluetooth, or any other communications protocol, which may be wireless or otherwise.
  • the various items of data may be required to be viewed and/or manipulated by a plurality of different users.
  • Each user is provided with a headset 100 and, as such, their own virtual environment within which data can be displayed.
  • some or all of them may have different authorisations and permissions to access and/or manipulate the information provided and, as such, will be authorised for, and require, access to different information and different security levels.
  • the shared information may not apply to all users of the system.
  • a user may not wish to see a specific piece of information, as it may not be relevant to them and its inclusion within their field of view simply acts to detract their attention from the information they require.
  • a user may wish to interact with private data without any other user having access thereto in order to view and/or edit it.
  • each virtual environment is provided with a plurality of data layers.
  • a private layer 300 and a shared layer 302 may be provided within each user's virtual environment.
  • additional layers may be added by the user (or, indeed, provided by the system) to serve specific functions and, therefore, used to quickly adopt new working environments, as necessary, and the present invention is not necessarily intended to be limited in this regard.
  • the data items 304 to be displayed will have security attributes associated therewith, which define which users are permitted to view and/or manipulate it.
  • each layer has a security attribute associated therewith.
  • the shared layer 302 appears in all of the users' virtual environments, and contains data which can be viewed and manipulated by all users of the system (subject to various security attributes, as will be discussed later.
  • the shared layer may cache the last version of a data item that has been moved into an user's private layer for manipulation, such that the last version of that data item remains accessible on the shared layer to other authorised users.
  • the shared layer 302 can be seen by all users, it is not essential that all data thereon is viewable by all users. Indeed, some users may not be authorised to view such data, and others may not be authorised to manipulate such data. Thus, data items assigned to the shared layer 302 may have respective security attributes associated therewith, these may be assigned by the creator of the data or they may be assigned centrally, in accordance with, for example, the source from which the data is received. A first user, for example, may be authorised to view all data on the shared layer 302. In this case, all such data will be displayed on the shared layer within their virtual environment. In an exemplary embodiment of the invention, the user is able to selectively remove data they do not require in order to de-clutter their virtual environment, and reinstate such data as required.
  • Another user may be authorised to view all data on the shared layer 302, but may only be authorised to interact with selected data items, and the security attributes associated with the displayed data items will reflect the user's permissions, thus only allowing them to interact with data items for which they are authorised to do so (in which case, such interaction may take place within the shared layer or in their private layer, as required).
  • a user would only be permitted to move a data item into their private layer 300 if they have permission to interact with that data item.
  • Yet another user may not be authorised to view certain data items included on the shared layer, and these would be identified and eliminated from their view of the shared layer 300.
  • a security module may be provided in order to identify the user of each headset, and their levels of authorisation, and configure their view of the shared layer 302 accordingly.
  • the two layers 300, 302 are "stacked" rearwardly from the foreground of the user's field of view, although the layers could be arranged in any convenient manner and the present invention is not intended to be in any way limited in this regard.
  • the shared layer 302 appears in the foreground of the user's virtual environment and the private layer 300 appears "behind" it.
  • Data items 304 which the user is authorised to view appear on the shared layer 300, which in the example shown appears as a two-dimensional screen, but the invention is again not necessarily intended to be limited in this regard.
  • the layers 300, 302 may be entirely three dimensional, such that when a layer is in the foreground, the data items thereon are displayed in three dimensions, with the hidden layer "behind” it having no effect on the visual representation of the front layer.
  • Data items 304 from the "front” layer can be moved into the other layer by, for example, a dragging and dropping action, which may, for example, be facilitated by hand gestures made by the user.
  • the user may "pull" the rear layer into the foreground by means of a predefined hand gesture, thus enabling them to view and/or manipulate data therein, and leaving the other layer sitting behind it until it is once again required for use in the foreground.

Abstract

A mixed reality system comprising a headset for placing over a user's eyes, in use, said headset including a screen, the system further comprising a processor configured to receive data from one or more sources and display said data (304) on said screen within a three-dimensional virtual environment, an image capture means arranged to capture images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to create a mixed reality environment, wherein said virtual environment includes at least two virtual, interactive data layers (300, 302) on which said data can be selectively displayed and within which said data can be manipulated by a user, each said layer (300, 302) having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system. A method of displaying data for viewing and/or manipulation by a plurality of users within a mixed reality system is also described.

Description

A MIXED REALITY SYSTEM AND METHOD FOR DISPLAYING DATA
THEREIN
This invention relates generally to a mixed reality system and a method for displaying data therein and, more particularly, to such a system and a method for displaying information therein to a plurality of users, and permitting selective viewing and manipulation thereof.
Virtual reality systems are known, comprising a headset which, when placed over a user's eyes, creates and displays a three dimensional virtual environment in which a user feels immersed and with which the user can interact in a manner dependent on the application. For example, the virtual environment created may comprise a game zone, within which a user can play a game.
More recently, virtual reality systems have been developed which enable "screens" of information, derived from multiple data sources, to be displayed within a three-dimensional virtual room, such that when a user places the headset over their eyes, they feel immersed within a virtual room having multiple data sources displayed simultaneously in three dimensions.
More recently, augmented and mixed reality systems have been developed, an image of a real world object can be captured, rendered and placed within a 3D virtual reality environment, such that it can be viewed and manipulated within that environment in the same way as virtual objects therein. Once again, it is envisaged, that data from one or more external data sources can be visually represented and placed within the virtual environment such that multiple data sources are displayed simultaneously in three dimensions. This concept has a large number of potential uses within a working environment, as it allows many forms of information to be displayed, in novel and convenient ways, within an environment that can be tailored to the user.
An extension of such a system provides a plurality of headsets, each displaying a user environment, within which a plurality of respective users can share information in the same way as physical objects and information can be shared within the real world environment. However, problems may arise in the case where there are a number of different users of the system, some or all of which have different authorisations and permissions to access and/or manipulate the information provided. When there are multiple people working in a close, collaborative environment, different people will be authorised for, and require, access to different information and different security levels. In addition, there are other situations envisaged where the shared information may not apply to all users of the system. For example, a user may not wish to see a specific piece of information, as it may not be relevant to them and its inclusion within their field of view simply acts to detract their attention from the information they require. Furthermore, a user may wish to interact with private data without any other user having access thereto in order to view and/or edit it.
It would, therefore, be desirable to provide a multi-user mixed reality system in which security and/or privacy can be accounted for, so as to enable a bespoke environment to be created for each user of the system, and aspects of the present invention seek to address at least some of these issues.
In accordance with a first aspect of the present invention, there is provided a mixed reality system comprising a headset for placing over a user's eyes, in use, said headset including a screen, the system further comprising a processor configured to receive data from one or more sources and display said data on said screen within a three-dimensional virtual environment, an image capture means arranged to capture images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to create a mixed reality environment, wherein said virtual environment includes at least two virtual, interactive data layers on which said data can be selectively displayed and within which said data can be manipulated by a user, each said layer having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system.
One of the data layers may be designated as a shared layer, such that said layer and data displayed thereon is accessible for display within the virtual environment displayed on the screen of any or all users of the system. One of the data layers may be designated as a private layer, such that data displayed thereon is displayed only within the virtual environment on the screen of a selected one or more users of the system.
According to an exemplary embodiment of the invention, the system may include a control module configured to enable a user to selectively manipulate the relative positions of the data layers within the virtual environment displayed on their screen. In this case, the data layers may be arranged such that a first data layer is displayed in the foreground of the virtual environment displayed on a user's screen and one or more other data layers are located within the virtual environment behind the first data layer, wherein said control module may be configured to selectively interchange the data layers displayed in the foreground.
The system may further comprise a selection module configured to enable a user to selectively transfer data from a first data layer displayed within the virtual environment on their screen to a second data layer displayed therein.
In one exemplary embodiment, each data layer defined in the system may have associated therewith a security attribute defining whether or not a respective data layer is displayed within the virtual environment on a user's screen. The system may further include a security module for selectively applying security attributes to items of data within a layer. Thus, the system may further comprise an identification module for identifying a user and predefined permission attributes associated therewith, and displaying and/or manipulation of only data layers and data having corresponding security attributes within the virtual environment on said user's screen. Another aspect of the present invention extends to a method of displaying data for viewing and/or manipulation by a plurality of users within a mixed reality system comprising at least one headset including a screen, and a processor configured to receive data from one or more sources and display said data on said screen within a three-dimensional virtual environment, an image capture means for capturing images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to create a mixed reality environment, the method comprising building, within said virtual environment, at least two virtual, interactive data layers, each said layer having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system, selectively displaying said data on said data layers, and providing a control function configured to enable a user to selectively manipulate said data layers so as to change the relative location thereof within the virtual environment displayed on their screen, and selectively move data displayed on a first data layer to a second data layer. These and other aspects of the present invention will be apparent from the following specific description in which embodiments of the present invention are described, by way of examples only, and with reference to the accompanying drawings, in which:
Figure 1 is a front perspective view of a headset for use in a system according to an exemplary embodiment of the present invention;
Figure 2 is a schematic block diagram of a system according to an exemplary embodiment of the present invention; and
Figure 3 is a schematic diagram illustrating the concept of displaying data in the form of layers within a virtual environment as envisaged in accordance with an exemplary embodiment of the present invention.
Referring to Figure 1 of the drawings, a system according to a present invention may comprise a headset comprising a visor 10 having a pair of arms 12 hingedly attached at opposing sides thereof in order to allow the visor to be secured onto a user's head, over their eyes, in use, by placing the curved ends of the arms 12 over and behind the user's ears, in a manner similar to conventional spectacles. It will be appreciated that, whilst the headset is illustrated herein in the form of a visor, it may alternatively comprise a helmet for placing over a user's head, or even a pair of contact lenses or the like, for placing within the user's eyes, and the present invention is not intended to be in any way limited in this regard. Also provided on the headset, is a pair of image capture devices 14 for capturing images of the environment, such image capture devices being mounted roughly aligned with a user's eyes in use. The system of the present invention further comprises a processor, which is communicably connected in some way to a screen which provided inside the visor 10. Such communicable connection may be a hard wired electrical connection, in which case the processor and associated circuitry will also be mounted on the headset. However, in an alternative exemplary embodiment, the processor may be configured to wirelessly communicate with the visor, for example, by means of Bluetooth or similar wireless communication protocol, in which case, the processor need not be mounted on the headset but can instead be located remotely from the headset, with the relative allowable distance between them being dictated and limited only by the wireless communication protocol being employed. For example, the processor could be mounted on or formed integrally with the user's clothing, or instead located remotely from the user, either as a stand-alone unit or as an integral part of a larger control unit, for example. Referring to Figure 2 of the drawings, a system according to an exemplary embodiment of the invention comprises, generally, a headset 100, incorporating a screen 102, a processor 104, and a pair of external digital image capture devices (only one shown) 106. As stated previously, the user's headset 100 includes two image capture devices 14, which may be used to capture respective images of the real world environment in the vicinity of the user, and data representative thereof can be blended to produce a stereoscopic depth map which enables the processor 104 to determine depth within the captured images without any additional infrastructure being required. All or selected portions of the 3D images can be blended into the virtual environment being displayed on the screen 102.
The general concept of real time image blending for augmented and mixed reality is known, and several techniques have been proposed. The present invention is not intended to be in any way limited in this regard. However, for completeness, one exemplary method for image blending will be briefly described. Thus, once the image data for an item to be blended into the virtual environment has been generated, a threshold function may be applied in order to extract that image data from any background images. Its relative location and orientation may also be extracted and preserved by means of marker data. Next, the image and marker data is converted to a binary image, possibly by means of adaptive thresholding (although other methods are known). The marker data and binary image are then transformed into a set of coordinates which match the location within the virtual environment in which they will be blended. Such blending is usually performed using black and white image data. Thus, if necessary, colour data sampled from the source image may be backward warped, using homography, to each pixel in the resultant virtual scene. All of these computational steps require minimal processing capacity and time and can, therefore, be performed quickly and in real time. Thus, if the selected object is moving, for example, the user's own body, the corresponding image data within the virtual environment can be updated in real time.
The processor 104 is also configured to display multiple data items simultaneously within a three dimensional virtual environment. Such data items may be received from several different sources, via, for example, an Internet connection, a secure LAN, Bluetooth, or any other communications protocol, which may be wireless or otherwise.
Within a work environment, the various items of data may be required to be viewed and/or manipulated by a plurality of different users. Each user is provided with a headset 100 and, as such, their own virtual environment within which data can be displayed. However, and as stated above, in the case where there are a number of different users of the system, some or all of them may have different authorisations and permissions to access and/or manipulate the information provided and, as such, will be authorised for, and require, access to different information and different security levels. In addition, there are other situations envisaged where the shared information may not apply to all users of the system. For example, a user may not wish to see a specific piece of information, as it may not be relevant to them and its inclusion within their field of view simply acts to detract their attention from the information they require. Furthermore, a user may wish to interact with private data without any other user having access thereto in order to view and/or edit it.
Thus, in accordance with an exemplary embodiment of the present invention, each virtual environment is provided with a plurality of data layers. Referring to Figure 3 of the drawings, at least a private layer 300 and a shared layer 302 may be provided within each user's virtual environment. However, additional layers may be added by the user (or, indeed, provided by the system) to serve specific functions and, therefore, used to quickly adopt new working environments, as necessary, and the present invention is not necessarily intended to be limited in this regard.
The data items 304 to be displayed will have security attributes associated therewith, which define which users are permitted to view and/or manipulate it. In addition, each layer has a security attribute associated therewith. Thus, in this exemplary embodiment of the invention, there may be a "private" attribute associated with the private layer 300 and a "shared" attribute associated with the shared data layer 302. Considering first the private layer 300, this is only visible to one user. Thus, a user can view and manipulate data therein without the other users being able to view their activity. The shared layer 302 appears in all of the users' virtual environments, and contains data which can be viewed and manipulated by all users of the system (subject to various security attributes, as will be discussed later. Thus, if a user wishes to manipulate a piece of data privately, they can do so by moving the data from the shared layer 302 into their private layer 300, and perform any action required. Equally, a user can create their own data item within the private layer 300. If they subsequently wish to share the data, thus manipulated/created, with the other users, it can be moved back into the shared layer 302, as required. In one exemplary embodiment of the invention, the shared layer may cache the last version of a data item that has been moved into an user's private layer for manipulation, such that the last version of that data item remains accessible on the shared layer to other authorised users.
Although the shared layer 302 can be seen by all users, it is not essential that all data thereon is viewable by all users. Indeed, some users may not be authorised to view such data, and others may not be authorised to manipulate such data. Thus, data items assigned to the shared layer 302 may have respective security attributes associated therewith, these may be assigned by the creator of the data or they may be assigned centrally, in accordance with, for example, the source from which the data is received. A first user, for example, may be authorised to view all data on the shared layer 302. In this case, all such data will be displayed on the shared layer within their virtual environment. In an exemplary embodiment of the invention, the user is able to selectively remove data they do not require in order to de-clutter their virtual environment, and reinstate such data as required. Another user may be authorised to view all data on the shared layer 302, but may only be authorised to interact with selected data items, and the security attributes associated with the displayed data items will reflect the user's permissions, thus only allowing them to interact with data items for which they are authorised to do so (in which case, such interaction may take place within the shared layer or in their private layer, as required). Thus, it is envisaged in accordance with one exemplary embodiment of the invention, that a user would only be permitted to move a data item into their private layer 300 if they have permission to interact with that data item. Yet another user may not be authorised to view certain data items included on the shared layer, and these would be identified and eliminated from their view of the shared layer 300. A security module may be provided in order to identify the user of each headset, and their levels of authorisation, and configure their view of the shared layer 302 accordingly.
As illustrated in Figure 3 of the drawings, and in accordance with one exemplary embodiment of the invention, the two layers 300, 302 are "stacked" rearwardly from the foreground of the user's field of view, although the layers could be arranged in any convenient manner and the present invention is not intended to be in any way limited in this regard. In the example shown, the shared layer 302 appears in the foreground of the user's virtual environment and the private layer 300 appears "behind" it. Data items 304 which the user is authorised to view appear on the shared layer 300, which in the example shown appears as a two-dimensional screen, but the invention is again not necessarily intended to be limited in this regard. Indeed, it is envisaged that the layers 300, 302 may be entirely three dimensional, such that when a layer is in the foreground, the data items thereon are displayed in three dimensions, with the hidden layer "behind" it having no effect on the visual representation of the front layer. Data items 304 from the "front" layer can be moved into the other layer by, for example, a dragging and dropping action, which may, for example, be facilitated by hand gestures made by the user. Equally, the user may "pull" the rear layer into the foreground by means of a predefined hand gesture, thus enabling them to view and/or manipulate data therein, and leaving the other layer sitting behind it until it is once again required for use in the foreground. It will be apparent to a person skilled in the art, from the foregoing description, that modifications and variations can be made to the described embodiments without departing from the scope of the invention as claimed.

Claims

1 . A mixed reality system comprising a headset for placing over a user's eyes, in use, said headset including a screen, the system further comprising a processor configured to receive data from one or more sources and display said data on said screen within a three-dimensional virtual environment, an image capture means arranged to capture images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to create a mixed reality environment, wherein said virtual environment includes at least two virtual, interactive data layers on which said data can be selectively displayed and within which said data can be manipulated by a user, each said layer having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system.
2. A system according to claim 1 , wherein one of said data layers is
designated as a shared layer, such that said layer and data displayed thereon is accessible for display within the virtual environment displayed on the screen of any or all users of said system.
3. A system according to claim 1 or claim 2, wherein one of said data layers is designated as a private layer, such that data displayed thereon is displayed only within the virtual environment on the screen of a selected one or more users of said system.
4. A system according to any of the preceding claims, including a control module configured to enable a user to selectively manipulate the relative positions of said data layers within the virtual environment displayed on their screen.
5. A system according to claim 4, wherein said data layers are arranged such that a first data layer is displayed in the foreground of the virtual environment displayed on a user's screen and one or more other data layers are located within the virtual environment behind the first data layer, wherein said control module is configured to selectively
interchange the data layers displayed in said foreground.
A system according to any of the preceding claims, further comprising a selection module configured to enable a user to selectively transfer data from a first data layer displayed within the virtual environment on their screen to a second data layer displayed therein.
A system according to any of the preceding claims, wherein each data layer defined therein has associated therewith a security attribute defining whether or not a respective data layer is displayed within the virtual environment on a user's screen.
A system according to any of the preceding claims, including a security module for selectively applying security attributes to items of data within a layer.
A system according to claim 7 or claim 8, further comprising an identification module for identifying a user and predefined permission attributes associated therewith, and displaying only data layers and data having corresponding security attributes within the virtual environment on said user's screen.
A method of displaying data for viewing and/or manipulation by a plurality of users within a mixed reality system comprising at least one headset including a screen, and a processor configured to receive data from one or more sources and display said data on said screen within a three- dimensional virtual environment, an image capture means for capturing images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to create a mixed reality environment, the method comprising building, within said virtual environment, at least two virtual, interactive data layers, each said layer having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system, selectively displaying said data on said data layers, and providing a control function configured to enable a user to selectively manipulate said data layers so as to change the relative location thereof within the virtual environment displayed on their screen, and selectively move data displayed on a first data layer to a second data layer.
PCT/GB2016/050375 2015-02-25 2016-02-16 A mixed reality system and method for displaying data therein WO2016135450A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
EP15275046.9A EP3062219A1 (en) 2015-02-25 2015-02-25 A mixed reality system and method for displaying data therein
GB201503113A GB201503113D0 (en) 2015-02-25 2015-02-25 A mixed reality system adn method for displaying data therein
GB1503113.1 2015-02-25
EP15275046.9 2015-02-25

Publications (1)

Publication Number Publication Date
WO2016135450A1 true WO2016135450A1 (en) 2016-09-01

Family

ID=55361897

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2016/050375 WO2016135450A1 (en) 2015-02-25 2016-02-16 A mixed reality system and method for displaying data therein

Country Status (1)

Country Link
WO (1) WO2016135450A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10482665B2 (en) 2016-12-16 2019-11-19 Microsoft Technology Licensing, Llc Synching and desyncing a shared view in a multiuser scenario
EP4216093A1 (en) * 2017-02-07 2023-07-26 InterDigital VC Holdings, Inc. System and method to prevent surveillance and preserve privacy in virtual reality

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013028813A1 (en) * 2011-08-23 2013-02-28 Microsoft Corporation Implicit sharing and privacy control through physical behaviors using sensor-rich devices
US20140285519A1 (en) * 2013-03-22 2014-09-25 Nokia Corporation Method and apparatus for providing local synchronization of information for augmented reality objects
US20140368537A1 (en) * 2013-06-18 2014-12-18 Tom G. Salter Shared and private holographic objects

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013028813A1 (en) * 2011-08-23 2013-02-28 Microsoft Corporation Implicit sharing and privacy control through physical behaviors using sensor-rich devices
US20140285519A1 (en) * 2013-03-22 2014-09-25 Nokia Corporation Method and apparatus for providing local synchronization of information for augmented reality objects
US20140368537A1 (en) * 2013-06-18 2014-12-18 Tom G. Salter Shared and private holographic objects

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SZALAVARI Z ET AL: "STUDIERSTUBE: AN ENVIRONMENT FOR COLLABORATION IN AUGMENTED REALITY", VIRTUAL REALITY, VIRTUAL PRESS, WALTHAM CROSS, GB, vol. 3, no. 1, 1 January 1998 (1998-01-01), pages 37 - 48, XP008011892, ISSN: 1359-4338, DOI: 10.1007/BF01409796 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10482665B2 (en) 2016-12-16 2019-11-19 Microsoft Technology Licensing, Llc Synching and desyncing a shared view in a multiuser scenario
EP4216093A1 (en) * 2017-02-07 2023-07-26 InterDigital VC Holdings, Inc. System and method to prevent surveillance and preserve privacy in virtual reality

Similar Documents

Publication Publication Date Title
US11790871B2 (en) Detection and display of mixed 2D/3D content
EP3062219A1 (en) A mixed reality system and method for displaying data therein
EP3117290B1 (en) Interactive information display
US10262465B2 (en) Interactive control station
KR102340665B1 (en) privacy screen
US11308686B1 (en) Captured image data in a computer-generated reality environment
EP3262505B1 (en) Interactive system control apparatus and method
CN111566596A (en) Real world portal for virtual reality display
US11900520B1 (en) Specifying effects for entering or exiting a computer-generated reality environment
WO2016079470A1 (en) Mixed reality information and entertainment system and method
US11861056B2 (en) Controlling representations of virtual objects in a computer-generated reality environment
US10984607B1 (en) Displaying 3D content shared from other devices
Cheng et al. Towards understanding diminished reality
US20210339143A1 (en) Method and device for attenuation of co-user interactions in simulated reality (sr) space
US20180218631A1 (en) Interactive vehicle control system
CN113906765A (en) Obfuscating location-specific data associated with a physical environment
WO2016135450A1 (en) A mixed reality system and method for displaying data therein
GB2536790A (en) A mixed reality system and method for displaying data therein
GB2525304B (en) Interactive information display
EP2919094A1 (en) Interactive information display
GB2535730A (en) Interactive system control apparatus and method
US20200356163A1 (en) Techniques related to configuring a display device
CN117768630A (en) Visual technology of 3D content
EP3062221A1 (en) Interactive system control apparatus and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16704935

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16704935

Country of ref document: EP

Kind code of ref document: A1