GB2536790A - A mixed reality system and method for displaying data therein - Google Patents
A mixed reality system and method for displaying data therein Download PDFInfo
- Publication number
- GB2536790A GB2536790A GB1602707.0A GB201602707A GB2536790A GB 2536790 A GB2536790 A GB 2536790A GB 201602707 A GB201602707 A GB 201602707A GB 2536790 A GB2536790 A GB 2536790A
- Authority
- GB
- United Kingdom
- Prior art keywords
- data
- user
- displayed
- layer
- screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 12
- 230000002452 interceptive effect Effects 0.000 claims abstract description 5
- 230000008859 change Effects 0.000 claims description 2
- 238000012546 transfer Methods 0.000 claims description 2
- 230000003190 augmentative effect Effects 0.000 abstract description 3
- 238000013475 authorization Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 238000002156 mixing Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/024—Multi-user, collaborative environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A mixed or augmented reality (AR) system with a head mounted display (HMD) 100 and a processor and image capture means (e.g. camera or sensor) to create a mixed reality environment for display. The environment includes at least two virtual, interactive data layers 300, 302, where data 304 from one or more sources can be selectively displayed and data can be manipulated by a user. Each layer 300, 302 is also associated with a permission characteristic (or privacy setting) defining the visibility to another user within the AR environment. Also claimed is a method where a plurality of users are further allowed to manipulate data layers 300, 302 by changing their relative location within the virtual environment or move data 304 from the first to second layer. The system may comprise a security module for applying security attributes to data items 304 and a user identification module to enable access to security restricted layers.
Description
Intellectual Property Office Application No. GII1602707.0 RTM Date:21 Jule 2016 The following terms are registered trade marks and should be read as such wherever they occur in this document: Bluetooth Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo -1 -
A MIXED REALITY SYSTEM AND METHOD FOR DISPLAYING DATA THEREIN
This invention relates generally to a mixed reality system and a method for displaying data therein and, more particularly, to such a system and a method for displaying information therein to a plurality of users, and permitting selective viewing and manipulation thereof.
Virtual reality systems are known, comprising a headset which, when placed over a user's eyes, creates and displays a three dimensional virtual environment in which a user feels immersed and with which the user can interact in a manner dependent on the application. For example, the virtual environment created may comprise a game zone, within which a user can play a game.
More recently, virtual reality systems have been developed which enable "screens" of information, derived from multiple data sources, to be displayed within a three-dimensional virtual room, such that when a user places the headset over their eyes, they feel immersed within a virtual room having multiple data sources displayed simultaneously in three dimensions.
More recently, augmented and mixed reality systems have been developed, an image of a real world object can be captured, rendered and placed within a 3D virtual reality environment, such that it can be viewed and manipulated within that environment in the same way as virtual objects therein. Once again, it is envisaged, that data from one or more external data sources can be visually represented and placed within the virtual environment such that multiple data sources are displayed simultaneously in three dimensions. This concept has a large number of potential uses within a working environment, as it allows many forms of information to be displayed, in novel and convenient ways, within an environment that can be tailored to the user.
An extension of such a system provides a plurality of headsets, each displaying a user environment, within which a plurality of respective users can 30 share information in the same way as physical objects and information can be shared within the real world environment. -2 -
However, problems may arise in the case where there are a number of different users of the system, some or all of which have different authorisations and permissions to access and/or manipulate the information provided. When there are multiple people working in a close, collaborative environment, different people will be authorised for, and require, access to different information and different security levels. In addition, there are other situations envisaged where the shared information may not apply to all users of the system. For example, a user may not wish to see a specific piece of information, as it may not be relevant to them and its inclusion within their field of view simply acts to detract their attention from the information they require. Furthermore, a user may wish to interact with private data without any other user having access thereto in order to view and/or edit it It would, therefore, be desirable to provide a multi-user mixed reality system in which security and/or privacy can be accounted for, so as to enable a 15 bespoke environment to be created for each user of the system, and aspects of the present invention seek to address at least some of these issues.
In accordance with a first aspect of the present invention, there is provided a mixed reality system comprising a headset for placing over a user's eyes, in use, said headset including a screen, the system further comprising a processor configured to receive data from one or more sources and display said data on said screen within a three-dimensional virtual environment, an image capture means arranged to capture images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to create a mixed reality environment, wherein said virtual environment includes at least two virtual, interactive data layers on which said data can be selectively displayed and within which said data can be manipulated by a user, each said layer having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system.
One of the data layers may be designated as a shared layer, such that said layer and data displayed thereon is accessible for display within the virtual environment displayed on the screen of any or all users of the system. -3 -
One of the data layers may be designated as a private layer, such that data displayed thereon is displayed only within the virtual environment on the screen of a selected one or more users of the system.
According to an exemplary embodiment of the invention, the system may include a control module configured to enable a user to selectively manipulate the relative positions of the data layers within the virtual environment displayed on their screen. In this case, the data layers may be arranged such that a first data layer is displayed in the foreground of the virtual environment displayed on a user's screen and one or more other data layers are located within the virtual environment behind the first data layer, wherein said control module may be configured to selectively interchange the data layers displayed in the foreground.
The system may further comprise a selection module configured to enable a user to selectively transfer data from a first data layer displayed within the virtual environment on their screen to a second data layer displayed therein.
In one exemplary embodiment, each data layer defined in the system may have associated therewith a security attribute defining whether or not a respective data layer is displayed within the virtual environment on a user's screen. The system may further include a security module for selectively applying security attributes to items of data within a layer. Thus, the system may further comprise an identification module for identifying a user and predefined permission attributes associated therewith, and displaying and/or allowing manipulation of only data layers and data having corresponding security attributes within the virtual environment on said user's screen.
Another aspect of the present invention extends to a method of displaying data for viewing and/or manipulation by a plurality of users within a mixed reality system comprising at least one headset including a screen, and a processor configured to receive data from one or more sources and display said data on said screen within a three-dimensional virtual environment, an image capture means for capturing images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to -4 -create a mixed reality environment, the method comprising building, within said virtual environment, at least two virtual, interactive data layers, each said layer having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system, selectively displaying said data on said data layers, and providing a control function configured to enable a user to selectively manipulate said data layers so as to change the relative location thereof within the virtual environment displayed on their screen, and selectively move data displayed on a first data layer to a second data layer.
These and other aspects of the present invention will be apparent from the following specific description in which embodiments of the present invention are described, by way of examples only, and with reference to the accompanying drawings, in which: Figure 1 is a front perspective view of a headset for use in a system 15 according to an exemplary embodiment of the present invention, Figure 2 is a schematic block diagram of a system according to an exemplary embodiment of the present invention; and Figure 3 is a schematic diagram illustrating the concept of displaying data in the form of layers within a virtual environment as envisaged in 20 accordance with an exemplary embodiment of the present invention.
Referring to Figure 1 of the drawings, a system according to a present invention may comprise a headset comprising a visor 10 having a pair of arms 12 hingedly attached at opposing sides thereof in order to allow the visor to be secured onto a user's head, over their eyes, in use, by placing the curved ends of the arms 12 over and behind the user's ears, in a manner similar to conventional spectacles. It will be appreciated that, whilst the headset is illustrated herein in the form of a visor, it may alternatively comprise a helmet for placing over a user's head, or even a pair of contact lenses or the like, for placing within the user's eyes, and the present invention is not intended to be in any way limited in this regard. Also provided on the headset, is a pair of image capture devices 14 for capturing images of the environment, such image capture devices being mounted roughly aligned with a user's eyes in use. -5 -
The system of the present invention further comprises a processor, which is communicably connected in some way to a screen which provided inside the visor 10. Such communicable connection may be a hard wired electrical connection, in which case the processor and associated circuitry will also be mounted on the headset. However, in an alternative exemplary embodiment, the processor may be configured to wirelessly communicate with the visor, for example, by means of Bluetooth or similar wireless communication protocol, in which case, the processor need not be mounted on the headset but can instead be located remotely from the headset, with the relative allowable distance between them being dictated and limited only by the wireless communication protocol being employed. For example, the processor could be mounted on or formed integrally with the user's clothing, or instead located remotely from the user, either as a stand-alone unit or as an integral part of a larger control unit, for example.
Referring to Figure 2 of the drawings, a system according to an exemplary embodiment of the invention comprises, generally, a headset 100, incorporating a screen 102, a processor 104, and a pair of external digital image capture devices (only one shown) 106. As stated previously, the user's headset 100 includes two image capture devices 14, which may be used to capture respective images of the real world environment in the vicinity of the user, and data representative thereof can be blended to produce a stereoscopic depth map which enables the processor 104 to determine depth within the captured images without any additional infrastructure being required. All or selected portions of the 3D images can be blended into the virtual environment being displayed on the screen 102.
The general concept of real time image blending for augmented and mixed reality is known, and several techniques have been proposed. The present invention is not intended to be in any way limited in this regard. However, for completeness, one exemplary method for image blending will be briefly described. Thus, once the image data for an item to be blended into the virtual environment has been generated, a threshold function may be applied in order to extract that image data from any background images. Its relative location and orientation may also be extracted and preserved by means of -6 -marker data. Next, the image and marker data is converted to a binary image, possibly by means of adaptive thresholding (although other methods are known). The marker data and binary image are then transformed into a set of coordinates which match the location within the virtual environment in which they will be blended. Such blending is usually performed using black and white image data. Thus, if necessary, colour data sampled from the source image may be backward warped, using homography, to each pixel in the resultant virtual scene. All of these computational steps require minimal processing capacity and time and can, therefore, be performed quickly and in real time.
Thus, if the selected object is moving, for example, the user's own body, the corresponding image data within the virtual environment can be updated in real time.
The processor 104 is also configured to display multiple data items simultaneously within a three dimensional virtual environment. Such data items may be received from several different sources, via, for example, an Internet connection, a secure LAN, Bluetooth, or any other communications protocol, which may be wireless or otherwise.
Within a work environment, the various items of data may be required to be viewed and/or manipulated by a plurality of different users. Each user is provided with a headset 100 and, as such, their own virtual environment within which data can be displayed. However, and as stated above, in the case where there are a number of different users of the system, some or all of them may have different authorisations and permissions to access and/or manipulate the information provided and, as such, will be authorised for, and require, access to different information and different security levels. In addition, there are other situations envisaged where the shared information may not apply to all users of the system. For example, a user may not wish to see a specific piece of information, as it may not be relevant to them and its inclusion within their field of view simply acts to detract their attention from the information they require.
Furthermore, a user may wish to interact with private data without any other user having access thereto in order to view and/or edit it.
Thus, in accordance with an exemplary embodiment of the present invention, each virtual environment is provided with a plurality of data layers. -7 -
Referring to Figure 3 of the drawings, at least a private layer 300 and a shared layer 302 may be provided within each user's virtual environment. However, additional layers may be added by the user (or, indeed, provided by the system) to serve specific functions and, therefore, used to quickly adopt new working environments, as necessary, and the present invention is not necessarily intended to be limited in this regard.
The data items 304 to be displayed will have security attributes associated therewith, which define which users are permitted to view and/or manipulate it. In addition, each layer has a security attribute associated therewith. Thus, in this exemplary embodiment of the invention, there may be a "private" attribute associated with the private layer 300 and a "shared" attribute associated with the shared data layer 302. Considering first the private layer 300, this is only visible to one user. Thus, a user can view and manipulate data therein without the other users being able to view their activity. The shared layer 302 appears in all of the users' virtual environments, and contains data which can be viewed and manipulated by all users of the system (subject to various security attributes, as will be discussed later. Thus, if a user wishes to manipulate a piece of data privately, they can do so by moving the data from the shared layer 302 into their private layer 300, and perform any action required. Equally, a user can create their own data item within the private layer 300. If they subsequently wish to share the data, thus manipulated/created, with the other users, it can be moved back into the shared layer 302, as required. In one exemplary embodiment of the invention, the shared layer may cache the last version of a data item that has been moved into an user's private layer for manipulation, such that the last version of that data item remains accessible on the shared layer to other authorised users.
Although the shared layer 302 can be seen by all users, it is not essential that all data thereon is viewable by all users. Indeed, some users may not be authorised to view such data, and others may not be authorised to manipulate such data. Thus, data items assigned to the shared layer 302 may have respective security attributes associated therewith, these may be assigned by the creator of the data or they may be assigned centrally, in accordance with, for example, the source from which the data is received. A first user, for -8 -example, may be authorised to view all data on the shared layer 302. In this case, all such data will be displayed on the shared layer within their virtual environment. In an exemplary embodiment of the invention, the user is able to selectively remove data they do not require in order to de-clutter their virtual environment, and reinstate such data as required. Another user may be authorised to view all data on the shared layer 302, but may only be authorised to interact with selected data items, and the security attributes associated with the displayed data items will reflect the user's permissions, thus only allowing them to interact with data items for which they are authorised to do so (in which case, such interaction may take place within the shared layer or in their private layer, as required). Thus, it is envisaged in accordance with one exemplary embodiment of the invention, that a user would only be permitted to move a data item into their private layer 300 if they have permission to interact with that data item. Yet another user may not be authorised to view certain data items included on the shared layer, and these would be identified and eliminated from their view of the shared layer 300. A security module may be provided in order to identify the user of each headset, and their levels of authorisation, and configure their view of the shared layer 302 accordingly.
As illustrated in Figure 3 of the drawings, and in accordance with one exemplary embodiment of the invention, the two layers 300, 302 are "stacked" rearwardly from the foreground of the user's field of view, although the layers could be arranged in any convenient manner and the present invention is not intended to be in any way limited in this regard. In the example shown, the shared layer 302 appears in the foreground of the user's virtual environment and the private layer 300 appears "behind" it. Data items 304 which the user is authorised to view appear on the shared layer 300, which in the example shown appears as a two-dimensional screen, but the invention is again not necessarily intended to be limited in this regard. Indeed, it is envisaged that the layers 300, 302 may be entirely three dimensional, such that when a layer is in the foreground, the data items thereon are displayed in three dimensions, with the hidden layer "behind" it having no effect on the visual representation of the front layer. Data items 304 from the "front" layer can be moved into the other layer by, for example, a dragging and dropping action, which may, for example, be -9 -facilitated by hand gestures made by the user. Equally, the user may "pull" the rear layer into the foreground by means of a predefined hand gesture, thus enabling them to view and/or manipulate data therein, and leaving the other layer sitting behind it until it is once again required for use in the foreground.
It will be apparent to a person skilled in the art, from the foregoing description, that modifications and variations can be made to the described embodiments without departing from the scope of the invention as claimed.
Claims (10)
- -10 -CLAIMS 1. A mixed reality system comprising a headset for placing over a users eyes, in use, said headset including a screen, the system further comprising a processor configured to receive data from one or more sources and display said data on said screen within a three-dimensional virtual environment, an image capture means arranged to capture images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to create a mixed reality environment, wherein said virtual environment includes at least two virtual, interactive data layers on which said data can be selectively displayed and within which said data can be manipulated by a user, each said layer having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system.
- 2. A system according to claim 1, wherein one of said data layers is designated as a shared layer, such that said layer and data displayed thereon is accessible for display within the virtual environment displayed on the screen of any or all users of said system.
- 3. A system according to claim 1 or claim 2, wherein one of said data layers is designated as a private layer, such that data displayed thereon is displayed only within the virtual environment on the screen of a selected one or more users of said system.
- 4. A system according to any of the preceding claims, including a control module configured to enable a user to selectively manipulate the relative positions of said data layers within the virtual environment displayed on their screen.
- 5. A system according to claim 4, wherein said data layers are arranged such that a first data layer is displayed in the foreground of the virtual environment displayed on a user's screen and one or more other data layers are located within the virtual environment behind the first data layer, wherein said control module is configured to selectively interchange the data layers displayed in said foreground.
- A system according to any of the preceding claims, further comprising a selection module configured to enable a user to selectively transfer data from a first data layer displayed within the virtual environment on their screen to a second data layer displayed therein.
- 7. A system according to any of the preceding claims, wherein each data layer defined therein has associated therewith a security attribute defining whether or not a respective data layer is displayed within the virtual environment on a user's screen.
- A system according to any of the preceding claims, including a security module for selectively applying security attributes to items of data within a layer.
- 9. A system according to claim 7 or claim 8, further comprising an identification module for identifying a user and predefined permission attributes associated therewith, and displaying only data layers and data having corresponding security attributes within the virtual environment on said user's screen.
- 10. A method of displaying data for viewing and/or manipulation by a plurality of users within a mixed reality system comprising at least one headset including a screen, and a processor configured to receive data from one or more sources and display said data on said screen within a three-dimensional virtual environment, an image capture means for capturing images of a real world environment in the vicinity of the user and the processor being further configured to blend a least a portion of said real world environment into said three dimensional virtual environment to create a mixed reality environment, the method comprising building, within said virtual environment, at least two virtual, interactive data layers, each said layer having associated therewith a permission characteristic defining visibility thereof within said three dimensional environment displayed on the screen of another user of said system, selectively displaying said data on said data layers, and providing a -12 -control function configured to enable a user to selectively manipulate said data layers so as to change the relative location thereof within the virtual environment displayed on their screen, and selectively move data displayed on a first data layer to a second data layer.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB201503113A GB201503113D0 (en) | 2015-02-25 | 2015-02-25 | A mixed reality system adn method for displaying data therein |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201602707D0 GB201602707D0 (en) | 2016-03-30 |
GB2536790A true GB2536790A (en) | 2016-09-28 |
Family
ID=52822131
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB201503113A Ceased GB201503113D0 (en) | 2015-02-25 | 2015-02-25 | A mixed reality system adn method for displaying data therein |
GB1602707.0A Withdrawn GB2536790A (en) | 2015-02-25 | 2016-02-16 | A mixed reality system and method for displaying data therein |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB201503113A Ceased GB201503113D0 (en) | 2015-02-25 | 2015-02-25 | A mixed reality system adn method for displaying data therein |
Country Status (1)
Country | Link |
---|---|
GB (2) | GB201503113D0 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3392827A1 (en) * | 2017-04-17 | 2018-10-24 | INTEL Corporation | Collaborative multi-user virtual reality |
US11647161B1 (en) | 2022-05-11 | 2023-05-09 | Iniernational Business Machines Corporation | Resolving visibility discrepencies of virtual objects in extended reality devices |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120210254A1 (en) * | 2011-02-10 | 2012-08-16 | Masaki Fukuchi | Information processing apparatus, information sharing method, program, and terminal device |
WO2013028813A1 (en) * | 2011-08-23 | 2013-02-28 | Microsoft Corporation | Implicit sharing and privacy control through physical behaviors using sensor-rich devices |
US20130293468A1 (en) * | 2012-05-04 | 2013-11-07 | Kathryn Stone Perez | Collaboration environment using see through displays |
US20140368537A1 (en) * | 2013-06-18 | 2014-12-18 | Tom G. Salter | Shared and private holographic objects |
-
2015
- 2015-02-25 GB GB201503113A patent/GB201503113D0/en not_active Ceased
-
2016
- 2016-02-16 GB GB1602707.0A patent/GB2536790A/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120210254A1 (en) * | 2011-02-10 | 2012-08-16 | Masaki Fukuchi | Information processing apparatus, information sharing method, program, and terminal device |
WO2013028813A1 (en) * | 2011-08-23 | 2013-02-28 | Microsoft Corporation | Implicit sharing and privacy control through physical behaviors using sensor-rich devices |
US20130293468A1 (en) * | 2012-05-04 | 2013-11-07 | Kathryn Stone Perez | Collaboration environment using see through displays |
US20140368537A1 (en) * | 2013-06-18 | 2014-12-18 | Tom G. Salter | Shared and private holographic objects |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3392827A1 (en) * | 2017-04-17 | 2018-10-24 | INTEL Corporation | Collaborative multi-user virtual reality |
US10430147B2 (en) | 2017-04-17 | 2019-10-01 | Intel Corporation | Collaborative multi-user virtual reality |
US10908865B2 (en) | 2017-04-17 | 2021-02-02 | Intel Corporation | Collaborative multi-user virtual reality |
US11520555B2 (en) | 2017-04-17 | 2022-12-06 | Intel Corporation | Collaborative multi-user virtual reality |
US11647161B1 (en) | 2022-05-11 | 2023-05-09 | Iniernational Business Machines Corporation | Resolving visibility discrepencies of virtual objects in extended reality devices |
Also Published As
Publication number | Publication date |
---|---|
GB201602707D0 (en) | 2016-03-30 |
GB201503113D0 (en) | 2015-04-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11355086B2 (en) | Detection and display of mixed 2D/3D content | |
CN110954083B (en) | Positioning of mobile devices | |
EP3062219A1 (en) | A mixed reality system and method for displaying data therein | |
EP3117290B1 (en) | Interactive information display | |
KR102684612B1 (en) | Control virtual objects | |
US10262465B2 (en) | Interactive control station | |
US11308686B1 (en) | Captured image data in a computer-generated reality environment | |
US10096166B2 (en) | Apparatus and method for selectively displaying an operational environment | |
KR102340665B1 (en) | privacy screen | |
US10891800B1 (en) | Providing features of an electronic product in an augmented reality environment | |
US10296359B2 (en) | Interactive system control apparatus and method | |
US11712628B2 (en) | Method and device for attenuation of co-user interactions | |
CN111566596A (en) | Real world portal for virtual reality display | |
US20180218631A1 (en) | Interactive vehicle control system | |
US11900520B1 (en) | Specifying effects for entering or exiting a computer-generated reality environment | |
WO2016079470A1 (en) | Mixed reality information and entertainment system and method | |
US20190192967A1 (en) | Terminal device, system, program, and method | |
CN113906765B (en) | Method and apparatus for blurring location-specific data associated with a physical environment | |
GB2536790A (en) | A mixed reality system and method for displaying data therein | |
WO2016135450A1 (en) | A mixed reality system and method for displaying data therein | |
GB2525304B (en) | Interactive information display | |
EP2919094A1 (en) | Interactive information display | |
GB2535730A (en) | Interactive system control apparatus and method | |
CN117768630A (en) | Visual technology of 3D content | |
WO2024205852A1 (en) | Sound randomization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |