EP3938870A1 - Fixed holograms in mobile environments - Google Patents
Fixed holograms in mobile environmentsInfo
- Publication number
- EP3938870A1 EP3938870A1 EP20708828.7A EP20708828A EP3938870A1 EP 3938870 A1 EP3938870 A1 EP 3938870A1 EP 20708828 A EP20708828 A EP 20708828A EP 3938870 A1 EP3938870 A1 EP 3938870A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- environment
- data
- positioning information
- hologram
- display positioning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000007613 environmental effect Effects 0.000 claims abstract description 34
- 238000001914 filtration Methods 0.000 claims abstract description 24
- 230000033001 locomotion Effects 0.000 claims description 61
- 230000000007 visual effect Effects 0.000 claims description 11
- 238000005259 measurement Methods 0.000 claims description 8
- 238000000034 method Methods 0.000 abstract description 49
- 230000008569 process Effects 0.000 description 15
- 238000009877 rendering Methods 0.000 description 11
- 230000001133 acceleration Effects 0.000 description 10
- 238000012545 processing Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000011218 segmentation Effects 0.000 description 5
- 230000006641 stabilisation Effects 0.000 description 5
- 238000011105 stabilization Methods 0.000 description 5
- 230000003068 static effect Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 4
- 239000013589 supplement Substances 0.000 description 4
- 230000005670 electromagnetic radiation Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000000087 stabilizing effect Effects 0.000 description 2
- 101000822695 Clostridium perfringens (strain 13 / Type A) Small, acid-soluble spore protein C1 Proteins 0.000 description 1
- 101000655262 Clostridium perfringens (strain 13 / Type A) Small, acid-soluble spore protein C2 Proteins 0.000 description 1
- 101000655256 Paraclostridium bifermentans Small, acid-soluble spore protein alpha Proteins 0.000 description 1
- 101000655264 Paraclostridium bifermentans Small, acid-soluble spore protein beta Proteins 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/0005—Adaptation of holography to specific applications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/02—Details of features involved during the holographic process; Replication of holograms without interference recording
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0181—Adaptation to the pilot/driver
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- MR systems/devices include virtual-reality (VR) and augmented-reality (AR) systems.
- VR systems create completely immersive experiences by restricting users’ views to only virtual images rendered in VR scenes/environments.
- AR systems create AR experiences by visually presenting virtual images that are placed in or that interact with the real world.
- VR and AR systems are described and referenced interchangeably via use of the phrase“MR system.”
- the terms“virtual image” or“hologram” refer to any type of digital image rendered by an MR system.
- a head-mounted device typically provides the display used by the user to view and/or interact with holograms provided within an MR scene.
- a MR system’s HMD typically includes a head tracking system having one or more head tracking camera(s) and inertial measurement units (IMUs). Using these cameras, the head tracking system beneficially determines the HMD’s position and pose relative to its surrounding environment. It should be noted that the HMD’s position and pose are both relied upon by many existing MR systems when visually placing/rendering holograms into mixed-reality (MR) scenes. By continuously or frequently determining position and pose, MR systems can provide display corrections to its rendered holograms. These display corrections enable the MR systems to render realistic holograms because they enable the MR systems to dynamically respond to HMD’s movements in real-time and to update the holograms in response to those movements.
- IMUs inertial measurement units
- Some MR systems additionally use an inertial measurement unit (EMU) to monitor the HMD’s acceleration changes in order to determine the HMD’s position and pose.
- EMU inertial measurement unit
- EMU data is helpful in determining position and pose
- EMU data should not be used in isolation for head tracking, pose/position estimation, or hologram rendering/placement. The reason is because an EMU calculates position using a square function of time, and any errors in that calculation are proportional to the EMU’s sampling time. Consequently, in some cases, the determined position/pose (as determined by the EMU) may be significantly skewed as a result of the errors being multiplied by the time squared.
- EMU data can still be highly valuable when determining position and pose.
- some MR systems use a combination of information from their head tracking cameras and their IMUs, with the data from each combined using a Kalman filter, in order to accurately track and validate position and pose of the HMD over time. That is, IMU data can be used to augment or supplement the head tracking camera data, thereby resulting in a more reliable determination of the HMD’s position and pose. A more reliable determination of position and pose allows for improved hologram placement.
- Disclosed embodiments include systems, methods, and devices used for controlling and, in some instances, stabilizing the visual placement of holograms within MR scenes. This stabilization may occur even when certain position data, which is used for the visual placement of the holograms, is conflicted as a result of being collected while the rendering MR device is located within one or more moving environments.
- a first environment e.g., perhaps the cockpit of a plane or perhaps the inside of an elevator
- this first environment is moving relative to a second environment (e.g., perhaps the runway for the plane or perhaps a building that includes the elevator).
- display positioning information is obtained for the first environment
- second display positioning information is obtained for the second environment, the combination of which constitutes“environmental data.”
- the first display positioning information includes conflicting data (e.g., as between the data from the head tracking cameras and the data from the IMU) as a result of being collected while the first environment moves relative to the second environment.
- a determination is also made that a hologram is to be visually placed within an MR scene. This placement is to occur at a fixed position relative to either one of the first or second environments based on the environmental data.
- At least some of the second display positioning information is selectively filtered out from the environmental data. Consequently, when the environmental data is used during a placement operation in which the hologram is visually placed at the fixed position relative to the first environment, at least some of the second display positioning information is excluded, isolated, or refrained from being considered during the placement operation, such that it is unlikely to cause a conflict with the first display positioning information.
- the first display positioning information is selectively filtered out from the environmental data. Consequently, when the environmental data is used for placing the hologram at the fixed position relative to the second environment, at least some of the first display positioning information is excluded, isolated, or refrained from being considered during the placement operation, such that it is unlikely to cause a conflict with the second display positioning information.
- FIG. 1 illustrates an example of a head-mounted system/device (HMD) that includes one or more inertial measurement unit(s) (IMU) and one or more head tracking camera(s).
- IMU inertial measurement unit
- head tracking camera one or more head tracking camera(s).
- Figure 2 illustrates a flowchart of an example method for stabilizing the visual placement of a hologram within an MR scene by an MR device/system. This stabilization can occur even when certain position data, which is used for the visual placement, has conflicts as a result of being collected while the MR device was operating in a moving environment.
- Figure 3 illustrates an example scenario showing how one environment (e.g., a cockpit of a plane) can move relative to another environment (e.g., the earth).
- one environment e.g., a cockpit of a plane
- another environment e.g., the earth
- Figure 4 more fully illustrates the differences between the cockpit environment and the outside earthly environment (e.g., a runway).
- Figure 5 illustrates another example scenario in which one environment (e.g., an elevator) is moving relative to another environment (e.g., a building or other fixed location/structure on the earth).
- one environment e.g., an elevator
- another environment e.g., a building or other fixed location/structure on the earth.
- Figure 6 illustrates how an IMU can be used to monitor changes in position or pose by detecting acceleration changes.
- the IMU detects changes to the elevator’s acceleration.
- the resulting acceleration information can be used by an MR device to determine its location relative to its environment (e.g., the elevator). The information can also be used to help stabilize the placement/rendering of holograms.
- Figure 7 illustrates how a head tracking system, which includes head tracking cameras, can be used to monitor changes in an MR device’s position.
- the head tracking system can detect the position of the MR device relative to its environment (e.g., the elevator). This position information can be used by the MR system to stabilize the placement/rendering of holograms.
- Figure 8A illustrates how conflicts or discrepancies can occur between IMU data and head tracking data.
- Figure 8B illustrates that first display positioning information and second display positioning information are collectively referred to as environmental data and that there may be a full or partial discrepancy in the first display positioning information.
- Figure 9 illustrates an example of how display positioning information for one environment (e.g., a dynamic or moving environment such as the cockpit of a plane, where the plane is moving relative to the earth) can be filtered or excluded (e.g., from the head tracking cameras) so that its corresponding display positioning information is not relied on or considered when placing/rendering holograms.
- a dynamic or moving environment such as the cockpit of a plane, where the plane is moving relative to the earth
- Figure 10 illustrates an example of a hologram placement operation and illustrates how some display positioning information has been filtered or excluded from (e.g., the head tracking camera) consideration.
- Figure 11 illustrates an example of how display positioning information for one environment (e.g., a static earthly environment outside the cockpit area of a plane that is moving relative to the earthly environment) can be filtered or excluded (e.g., the head tracking camera) so that its corresponding display positioning information is not relied on or considered when placing/rendering holograms.
- one environment e.g., a static earthly environment outside the cockpit area of a plane that is moving relative to the earthly environment
- the head tracking camera e.g., the head tracking camera
- Figure 12 illustrates an example of a hologram placement operation and illustrates how some display positioning information has been filtered or excluded from consideration.
- Figure 13 illustrates a flowchart with acts corresponding to example methods for placing holograms to be rendered by a MR device when the MR device is located within multiple environments concurrently (e.g., three or more), one or more of which may be moving relative to the others.
- Figure 14 illustrates a plane positioned on an aircraft carrier, which is positioned in the ocean.
- the plane constitutes one environment
- the aircraft carrier constitutes another environment
- the ocean constitutes yet another environment.
- the embodiments are able to stabilize the placement/display of holograms within an MR scene.
- Figure 15 illustrates an example of a computer system configured to include or perform at least a portion of the disclosed embodiments.
- Disclosed embodiments relate to mixed-reality (MR) systems, devices and methods that can be used to help selectively control display positioning information (aka sensor data for display stability) that is obtained and used to facilitate the presentation and stabilization of holograms positioned/rendered within MR scenes.
- display positioning information aka sensor data for display stability
- This stabilization may occur even when certain position data, which is used for the visual placement, has internal conflicts as a result being collected while the MR device is concurrently positioned within a plurality of different environments that move relative to each other.
- a first and second environment are identified, where the first environment is moving relative to the second environment.
- Display positioning information is obtained for both of the environments (e.g., first and second display positioning information, collectively referred to as“environmental data”), but the first display positioning information has conflicting information as a result of being collected while the first environment moved.
- a determination is made that a hologram is to be visually placed within an MR scene at a fixed position relative to either one of the first or second environments.
- the embodiments selectively filter out at least some (and potentially all) of the second display positioning information from the environmental data. Consequently, at least some of the second display positioning information is excluded, isolated, or refrained from being considered during a hologram placement operation.
- the embodiments selectively filter out at least some (and potentially all) of the first display positioning information from the environmental data. Consequently, at least some of the first display positioning information is excluded, isolated, or refrained from being considered during the hologram placement operation.
- the disclosed embodiments are able to place any number of holograms in the MR scene. Furthermore, the disclosed embodiments can simultaneously perform both of the filtering operations discussed above. As such, holograms may be placed in the first environment and other holograms may be simultaneously placed in the second environment. There is no limit on the number of holograms that may be visually placed in the different environments.
- the disclosed embodiments improve the technical field in many ways. For instance, as described earlier, traditional MR systems face many problems when attempting to fixedly place a hologram in a moving environment. These problems arise because the data acquisition devices (e.g., IMUs, HT cameras, etc.) on the HMDs produce conflicting information. Specifically, it is often the case that the IMU data conflicts with the HT camera data. Because of these conflicts, the resulting hologram tends to drift or migrate away from its placement location even though it was supposed to remain at a fixed, locked, stationary, or tethered position.
- data acquisition devices e.g., IMUs, HT cameras, etc.
- the concept of being“fixed” will be described in more detail later, but it generally refers to the concept of holograms being positioned at a locked or unmoving location relative to an environment as opposed to holograms being positioned relative to the user’s head or HMD. In other words, the holograms remain stationary relative to the environment even when the user’s head or HMD moves.
- a real-world example of a“fixed” placement is that of a billboard next to a highway. Regardless of how a driver drives his/her car or moves his/her head, the billboard will remain fixedly disposed at the same location. The same concept can be applied to placing virtual images/holograms at fixed locations, as will be described in more detail later.
- the disclosed principles can be practiced to eliminate undesired drifting or migrating conditions by selectively and intelligently filtering out some, or even all, of certain selected portions of the collected data, which includes IMU data and HT camera data. For instance, some of the IMU data and/or some of the head tracking data may be filtered.
- holograms can be locked to an expansive world-like environment (e.g., perhaps a runway underneath a plane or perhaps a building enveloping an elevator) while in other cases the holograms can be locked to more local or smaller environments (e.g., perhaps the cockpit of the plane on the runway or perhaps the inside of the elevator in the building).
- the disclosed embodiments eliminate the undesired drifting conditions described above. As such, the disclosed embodiments provide a more enjoyable user experience because the users will be presented with holograms that behave in a repeatedly predictable and stable manner. Accordingly, by improving the stability of the holograms, the users’ interactions with the MR device will be improved.
- the disclosed embodiments also operate to improve the processing of the MR device itself. Specifically, by reducing the occurrence of hologram drift, the disclosed embodiments will end up performing fewer corrective actions to reposition the hologram to be at its correct location. Less corrective actions means that the MR device will use less of its resources and power, thereby resulting in improved resource and power efficiencies. Accordingly, the disclosed embodiments are highly beneficial and bring about substantial improvements to the field, as more fully described below.
- Figure 1 illustrates an example HMD 100 that is included as a part of an MR device (which will be illustrated later in connection with Figure 15).
- the descriptions“MR device” and“MR system” can be used interchangeably with one another.
- HMD 100 is itself considered as an MR device. Therefore, references to HMDs, MR devices, or MR systems generally relate to one another and may be used interchangeably.
- HMD 100 is able to stabilize the visual placement of any number of holograms (e.g., 1, 2, 3, 4, 5, 10, 20, 30, 40, 50, or more than 50 holograms) rendered by the display of HMD 100. This stabilization may occur even when certain position data, which is used for the visual placement, has conflicts or conflicting information as a result of it being collected while the HMD 100 was operating in a moving environment.
- holograms e.g., 1, 2, 3, 4, 5, 10, 20, 30, 40, 50, or more than 50 holograms
- HMD 100 is shown as including an IMU 105.
- IMU 105 is a type of device that measures force, angular adjustments/rates, orientation, acceleration, velocity, gravitational forces, and sometimes even magnetic fields. To do so, IMU 105 may include any number of data acquisition devices, which include any number of accelerometers, gyroscopes, and even magnetometers.
- IMU 105 can be used to measure a roll rate 105 A, a pitch rate 105B, and a yaw rate 105C. It will be appreciated, however, that IMU 105 can measure changes in any of the six degrees of freedom 110.
- Six degrees of freedom 110 refers to the ability of a body to move in three-dimensional space. As an example, suppose HMD 100 is operating in the cockpit of an airplane rolling along a runway. Here, the cockpit may be considered as a “first” environment and the runway may be considered as a“second” environment. Furthermore, the first environment is moving relative to the second environment. Regardless of whichever environment HMD 100 is operating within, the movements of one environment relative to another environment (as recorded or monitored by at least some of HMD 100’s data acquisition devices) can be detected or measured in any one or more of the six degrees of freedom 110.
- Six degrees of freedom 110 include surge 110A (e.g., forward/backward movement), heave 110B (e.g., up/down movement), sway 1 IOC (e.g., left/right movement), pitch 110D (e.g., movement along a transverse axis), roll 110E (e.g., movement along a longitudinal axis), and yaw 110F (e.g., movement along a normal axis).
- IMU 105 can be used to measure changes in force and changes in movement, including any acceleration changes of HMD 100. This collected data can be used to help determine a position, pose, and/or perspective of HMD 100 relative to its environment. To improve the position and pose determinations, the data generated by IMU 105 can augment or supplement data collected by a head tracking (HT) system.
- HT head tracking
- Figure 1 also shows a first HT camera 115, with its corresponding field of view (FOV) 120 (i.e. the observable area of HT camera 115, or rather the observable angle through which HT camera 115 is able to capture electromagnetic radiation), and a second HT camera 125, with its corresponding FOV 130. While only two HT cameras are illustrated, it will be appreciated that any number of HT cameras may be used on HMD 100 (e.g., 1 camera, 2, 3, 4, 5, or more than 5 cameras). Furthermore, these cameras may be included as a part of a HT system 135 implemented on HMD 100.
- FOV field of view
- HT cameras 115 and 125 can be any type of HT camera.
- HT cameras 115 and 125 may be stereoscopic HT cameras in which a part of FOVs 120 and 130 overlap with one another to provide stereoscopic HT operations.
- HT cameras 115 and 125 are other types of HT cameras.
- HT cameras 115 and 125 are able to capture electromagnetic radiation in the visible light spectrum and generate visible light images.
- HT cameras 115 and 125 are able to capture electromagnetic radiation in the infrared (IR) spectrum and generate IR light images.
- IR infrared
- HT cameras 115 and 125 include a combination of visible light sensors and IR light sensors.
- HT cameras 115 and 125 include or are associated with depth detection functionalities for detecting depth in the environment.
- HT system 135 constitutes an“inside-out” HT system because HT cameras 115 and 125 are mounted or physically disposed on HMD 100 and are looking away from HMD 100.
- An“inside-out” HT system tracks or interpolates the position of an HMD (e.g., HMD 100) by monitoring the HMD’s position and pose relative to its surrounding environment. This interpolation is accomplished by causing the HT system to identify anchor or reference points in the surrounding environment.“Anchor points” are points or objects that are identified as being highly stable or that satisfy a stability threshold. Examples include a handrail, a doorframe, an instrument panel, or even an overhead light fixture.
- the HT system then derives the HMD’s position and pose based on perspectives identified within images generated by the HT system.
- the HT system captures images of the anchor points.
- the HT system can identify a perspective between the HT system’s cameras and the anchor points. This perspective enables the HT system to then derive its position and pose relative to the anchor points.
- a head tracking system uses both IMU data and Head Tracking cameras to compensate for movements of the user’s head position.
- the IMU acts very fast and is useful for a frame by frame correction. Sometimes, however, it accumulates error. To prevent a large accumulation of error, the disclosed embodiments also use the Head Tracking cameras to provide an additional input and to ensure that the display is locked to the environment. In some cases, a Kalman filter is used to combine these two sensor inputs to provide better correction than what could be done with a single sensor alone.
- HT system 135 can perform head tracking by locking onto objects or points identified as being anchor or reference objects. Each time HT cameras 115 and 125 capture an image of the surrounding environment, HT system 135 analyzes the images to detect the anchor points/objects. Analyzing the images enables HT system 135 to determine the perspective, position, or pose of the HMD 100 relative to specific anchor points/objects and relative to an entire surrounding environment.
- an“outside-in” tracking system uses cameras that are mounted in the environment and that are pointed toward the HMD. In this manner, inside-out head tracking systems are distinguished from outside-in head tracking systems.
- HMD 100 is able to use display positioning information generated by IMU 105 and display positioning information generated by HT system 135 in order to determine HMD 100’s position and pose. This position and pose information will then enable HMD 100 to accurately render a hologram within an MR scene provided by HMD 100. For instance, if a hologram is to be fixedly displayed on a wall of a room, then the position and pose of HMD 100 are used during the hologram’s placement operation to ensure that the hologram is rendered/placed at the proper wall location.
- the information from the HT cameras and the information from the IMU(s) can be combined using a Kalman filter to provide robust head tracking position and pose estimation and to perform hologram placement using the position and pose information.
- a “Kalman” filter is a type of combining algorithm in which multiple sensor inputs, which were collected over a defined time period and which were collected using the IMU(s) and HT cameras, are combined together to provide more accurate display positioning information than that which could be achieved by either sensor alone. This combination may occur even in the face of statistical noise and/or other inaccuracies. This combined data is what is used during hologram placement.
- the display positioning information provided by IMU 105 is in conflict with the display positioning information provided by HT cameras 115 and 125, such as when the HT system 135 is located within multiple environments that move relative to each other. Consequently, the combined data generated using the Kalman filter will have conflicts, and the resulting hologram will experience drift and be unstable within the MR scenes rendered by the corresponding MR system.
- the disclosed embodiments can be used to help improve hologram placement and stability by improving how IMU position data is selectively filtered and used in conjunction with HT position data.
- Figure 2 illustrates a flowchart associated with an example method 200 for improving how holograms are visually placed/rendered in an MR scene that is provided by an MR device.
- This MR device may be configured in the manner described in Figure 1. That is, HMD 100, which can be considered as an MR device, is able to perform the method acts described in reference to method 200.
- the method 200 includes an act 205 of identifying a first environment (e.g., perhaps a cockpit of an airplane moving on runway or perhaps an elevator moving in a building) that is moving relative to a second environment (e.g., perhaps the runway or perhaps the building).
- first environment is considered as a “moving” environment or as a“dynamic” environment.
- second environment is considered to be a“stationary” environment, a“non-moving” environment, or a“static” environment. That said, it will be appreciated that in some embodiments, the second environment is also a moving environment (e.g., consider a cart driving on an aircraft carrier located on the ocean).
- the terms“first” and“second” should be interpreted broadly to include any kind of environment, regardless of whether it is moving or stationary.
- Figure 3 illustrates one example scenario in which two separate environments are illustrated. Specifically, Figure 3 shows environment 300 in the form of a cockpit of an airplane while environment 305 is in the form of the earth. In this regard, environment 300 is inside of a moving unit (e.g., an airplane) and environment 305 is a real-world environment through which the moving unit is moving.
- a moving unit e.g., an airplane
- environment 300 is representative of the first environment
- environment 305 is representative of the second environment. From Figure 3, it is apparent that environment 300 is moving relative to environment 305. That is, the cockpit of the airplane is moving through the air while the earth is relatively motionless. For the purposes of this disclosure, earth will be considered as a stationary body even though it is rotating and revolving through space at a fast rate. To clarify, earth should be considered as a stable, non-moving frame of reference relative to any of the other disclosed environments.
- Figure 4 provides additional clarity regarding how one environment can move relative to a second environment.
- Figure 4 shows an airplane 400, which is representative of the airplane shown in Figure 3 whose cockpit was labeled as environment 300.
- Figure 4 shows environment 405, which more clearly illustrates the cockpit of airplane 400.
- Figure 4 also shows environment 410, which is representative of the earth, and in particular a stationary runway on which airplane 400 is moving or running along. From this figure, it will be appreciated that environment 405 is moving relative to environment 410.
- Figure 5 illustrates yet another scenario in which multiple environments are moving relative to one another.
- Figure 5 shows a first environment 500, which is representative of the first environment referenced in method 200, and a second environment 505, which is representative of the second environment referenced in method 200.
- First environment 500 is shown as being an elevator, or rather the inside of the elevator.
- Second environment 505 can be the building surrounding the elevator or even the earth.
- first environment 500 is able to move relative to second environment 505.
- the elevator can move up and down in order to transport people to different floors in the building.
- additional environments include, but are not limited to, submarines in the ocean, oil rigs on the ocean, go-karts on a track, trucks on an aircraft carrier on the ocean, and so on. Accordingly, the disclosed environments should be interpreted broadly to include any type of environment capable of moving relative to another environment.
- FIG. 5 shows a user 510 located within first environment 500.
- User 510 is also shown as wearing an HMD 515.
- HMD 515 is representative of HMD 100 from Figure 1 and may perform any of the functions discussed earlier.
- HMD 500 may include one or more IMUs and HT cameras (in some cases, a Head Tracking System is the combination of HT cameras and at least one IMU).
- the IMUs are able to detect changes in force or movement by detecting force-based changes (e.g., acceleration) while the HT system is able to detect changes in movement by detecting differences between content included in multiple digital images.
- first display positioning information includes bifurcated, or rather two different sets of information, such as first data and second data, both of which are generated for the first environment.
- first data and second data both of which are generated for the first environment.
- the IMU data can be considered the first data while the HT camera data can be considered the second data.
- the combination of the first and second display positioning information constitutes“environmental data,” as used herein.
- the first display positioning information (which includes the“first data” and the“second data”) may be a combination of data collected from multiple different sources or data acquisition devices.
- the first data may include data generated by one or more IMUs (e.g., IMU 105 from Figure 1) while the second data may include data generated by HT cameras of an HT system (e.g., HT system 135 from Figure 1) disposed on an MR device.
- the first display positioning information may include data generated from an HMD’s IMU(s) and HT system.
- the second display positioning information may also be collected using the HMD’s same IMU(s) and HT system. Although the data may have been collected from common devices, the second display positioning information may be classified, tagged, or otherwise marked as being different from the first display positioning information. Additionally, in some cases, the second display positioning information may include data that has been inferred based on a determined state of the second environment. For instance, if the second environment is the earth, then the MR device can infer that the second environment is static, stationary, or non-moving such that the second environment is determined to have no changes in its movement. Therefore, in some embodiments, the display positioning information may be determined based on an understanding or knowledge of a particular environment’s state.
- the second display positioning information may be collected from data acquisition devices that are not disposed on the HMD.
- a separate IMU or a separate camera system may be used to generate or add to the second display positioning information.
- Figures 6 and 7 more fully clarify some aspects related to these features and to method act 210.
- Figure 6 shows an elevator 600 and two environments, namely, inside elevator 605 and outside elevator 610 (e.g., the building or the earth). Inside elevator 605 may be representative of the first environment from method act 205 while outside elevator 610 may be representative of the second environment.
- inside elevator 605 may be representative of the first environment from method act 205 while outside elevator 610 may be representative of the second environment.
- FIG. 6 shows that elevator 600 is able to move up and down, as shown by movement direction 615. As elevator 600 ascends and descends, bodies located inside elevator 605 will experience changes in force, including changes in velocity and acceleration.
- HMD An HMD’s IMU (e.g., HMD IMU 620) is able to monitor or detect these changes in force.
- Figure 6 also shows an additional IMU 625 which will be discussed later. It should be noted that both HMD IMU 620 and IMU 625 can perform any of the operations mentioned in connection with IMU 105 from Figure 1.
- HMD IMU 620 is able to capture data describing the positional changes (e.g., changes in acceleration) of elevator 600 as well as positional changes of the HMD itself.
- This data is referred to as first data 630, which is representative of the first data mentioned earlier and which is included as a part of the first display positioning information.
- Figure 7 shows elevator 700, inside elevator 705, outside elevator 710, and movement direction 715, all of which are representative of their corresponding elements in Figure 6.
- Figure 7 shows that an HT system, which includes HT camera 720A and HT camera 720B, is being used to collect display positioning information. It will be appreciated that HT cameras 720A and 720B are representative of HT cameras 115 and 125 from Figure 1.
- the HT system is able to use its cameras to capture any number of images of inside elevator 705.
- the HT system can determine the position and pose of its corresponding HMD.
- Figure 7 shows three anchor points, namely, anchor point 725 (i.e. the elevator handrail on the left-hand side), anchor point 730 (i.e. the elevator handrail in the middle), and anchor point 735 (i.e. the elevator handrail on the right-hand side). While only three anchor points are labeled, it will be appreciated that the HT system can detect and use any number of anchor points.
- the HT system can use 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 100, 200, or more than 200 anchor points when performing head tracking to determine the position and pose of an HMD.
- the HT system can generate second data 740, which is representative of the second data mentioned earlier and which is included as a part of the first display positioning information.
- this information may be generated by any of the IMUs and/or HT cameras used to generate the first display positioning information.
- the second display positioning information may include at least some similar data as the first display positioning information.
- the second display positioning information may be generated by additional IMUs, such as IMU 625 from Figure 6, and/or any other type of data acquisition system.
- the first data e.g., first data 630
- the second data e.g., second data 740
- the first display positioning information includes conflicting data as a result of being collected while the first environment moves relative to the second environment.
- first data 800 is representative of first data 630 from Figure 6.
- First data 800 was collected using non-camera-based systems.
- first data 800 may include IMU data 800A (e.g., accelerometer data, gyroscope data, magnetometer data, etc.) and global positioning system (GPS) data 800B.
- IMU data 800A e.g., accelerometer data, gyroscope data, magnetometer data, etc.
- GPS global positioning system
- first data 800C demonstrates that first data 800 may include other non-camera-based data (i.e. data collected using devices that are not considered cameras but rather that detect force-based changes).
- One additional example includes barometric or pressure data.
- first data 800 may be obtained from an IMU or even from a GPS. It will be appreciated that the first data 800 may include any kind of horizontal position coordinate data, vertical or elevation position coordinate data, and even rotational data.
- second data 805 which is representative of second data 740 from Figure 7, was collected using camera-based systems (e.g., HT cameras, depth cameras, etc.).
- the combination of first data 800 and second data 805 constitutes the first display positioning information 810, which is representative of the first display positioning information from act 210.
- Second data 805 may include HT data 805A and surface depth mesh 805B.
- surface depth mesh 805B is generated by any type of depth determining device. Such devices include, but are not limited to, depth cameras, active stereoscopic depth cameras, and even passive stereoscopic depth cameras. Therefore, in some implementations, the first display positioning information 810 includes a surface depth mesh of the first environment. In some embodiments, the first display positioning information 810 is selected based, at least in part, on the surface depth mesh of the first environment. The commonality between these different types of data included in second data 805 lies in the fact that this data was collected using an imaging camera device. The ellipsis 805C demonstrates how second data 805 may have been collected using other types of camera-based systems.
- an MR device is able to combine visual tracking of key anchor points detected by HT cameras in the form of second data 805 with IMU measurements in the form of first data 800.
- These disparate data types may be combined using the Kalman filter discussed earlier.
- first display positioning information 810 may include conflicting data as a result of the first environment moving relative to the second environment and as a result of the HT cameras being unable to detect all or some of the movement.
- Figure 8B again shows first display positioning information 810.
- First display positioning information 810 may be combined with second display positioning information 820 (e.g., of the second environment) to form environmental data 825. This data may be retained in a database or other storage medium.
- Figure 8B also shows how, in some circumstances, there may be what is called a full discrepancy 830 or a partial discrepancy 835 in the first display positioning information 810 (e.g., a discrepancy as between the first data 800 collected by the IMU and the second data 805 collected by the HT system).
- full discrepancy 830 occurs in cases where the HT cameras are entirely unable to capture images of the second environment such that the second environment is not used as a frame of reference for detecting the movement of the first environment.
- the elevator example is an example of a full discrepancy between the HT cameras and the IMU.
- Another example includes movement when the HMD is located in a submarine environment, where there are no windows.
- Partial discrepancy 835 occurs in cases where the HT cameras are able to capture images of at least a part of the second environment such that the second environment can operate as at least a partial frame of reference for detecting the movement of the first environment.
- An example of a partial discrepancy may occur when the elevator has windows, and the HT cameras can capture at least some movement.
- Another example includes detecting changes when the HMD is located in an airplane’s cockpit that has windows showing the outside and that are viewable by the HMD’s HT cameras. Further details on full discrepancy 830 and partial discrepancy 835 will be provided later.
- method 200 additionally includes an act 215 of determining that a hologram is to be visually placed within an MR scene at a fixed position relative to either one of the first environment or the second environment based on the environmental data.
- a“fixed” position does not necessarily mean 100% fidelity in being“fixed,”“stationary,”“tethered,”“locked,” or“un-moving” relative to a position within an environment. That is, a hologram positioned at a fixed location can have some slight changes, fluctuations, minor drift, or minor movements/migration. These permitted movements may be due to (i) animations of the hologram, (ii) movement traits of the hologram, (iii) rendering errors of the hologram or the environment(s), and/or (iv) other processing deviations.
- a hologram placed at a fixed position is referred to as a“world locked hologram,” though other descriptions may be used as well. Accordingly, world locked holograms are holograms that are locked at a particular position relative to the environment and/or user.
- the term“fixed” position (for a hologram corresponding to an environment) is a term of relativity that simply means the position of the hologram (relative to/corresponding to that environment) is relatively more fixed and/or has relatively less deviation in positioning relative to that environment than the deviation in positioning of the hologram relative to a corresponding set of one or more alternative environments. For example, it may be determined that a hologram has or should have a fixed position relative to a first environment but not to a second environment. This so-called“fixed” position or placement may occur even though the positioning of the hologram moves or will move some distance relative to the fist environment.
- a hologram at a fixed position relative to a particular environment may have some degree or amount of movement, but that amount is designed to be less than a corresponding movement of the hologram relative to an alternative environment, which may be contemporaneously moving relative to the hologram (for which the hologram is determined not to have a fixed position).
- an acceptable threshold movement or deviation is associated with a hologram placed at a fixed position. Consequently, the hologram may be permitted to move or fluctuate up to the threshold movement while still being considered to be located or placed at the fixed position.
- a hologram can be rendered to look like a billboard in an environment.
- the billboard hologram is permitted to have some movement or fluctuation in accordance with the disclosure presented above.
- method 200 also includes two alternative method acts, namely acts 220 A and 220B.
- method acts 220 A and 220B may be performed alternatively to one another, it will be appreciated that these two acts may both be performed, and in some cases, they may be performed concurrently or simultaneously with one another, such as when placing a first hologram in a fixed location relative to a first environment and a second hologram in a fixed location relative to a second environment, while the first and second environments move relative to each other.
- act 220A upon determining that the hologram is to be visually placed within the MR scene at the fixed position relative to the first environment, act 220A includes a process of selectively filtering out at least some (and potentially all) of the second display positioning information from the environmental data.
- a portion of the first display positioning information e.g., some or all of the first data and/or some or all of the second data
- portions of the first display positioning information may also be excluded from consideration.
- act 220B upon determining that the hologram is to be visually placed within the MR scene at the fixed position relative to the second environment, act 220B includes a process of selectively filtering out at least some (and potentially all) of the first display positioning information from the environmental data.
- portions of the second display positioning information may also be selectively filtered out as well. Consequently, when the environmental data is used during the placement operation in which the hologram is visually placed at the fixed position relative to the second environment, at least some (and potentially all) of the first display positioning information is excluded from consideration during the placement operation. In some cases, portions of the second display positioning information may also be excluded from the consideration as well.
- each of the acts 220A and 220B for selectively filtering display positioning information may additionally include a preceding act of determining that the first and second environment are moving relative to each prior to filtering the display positioning information.
- the selective filtering may be performed intermittently as the first and second environments correspondingly move (intermittently) relative to each other.
- the selective filtering may intermittently occur while the elevator is in motion (upon making the determination the environments are moving relative to each other), but not when the elevator is temporarily held in a stationary position at each floor the elevator stops at in the building.
- selective filtering may include a process of refraining to gather certain/filtered display positioning information (inferential filtering or prefiltering) and/or the process of extracting/ignoring the display positioning information to be filtered after it is gathered/received (explicit or delayed filtering).
- the disclosed methods may also include the act 225 of causing the MR device to visually place the hologram at the fixed position within the MR scene in accordance with the corresponding display positioning information that remains, corresponding to whichever fixed location is selected for the hologram placement (e.g., either in the first environment or the second environment).
- Figures 5, 6, and 7 represented a scenario involving a full discrepancy (e.g., full discrepancy 830 from Figure 8B) between the HMD EMU’s data and the HT camera’s data.
- This full discrepancy occurred as a result of the HT cameras not being able to capture images of the second environment. Consequently, the second environment could not be used as a frame of reference for the HT cameras in detecting the movements of the first environment.
- some of the disclosed embodiments utilize an additional IMU to capture additional IMU data.
- IMU 625 in addition to the HMD IMU 620 of Figure 6 (which is included as a part of an HMD), some embodiments additionally use IMU 625 in order to resolve the full discrepancy.
- the data generated by IMU 625 can be included or classified as a part of the second display positioning information for the second environment - even though it is detecting movements of the elevator 600 (i.e. the first environment). It will be appreciated that just because data is classified as belonging to either one of the first or second environments does not mean that the data is restricted to describing only one of the first or second environments.
- the second environment is the building or earth, which is considered a stationary body absent of movement. If the data from IMU 625 were not included in the second positional information, then the second display positioning information may be null or empty. As such, the data from IMU 625 is classified as the second display positioning information to facilitate the HMD in resolving a full discrepancy scenario.
- IMU 625 is an independent IMU unit. In other implementations, IMU 625 is included as a part of at least one of (i) a smart phone, (ii) a different type of handheld device, or (iii) even an integral part or portion of the second environment. For instance, the IMU can be fixed to a wall, a control console, a fixture, or any other integrated portion of the second environment. Regardless of how it is configured, IMU 625 is located at a location removed from the HMD/MR device.
- IMU 625 may be located within elevator 600 but not on the HMD. As such, IMU 625 is able to independently monitor the force changes of elevator 600 without detecting force changes to the HMD (e.g., occurring from movements of the user’s head). As described earlier, in some cases, the previously described second display positioning information for the second environment can include both inferred data based on a determined state of the second environment (e.g., non-movement of the earth or building) as well as the data from IMU 625.
- a determined state of the second environment e.g., non-movement of the earth or building
- IMU 625 is fixedly positioned to the moving environment (e.g., an elevator, a moving car, an airplane, etc.) while the HMD IMU 620 is fixedly positioned to an HMD located in the moving environment.
- Data from IMU 625 may be provided to the HMD via any communication channel, such as, but not limited to, Bluetooth, near field communications, wifi, or any other wireless protocol.
- HMD IMU 620 The data generated by HMD IMU 620 is subtracted from the data generated by IMU 625 (or vice versa) to provide positional changes describing only movements of the user’s head with respect to the environmental frame of reference (e.g., in this case, the elevator).
- the embodiments are able to determine only the force-based changes of the HMD without factoring in the force-based changes of the elevator relative to the building.
- HMD IMU 620 detected changes in movement for both elevator 600 as well as the HMD. It is desirable to remove the elevator 600’ s movement component and use only the HMD’s movement component.
- the elevator 600’ s movement component can be considered a part of the second display positioning information. As such, it is desirable to filter out that information in accordance with act 220A and rely only on the first display positioning information (which includes the HT camera data and only the subtracted version of the data now describing only the force-based changes to the HMD).
- the filtering process described in act 220A may include the subtraction process described above.
- the force-based changes to the HMD can then be used to augment or supplement the HT camera data to determine position and pose and to place a hologram. That is, because the user is completely enveloped within the first environment (and the second environment is not viewable), the HT cameras will be referenced only to the first environment and not to the second environment.
- the HT camera data is not influenced from the movements of the first environment relative to the second environment, unlike the IMU data which was influenced by the movements. Consequently, in this case, the disclosed embodiments can entirely filter out some or all of the second display positioning information obtained for the second environment (e.g., the force-based movements of elevator 600).
- the subtracted version of the data from HMD IMU 620 may not be needed because the HT cameras (in some cases) may be able to provide a complete or full depiction/rendition of the first environment.
- the HT camera frame rate and/or data rate may be adjusted to be fast enough so as to sufficiently capture and monitor the surrounding environment.
- the subtracted version of the data from the HMD IMU 620 can be (but is not required to be) entirely filtered and excluded from consideration when the holograms are being placed (such that only the HT camera data is considered). In other words, IMU data does not augment or supplement the HT camera data.
- the HT camera data and the IMU data may be provided with different weighting scales. For instance, depending on the situation, the HT camera data may be provided a relatively higher weight (such that its impact is higher or such that it is considered relatively more important) while the IMU data may be provided with a relatively lower weight, or vice versa. Determining what weight to provide the HT camera data and the IMU data can be based on a number of different factors. Some factors include, but are not limited to, the type of environments currently at play, the average stability of the HMD, the type of application currently executing on the HMD, the sensor sensitivity of the HT cameras (e.g., frame rate) and/or the IMU, and others as well. Accordingly, in some embodiments, weights may be assigned to the HT camera data and the IMU data.
- situations involving a full discrepancy can perform method act 220A and can filter out at least some (and potentially all) of the second display positioning information and potentially can filter out some of the first positioning data (e.g., the subtracted version of the data from the HMD’s IMU).
- Figures 9, 10, 11, and 12 represent a scenario involving a partial discrepancy (e.g., partial discrepancy 835 from Figure 8B).
- the content in these figures corresponds to the content previously shown in Figure 4. That is, Figures 9, 10, 11, and 12 illustrate a cockpit of an airplane moving on a runway.
- the first scenario arises when a hologram is to be placed at a fixed position relative to the second environment (e.g., a non-moving, stationary, or static environment, though the second environment may be moving) but a portion of the second environment is occluded.
- the first scenario causes act 220B of Figure 2 to be performed.
- the second scenario arises when a hologram is to be placed at a fixed position relative to the first environment (e.g., a moving environment), but again a portion of the second environment is occluded.
- the second scenario causes act 220A of Figure 2 to be performed.
- the cockpit illustration from Figure 4 will be used to describe both of these scenarios.
- the first scenario involves placing a hologram in environment 410 (i.e. the earth environment such that the hologram is a world-locked hologram) while the second scenario involves placing the hologram in environment 405 (i.e. the cockpit such that the hologram is a locally-locked hologram).
- the disclosed embodiments use full data from the HMD’s IMU(s) as well as some, but not all, of the data from the HT cameras.
- the full data from the HMD’s IMU(s) includes both movements of the HMD as well as movements of the first environment (e.g., the airplane).
- the HT cameras see a combination of points. Some points are referenced to the first environment (e.g., the cockpit) while other points are referenced to the second environment (e.g., the earth).
- the disclosed embodiments filter some of the HT camera data, where the filtered data corresponds to the first environment.
- this filtering process is achieved through the additional use of depth cameras disposed on the HMD. These depth cameras are configured to generate a 3D surface mesh of the HMD’s environment.
- a 3D surface mesh is a geometric representation or model made up of any number of discrete interconnected faces (e.g., triangles) and/or other interconnected vertices. The combination of these vertices describes the environment’s geometric contours, including the contours of any objects within that environment.
- the 3D surface mesh would include a part corresponding to environment 410 and another part corresponding to environment 405.
- the 3D surface mesh part corresponding to environment 410 would have depths that are relatively larger than depths included in the 3D surface mesh corresponding to environment 405. This is due to the fact that environment 410 is relatively more remote or further away from the HMD than environment 405. In other words, the cockpit is closer to the HMD than the runway.
- the filtering process may include filtering out at least some data included in the surface depth mesh of the first or second environments.
- the filtering process may also include segmenting or excluding the undesired portions (e.g., in this case the cockpit portions of environment 405). Segmenting ensures that any key anchor or reference points selected by the HT cameras are located only within the second environment (e.g., environment 410) as opposed to being within the first environment (e.g., environment 405). This segmentation process also ensures alignment between the IMU and the HT cameras because both units will be focused only on data associated with the second environment. In some cases, the segmentation process involves the use of machine learning to learn an algorithm that provides binary segmentation of the cockpit environment from the outside surrounding environment. As the cockpit is static for most situations, this region of the HT camera data can be ignored or filtered.
- the data corresponding to the cockpit has been eliminated or filtered so that the cockpit data will not influence holograms placed in the runway environment.
- the HMD s IMU generated data recording the movement of the HMD and the cockpit relative to the runway, and the HT cameras generated data for both the cockpit and the runway.
- the embodiments use the full data from the HMD’s IMU (which measures both the HMD’s movements as well as the cockpit’s movements) and combine that data with some, but not all, of the data captured by the HT cameras. That is, some of the HT camera data is filtered out.
- Figure 9 shows an example of this filtering and segmentation process. Specifically, Figure 9 shows an MR scene 900 that includes segmented environment 905 (i.e. the data that was filtered out) and environment 910.
- segmented environment 905 corresponds to the“first” environment that is moving relative to the “second” environment (i.e. environment 910).
- the disclosed embodiments utilize a filter 915 to identify data associated with the cockpit environment (e.g., the identification can be performed using the depth data or surface mesh described earlier). Then, the embodiments use filter 915 to remove that data from consideration, thereby resulting in segmented environment 905.
- Figure 9 shows segmented environment 905 with a cross-hatch pattern to visually illustrate how the data corresponding to segmented environment 905 is actually precluded from consideration.
- method act 220B is satisfied because at least some of the first display positioning information (e.g., the cockpit data) is filtered.
- Figure 10 shows a follow-up figure. Similar to Figure 9, Figure 10 illustrates an MR scene 1000, a segmented environment 1005, and environment 1010. Here, the embodiments have performed a placement operation 1015 to visually place a hologram 1020 at a fixed position 1025 relative to the environment 1010. Here, hologram 1020 is a virtual image reminding the pilot to“Don’t Forget To Buckle Up!”
- Hologram 1030 has also been placed in environment 1010. Hologram 1030 is visually emphasizing the presence of another real-world object in environment 1010. Specifically, hologram 1030 is calling attention (e.g., via use of virtual arrows and a virtual bounding box) to the presence of another plane located near the pilot’s plane.
- holograms 1020 and 1030 may be any type of hologram. As examples only, holograms 1020 and 1030 may be used to visually emphasize real-world objects, they may be used to visually present purely virtual information to a user, and they may even be used to control operations of the HMD. The holograms 1020 and 1030 may have any color, format, or other type of visual presentation.
- holograms 1020 and 1030 can be characterized as“world-locked” holograms that are placed at fixed positions.
- Figures 11 and 12 focus on the second scenario discussed earlier in which a hologram is to be placed at a fixed position relative to the first environment (e.g., a moving environment), but portions of the second environment are occluded.
- the second scenario causes method act 220A to be performed.
- Figure 11 again shows an MR scene 1100, an environment 1105 (i.e. the local cockpit environment that is moving relative to the runway), and a segmented environment 1110 (which corresponded to the runway).
- the embodiments use a filter 1115 to filter out the runway data.
- Segmented environment 1110 symbolically illustrates that the data for the runway environment has been excluded from consideration. Segmented environment 1110 is shown using a cross-hatch pattern for the symbolism.
- Figure 12 is a follow-up figure showing the hologram placement operation. Specifically, Figure 12 shows MR scene 1200, environment 1205, segmented environment 1210, and a placement operation 1215. Placement operation 1215 caused a hologram 1220 to be placed at a fixed position 1225 relative to environment 1205 (i.e. inside the cockpit). As shown, hologram 1220 is a reminder hologram reminding the pilot“Don’t Forget To Buckle Up!” Therefore, in this situation, the embodiments are able to generate a locally- locked hologram that is placed in the local, moving environment.
- the HMD’s IMU data is subtracted from an IMU placed in the local environment (e.g., the cockpit), or vice versa. Performing this subtraction allows the embodiments to filter out data pertaining to the outside world and to focus only on the data relating to the movements of the HMD.
- the HT system segments and excludes from view the data relating to the outside world. This is shown as segmented environment 1210.
- the embodiments can ensure that the hologram will be placed at a locally-fixed location and will not be influenced by the data relating to the outside world, even when the cockpit is moving.
- machine learning or any other type of segmentation process can be used for placing holograms at locally-fixed locations.
- this second scenario causes act 220A of Figure 2 to be performed because at least some of the second display positioning information (e.g., the HT camera data corresponding to the runway and/or the IMU data corresponding only to the force-based changes of the cockpit) is filtered from consideration.
- the second display positioning information e.g., the HT camera data corresponding to the runway and/or the IMU data corresponding only to the force-based changes of the cockpit
- Figures 13 and 14 relate to such a scenario. Specifically, Figure 13 illustrates a flowchart of an example method 1300 for factoring in the movement data from a third environment.
- Method 1300 initially includes an act 1305 of identifying a third environment.
- the first environment and the second environment are moving relative to the third environment.
- Figure 14 illustrates such a scenario. Specifically, Figure 14 shows a plane moving on an aircraft carrier which itself is moving on the ocean. Environment 1400 (i.e. the plane) is representative of the previously described first environment, environment 1405 (i.e. the aircraft carrier) is representative of the previously described second environment, and environment 1410 (i.e. the ocean) is now the third environment.
- Environment 1400 i.e. the plane
- environment 1405 i.e. the aircraft carrier
- environment 1410 i.e. the ocean
- method 1300 includes an act 1310 of obtaining third display positioning information for the third environment and including the third display positioning information in the environmental data.
- the environmental data now includes first, second, and third display positioning information.
- the third display positioning information may be collected in any number of ways, similar to those mentioned earlier.
- the third display positioning information may be collected using IMUs, GPSs, cameras, depth cameras, and so on.
- Method 1300 then includes an act 1315 where, instead of visually placing the hologram at the fixed position relative to either one of the first environment or the second environment, the hologram is visually placed at the fixed position relative to the third environment.
- This placement involves selectively filtering out at least some of the first display positioning information and at least some of the second display positioning information from the environmental data. Consequently, when the environmental data is used during the placement operation in which the hologram is visually placed at the fixed position relative to the third environment, at least some of the first display positioning information and at least some of the second display positioning information are excluded from consideration during the placement operation.
- the disclosed embodiments are able to improve how holograms are placed within an MR scene. These improvements are achieved, in some cases, by collecting movement data about multiple different environments, at least one of which is moving relative to another one. Then, the embodiments selectively filter out some (or potentially all) of the movement data, depending on which environment the hologram is to be placed within. In doing so, the embodiments are able to ensure that the hologram is positioned at a fixed location and is positioned in a highly stable manner.
- Figure 15 illustrates an example computer system 1500 that may include and/or be used to perform the operations described herein.
- this computer system 1500 may be in the form of the MR systems/devices that were described earlier.
- Computer system 1500 may take various different forms.
- computer system 1500 may be embodied as a tablet 1500A, a desktop 1500B, or an HMD 1500C (with a corresponding MR rendering engine and display device), such as those described throughout this disclosure.
- HMD 1500C with a corresponding MR rendering engine and display device
- the ellipsis 1500D demonstrates that computer system 1500 may be embodied in any form.
- Computer system 1500 may also be a distributed system that includes one or more connected computing components/devices that are in communication with computer system 1500, a laptop computer, a mobile phone, a server, a data center, and/or any other computer system.
- the ellipsis 1500D also indicates that other system subcomponents may be included or attached with the computer system 1500, including, for example, sensors that are configured to detect sensor data such as user attributes (e.g., heart rate sensors), as well as sensors like cameras and other sensors that are configured to detect sensor data such as environmental conditions and location/positioning (e.g., clocks, pressure sensors, temperature sensors, gyroscopes, accelerometers and so forth), all of which sensor data may comprise different types of information used during application of the disclosed embodiments.
- Some of the embodiments are implemented as handheld devices or handheld depth cameras. Some embodiments are also operable in robotics, drones, ambient settings, and any type of mobile phone.
- computer system 1500 includes various different components.
- Figure 15 shows that computer system 1500 includes at least one processor 1505 (aka a“hardware processing unit”), input/output (“I/O”) 1510, HT system 1515 (which may include any number of HT cameras), IMU(s) 1520, a depth system 1525 (or surface reconstruction engine), and storage 1530.
- processor 1505 aka a“hardware processing unit”
- I/O input/output
- HT system 1515 which may include any number of HT cameras
- IMU(s) 1520 IMU(s) 1520
- depth system 1525 or surface reconstruction engine
- HT system 1515 may include any number of HT cameras. These cameras may be configured in the manner described earlier, and the HT system 1515 may perform any of the disclosed head tracking operations.
- Depth system 1525 is able to cause depth camera(s) to scan and generate a surface mesh, or spatial mapping, of an environment. Furthermore, depth system 1525 can control illuminator(s) used to provide additional illumination or texture for the MR scene. These illuminators may include visible light illuminators and/or IR light illuminators.
- Depth system 1525 may also include any number of time of flight cameras, active or passive stereoscopic cameras, and/or any other type of depth cameras. Using these cameras, depth system 1525 is able to capture images of an environment and generate a 3D representation of that environment in the form of a 3D surface mesh. Accordingly, depth system 1525 includes any hardware and/or software components necessary to generate a surface mesh/spatial mapping (which may include depth images/maps, 3D dot/point clouds, and/or 3D meshes). This surface mesh/spatial mapping may be used when segmenting and characterizing objects in the real-world environment, as described earlier.
- Storage 1530 is shown as including executable code/instructions 1530A.
- the executable code/instructions 1530A represent instructions that are executable by computer system 1500 to perform the disclosed operations, such as those described in the methods of Figures 2 and 13.
- Storage 1530 may be physical system memory, which may be volatile, non volatile, or some combination of the two.
- the term“memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If computer system 1500 is distributed, the processing, memory, and/or storage capability may be distributed as well.
- the term “executable module,” “executable component,” or even “component” can refer to software objects, routines, or methods that may be executed on computer system 1500.
- the different components, modules, engines, and services described herein may be implemented as objects or processors that execute on computer system 1500 (e.g. as separate threads).
- the disclosed embodiments may comprise or utilize a special-purpose or general-purpose computer including computer hardware, such as, for example, one or more processors (such as processor 1505) and system memory (such as storage 1530), as discussed in greater detail below.
- Embodiments also include physical and other computer- readable media for carrying or storing computer-executable instructions and/or data structures.
- Such computer-readable media can be any available media that can be accessed by a general-purpose or special-purpose computer system.
- Computer-readable media that store computer-executable instructions in the form of data are physical computer storage media.
- Computer-readable media that carry computer-executable instructions are transmission media.
- the current embodiments can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
- Computer storage media are hardware storage devices, such as RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSD”) that are based on RAM, Flash memory, phase-change memory (“PCM”), or other types of memory, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code means in the form of computer-executable instructions, data, or data structures and that can be accessed by a general-purpose or special-purpose computer.
- RAM random access memory
- ROM read-only memory
- EEPROM electrically erasable programmable read-only memory
- CD-ROM Compact Disk Read Only Memory
- SSD solid state drives
- PCM phase-change memory
- Computer system 1500 may also be connected (via a wired or wireless connection) to external sensors (e.g., one or more remote cameras, accelerometers, gyroscopes, acoustic sensors, magnetometers, etc.).
- external sensors e.g., one or more remote cameras, accelerometers, gyroscopes, acoustic sensors, magnetometers, etc.
- computer system 1500 can communicate with a handheld device 1540 that includes an IMU 1540A.
- IMU 1540A includes an accelerometer 1540B and a gyroscope 1540C.
- computer system 1500 can communicate with a smart phone 1545 that includes an IMU 1545 A.
- IMU 1545 A includes an accelerometer 1545B and a gyroscope 1545C.
- Handheld device 1540 and/or smart phone 1545 may be example implementations of the additional IMU discussed in connection with Figure 6 (e.g., IMU 625).
- computer system 1500 may also be connected through one or more wired or wireless networks 1535 to remote/separate computer systems(s) that are configured to perform any of the processing described with regard to computer system 1500.
- a user of computer system 1500 is able to perceive information (e.g., an MR scene/environment (including VR or AR)) through a display screen that is included with the I/O 1510 of computer system 1500 and that is visible to the user.
- information e.g., an MR scene/environment (including VR or AR)
- the EO 1510 and sensors with the I/O 1510 also include gesture detection devices, eye trackers, and/or other movement detecting components (e.g., cameras, gyroscopes, accelerometers, magnetometers, acoustic sensors, global positioning systems (“GPS”), etc.) that are able to detect positioning and movement of one or more real-world objects, such as a user’s hand, a stylus, and/or any other object(s) that the user may interact with while being immersed in the mixed-reality environment.
- gesture detection devices e.g., cameras, gyroscopes, accelerometers, magnetometers, acoustic sensors, global positioning systems (“GPS”), etc.
- GPS global positioning systems
- a graphics rendering engine may also be configured, with processor 1505, to render one or more virtual objects within an MR scene.
- processor 1505 to render one or more virtual objects within an MR scene.
- the virtual objects accurately move in response to a movement of the user and/or in response to user input as the user interacts within the virtual scene.
- A“network,” like the network 1535 shown in Figure 15, is defined as one or more data links and/or data switches that enable the transport of electronic data between computer systems, modules, and/or other electronic devices.
- a network either hardwired, wireless, or a combination of hardwired and wireless
- Computer system 1500 will include one or more communication channels that are used to communicate with the network 1535.
- Transmissions media include a network that can be used to carry data or desired program code means in the form of computer-executable instructions or in the form of data structures. Further, these computer- executable instructions can be accessed by a general-purpose or special-purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa).
- program code means in the form of computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a network interface card or“NIC”) and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system.
- NIC network interface card
- Computer-executable (or computer-interpretable) instructions comprise, for example, instructions that cause a general-purpose computer, special-purpose computer, or special-purpose processing device to perform a certain function or group of functions.
- the computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
- embodiments may be practiced in network computing environments with many types of computer system configurations, including personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like.
- the embodiments may also be practiced in distributed system environments where local and remote computer systems that are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network each perform tasks (e.g. cloud computing, cloud services and the like).
- program modules may be located in both local and remote memory storage devices.
- the functionality described herein can be performed, at least in part, by one or more hardware logic components (e.g., the processor 1505).
- illustrative types of hardware logic components include Field-Programmable Gate Arrays (“FPGA”), Program-Specific or Application-Specific Integrated Circuits (“ASIC”), Program-Specific Standard Products (“ASSP”), System-On-A-Chip Systems (“SOC”), Complex Programmable Logic Devices (“CPLD”), Central Processing Units (“CPU”), and other types of programmable hardware.
- FPGA Field-Programmable Gate Arrays
- ASIC Program-Specific or Application-Specific Integrated Circuits
- ASSP Program-Specific Standard Products
- SOC System-On-A-Chip Systems
- CPLD Complex Programmable Logic Devices
- CPU Central Processing Units
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/355,244 US10559135B1 (en) | 2019-03-15 | 2019-03-15 | Fixed holograms in mobile environments |
PCT/US2020/015519 WO2020190380A1 (en) | 2019-03-15 | 2020-01-29 | Fixed holograms in mobile environments |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3938870A1 true EP3938870A1 (en) | 2022-01-19 |
Family
ID=69410692
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20708828.7A Pending EP3938870A1 (en) | 2019-03-15 | 2020-01-29 | Fixed holograms in mobile environments |
Country Status (3)
Country | Link |
---|---|
US (1) | US10559135B1 (en) |
EP (1) | EP3938870A1 (en) |
WO (1) | WO2020190380A1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3639083A4 (en) * | 2017-06-14 | 2021-01-27 | Hewlett-Packard Development Company, L.P. | Display adjustments |
US11094128B2 (en) * | 2019-10-08 | 2021-08-17 | Panasonic Avionics Corporation | Utilizing virtual reality and hi-definition camera technology to allow passengers to experience flight path |
US11507400B2 (en) * | 2020-02-28 | 2022-11-22 | Wipro Limited | Method and system for providing real-time remote assistance to a user |
US11688080B2 (en) * | 2021-04-30 | 2023-06-27 | Microsoft Technology Licensing, Llc | Tracking in a moving platform |
CN113419632A (en) * | 2021-07-06 | 2021-09-21 | 广州市旗鱼软件科技有限公司 | Mixed reality simulation driving scene display method and system |
US11852825B1 (en) * | 2022-03-08 | 2023-12-26 | Meta Platforms Technologies, Llc | Selective notifications from eye measurements |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030210228A1 (en) * | 2000-02-25 | 2003-11-13 | Ebersole John Franklin | Augmented reality situational awareness system and method |
US10163264B2 (en) * | 2013-10-02 | 2018-12-25 | Atheer, Inc. | Method and apparatus for multiple mode interface |
EP2933707B1 (en) * | 2014-04-14 | 2017-12-06 | iOnRoad Technologies Ltd. | Head mounted display presentation adjustment |
US9626802B2 (en) * | 2014-05-01 | 2017-04-18 | Microsoft Technology Licensing, Llc | Determining coordinate frames in a dynamic environment |
FR3020882B1 (en) * | 2014-05-09 | 2017-12-08 | Thales Sa | OPTIMIZING THE TRACK OF AN AIRCRAFT |
US10529248B2 (en) * | 2014-06-19 | 2020-01-07 | Embraer S.A. | Aircraft pilot training system, method and apparatus for theory, practice and evaluation |
FR3046226B1 (en) * | 2015-12-29 | 2020-02-14 | Thales | DISPLAY OF METEOROLOGICAL DATA IN AN AIRCRAFT |
FR3063713B1 (en) * | 2017-03-09 | 2019-07-05 | Airbus Operations (S.A.S.) | DISPLAY SYSTEM AND METHOD FOR AN AIRCRAFT |
-
2019
- 2019-03-15 US US16/355,244 patent/US10559135B1/en active Active
-
2020
- 2020-01-29 EP EP20708828.7A patent/EP3938870A1/en active Pending
- 2020-01-29 WO PCT/US2020/015519 patent/WO2020190380A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2020190380A1 (en) | 2020-09-24 |
US10559135B1 (en) | 2020-02-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10559135B1 (en) | Fixed holograms in mobile environments | |
US11501527B2 (en) | Visual-inertial positional awareness for autonomous and non-autonomous tracking | |
US20210012520A1 (en) | Distance measuring method and device | |
US10354396B1 (en) | Visual-inertial positional awareness for autonomous and non-autonomous device | |
US11948369B2 (en) | Visual-inertial positional awareness for autonomous and non-autonomous mapping | |
US9459692B1 (en) | Virtual reality headset with relative motion head tracker | |
US20110301925A1 (en) | Optical State Estimation And Simulation Environment For Unmanned Aerial Vehicles | |
US20220051031A1 (en) | Moving object tracking method and apparatus | |
US10890600B2 (en) | Real-time visual-inertial motion tracking fault detection | |
US20180075614A1 (en) | Method of Depth Estimation Using a Camera and Inertial Sensor | |
Tomažič et al. | Fusion of visual odometry and inertial navigation system on a smartphone | |
CN108933902A (en) | Panoramic picture acquisition device builds drawing method and mobile robot | |
US20240031678A1 (en) | Pose tracking for rolling shutter camera | |
CN116952229A (en) | Unmanned aerial vehicle positioning method, device, system and storage medium | |
CN208638479U (en) | Panoramic picture acquisition device and mobile robot | |
CN116631307A (en) | Display method, intelligent wearable device, electronic device, device and storage medium | |
Praschl et al. | Enabling outdoor MR capabilities for head mounted displays: a case study | |
KR20240006669A (en) | Dynamic over-rendering with late-warping | |
Ready et al. | Inertially aided visual odometry for miniature air vehicles in gps-denied environments | |
KR20180060403A (en) | Control apparatus for drone based on image | |
US20230281834A1 (en) | Tracking in a moving platform | |
WO2019015261A1 (en) | Devices and methods for determining scene | |
WO2023130465A1 (en) | Aerial vehicle, image processing method and apparatus, and movable platform | |
CN115583243A (en) | Method for determining lane line information, vehicle control method, device and equipment | |
WO2023223262A1 (en) | Smooth object correction for augmented reality devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210907 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230725 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |