WO2019112557A1 - Enhanced reality headsets - Google Patents

Enhanced reality headsets Download PDF

Info

Publication number
WO2019112557A1
WO2019112557A1 PCT/US2017/064616 US2017064616W WO2019112557A1 WO 2019112557 A1 WO2019112557 A1 WO 2019112557A1 US 2017064616 W US2017064616 W US 2017064616W WO 2019112557 A1 WO2019112557 A1 WO 2019112557A1
Authority
WO
WIPO (PCT)
Prior art keywords
enhanced reality
reality headset
user
viewport
view
Prior art date
Application number
PCT/US2017/064616
Other languages
French (fr)
Inventor
Ian N. Robinson
Hiroshi Horii
Rafael Ballagas
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2017/064616 priority Critical patent/WO2019112557A1/en
Publication of WO2019112557A1 publication Critical patent/WO2019112557A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0154Head-up displays characterised by mechanical features with movable elements
    • G02B2027/0156Head-up displays characterised by mechanical features with movable elements with optionally usable elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • Augmented reality, virtual reality, and mixed reality all involve users interacting with real and/or perceived aspects of an environment in order to manipulate and/or interact with that environment interaction by a user in the augmented reality, virtual reality, and/or mixed reality environments may be viewed by others via a display device communicatively coupled to an augmented reality, virtual reality, and/or mixed reality system.
  • FIG. 1 is a block diagram of an enhanced reality headset according to an example of the principles described herein.
  • FIG. 2 is a block diagram of an enhanced reality system according to an example of the principles described herein.
  • FIG. 3 is a block diagram of an enhanced reality headset according to an example of the principles described herein.
  • FIG. 4 is a perspective view of an enhanced reality headset according to an example of the principles described herein.
  • FIG. 5 is a perspective view of an enhanced reality headset according to an example of the principles described herein.
  • Fig. 6 is a side cut-away view of a user ’ s head relative to an enhanced reality headset according to an example of the principles described herein.
  • VR Virtual reality
  • AR augmented reality
  • MR mixed reality
  • the present specification also describes an enhanced reality system that includes a computing device comprising a processor, an input device, an enhanced reality headset, and a viewport that selectively allows a user to view a scene exterior to the enhanced reality headset
  • the present specification further describes an enhanced reality headset that includes at least one visual output device and a viewport wherein the viewport is selectively opened and closed to provide a user of the enhanced reality headset to a view exterior to the enhanced reality headset based on characteristics of use of the enhanced reality headset
  • Fig. 1 is a block diagram of an enhanced reality headset (100) according to an example of the principles described herein.
  • the enhanced reality headset (100) may include at least one visual output device (105) and a viewport (1 10).
  • the enhanced reality headset (100) may further include a processor that receives, at least, image data and displays the image data on the visual output device (105).
  • the processor may, in an example, further execute computer usable program code to, when output from a gaze sensor is received, open the viewport (1 10) of the enhanced reality headset (100).
  • the visual output device (105) may be any type of visual output device (105) that presents to a user an enhanced reality (ER) environment.
  • the enhanced reality headset (100) includes two visual output devices (105); one for each eye.
  • the two visual output devices (105) are arranged to be placed 1 to 2 inches away from a user’s eyes. In this manner, a three-dimensional (3D) stereoscopic image and the ER environment is presented to the user.
  • Other types of visual output devices (105) may be used to assist the user in viewing the ER environment within the enhanced reality headset (100) and the present specification contemplates the use of those other types of visual output devices (105).
  • the enhanced reality headset (100) also includes a housing.
  • the housing may include certain features that prevent the enhanced reality headset (100) from slipping down a user’s face as well as maintaining the enhanced reality headset (100) on the user’s head.
  • the housing may include coupling points where a strap or other head coupling device is coupled to the housing so that the enhanced reality headset (100) may be maintained on the face of a user.
  • the housing may further serve to house certain electrical circuits, components, and or processors associated with the functioning of the enhanced reality headset (100).
  • the housing may further include surfaces that hold the visual output device(s) (105) at a constant distance from the user’s eyes.
  • the enhanced reality headset (100) may include a viewport (1 10).
  • the viewport (1 10) allows a user to view the real-life environment exterior to the housing of the enhanced reality headset (100) in an example, the viewport (1 10) selectively allows the user to see the real-life environment exterior to the enhanced reality headset (100).
  • this may be referred to as activation of the viewport herein.
  • the viewport (1 10) serves as a portion of the housing of the enhanced reality headset (100) when shrouding the view of the user in this example, the portion of the housing is moved away from
  • the viewport (1 10) may be coupled to the enhanced reality headset (100) via a hinge at one end allowing the viewport (1 10) to be moved away without being uncoupled from the enhanced reality headset (100).
  • a latch may be provided that disengages with the viewport (1 10) such that the viewport (1 10) is allowed to move away from a user’s LoS by either gravity or a spring installed with the hinge.
  • the latch may be electrically activated based on input from a sensor.
  • the viewport (1 10) may be slid out of the way using a compartment formed into the enhanced reality headset (100).
  • movement of the viewport (1 10) into the compartment where the viewport (1 10) may be placed allows the user to view the real-life environment outside the enhanced reality headset (100).
  • the viewport (1 10) may include a protrusion that extends out from the viewport (1 10) that the user may push against in order to move the viewport (1 10) out of the LoS of the user as described herein.
  • the viewport (1 10) serves as a portion of the housing of the enhanced reality headset (100) when shrouding the view of the user
  • the viewport (1 10) is removable from the enhanced reality headset (100).
  • the removable viewport (1 10) may be selectively coupled to the enhanced reality headset (100) via any type of coupling device that allows a user to physically remove the viewport (1 10) during use of the enhanced reality headset (100).
  • the viewport (1 10) may be made of a switchable glass that has light transmission properties that are altered when voltage, light, and/or heat is applied to the switchable glass in this way, the physical structure of the viewport (1 10) remains stationary while the switchable glass selectively allows for the transmission and non-transmission of the light from outside the enhanced reality headset (100) through the viewport (1 10).
  • the housing may be formed into a number of louvers. These louvers may be moved to allow a user to view between the louvers when the viewport (1 10) is activated as described herein [0028] As described herein, the removal of the viewport (1 10) from the LoS of the user may be done by the user interacting physically with the viewport (1 10).
  • electrical and/or mechanical devices may be used to remove the viewport (1 10) from the LoS of the user in the example where the viewport (1 10) is coupled to the enhanced reality headset (100) via, for example a hinge, a latch may be mechanically activated to release the viewport (1 10) allowing the viewport (1 10) pivot about the hinge.
  • a processor associate with the enhanced reality headset (100) may control if and when the louvers are to be opened.
  • the processor may receive certain input from the user and/or a number of input devices directing the louvers to be opened.
  • the visual output device (105) may further include a gaze sensor.
  • the gaze sensor may monitor the direction of a user’s gaze.
  • a user may relatively more often direct the user’s gaze to the visual output device(s)
  • a user may implement certain input devices such as a keyboard and a mouse in order to interact with objects represented within the ER environment. With the viewport (1 10) closed and preventing the user from seeing the input device, a user may be left to rely on the sense of touch in order to interface with those input devices.
  • the gaze sensor may track the movement of a user’s eyes and determine if and when a user is attempting to view the real-life environment blocked from view by a“closed” viewport (1 10).
  • the viewport (1 10) may, via a processor,“open” to allow the user to see the real-world environment including the input devices in the example where the viewport (1 10) forms a part of the housing of the enhanced reality headset (100) as described herein, the processor may send a signal to a latching device or motor device to move the viewport (1 10) out of the way as described herein.
  • the processor may send a signal to have voltage, light, and/or heat applied to the switchabie glass in order to make the switchabie glass transparent.
  • the processor may“close” the viewport (1 10) upon receipt of such an indication from the gaze sensor.
  • the enhanced reality headset (100) may further include a voice recognition device.
  • the user may present voice commands directed towards allowing the user to view out of the viewport (1 10) and at the real-life environment exterior to the enhanced reality headset (100). Similar to the gaze sensor described herein, specific commands from the user such as“open viewport” may be received at a microphone of the voice recognition device. This audio input may be received by a processor associated with the voice recognition device and/or enhanced reality headset (100), be analyzed, and specific commands may be executed via signals sent to the enhanced reality headset (100).
  • These commands may include sliding the viewport (1 10) open, activating a latch holding the viewport (1 10) closed, or applying voltage, light, and/or heat to a switchabie glass as described herein. In this manner, the viewport (1 10) may be“opened” and“closed” using voice recognition.
  • the enhanced reality headset (100) may further include an input sensor that senses input from an input device.
  • the enhanced reality headset (100) may be communicatively coupled to a processor of a computing device.
  • the computing device may include a number of input devices includes, but not limited to, a keyboard, a mouse, a joystick, a touch sensitive mat, a trackball, a microphone, a camera, among others. These devices may be activated during use of the enhanced reality headset (100) by a user placing a hand on the devices.
  • the processor may receive input from any one of these devices and send a signal to the viewport (1 10) of the enhanced reality headset (100) to“open” as described herein in this example, a user of the enhanced reality headset (100) may not see the actual input device but may know the general area where it is located.
  • the user may cause the viewport (1 10) to be activated allowing the user to interact with the input device relatively more accurately in the example where the input device is a microphone, the user may implement voice commands which may then be received by the process and signals may be sent, wirelessly or vie wire, to the enhanced reality headset (100) as described.
  • the webcam may monitor parts of the user’s body or the general area containing the input devices for specific hand and/or body gestures. These hand and/or body gestures may be interpreted by image processing applications executed by the processor of the computing device such that certain hand and/or body gestures indicating the activation or opening of the viewport (1 10). Again, a signal may be processed by the processor and sent, either wirelessly or via a wire, to the enhanced reality headset (100) in order to open and/or activate the viewport (1 10) and its associated devices.
  • the enhanced reality headset (100) may further include an input sensor that senses input from a switch or touch panel on the head mounted display, or from a motion sensor on the head mounted display in this example, the viewport may be activated by a user touching the side of the headset (1 10) or by a user briefly shaking the user’s head.
  • the viewport (1 10) may be formed on any surface of the housing of the enhanced reality headset (100). in an example, the viewport (1 10) is formed within the lower portion of the housing. This lower portion of the housing may shroud the area below the user’s eyes and, when the viewport (1 10) is closed, contacts the users cheeks in the example where the viewport (1 10) is to be moved away from the users face, the viewport (1 10) may slide away or flip down from the user’s cheeks opening up the viewport (1 10) as described in any example herein.
  • the lower housing of the enhanced reality headset (100) may be formed out of the switchable glass such that the user, when the switchable glass is activated, may be allowed to see a desktop below the enhanced reality headset (100)
  • Fig. 2 is a block diagram of an enhanced reality system (200) according to an example of the principles described herein.
  • the enhanced reality system (200) may include a computing device (205) communicatively coupled to at least one processor (210), at least one input device (215), and an enhanced reality headset (220) including a viewport (225).
  • the enhanced reality system (200) may further include any devices used to communicatively couple any of the computing device (205), processor (210), input device (215), enhanced reality headset (220), and viewport (225) together.
  • the computing device (205) may be any type of computing device including desktop computers, laptop computers, personal digital assistants (PDAs), mobile devices, smartphones, gaming systems, and tablets, among other computing devices.
  • the input device (215) may be communicatively coupled to the computing device (205) so that input from the input device (215) may be received by the processor (210) of the computing device (205) and that input may be used to allow a user to interact with objects represented to a user via the enhanced reality headset (220).
  • the enhanced reality headset (220) may also be communicatively coupled to the processor (210) of the computing device (205) in order to receive input from the processor (210) as well as provide data to the processor (210).
  • the processor (210) may execute computer readable program code in order to present that ER environment.
  • the processor (210) may receive input from the enhanced reality headset (220) indicating changes to the ER environment displayed as well as the orientation of the enhanced reality headset (220) with the ER environment.
  • the process of the processor (210) executing the computer readable program code to provide a visual representation of the ER environment and the process of the processor (210) receiving input from the enhanced reality headset (220) may be conducted on a real-time basis during operation of the enhanced reality headset (220).
  • the input device (215) may be any number of input devices that include any type of input device. As described herein, example input devices (215) may include a mouse, a keyboard, a joystick, a microphone, a camera, the gaze sensor described herein, or any other device that may be used to provide input to, at least, the processor (210) of the computing device (205). In each of these examples, however, the input from the input device (215) may be received via a hardware adapter at the computing device (205), received by the processor (210), interpreted, and used to affect the state of the viewport (225) as described herein.
  • the viewport (225) of the enhanced reality headset (220) may also be communicatively coupled to the processor (210) of the computing device (205).
  • the viewport (225) may be a physical portion of the housing of the enhanced reality headset (220).
  • the portion of the housing of the enhanced reality headset (220) may be a lower portion in an example, the viewport (225) may include a latch and a hinge that allow a user to physical unlatch a side of the viewport (225) allowing the viewport (225) to be moved away from the user’s view towards, for example, a desktop.
  • the viewport (225) may be slid out of the way by the user along a set of rails allowing the viewport (225) to be moved away from the users view towards, for example, a desktop.
  • the viewport (225) may be unlatched or slid based on input from an input device in the example where the input device (215) is a microphone, a user may present oral commands that are received by the microphone.
  • the input may be received and processed by the processor (210) and signals may be sent to the devices controlling the viewport (225) to“open” the viewport allowing the user to view the real-life environment exterior to the enhanced reality headset (220).
  • the input device (215) is a camera
  • a user may present visual commands that are received by the camera.
  • the input may be received and processed by the processor (210) and signals may be sent to the devices controlling the viewport (225) to“open” the viewport allowing the user to view the real-life environment exterior to the enhanced reality headset (220).
  • the input device (215) is the gaze sensor described herein
  • a user may direct the user ’ s gaze towards the viewport (225) and the movement may be detected by the gaze sensor.
  • the gaze sensor may receive this input and send it to the processor (210) to be processed. Signals may be sent to the devices controlling the viewport (225) to “open” the viewport allowing the user to view the real-life environment exterior to the enhanced reality headset (220).
  • the input device (215) is either a joystick a mouse, and/or a keyboard
  • a user may activate any input methods in order to send any type of input to the processor (210).
  • the input may be received and processed by the processor (210) and signals may be sent to the devices controlling the viewport (225) to“open” the viewport allowing the user to view the real-life environment exterior to the enhanced reality headset (220).
  • the viewport (225) is made of a switchable glass that is selectively made opaque and transparent
  • similar input from the input devices (215) may render the switchable glass transparent.
  • any described input from the joystick, mouse, keyboard, camera, microphone, and/or gaze sensor may cause the switchable glass to become transparent.
  • the transparency of the switchable glass may be changed back to opaque after the passage of time and/or after input from the input devices (215) is no longer received or not received for after the passage of time in the example where the input device (215) is a gaze sensor, the switchable glass of the viewport (225) may be made opaque when the user’s gaze is no longer detected as being directed towards the viewport (225) and/or the environment exterior to the enhanced reality headset (220).
  • Fig. 3 is a block diagram of an enhanced reality headset (300) according to an example of the principles described herein.
  • the enhanced reality headset (300) of Fig. 3 may include at least one visual output device (305) and a viewport (310) similar to that shown and described in connection with Fig. 1.
  • the example shown in Fig. 3 may further include a viewport (310) that is selectively activated to provide a user of the enhanced reality headset to a view exterior to the enhanced reality headset based on characteristics of use of the enhanced reality headset identified by an analysis module (315).
  • a processor Fig 2, 210) of either a computing device (Fig.
  • the enhanced reality headset (300) is coupled to, or a processor of the enhanced reality headset (300) itself may execute computer readable program code in order to present a user with an ER environment.
  • the processor may execute computer usable program code associated with the analysis module (315).
  • the analysis module (315) may be stored in a data storage device as a separate computer program product and, during use do the enhanced reality headset (300), may be used to detect the
  • the ER environment may elicit a response from a user in the form typed input from an input device (Fig. 2, 215) such as a keyboard.
  • the viewport (225) may be activated to allow a user to see the keyboard.
  • the activation of the viewport (310) may be done when, for example, a user’s gaze is sensed by a gaze sensor indicating that a user is looking at part of the ER environment where input may be entered.
  • the gaze sensor may send a signal to the processor indicating that the user should be allowed to see the keyboard and that the viewport (310) should be activated to allow the user to do so in an example, the viewport (310) may remain open for a period of time and as long as the gaze sensor senses that a user’s gaze is being directed to the part of the ER environment where input may be entered or input is received from the keyboard in this manner, the viewport (310) may be selectively activated based on characteristics of use of the enhanced reality headset during execution of the analysis module.
  • Fig. 4 is a perspective view of an enhanced reality headset (400) according to an example of the principles described herein in the example shown in Fig. 4, the enhanced reality headset (400) may include a number of visual output devices (405) and a viewport (410) as described herein. In this example, the number of visual output devices (405) is two: one visual output device (405) for each eye. In this example, this creates a stereoscopic view of the ER environment presented to the user of the enhanced reality headset (400).
  • the viewport (410) is a siiding-type viewport (410).
  • the viewport (410) serves as a portion of the lower side of the housing (420) of the enhanced reality headset (400) when placed in a closed position.
  • the viewport (410) When the viewport (410) is placed in an open state, the user is allowed to view the real-life environment outside of the enhanced reality headset (400) and in particular a desktop in front of the user that includes input devices to be used by the user to interact with the ER environment presented on the visual output devices (405).
  • the arrow (415) denotes the direction the sliding housing portion of the viewport (410) is to be moved.
  • the enhanced reality headset (400) may further include a number of gaze sensors (425).
  • the gaze sensors (425) may monitor the direction of a user’s gaze during use of the enhanced reality headset (400).
  • a signal may be sent to a processor.
  • This processor may send a signal to, for example, a motor to cause the viewport (410) to slide out of the way of the user’s gaze and allow the user to view the real-life world environment outside of the enhanced reality headset (400).
  • Fig. 4 shows the use of the gaze sensors (425), the present specification contemplates any devices and input from any type of input device used to open or move the viewport (410) out of the way of the viewer’s gaze to all the user to view outside of the enhanced reality headset (400).
  • Fig. 4 also shows a number of coupling points (430) to which a harness may be coupled.
  • the harness may prevent the enhanced reality headset (400) from sliding down the user’s face by holding the enhanced reality headset (400) to the user’s head.
  • the viewport (505) is made of a switchable glass as described herein.
  • the switchable glass may remain tight to a user’s face thereby supporting the enhanced reality headset (400) on the user’s face.
  • the gaze sensors (425) may be used to track the user’s direction of gaze.
  • the gaze sensors (425) and/or processor may send a signal to apply a voltage, light, and/or heat to the switchable glass.
  • the switchable glass is made transparent allowing the user to view the real-life environment exterior to the enhanced reality headset (400).
  • the viewport (410) may be created by the enhanced reality headset (500) being moved as a whole away from the user’s face.
  • an interface between the enhanced reality headset (500) and a head coupling device may include a hinge. The hinge may allow the enhanced reality headset (500) to be flipped up and away from in front of the user’s face thereby creating the viewport (410).
  • any input from any device or any action described herein may be used to automatically cause the enhanced reality headset (500) to be flipped up in this manner in an example, the degree to which the enhanced reality headset is flipped up may be limited to a certain degree (i.e., an angle of roughly 20 degrees may be sufficient) in an example, an automated mechanism may be used to flip the headset up over that range.
  • This may allow a user to access input devices communicatively coupled to the enhanced reality headset (500) or computing device associated with the enhanced reality headset (500).
  • the enhanced reality headset (500) may be flipped back down allowing the user to interact with the ER environment displayed on the display devices again.
  • the enhanced reality headset (500) may be flipped back down after a passage of time thereby allowing a user to prepare before not being able to see the real-life environment again.
  • Fig. 6 is a side cut-away view of a user’s head relative to an enhanced reality headset according to an example of the principles described herein.
  • the enhanced reality headset (800) provides an amount of space between the housing of the enhanced reality headset (500) and the user’s face when the viewport (410) is in an open state that allows for the user to view the real-life environment presented below the user. Because this is a location where a mouse, keyboard, or other types of input devices are located, a user may be able to visually access the input devices when the viewport (410) is open or has been activated as described herein.
  • the amount of viewable space is dictated by the angle (605) created between the user’s face and any remaining portion of the housing of the enhanced reality headset (800).
  • a portion of the user’s view is taken up by the visual output devices (410) in front of the user’s eyes and that angle (610) may be more or less than the angle (805) between the user’s face and the housing of the enhanced reality headset (800).
  • the enhanced reality headset (600) may further include a head coupling device (615) and a hinge (820) to allow the enhanced reality headset (800) to be flipped up and away from the user’s face when the real-life environment exterior to the enhanced reality headset (600) is to be seen in this example, the viewport (410) may be activated when this occurs, opening up more of the user’s view of the outside world that would otherwise be obscured by the housing. Similarly, the viewport could close automatically when the headset is flipped back down.
  • the enhanced reality headset (600) flipping up may be included with the viewports (410) described herein or may replace the movement of the housing or the switchable glass.
  • the specification and figures describe an enhanced reality headset that includes a viewport.
  • the view port may be placed in two states that selectively allow and prevent a user from viewing the real-life environment exterior to the enhanced reality headset.
  • the viewport may be opened as the user has directed in order to allow a user to interact with real-life objects such as the input devices presented herein in some examples, the context and/or characteristics of the use of the enhanced reality headset may dictate if and when the viewport is activated thereby allowing the user to view the real-life environment exterior to the enhanced reality headset.

Abstract

An enhanced reality headset, in an example, may include at least one visual output device and a viewport that selectively allows a user to view a scene exterior to the enhanced reality headset. An enhanced reality system, in an example, may include a computing device comprising a processor, an input device, and an enhanced reality headset comprising a viewport that selectively allows a user to view a scene exterior to the enhanced reality headset.

Description

ENHANCED REALITY HEADSETS
BACKGROUND
[0001] Augmented reality, virtual reality, and mixed reality all involve users interacting with real and/or perceived aspects of an environment in order to manipulate and/or interact with that environment interaction by a user in the augmented reality, virtual reality, and/or mixed reality environments may be viewed by others via a display device communicatively coupled to an augmented reality, virtual reality, and/or mixed reality system.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] The accompanying drawings illustrate various examples of the principles described herein and are part of the specification. The illustrated examples are given merely for illustration, and do not limit the scope of the claims.
[0003] Fig. 1 is a block diagram of an enhanced reality headset according to an example of the principles described herein.
[0004] Fig. 2 is a block diagram of an enhanced reality system according to an example of the principles described herein.
[000S] Fig. 3 is a block diagram of an enhanced reality headset according to an example of the principles described herein.
[0006] Fig. 4 is a perspective view of an enhanced reality headset according to an example of the principles described herein.
[0007] Fig. 5 is a perspective view of an enhanced reality headset according to an example of the principles described herein. [0008] Fig. 6 is a side cut-away view of a users head relative to an enhanced reality headset according to an example of the principles described herein.
[0009] Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements. The figures are not necessarily to scale, and the size of some parts may be exaggerated to more clearly illustrate the example shown. Moreover, the drawings provide examples and/or implementations consistent with the description; however, the description is not limited to the examples and/or implementations provided in the drawings.
DETAILED DESCRIPTION
[0010] Virtual reality (VR), augmented reality (AR), and mixed reality (MR) systems and devices are used by a user to perceive a visual
representation of a VR, AR, and/or MR environments VR systems and devices implement virtual reality (VR) headsets to generate realistic images, sounds, and other human discernabie sensations that simulate a user’s physical presence in a virtual environment presented at the headset. In some examples, the VR system and/or device presents to a user a representation of physical spaces and/or multi-projected environments. AR systems and/or devices may include those systems and devices that implement live direct and/or indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as sound, video, graphics and/or GPS data. MR systems and/or devices present environments to a user that include the merging of real and virtual worlds to produce new environments and visualizations where physical and digital objects co-exist and interact in real time. For simplicity in description only, virtual reality (VR), augmented reality (AR), and mixed reality (MR) systems and/or devices are referred to herein as enhanced reality (ER) systems and/or devices.
[0011] ER systems include headsets that provide to a user a dedicated view of an ER environment. The headset replicates and/or displays an environment simulating physical presence In places in the real world or imagined worlds and lets a user interact with objects in that environment or the environment itself. The headset may include at least one display device that is brought close to a user’s eye within the headset. The headset may further include a housing. The housing may hold in place the display device described herein.
[0012] The housing may further be formed to wrap around a user’s face blocking the users’ view of the real-world exterior to the headset. In some examples, this may be done so as to prevent light from entering into the headset in other examples, the housing wrapped around the user’s head may create a certain distance between the user’s eyes and the display device in yet other examples, the housing may prevent the headset from slipping off of the user’s face during use by conforming to the users face while a strap secures the headset to the user’s head.
[0013] As mentioned, however, the housing prevents the user from viewing any scenes within the real-world environment exterior to the headset. This prevents a user from being able to visually interact with objects as well as ascertain the position of real-world objects around the user.
[0014] The present specification describes an enhanced reality headset that includes at least one visual output device and a viewport that selectively allows a user to view a scene exterior to the enhanced reality headset.
[0016] The present specification also describes an enhanced reality system that includes a computing device comprising a processor, an input device, an enhanced reality headset, and a viewport that selectively allows a user to view a scene exterior to the enhanced reality headset
[0016] The present specification further describes an enhanced reality headset that includes at least one visual output device and a viewport wherein the viewport is selectively opened and closed to provide a user of the enhanced reality headset to a view exterior to the enhanced reality headset based on characteristics of use of the enhanced reality headset
[0017] As used in the present specification and in the appended claims, the term“enhanced reality” is meant to be understood as virtual reality (VR), augmented reality (AR), and mixed reality (MR), or combinations thereof in an example, an enhanced reality environment may have any characteristics of a VR environment, an AR environment, and/or an MR environment.
[0018] Turning now to the figures, Fig. 1 is a block diagram of an enhanced reality headset (100) according to an example of the principles described herein. The enhanced reality headset (100) may include at least one visual output device (105) and a viewport (1 10). In an example, the enhanced reality headset (100) may further include a processor that receives, at least, image data and displays the image data on the visual output device (105). The processor may, in an example, further execute computer usable program code to, when output from a gaze sensor is received, open the viewport (1 10) of the enhanced reality headset (100).
[0019] The visual output device (105) may be any type of visual output device (105) that presents to a user an enhanced reality (ER) environment. In an example, the enhanced reality headset (100) includes two visual output devices (105); one for each eye. In this example, the two visual output devices (105) are arranged to be placed 1 to 2 inches away from a user’s eyes. In this manner, a three-dimensional (3D) stereoscopic image and the ER environment is presented to the user. Other types of visual output devices (105) may be used to assist the user in viewing the ER environment within the enhanced reality headset (100) and the present specification contemplates the use of those other types of visual output devices (105).
[0020] The enhanced reality headset (100) also includes a housing. The housing may include certain features that prevent the enhanced reality headset (100) from slipping down a user’s face as well as maintaining the enhanced reality headset (100) on the user’s head. For example, the housing may include coupling points where a strap or other head coupling device is coupled to the housing so that the enhanced reality headset (100) may be maintained on the face of a user. The housing may further serve to house certain electrical circuits, components, and or processors associated with the functioning of the enhanced reality headset (100). The housing may further include surfaces that hold the visual output device(s) (105) at a constant distance from the user’s eyes. [0021] In an example, the housing may include shroud surfaces placed around the visual output device (105) and extending to the surface of the user’s face when the user is wearing the enhanced reality headset (100). These shroud surfaces prevent the user from viewing the real-life environment exterior to the enhanced reality headset (100). As such, the user may be hindered from interacting with objects in the real-life environment due to being unable to see those objects. Some of these objects include input devices such as a keyboard, a mouse, among other user interactive devices associated with the operation of an enhanced reality headset (100).
[0022] The enhanced reality headset (100) may include a viewport (1 10). The viewport (1 10) allows a user to view the real-life environment exterior to the housing of the enhanced reality headset (100) in an example, the viewport (1 10) selectively allows the user to see the real-life environment exterior to the enhanced reality headset (100). When the viewport (1 10) allows the user to view the real-life environment exterior to the enhanced reality headset (100), this may be referred to as activation of the viewport herein.
[0023] In an example, the viewport (1 10) serves as a portion of the housing of the enhanced reality headset (100) when shrouding the view of the user in this example, the portion of the housing is moved away from
obstructing the users view of or line~of~sight (LoS) to the real-life environment outside of the enhanced reality headset (100). In this example, the viewport (1 10) may be coupled to the enhanced reality headset (100) via a hinge at one end allowing the viewport (1 10) to be moved away without being uncoupled from the enhanced reality headset (100). In an example, a latch may be provided that disengages with the viewport (1 10) such that the viewport (1 10) is allowed to move away from a user’s LoS by either gravity or a spring installed with the hinge. In an example, the latch may be electrically activated based on input from a sensor.
[0024] In the example where the viewport (1 10) serves as a portion of the housing of the enhanced reality headset (100) when shrouding the view of the user, the viewport (1 10) may be slid out of the way using a compartment formed into the enhanced reality headset (100). in this example, movement of the viewport (1 10) into the compartment where the viewport (1 10) may be placed allows the user to view the real-life environment outside the enhanced reality headset (100). In this example, the viewport (1 10) may include a protrusion that extends out from the viewport (1 10) that the user may push against in order to move the viewport (1 10) out of the LoS of the user as described herein.
[002S] In the example where the viewport (1 10) serves as a portion of the housing of the enhanced reality headset (100) when shrouding the view of the user, the viewport (1 10) is removable from the enhanced reality headset (100). The removable viewport (1 10) may be selectively coupled to the enhanced reality headset (100) via any type of coupling device that allows a user to physically remove the viewport (1 10) during use of the enhanced reality headset (100).
[0026] In some examples, the viewport (1 10) may be made of a switchable glass that has light transmission properties that are altered when voltage, light, and/or heat is applied to the switchable glass in this way, the physical structure of the viewport (1 10) remains stationary while the switchable glass selectively allows for the transmission and non-transmission of the light from outside the enhanced reality headset (100) through the viewport (1 10).
[0027] In an example, where the viewport (1 10) serves as a portion of the housing of the enhanced reality headset (100), the housing may be formed into a number of louvers. These louvers may be moved to allow a user to view between the louvers when the viewport (1 10) is activated as described herein [0028] As described herein, the removal of the viewport (1 10) from the LoS of the user may be done by the user interacting physically with the viewport (1 10). In some examples, electrical and/or mechanical devices may be used to remove the viewport (1 10) from the LoS of the user in the example where the viewport (1 10) is coupled to the enhanced reality headset (100) via, for example a hinge, a latch may be mechanically activated to release the viewport (1 10) allowing the viewport (1 10) pivot about the hinge.
[0029] In the example where the viewport (1 10) is slid out of the way within a compartment formed into the enhanced reality headset (100), a rail system with a motor may be used in this example, the motor may move the viewport (1 10) out of the LoS of the user and into the compartment formed in the enhanced reality headset (100). The motor may mechanically move the viewport (1 10) out of the LoS after receiving a signal from, for example, a processor associated with the enhanced reality headset (100).
[0030] In the example where the viewport (1 10) is made of a switchable glass, a processor associated with the enhanced reality headset (100) may control if and when a voltage, light, and/or heat is applied to the switchable glass. In an example, the opacity and/or translucency of the switchable glass may be altered when a voltage, light, and/or heat is applied to any degree based on the level of voltage, light, and/or being applied.
[0031] In the example where the viewport (1 10) is made of a number of louvers, a processor associate with the enhanced reality headset (100) may control if and when the louvers are to be opened. The processor may receive certain input from the user and/or a number of input devices directing the louvers to be opened.
[0032] In some of the examples of the visual output device (105) may further include a gaze sensor. The gaze sensor may monitor the direction of a user’s gaze. During operation of the enhanced reality headset (100), a user may relatively more often direct the user’s gaze to the visual output device(s)
(105). However, as described herein, a user may implement certain input devices such as a keyboard and a mouse in order to interact with objects represented within the ER environment. With the viewport (1 10) closed and preventing the user from seeing the input device, a user may be left to rely on the sense of touch in order to interface with those input devices. The gaze sensor, however, may track the movement of a user’s eyes and determine if and when a user is attempting to view the real-life environment blocked from view by a“closed” viewport (1 10). When the gaze sensor detects a user has directed the user’s gaze to the interior surface of the viewport (1 10), the viewport (1 10) may, via a processor,“open” to allow the user to see the real-world environment including the input devices in the example where the viewport (1 10) forms a part of the housing of the enhanced reality headset (100) as described herein, the processor may send a signal to a latching device or motor device to move the viewport (1 10) out of the way as described herein. In the example where the viewport (1 10) is made of switchabie glass, the processor may send a signal to have voltage, light, and/or heat applied to the switchabie glass in order to make the switchabie glass transparent. In any of these examples, when the gaze sensor detects that the user is no longer directing the users gaze towards the viewport (1 10) or the real-life environment seen through the“open” viewport (1 10), the processor may“close” the viewport (1 10) upon receipt of such an indication from the gaze sensor.
[0033] In an example, the enhanced reality headset (100) may further include a voice recognition device. During operation of the enhanced reality headset (100), the user may present voice commands directed towards allowing the user to view out of the viewport (1 10) and at the real-life environment exterior to the enhanced reality headset (100). Similar to the gaze sensor described herein, specific commands from the user such as“open viewport” may be received at a microphone of the voice recognition device. This audio input may be received by a processor associated with the voice recognition device and/or enhanced reality headset (100), be analyzed, and specific commands may be executed via signals sent to the enhanced reality headset (100). These commands may include sliding the viewport (1 10) open, activating a latch holding the viewport (1 10) closed, or applying voltage, light, and/or heat to a switchabie glass as described herein. In this manner, the viewport (1 10) may be“opened” and“closed” using voice recognition.
[0034] In an example, the enhanced reality headset (100) may further include an input sensor that senses input from an input device. In this example, the enhanced reality headset (100) may be communicatively coupled to a processor of a computing device. The computing device may include a number of input devices includes, but not limited to, a keyboard, a mouse, a joystick, a touch sensitive mat, a trackball, a microphone, a camera, among others. These devices may be activated during use of the enhanced reality headset (100) by a user placing a hand on the devices. When this occurs, the processor may receive input from any one of these devices and send a signal to the viewport (1 10) of the enhanced reality headset (100) to“open” as described herein in this example, a user of the enhanced reality headset (100) may not see the actual input device but may know the general area where it is located. By simply activating a button or some other input method on the input device the user may cause the viewport (1 10) to be activated allowing the user to interact with the input device relatively more accurately in the example where the input device is a microphone, the user may implement voice commands which may then be received by the process and signals may be sent, wirelessly or vie wire, to the enhanced reality headset (100) as described. In the example where the input device is a webcam, the webcam may monitor parts of the user’s body or the general area containing the input devices for specific hand and/or body gestures. These hand and/or body gestures may be interpreted by image processing applications executed by the processor of the computing device such that certain hand and/or body gestures indicating the activation or opening of the viewport (1 10). Again, a signal may be processed by the processor and sent, either wirelessly or via a wire, to the enhanced reality headset (100) in order to open and/or activate the viewport (1 10) and its associated devices.
[0035] In an example, the enhanced reality headset (100) may further include an input sensor that senses input from a switch or touch panel on the head mounted display, or from a motion sensor on the head mounted display in this example, the viewport may be activated by a user touching the side of the headset (1 10) or by a user briefly shaking the user’s head.
[0036] The viewport (1 10) may be formed on any surface of the housing of the enhanced reality headset (100). in an example, the viewport (1 10) is formed within the lower portion of the housing. This lower portion of the housing may shroud the area below the user’s eyes and, when the viewport (1 10) is closed, contacts the users cheeks in the example where the viewport (1 10) is to be moved away from the users face, the viewport (1 10) may slide away or flip down from the user’s cheeks opening up the viewport (1 10) as described in any example herein. In the example where the viewport (1 10) is made of switchable glass, the lower housing of the enhanced reality headset (100) may be formed out of the switchable glass such that the user, when the switchable glass is activated, may be allowed to see a desktop below the enhanced reality headset (100)
[0037] Fig. 2 is a block diagram of an enhanced reality system (200) according to an example of the principles described herein. The enhanced reality system (200) may include a computing device (205) communicatively coupled to at least one processor (210), at least one input device (215), and an enhanced reality headset (220) including a viewport (225). The enhanced reality system (200) may further include any devices used to communicatively couple any of the computing device (205), processor (210), input device (215), enhanced reality headset (220), and viewport (225) together.
[0038] The computing device (205) may be any type of computing device including desktop computers, laptop computers, personal digital assistants (PDAs), mobile devices, smartphones, gaming systems, and tablets, among other computing devices. In an example, the input device (215) may be communicatively coupled to the computing device (205) so that input from the input device (215) may be received by the processor (210) of the computing device (205) and that input may be used to allow a user to interact with objects represented to a user via the enhanced reality headset (220).
[0039] The enhanced reality headset (220) may also be communicatively coupled to the processor (210) of the computing device (205) in order to receive input from the processor (210) as well as provide data to the processor (210). During operation of the enhanced reality headset (220), a user is presented visually with an ER environment. The processor (210) may execute computer readable program code in order to present that ER environment. Additionally, the processor (210) may receive input from the enhanced reality headset (220) indicating changes to the ER environment displayed as well as the orientation of the enhanced reality headset (220) with the ER environment. The process of the processor (210) executing the computer readable program code to provide a visual representation of the ER environment and the process of the processor (210) receiving input from the enhanced reality headset (220) may be conducted on a real-time basis during operation of the enhanced reality headset (220). [0040] The input device (215) may be any number of input devices that include any type of input device. As described herein, example input devices (215) may include a mouse, a keyboard, a joystick, a microphone, a camera, the gaze sensor described herein, or any other device that may be used to provide input to, at least, the processor (210) of the computing device (205). In each of these examples, however, the input from the input device (215) may be received via a hardware adapter at the computing device (205), received by the processor (210), interpreted, and used to affect the state of the viewport (225) as described herein.
[0041] The viewport (225) of the enhanced reality headset (220) may also be communicatively coupled to the processor (210) of the computing device (205). In an example, the viewport (225) may be a physical portion of the housing of the enhanced reality headset (220). in an example, the portion of the housing of the enhanced reality headset (220) may be a lower portion in an example, the viewport (225) may include a latch and a hinge that allow a user to physical unlatch a side of the viewport (225) allowing the viewport (225) to be moved away from the user’s view towards, for example, a desktop. Similarly, the viewport (225) may be slid out of the way by the user along a set of rails allowing the viewport (225) to be moved away from the users view towards, for example, a desktop.
[0042] In an example, the viewport (225) may be unlatched or slid based on input from an input device in the example where the input device (215) is a microphone, a user may present oral commands that are received by the microphone. The input may be received and processed by the processor (210) and signals may be sent to the devices controlling the viewport (225) to“open” the viewport allowing the user to view the real-life environment exterior to the enhanced reality headset (220). In the example where the input device (215) is a camera, a user may present visual commands that are received by the camera. The input may be received and processed by the processor (210) and signals may be sent to the devices controlling the viewport (225) to“open” the viewport allowing the user to view the real-life environment exterior to the enhanced reality headset (220). in the example where the input device (215) is the gaze sensor described herein, a user may direct the users gaze towards the viewport (225) and the movement may be detected by the gaze sensor. The gaze sensor may receive this input and send it to the processor (210) to be processed. Signals may be sent to the devices controlling the viewport (225) to “open” the viewport allowing the user to view the real-life environment exterior to the enhanced reality headset (220). In the example where the input device (215) is either a joystick a mouse, and/or a keyboard, a user may activate any input methods in order to send any type of input to the processor (210). The input may be received and processed by the processor (210) and signals may be sent to the devices controlling the viewport (225) to“open” the viewport allowing the user to view the real-life environment exterior to the enhanced reality headset (220).
[0043] In the examples where the viewport (225) is made of a switchable glass that is selectively made opaque and transparent, similar input from the input devices (215) may render the switchable glass transparent. In an example, any described input from the joystick, mouse, keyboard, camera, microphone, and/or gaze sensor may cause the switchable glass to become transparent. In any of these examples, the transparency of the switchable glass may be changed back to opaque after the passage of time and/or after input from the input devices (215) is no longer received or not received for after the passage of time in the example where the input device (215) is a gaze sensor, the switchable glass of the viewport (225) may be made opaque when the user’s gaze is no longer detected as being directed towards the viewport (225) and/or the environment exterior to the enhanced reality headset (220).
[0044] Fig. 3 is a block diagram of an enhanced reality headset (300) according to an example of the principles described herein. The enhanced reality headset (300) of Fig. 3 may include at least one visual output device (305) and a viewport (310) similar to that shown and described in connection with Fig. 1. The example shown in Fig. 3 may further include a viewport (310) that is selectively activated to provide a user of the enhanced reality headset to a view exterior to the enhanced reality headset based on characteristics of use of the enhanced reality headset identified by an analysis module (315). As described herein, a processor (Fig 2, 210) of either a computing device (Fig. 2, 205) the enhanced reality headset (300) is coupled to, or a processor of the enhanced reality headset (300) itself may execute computer readable program code in order to present a user with an ER environment. In this example, the processor (Fig 2, 210) may execute computer usable program code associated with the analysis module (315). The analysis module (315) may be stored in a data storage device as a separate computer program product and, during use do the enhanced reality headset (300), may be used to detect the
characteristics of use of the enhanced reality headset (300) by the user
[0045] In some example, the ER environment may elicit a response from a user in the form typed input from an input device (Fig. 2, 215) such as a keyboard. In this example, the viewport (225) may be activated to allow a user to see the keyboard. The activation of the viewport (310) may be done when, for example, a user’s gaze is sensed by a gaze sensor indicating that a user is looking at part of the ER environment where input may be entered. In this example, as a user looks towards the input location within the ER environment, the gaze sensor may send a signal to the processor indicating that the user should be allowed to see the keyboard and that the viewport (310) should be activated to allow the user to do so in an example, the viewport (310) may remain open for a period of time and as long as the gaze sensor senses that a user’s gaze is being directed to the part of the ER environment where input may be entered or input is received from the keyboard in this manner, the viewport (310) may be selectively activated based on characteristics of use of the enhanced reality headset during execution of the analysis module.
[0046] Other characteristics of use may also cause the viewport (310) to be activated. Some examples include the gaze of a user’s eyes, an input period to input data into a computing system associated with the enhanced reality headset, when a user is prompted to input data within the ER environment (e.g. when initially logging on to the system), or when input is received from an input device by the enhanced reality headset in an example, combinations of these characteristics of use may also cause the viewport (310) to be activated. [0047] Fig. 4 is a perspective view of an enhanced reality headset (400) according to an example of the principles described herein in the example shown in Fig. 4, the enhanced reality headset (400) may include a number of visual output devices (405) and a viewport (410) as described herein. In this example, the number of visual output devices (405) is two: one visual output device (405) for each eye. In this example, this creates a stereoscopic view of the ER environment presented to the user of the enhanced reality headset (400).
[0048] In this example, the viewport (410) is a siiding-type viewport (410). As described herein, the viewport (410) serves as a portion of the lower side of the housing (420) of the enhanced reality headset (400) when placed in a closed position. When the viewport (410) is placed in an open state, the user is allowed to view the real-life environment outside of the enhanced reality headset (400) and in particular a desktop in front of the user that includes input devices to be used by the user to interact with the ER environment presented on the visual output devices (405). The arrow (415) denotes the direction the sliding housing portion of the viewport (410) is to be moved.
[0049] In an example and in the example shown in Fig. 4, the enhanced reality headset (400) may further include a number of gaze sensors (425). As described herein, the gaze sensors (425) may monitor the direction of a user’s gaze during use of the enhanced reality headset (400). When a user’s gaze is directed towards the closed viewport (410), a signal may be sent to a processor. This processor may send a signal to, for example, a motor to cause the viewport (410) to slide out of the way of the user’s gaze and allow the user to view the real-life world environment outside of the enhanced reality headset (400).
[0050] Although Fig. 4, shows the use of the gaze sensors (425), the present specification contemplates any devices and input from any type of input device used to open or move the viewport (410) out of the way of the viewer’s gaze to all the user to view outside of the enhanced reality headset (400).
Other examples have been presented herein in order to move the presently shown sliding viewport (410) as well as a hinged viewport (410). [00S1] Fig. 4 also shows a number of coupling points (430) to which a harness may be coupled. The harness may prevent the enhanced reality headset (400) from sliding down the user’s face by holding the enhanced reality headset (400) to the user’s head.
[0052] Fig. 5 is a perspective view of an enhanced reality headset (500) according to an example of the principles described herein in the example shown in Fig. 5, the enhanced reality headset (500) may include a number of visual output devices (405) and a viewport (505) as described herein. In this example, the number of visual output devices (405) is two: one visual output device (405) for each eye. In this example, this creates a stereoscopic view of the ER environment presented to the user of the enhanced reality headset (400).
[0053] In this example, the viewport (505) is made of a switchable glass as described herein. The switchable glass may remain tight to a user’s face thereby supporting the enhanced reality headset (400) on the user’s face. As described herein, the gaze sensors (425) may be used to track the user’s direction of gaze. When the user directs the users gaze towards the viewport (410), the gaze sensors (425) and/or processor may send a signal to apply a voltage, light, and/or heat to the switchable glass. Upon application of the voltage, light, and/or heat, the switchable glass is made transparent allowing the user to view the real-life environment exterior to the enhanced reality headset (400).
[0054] In any of the examples presented above, the viewport (410) may be created by the enhanced reality headset (500) being moved as a whole away from the user’s face. In this example, an interface between the enhanced reality headset (500) and a head coupling device may include a hinge. The hinge may allow the enhanced reality headset (500) to be flipped up and away from in front of the user’s face thereby creating the viewport (410). Again, any input from any device or any action described herein may be used to automatically cause the enhanced reality headset (500) to be flipped up in this manner in an example, the degree to which the enhanced reality headset is flipped up may be limited to a certain degree (i.e., an angle of roughly 20 degrees may be sufficient) in an example, an automated mechanism may be used to flip the headset up over that range. This may allow a user to access input devices communicatively coupled to the enhanced reality headset (500) or computing device associated with the enhanced reality headset (500). in an example, as the user activates or changes the state of any of the input devices, the enhanced reality headset (500) may be flipped back down allowing the user to interact with the ER environment displayed on the display devices again. In an example, the enhanced reality headset (500) may be flipped back down after a passage of time thereby allowing a user to prepare before not being able to see the real-life environment again.
[0055] Fig. 6 is a side cut-away view of a user’s head relative to an enhanced reality headset according to an example of the principles described herein. The enhanced reality headset (800) provides an amount of space between the housing of the enhanced reality headset (500) and the user’s face when the viewport (410) is in an open state that allows for the user to view the real-life environment presented below the user. Because this is a location where a mouse, keyboard, or other types of input devices are located, a user may be able to visually access the input devices when the viewport (410) is open or has been activated as described herein. The amount of viewable space is dictated by the angle (605) created between the user’s face and any remaining portion of the housing of the enhanced reality headset (800). A portion of the user’s view is taken up by the visual output devices (410) in front of the user’s eyes and that angle (610) may be more or less than the angle (805) between the user’s face and the housing of the enhanced reality headset (800).
[0056] As described herein, the enhanced reality headset (600) may further include a head coupling device (615) and a hinge (820) to allow the enhanced reality headset (800) to be flipped up and away from the user’s face when the real-life environment exterior to the enhanced reality headset (600) is to be seen in this example, the viewport (410) may be activated when this occurs, opening up more of the user’s view of the outside world that would otherwise be obscured by the housing. Similarly, the viewport could close automatically when the headset is flipped back down. Thus, in this example, the enhanced reality headset (600) flipping up may be included with the viewports (410) described herein or may replace the movement of the housing or the switchable glass.
[0057] The specification and figures describe an enhanced reality headset that includes a viewport. As described herein, the view port may be placed in two states that selectively allow and prevent a user from viewing the real-life environment exterior to the enhanced reality headset. During use of the enhanced reality headset, the viewport may be opened as the user has directed in order to allow a user to interact with real-life objects such as the input devices presented herein in some examples, the context and/or characteristics of the use of the enhanced reality headset may dictate if and when the viewport is activated thereby allowing the user to view the real-life environment exterior to the enhanced reality headset.
[0058] The preceding description has been presented to illustrate and describe examples of the principles described. This description is not intended to be exhaustive or to limit these principles to any precise form disclosed. Many modifications and variations are possible in light of the above teaching.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. An enhanced reality headset, comprising:
at least one visual output device; and
a viewport that selectively allows a user to view a scene exterior to the enhanced reality headset.
2. The enhanced reality headset of claim 1 , wherein the viewport is formed on a bottom surface of the enhanced reality headset allowing a user to view below the visual output device.
3. The enhanced reality headset of claim 1 , wherein the view port is a hinged surface of the enhanced reality headset that is movable away from the enhanced reality headset.
4. The enhanced reality headset of claim 1 , wherein the view port is a selectively transparent and opaque.
5. The enhanced reality headset of claim 4 wherein the view port is selectively made to be transparent and opaque upon application of a voltage.
6. The enhanced reality headset of claim 1 , further comprising a gaze sensor to sense the direction of the user’s gaze and wherein selectively allowing the user to view a scene exterior to the enhanced reality headset is controlled by the gaze sensor detecting the that the user’s gaze has been directed towards the view port.
7. The enhanced reality headset of claim 1 , wherein selectively allowing the user to view a scene exterior to the enhanced reality headset is controlled via a voice command.
8. The enhanced reality headset of claim 1 , wherein selectively allowing the user to view a scene exterior to the enhanced reality headset is controlled via a switch.
9. The enhanced reality headset of claim 1 , wherein selectively allowing the user to view a scene exterior to the enhanced reality headset is controlled via signal representing a sensed input from a keyboard, a mouse, or combinations thereof.
10. An enhanced reality system, comprising:
a computing device comprising a processor;
an input device; and
an enhanced reality headset comprising a viewport that selectively allows a user to view a scene exterior to the enhanced reality headset.
1 1. The enhanced reality system of claim 10, wherein the viewport comprises a portion of a housing of the enhanced reality headset that moves away to provide a view exterior to the enhanced reality headset.
12. The enhanced reality system of claim 10, wherein the viewport comprises a selectively opaque and translucent portion of a housing.
13. The enhanced reality system of claim 10, wherein the selectively allowing a user to view a scene exterior to the enhanced reality is accomplished using a gaze sensor to sense the user’s gaze.
14. An enhanced reality headset, comprising:
at least one visual output device;
a viewport; and
analysis module;
wherein the viewport is selectively activated to provide a user of the enhanced reality headset to a view exterior to the enhanced reality headset based on characteristics of use of the enhanced reality headset identified by the analysis module
15. The enhanced reality headset of claim 14, wherein the characteristics of use comprise the gaze of a user’s eyes, an input period to input data into a computing system associated with the enhanced reality headset, when input is received from an input device by the enhanced reality headset, or combinations thereof.
PCT/US2017/064616 2017-12-05 2017-12-05 Enhanced reality headsets WO2019112557A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2017/064616 WO2019112557A1 (en) 2017-12-05 2017-12-05 Enhanced reality headsets

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2017/064616 WO2019112557A1 (en) 2017-12-05 2017-12-05 Enhanced reality headsets

Publications (1)

Publication Number Publication Date
WO2019112557A1 true WO2019112557A1 (en) 2019-06-13

Family

ID=66750284

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/064616 WO2019112557A1 (en) 2017-12-05 2017-12-05 Enhanced reality headsets

Country Status (1)

Country Link
WO (1) WO2019112557A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130293688A1 (en) * 2012-05-04 2013-11-07 Sony Computer Entertainment Europe Limited Head mountable display system
US20140364212A1 (en) * 2013-06-08 2014-12-11 Sony Computer Entertainment Inc. Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted dipslay
US20160116748A1 (en) * 2014-10-24 2016-04-28 Emagin Corporation Microdisplay based immersive headset
US20160196694A1 (en) * 2015-01-05 2016-07-07 Worcester Polytechnic Institute System and method for controlling immersiveness of head-worn displays

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130293688A1 (en) * 2012-05-04 2013-11-07 Sony Computer Entertainment Europe Limited Head mountable display system
US20140364212A1 (en) * 2013-06-08 2014-12-11 Sony Computer Entertainment Inc. Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted dipslay
US20160116748A1 (en) * 2014-10-24 2016-04-28 Emagin Corporation Microdisplay based immersive headset
US20160196694A1 (en) * 2015-01-05 2016-07-07 Worcester Polytechnic Institute System and method for controlling immersiveness of head-worn displays

Similar Documents

Publication Publication Date Title
CN107771309B (en) Method of processing three-dimensional user input
CN110456626B (en) Holographic keyboard display
CN107810465B (en) System and method for generating a drawing surface
EP3433706B1 (en) Virtual-reality navigation
CN106662925B (en) Multi-user gaze projection using head mounted display devices
KR102473259B1 (en) Gaze target application launcher
CN108027657A (en) Context sensitive user interfaces activation in enhancing and/or reality environment
US20130154913A1 (en) Systems and methods for a gaze and gesture interface
JP2017531221A (en) Countering stumbling when immersed in a virtual reality environment
EP3117290B1 (en) Interactive information display
CN107787472A (en) For staring interactive hovering behavior in virtual reality
US20230333637A1 (en) System and method for a blended reality user interface and gesture control system
US11520401B2 (en) Focus-based debugging and inspection for a display system
WO2021242451A1 (en) Hand gesture-based emojis
US11360550B2 (en) IMU for touch detection
JP2022535322A (en) Gesture-Driven User Interface Element Gating to Identify Corners for Artificial Reality Systems
CN111221418B (en) Method and device for entering VR helmet to hide menu and VR helmet
WO2019112557A1 (en) Enhanced reality headsets
KR102501599B1 (en) View-Based Stops
US20240103684A1 (en) Methods for displaying objects relative to virtual surfaces
Bertomeu Castells Towards embodied perspective: exploring first-person, stereoscopic, 4K, wall-sized rendering of embodied sculpting

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17933873

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17933873

Country of ref document: EP

Kind code of ref document: A1