GB2548151A - Head-mountable display - Google Patents

Head-mountable display Download PDF

Info

Publication number
GB2548151A
GB2548151A GB1604179.0A GB201604179A GB2548151A GB 2548151 A GB2548151 A GB 2548151A GB 201604179 A GB201604179 A GB 201604179A GB 2548151 A GB2548151 A GB 2548151A
Authority
GB
United Kingdom
Prior art keywords
display
user
hmd
eye
currently viewing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1604179.0A
Other versions
GB2548151B (en
GB201604179D0 (en
Inventor
John Hall Simon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Computer Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Ltd filed Critical Sony Computer Entertainment Europe Ltd
Priority to GB1604179.0A priority Critical patent/GB2548151B/en
Publication of GB201604179D0 publication Critical patent/GB201604179D0/en
Priority to PCT/GB2017/050663 priority patent/WO2017153778A1/en
Publication of GB2548151A publication Critical patent/GB2548151A/en
Application granted granted Critical
Publication of GB2548151B publication Critical patent/GB2548151B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

A head-mounted display (HMD) comprising a display, having one or more display regions operable in a normal or low power mode, a detector 800 and a controller 810. The detector is configured to detect whether one or both of a users eyes are currently viewing the display, and this may be by determining gaze direction or whether one or more of the users eyes are shut. The controller controls a currently viewed display region to operate in a normal power mode, and a currently unviewed display region to operate in a low power mode. Normal and low power modes may correspond to higher or lower display region illumination levels respectively, and the illumination may be provided by display backlights. The controller is configured to control rendering of an image portion for display, which may be a normal power mode, and display of stored image data, which may be a low power mode.

Description

HEAD-MOUNTABLE DISPLAY BACKGROUND Field of the Disclosure
This disclosure relates to head-mountable displays.
Description of the Prior Art
The “background” description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present disclosure. A head-mountable display (HMD) is one example of a head-mountable apparatus. In an HMD, an image or video display device is provided which may be worn on the head or as part of a helmet. Either one eye or both eyes are provided with small electronic display devices.
Although the original development of HMDs was perhaps driven by the military and professional applications of these devices, HMDs are becoming more popular for use by casual users in, for example, computer game or domestic computing applications.
In any device of this nature, power consumption is an important factor. It is a general aim to reduce power consumption of devices such as HMDs.
The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
SUMMARY
Various aspects and features of the present disclosure are defined in the appended claims and within the text of the accompanying description and include at least a video server, a head mountable display, a system, a method of operating a video server or a head-mountable apparatus as well as a computer program and a video signal.
BRIEF DESCRIPTION OF THE DRAWINGS A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Figure 1 schematically illustrates an HMD to be worn by a user;
Figure 2 is a schematic plan view of an HMD;
Figures 3 and 4 schematically illustrate a user wearing an HMD connected to a Sony® PlayStation ® games console;
Figure 5 schematically illustrates a change of view of user of an HMD;
Figures 6a and 6b schematically illustrate HMDs with motion sensing;
Figure 7 schematically illustrates an HMD in communication with a server;
Figure 8 schematically illustrates an HMD system;
Figure 9 is a further schematic plan view of an HMD;
Figure 10 schematically illustrates a pair of eye tracking images;
Figure 11 schematically illustrates another technique for capturing eye tracking images;
Figures 12 schematically illustrates a backlit display element;
Figure 13 is a schematic front elevation of a backlight arrangement;
Figure 14 schematically illustrates a backlight control technique;
Figure 15 is a schematic flowchart for a backlight control technique;
Figure 16 is a schematic flowchart of a display control technique;
Figures 17 and 18 are schematic flowcharts of rendering control processes;
Figure 19 schematically illustrates a self-luminescent display;
Figure 20 schematically illustrates another type of display configuration for use in an HMD;
Figures 21 and 22 are schematic flowcharts of methods;
Figure 23 schematically illustrates a render pipeline and a display;
Figure 24 schematically illustrates an arrangement similar to that of Figure 23 but with a local memory; and
Figure 25 is a schematic flowchart of a method.
DESCRIPTION OF THE EMBODIMENTS
Referring now to Figure 1, an HMD 20 (as an example of a generic head-mountable apparatus) is wearable by a user. The HMD comprises a frame 40, in this example formed of a rear strap and an upper strap, and a display portion 50.
Note that the HMD of Figure 1 may comprise further features, to be described below in connection with other drawings, but which are not shown in Figure 1 for clarity of this initial explanation.
The HMD of Figure 1 completely (or at least substantially completely) obscures the user's view of the surrounding environment. All that the user can see is the pair of images displayed within the HMD, one image for each eye.
The HMD has associated headphone audio transducers or earpieces 60 which fit into the user's left and right ears. The earpieces 60 replay an audio signal provided from an external source, which may be the same as the video signal source which provides the video signal for display to the user's eyes.
The combination of the fact that the user can see only what is displayed by the HMD and, subject to the limitations of the noise blocking or active cancellation properties of the earpieces and associated electronics, can hear only what is provided via the earpieces, mean that this HMD may be considered as a so-called “full immersion” HMD. Note however that in some embodiments the HMD is not a full immersion HMD, and may provide at least some facility for the user to see and/or hear the user’s surroundings. This could be by providing some degree of transparency or partial transparency in the display arrangements, and/or by projecting a view of the outside (captured using a camera, for example a camera mounted on the HMD) via the HMD’s displays, and/or by allowing the transmission of ambient sound past the earpieces and/or by providing a microphone to generate an input sound signal (for transmission to the earpieces) dependent upon the ambient sound. A front-facing camera 122 may capture images to the front of the HMD, in use. A Bluetooth® antenna 124 may provide communication facilities or may simply be arranged as a directional antenna to allow a detection of the direction of a nearby Bluetooth transmitter.
In operation, a video signal is provided for display by the HMD. This could be provided by an external video signal source 80 such as a video games machine or data processing apparatus (such as a personal computer), in which case the signals could be transmitted to the HMD by a wired or a wireless connection 82. Examples of suitable wireless connections include Bluetooth® connections. The external apparatus could communicate with a video server. Audio signals for the earpieces 60 can be carried by the same connection. Similarly, any control signals passed from the HMD to the video (audio) signal source may be carried by the same connection. Furthermore, a power supply 83 (including one or more batteries and/or being connectable to a mains power outlet) may be linked by a cable 84 to the HMD. Note that the power supply 83 and the video signal source 80 may be separate units or may be embodied as the same physical unit. There may be separate cables for power and video (and indeed for audio) signal supply, or these may be combined for carriage on a single cable (for example, using separate conductors, as in a USB cable, or in a similar way to a “power over Ethernef arrangement in which data is carried as a balanced signal and power as direct current, over the same collection of physical wires). The video and/or audio signal may be carried by, for example, an optical fibre cable. In other embodiments, at least part of the functionality associated with generating image and/or audio signals for presentation to the user may be carried out by circuitry and/or processing forming part of the HMD itself. A power supply may be provided as part of the HMD itself.
Some embodiments of the disclosure are applicable to an HMD having at least one electrical and/or optical cable linking the HMD to another device, such as a power supply and/or a video (and/or audio) signal source. So, embodiments of the disclosure can include, for example: (a) an HMD having its own power supply (as part of the HMD arrangement) but a cabled connection to a video and/or audio signal source; (b) an HMD having a cabled connection to a power supply and to a video and/or audio signal source, embodied as a single physical cable or more than one physical cable; (c) an HMD having its own video and/or audio signal source (as part of the HMD arrangement) and a cabled connection to a power supply; or (d) an HMD having a wireless connection to a video and/or audio signal source and a cabled connection to a power supply.
Other examples may be entirely self-contained, for example being an HMD having its own power supply (such as a rechargeable or dry-cell power supply) and its own video source (or a wireless connection to an external video source). This provides an example of the HMD comprising a power supply which stores electrical energy to provide operating power to the display arrangement. In such examples, power consumption is a particularly relevant factor.
If one or more cables are used, the physical position at which the cable 82 and/or 84 enters or joins the HMD is not particularly important from a technical point of view. Aesthetically, and to avoid the cable(s) brushing the user’s face in operation, it would normally be the case that the cable(s) would enter or join the HMD at the side or back of the HMD (relative to the orientation of the user’s head when worn in normal operation). Accordingly, the position of the cables 82, 84 relative to the HMD in Figure 1 should be treated merely as a schematic representation.
Accordingly, the arrangement of Figure 1 provides an example of a head-mountable display system comprising a frame to be mounted onto an observer’s head, the frame defining one or two eye display positions which, in use, are positioned in front of a respective eye of the observer and a display element mounted with respect to each of the eye display positions, the display element providing a virtual image of a video display of a video signal from a video signal source to that eye of the observer.
Figure 1 shows just one example of an HMD. Other formats are possible: for example an HMD could use a frame more similar to that associated with conventional eyeglasses, namely a substantially horizontal leg extending back from the display portion to the top rear of the user's ear, possibly curling or diverting down behind the ear. In other (not full immersion) examples, the user's view of the external environment may not in fact be entirely obscured; the displayed images could be arranged so as to be superposed (from the user's point of view) over the external environment.
In the example of Figure 1, a separate respective display is provided for each of the user's eyes. A schematic plan view of how this is achieved is provided as Figure 2, which illustrates the positions 100 of the user's eyes and the relative position 110 of the user's nose. The display portion 50, in schematic form, comprises an exterior shield 120 to mask ambient light from the user's eyes and an internal shield 130 which prevents one eye from seeing the display intended for the other eye. The combination of the user's face, the exterior shield 120 and the interior shield 130 form two compartments 140, one for each eye. In each of the compartments there is provided a display element 150 and one or more optical elements 160. These can cooperate to display three dimensional or two dimensional content.
In some situations, an HMD may be used simply to view movies, or other video content or the like. If the video content is panoramic (which, for the purposes of this description, means that the video content extends beyond the displayable area of the HMD so that the viewer can, at any time, see only a portion but not all of the video content), or in other uses such as those associated with virtual reality (VR) or augmented reality (AR) systems, the user's viewpoint needs to track movements with respect to a real or virtual space in which the user is located. Arrangements to achieve this will be discussed with reference to Figures 5, 6a and 6b.
Figure 3 schematically illustrates a user wearing an HMD connected to a Sony® PlayStation 4® games console 300 as an example of a base device. The games console 300 is connected to a mains power supply 310 and (optionally) to a main display screen (not shown). A cable, acting as the cables 82, 84 discussed above (and so acting as both power supply and signal cables), links the HMD 20 to the games console 300 and is, for example, plugged into a USB socket 320 on the console 300. Note that in the present embodiments, a single physical cable is provided which fulfils the functions of the cables 82, 84. In Figure 3, the user is also shown holding a hand-held controller 330 which may be, for example, a Sony® Move® controller which communicates wirelessly with the games console 300 to control (or to contribute to the control of) operations relating to a currently executed program at the games console.
The video displays in the HMD 20 are arranged to display images provided via the games console 300, and the earpieces 60 in the HMD 20 are arranged to reproduce audio signals generated by the games console 300. The games console may be in communication with a video server. Note that if a USB type cable is used, these signals will be in digital form when they reach the HMD 20, such that the HMD 20 comprises a digital to analogue converter (DAC) to convert at least the audio signals back into an analogue form for reproduction.
Images from the camera 122 mounted on the HMD 20 are passed back to the games console 300 via the cable 82, 84. Similarly, if motion or other sensors are provided at the HMD 20, signals from those sensors may be at least partially processed at the HMD 20 and/or may be at least partially processed at the games console 300. The use and processing of such signals will be described further below.
The USB connection from the games console 300 also provides power to the HMD 20, according to the USB standard.
Figure 4 schematically illustrates a similar arrangement in which the games console is connected (by a wired or wireless link) to a so-called “break out box” acting as a base or intermediate device 350, to which the HMD 20 is connected by a cabled link 82, 84. The breakout box has various functions in this regard. One function is to provide a location, near to the user, for some user controls relating to the operation of the HMD, such as (for example) one or more of a power control, a brightness control, an input source selector, a volume control and the like. Another function is to provide a local power supply for the HMD (if one is needed according to the embodiment being discussed). Another function is to provide a local cable anchoring point. In this last function, it is not envisaged that the break-out box 350 is fixed to the ground or to a piece of furniture, but rather than having a very long trailing cable from the games console 300, the break-out box provides a locally weighted point so that the cable 82, 84 linking the HMD 20 to the break-out box will tend to move around the position of the break-out box. This can improve user safety and comfort by avoiding the use of very long trailing cables.
It will be appreciated that the localisation of processing in the various techniques described in this application can be varied without changing the overall effect, given that an HMD may form part of a set or cohort of interconnected devices (that is to say, interconnected for the purposes of data or signal transfer, but not necessarily connected by a physical cable). So, processing which is described as taking place “af one device, such as at the HMD, could be devolved to another device such as the games console (base device) or the break-out box. Processing tasks can be shared amongst devices. Source (for example, sensor) signals, on which the processing is to take place, could be distributed to another device, or the processing results from the processing of those source signals could be sent to another device, as required. So any references to processing taking place at a particular device should be understood in this context.
As mentioned above, in some uses of the HMD, such as those associated with panoramic video content viewing, virtual reality (VR) or augmented reality (AR) systems, the user's viewpoint needs to track movements with respect to a real or virtual space in which the user is located.
This tracking is carried out by detecting motion of the HMD and varying the apparent viewpoint of the displayed images so that the apparent viewpoint tracks the motion.
Figure 5 schematically illustrates the effect of a user head movement in a VR or AR system.
Referring to Figure 5, a virtual environment is represented by a (virtual) spherical or cylindrical or part-spherical shell 250 around a user. This provides an example of a virtual display screen. Because of the need to represent this arrangement on a two-dimensional paper drawing, the shell is represented by a part of a circle, at a distance from the user equivalent to the separation of the displayed virtual image from the user. A user is initially at a first position 260 and is directed towards a portion 270 of the virtual environment. It is this portion 270 which is represented in the images displayed on the display elements 150 of the user's HMD. It will be appreciated that the VDS subsists in three dimensional space (in a virtual sense) around the position in space of the HMD wearer, such that the HMD wearer sees a current portion of VDS according to the HMD orientation.
Consider the situation in which the user then moves his head to a new position and/or orientation 280. In order to maintain the correct sense of the virtual reality or augmented reality display, the displayed portion of the virtual environment also moves so that, at the end of the movement, a new portion 290 of content is displayed by the HMD.
So, in this arrangement, the apparent viewpoint within the virtual environment moves with the head movement. If the head rotates to the right side, for example, as shown in Figure 5, the apparent viewpoint also moves to the right from the user's point of view. If the situation is considered from the aspect of a displayed object, such as a displayed object 300, this will effectively move in the opposite direction to the head movement. So, if the head movement is to the right, the apparent viewpoint moves to the right but an object such as the displayed object 300 which is stationary in the virtual environment will move towards the left of the displayed image and eventually will disappear off the left-hand side of the displayed image, for the simple reason that the displayed portion of the virtual environment has moved to the right whereas the displayed object 300 has not moved in the virtual environment. A detection of how the user moves his head while wearing the HMD, and therefore a detection of which is a current portion (such as 270, 290) of content to be displayed, can be carried out using one or more motion sensors.
Note that, using established audio processing techniques, a so-called 3D audio field can be created for the user by the earpieces 60. In this arrangement the direction in which the HMD wearer perceives sound to be coming from can be controlled by processing applied to the signals being emitted by the two earpieces 60. The panoramic video may have an associated sound field such that sounds are assigned to particular directions with respect to the underlying video content. As the user turns his head, the 3D sound field is varied (for example, by a processor at the HMD) so that the sounds remain aligned with the correct portion of the displayed video content whatever the viewer’s head direction. So, for example, if the viewer has his back to the main action in a piece of panoramic video content, the sound corresponding to that main action would be arranged, using the 3D audio field, to be perceived by the viewer to be coming from behind him.
Figures 6a and 6b schematically illustrated HMDs with motion sensing. The two drawings are in a similar format to that shown in Figure 2. That is to say, the drawings are schematic plan views of an HMD, in which the display element 150 and optical elements 160 are represented by a simple box shape. Many features of Figure 2 are not shown, for clarity of the diagrams. Both drawings show examples of HMDs with a motion detector for detecting motion of the observer’s head.
In Figure 6a, a forward-facing camera 322 is provided on the front of the HMD. This may be the same camera as the camera 122 discussed above, or may be an additional camera. This does not necessarily provide images for display to the user (although it could do so, for example in an augmented reality arrangement). Instead, its primary purpose in the present embodiments is to allow motion sensing. A technique for using images captured by the camera 322 for motion sensing may include so-called optical flow detection, in which a motion is detected by detecting differences between successively captured images of the environment surrounding the camera 322. In these arrangements, the motion detector comprises a camera mounted so as to move with the frame; and an image comparator operable to compare successive images captured by the camera so as to detect inter-image motion.
Figure 6b makes use of a hardware motion detector 332. This can be mounted anywhere within or on the HMD. Examples of suitable hardware motion detectors are piezoelectric accelerometers or optical fibre gyroscopes. It will of course be appreciated that both hardware motion detection and camera-based motion detection can be used in the same device, in which case one sensing arrangement could be used as a backup when the other one is unavailable, or one sensing arrangement (such as the camera) could provide data for changing the apparent viewpoint of the displayed images, whereas the other (such as an accelerometer) could provide data for image stabilisation.
Figure 6b also shows an example of an eye tracking camera 324 disposed in each of the compartments 140. This can be used (by established techniques) to detect a direction of gaze of the wearer with respect to the currently displayed image, and therefore to detect what the user is looking at within a currently displayed image. The camera 324 can also be used to provide a detection of whether one or both of the user’s eyes are shut.
Figures 6a and 6b therefore provide examples of a head mountable display (HMD) comprising: a display arrangement; an orientation detector configured to detect an orientation of the HMD; and a video processor configured to generate images for display by the display arrangement in dependence upon the detected current orientation. As discussed below, the video may be panoramic video content sent to the HMD by a video server.
As mentioned above, the HMD can receive video content such as streaming video content for display from a server. Figure 7 schematically illustrates an HMD system comprising an HMD 400 and a server 410 which are connected or associated together for data transfer by a data connection 420. For the purposes of the present discussion, the HMD 400 may be of the type described as the HMD 50 above. The HMD may be associated with a console or other processing unit such as the console 300 shown in Figures 3 and 4, and/or a device such as a breakout box 350 as shown in Figure 4. Such devices may act as the server or may provide at least a part of a communications path for data communications with the server. Alternatively, the HMD 400 may be a self-contained head-mountable unit capable of communicating with the server without the need for external apparatus. Or, just a battery or power supply may be provided externally to the HMD itself. All of these different options are encompassed within the generic illustration of the HMD 400 in Figure 7.
The server 410 may be located remotely from the HMD 400, for example at a data centre or even in another country to the HMD 400. The data connection 420 may be, for example, an internet connection and may include wired and/or wireless connections. The HMD 400 (encompassing the various options just described) will be referred to as an “HMD client”. In this regard, the HMD client is acting as a client of the server 410. Other HMD clients (not shown in Figure 7) may be associated with the server 410 at the same time as the HMD client 400. In this way, the server 410 can potentially provide video content to multiple HMD clients at the same time.
Figure 8 schematically illustrates an HMD system, employing a control technique according to eye orientation and/or eyes shut detection.
Referring to Figure 8, a detector 800 is operable to detect eye orientation and/or whether one or both eyes are shut. The detector 800 comprises one or two eye tracking cameras of the type described above, along with processing to carry out eye orientation and/or eyes shut detection (for example, according to a technique to be discussed below). The output of the detector 800 comprises a set of data defining any or all of: (a) the current absolute coordinates and/or position and/or orientation of one or each pupil (relative to a frame of reference defined relative to the HMD), (b) deviations of the pupil position(s) from a nominal or expected pupil position and/or orientation, (c) whether each eye is shut.
From the output of the detector 800, a controller 810 carries out various functions. In some examples, the controller 810 detects the image location, with respect to the images being displayed by the HMD, which the user is looking at. The controller 810 can make use of calibration data obtained during a calibration stage (for example, when the user is asked, or prompted by the use of particular image content, to look straight ahead with respect to the HMD), in order to provide a mapping between pupil position and image position being looked at. In other examples (as an alternative or in addition) the controller 810 detects that one or both of the user’s eyes are shut and generates control data indicative of this.
An image renderer 820 renders an image for display by the HMD, for example in dependence upon a generated image provided by an image generator 830. The rendering can be dependent upon the control data provided by the controller 810. The control data can also be provided to the display or a display driver in examples to be discussed below.
Figure 9 is a further schematic plan view of an HMD similar to the plan view of Figure 2 described above. Features already described in connection with that Figure will not be described again. A pair of eye-tracking cameras 500, 510 (acting as at least a part of the detector 800) are provided within the compartments corresponding to each eye so as to generate images of the wearer's eyes in use. Accordingly, the eye-tracking cameras 500, 510 are directed in a backwards direction relative to the orientation of the user's head, so that they look back at the user's eyes. Note that the cameras 500, 510 in Figure 9 can be disposed anywhere with respect to the compartments corresponding to each eye, as long as they do not obscure the user's view of the displayed images; they are just shown in the outer corners by way of one schematic example.
To illuminate the user's eye, the cameras 500, 510 can rely on illumination provided by the displayed images within the HMD or, if that is insufficient, on infrared or other illumination directed towards the user's eyes.
Figure 10 schematically illustrates a pair of eye tracking images as captured by the cameras 500, 510. In particular. Figure 10 shows an image 520 of the user's left eye and an image 530 of the user's right eye. In general, these can be captured as separate images but are shown alongside one another in Figure 10 for the purposes of this explanation. A significant feature to be derived from the captured images 520, 530 is the current location of the pupils of the wearer's eyes. Note that the cameras 500, 510 are mounted in a fixed relationship relative to the frame of the HMD, and that, in use, the HMD adopts a fixed relationship to the user's head. Accordingly, from the image position of the pupils within the captured images 520, 530, the position and/or orientation of each pupil relative to the HMD, and therefore relative to the display elements of the HMD, can be directly established. The point or region of the display which the user is looking at can therefore be detected by applying the detected position and/or orientation to the known geometry of the HMD.
The position of the user’s pupils can then be compared (by the detector 800) with a nominal or calibration position and/or orientation which can be established in a calibration phase of operation. This calibration could be carried out with the user’s knowledge (for example, by asking the user to look at different parts of the displayed image in turn, including a central image position) or without actively asking the user (for example, by placing useful or interesting information or image content at various image positions, for example including a central image position, at various stages in normal operation. A first stage in detecting the current pupil position is to detect the extent of the iris or coloured portion 540. Then, using known image processing techniques, a central region of the iris is scanned to establish the upper 542 and lower 544 bounds of the pupil (a dark area within the iris) to allow the vertical centre 546 of the pupil to be detected. As a crosscheck, the system establishes whether the vertical centre 546 is also the approximate vertical centre of the iris 540.
To establish the horizontal centre of the pupil, a horizontal scan of the captured image 520, 530 is carried out at the vertical position indicated by the vertical centre 546. This gives left 550 and right 552 boundaries of the pupil, from which a horizontal centre 554 of the pupil can be established.
The horizontal and vertical centres of each pupil, as established by the techniques discussed above, provide a set of coordinates of the pupil position. This detection can be carried out frequently. In embodiments of the invention, the detection is carried out at least as frequently as the image display rate of the display elements of the HMD, so that a next image to be displayed by the HMD can be rendered using techniques to be described below according to the detected pupil positions found by a detection process carried out immediately before the display of that image. In other arrangements, the eye position may be detected less frequently than the image display frequency, but a most-recently detected eye position is used in the rendering of each image for display.
As discussed above, for a particular user, a nominal or central eye position may be detected. One way to detect this is for the HMD to display material (such as an instruction which the user has to read) at a central position in the HMD display, and for the eye position detection arrangement to detect the eye position at the time that the user is viewing that centrally displayed information. Similarly, calibration can be carried out to detect the user's eye position when the user is viewing information at various extreme positions (such as each corner) of the HMD display. This calibration can be carried out without the user necessarily knowing, simply by providing information for the user to read at different positions in the HMD display. From the calibration, a mapping between detected eye position and region of the HMD display which the user is viewing can be generated.
Further information can be derived from the images captured by the cameras 500, 510. In particular, a detection of whether the eyes are shut (or one eye shut while the other is open) can be obtained by the lack of a detection of an iris or pupil at or near to the expected position.
In the arrangement of Figure 9, there was some flexibility in the positioning of the cameras 500, 510 except that they could not obscure the user's view of the display elements of the HMD. A different arrangement is illustrated schematically in Figure 11. A user's eye 560 views a display element 570 through an optical system shown schematically by a lens 580 but which (as discussed above) may include other optical elements. An eye tracking camera 590 is disposed so as to capture images of the user's eye 560 via at least a part of the optical system 580. So, where the optical system 580 comprises one or more lenses or optical elements, the camera 590 is disposed so as to capture images of the user's eye through one or more of those lenses or optical elements. Similarly, if the optical system 580 comprises one or more reflectors, the camera 590 may be disposed so as to capture images of the user’s eye via one or more of the reflectors. In the example shown in Figure 11, the camera 590 is disposed alongside the display element 570, but in other embodiments the camera 590 could be positioned within a compound optical system 580.
In some embodiments, the position of only one eye and its associated pupil is tracked, so that only one of the eye tracking cameras is required. This is on the basis that, for most people, movements of one eye mapped directly to corresponding movements of the other eye. However, a feature of using two cameras is that the results for the pair of eyes can be checked against one another, and that individual detections can be obtained for each of the user's eyes.
Figure 12 schematically illustrates (in a schematic end elevation) a backlit display element, comprising the display element 1200 itself, which allows light from the backlight to pass or to be obscured according to the state of pixel data 1210, a backlight 1220 and a backlight driver 1230. For example, the arrangement of Figure 12 can provide the entirety of the display applicable to one eye of the HMD arrangements discussed above.
The backlight 1220 can be a single backlight covering the whole of the display element 1200, so that any variation in backlight illumination applies to the whole of the display element. Or the backlight can be arranged as multiple sections or regions such as regions 1300 as shown in a schematic front elevation of Figure 13. In examples, the regions 1300 are contiguous (which is to say, there are no gaps in the illumination within the body of the display element). The regions 1300 may be the same size and shape as one another, or could be different sizes and shapes. For example, a central region could be larger than peripheral regions. Under the control of the backlight driver 1230, the illumination provided by the backlight 1220 and/or by individual ones of the regions 1300 is controllable. In examples, the illumination is controllable between at least two levels: a normal operation mode and a lower power consumption mode. The illumination provided by a backlight region is lower in the lower power consumption mode than in the normal operation mode, and similarly the power consumption by that backlight region is lower in the lower power consumption mode than in the normal operation mode.
Accordingly the display 1200 and backlight 1210 provide an example of a display arrangement formed as one or more display regions operable in a normal operation mode and in a lower power consumption mode.
In general terms, there can be multiple power levels, or a continuum of power levels. The “lower power consumption mode” can therefore be considered as a generic term for a mode of operation in which less than the “normal operation mode” illumination is provided and less than the “normal operation mode” power is consumed.
Note that the luminance of the displayed image, as seen from the user’s viewpoint, depends not only on the backlight illumination but also on the pixel data 1210. The pixel data defines the proportion of light from the backlight, and its colour, which the display element 1200 passes. So, pixel data representing darker pixels, or a darker (lower illumination) backlight, or both, will result in a lower luminance image at the user’s viewpoint.
In examples, the illumination provided by the backlight regions can be varied under the control of the backlight driver in dependence upon control data provided by the controller 810 of Figure 8. This can be carried out, for example, so that a display region (for example, that part of the display element 1200 illuminated by a respective backlight region 1300) which a user’s eye is detected to be currently viewing operates in the normal operation mode and a display region which a user’s eye is detected to be not currently viewing operates in the lower power consumption mode. In other words, the backlight illumination at a backlight region 1300 corresponding to a part of the display which the user is not looking at is reduced in comparison with any display regions which the user is detected to be looking at.
In the case of gaze detection, there would normally be some position on the display element 1200 which the user is detected to be looking at. Figure 14 provides an example, illustrating a backlight control technique, in which a user’s gaze is detected to be directed at a position 1400. The backlight region 1410 encompassing the position 1400 is caused (by the backlight drive 1230 operating under the control of the controller 810) to operate in the normal operation mode. Also, backlight regions 1420 within a threshold deviation (a lateral distance or an angular deviation) of the detected gaze position are also caused to operate in the normal operation mode. Other backlight regions outside the threshold deviation are caused to operate in the lower power consumption mode. This may imply that the backlight illumination at the regions 1420 is uniformly lower than the backlight illumination at other regions (which is to say, the illumination at each of the regions 1430 is the same). Or in other examples, each of the regions 1430 could operate in a lower power consumption mode, but with regions (amongst those in the set of regions 1430) which are closer to the position 1400 having a higher illumination than regions (amongst the set of regions 1430) further from the position 1400. Note that all regions apart from the region 1410 could be included in the set 1430, with a reduction in illumination varying (for example, linearly varying) with distance (expressed in distance units or in numbers of regions) from the point 1400.
Therefore, the detector 800 is an example of a detector configured to detect whether one or both of a user’s eyes are currently viewing a display region. For examples, the detector can be configured to detect a current gaze direction of one or both of the user’s eyes and/or to detect whether one or both of the user’s eyes are currently shut. The controller 810 is an example of a controller configured to control a display region which a user’s eye is detected to be currently viewing to operate in the normal operation mode and to control a display region which a user’s eye is detected to be not currently viewing to operate in the lower power consumption mode. In examples, the controller is configured to detect, as a display region which the user is currently viewing, those display regions in a current detected gaze direction and within a threshold deviation from the current detected gaze direction. In examples, the controller is configured to detect, as a display region which the user is not currently viewing, those display regions configured to be viewed by an eye of the user which is detected to be currently shut.
In the case of a backlit or self-luminescent display, in examples the controller is configured to vary an illumination of the one or more display regions so that the illumination is greater in the normal operation mode than in the lower power consumption mode. For example, as shown in Figure 13, the display arrangement comprises one or more display backlights, in which the controller is configured to vary an illumination of the one or more display backlights.
This process is illustrated schematically by the flowchart of Figure 15. At a step 1500, the user’s gaze direction is detected. At a step 1510, the backlight illumination is adjusted so that at least a backlight region which the user is detected to be looking at is operated in the normal operation mode, and at least another backlight region is operated in a lower power consumption mode. A low pass filtering arrangement can be used by the controller 810 and/or the backlight driver 1230 so as to inhibit sudden large changes in backlight illumination by region, in an instance where the user’s gaze is detected to move rapidly from one region to another region.
In the case of an eyes-shut detection (which can be used in addition to or instead of the gaze detection discussed above), the whole display can be transitioned to a lower power consumption mode when the relevant one of the user’s eyes is detected to be shut. For example, in the case of an HMD having separate respective displays for the two eyes, an individual display can be moved to a lower power consumption mode in response to that eye being detected as being shut. Or the system can act only on a detection that both eyes are shut (that is, a blink), to dim (move temporarily to a lower power consumption mode) both displays.
Figure 16 is a schematic flowchart illustrating an example of such a technique. At a step 1600, the detector 800 detects that both of the user’s eyes are shut. At a step 1610 the controller 810 sends control data to instruct the backlight driver 1230 to reduce the illumination of the backlights (that is, all of the backlight regions, if there are multiple such regions) of the backlights acting on the displays for the two eyes. This moves the displays to a lower power mode which could correspond to zero illumination or, in examples, to a reduced illumination level. At a step 1620 the controller 810 either detects (from the detector 800) that an eye has opened or detects a timer (for example, provided by the controller 810) reaching a threshold time (a timeout) and, at a step 1630, causes the normal operation mode illumination to be restored.
As well as, or instead of, dimming the display or regions of the display, portions of or all of the rendered image could be inhibited from being rendered, in response to a detection that either (or both of) the user is not looking at a portion of the display or the user’s eye or eyes are shut. In these examples, the controller is configured to control rendering of an image portion for display by a display region which a user’s eye is detected to be currently viewing, and (as discussed below) to control display of stored data representing that image portion for other image regions.
Figure 17 is a schematic flowchart illustrating a rendering process (at a step 1710) initiated by the controller 810 instructing the image renderer 820 to do so, in response to a detection by the detector 800 at a step 1700 that both eyes are open.
Figure 18 is a schematic flowchart illustrating a rendering process applied to portions or regions of the image (at a step 1810) initiated by the controller 810 instructing the image renderer 820 to do so, in response to a detection by the detector at a step 1800 that the user is looking at a that portion or region. As with the change in illumination discussed with respect to Figure 13, the image render process could be applied to portions of the image at up to at least a threshold deviation away from the current gaze position. As discussed below, other portions of the displayed image could be provided by stored data.
By using these techniques, subjective user disturbance can be reduced and/or image artefacts can be masked by applying a render process to appropriate image portions.
Not all displays have backlights. Some types of display are themselves emissive, for example electroluminescent displays such as an OLED (organic light emitting diode) display. Figure 19 schematically illustrates such a display 1900 in end elevation, with a display driver 1910. The techniques discussed above can be used in connection with a self-emissive display of this type. The brightness and instantaneous power consumption depend on pixel data which in turn indicate who bright each pixel should be displayed. The display driver 1910 is responsive to pixel data and also to control data from the controller 810 to vary the illumination of individual pixels or regions of pixels (for example following a similar pattern to the regions 1300 shown in Figure 13) so as to reduce the power consumption and illumination of pixels or regions of pixels which the user is not looking at (which are separated from the current gaze direction or which are active during a period when the user’s eye or eyes are shut).
Accordingly the display 1900 provides an example of a display arrangement formed as one or more display regions operable in a normal operation mode and in a lower power consumption mode.
Various alternatives are possible. For example, an alternative arrangement is shown in Figure 20. This arrangement may be used where it is desired that the user's view of the external environment is not entirely obscured. However, it is also applicable to HMDs in which the user's external view is wholly obscured. In the arrangement of Figure 20, a display element 1150 and optical elements 1200 cooperate to provide an image which is projected onto a mirror 1210, which deflects the image towards the user's eye position 1220. The user perceives a virtual image to be located at a position 1230 which is in front of the user and at a suitable distance from the user. A similar arrangement (not shown) may be used for the other eye.
In the case of an HMD in which the user's view of the external surroundings is entirely obscured, the mirror 1210 can be a substantially 100% reflective mirror. The arrangement of Figure 20 then has the advantage that the display element and optical elements can be located closer to the centre of gravity of the user's head and to the side of the user's eyes, which can produce a less bulky HMD for the user to wear. Alternatively, if the HMD is designed not to completely obscure the user's view of the external environment, the mirror 1210 can be made partially reflective so that the user sees the external environment, through the mirror 1210, with the virtual image superposed over the real external environment.
The arrangement of Figure 20 may be used with backlit or self-luminescent displays as the display elements 1150, and all of the techniques discussed above are applicable to the arrangement of Figure 20.
The detector 800 and controller 810 can cooperate to provide an example of A control system for a head mountable display (HMD) having a display arrangement formed as one or more display regions operable in a normal operation mode and in a lower power consumption mode.
Figure 21 is a schematic flowchart illustrating a method of operation of a head mountable display (HMD) having a display arrangement formed as one or more display regions operable in a normal operation mode and in a lower power consumption mode; the method comprising: detecting (at a step 2100) whether one or both of a user’s eyes are currently viewing a display region; controlling (at a step 2110) a display region which a user’s eye is detected to be currently viewing to operate in the normal operation mode; and controlling (at a step 2120) a display region which a user’s eye is detected to be not currently viewing to operate In the lower power consumption mode.
Figure 22 is a schematic flowchart illustrating a method of operation of a head mountable display (HMD) having a display arrangement formed as one or more display regions; the method comprising: detecting (at a step 2200) whether one or both of a user’s eyes are currently viewing a display region; and controlling (at a step 2210) rendering of an image portion for display by a display region which a user’s eye is detected to be currently viewing, and controlling display of stored data representing that image portion for other image regions.
Figure 23 schematically illustrates a render pipeline 2300 and a display 2310. The render pipeline is an example of a display controller or driver (or of the image renderer 820 described above) providing images for display by the display 2310. An issue which can arise is that the render pipeline provides images for display at the refresh rate of the display 2310, for example 60Hz or 120Hz, which can lead to unnecessarily high power consumption in respect of display areas which the user is not actually looking at.
Figure 24 schematically illustrates an arrangement similar to that of Figure 23 but with a local memory 2420 associated with a display 2410 and a render pipeline 2400. A detector/controller 2430 controls aspects of the operations of the other components.
The local memory is associated with the display and is configured to selectively store image data locally at the display, avoiding the need for the render pipeline to provide it at the refresh rate. The locally stored image data is selectively used in place of rendered image data. This can save power consumption at the render pipeline by selectively avoiding the need to render image data for at least some image regions.
Accordingly, the arrangement of Figure 24 is operable in at least two ways:
In a first mode of operation, the detector/controller 2430 detects that still images are being displayed, which is to say, an image or a portion of an image has not changed (or has changed by less than a threshold change amount) over a threshold number of successive images. In examples, this is detected with respect to a processing stage such as a game engine which precedes the render pipeline in the signal flow. In such instances, the detector/controller 2430 controls the image data for that image or image portion to be stored locally in the local memory, and the render pipeline is temporarily disabled from producing images or that image portion. The render pipeline operation is restarted when a change in that image or image portion is detected. This example arrangement can operate in conjunction with any of the embodiments discussed above, or with the other example embodiment to be discussed below.
In a second mode of operation, the detector/controller 2430 detects portions of the displayed images which the user is not looking at, using the techniques discussed above. For those images portions, the detector/controller 2430 controls the display to store and use locally stored still image data and controls the render pipeline not to render image information for display at those portions.
In either mode of operation, the local memory 2420 needs to store only that image data relating to image regions not being rendered, although in other examples the whole image could be stored in the local memory 2420 and just the relevant parts of it used by the display 2410.
Figure 25 is a schematic flowchart of a method relating to an example implementation of the second mode of operation just discussed. At a step 2500, the detector/controller detects portions of the displayed images which the user is not looking at, using the techniques discussed above. For those images portions, at a step 2510 the detector/controller controls the display to store and use locally stored still image data and controls the render pipeline not to render image information for display at those portions.
The use (at the step 2510) of locally stored information is another example of a lower power mode as discussed above. The selection between local storage and rendering can be carried out region-by-region as discussed above.
These arrangements can be applied alone or in conjunction with other embodiments described in this application.
It will be apparent that numerous other modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure may be practised otherwise than as specifically described herein.

Claims (19)

1. A head mountable display (HMD) comprising: a display arrangement formed as one or more display regions operable in a normal operation mode and in a lower power consumption mode; a detector configured to detect whether one or both of a user’s eyes are currently viewing a display region; and a controller configured to control a display region which a user’s eye is detected to be currently viewing to operate in the normal operation mode and to control a display region which a user’s eye is detected to be not currently viewing to operate in the lower power consumption mode.
2. An HMD according to claim 1, in which the controller is configured to vary an illumination of the one or more display regions so that the illumination is greater in the normal operation mode than in the lower power consumption mode.
3. An HMD according to claim 2, in which the display arrangement comprises one or more display backlights, in which the controller is configured to vary an illumination of the one or more display backlights.
4. An HMD according to any one of the preceding claims, in which the detector is configured to detect a current gaze direction of one or both of the user’s eyes.
5. An HMD according to claim 4, in which the controller is configured to detect, as a display region which the user is currently viewing, those display regions in a current detected gaze direction and within a threshold deviation from the current detected gaze direction.
6. An HMD according to any one of the preceding claims, in which the detector is configured to detect whether one or both of the user’s eyes are currently shut.
7. An HMD according to claim 6, in which the controller is configured to detect, as a display region which the user is not currently viewing, those display regions configured to be viewed by an eye of the user which is detected to be currently shut.
8. An HMD according to any one of the preceding claims, in which the controller is configured, in the normal operation mode, to control rendering of an image portion for display by a display region, and, in the lower power consumption mode, to control display of stored data representing that image portion for other image regions.
9. An HMD according to any one of the preceding claims, in which the HMD comprises a power supply which stores electrical energy to provide operating power to the display arrangement.
10. A control system for a head mountable display (HMD) having a display arrangement formed as one or more display regions operable in a normal operation mode and in a lower power consumption mode; the control system comprising: a detector configured to detect whether one or both of a user’s eyes are currently viewing a display region; and a controller configured to control a display region which a user’s eye is detected to be currently viewing to operate in the normal operation mode and to control a display region which a user’s eye is detected to be not currently viewing to operate in the lower power consumption mode.
11. A head mountable display (HMD) comprising: a display arrangement formed as one or more display regions; a detector configured to detect whether one or both of a user’s eyes are currently viewing a display region; and a controller configured the controller is configured to control rendering of an image portion for display by a display region which a user’s eye is detected to be currently viewing, and to control display of stored data representing that image portion for other image regions.
12. A control system for a head mountable display (HMD) having a display arrangement formed as one or more display regions; the control system comprising: a detector configured to detect whether one or both of a user’s eyes are currently viewing a display region; and a controller configured to control rendering of an image portion for display by a display region which a user’s eye is detected to be currently viewing, and to control display of stored data representing that image portion for other image regions.
13. A head mountable display substantially as hereinbefore described with reference to the accompanying drawings.
14. A control system for a head mountable display, the control system being substantially as hereinbefore described with reference to the accompanying drawings.
15. A method of operation of a head mountable display (HMD) having a display arrangement formed as one or more display regions operable in a normal operation mode and in a lower power consumption mode; the method comprising: detecting whether one or both of a user’s eyes are currently viewing a display region; controlling a display region which a user’s eye is detected to be currently viewing to operate in the normal operation mode; and controlling a display region which a user’s eye is detected to be not currently viewing to operate in the lower power consumption mode.
16. A method of operation of a head mountable display (HMD) having a display arrangement formed as one or more display regions; the method comprising: detecting whether one or both of a user’s eyes are currently viewing a display region; and controlling rendering of an image portion for display by a display region which a user’s eye is detected to be currently viewing, and controlling display of stored data representing that image portion for other image regions.
17. A method of operation of a head mountable display, the method being substantially as hereinbefore described with reference to the accompanying drawings.
18. Computer software which, when executed by a computer, causes the computer to carry out the method of any one of claims 15 to 17.
19. A non-transitory machine-readable storage medium which stores computer software according to claim 18.
GB1604179.0A 2016-03-11 2016-03-11 Head-mountable display Active GB2548151B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1604179.0A GB2548151B (en) 2016-03-11 2016-03-11 Head-mountable display
PCT/GB2017/050663 WO2017153778A1 (en) 2016-03-11 2017-03-10 Head-mountable display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1604179.0A GB2548151B (en) 2016-03-11 2016-03-11 Head-mountable display

Publications (3)

Publication Number Publication Date
GB201604179D0 GB201604179D0 (en) 2016-04-27
GB2548151A true GB2548151A (en) 2017-09-13
GB2548151B GB2548151B (en) 2020-02-19

Family

ID=55952182

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1604179.0A Active GB2548151B (en) 2016-03-11 2016-03-11 Head-mountable display

Country Status (2)

Country Link
GB (1) GB2548151B (en)
WO (1) WO2017153778A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2566013A (en) * 2017-08-24 2019-03-06 Displaylink Uk Ltd Compressing image data for transmission to a display
GB2571300A (en) * 2018-02-23 2019-08-28 Sony Interactive Entertainment Inc Eye tracking method and apparatus
WO2020047309A1 (en) * 2018-08-30 2020-03-05 Qualcomm Incorporated Load reduction in a visual rendering system
GB2607455A (en) * 2017-08-24 2022-12-07 Displaylink Uk Ltd Compressing image data for transmission to a display

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019246044A1 (en) * 2018-06-18 2019-12-26 Magic Leap, Inc. Head-mounted display systems with power saving functionality

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5635948A (en) * 1994-04-22 1997-06-03 Canon Kabushiki Kaisha Display apparatus provided with use-state detecting unit
US6124976A (en) * 1998-03-17 2000-09-26 Sony Corporation Voltage controlling method for head mounted display unit and head mounted display apparatus
US20100056274A1 (en) * 2008-08-28 2010-03-04 Nokia Corporation Visual cognition aware display and visual data transmission architecture
WO2012078207A1 (en) * 2010-12-08 2012-06-14 Sony Computer Entertainment Inc. Adaptive displays using gaze tracking
US20120242570A1 (en) * 2011-03-24 2012-09-27 Seiko Epson Corporation Device, head mounted display, control method of device and control method of head mounted display
EP2634617A1 (en) * 2012-02-29 2013-09-04 Recon Instruments Inc. Modular heads-up display systems
WO2014197226A1 (en) * 2013-06-07 2014-12-11 Sony Computer Entertainment Inc. Image rendering responsive to user actions in head mounted display
WO2016014608A1 (en) * 2014-07-25 2016-01-28 Microsoft Technology Licensing, Llc Eyelid movement as user input

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2013080444A1 (en) * 2011-11-29 2015-04-27 パナソニックIpマネジメント株式会社 Display control apparatus, display control method, and display control program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5635948A (en) * 1994-04-22 1997-06-03 Canon Kabushiki Kaisha Display apparatus provided with use-state detecting unit
US6124976A (en) * 1998-03-17 2000-09-26 Sony Corporation Voltage controlling method for head mounted display unit and head mounted display apparatus
US20100056274A1 (en) * 2008-08-28 2010-03-04 Nokia Corporation Visual cognition aware display and visual data transmission architecture
WO2012078207A1 (en) * 2010-12-08 2012-06-14 Sony Computer Entertainment Inc. Adaptive displays using gaze tracking
US20120242570A1 (en) * 2011-03-24 2012-09-27 Seiko Epson Corporation Device, head mounted display, control method of device and control method of head mounted display
EP2634617A1 (en) * 2012-02-29 2013-09-04 Recon Instruments Inc. Modular heads-up display systems
WO2014197226A1 (en) * 2013-06-07 2014-12-11 Sony Computer Entertainment Inc. Image rendering responsive to user actions in head mounted display
WO2016014608A1 (en) * 2014-07-25 2016-01-28 Microsoft Technology Licensing, Llc Eyelid movement as user input

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2566013A (en) * 2017-08-24 2019-03-06 Displaylink Uk Ltd Compressing image data for transmission to a display
GB2566013B (en) * 2017-08-24 2022-12-07 Displaylink Uk Ltd Compressing image data for transmission to a display
GB2607455A (en) * 2017-08-24 2022-12-07 Displaylink Uk Ltd Compressing image data for transmission to a display
GB2607455B (en) * 2017-08-24 2023-02-15 Displaylink Uk Ltd Compressing image data for transmission to a display
GB2571300A (en) * 2018-02-23 2019-08-28 Sony Interactive Entertainment Inc Eye tracking method and apparatus
GB2571300B (en) * 2018-02-23 2020-05-27 Sony Interactive Entertainment Inc Eye tracking method and apparatus
US11557020B2 (en) 2018-02-23 2023-01-17 Sony Interactive Entertainment Inc. Eye tracking method and apparatus
WO2020047309A1 (en) * 2018-08-30 2020-03-05 Qualcomm Incorporated Load reduction in a visual rendering system

Also Published As

Publication number Publication date
WO2017153778A1 (en) 2017-09-14
GB2548151B (en) 2020-02-19
GB201604179D0 (en) 2016-04-27

Similar Documents

Publication Publication Date Title
US9645398B2 (en) Electronic correction based on eye tracking
US11061240B2 (en) Head-mountable apparatus and methods
US10078366B2 (en) Head-mountable apparatus and system
EP3070513B1 (en) Head-mountable display system
WO2017153778A1 (en) Head-mountable display
US20140361987A1 (en) Eye controls
GB2517263A (en) Head-mountable apparatus and systems
US11507201B2 (en) Virtual reality
US11650417B2 (en) Video processing
US10571700B2 (en) Head-mountable display system
US11045733B2 (en) Virtual reality
US11094109B2 (en) Data processing
US11762204B2 (en) Head mountable display system and methods
US20230396752A1 (en) Electronic Device that Displays Virtual Objects
WO2018096315A1 (en) Virtual reality
GB2515130A (en) Head-mountable apparatus and systems