GB2577024A - Using headset movement for compression - Google Patents

Using headset movement for compression Download PDF

Info

Publication number
GB2577024A
GB2577024A GB1709237.0A GB201709237A GB2577024A GB 2577024 A GB2577024 A GB 2577024A GB 201709237 A GB201709237 A GB 201709237A GB 2577024 A GB2577024 A GB 2577024A
Authority
GB
United Kingdom
Prior art keywords
movement
image
speed
compression
display data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1709237.0A
Other versions
GB2577024B (en
GB201709237D0 (en
Inventor
Doidge Ian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DisplayLink UK Ltd
Original Assignee
DisplayLink UK Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DisplayLink UK Ltd filed Critical DisplayLink UK Ltd
Priority to GB2209162.3A priority Critical patent/GB2606889B/en
Priority to GB1709237.0A priority patent/GB2577024B/en
Priority to GB2209157.3A priority patent/GB2606651B/en
Publication of GB201709237D0 publication Critical patent/GB201709237D0/en
Priority to PCT/GB2018/051567 priority patent/WO2018224841A1/en
Priority to US16/617,045 priority patent/US20200145687A1/en
Publication of GB2577024A publication Critical patent/GB2577024A/en
Application granted granted Critical
Publication of GB2577024B publication Critical patent/GB2577024B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/164Feedback from the receiver or from the transmission channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/115Selection of the code volume for a coding unit prior to coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • G02B27/022Viewing apparatus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/154Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/167Position within a video image, e.g. region of interest [ROI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Abstract

A first method comprises receiving movement direction information from a wearable headset, for example a head mounted display HMD. The direction information has a trailing position and a leading position. If the direction of movement is on an arc 45, the display data which forms the trailing portion 46 is compressed relative to the leading portion data 47. The size of the trailing and leading portions may change on a frame to frame basis. A second method comprises receiving from a headset speed of movement information and if the speed is above a minimum threshold, compressing the display data by an amount based on the speed above the minimum threshold (figure 6). A third method comprises a HMD which determines if movement is both on an arc and above a minimum threshold, and when it is, sends the information to a host device.

Description

(54) Title of the Invention: Using headset movement for compression
Abstract Title: Transmitting and compression determination based on headset movement (57) A first method comprises receiving movement direction information from a wearable headset, for example a head mounted display HMD. The direction information has a trailing position and a leading position. If the direction of movement is on an arc 45, the display data which forms the trailing portion 46 is compressed relative to the leading portion data 47. The size of the trailing and leading portions may change on a frame to frame basis. A second method comprises receiving from a headset speed of movement information and if the speed is above a minimum threshold, compressing the display data by an amount based on the speed above the minimum threshold (figure 6). A third method comprises a HMD which determines if movement is both on an arc and above a minimum threshold, and when it is, sends the information to a host device.
Figure 4C
1/10
Figure 1
2/10
Figure 2
3/10
Figure 3A Figure 3B
4/10
Figure 4A Figure 4B Figure 4C
5/10
Figure 5A Figure 5B
6/10
Figure 6A Figure 6B Figure 6C
7/10
Figure 7
8/10
Figure 8
9/10
Figure 9A Figure 9B Figure 9C
10/10
Figure 10
Using Headset Movement for Compression
Background
Virtual reality is becoming an increasingly popular display method, especially for computer gaming but also in other applications. This introduces new problems in the generation and display of image data as virtual reality devices must have extremely fast and high-resolution displays to create an illusion of reality. This means that a very large volume of data must be transmitted to the device from any connected host.
As virtual-reality display devices become more popular, it is also becoming desirable for them to be wirelessly connected to their hosts. This introduces considerable problems with the transmission of the large volume of display data required, as wireless connections commonly have very limited bandwidth. It is therefore desirable for as much compression to be applied to the display data as possible without affecting its quality, as reductions in quality are likely to be noticed by a user.
Finally, it is desirable for virtual-reality devices to be as lightweight as possible, since they are commonly mounted on the user’s head. This limits the number of internal devices such as complex decompression circuits and sensors that can be provided.
The invention aims to mitigate some of these problems.
Summary
Accordingly, in one aspect, the invention provides a method at a host device for compressing display data forming an image for display on one or more displays of a wearable headset, the method comprising:
receiving from the wearable headset, information regarding a direction of movement of the wearable headset, including the one or more displays, the direction being between a trailing position and a leading position;
if the direction of the movement is on an arc, compressing the display data forming a trailing portion of the image relative to the display data forming a leading portion of the image, when displayed on the one or more displays that are moving with the wearable headset, wherein the leading portion and the trailing portion may be of any size smaller than the whole image and may change in size on a frame to frame basis; and forwarding the display data forming the image from the host device to the wearable headset for display on the one or more displays thereof.
In one embodiment, the information further comprises a speed of the movement and compression of the display data forming at least the trailing portion of the image is performed if a speed of the movement is above a minimum threshold. Compression of the display data forming the whole of the image is preferably performed if a speed of the movement is above a minimum threshold. The compression of the display data forming a part of, or the whole of the image may be based on the speed of the movement above the minimum threshold. The compression of the display data forming a part of, or the whole of the image may be increased as the speed of the movement above the minimum threshold increases.
According to a second aspect, the invention provides a method at a host device for compressing display data forming an image for display on one or more displays of a wearable headset, the method comprising:
receiving from the wearable headset information regarding a speed of movement of the wearable headset, including the one or more displays;
if the speed of the movement is above a minimum threshold, compressing the display data by an amount based on the speed of the movement above the minimum threshold; and forwarding the compressed display data from the host device to the wearable headset for display thereon.
In an embodiment, the compression of the display data forming the image may be increased as the speed of the movement above the minimum threshold increases.
Preferably, the information further comprises a direction of movement of the wearable headset, including the one or more displays, the direction being between a trailing position and a leading position, the method further comprises, if the direction of the movement is on an arc, compressing the display data forming a trailing portion of the image relative to the display data forming a leading portion of the image, when displayed on the one or more displays that are moving with the wearable headset.
In a preferred embodiment, the display data forming a trailing portion of the image is compressed by a higher compression factor than the display data forming a leading portion of the image. Preferably, compression of the display data is increased in portions across the image in the direction from the leading portion to the trailing portion of the image. The trailing portion may, in some cases, increase in size compared to the leading portion of the image as the speed of the movement above the minimum threshold increases.
In embodiment, the method comprises, at the host device:
determining, from the information, whether the movement is on an arc or linear;
determining, from the information, the speed of the movement of the wearable headset; and determining whether the speed of the movement is above the minimum threshold.
In another embodiment, the method comprises, at the wearable headset:
determining whether the movement of the wearable headset is on an arc or linear;
determining the speed of the movement of the wearable headset;
determining whether the speed of the movement is above the minimum threshold; and sending the information to the host device, if the movement is on an arc and the speed is above the minimum threshold.
According to a third aspect, the invention provides a method at a wearable headset for displaying display data forming an image on one or more displays, the method comprising:
sensing movement of the wearable headset indicative of movement of the one or more displays;
determining whether the movement is on an arc or linear;
determining the speed of the movement of the wearable headset;
determining whether the speed of the movement is above a minimum threshold;
sending information regarding the speed and direction of the movement to a host device, if the movement is on an arc and the speed is above the minimum threshold;
receiving from the host device, the display data forming the image; and displaying the image on one or more displays.
Preferably, sensing movement of the wearable headset comprises using a gyroscope and an accelerometer in the wearable headset.
In a further aspect, the invention may provide a host device and a wearable headset configured to perform the various appropriate steps of the method described above.
The wearable headset may be a virtual reality headset or an augmented reality set of glasses.
A system comprising a host device and a wearable headset connected to the host device may also be provided.
In another aspect, the invention provides a method of applying adaptive compression to display data according to sensor data indicating that a headset is in motion, the method comprising:
1. Detecting a movement of the headset
2. Analysing the movement to determine its direction and/or speed
3. Applying compression selectively to display data according to the results of the analysis
4. Transmitting the display data for display
5. Decompressing and displaying the display data
The analysis may comprise determining the direction of the movement only and applying localised compression, in which the part of an image assumed to be moving out of the user’s vision based on this movement is compressed to a greater degree than the remainder of the image. Alternatively or additionally, the analysis may comprise determining the speed of the movement and applying staged compression, in which compression is applied to a greater extent across the whole frame as the speed of the movement increases.
This above described methods are advantageous as they allows assumptions regarding user gaze to be used to improve compression without requiring additional sensors to be incorporated into a device. This provides compression benefits without increasing the expense and complexity of such devices.
Brief Description of the Drawings
Embodiments of the invention will now be more fully described, by way of example, with reference to the drawings, of which:
Figure 1 shows a basic overview of the system according to one embodiment of the invention;
Figure 2 shows a more detailed block diagram of a VR headset in use;
Figure 3 shows the application of localised compression;
Figure 4 shows a variation on localised compression;
Figure 5 shows a further variation on localised compression;
Figure 6 shows the application of staged compression;
Figure 7 shows the process of the application of localised compression;
Figure 8 shows the process of the application of staged compression;
Figure 9 shows the application of adaptive compression; and
Figure 10 shows the process of the application of adaptive compression;
Detailed Description of the Drawings
Figure 1 is a block diagram showing a basic overview of a display system arranged according to the invention. In this system, a host [11] is connected by connection [16] to a virtual-reality headset [12], This connection [16] may be wired or wireless, and there may be multiple connection channels, or there may be a single bidirectional connection which is used for multiple purposes. For the purposes of this description, the connection [16] is assumed to be a general-purpose wireless connection, such as one using the Universal Serial Bus (USB) protocol, although other appropriate protocols could, of course, be used.
The host [11] incorporates, among other components, a processor [13] running an application which generates frames of display data using a graphics processing unit (GPU) on the host [11], These frames are then transmitted to a compression engine [14], which carries out compression on the display data to reduce its volume prior to transmission. The transmission itself is carried out by an output engine [15], which controls the connection to the headset [12] and may include display and wireless driver software.
The headset [12] incorporates an input engine [17] for receiving the transmitted display data, which also controls the connection [16] to the host [11] as appropriate. The input engine [17] is connected to a decompression engine [18], which decompresses the received display data as appropriate. The decompression engine [18] is in turn connected to two display panels [19], one of which is presented to each of a user’s eyes when the headset [12] is in use. When the display data has been decompressed, it is transmitted to the display panels [19] for display, possibly via frame or flow buffers to account for any unevenness in the rate of decompression.
The headset [12] also incorporates sensors [110] which detect user interaction with the headset [12], There may be a variety of position, temperature, heartbeat, angle, etc. sensors [110], but the important sensors [110] for the purposes of this example are sensors used to detect the movement of the headset [12], In this example, these comprise an accelerometer for determining the speed at which the headset [12] is moving, and a gyroscope for detecting changes in its angle and position in space. However, other methods can be used, such as an external camera which determines headset movement by detecting the movement of points of interest in the surroundings, or a wireless module that may be able to derive movement from a beamforming signal. In any case, the sensors [110] are connected to the host [11] and transmit data back to it to control the operation of applications on the host [11],
Specifically, the sensors [110] provide information to the processor [13], since the output of the sensors [110] will affect the display data being generated. For example, when the user moves the display data shown to him or her must change to match that movement to create the illusion of a virtual world. The sensor data, or a derivative of the sensed data, is also sent to the compression engine [14] according to embodiments of the invention.
Figure 2 shows a view of the headset [12] of Figure 1 when it is in use. For simplicity, the internal workings of the host [11] and the input [17] and decompression [18] engines on the headset [12] are not shown.
As previously mentioned, the headset [11] incorporates two display panels [19], each presented to one of the eyes [24] of a user [23], when in use. These two panels may in fact be a single display panel which shows two images, but this will not affect the operation of these embodiments of the invention.
In Figure 2, the accelerometer [21] and gyroscope [22] are separately shown. Since they are incorporated in the VR headset [12] mounted on the head of the user [23] when the system is in use, they will detect all movements of the head of the user [23], Naturally, if the headset [12] is moved when not being worn, the sensors [21, 22] will also detect this movement, though optionally an additional sensor in the headset [12] could be used to switch various functions on and off depending on whether the headset [12] is being worn. This could be useful if, for example, the image data is also being transmitted to an external display for demonstration purposes.
In any case, the sensors [21, 22] transmit sensor data to the host [11] as previously described, and the host [11] transmits image data for display on the display panels [19] as previously described.
Figure 3 shows an example of the use of localised compression to reduce the amount of data that needs to be transmitted to the headset. When the headset [12] is not in motion, as in Figure 3A, the user [23] is assumed to be looking at the centre of the display panel [19] as shown by the direction of the arrow pointing from the eye [24] to the panel [19], Naturally, the user may look at other locations on the display panel [19], but in the absence of eyetracking this cannot be ascertained. Therefore, the data for all parts of the display panel [19] should be provided in a uniform fashion. This may mean that no compression is applied or that a lossless or low-loss compression algorithm is applied.
Figure 3B shows the case where the headset [12] is in motion, as where the user [23] turns his or her head. The curved arrow [31] shows the direction of motion: in an arc to the right. It is assumed that when the user [23] is turning his or her head to the right, he or she is looking to the right: this is shown in the Figure by the movement of the eye [24] and the arrow pointing from the eye [24] to the display panel [19] and indicating the direction of gaze. Since there is no eye-tracking, this cannot be guaranteed, but based on normal human behaviour - in which the user [23] would turn his or her head because he or she wished to view something to the right - it can be assumed.
Accordingly, localised relative compression is applied to the left part of the image shown on the display panel [19], This is shown by the hatched area [32] on the panel shown in the Figure: since the user is looking to the right, the left-hand side of the image - i.e. the trailing side relative to the direction of the movement - will be in the user’s peripheral vision, where the human eye has low acuity. He or she will therefore not be aware of any loss of quality if that part of the image is compressed more than the right-hand side [33] at which the user is assumed to be actually looking.
This compression could be applied whenever there is movement, or only after the speed of movement is above a minimum threshold. The speed of the movement could also be used to determine the level of compression used, such that as the speed of the movement increases an increased level of compression is used on the same area [32],
This allows higher levels of compression to be applied to the trailing part of the frame as compared to the leading part of the frame. Thus, either compression is applied to the trailing area whereas it was not used before, or a compression algorithm that allows greater loss of data may be used.
Figure 4 shows a variation of localised relative compression. As described in Figure 3, when the headset [12] is not in motion, as in Figure 4A, a uniform level of compression (or no compression) is applied to the display data across the whole of the display panel [19], Where the headset [12] is in motion, as in Figures 4B and 4C, compression is applied on the trailing side [42/46] of the display panel [19],
Figure 4B shows the case where the headset [12] is moving slowly, as indicated by the relatively short arrow indicating the movement [41], As previously described in Figure 3, a smaller area [42] on the trailing side of the image is compressed to a higher level than the rest of the image [43], since the system assumes that the user [23] will be looking in the direction of movement, as indicated by the arrow [44] representing the user’s gaze. The application of compression to this smaller area [42] may be triggered by the speed of the movement exceeding a first threshold.
Figure 4C shows the case where the headset [12] is moving more quickly, as indicated by the longer arrow [45] representing the movement. In this case, the speed of the movement has exceeded a second threshold and therefore a larger area [46] on the trailing side of the image is compressed to a higher level than the rest of the image [47], In this example, the same level of compression is used for the larger area [46] shown in Figure 4C as was used for the smaller area [42] shown in Figure 4B, but a different level of compression could be applied at higher speeds as well as the size of the compressed area [42/46] being increased.
Figure 5 shows a further variation on localised relative compression, whereby different levels of compression are applied in discrete steps or even in a continuum across the image when the headset [12] is in motion.
Figure 5A shows a case where the headset [12] is moving relatively slowly, as indicated by the relatively short arrow [51] representing the movement. An area on the trailing portion [52] of the image is compressed when the speed of the motion is above a minimum threshold, which may simply mean that the headset [12] is moving at all. However, rather than the whole area of the trailing portion [52] being compressed to the same level as in the embodiments of Figures 3 and 4, the level of compression is increased in steps across the trailing portion [52] of the image from the leading portion [53] to the trailing edge, so that, for example area 52c is compressed more than area 52b, which in turn is compressed more than area 52a.
This method may be used in a similar way to the localised relative compression described in Figure 3, whereby the trailing portion [52] is always the same size, or the trailing portion [52/56] may change in size as in the variation shown in Figure 4. This may mean changing the sizes of the differently-compressed areas [52a, b, c], or it may mean adding more gradations of compression as described below with reference to Figure 5B.
Figure 5B shows a case where the headset [12] is moving more quickly, as indicated by the longer arrow [55] representing the movement. A larger area of the trailing portion [56] is compressed compared to the compressed portion [52] shown in Figure 5A. As a result, there are more areas of differently-compressed data in the compressed portion [56], As previously described, the level of compression increases in these areas across the image, so area 56a is least compressed, area 56b is more compressed than area 56a, and so forth through areas 56c, 56d, and 56e.
Figure 6 shows staged compression. This relies on the fact that when a user [23]moves his or her head, he or she will not fully process visual detail while the movement continues, resulting in lower conscious acuity. The faster the movement, the less detail will be consciously visible: the user’s vision will become ‘blurred’. In this example, the methods of the invention are triggered by a linear movement, but the same effect could take place when the headset [12] moves in an arc as previously described.
In Figure 6A, the headset [12] is not moving, and a low level of compression (or no compression) is applied so that the detail of the image shown on the display panel [19] is unchanged.
In Figure 6B, the arrow [61] shows that the headset [12] is moving to the right. However, the relatively short length of the arrow [61] shows that the rate of movement is low. Nevertheless, since the image data is changing by a speed comparable to the movement, it will be apparent that there will be some loss of acuity. As a result, compression can be applied to the whole image such that some detail may be lost, as shown by the dotted hatching of the display panel [19] shown in the Figure.
In Figure 6C, the length of the arrow [62] shows that the headset [12] is moving to the right at a high speed. This means that a user wearing the headset [12] is moving his or her head quickly and will have low acuity, so a high level of compression, resulting in loss of information, may be applied to the whole image, as shown by the dark hatching of the display panel [19] shown in the Figure, without the user perceiving the loss of clarity.
Figure 6 shows three gradations of the level of compression, but there may be any plural number of gradations, such that the system may be a binary determination that if there is motion staged compression is applied and not otherwise, or there may be many levels of compression, each reducing the volume of data and sacrificing detail to a greater extent than the last.
As indicated by the fact that in the Figure the eye [24] is always shown gazing directly ahead, shown by the position of the pupil and the direction of the arrow connecting the eye [24] to the display panel [19], the actual direction of the user’s [23] gaze is immaterial when this method is in use; the same level of compression is applied across the image.
Figure 7 shows the process in use in Figure 3, for localised relative compression. It will be described with reference to the example shown in Figure 3B. A similar process may be used for the versions of localised relative compression shown in Figures 4 and 5, and variations will be described where appropriate.
At Step S71 the gyroscope [22] detects a rotational movement of the headset [12] to the right. If the headset [12] is in use, this will indicate that the user [23] is turning his or her head and therefore is likely to be looking in the direction of movement, as described above with reference to Figure 3. Data indicating that the headset [12] is rotating to the right is transmitted to the host [11] at Step S72.
As described in Figure 1, the sensor data is received by both the processor [13] running the application and the compression engine [14] on the host [11], The processor [13] analyses the sensor data and uses it in the generation of the next frames of display data, based on the movement. It transmits a finished frame to the compression engine [14],
The compression engine [14] receives the sensor data and analyses it at Step S73 to determine the direction of motion. Alternatively, this function may be carried out by a dedicated analysis engine, and analysis may take place before the generation of new display data, rather than both the application and the compression engine [14] performing their own analysis. Furthermore, the application could receive the sensor data and use it to generate instructions or derived data for the compression engine [14], containing information on the direction of movement and potentially predictions of future movements. In any case, the compression engine [14] determines the direction of movement.
In this example, the sensor data is produced by a gyroscope [22] which detects rotational movements. In other embodiments in which a less-specialised sensor or combination of sensors [110] is used, such as analysis of images from an external camera, the host [11] may also have to determine whether the movement is rotational - i.e. in an arc - or some other form of movement. This process might then only continue if the direction of movement is on an arc, i.e. corresponding to a user’s head turning.
In the embodiment shown in Figure 4, the speed of movement will also be required. This may be determined from the gyroscope [22] if the gyroscope provides speed as well as rotation data, or it may be determined from other sensors [110] such as an accelerometer [21], camera, etc. as previously mentioned. This information may also be determined by the compression engine [14] or by a dedicated analysis engine.
As previously mentioned, the speed of movement may be used in either of the embodiments shown in Figure 3 to determine whether compression should be used, and may be used in the embodiment shown in Figure 5 to determine the amount of the image to be compressed, as in Figure 4.
In some embodiments, the output from sensors [110] on the headset [12] may be analysed in a processor on the headset [12] to determine speed and direction of movement. This processed data is then transmitted to the host for use in generation and compression of display data.
At Step S74, localised relative compression is applied to compress the display data forming a trailing portion [32] of the image relative to the display data forming the leading portion [33] of the image. This may, for example, involve exercising two compression algorithms such that in each row of pixels received from the application, the first third comprising the left-hand side [32] of the image - are compressed with a lossy compression algorithm while the second two thirds - comprising the right-hand side [33] of the image- are compressed with a lossless compression algorithm. Alternatively, the compression engine [14] may receive the frame as tiles with location information or split a received frame into tiles with location information, allowing it to determine location of the tiles in the frame regardless of their order, allowing the left-hand tiles [32] of the image to be compressed with the lossy compression algorithm in parallel to the compression of the right-hand tiles [33] with the lossless compression algorithm.
Naturally, the lossless compression algorithm used for the right-hand side [33] of the image may be replaced with a lossy compression algorithm that nonetheless causes less data loss than the compression algorithm used for the left-hand side [32] of the image or no compression could be used for the right-hand side [33] of the image.
Furthermore, the speed of the movement could also be considered such that compression is not applied unless the movement is above a predetermined speed threshold. This would take into account the possibility that the user [23] is turning his or her head while keeping his or her eyes [24] focussed on a fixed point; this is only possible at slow speeds.
In the variations shown in Figures 4 and 5, speed is taken into account when localised compression is applied, as previously described. This means that the speed is also determined at Step S73 and compared to one or more thresholds. Each threshold is associated with a proportion of the image to be compressed and where, in the example described above with reference to Figure 3, one-third of the image is compressed, this amount is replaced by the proportion corresponding to the speed of the movement. In this case, the leading portion [43/47/52/56] and trailing portion [42/46/53/57] may each be any size smaller than the whole image and may change in size on a frame-to-frame basis.
In any case, at Step S75 the compressed display data is sent to the output engine [15] and thence transmitted to the headset [12] to be received by the input engine [17], Then, at Step S76, it is sent to the decompression engine [18], decompressed as appropriate, and displayed on the display panels [19],
Figure 8 shows the process applied in Figure 6, where staged compression is used. It will be described with reference to Figure 6B.
At Step S81, the accelerometer [21] detects a movement [61] of the headset [12], in this example a movement to the right. It transmits data indicating this movement to the host [11] at Step S82.
As previously described, this data is used both for the generation of new display data and according to this embodiment of the invention - to control compression. As such, the movement data is analysed at Step S83 to determine the speed of the movement [61] in a similar way to the movement data described in Figure 6. For the purposes of staged compression, the direction is immaterial, though of course more detail will be required for the generation of the display data. The speed of the movement [61] is supplied to the compression engine [14],
In some embodiments, analysis to determine the speed of the movement may be carried out in a processor on the headset [12] and the determined speed transmitted to the host [12] for use in controlling compression.
In this example, the sensor data is produced by an accelerometer [21] which may not distinguish between straight and rotational movements. In other embodiments in which a less-specialised sensor or combination of sensors is used, such as analysis of images from an external camera, the host [12] may also have to determine whether the movement is rotational - i.e. in an arc - or some other form of movement. It may then amend the application of compression depending on the type of movement.
At Step S84, staged compression is applied. In Figure 6B, the movement [61] is relatively slow, so a low level of compression is applied. This may be a binary determination, such that if the speed of the movement [61] is above a minimum threshold lossy compression is applied, but otherwise no compression or lossless compression is applied. Alternatively, there may be multiple levels of compression as shown between Figure 6B and Figure 6C, such that fast movement triggers the application of a high level of compression and slower movement triggers a lower level of compression. Depending on the algorithm, this may be a smooth continuum or thresholds may be used for different levels of compression.
In any case, at Step S85 the compressed display data is transmitted to the headset [12] at Step S85 and then decompressed and displayed at Step S86.
Figure 9 shows adaptive compression, which is a hybrid of localised and staged compression, incorporating aspects of both to maximise the compression applied and minimise the volume of data transmitted.
Figure 9A shows the case where the headset [12] is not in motion and the user [23] is assumed to be looking at the centre of the image, as described in Figure 3A. In this case, a base level of compression is applied.
Figure 9B shows the case where the headset [12] is moving as the user [23] turns his or her head to the right. The relatively short length of the curved arrow [91] shows that the rate of movement is low, as previously described in Figure 6B. Unlike the staged compression described in Figure 6, however, the assumed direction of the user’s gaze [94] is taken into account: he or she is assumed to be looking in the direction of movement [91], as described in Figure 3B, and therefore localised compression is also applied to an area on the left of the image [92], This results in an area [92] which is at a higher compression level than the rest of the image [93], shown by the darker hatching.
Figure 9C shows the case where the headset is moving at a faster rate, as shown by the length of the curved arrow [95], As in Figure 9B, the user is now assumed to be looking in the direction of movement, so both types of compression are applied: staged compression is used on the entire image [97], as shown by the hatching, which is darker than that used in Figure 9B as a higher level of compression is used to reflect the faster movement. Meanwhile, localised compression is also used on the left-hand side of the image [96], as shown by the fact that the hatching in this area is darker still.
This combination is especially useful because the user is in fact more likely to be looking in the direction of movement where the movement is fast; small and slow rotations of the head may be carried out with the eye fixed on a point, but this is unlikely to occur with fast movements.
Figure 10 shows the process associated with the adaptive compression shown in Figure 9. It will be described with reference to Figure 9B.
At Step S101, the sensors [110] detect movement of the headset [12], As previously described, the gyroscope [22] detects rotation and its direction while the accelerometer [21] detects the speed of the movement, and in this case both will be used for adaptive compression. The sensor data is transmitted to the host [11] and received by the compression engine [14] and application [13] as previously described at Step SI02.
At Step S103, the compression engine [14] - or a connected analysis engine - analyses the received sensor data to determine the type, direction, and speed of the movement. It then applies adaptive compression at Step SI04. As described in Figure 9, this means applying lossy compression across the trailing left-hand part of the frame [92] at a relatively high level and across the rest of the frame at a lower level [93],
As previously mentioned, different types of compression might be applied depending on the type of movement. For example, if at Step S103 the compression engine [14] or analysis engine determines that the movement is linear with no rotational component, the localised compression component of adaptive compression might be omitted and the process continue as described in Figure 8.
Thresholds could be used as appropriate. For example, there might be no compression or a low level of compression applied until the speed of the movement is above a minimum threshold, then only staged compression as described in Figure 6 could be used until the movement [91, 95] is above a second threshold, to take account of the fact that the user [23] may continue to look at a fixed point regardless of head movements if the movement is slow.
In any case, at Step SI05 the compressed data is sent to the output engine [15] for transmission to the headset [12], where it is decompressed and displayed on the display panels [19] as appropriate at Step SI06.
Due to the format of the Figures, all examples have been described in terms of side-to-side movement in two dimensions. This does not limit the range of movement to which the methods of the invention may be applied; changes in the compression area and level as herein described could also be applied to a trailing edge in vertical movement, or part of each of two trailing edges in diagonal movement, as appropriate.
Although only one particular embodiment has been described in detail above, it will be appreciated that various changes, modifications and improvements can be made by a person skilled in the art without departing from the scope of the present invention as defined in the claims. For example, hardware aspects may be implemented as software where appropriate and vice versa. Furthermore, the variations on localised relative compression described above with reference to Figures 4 and 5 could also be used in a system such as that described with reference to Figures 9 and 10. Furthermore, it will be appreciated that the shape of the frame portions/regions need not be rectangular. Their shape can be arbitrary, and can for example be radial to accommodate the shape of human visual acuity or a straight line across the image or adapted around the nature of the compression algorithm (e.g. be tile based, if the compression algorithm operates on tiles into which an image may be divided).

Claims (20)

Claims
1. A method at a host device for compressing display data forming an image for display on one or more displays of a wearable headset, the method comprising:
receiving from the wearable headset, information regarding a direction of movement of the wearable headset, including the one or more displays, the direction being between a trailing position and a leading position;
if the direction of the movement is on an arc, compressing the display data forming a trailing portion of the image relative to the display data forming a leading portion of the image, when displayed on the one or more displays that are moving with the wearable headset, wherein the leading portion and the trailing portion may be of any size smaller than the whole image and may change in size on a frame to frame basis; and forwarding the display data forming the image from the host device to the wearable headset for display on the one or more displays thereof.
2. A method according to either claim 1, wherein the information further comprises a speed of the movement and compression of the display data forming at least the trailing portion of the image is performed if a speed of the movement is above a minimum threshold.
3. A method according to claim 2, wherein compression of the display data forming the whole of the image is performed if a speed of the movement is above a minimum threshold.
4. A method according to either claim 2 or claim 3, wherein the compression of the display data forming a part of, or the whole of the image is based on the speed of the movement above the minimum threshold.
5. A method according to claim 4, wherein the compression of the display data forming a part of, or the whole of the image is increased as the speed of the movement above the minimum threshold increases.
6. A method at a host device for compressing display data forming an image for display on one or more displays of a wearable headset, the method comprising:
receiving from the wearable headset information regarding a speed of movement of the wearable headset, including the one or more displays;
if the speed of the movement is above a minimum threshold, compressing the display data by an amount based on the speed of the movement above the minimum threshold; and forwarding the compressed display data from the host device to the wearable headset for display thereon.
7. A method according to claim 6, wherein the compression of the display data forming the image is increased as the speed of the movement above the minimum threshold increases.
8. A method according to either claim 6 or claim 7, wherein the information further comprises a direction of movement of the wearable headset, including the one or more displays, the direction being between a trailing position and a leading position, the method further comprises, if the direction of the movement is on an arc, compressing the display data forming a trailing portion of the image relative to the display data forming a leading portion of the image, when displayed on the one or more displays that are moving with the wearable headset.
9. A method according to any one of claims 1 to 5 or 8, wherein the display data forming a trailing portion of the image is compressed by a higher compression factor than the display data forming a leading portion of the image.
10. A method according to claim 9, wherein compression of the display data is increased in portions across the image in the direction from the leading portion to the trailing portion of the image.
11. A method according to any one of claims 2 to 5 or 8 to 10, wherein the trailing portion increases in size compared to the leading portion of the image as the speed of the movement above the minimum threshold increases.
12. A method according to any one of claims 2 to 5 or 8 to 11, further comprising, at the host device:
determining, from the information, whether the movement is on an arc or linear;
determining, from the information, the speed of the movement of the wearable headset; and determining whether the speed of the movement is above the minimum threshold.
13. A method according to any one of claims 2 to 5 or 8 to 11, further comprising, at the wearable headset:
determining whether the movement of the wearable headset is on an arc or linear;
determining the speed of the movement of the wearable headset;
determining whether the speed of the movement is above the minimum threshold; and sending the information to the host device, if the movement is on an arc and the speed is above the minimum threshold.
14. A method at a wearable headset for displaying display data forming an image on one or more displays, the method comprising:
sensing movement of the wearable headset indicative of movement of the one or more displays;
determining whether the movement is on an arc or linear;
determining the speed of the movement of the wearable headset;
determining whether the speed of the movement is above a minimum threshold;
sending information regarding the speed and direction of the movement to a host device, if the movement is on an arc and the speed is above the minimum threshold;
receiving from the host device, the display data forming the image; and displaying the image on one or more displays.
15. A method according to claim 14, wherein the sensing movement of the wearable headset comprises using a gyroscope and an accelerometer in the wearable headset.
16. A host device configured to perform all the steps of a method according to any one of claims 1 to 13.
17. A wearable headset configured to perform all the steps of a method according to either claim 14 or claim 15.
18. A wearable headset according to claim 17, wherein the wearable headset is a virtual reality headset.
19. A method according to according to claim 17, wherein the wearable headset is an augmented reality set of glasses.
20. A system comprising a host device according to claim 16 and a wearable headset connected to the host device.
Intellectual
Property
Office
Application No: GB1709237.0 Examiner: Ms Lucy Stratton
GB1709237.0A 2017-06-09 2017-06-09 Using headset movement for compression Active GB2577024B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
GB2209162.3A GB2606889B (en) 2017-06-09 2017-06-09 Using headset movement for compression
GB1709237.0A GB2577024B (en) 2017-06-09 2017-06-09 Using headset movement for compression
GB2209157.3A GB2606651B (en) 2017-06-09 2017-06-09 Using headset movement for compression
PCT/GB2018/051567 WO2018224841A1 (en) 2017-06-09 2018-06-08 Using headset movement for compression
US16/617,045 US20200145687A1 (en) 2017-06-09 2018-06-08 Using headset movement for compression

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1709237.0A GB2577024B (en) 2017-06-09 2017-06-09 Using headset movement for compression

Publications (3)

Publication Number Publication Date
GB201709237D0 GB201709237D0 (en) 2017-07-26
GB2577024A true GB2577024A (en) 2020-03-18
GB2577024B GB2577024B (en) 2022-08-03

Family

ID=59358174

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1709237.0A Active GB2577024B (en) 2017-06-09 2017-06-09 Using headset movement for compression

Country Status (3)

Country Link
US (1) US20200145687A1 (en)
GB (1) GB2577024B (en)
WO (1) WO2018224841A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10657674B2 (en) 2016-06-17 2020-05-19 Immersive Robotics Pty Ltd. Image compression method and apparatus
AU2018217434C1 (en) 2017-02-08 2023-04-27 Immersive Robotics Pty Ltd Displaying content to users in a multiplayer venue
US11153604B2 (en) 2017-11-21 2021-10-19 Immersive Robotics Pty Ltd Image compression for digital reality
CN111837384A (en) 2017-11-21 2020-10-27 因默希弗机器人私人有限公司 Frequency component selection for image compression
GB2575326B (en) 2018-07-06 2022-06-01 Displaylink Uk Ltd Method and apparatus for determining whether an eye of a user of a head mounted display is directed at a fixed point
KR20220135483A (en) * 2021-03-30 2022-10-07 삼성전자주식회사 A method and an apparatus for conversational services in a mobile communication system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160018652A1 (en) * 2014-01-24 2016-01-21 Osterhout Group, Inc. See-through computer display systems
US20160109709A1 (en) * 2014-01-21 2016-04-21 Osterhout Group, Inc. See-through computer display systems
US20160170488A1 (en) * 2014-12-12 2016-06-16 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and program
GB2536025A (en) * 2015-03-05 2016-09-07 Nokia Technologies Oy Video streaming method
US20160259166A1 (en) * 2014-01-21 2016-09-08 Osterhout Group, Inc. Compact optics with increased color gamut
US20160282626A1 (en) * 2015-03-27 2016-09-29 Osterhout Group, Inc. See-through computer display systems
US20160328884A1 (en) * 2014-11-27 2016-11-10 Magic Leap, Inc. Virtual/augmented reality system having dynamic region resolution
KR20160139318A (en) * 2015-05-27 2016-12-07 주식회사 넥슨코리아 System and method for virtual reality display
EP3136718A1 (en) * 2015-08-27 2017-03-01 HTC Corporation Method for synchronizing video and audio in virtual reality system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8184069B1 (en) * 2011-06-20 2012-05-22 Google Inc. Systems and methods for adaptive transmission of data
GB2523740B (en) * 2014-02-26 2020-10-14 Sony Interactive Entertainment Inc Image encoding and display

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160109709A1 (en) * 2014-01-21 2016-04-21 Osterhout Group, Inc. See-through computer display systems
US20160259166A1 (en) * 2014-01-21 2016-09-08 Osterhout Group, Inc. Compact optics with increased color gamut
US20160018652A1 (en) * 2014-01-24 2016-01-21 Osterhout Group, Inc. See-through computer display systems
US20160328884A1 (en) * 2014-11-27 2016-11-10 Magic Leap, Inc. Virtual/augmented reality system having dynamic region resolution
US20160170488A1 (en) * 2014-12-12 2016-06-16 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and program
GB2536025A (en) * 2015-03-05 2016-09-07 Nokia Technologies Oy Video streaming method
US20160282626A1 (en) * 2015-03-27 2016-09-29 Osterhout Group, Inc. See-through computer display systems
KR20160139318A (en) * 2015-05-27 2016-12-07 주식회사 넥슨코리아 System and method for virtual reality display
EP3136718A1 (en) * 2015-08-27 2017-03-01 HTC Corporation Method for synchronizing video and audio in virtual reality system

Also Published As

Publication number Publication date
WO2018224841A1 (en) 2018-12-13
GB2577024B (en) 2022-08-03
US20200145687A1 (en) 2020-05-07
GB201709237D0 (en) 2017-07-26

Similar Documents

Publication Publication Date Title
US20200145687A1 (en) Using headset movement for compression
US11727619B2 (en) Video pipeline
AU2017200163B2 (en) Perception based predictive tracking for head mounted displays
EP3824371B1 (en) Distributed foveated rendering based on user gaze
CN112150601B (en) Methods, computing systems, and non-transitory machine-readable media for foveal rendering
WO2017047178A1 (en) Information processing device, information processing method, and program
KR101920983B1 (en) Display of information on a head mounted display
US10742966B2 (en) Method, system and recording medium for adaptive interleaved image warping
CN111292236B (en) Method and computing system for reducing aliasing artifacts in foveal gaze rendering
US20200334790A1 (en) Image resolution processing method, system, and apparatus, storage medium, and device
JP2007265274A (en) Physiology adaptive display device
CN112445339A (en) Gaze and glance based graphical manipulation
CN106201284B (en) User interface synchronization system and method
KR101951406B1 (en) Head mounted display and operating method for reducing virtual reality sickness
GB2606889A (en) Using headset movement for compression
GB2606651A (en) Using headset movement for compression
CN106126148B (en) Display control method and electronic equipment
US20200066234A1 (en) VR Drawing Method, Device, and System
KR20180055637A (en) Electronic apparatus and method for controlling thereof
EP3217256B1 (en) Interactive display system and method
WO2019038520A1 (en) Compressing image data for transmission to a display of a wearable headset based on information on blinking of the eye