GB2557569A - Adjusting display data - Google Patents

Adjusting display data Download PDF

Info

Publication number
GB2557569A
GB2557569A GB1615392.6A GB201615392A GB2557569A GB 2557569 A GB2557569 A GB 2557569A GB 201615392 A GB201615392 A GB 201615392A GB 2557569 A GB2557569 A GB 2557569A
Authority
GB
United Kingdom
Prior art keywords
lens
display
data
viewer
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1615392.6A
Other versions
GB201615392D0 (en
GB2557569B (en
Inventor
Morse Douglas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DisplayLink UK Ltd
Original Assignee
DisplayLink UK Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DisplayLink UK Ltd filed Critical DisplayLink UK Ltd
Priority to GB1615392.6A priority Critical patent/GB2557569B/en
Publication of GB201615392D0 publication Critical patent/GB201615392D0/en
Priority to PCT/GB2017/052529 priority patent/WO2018046892A1/en
Publication of GB2557569A publication Critical patent/GB2557569A/en
Application granted granted Critical
Publication of GB2557569B publication Critical patent/GB2557569B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • G06T5/80
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0181Adaptation to the pilot/driver
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Abstract

A method for adjusting display data, for example, in a head up display, includes obtaining information regarding at least one characteristic of a lens located between a display 39 and a retina of a viewer's eye and determining whether an adjustment of display data to be displayed on the display 39 should be made based on the at least one characteristic 38 of the lens. The display data to be displayed is adjusted based on the at least one characteristic of the lens and forwarded for display on the display 39. A device and system implementing the method are also claimed. The lens characteristic may be an optical property of the lens which may include one or more of the optical distortion of the lens, the focal length, the diameter or the chromatic aberration. In another aspect, a method comprises determining at least one parameter of a lens, generating information based on the parameter, transmitting the information to a display generator, displaying the received adjusted data on a display. A remote device and system implementing the method are also claimed.

Description

(71) Applicant(s):
DisplayLink (UK) Limited (Incorporated in the United Kingdom)
140 Cambridge Science Park, Milton Road, CAMBRIDGE, CB4 0GF, United Kingdom (72) Inventor(s):
Douglas Morse (51) INT CL:
G02B 27/01 (2006.01) (56) Documents Cited:
EP 2980628 A1 WO 2013/082387 A1
EP 2447758 A1 (58) Field of Search:
INT CL G02B
Other: EPODOC, TXTA, WPI (74) Agent and/or Address for Service:
Mathys & Squire LLP
The Shard, 32 London Bridge Street, LONDON, SE1 9SG, United Kingdom (54) Title of the Invention: Adjusting display data
Abstract Title: Adjusting Display Data Based on Intermediate Lens Properties (57) A method for adjusting display data, for example, in a head up display, includes obtaining information regarding at least one characteristic of a lens located between a display 39 and a retina of a viewer's eye and determining whether an adjustment of display data to be displayed on the display 39 should be made based on the at least one characteristic 38 of the lens. The display data to be displayed is adjusted based on the at least one characteristic of the lens and forwarded for display on the display 39. A device and system implementing the method are also claimed. The lens characteristic may be an optical property of the lens which may include one or more of the optical distortion of the lens, the focal length, the diameter or the chromatic aberration. In another aspect, a method comprises determining at least one parameter of a lens, generating information based on the parameter, transmitting the information to a display generator, displaying the received adjusted data on a display. A remote device and system implementing the method are also claimed.
Figure 3
Figure GB2557569A_D0001
1/5
LD m
LD
Figure GB2557569A_D0002
Figure 1
Figure GB2557569A_D0003
2/5
Figure 2
Figure GB2557569A_D0004
Figure GB2557569A_D0005
Figure GB2557569A_D0006
3/5
Figure 3
Figure GB2557569A_D0007
4/5
Figure 4
Figure GB2557569A_D0008
5/5
Figure 5
Figure GB2557569A_D0009
Intellectual
Property
Office
Application No. GB1615392.6
RTM
Date :24 February' 2017
The following terms are registered trade marks and should be read as such wherever they occur in this document:
Bluetooth
Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo
Adjusting Display Data
Background
This invention relates to a method and apparatus for adjusting display data according to characteristics of a lens located between a display and a retina of a user’s eye.
Conventional virtual-reality headsets include goggles comprising two small displays, each with a lens between it and the user’s eye. This allows the user to view the display from a very short distance away, as otherwise the user would be unable to focus. However, the lens introduces distortions which must be accounted for when display data is generated. As a result, the application generating the display data must be aware of the nature of the lens.
Conventionally, a particular host system will have a specific compatible type of headset, and the specifications of the lenses in the headset will be hard coded into the system and often into the compatible applications themselves. This leads to a lack of competition and excessive duplication of technology as a user must buy the compatible headset for each system and potentially each application. Furthermore, it is impossible to account for vision problems experienced by the user and requiring, for example, glasses.
Finally, similar techniques cannot be used for display systems other than integrated headsets.
The system of the invention aims to solve or at least mitigate this problem.
Summary
Accordingly, in a first aspect, the invention provides a method for adjusting display data, the method comprising:
obtaining information regarding at least one characteristic of a lens located between a display and a retina of a viewer’s eye;
determining whether an adjustment of display data to be displayed on the display should be made based on the at least one characteristic of the lens;
adjusting the display data to be displayed on the display based on the at least one characteristic of the lens, if it is determined that an adjustment of display data to be displayed on the display should be made; and forwarding the adjusted display data for display on the display.
In a preferred embodiment, the information comprises either the lens identification data or the viewer identification data and the method further comprises determining from the lens identification data or viewer identification data the lens characteristic data, wherein the determining whether an adjustment of display data to be displayed on the display should be made is based on the lens characteristic data.
The at least one characteristic of the lens preferably includes at least one optical characteristic of the lens, which may include one or more of:
optical distortion of the lens; focal length of the lens; diameter of the lens; and chromatic aberration of the lens.
Obtaining the information preferably comprises receiving the information over a short-range wireless connection from the lens, which may comprise a wireless RF transmitter mounted to the lens.
In one embodiment, the method further comprises:
determining at least one parameter of a lens located between a display and a retina of a viewer’s eye; and generating the information based on the at least one parameter of the lens.
According to a second aspect, the invention provides a method for displaying display data to a viewer, the method comprising:
determining at least one parameter of a lens located between a display and a retina of a viewer’s eye;
generating information regarding at least one characteristic of the lens based on the at least one parameter of the lens;
transmitting the information to a display generator to enable the display generator to adjust display data based on the at least one characteristic of the lens;
receiving, from the display generator, adjusted display data; and displaying the adjusted display data on the display.
In one embodiment, determining at least one parameter of a lens may comprise reading a mechanical indicium on the lens. Alternatively, or additionally, determining at least one parameter of a lens may comprises determining an identification of the lens, and/or determining at least one parameter of a lens may comprise receiving the at least one parameter over a short-range wireless connection from the lens, which may comprise a wireless RF tag mounted to the lens.
Preferably, the information may further comprise Electronic Device ID, EDID, data, optical characteristics of the display and/or compression optimisation data such as foveal area equations.
In an embodiment, the lens and the display may be mounted in a remote device and the information may further include sensor and/or audio capabilities of the remote device, wherein the sensor and/or audio capabilities of the remote device may, preferably, include one or more of:
a gyroscope;
an accelerometer; a microphone; and speakers.
In a third aspect, the invention provides a host device configured to perform a method as described above.
In a fourth aspect, the invention provides a remote device configured to perform a method as described above.
In a fifth aspect, the invention provides a system comprising a host device as described above and a remote device as described above connected to the host device.
In an embodiment, the display is mounted on a frame configured to be worn on the viewer’s head. In some embodiments, the lens may be configured to be within, in contact with or close to the viewer’s eye, wherein the lens may be a contact lens or a lens implanted in the viewer’s eye. Alternatively, the lens may be mounted together with the display on the frame, such that the lens is located between the display and a retina of a viewer’s eye.
In one embodiment, the system may comprise a pair of the lenses and a pair of the displays, each of the displays mounted on the frame associated with a respective lens to be viewed by a respective one of the viewer’s eyes. In some cases, the frame may form part of the remote device. In some embodiments, the lens may be mounted together with the display in the remote device, such that the lens is located between the display and a retina of a viewer’s eye.
The system may compris a pair of the lenses and a pair of the displays, each of the displays mounted in the remote device associated with a respective lens to be viewed by a respective one of the viewer’s eyes.
In some embodiments, the remote device may be configured to be worn on the head of the viewer, and may, in some cases, comprise a set of glasses, such as, for example, an augmented reality set of glasses or a headset, such as, for example, a virtual reality headset.
In other aspects, an embodiment of the invention may provide a system consisting of • a head-mounted display device comprising one or more display panels mounted in front of a user’s eyes and incorporating lenses in order to aid focussing, arranged to transmit configuration information to a connected host device;
• a host device arranged to receive configuration information from the head-mounted display device and alter the running of an application in order to produce display data suited to the particular requirements of that head-mounted display device.
The head-mounted display device might be a virtual-reality headset, augmented-reality glasses, or any other such head-up display.
The configuration information sent may be limited to display information such as the nature of connected lenses and the displays integral to the head-mounted display device, but may also include other capabilities and configuration settings, such as sensor inputs available, multimedia outputs available, networking capability and signal strength, etc.
This will allow different display devices to be swapped between host systems and still operate correctly, which will mean that a user need only have one such display device. It could also allow lenses to be swapped if the display device could be properly configured to accept the new lenses, which would be useful where a user requires corrective lenses rather than only the conventional lens supplied in a headset.
Brief Description of the Drawings
Embodiments of the invention will now be more fully described, by way of example, with reference to the drawings, of which:
Figure 1 shows a conventional system;
Figure 2 shows an arrangement of a display panel, a lens, and a user’s eye;
Figure 3 shows a system arranged according to an embodiment of the invention;
Figure 4 shows an arrangement according to a second embodiment of the invention; and 5 Figure 5 shows an arrangement according to a third embodiment of the invention.
Detailed Description of the Drawings
Figure 1 shows a conventional system arranged according to the current art, which comprises a host [11], connected to a display controller [13], which in turn is connected to an eyepiece [14] which contains two display panels [16], The display controller [13] and eyepiece [14] are likely to be co-located, i.e. contained within a single casing such that they appear to be one device. As such, in some embodiments the display controller [13] may not appear to be a separate device, but there will nonetheless be some method of controlling the display panels [16], In order to indicate this, the eyepiece [14] and the display controller [13] are shown within a single casing, together comprising a headset [12],
The display controller [13] may be a general-purpose programmable processor, which may have several cores, or it may be a collection of special-purpose engines, or it may be a combination of the two. It is likely to have memory available.
The headset [12] may be any device for displaying image data locally to the user, including a virtual-reality headset, augmented-reality glasses, or a head-up display. It is preferably positioned extremely close to the user and may be head-mounted. For the purposes of this description, the example used will be a virtual-reality headset incorporating speakers, a microphone, a gyroscope, and an accelerometer.
Likewise, while the host [11] could be any computing device capable of connecting to a suitable display device and running any application or applications which can send data for display on the display device, for the purposes of this description the example used will be a gaming console running a video game.
The host console [11] may be connected to the headset [12] via any wired or wireless connection, and there may also be other inputs, outputs, and controls such as mice, keyboards, joysticks, other displays, speakers, etc. which are not shown in the diagram.
Figure 2 shows a cross-section of part of the headset [12], showing a display panel [16] and a lens [21] in front of it, together with a user’s eye [22] when the system is in use. The display panel [16] is similar to the display panel shown in Figure 1, and the lens [21] is of a standard type. Conventionally, the specifications of the display panel [16] and the characteristics of the lens [21] are hard-coded into the headset [12] and are therefore known to the host [11] and the applications [15] running on the host [11], and they cannot be changed.
Display data is produced by the application [15] on the host [11] and transmitted to the headset [12], where it is received by the display controller [13], It is processed for display on the display panels [16] comprising the eyepiece [14] by scaling, colour correction, decompression, decryption, etc.. It is then sent to the display panels [16] for display. The user views it through the lenses [21], which allow focussing at the very close range required. However, they distort the image, most notably by introducing a ‘fish-eye’ effect whereby only the centre of the user’s field of vision is clear and the surroundings are warped, and the application [15] must allow for this distortion when generating the image by pre-distorting the image. In the case of the ‘fish-eye’ distortion, this could involve warping the edges of the images sent to the display panels [16] in order to counteract the warping caused by the lenses [21], This would preferably be carried out as the last stage in generating the images prior to transmission.
A further example of distortion is chromatic aberration, as a lens [21] may introduce a colour shift to the image as observed by the viewer. For example, thick glass may introduce a slight blue or green tint. This must also be known and accounted for by the application [15] during pre-distortion, and in this case the application [15] may, for example, introduce a slight red tint to the images prior to transmission.
Since the details of the display panel [16] and associated lens [21] are hard-coded, the application’s [15] corrections must also be hard-coded, and this means that the application [15] can only be used with this headset [12] when it is fitted with these lenses [21], otherwise the image seen by the user will still appear to be distorted, leading to a poor user experience.
Figure 3 shows a similar system to that shown in Figure 1, but arranged according to one embodiment of the invention.
The headset [32] shown in Figure 3 also consists of a display controller [33] and an eyepiece [34], incorporating two display panels [39], These display panels [39] are likely to be similar to those shown in Figures 1 and 2, and like those display panels [16] will have associated lenses [21] to allow the user to view them clearly from a close range. As previously described, these lenses [21] will introduce distortions that must be accounted for during image generation.
In this system, however, the characteristics [38] of the headset [32], including characteristics of the lenses, are available to the display controller [33] and in some embodiments are changeable. They are shown in an area of memory on the display controller [33] in Figure 3.
As in Figure 1, the headset [32] is connected [35] to a host device [31], which in this example is a gaming console but may be any device which is capable of generating display data. The host [31] is running an application [37], which generates display data and transmits it to the display controller [33],
There is also a second connection [36] shown between the headset [32] and the host [31], although in practice the same connection [35] would most likely be used, or at least the two connections may be separate wires in a single physical cable or separate wireless ports in the same wireless connection device, for example. Once the display controller has obtained the descriptive data [38] about the lens, this connection [36] carries it to the host [31], where it is received and passed to the application [37] for use in generating display data. This will allow the host to determine whether the display data needs to be adjusted so that it is ideally suited for the characteristics of the headset [32] and, if so, to perform such adjustments prior to transmitting it for display. The display controller [33] then receives this adjusted display data and displays it on the display panels [39],
This is distinct from the current art as previously described; in the current art the characteristics of the headset [32] are programmed into the application [37], not transmitted by the display controller [33], which may not have access to them. This means that the predistortion will always be carried out in the same way. Using the methods of the invention, different headsets [32] could be interchanged, provided they are capable of transmitting these characteristics [38],
The characteristics [38] shown in Figure 3 include Electronic Device ID (EDID) and optics information for the display panels [39] and lenses [21], which may include characteristics such as resolution of the display panels [39] and lens characteristics such as the focal length of the lenses [21], Furthermore, the characteristics may include compression optimisation data such as foveal area equations. Conventionally, these are also pre-configured by the manufacturer and stored as parameters in the headset [32], However, it would be beneficial for them to be changed by the user in order to take account of differing eyetracking patterns such as conditions that cause the user’s eyes to not necessarily point in the same direction. Such amended parameters could then be transmitted to the host along with the rest of the characteristics [38] so that the application [37] can use them in foveal predistortion and compression.
In this particular embodiment, the characteristics [38] of the headset may also include audio and sensor capabilities, since, as previously mentioned, the headset [32] in this example incorporates speakers, a microphone, a gyroscope and an accelerometer. This will allow further fine-tuning for the application [37] to suit the connected headset [32], for example production of a different sound quality, acceptance of verbal commands, or allowance for different sensor inputs. These are examples only and other characteristics such as the physical size of the display panels [39] or diameter of the lenses [21] and chromatic aberration introduced by the lenses [21] may also be stored and transmitted as appropriate.
The characteristics [38] could be altered if the hardware of the headset [32] is changed. For example, the lenses [21] could be made replaceable so that if a user required assistive technology specially-designed lenses could be substituted for the standard ones and the characteristics [38] of the headset [32] amended accordingly so that different information will be transmitted to the host [31] when appropriate. This would make the system even more useful, as it would introduce flexibility and improved customisation. This could in turn mean that the headset could use different parameters, for example a user identification such as a usemame/password pair or biometric data, to determine the characteristics of the lens. This would be useful if, for example, a particular user always used a particular pair of lenses, which may, for example, be contact lenses.
In a further embodiment, the characteristics [38] of the headset [32] could be changed due to a change in software, for example if an upgrade to the firmware of the display controller [33] allows it to perform some pre-distortion locally according to its own knowledge of the characteristics [38] of the headset [32], For example, it may be able to perform pre-distortion to correct for chromatic aberration, and it might be useful to then change the information on chromatic aberration transmitted to the host [31] so that no redundant pre-distortion is carried out.
The changes to the characteristics [38] could be made manually, so that the headset [32] has to be re-programmed with awareness of new hardware or software updates that have been introduced. Alternatively, they could be made automatically by, for example, an automatic configuration program run by inserting a disk or running an application on a connected device, or by an aspect of any new hardware, such as a small integrated circuit or a barcode incorporated into a lens [21] that automatically causes changes in the characteristics of the headset [32] when the lens is inserted. Because the characteristics [38], especially optics information, are particularly associated with different lenses [21], the display controller [33] could store the characteristics [38] as a list of collections of characteristics each associated with a lens identifier meaning that, for example, every time a new lens [21] is inserted the headset [32] identifies it via, for example, the aforementioned incorporated barcode and checks whether it has characteristics stored. If so, it is able to transmit them to the host [31] for use in adjusting the display data transmitted by the application [37], If not, it could, for example, download them from the internet or fetch them from memory incorporated into the lens [21] and store them as well as transmitting them.
The specific characteristics could also be determined by the host [31], such that the lens identification data is read by the headset [32] and transmitted to the host [31], which has a similar method of storing and determining the characteristics of different lenses [21] for the use of the application [37], If, as previously mentioned, the lens [21] is identified via a user identification, this user identification could be transmitted to the host [31] and used in the same way.
A further example of a manual change in characteristics could be a change to the physical configuration of the headset [32], most likely through user manipulation. For example, the user could physically manipulate the eyepiece [34] to move the display panels [39] further apart to match his or her interpupillary distance. This distance could also be one of the characteristics [38] transmitted to the host [31] for use by the application [37] in, for example, generating stereoscopic data, so that when different images are being generated for the display panels [39] to show different angles on a virtual 3D object, the application [37] can account for the distance between the user’s eyes as he or she views the object.
Naturally, in a more mobile headset such as a pair of augmented-reality glasses which are not designed to be connected to an external host, the computing device which acts as the host [31] may be integrated into the same casing as the display controller [33] as an internal display generator so that the whole can be configured to be worn on the user’s head. The system will, however, operate in the same way, with the display controller [33] receiving the lens and headset characteristics [38] and transmitting them to the display generator [31] so that it can determine if adjustments to the display data are required and make them, prior to transmitting the display data to the display controller [33] for display.
Finally, the lens [21] might not be part of the headset [32] as long as it is positioned between the user’s eye (or at least his or her retina) and the eyepiece [39], It may be within, in contact with, or merely close to the user’s eye and the above described methods could still apply. Examples of such embodiments are shown in Figures 4 and 5.
Figure 4 shows a host device [41] connected to a display controller [46], In this example, the host device [41] is a smartphone running an application [45] that generates display data and transmits it [48] to the display controller [46] in a set-top box [410], This is integrated into a display device and therefore controls the integral display panel [49], but the display controller [46] and the panel [49] are separate devices that merely share a casing. The smartphone [41] may be connected to the set-top box [410] by a wired or wireless connection, and even over a network, including the Internet. In this example, the connection is through a USB cable [48], As in Figure 3, there is a second connection [411] shown between the smartphone [41] and the set-top box [410] which carries the lens characteristics. As previously described, in practice this is likely to be part of the same connection [48], but it may be separate.
Figure 4 also shows a user’s eye [42] viewing the display device [49] when it is in use. The user is viewing the display device [49] through a contact lens [43] which will be on the surface of the eye [42], though similar methods could be used for a bionic lens which replaces the internal lens of the eyeball, glasses or goggles worn by the user, or other similar personal lenses.
The contact lens [43] shown in Figure 4 incorporates a transmitter [44], In this example, this is an RFID tag which may be read by the display controller, but it may alternatively be a wireless transmitter that uses Bluetooth or a similar short-range wireless connection protocol. In any case, it transmits [47] information regarding characteristics of the lens [43] to the display controller [45] in the set-top box [410], This information could comprise the actual characteristics of the lens [43], or a parameter such as a lens identifier from which the characteristics could be derived, or a user identifier from which a lens identifier or characteristics could be derived, as previously described. In any case, the set-top box [410] determines the characteristics of the lens [43] and then transmits [411] these to the application [45] running on the smartphone [41], The application [45] is then able to determine whether any adjustment to the display data is appropriate and perform it as necessary. The adjusted display data is then transmitted [48] to the set-top box [410] so that it can forward it for display on the display device [49],
Naturally, if other information such as the EDID or optical characteristics of the display [49] are available to the set-top box [410] these may be transmitted and optionally used alongside the lens characteristics.
In a similar embodiment, if the set-top box [410] is capable of adjusting display data itself, it might not transmit the lens characteristics to the smartphone [41] but instead, having obtained the characteristics via the transmitter [44] on the lens [43], it may receive unadjusted display data from the smartphone [41] over the USB connection [48] and determine for itself whether adjustment is required. It may then adjust the display data based on its knowledge of the characteristics of the lens if necessary and forward it for display on the display device [49],
Figure 5 shows a host device [51] which, as in Figure 4, is a smartphone. In this example, it is connected wirelessly via a short-range wireless connection [54] such as Bluetooth directly to a display device such as a television [55], Among other components, the smartphone [51] includes a characteristic input engine [52] and a display generation engine [53], The characteristic input engine [52] is configured to receive signals from appropriatelydesigned lenses which contain information regarding characteristics of the lenses in question. It is then able to transfer these to the display generation engine [53] for use in generating display data. As previously described, the display generation engine [53] may use these characteristics to determine if pre-distortion and other adjustments on the generated display data are necessary and perform these prior to transmitting it to the display device [55] for display. For example, if the display generation engine [53] determines from the lens characteristics that the lens [56] distorts images seen through it by vertically ‘squashing’ them, it may vertically ‘stretch’ the images it generates prior to transmission.
Since in this embodiment the smartphone [51] is receiving the lens characteristics, generating the display data and supplying it directly to the display device [55], it incorporates the components and functionality of the host device [31/41] and the display controller [33/410] of Figures 3 and 4. Roughly, these correspond to the display generation engine [52] and the characteristic input engine [53] respectively.
Like Figure 4, Figure 5 shows a user’s eye [42] viewing the display device [55] when it is in use. In this embodiment, the user is viewing the display device [49] through a bionic lens [56] which replaces the internal lens of the eyeball. As in Figure 4, there is a transmitter attached to the lens, but in this case the ‘transmitter’ [57] is not an active transmitter as in Figure 4 but a bar code printed on the lens [56], It is read by a sensor incorporated into the characteristic input engine [52] on the smartphone [51], which thus receives information regarding characteristics of the lens [56], This information could comprise the actual characteristics of the lens [56] to be used, or a lens identifier from which the characteristics could be derived, or a user identifier from which a lens identifier or characteristics could be derived, as previously described. The smartphone [51] is then able to determine whether any adjustment to the display data is appropriate and perform it as necessary prior to transmitting the display data to the display device [55],
These embodiments could further increase the benefits of the invention, as they could allow the user to wear his or her own assistive device or have fully-customised lenses for use with any appropriately-configured display device.
Although particular embodiments have been described in detail above, it will be appreciated that various changes, modifications and improvements can be made by a person skilled in the art without departing from the scope of the present invention as defined in the claims. For example, hardware aspects may be implemented as software where appropriate and vice versa, and modules which are described as separate may be combined into single modules and vice versa. Functionality of the modules may be embodied in one or more hardware processing device(s) e.g. processors and/or in one or more software modules, or in any appropriate combination of hardware devices and software modules. Furthermore, software instructions to implement the described methods may be provided on a computer readable medium.

Claims (33)

Claims
1. A method for adjusting display data, the method comprising:
obtaining information regarding at least one characteristic of a lens located between a 5 display and a retina of a viewer’s eye;
determining whether an adjustment of display data to be displayed on the display should be made based on the at least one characteristic of the lens;
adjusting the display data to be displayed on the display based on the at least one characteristic of the lens, if it is determined that an adjustment of display data to be displayed
10 on the display should be made; and forwarding the adjusted display data for display on the display.
2. A method according to claim 1, wherein the information includes one or more of: lens characteristic data including the at least one lens characteristic;
15 lens identification data from which the lens characteristic data can be determined; and viewer identification data from which the lens characteristic data can be determined.
3. A method according to claim 2, wherein the information comprises either the lens identification data or the viewer identification data and the method further comprises
20 determining from the lens identification data or viewer identification data the lens characteristic data, wherein the determining whether an adjustment of display data to be displayed on the display should be made is based on the lens characteristic data.
4. A method according to any preceding claim, wherein the at least one characteristic of
25 the lens includes at least one optical characteristic of the lens.
5. A method according to claim 4, wherein the at least one optical characteristic of the lens includes one or more of:
optical distortion of the lens; focal length of the lens; diameter of the lens; and chromatic aberration of the lens.
6. A method according to any preceding claim, wherein obtaining the information comprises receiving the information over a short-range wireless connection from the lens.
7. A method according to claim 6, wherein the short-range wireless connection 10 comprises a wireless RF transmitter mounted to the lens.
8. A method according to any one of claims 1 to 5, further comprising:
determining at least one parameter of a lens located between a display and a retina of a viewer’s eye; and
15 generating the information based on the at least one parameter of the lens.
9. A method for displaying display data to a viewer, the method comprising:
determining at least one parameter of a lens located between a display and a retina of a viewer’s eye;
20 generating information regarding at least one characteristic of the lens based on the at least one parameter of the lens;
transmitting the information to a display generator to enable the display generator to adjust display data based on the at least one characteristic of the lens;
receiving, from the display generator, adjusted display data; and 25 displaying the adjusted display data on the display.
10. A method according to either claim 8 or claim 9, wherein determining at least one parameter of a lens comprises reading a mechanical indicium on the lens.
11. A method according to either claim 8 or claim 9, wherein determining at least one parameter of a lens comprises determining an identification of the lens.
5
12. A method according to either claim 8 or claim 9, wherein determining at least one parameter of a lens comprises receiving the at least one parameter over a short-range wireless connection from the lens.
13. A method according to claim 12, wherein the short-range wireless connection 10 comprises a wireless RF tag mounted to the lens.
14. A method according to any one of claims 1 to 5 or 8 to 13, wherein the information further comprises Electronic Device ID, EDID, data, optical characteristics of the display and/or compression optimisation data such as foveal area equations.
15. A method according to any one of claims 1 to 5 or 8 to 14, wherein the lens and the display are mounted in a remote device and the information further includes sensor and/or audio capabilities of the remote device.
20
16. A method according to claim 15, wherein the sensor and/or audio capabilities of the remote device includes one or more of:
a gyroscope;
an accelerometer; a microphone; and
25 speakers.
17. A device for adjusting display data, the device configured to perform all the steps of a method according to any one of claims 1 to 7 or claims 14 to 16.
18. A device according to claim 17, wherein the device is configured to receive the information from a remote device and is configured to transmit the adjusted display data to the remote device.
19. A remote device configured to perform all the steps of a method according to any one of claims 9 to 13 or claims 14 to 16 when dependent on claim 9 to 13.
20. A system comprising a device according to either claim 17 or claim 18 and the 10 display.
21. A system comprising a device according to either claim 17 or claim 18, a remote device according to claim 19 and the display.
15
22. A system according to either claim 20 or claim 21, wherein the display is mounted on a frame configured to be worn on the viewer’s head.
23. A system according to any one of claims 20 to 22, wherein the lens is configured to be within, in contact with or close to the viewer’s eye.
24. A system according to claim 23, wherein the lens is a contact lens or a lens implanted in the viewer’s eye.
25. A system according to claim 22, wherein the lens is mounted together with the display 25 on the frame, such that the lens is located between the display and a retina of a viewer’s eye.
26. A system according to claim 25, comprising a pair of the lenses and a pair of the displays, each of the displays mounted on the frame associated with a respective lens to be viewed by a respective one of the viewer’s eyes.
27. A system according to claim 22 when dependent on claim 21, wherein the frame forms part of the remote device.
5
28. A system according to claim 27, wherein the lens is mounted together with the display in the remote device, such that the lens is located between the display and a retina of a viewer’s eye.
29. A system according to claim 28, comprising a pair of the lenses and a pair of the 10 displays, each of the displays mounted in the remote device associated with a respective lens to be viewed by a respective one of the viewer’s eyes.
30. A system according to any one of claims 27 to 29, wherein the remote device is configured to be worn on the head of the viewer.
31. A system according to any one of claim 27 to 30, wherein the remote device comprises a set of glasses or a headset.
32. A system according to claim 31, wherein the headset is a virtual reality headset.
33. A system according to claim 31, wherein the set of glasses is an augmented reality set of glasses.
Intellectual
Property
Office
Application No: GB1615392.6 Examiner: Sophie Cartmell
GB1615392.6A 2016-09-09 2016-09-09 Adjusting display data Active GB2557569B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1615392.6A GB2557569B (en) 2016-09-09 2016-09-09 Adjusting display data
PCT/GB2017/052529 WO2018046892A1 (en) 2016-09-09 2017-08-30 Adjusting display data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1615392.6A GB2557569B (en) 2016-09-09 2016-09-09 Adjusting display data

Publications (3)

Publication Number Publication Date
GB201615392D0 GB201615392D0 (en) 2016-10-26
GB2557569A true GB2557569A (en) 2018-06-27
GB2557569B GB2557569B (en) 2022-07-06

Family

ID=57234527

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1615392.6A Active GB2557569B (en) 2016-09-09 2016-09-09 Adjusting display data

Country Status (2)

Country Link
GB (1) GB2557569B (en)
WO (1) WO2018046892A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10657674B2 (en) 2016-06-17 2020-05-19 Immersive Robotics Pty Ltd. Image compression method and apparatus
CN110494193A (en) 2017-02-08 2019-11-22 因默希弗机器人私人有限公司 User into multiplayer place shows content
CN111837384A (en) 2017-11-21 2020-10-27 因默希弗机器人私人有限公司 Frequency component selection for image compression
EP3714602A4 (en) 2017-11-21 2021-07-28 Immersive Robotics Pty Ltd Image compression for digital reality
WO2022055741A1 (en) * 2020-09-08 2022-03-17 Daedalus Labs Llc Devices with near-field communications

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2447758A1 (en) * 2010-10-26 2012-05-02 BAE Systems PLC Display assembly, in particular a head-mounted display
WO2013082387A1 (en) * 2011-12-02 2013-06-06 Aguren Jerry G Wide field-of-view 3d stereo vision platform with dynamic control of immersive or heads-up display operation
EP2980628A1 (en) * 2014-07-31 2016-02-03 Samsung Electronics Co., Ltd Wearable glasses and a method of displaying image via the wearable glasses

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4693552B2 (en) * 2005-08-30 2011-06-01 キヤノン株式会社 Display device, control device, and control method
JP5341462B2 (en) * 2008-10-14 2013-11-13 キヤノン株式会社 Aberration correction method, image processing apparatus, and image processing system
EP2889873B1 (en) * 2012-08-27 2018-07-11 Sony Corporation Image display device and image display method, information communication terminal and information communication method, and image display system
US20140268012A1 (en) * 2013-03-12 2014-09-18 Christie Digital Systems Canada Inc. Immersive environment system having marked contact lenses coordinated with viewing stations
KR102201736B1 (en) * 2014-01-15 2021-01-12 엘지전자 주식회사 Detachable Head Mounted Display Device and and Controlling Method Thereof
US10228562B2 (en) * 2014-02-21 2019-03-12 Sony Interactive Entertainment Inc. Realtime lens aberration correction from eye tracking
WO2016017966A1 (en) * 2014-07-29 2016-02-04 Samsung Electronics Co., Ltd. Method of displaying image via head mounted display device and head mounted display device therefor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2447758A1 (en) * 2010-10-26 2012-05-02 BAE Systems PLC Display assembly, in particular a head-mounted display
WO2013082387A1 (en) * 2011-12-02 2013-06-06 Aguren Jerry G Wide field-of-view 3d stereo vision platform with dynamic control of immersive or heads-up display operation
EP2980628A1 (en) * 2014-07-31 2016-02-03 Samsung Electronics Co., Ltd Wearable glasses and a method of displaying image via the wearable glasses

Also Published As

Publication number Publication date
GB201615392D0 (en) 2016-10-26
GB2557569B (en) 2022-07-06
WO2018046892A1 (en) 2018-03-15

Similar Documents

Publication Publication Date Title
WO2018046892A1 (en) Adjusting display data
US20200218096A1 (en) Apparatus and method for fitting head mounted vision augmentation systems
US20110241976A1 (en) Systems and methods for personal viewing devices
US20160246057A1 (en) Image display device and image display method, image output device and image output method, and image display system
US20170235161A1 (en) Apparatus and method for fitting head mounted vision augmentation systems
WO2016113951A1 (en) Head-mounted display device and video display system
CN106663336B (en) Image generation device and image generation method
US11962746B2 (en) Wide-angle stereoscopic vision with cameras having different parameters
WO2016098412A1 (en) Head-worn display device, and image display system
WO2011158653A1 (en) Video display system, shutter glasses, and display device
CN105158899A (en) Head-worn display system
US11644894B1 (en) Biologically-constrained drift correction of an inertial measurement unit
JP2018513656A (en) Eyeglass structure for image enhancement
WO2011159757A2 (en) Systems and methods for personal viewing devices
TWI646355B (en) Head-mounted display and adjusting method of the same
JPWO2020129115A1 (en) Information processing system, information processing method and computer program
TWI635316B (en) External near-eye display device
JP2021039444A (en) Image processing device, control method and program thereof
WO2018035842A1 (en) Additional near-eye display apparatus
TW202317771A (en) Compact imaging optics using spatially located, free form optical components for distortion compensation and image clarity enhancement
JP2017212720A (en) Image processing apparatus, image processing method, and program
US10989927B2 (en) Image frame synchronization in a near eye display
US20220373803A1 (en) Actuator aligned multichannel projector assembly
CN116888981A (en) Audio source amplification with speaker protection function and internal voltage and current sensing
WO2022251030A1 (en) Actuator aligned multichannel projector assembly