WO2018158347A1 - Improvements in or relating to virtual and augmented reality headsets - Google Patents

Improvements in or relating to virtual and augmented reality headsets Download PDF

Info

Publication number
WO2018158347A1
WO2018158347A1 PCT/EP2018/054983 EP2018054983W WO2018158347A1 WO 2018158347 A1 WO2018158347 A1 WO 2018158347A1 EP 2018054983 W EP2018054983 W EP 2018054983W WO 2018158347 A1 WO2018158347 A1 WO 2018158347A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
virtual
adjustable
user
headset
Prior art date
Application number
PCT/EP2018/054983
Other languages
French (fr)
Inventor
Daniel Paul RHODES
Robert Edward Stevens
Original Assignee
Adlens Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Adlens Ltd filed Critical Adlens Ltd
Priority to TW107109989A priority Critical patent/TW201937238A/en
Publication of WO2018158347A1 publication Critical patent/WO2018158347A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0136Head-up displays characterised by optical features comprising binocular systems with a single image source for both eyes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/015Head-up displays characterised by mechanical features involving arrangement aiming to get less bulky devices
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0154Head-up displays characterised by mechanical features with movable elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0181Adaptation to the pilot/driver

Definitions

  • the present invention relates to virtual and augmented reality headsets and has particular reference to virtual and augmented reality headsets of the kind in which two 2- dimensional images are displayed side -by-side within the headset to form a stereoscopic 3-dimensional image when viewed by a user through a pair of primary objective lenses that serve to magnify the images.
  • the images may be still images (e.g., photographs) or moving images (e.g., videos).
  • a virtual reality (VR) headset provides a virtual reality experience for the wearer.
  • VR headsets are widely used with computer games, but they are also used in other applications, including simulators, trainers and presentations.
  • An augmented reality (AR) headset offers a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data.
  • the present invention is concerned with virtual and augmented reality headsets in which a 3-D stereoscopic image is displayed within the headset, either alone or superposed on a view of the real-world environment.
  • Known virtual and augmented reality headsets generally comprise a housing that is worn on the face of the user over the eyes.
  • a stereoscopic 3-dimensional image is displayed within the headset comprising two side-by-side 2-D images which, when viewed by user, form a virtual 3-D image.
  • a pair of primary lenses are provided within the headset for imaging the stereoscope.
  • an elasticated or adjustable headband is provided for attaching the housing to the user's head for hands-free use.
  • Some virtual and augmented reality headsets include one or more integrated displays for displaying the two images side-by-side that make up the 3-D image.
  • Other headsets known as mobile virtual reality headsets, are designed to receive a mobile device comprising its own display such that the mobile device's display faces towards the user's eyes when fitted.
  • the mobile device which may be a mobile phone, can be easily removed from the housing after use.
  • known virtual and augmented reality headsets provide for a limited degree of focusing of the image displayed on the integrated displays(s) or display screen of the mobile device as the case may be.
  • the primary lenses are generally fixedly secured within the housing, but the mobile device is mounted on a part of the housing that is arranged to move forwards and backwards relative to the user, parallel to the optical axis of the primary lenses and perpendicular to the plane of the screen.
  • the degree of focusing that this provides is rather limited. Further, it is not possible to adjust the focus of each eye separately. No provision is made for adjusting for the user's cylinder or axis prescription.
  • the housing of a virtual augmented reality headset includes
  • the removable mobile device may be fairly heavy and positioning it at a distance from the user's face gives rise to a considerable turning force that is applied to the user's face, which may be uncomfortable, notwithstanding the usual provision of foam padding or the like around a rear end of the headset where it contacts the user's face.
  • accommodation reflex sends important sensory clues to the brain to help comprehend the image and discern the distance to the particular object being viewed.
  • some users of virtual or augmented reality headsets experience some feelings of nausea or disorientation, because they perceive a 3- dimensional image with various objects at different virtual distances, but the accommodation of their eyes remains unchanging.
  • a virtual or augmented reality headset comprising two lens groups for imaging two side- by-side 2-dimensional images displayed on a display within the headset to form a virtual stereographic 3 -dimensional image, each lens group comprising a primary lens and at least one selectively adjustable lens having at least one variable optical characteristic.
  • each lens group may comprise a primary lens, a first adjustable lens for correcting a spherical refractory error in a user's distance vision and a second adjustable lens for correcting astigmatism and axis, the primary lens and first and second adjustable lenses within each lens group being in mutual optical alignment on a respective optical axis.
  • the first adjustable lens may therefore have a selectively variable focusing power (i.e. variable sphere).
  • the second adjustable lens may have a selectively variable cylinder power.
  • the second adjustable lens may also be selectively rotatable around its optical axis to adjust the axis of the astigmatism of the second adjustable lens.
  • the second adjustable lens of at least one lens group may also have selectively variable focusing power to supplement or modify the focusing power of the first adjustable lens.
  • the headset may be a virtual reality (VR) headset in which only the 3-dimensional image displayed on the display is visible.
  • the virtual reality headset may incorporate one or more integrated displays.
  • the virtual reality headset may be a mobile VR headset that is arranged to receive a mobile device having its own display, such that the mobile device is easily removable from the headset and, when fitted, is arranged such that its display faces the user's eyes.
  • the headset may be an augmented reality (AR) headset in which the 3 -dimensional image is superimposed on an image of the real world viewed through the headset.
  • AR augmented reality
  • variable optical characteristic of the at least one adjustable lens implies variable optical power, including focusing power (sphere) or cylinder power (astigmatism), variable axis or a variable position of the lens in a direction transverse the optical axis.
  • the at least one adjustable lens may be arranged for modifying the vision of a user of the headset.
  • the at least one adjustable lens may be configured for correcting the user's vision.
  • the adjustable lens may have variable sphere, cylinder and/or axis and may be selectively adjusted for correcting the user's prescription for sphere and/or astigmatism.
  • the at least one adjustable lens may alternatively or additionally be used for modifying the user's vision in accordance with an image displayed within the headset.
  • Modifying the user's vision as used herein therefore not only implies correcting the user's vision in accordance, for example, with the user's ophthalmic prescription for distance, pupillary distance, astigmatism and the like, but also manipulating the user's vision so that the user's eyes are forced to accommodate and/or verge as described in more detail below in response to how the user views an image on the display.
  • each lens group comprises a first adjustable lens for correcting a
  • the first adjustable lens is for correcting a mainly spherical component of the user's distance vision.
  • the second adjustable lens is for correcting mainly astigmatism.
  • the second adjustable lens within each lens group may also afford variable sphere.
  • the first adjustable lenses are independently adjustable, but in some embodiments, they may be linked for adjustment together, preferably simultaneous equal adjustment.
  • a second adjustable lens which provides for a small amount of sphere adjustment, may therefore allow for the correction of small differences in sphere between the user's two eyes.
  • the distance between the lens groups and the display within the headset may be fixed.
  • the positions of the lens groups within the headset may be fixed, but in other embodiments the lens groups may be movable sideways in a direction parallel to an interpupillary axis between the user's eyes.
  • the lens groups may be movable sideways in a direction parallel to the plane of the display.
  • each lens group comprises at least one cubic surface-type adjustable lens comprising two superposed lens elements having mutually cooperating cubic or higher order surfaces that are slidable relative to one other in a plane perpendicular to the optical axis of the lens for varying the optical power of the lens.
  • each lens group may comprise at least one Alvarez lens as disclosed, for instance, in US 3,305,294, the contents of which are incorporated herein by reference.
  • Other suitable cubic-type adjustable lenses with cubic and higher- order surfaces are disclosed by US 3,583,790, US 7,338,159, US 7,717,552,
  • the cubic surfaces of the lens elements may be shaped to provide variable sphere correction when they are displaced relative to one another along an x-axis of the adjustable lens and/or variable cylinder correction when they are displaced relative to one another along a j-axis of the adjustable lens, the x- and j-axes forming a 3- dimensional Cartesian coordinate system with a z-axis that is parallel to the optical axis of the adjustable lens.
  • each lens element may have a first cubic or higher- order surface of the kind described above and a second opposite surface that is a regular surface of revolution.
  • the second surface of each lens element may be planar.
  • each lens group comprises a first cubic surface-type lens in which the lens elements are slidable relative to one another in a direction parallel to the x-axis for correcting the user's (mainly) spherical distance vision.
  • the x-axis of the first cubic surface-type lens may be oriented at an acute angle, suitably about 60-80°, relative to an interpupillary axis that extends between the lens groups.
  • the x-axis of the first cubic surface-type lens may be arranged generally parallel to an adjacent side of a user's nose.
  • the arrangements of the first cubic surface-type lenses of the two lens groups are mirror images of one another, so that while the x-axis of the first lens of one of the lens groups is oriented is an angle of 45-80° relative to the interpupillary axis, the x-axis of the first lens of the other lens group 100 is oriented at an angle of -45° to -80° relative to the interpupillary axis. This allows the lens groups to be positioned close to the user's face while still maintaining a large degree of relative movement between the two lens elements of the first cubic surface-type lens.
  • the travel of the lens elements would be limited by the user's nose in one direction, by disturbance to the wearer's field of view in the other direction, or the first adjustable lens would have to be positioned forwardly of the nose.
  • the first adjustable lens of each lens group may comprise a variable focusing power liquid lens comprising a distensible membrane having an optical surface, wherein the focusing power of the liquid lens is related to the curvature of the membrane.
  • Suitable variable focusing power liquid lenses are disclosed by US 1269422, WO 99/061940 Al, US 2576581, US 3161718, US 3614215, WO2014/118546 Al, WO2014/199180 Al and WO2017/055787 A2, the contents of which are incorporated herein by reference.
  • the optical power (focusing power) of the first adjustable lens may be variable in the range up to about ⁇ 15 dioptres, preferably -15 to +10 dioptres.
  • the focusing power of the first adjustable lens may be in the range -8 to +6 dioptres or -6 to +4 dioptres.
  • each lens group comprises a second cubic
  • the lens elements of the second lens being slidable relative to one another in a direction parallel to the j-axis of the second cubic surface- type lens for correcting astigmatism.
  • the cylinder power of the second cubic surface-type lens may be variable in the range up to about +8 to -8 dioptres.
  • the cylinder power of the second cubic surface-type lens may be in the range +4 to -4 dioptres or +2 to -2 dioptres, where a sign change in cylinder is equivalent to the axis changing by 90°.
  • each lens group may comprise a rotation-controlling actuator associated with the second cubic surface-type lens for adjusting the axis of the second adjustable lens.
  • Each second cubic surface-type lens may be arranged for rotation about the optical axis by up to 90°.
  • Each lens group may comprise a respective sliding-controlling actuator associated with the or each cubic surface-type lens for controlling the relative disposition of the lens elements for adjusting the optical power of the lens.
  • each of the lens groups may be mounted within the headset for translational movement in a direction parallel to the interpupillary axis that extends between the two lens groups.
  • the primary lens and at least one adjustable lens within each lens group are mounted for movement together as a group.
  • the lens groups may be mounted for translational movement in a direction parallel to the plane of the display.
  • a respective pupillary distance-controlling actuator may be associated with each of the lens groups for adjusting the interpupillary distance between the lens groups.
  • the positions of the primary lenses may be fixed, and only the one or more adjustable lenses may be movable as described.
  • the interpupillary distance between the lens groups may be adjustable in the range 50-76 mm. In some embodiments, the interpupillary distance between the lens groups may be adjustable in the range 52-72 mm or 54-70 mm.
  • the virtual or augmented reality headset further comprises at least one eye-tracker.
  • the user's line of sight may be calculated, from which the user's point of gaze on an image displayed within the headset may be determined.
  • Virtual distance data associated with image data used for creating the image within the headset may comprise virtual distance values for multiple points or regions within the image (encoded as parallax in the stereographic images), from which an object vertex distance (z-distance) may be determined.
  • the object vertex distance may be used for adjusting the at least one adjustable lens to modify the user's vision in a manner related to the object vertex distance for the point of gaze on the image.
  • the optical power of the first adjustable lens may be adjusted in inverse relation to the object vertex distance, so that the image is effectively viewed from a shorter virtual object distance.
  • the optical power of the first adjustable lens is greater when the user is looking at a distant point within the image having a greater object vertex distance than when he or she is looking at a nearer point within the image having a lower object vertex distance. In this way, the user's eyes are forced to accommodate in response to the virtual distance of the part of the image at which the user is looking.
  • the at least one adjustable lens within each group may be arranged to correct the user's vision in accordance with his or her prescription.
  • the adjustment of the focusing power of the first adjustable lens in relation to the virtual distance of the part of the image at which the user is looking may therefore comprise an adjustment from the user's prescription.
  • the at least one adjustable lens within each lens group may be set up to correct the user's distance vision, and the optical power of the first lens may be adjusted to a negative power relative to the user's distance vision when the user is looking at an object closer than infinity.
  • software or firmware for controlling a virtual or augmented reality headset in accordance with a first aspect of the invention, which software or firmware comprises machine code that is executable by a computer comprising a processor and memory for controlling the computer to receive in the memory ophthalmic prescription data for a user encoding at least the user's sphere correction for distance and transmitting control signals to the sliding-controlling actuators for adjusting the relative disposition of the lens elements of each of the first cubic surface-type lenses to adjust the focusing power of the first adjustable lenses in accordance with the user's prescription.
  • each of the first cubic surface-type lenses may be adjusted independently of the other, but as mentioned above, in some embodiments, the adjustment of the first adjustable lenses may be linked, so they are adjusted together.
  • the computer may be a separate computer that is connected to the headset of the
  • the computer may be a mobile device such, for example, as a mobile phone, that is received in the headset for using the mobile device's own display screen for displaying the image within the headset, as described above.
  • the computer may also comprise a storage
  • the software or firmware may be executable by the computer to obtain the user's prescription data from the storage device.
  • the software or firmware may be executable by the computer to prompt a user to input their prescription data.
  • the firmware of the invention may be stored in a suitable non- volatile memory within the headset.
  • the lens groups may be positioned further forwardly from the user's eyes than would be the case in a conventional pair of glasses for correcting the user's vision.
  • the adjustment of the first cubic surface-type lenses in accordance with the user's prescription may include an additional offset to account for the user's eye position or vertex distance when wearing the headset.
  • the size of the offset may be input with the prescription data or stored from a one-time calibration when the user first wears and adjusts the headset.
  • the vertex distance may be obtained automatically from an eye tracker device of the kind mentioned above that may be mounted within the headset.
  • the user's ophthalmic prescription data may further comprise one or more of fitting data for a lens to the user such, for example, as interpupillary distance, cylinder, axis and lens vertex distance, age and add power
  • the software may be executable by the processor for transmitting control signals to the pupillary distance- controlling actuators associated with the respective lens groups, the sliding-controlling actuators associated with the first and/or second adjustable lenses and/or the rotation- controlling actuators associated with the second adjustable lenses for adjusting the separation of the lens groups, the sphere power of the first lenses and/or the cylinder power and/or axis of the second adjustable lenses for correcting the user's vision in accordance with their prescription.
  • the software or firmware is executable by the computer for controlling the computer to receive in the memory the user's axis and cylinder corrections for astigmatism and transmitting control signals to the controllers to operate the rotation- controlling actuators and sliding-controlling actuators for the second cubic surface-type lenses to adjust the orientation and optical power of the second adjustable lenses to the user's prescription.
  • virtual or augmented reality software or firmware for operating a computer comprising a processor, memory and storage to control operation of a virtual or augmented reality headset in accordance with the first aspect of the invention, the software or firmware comprising executable code and data; the data comprising image data encoding a stereographic image for display within the headset, virtual distance data comprising virtual distance values for virtual distances from a user of multiple points or regions within the image and accommodation data comprising accommodation and/or vergence values corresponding to the virtual distances;
  • the executable code being executable by the processor for retrieving the image data from the storage and using the image data to generate a display signal for a display within the headset to display the image;
  • he virtual or augmented reality software or firmware of the third aspect of the invention when executed therefore controls operation of a virtual or augmented reality headset in accordance with the invention such that the user's vision is modified according to the virtual distance from the user to his or her point of gaze on the image displayed within the headset. In this way, as described above, the user's eyes may be forced to accommodate and/or verge when viewing parts of the image that have a shorter virtual distance than infinity.
  • the virtual distance of the user to his or her point of gaze on the image will be dependent upon the user's vertex distance.
  • the vertex distance may be obtained automatically from the eye tracker. Once measured, the vertex distance may be held in the memory of the computer during use of the software and/or stored in the storage for future use. In determining the virtual distance from the user to the user's point of gaze on the image, an adjustment may be made for the vertex distance.
  • the virtual reality software or firmware may be executed in
  • the set-up software or firmware may be executed when the user puts on headset to adjust the at least one adjustable lens in accordance with the user's prescription.
  • the first adjustable lens may be adjusted in accordance with the user's distance prescription.
  • the second adjustable lens when present, may be adjusted in accordance with the user's cylinder and axis prescription.
  • the distance between the lens groups may be adjusted in accordance with the user's pupillary distance. During use of the headset, the distance between the lens groups may be reduced as described above when the user looks at parts of the stereoscopic 3-D image that appear to be nearer than infinity to cause the user's eyes to verge.
  • the focal power of the first adjustable lens may be reduced as described above when the user looks at parts of the stereoscopic 3-D image that appear to be nearer than infinity to cause the user's eyes to accommodate.
  • the user's prescription data includes the user's age and/or presbyopia data
  • adjustment of the first adjustable lenses as described above may be disabled or limited for parts of the image at virtual distances for which the user's accommodation is insufficient owing to presbyopia.
  • the software or firmware may be executable by a mobile device having a display, e.g. a mobile phone, which can be fitted to the virtual or augmented reality headset for displaying the two side-by-side 2-dimensional images that form the stereographic 3-dimensional image.
  • a mobile device having a display, e.g. a mobile phone, which can be fitted to the virtual or augmented reality headset for displaying the two side-by-side 2-dimensional images that form the stereographic 3-dimensional image.
  • the present invention therefore provides a virtual or augmented reality headset which comprises, in addition to the usual pair of primary lenses for imaging a stereoscopic image displayed within the headset, at least one adjustable lens for each eye.
  • the adjustable lens may be adjusted for modifying the user's vision, particularly for vision correction in order to avoid having to use glasses with the headset. This allows the headset to be made with a compact form factor with improved comfort as compared with prior virtual/augmented reality headsets.
  • the adjustable lens may also be controlled dynamically as the user views different parts of the stereoscope having different virtual distances from the user. In this way, the user is forced to accommodate when looking at different parts of the image, even though the distance between the user's eyes and the display remains fixed. This may help to overcome some of the feelings of nausea experienced by users of prior virtual/augmented reality headsets.
  • a second adjustable lens associated with each eye allows for adjustment of cylinder and/or axis for further improvement of the user's experience and image quality.
  • the two lens groups may be mounted for translation within the headset in a direction parallel to the plane of the image, i.e. in a direction parallel to an axis between the two lens groups, to allow the lens groups to moved closer together or further apart.
  • the lens groups may therefore be set at a pupillary distance
  • the distance between the lens groups may be adjusted dynamically as the user looks at different parts of the image with different virtual distances from the user. Specifically, the lens groups may be moved closer together when the user looks at parts of the image with a shorter virtual distance from the user, e.g. for maintaining alignment between the lens groups' optical centres and the eyes as they verge when looking at near objects.
  • FIG. 1 shows a virtual reality headset worn by user.
  • FIG. 2 is a perspective front view from above and a left side of the headset of FIG. 1
  • FIG. 3 is a perspective front view from the left side of the headset of FIGS. 1 and 2, showing an insert for a mobile phone or other handheld mobile device comprising a display screen.
  • FIG. 4 is another perspective front view from the left side of the headset shown in the previous figures, which shows a mobile phone in a partially docked position.
  • FIG. 5 is a schematic plan view of the virtual reality headset of FIGS. 1-4 showing a pair of lens groups, in accordance with the present invention.
  • FIG. 6 is a schematic side view, showing the headset of FIG. 5 worn on the face of the user.
  • FIG. 7 is a schematic plan view of a right-hand group of lenses for the headset of
  • FIGS. 1-6 are views of FIGS. 1-6.
  • FIG. 8 is a schematic rear view from above and a right side of the right-hand group of lenses of FIG. 7.
  • FIG. 9A is a schematic rear view from above and the right side of a first cubic-type lens forming part of the lens group shown in FIGS. 7 and 8.
  • FIG. 9B is a schematic rear view from above and the right side of a second cubic-type lens forming part of the lens group of FIGS. 7 and 8.
  • FIG. 10 is a plan view of the right-hand group of lenses of FIGS. 9 and 10, showing
  • FIG. 11 is an isometric rear view from above and the right side of the right-hand group of lenses of FIG. 10.
  • FIG. 12A is a rear view of the first cubic-type lens of the right-hand group of lenses of FIGS. 10 and 11.
  • FIG. 12B is a rear view of the second cubic-type lens of the right-hand group of lenses of FIGS. 10 and 11.
  • FIGS. 13A-13C show adjustment of the first cubic-type lens of FIG. 12A for modifying the user's accommodation.
  • FIGS. 14A-14C show adjustment of the second cubic-type lens of FIG. 12B for
  • FIGS. 15A-15C show adjustment of the second cubic-type lens of FIG. 12B for
  • FIG. 16 is a chart showing adjustments needed to the first and second cubic-type lenses of FIGS. 12A and 12B for correcting the user's vision and/or modifying their accommodation or pupillary distance (PD).
  • PD pupillary distance
  • FIG. 17 is a flow diagram for software to adjust lenses within a headset according to the invention in accordance with a user's prescription.
  • FIG. 18 is a flow diagram for software for adjusting lenses within a headset according to the invention for modifying a user's vision according to a point or region of a stereoscopic 3D image being viewed within the headset.
  • FIG. 19 and FIG. 20 shows schematically how the pupillary distance of the user varies according to the distance of an object being viewed.
  • FIG. 21 is a block diagram of the electronic components of the headset of FIGS. 1-15.
  • FIG. 1 shows a mobile virtual reality headset 10 according to one embodiment of the invention being worn by user. While the present embodiment relates to a mobile virtual reality headset, which is designed to receive a mobile device incorporating a display, the present invention is equally applicable to virtual and augmented reality headsets with integrated displays.
  • the mobile virtual reality headset 10 of the present embodiment comprises a housing 12 having a front end 14, a rear end 15, a left-hand side 16, a right-hand side 17 and top 18 and a bottom 19.
  • the housing 12 is assembled from multiple parts, including a sidewall 20 that extends around the housing 12 and has a front end 22 and a rear end 23, a peripheral rear extension piece 30 and a detachable front cover 40.
  • An elasticated headband 39 is attached to the sidewall 20 at the right and left-hand sides 16, 17 of the housing 12 for securing the headset 10 on the head of a user, over the eyes, as shown in FIG. 1, for hands-free use.
  • an adjustable headband may be used.
  • the rear extension piece 30 is fitted in the rear end 23 of the sidewall 20 and, as is clear from FIGS. 1 and 2, extends around the rear end 23 of the sidewall 20, including a region (not shown) that is shaped to accommodate the user's nose.
  • the rear end 15 of the housing 12, including the rear end 23 of the sidewall 20 and the rear extension piece 30, are shaped to match the general contours of the user's face, while the rear extension piece 30 carries a strip of foam or other cushioning material 35 for comfort on the face of the user, particularly when the headband 39 is tightened around the user's head.
  • the front cover 40 is detachably secured to a moulded insert 50 that is fitted in the front end 22 of the sidewall 20.
  • the insert 50 has a front surface 51 that defines a shallow, forwards-facing recess 52 that is shaped and sized to receive a rectangular mobile device 1 having a display screen 2, of the kind shown in FIG. 4.
  • the front cover 40 may likewise have a recessed rear face, so that upon fitting the front cover 40 to the insert 50, the rear face of the front cover 40 and front face 51 of the insert 50 form a cavity for receiving the mobile device 1.
  • Forwardly protruding formations 53 are provided for engaging one or more cooperating formations formed on the front cover 40 for attaching the front cover 40 to the insert 50.
  • a dock connector 55 is mounted on a cylindrical block 56 which is hinged to the
  • the insert 50 is not movable relative to the housing 12.
  • the dock connector 55 is arranged for connecting the mobile device 1 to various on-board electronic components of the headset 10, as described below.
  • the headset 10 is equipped with a number of manually operable controls that are accessible by the user from outside the housing 12.
  • the housing 12 is provided with a trackpad 24 (see FIG.
  • the headset 10 further includes a proximity sensor 25 facing towards the user's face (see FIG. 5), an eye tracker 26, one or more motion tracking sensors 27, a charging port (also not shown) and a USB connector (also not shown).
  • the various component parts of the housing 12, including the sidewall 20, rear extension piece 30 and front cover 40 are formed from opaque materials to prevent ambient light from passing through these parts into the housing 12.
  • the rear extension piece 30 and foam strip 35 act to inhibit light from entering into the housing 12 through the rear end 15 by virtue of forming a close fit with the user's face.
  • the insert 50 is also formed from opaque material and, as best seen in FIG. 3, is shaped to define a pair of spaced apertures 60.
  • the apertures 60 are arranged to ensure that each of the user's eyes can only view a respective one of two side-by-side 2-D images that are shown on the display of the mobile device 1.
  • the insert 50 may be formed with an integral partition 62 between the two apertures 60, to block light from one of the images bleeding into the other aperture 60.
  • the insert 50 is formed with a generally cylindrical inner surface 64 having a diameter of about 50 mm that extends rearwardly from the front surface 51 through the insert 50.
  • a biconvex primary lens 70 is disposed behind each of the apertures 60 for imaging the side-by-side images shown on the display to form a 3-D stereoscopic image.
  • the primary lenses 70 have the same power as each other. The actual power of the lenses will depend on the distance between the display and lenses 70, but suitably the lenses 70 may have a power of about 35 dioptres.
  • each of the primary lenses 70 is part of a respective lens group 100 that further comprises two adjustable lenses 200, 300 in accordance with the present invention, as described in more detail below.
  • Each lens group 100 is associated with a respective one of the user's eyes E for viewing a respective one of the side-by-side 2-D images that are displayed on the display 2 of the mobile device 1 to form a virtual 3-D stereoscopic image, as mentioned above.
  • the position of each lens group 100 between the front and rear ends 14, 15 of the housing 12 is fixed.
  • each lens group 100 is mounted within the housing 12 for translational movement on a horizontal axis H between the right and left hand sides 16, 17 of the housing 12 as described in more detail below.
  • the horizontal axis H is oriented substantially parallel to an interpupillary axis extending between the user's right and left eyes.
  • the lens groups 100 may conveniently be mounted to an interior surface of the sidewall 20, but in the present embodiment the lens groups 100 are mounted to a rear surface (not shown) of the insert 50, behind the apertures 60, as shown in FIGS. 5 and 6.
  • the adjustable lenses 200, 300 are disposed behind the corresponding primary lens 70.
  • the adjustable lenses 200, 300 may be positioned in front of their respective primary lenses 70.
  • Each lens group 100 therefore comprises two adjustable lenses 200, 300 for modifying the user's vision as described in more detail below.
  • modifying is meant not only that the adjustable lenses 200, 300 are capable of correcting the user's vision, but also that the user's view of the stereoscopic image may be controlled to compensate for the fact that the distance between the user's eyes and the display is constant.
  • a first adjustable lens 200 of each lens group 100 is provided for modifying a mainly spherical component of the user's vision, while a second adjustable lens 300 is provided for modifying mainly cylinder and axis components.
  • the first adjustable lens 200 is positioned behind the second adjustable lens 300, nearer the user's eyes E.
  • each of the first and second adjustable lenses 200, 300 of each lens group 100 is a cubic surface-type adjustable lens comprising two superposed lens elements 201, 202; 301, 302, as best shown in FIGS. 9A and 9B.
  • the lens elements 201, 202; 301, 302 of each lens 200, 300 have mutually cooperating cubic surfaces which are shaped such that, when the lens elements are shifted relative to one other in a plane perpendicular to the optical axis of the lens, the optical power of the lens varies.
  • adjustable lenses such, for example, as fluidic lenses, may be employed for one or both of the adjustable lenses 200, 300, especially the first adjustable lenses 200 which are configured for correcting the user's spherical focusing power for distance vision as described in more detail below.
  • the lens elements 201, 202; 301, 302 of each cubic-type adjustable lens 200, 300 have polished opposite front and rear faces, and the optical thickness t of each lens element 201, 202; 301, 302 in a direction parallel to the optical axis z, as shown in FIG. 6, is defined by formula (I):
  • D is a constant representing the coefficient of a prism removed to minimise lens thickness and may be zero
  • E is a constant representing lens element thickness at the optical axis z
  • x and y represent coordinates on a rectangular coordinate system centred on the optical axis and lying in a plane perpendicular thereto
  • A is a constant representing the rate of lens power variation with relative lens element movement in the x direction, being positive for one of the lens elements 201, 202; 301, 302 and negative for the other lens element 202, 201; 302, 301.
  • One face of each lens element 201, 202; 301, 302 is planar, but in other embodiments of the invention, one face of each lens element may be a regular surface of revolution.
  • lens elements 202, 201; 302, 301 of the present embodiment are defined by formula (I) above, other cubic-type adjustable lenses with cubic and higher-order surfaces that are suitable for use in accordance with the present invention are disclosed by US 3,305,294, US 3,583,790, US 7,338,159, US 7,717,552, US 5,644,374 and WO 2013/030603, the contents of which are incorporated herein by reference.
  • Each of the lens groups 100 therefore comprises a primary lens 70 and first and second cubic-type adjustable lenses 200, 300 and is mounted within the housing 12 for lateral translational movement side-to-side on the above-mentioned horizontal axis H between the left and right hand sides 16, 17 of the housing 12, as indicated by the arrows indicated by A in FIG. 5, for adjusting the pupillary distance (PD) between the lens groups 100.
  • a separate actuator 503, 504 is associated with each lens group 100 for moving the lens groups 100 independently of one another under the control of respective control units 501, 502, as shown in FIG. 21.
  • a separate control unit 501, 502 is associated with each of the left-hand and right-hand lens groups 100.
  • the pupillary distance is adjustable in the range 50-76 mm, but different embodiments may be capable of different ranges of PD adjustment, for example 52-72 mm or 54-70 mm.
  • FIGS. 7 and 8 show the right-hand lens group 100 schematically in more detail.
  • the arrangement of the left-hand lens group 100 is similar to that of the right-hand lens group 100, but in mirror image.
  • the references to right- and left-hand throughout are anatomical, corresponding to the right and left eyes of the user.
  • the first and second adjustable lenses 200, 300 of the right- hand lens group 100 are mounted behind the corresponding primary lens 70 on the optical axis z.
  • the lens elements 201, 202 of the first cubic-type adjustable lens 200 are mounted for sliding movement relative to one another parallel to the x-axis of the lens 200 for adjusting the mainly spherical optical power of the lens 200.
  • the first adjustable lens 200 is mounted with the x-axis oriented at an acute angle ⁇ of about 75° to the horizontal axis H, as best shown in FIG. 12A. In other embodiments, an angle ⁇ in the range 60-80° may be used.
  • the lens groups 100 may be positioned within the housing 12 such that they are disposed adjacent the region of the rear extension piece 30 that is shaped to accommodate the user's nose.
  • At least the first cubic-type lens 200 may be positioned adjacent the user's nose when the headset is worn, and by orienting the first lens 200 at such an acute angle ⁇ to the horizontal axis H, substantial relative movement of the lens elements 201, 202 can be accommodated without impinging on the user's nose or the part of the housing 12 that is shaped to accommodate the user's nose. It will be understood that since the lenses are concealed within the housing 12 of the headset 10, aesthetic considerations arising from the orientation of the first lens 200 at such an acute angle do not arise.
  • the first adjustable lens 200 is capable of a range of spherical powers of about -8 to +6 dioptres, but in some embodiments a wider or narrower range may be used, e.g. -15 to +10 dioptres or -6 to +4 dioptres.
  • the first adjustable lens 200 may be capable of a range of spherical powers of ⁇ 10 dioptres or up to ⁇ 15 dioptres.
  • any suitable mechanism may be provided for adjusting the relative disposition of the lens elements 201, 202 of the first adjustable lens 200.
  • each of the lens elements 201, 202 is fabricated with an internally threaded protrusion 203, 204 having mutually opposite screw threads.
  • the protrusions 203, 204 are mounted on a spindle 205 having spaced thread portions 207, 208, which also have mutually opposite screw threads configured to engage with the respective protrusions 203, 204.
  • Rotation of the spindle 205 on its axis therefore shifts the lens elements 201, 202 relative to one another in equal and opposite directions on the x-axis.
  • a mechanism of this kind is disclosed by US2008/0030678 Al, the contents of which are incorporated herein by reference.
  • An actuator 506 as a prime mover is fitted at one end of the spindle 205 for rotating the spindle 205 under control of the control unit 502.
  • a corresponding actuator 505 is associated with the first adjustable lens 200 of the left-hand lens group 100.
  • the second adjustable lens 300 of each lens group 100 is mounted between the first adjustable lens 200 and the primary lens 70.
  • the second adjustable lens 300 is mounted for rotation about the optical axis z as shown in FIGS. 14A-C.
  • the lens elements 301, 302 of the second adjustable lens 300 are also mounted for sliding movement relative to one another in a direction parallel to the y-axis of the second lens 300, as illustrated in FIGS. 15A-C. From formula (I) above, it will be understood that shifting the lens elements 301, 302 of the second lens 300 relative to one another along the j-axis causes adjustment of the mainly cylinder optical power of the second lens 300.
  • Rotation of the second lens 300 about the z-axis allows adjustment of the axis of the lens 300.
  • the rotation and adjustment of the second lens 300 by relative movement of the lens elements 300, 301 parallel to the y-axis allows for mainly cylinder and axis
  • the second adjustable lens 300 is capable of a range of
  • the second adjustable lens 300 is capable or rotation about the optical axis of 90° to provide the full range of axis prescriptions for different users.
  • the lens elements 301, 302 of the second adjustable lens 300 are incapable of shifting relative to one another parallel to the x-axis of the second lens 300.
  • some relative sliding of the lens elements 301, 302 of the second adjustable lens 300 may be permitted on the x-axis under the control of a suitable further actuation mechanism for allowing independent correction of the sphere components of the user's right and left eyes.
  • the second adjustable lens 300 may have a variable sphere in the range 0- 2 dioptres or 0-1 dioptres in addition to its variable cylinder and axis components.
  • any suitable mechanism indicated schematically by item 310 in FIG. 9B, may be provided for relative movement of the lens elements 301,
  • each of the lens elements 301, 302 is fabricated with an interiorly threaded protrusion 303, 304 as shown in FIG. 12B and a spindle 305 equipped with two longitudinally spaced thread portions 307, 308 is engaged with the threaded protrusions 303, 304.
  • the protrusions 303, 304 and threaded portions 307, 308 are formed with opposite screw threads, so that rotation of the spindle 305 causes relative movement of the lens elements 301, 302 parallel to the j-axis, transverse the z-axis.
  • An actuator 508 is mounted to one end of the spindle 305 for causing rotation of the spindle 305 under the control of the control unit 502.
  • the mirror image left-hand second adjustable lens 300 is similarly provided with a corresponding actuator 507 under control of the corresponding control unit 501.
  • any suitable mechanism may be provided for controlling rotation of the second adjustable lens 300 about the optical axis z.
  • one of the lens elements 301, 302 is fabricated with a gear profile 315 comprising a series of teeth 316, and the second lens 300 is mounted within the housing 12 such that the gear profile 315 engages a small pinion 320 as shown in FIGS. 11 and 12B.
  • the teeth 316 of the gear profile 315 are arranged along a curvilinear arc, and the pinion 320 is drivably connected to an actuator 510 for causing rotation of the third lens 300 about the z-axis under the control of the control unit 502.
  • a corresponding actuator 509 is provided for rotation of the left-hand second adjustable lens 300 under control of the control unit 501.
  • the first and second adjustable lenses 200, 300 of each lens group 100 are capable of modifying the user's vision by adjusting sphere (first adjustable lens 200) and cylinder and axis (second adjustable lens 300).
  • the two lens groups 100 may be moved closer together or further apart for modifying the user's pupillary distance.
  • Control of the actuators 503-510 for the left and right- hand lens groups 70 occurs under the control of the control units 501, 502.
  • the control units 501, 502 are connected to a processor 520 mounted on a small PCB that is fixedly secured within the housing 12.
  • processor 520 is also connected to a memory unit 521, the dock connector 55 on the insert 50, for communication with the mobile device 1, the trackpad 24 and other user- operable controls on the headset 10, the proximity sensor 25, eye tracker 26 and motion tracking sensor 27.
  • the first adjustable lens 200 of each lens group 100 may be adjusted by the actuators 505, 506 under the control of the control units 501, 502 to provide the nearest equivalent sphere (NES) for correcting the user's distance vision.
  • the second adjustable lens 300 of each lens group 100 may be adjusted using the actuators 507, 508, 509, 510 for correcting for cylinder and axis.
  • the pupillary distance between the lens groups 100 may be adjusted using the actuators 503, 504.
  • each first adjustable lens 200 is adjusted by the actuators 505, 506 under the control of the control units 501, 502 to provide the nearest equivalent sphere (NES) for correcting the user's distance vision.
  • the second adjustable lens 300 of each lens group 100 may be adjusted using the actuators 507, 508, 509, 510 for correcting for cylinder and axis.
  • the pupillary distance between the lens groups 100 may be adjusted using the actuators 503, 504.
  • each first adjustable lens 200 is adjusted
  • the first adjustable lenses 200 of the two lens groups 100 may be adjusted together to afford the same degree of sphere correction in both eyes, and one or both of the second adjustable lenses 300 may be capable of additional independent adjustment for sphere by providing for the lens elements 301, 302 to slide relative to each other parallel to the x-axis as well as the j-axis as described above.
  • the first adjustable lenses 200 may be operated, for example, to provide an average vision correction for sphere between the two eyes, and small final adjustments for differences in sphere as between the two eyes may be made using the second adjustable lenses 300.
  • prescription set-up software in accordance with the present invention for adjusting the first and second adjustable lenses 200, 300 may be stored in a storage device 4 or memory unit 5 of the mobile device 1 , the software being executable by a processor 6 of the mobile device 1 for transmitting instructions to the control units 501, 502 for operating the actuators 503-510 when the mobile device 1 is connected via its dock connector 3 to the dock connector 55 of the headset 10.
  • the software being executable by a processor 6 of the mobile device 1 for transmitting instructions to the control units 501, 502 for operating the actuators 503-510 when the mobile device 1 is connected via its dock connector 3 to the dock connector 55 of the headset 10.
  • the prescription set-up software includes executable machine code for displaying a prompt for the user on the display 2 of the mobile device 1 to input his or her ophthalmic prescription, including at least the user's sphere correction for distance and optionally also including fitting data for a lens to the wearer, e.g. pupillary distance, cylinder, axis and lens vertex distance, age and/or add power.
  • the software also comprises code for allowing the user to store his or her prescription data in the storage device 4 of the mobile device 1 to avoid the user having to re-enter the data when he or she uses the headset 10 again.
  • the prescription data is stored on the mobile device 1 in association with a user account in a manner well known to those skilled in the art, so that different users may store their own prescription data on the mobile device 1.
  • FIG. 17 is a flowchart for setting up the headset 10 with a user's prescription in
  • the electronic components of the headset 10 Upon docking the mobile device 1 with the dock connector 55 on the headset 10, the electronic components of the headset 10 receive power 1001 from the mobile device 1 causing the headset to power up.
  • the electronic components of the headset 10 Upon executing the prescription set-up software in the processor 6 of the mobile device 1 (e.g. by opening an app on the mobile device), the electronic components of the headset 10 communicate 1002 with the mobile device 1 and the user's personal ophthalmic prescription data is retrieved 1004 from the storage device 4 and loaded into the memory 5. If the user's prescription data cannot be found in the storage device 4, the user is prompted to input them.
  • the set-up software is executed by the processor 6 1005 to transmit instructions to the on-board processor 520 of the headset 10 to instruct the control units 501, 502 to operate the actuators 503- 510 to adjust the positions of the lens groups 100 on the horizontal axis H and the settings of the first and second adjustable lenses 200, 300 of each lens group 100 in accordance with the user's prescription.
  • the lens groups 100 will usually be situated further away from the user's eyes E than a pair of glasses would be and, in consequence of this, some offset to the user's prescription is needed to adjust the first and second lenses 200, 300 correctly for the user.
  • the set-up software of the invention may therefore include code for prompting the user to input a measurement of the position of his or her eyes when entering prescription data and may calculate the necessary offset for the first and second lenses 200, 300 based on the user's prescription data and the position of his or her eyes.
  • the set-up software may include a short calibration routine which is executed when a new user uses the headset 10 for the first time to fine tune the adjustment of the first and second lenses 200, 300 for the spacing of the user's eyes from the lens groups 100.
  • the calibration data may then be stored in the storage device 4 with the user's prescription data as a profile, so the user does not have to re-calibrate the headset 10 each time he or she uses it.
  • the software may run on a separate computer (not shown) that is connected to the headset wirelessly or via a cable (e.g., USB).
  • the set-up software may prompt the user for login details 1002A and, after receiving correct login credentials, may operate a processor of the separate computer to retrieve the user's ophthalmic prescription data from a storage device that is integral with or connected to the separate computer and calculate the necessary adjustments to the lens groups 100 based on the user's prescription data.
  • the lens groups 100 may be set-up under software control in accordance with the user's vision.
  • the user does not need to wear glasses for correcting refractive errors when using the headset of the invention, and the headset does not need to include sufficient space to accommodate the user's glasses.
  • This may bring numerous advantages in terms of the quality of the image viewed by the user, but additionally may allow the headset to be designed with a smaller form factor as compared with previous virtual and augmented reality headsets, where a certain minimum distance from the user's face was required to accommodate glasses for users who needed them for vision correction.
  • the headset does not protrude as far forwardly from the user's face, the turning force applied by the headset to the user's face owing to the weight of the headset may be reduced, making the headset more comfortable to wear.
  • the headset 10 of the present embodiment includes an eye
  • any suitable eye tracker 26 that is capable of measuring the user's point of gaze may be used, but in the present embodiment, a video-based eye tracker is used in which infrared/near-infrared non-collimated light is used to create corneal reflections, and a vector between the pupil centre and the corneal reflections is used to compute the point of regard on a surface or the gaze direction.
  • the eye tracker 26 may be mounted within the housing 12 at any convenient location with a clear line of sight to one of the user's eyes. In the present embodiment, a single eye tracker 26 is used for measuring the line of sight of the user's left eye. However, it will be understood that in other embodiments
  • the right eye may be measured, or two eye trackers may be employed, one associated with each eye.
  • Virtual reality software of the kind known in the art for operating a mobile virtual reality headset that is fitted with a mobile device such, for example, as mobile device 1 in the present embodiment typically consists of executable code and data.
  • a mobile device generally comprises a processor, memory, storage and a display.
  • the software is stored in the storage, and a temporary copy of at least the executable code is retrieved and loaded into the memory when the software is run.
  • the data comprises image data encoding a 3-D stereographic image for display within the headset, and the code is executable by the processor for retrieving the image data from the storage and using the image data to generate a display signal for displaying the image on the display within the headset.
  • the image data may comprise information for displaying a static stereoscopic image on the display, but in some embodiments, may comprise information for displaying a moving stereoscopic image.
  • the image data may comprise a movie file or a 3-D game.
  • the image displayed on the display is updated dynamically in response to head movements of the user which are measured using the motion tracking sensor 27 (e.g. gyroscopes, accelerometers, structured light systems) in the headset or the mobile device to simulate the user looking around a 3-D scene.
  • the motion tracking sensor 27 e.g. gyroscopes, accelerometers, structured light systems
  • the software of the present embodiment further includes virtual distance data as part of the image data.
  • the virtual distance data may be encoded as parallax data as between the two images of the stereoscopic image.
  • the virtual distance data comprises virtual distance values for virtual distances from the user's eyes E of multiple points or regions within the image to be displayed.
  • the data also includes accommodation data comprising accommodation and vergence values corresponding to the virtual distances.
  • the display signal generated by the processor 6 causes the display 2 to show two side-by- side 2-D images, which when imaged together through the respective lens groups 100 in the headset, form a virtual 3-D stereoscopic image in the manner known to those skilled in the art.
  • the user views stereoscopic image in 3-D through the lens groups 100 and, upon moving his or head to look around a virtual scene included in the image, the image is updated to change with the user's movements.
  • the present invention seeks to overcome this problem by adjusting the position and optical power of the lens groups 100 in response to the line of sight of the user, which is measured using the eye tracker 26, although as mentioned above, the present invention with the first and second adjustable lenses 200, 300 may be implemented for correcting for the user's prescription in a virtual or augmented reality headset, without these additional features.
  • the processor 520 in the headset 10 is arranged to receive a signal from the eye tracker 26 and transmit data encoding the user's line of sight to the mobile device 1 through the dock connector 55. As shown in the flowchart of FIG. 18, this step is indicated by reference numeral 2001.
  • the executable code includes an algorithm of the kind known in the art for calculating the user's point of gaze on the displayed image from the line of sight data 2002.
  • the virtual distance value for the part of the image corresponding to the user's point of gaze is then looked up in the virtual distance data stored in the memory 5. Using the retrieved virtual distance value, the corresponding accommodation and vergence values are obtained from the
  • accommodation and vergence values to transmit instructions to the processor 520 of the headset 10 for instructing the control units 501, 502 to send control signals to the actuators 503, 504, 505, 506 for controlling the positions of the right- and left-hand lens groups 100 on the horizontal axis H and the spherical power of the first adjustable lenses 200 according to the obtained accommodation and vergence values.
  • the actuators 503, 504 are operated by the control units 501, 502 to adjust the pupillary distance between respective lens groups 100 to match the vergence value corresponding to the virtual distance value for the user's point of gaze in the displayed image.
  • the actuators 505, 506 operated by the control units 501, 502 to adjust the first adjustable lenses 200 to match the spherical power of the first lenses 200 to the accommodation value corresponding to the virtual distance value for the user's point of gaze on the image.
  • the lens groups 100 of the headset of the present embodiment are adjusted to oblige the user to accommodate in response to his or her point of gaze.
  • the lens groups 100 are moved closer together using the actuators 503, 504 causing the user's eyes to verge.
  • the lens groups 100 are moved apart.
  • the first adjustable lenses 100 are adjusted using the actuators 505, 506 to shift the lens elements 202, 203 of each first lens 100 in a direction parallel to the x-axis of the lens 200 for reducing the sphere of the lens 100, causing the user to accommodate more strongly to view the image in focus.
  • the actuators 505, 506 are operated to increase the optical power of the first adjustable lenses 200 in accordance with the user's distance vision.
  • the virtual reality software of the present invention may be integrated with the prescription setup software described above, so that dynamic adjustments to the first and second adjustable lenses 200, 300 are made relative to the user's normal prescription.
  • the lens groups 100 are adjusted to match the user's prescription for pupillary distance, sphere, cylinder and axis, etc.
  • the lens groups 100 are adjusted under control of the software to adjust the distance between the lens groups 100 and the degree of accommodation provided by the first adjustable lenses 200 (and optionally second adjustable lenses 300 as described above) to force the user's eyes to accommodate and verge in the same way as they would naturally when viewing a real 3-D scene.
  • the virtual reality headset 10 of the present embodiment is capable of providing a wide range of modifications to the user's vision.
  • the first adjustable lenses 200 may be selectively adjusted to correct the user's spherical prescription as indicated in box (a) of FIG. 16.
  • the second adjustable lenses 300 may be adjusted as well as to correct the user's cylinder and axis as shown in box (b). If the user requires no distance correction, only astigmatism and axis may be corrected as shown in box (e).
  • the lens groups 100 of the headset 10 can be adjusted to modify dynamically the user's vision according to the virtual distance of the part of the virtual 3-D stereoscopic image that the user is looking at within the headset.
  • the pupillary distance between the lens groups 100 can be adjusted by moving the lens groups 100 closer together as shown in box (j) of FIG. 16.
  • the pupillary distance between the lens groups 100 may also be adjusted in the same manner to fit the user.
  • the user's eyes can be forced to accommodate when viewing closer virtual objects by reducing the optical power of the first adjustable lenses 200 as shown in box (h).
  • a combination of pupillary distance and accommodation adjustments is indicated in box (i).
  • the first and second adjustable lenses 200, 300 may be adjusted accordingly.
  • the first and second adjustable lenses 200, 300 can be controlled for correcting the user's cylinder and axis and an adjustment corresponding to the virtual distance of the part of the image at which the user is looking.
  • the pupillary distance and cylinder/axis are adjusted.
  • the first and second adjustable lenses 200, 300 may be adjusted to correct for cylinder and axis and the net sphere provided by the first adjustable lenses 200 may be calculated from the user's distance correction and the adjustment corresponding to the virtual distance to the part of the image which the user is looking.
  • the pupillary distance may also be reduced as shown in box (d).

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

A virtual or augmented reality headset (10) comprising two lens groups (100) for imaging two side-by-side 2-dimensional images displayed on a display (1) within the headset to form a virtual stereographic 3-dimensional image; each lens group comprising a primary lens (70), a first adjustable lens (200) for correcting a spherical refractory error in a user's distance vision and a second adjustable lens (300) for correcting astigmatism, the primary lens and first and second adjustable lenses within each lens group being in mutual optical alignment on a respective optical axis (z). Suitably, the first adjustable lens comprises a cubic surface-type adjustable lens (e.g. an Alvarez lens) comprising two superposed lens elements having mutually cooperating cubic or higher order surfaces that are slidable relative to one other along an x-axis of the first adjustable lens, while the second adjustable lens is rotatable about the z-axis and comprises a cubic surface-type in which two similarly superposed lens elements are slidable relative to one other along an y-axis of the second adjustable lens.

Description

IMPROVEMENTS IN OR RELATING TO VIRTUAL AND AUGMENTED REALITY
HEADSETS
[0001] The present invention relates to virtual and augmented reality headsets and has particular reference to virtual and augmented reality headsets of the kind in which two 2- dimensional images are displayed side -by-side within the headset to form a stereoscopic 3-dimensional image when viewed by a user through a pair of primary objective lenses that serve to magnify the images. The images may be still images (e.g., photographs) or moving images (e.g., videos).
[0002] A virtual reality (VR) headset provides a virtual reality experience for the wearer. VR headsets are widely used with computer games, but they are also used in other applications, including simulators, trainers and presentations. An augmented reality (AR) headset offers a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. The present invention is concerned with virtual and augmented reality headsets in which a 3-D stereoscopic image is displayed within the headset, either alone or superposed on a view of the real-world environment.
[0003] Known virtual and augmented reality headsets generally comprise a housing that is worn on the face of the user over the eyes. A stereoscopic 3-dimensional image is displayed within the headset comprising two side-by-side 2-D images which, when viewed by user, form a virtual 3-D image. A pair of primary lenses are provided within the headset for imaging the stereoscope. Typically, an elasticated or adjustable headband is provided for attaching the housing to the user's head for hands-free use.
[0004] Some virtual and augmented reality headsets include one or more integrated displays for displaying the two images side-by-side that make up the 3-D image. Other headsets, known as mobile virtual reality headsets, are designed to receive a mobile device comprising its own display such that the mobile device's display faces towards the user's eyes when fitted. The mobile device, which may be a mobile phone, can be easily removed from the housing after use. [0005] Typically, known virtual and augmented reality headsets provide for a limited degree of focusing of the image displayed on the integrated displays(s) or display screen of the mobile device as the case may be. In mobile virtual reality headsets, for example, the primary lenses are generally fixedly secured within the housing, but the mobile device is mounted on a part of the housing that is arranged to move forwards and backwards relative to the user, parallel to the optical axis of the primary lenses and perpendicular to the plane of the screen. However, the degree of focusing that this provides is rather limited. Further, it is not possible to adjust the focus of each eye separately. No provision is made for adjusting for the user's cylinder or axis prescription.
[0006] Generally, therefore, the housing of a virtual augmented reality headset includes
sufficient room immediately in front of the user's eyes to accommodate the user's spectacles if he or she requires them for vision correction. This is disadvantageous, because it increases the size of the housing and imposes limits on the extent to which the display can be placed close to the user's face. Particularly in the case of a mobile VR headset, the removable mobile device may be fairly heavy and positioning it at a distance from the user's face gives rise to a considerable turning force that is applied to the user's face, which may be uncomfortable, notwithstanding the usual provision of foam padding or the like around a rear end of the headset where it contacts the user's face.
[0007] It would be desirable to produce an improved virtual or augmented reality headset with a smaller form factor, particularly such a headset in which the display, for example the display of a mobile device, is positioned closer to the user's eyes.
[0008] Another problem with virtual reality and augmented reality headsets is that even though a
3-dimensional image is displayed, the actual distance of the display from the user's eyes is constant. In real life, when a person views an object that is nearer than infinity, his or her eyes accommodate to bring the object into focus and verge, reducing the pupillary distance between the eyes. This combination of physiological responses, known as the
"accommodation reflex", sends important sensory clues to the brain to help comprehend the image and discern the distance to the particular object being viewed. In the absence of these physiological responses, some users of virtual or augmented reality headsets experience some feelings of nausea or disorientation, because they perceive a 3- dimensional image with various objects at different virtual distances, but the accommodation of their eyes remains unchanging.
[0009] It would therefore also be desirable to ameliorate such feelings of nausea and/or
disorientation when using a virtual or augmented reality headset.
[0010] In accordance with a first aspect of the present invention therefore there is provided a virtual or augmented reality headset comprising two lens groups for imaging two side- by-side 2-dimensional images displayed on a display within the headset to form a virtual stereographic 3 -dimensional image, each lens group comprising a primary lens and at least one selectively adjustable lens having at least one variable optical characteristic. Suitably, each lens group may comprise a primary lens, a first adjustable lens for correcting a spherical refractory error in a user's distance vision and a second adjustable lens for correcting astigmatism and axis, the primary lens and first and second adjustable lenses within each lens group being in mutual optical alignment on a respective optical axis. The first adjustable lens may therefore have a selectively variable focusing power (i.e. variable sphere). The second adjustable lens may have a selectively variable cylinder power. The second adjustable lens may also be selectively rotatable around its optical axis to adjust the axis of the astigmatism of the second adjustable lens. In some embodiments, the second adjustable lens of at least one lens group may also have selectively variable focusing power to supplement or modify the focusing power of the first adjustable lens.
[0011] In some embodiments, the headset may be a virtual reality (VR) headset in which only the 3-dimensional image displayed on the display is visible. The virtual reality headset may incorporate one or more integrated displays. Alternatively, the virtual reality headset may be a mobile VR headset that is arranged to receive a mobile device having its own display, such that the mobile device is easily removable from the headset and, when fitted, is arranged such that its display faces the user's eyes. [0012] In another embodiment, the headset may be an augmented reality (AR) headset in which the 3 -dimensional image is superimposed on an image of the real world viewed through the headset.
[0013] In its broadest sense, a variable optical characteristic of the at least one adjustable lens implies variable optical power, including focusing power (sphere) or cylinder power (astigmatism), variable axis or a variable position of the lens in a direction transverse the optical axis. In accordance with the present invention, the at least one adjustable lens may be arranged for modifying the vision of a user of the headset.
[0014] Preferably, the at least one adjustable lens may be configured for correcting the user's vision. For instance, the adjustable lens may have variable sphere, cylinder and/or axis and may be selectively adjusted for correcting the user's prescription for sphere and/or astigmatism. However, in accordance with some embodiments of the invention, the at least one adjustable lens may alternatively or additionally be used for modifying the user's vision in accordance with an image displayed within the headset. "Modifying the user's vision" as used herein therefore not only implies correcting the user's vision in accordance, for example, with the user's ophthalmic prescription for distance, pupillary distance, astigmatism and the like, but also manipulating the user's vision so that the user's eyes are forced to accommodate and/or verge as described in more detail below in response to how the user views an image on the display.
[0015] Advantageously, each lens group comprises a first adjustable lens for correcting a
spherical component the user's distance vision and a second adjustable lens for correcting astigmatism (i.e. a cylinder component and axis component). The optical properties of the first adjustable lens may include one or more minor components in addition to sphere. The optical properties of the second adjustable lens may include one or more minor components in addition to cylinder. It will be understood however, that in some embodiments of the invention, the first adjustable lens is for correcting a mainly spherical component of the user's distance vision. The second adjustable lens is for correcting mainly astigmatism. [0016] However, in a particular alternative embodiment of the invention, the second adjustable lens within each lens group may also afford variable sphere. Preferably the first adjustable lenses are independently adjustable, but in some embodiments, they may be linked for adjustment together, preferably simultaneous equal adjustment. A second adjustable lens, which provides for a small amount of sphere adjustment, may therefore allow for the correction of small differences in sphere between the user's two eyes.
[0017] Suitably, the distance between the lens groups and the display within the headset may be fixed. In some embodiments, the positions of the lens groups within the headset may be fixed, but in other embodiments the lens groups may be movable sideways in a direction parallel to an interpupillary axis between the user's eyes. Thus, the lens groups may be movable sideways in a direction parallel to the plane of the display.
[0018] The adjustable lenses included in the headset of the present invention may be any suitable adjustable lens known to those skilled in the art. Suitably, each lens group comprises at least one cubic surface-type adjustable lens comprising two superposed lens elements having mutually cooperating cubic or higher order surfaces that are slidable relative to one other in a plane perpendicular to the optical axis of the lens for varying the optical power of the lens. For instance, each lens group may comprise at least one Alvarez lens as disclosed, for instance, in US 3,305,294, the contents of which are incorporated herein by reference. Other suitable cubic-type adjustable lenses with cubic and higher- order surfaces are disclosed by US 3,583,790, US 7,338,159, US 7,717,552,
US 5,644,374 and WO 2013/030603, the contents of all of which are also incorporated herein by reference.
[0019] Thus, the cubic surfaces of the lens elements may be shaped to provide variable sphere correction when they are displaced relative to one another along an x-axis of the adjustable lens and/or variable cylinder correction when they are displaced relative to one another along a j-axis of the adjustable lens, the x- and j-axes forming a 3- dimensional Cartesian coordinate system with a z-axis that is parallel to the optical axis of the adjustable lens. Suitably, each lens element may have a first cubic or higher- order surface of the kind described above and a second opposite surface that is a regular surface of revolution. In some embodiments, the second surface of each lens element may be planar.
[0020] Suitably, each lens group comprises a first cubic surface-type lens in which the lens elements are slidable relative to one another in a direction parallel to the x-axis for correcting the user's (mainly) spherical distance vision. Conveniently, in some embodiments, the x-axis of the first cubic surface-type lens may be oriented at an acute angle, suitably about 60-80°, relative to an interpupillary axis that extends between the lens groups. Thus, the x-axis of the first cubic surface-type lens may be arranged generally parallel to an adjacent side of a user's nose. It will be understood that the arrangements of the first cubic surface-type lenses of the two lens groups are mirror images of one another, so that while the x-axis of the first lens of one of the lens groups is oriented is an angle of 45-80° relative to the interpupillary axis, the x-axis of the first lens of the other lens group 100 is oriented at an angle of -45° to -80° relative to the interpupillary axis. This allows the lens groups to be positioned close to the user's face while still maintaining a large degree of relative movement between the two lens elements of the first cubic surface-type lens. It will be appreciated that if the x-axis were oriented substantially parallel to the interpupillary axis, the travel of the lens elements would be limited by the user's nose in one direction, by disturbance to the wearer's field of view in the other direction, or the first adjustable lens would have to be positioned forwardly of the nose.
[0021] Alternatively, in some embodiments, the first adjustable lens of each lens group may comprise a variable focusing power liquid lens comprising a distensible membrane having an optical surface, wherein the focusing power of the liquid lens is related to the curvature of the membrane. Suitable variable focusing power liquid lenses are disclosed by US 1269422, WO 99/061940 Al, US 2576581, US 3161718, US 3614215, WO2014/118546 Al, WO2014/199180 Al and WO2017/055787 A2, the contents of which are incorporated herein by reference.
[0022] In some embodiments, the optical power (focusing power) of the first adjustable lens may be variable in the range up to about ±15 dioptres, preferably -15 to +10 dioptres. Suitably, the focusing power of the first adjustable lens may be in the range -8 to +6 dioptres or -6 to +4 dioptres.
[0023] Advantageously, in some embodiments, each lens group comprises a second cubic
surface-type lens, which is mounted for rotation within the headset about an axis parallel to the z-axis of the lens, the lens elements of the second lens being slidable relative to one another in a direction parallel to the j-axis of the second cubic surface- type lens for correcting astigmatism.
[0024] In some embodiments, the cylinder power of the second cubic surface-type lens may be variable in the range up to about +8 to -8 dioptres. Suitably, the cylinder power of the second cubic surface-type lens may be in the range +4 to -4 dioptres or +2 to -2 dioptres, where a sign change in cylinder is equivalent to the axis changing by 90°.
[0025] Preferably, each lens group may comprise a rotation-controlling actuator associated with the second cubic surface-type lens for adjusting the axis of the second adjustable lens. Each second cubic surface-type lens may be arranged for rotation about the optical axis by up to 90°.
[0026] Each lens group may comprise a respective sliding-controlling actuator associated with the or each cubic surface-type lens for controlling the relative disposition of the lens elements for adjusting the optical power of the lens.
[0027] In a particular aspect of the invention, each of the lens groups may be mounted within the headset for translational movement in a direction parallel to the interpupillary axis that extends between the two lens groups. Suitably, the primary lens and at least one adjustable lens within each lens group are mounted for movement together as a group. Thus, the lens groups may be mounted for translational movement in a direction parallel to the plane of the display. A respective pupillary distance-controlling actuator may be associated with each of the lens groups for adjusting the interpupillary distance between the lens groups. In an alternative embodiment, the positions of the primary lenses may be fixed, and only the one or more adjustable lenses may be movable as described. [0028] Suitably the interpupillary distance between the lens groups may be adjustable in the range 50-76 mm. In some embodiments, the interpupillary distance between the lens groups may be adjustable in the range 52-72 mm or 54-70 mm.
[0029] In a particular aspect of the invention, the virtual or augmented reality headset further comprises at least one eye-tracker.
[0030] As described in more detail below, using an output signal from the eye-tracker, the user's line of sight may be calculated, from which the user's point of gaze on an image displayed within the headset may be determined. Virtual distance data associated with image data used for creating the image within the headset may comprise virtual distance values for multiple points or regions within the image (encoded as parallax in the stereographic images), from which an object vertex distance (z-distance) may be determined. The object vertex distance may be used for adjusting the at least one adjustable lens to modify the user's vision in a manner related to the object vertex distance for the point of gaze on the image. For instance, the optical power of the first adjustable lens may be adjusted in inverse relation to the object vertex distance, so that the image is effectively viewed from a shorter virtual object distance. As a result, the optical power of the first adjustable lens is greater when the user is looking at a distant point within the image having a greater object vertex distance than when he or she is looking at a nearer point within the image having a lower object vertex distance. In this way, the user's eyes are forced to accommodate in response to the virtual distance of the part of the image at which the user is looking.
[0031] In some embodiments as described below, the at least one adjustable lens within each group may be arranged to correct the user's vision in accordance with his or her prescription. The adjustment of the focusing power of the first adjustable lens in relation to the virtual distance of the part of the image at which the user is looking may therefore comprise an adjustment from the user's prescription. In other words, the at least one adjustable lens within each lens group may be set up to correct the user's distance vision, and the optical power of the first lens may be adjusted to a negative power relative to the user's distance vision when the user is looking at an object closer than infinity. [0032] In accordance with a second aspect of the invention therefore there is provided software or firmware for controlling a virtual or augmented reality headset in accordance with a first aspect of the invention, which software or firmware comprises machine code that is executable by a computer comprising a processor and memory for controlling the computer to receive in the memory ophthalmic prescription data for a user encoding at least the user's sphere correction for distance and transmitting control signals to the sliding-controlling actuators for adjusting the relative disposition of the lens elements of each of the first cubic surface-type lenses to adjust the focusing power of the first adjustable lenses in accordance with the user's prescription.
[0033] Preferably, each of the first cubic surface-type lenses may be adjusted independently of the other, but as mentioned above, in some embodiments, the adjustment of the first adjustable lenses may be linked, so they are adjusted together.
[0034] The computer may be a separate computer that is connected to the headset of the
invention wirelessly or via a suitable cable. Alternatively, the computer may be a mobile device such, for example, as a mobile phone, that is received in the headset for using the mobile device's own display screen for displaying the image within the headset, as described above.
[0035] In addition to the processor and memory, the computer may also comprise a storage
device for permanent storage of data as known in the art. Suitably, the software or firmware may be executable by the computer to obtain the user's prescription data from the storage device. Alternatively, the software or firmware may be executable by the computer to prompt a user to input their prescription data.
[0036] Software in accordance with the invention may be stored in the computer's storage
device and copied into the memory when the software is run. Alternatively, in a virtual or augmented reality headset of the invention comprising its own on-board computer, the firmware of the invention may be stored in a suitable non- volatile memory within the headset.
[0037] It will be appreciated that in a headset according to the present invention, the lens groups may be positioned further forwardly from the user's eyes than would be the case in a conventional pair of glasses for correcting the user's vision. Accordingly, the adjustment of the first cubic surface-type lenses in accordance with the user's prescription may include an additional offset to account for the user's eye position or vertex distance when wearing the headset. The size of the offset may be input with the prescription data or stored from a one-time calibration when the user first wears and adjusts the headset. In some embodiments, the vertex distance may be obtained automatically from an eye tracker device of the kind mentioned above that may be mounted within the headset.
[0038] In some embodiments, the user's ophthalmic prescription data may further comprise one or more of fitting data for a lens to the user such, for example, as interpupillary distance, cylinder, axis and lens vertex distance, age and add power, and the software may be executable by the processor for transmitting control signals to the pupillary distance- controlling actuators associated with the respective lens groups, the sliding-controlling actuators associated with the first and/or second adjustable lenses and/or the rotation- controlling actuators associated with the second adjustable lenses for adjusting the separation of the lens groups, the sphere power of the first lenses and/or the cylinder power and/or axis of the second adjustable lenses for correcting the user's vision in accordance with their prescription.
[0039] Preferably, the software or firmware is executable by the computer for controlling the computer to receive in the memory the user's axis and cylinder corrections for astigmatism and transmitting control signals to the controllers to operate the rotation- controlling actuators and sliding-controlling actuators for the second cubic surface-type lenses to adjust the orientation and optical power of the second adjustable lenses to the user's prescription.
[0040] In a third aspect of the present invention, there is provided virtual or augmented reality software or firmware for operating a computer comprising a processor, memory and storage to control operation of a virtual or augmented reality headset in accordance with the first aspect of the invention, the software or firmware comprising executable code and data; the data comprising image data encoding a stereographic image for display within the headset, virtual distance data comprising virtual distance values for virtual distances from a user of multiple points or regions within the image and accommodation data comprising accommodation and/or vergence values corresponding to the virtual distances;
the executable code being executable by the processor for retrieving the image data from the storage and using the image data to generate a display signal for a display within the headset to display the image;
transmitting the display signal to the display;
receiving line of sight data from an eye-tracker in the headset representing the user's line of sight;
calculating from the received line of sight data the user's point of gaze on the displayed image;
retrieving the virtual distance data from the storage and using the calculated point of gaze and virtual distance data to determine the virtual distance of the user from the user's point of gaze;
retrieving the accommodation data from the storage and looking up the accommodation and/or vergence values for the virtual distance corresponding to the user's point of gaze;
generating and transmitting a control signal to the pupillary distance-controlling actuators for adjusting the distance between the lens groups on the interpupillary axis in accordance with the determined vergence value and/or generating and transmitting control signals to the sliding-controlling actuators of the first cubic surface-type lenses for adjusting the optical power of the first lenses according to the determined accommodation value. he virtual or augmented reality software or firmware of the third aspect of the invention when executed therefore controls operation of a virtual or augmented reality headset in accordance with the invention such that the user's vision is modified according to the virtual distance from the user to his or her point of gaze on the image displayed within the headset. In this way, as described above, the user's eyes may be forced to accommodate and/or verge when viewing parts of the image that have a shorter virtual distance than infinity.
[0042] It will be appreciated that in part the virtual distance of the user to his or her point of gaze on the image will be dependent upon the user's vertex distance. As mentioned above, the vertex distance may be obtained automatically from the eye tracker. Once measured, the vertex distance may be held in the memory of the computer during use of the software and/or stored in the storage for future use. In determining the virtual distance from the user to the user's point of gaze on the image, an adjustment may be made for the vertex distance.
[0043] In some embodiments, the virtual reality software or firmware may be executed in
conjunction with, or may incorporate, prescription set-up software or firmware according to the second aspect of the present invention. Thus, the set-up software or firmware may be executed when the user puts on headset to adjust the at least one adjustable lens in accordance with the user's prescription. In particular, the first adjustable lens may be adjusted in accordance with the user's distance prescription. The second adjustable lens, when present, may be adjusted in accordance with the user's cylinder and axis prescription. The distance between the lens groups may be adjusted in accordance with the user's pupillary distance. During use of the headset, the distance between the lens groups may be reduced as described above when the user looks at parts of the stereoscopic 3-D image that appear to be nearer than infinity to cause the user's eyes to verge. The focal power of the first adjustable lens may be reduced as described above when the user looks at parts of the stereoscopic 3-D image that appear to be nearer than infinity to cause the user's eyes to accommodate. Where the user's prescription data includes the user's age and/or presbyopia data, adjustment of the first adjustable lenses as described above may be disabled or limited for parts of the image at virtual distances for which the user's accommodation is insufficient owing to presbyopia.
[0044] In some embodiments, the software or firmware may be executable by a mobile device having a display, e.g. a mobile phone, which can be fitted to the virtual or augmented reality headset for displaying the two side-by-side 2-dimensional images that form the stereographic 3-dimensional image.
[0045] The present invention therefore provides a virtual or augmented reality headset which comprises, in addition to the usual pair of primary lenses for imaging a stereoscopic image displayed within the headset, at least one adjustable lens for each eye. The adjustable lens may be adjusted for modifying the user's vision, particularly for vision correction in order to avoid having to use glasses with the headset. This allows the headset to be made with a compact form factor with improved comfort as compared with prior virtual/augmented reality headsets. The adjustable lens may also be controlled dynamically as the user views different parts of the stereoscope having different virtual distances from the user. In this way, the user is forced to accommodate when looking at different parts of the image, even though the distance between the user's eyes and the display remains fixed. This may help to overcome some of the feelings of nausea experienced by users of prior virtual/augmented reality headsets.
[0046] In preferred embodiments, a second adjustable lens associated with each eye allows for adjustment of cylinder and/or axis for further improvement of the user's experience and image quality. The two lens groups may be mounted for translation within the headset in a direction parallel to the plane of the image, i.e. in a direction parallel to an axis between the two lens groups, to allow the lens groups to moved closer together or further apart. The lens groups may therefore be set at a pupillary distance
corresponding to the user's pupillary distance and, in some embodiments, the distance between the lens groups may be adjusted dynamically as the user looks at different parts of the image with different virtual distances from the user. Specifically, the lens groups may be moved closer together when the user looks at parts of the image with a shorter virtual distance from the user, e.g. for maintaining alignment between the lens groups' optical centres and the eyes as they verge when looking at near objects.
[0047] Following is a description by way of example only with reference to the accompanying drawings of embodiments of the present invention.
[0048] In the drawings: [0049] FIG. 1 shows a virtual reality headset worn by user.
[0050] FIG. 2 is a perspective front view from above and a left side of the headset of FIG. 1
[0051] FIG. 3 is a perspective front view from the left side of the headset of FIGS. 1 and 2, showing an insert for a mobile phone or other handheld mobile device comprising a display screen.
[0052] FIG. 4 is another perspective front view from the left side of the headset shown in the previous figures, which shows a mobile phone in a partially docked position.
[0053] FIG. 5 is a schematic plan view of the virtual reality headset of FIGS. 1-4 showing a pair of lens groups, in accordance with the present invention.
[0054] FIG. 6 is a schematic side view, showing the headset of FIG. 5 worn on the face of the user.
[0055] FIG. 7 is a schematic plan view of a right-hand group of lenses for the headset of
FIGS. 1-6.
[0056] FIG. 8 is a schematic rear view from above and a right side of the right-hand group of lenses of FIG. 7.
[0057] FIG. 9A is a schematic rear view from above and the right side of a first cubic-type lens forming part of the lens group shown in FIGS. 7 and 8.
[0058] FIG. 9B is a schematic rear view from above and the right side of a second cubic-type lens forming part of the lens group of FIGS. 7 and 8.
[0059] FIG. 10 is a plan view of the right-hand group of lenses of FIGS. 9 and 10, showing
electronically controlled actuators for adjusting the first and second cubic-type lenses.
[0060] FIG. 11 is an isometric rear view from above and the right side of the right-hand group of lenses of FIG. 10. [0061] FIG. 12A is a rear view of the first cubic-type lens of the right-hand group of lenses of FIGS. 10 and 11.
[0062] FIG. 12B is a rear view of the second cubic-type lens of the right-hand group of lenses of FIGS. 10 and 11.
[0063] FIGS. 13A-13C show adjustment of the first cubic-type lens of FIG. 12A for modifying the user's accommodation.
[0064] FIGS. 14A-14C show adjustment of the second cubic-type lens of FIG. 12B for
correcting the user's axis.
[0065] FIGS. 15A-15C show adjustment of the second cubic-type lens of FIG. 12B for
correcting the user's cylinder.
[0066] FIG. 16 is a chart showing adjustments needed to the first and second cubic-type lenses of FIGS. 12A and 12B for correcting the user's vision and/or modifying their accommodation or pupillary distance (PD).
[0067] FIG. 17 is a flow diagram for software to adjust lenses within a headset according to the invention in accordance with a user's prescription.
[0068] FIG. 18 is a flow diagram for software for adjusting lenses within a headset according to the invention for modifying a user's vision according to a point or region of a stereoscopic 3D image being viewed within the headset.
[0069] FIG. 19 and FIG. 20 shows schematically how the pupillary distance of the user varies according to the distance of an object being viewed.
[0070] FIG. 21 is a block diagram of the electronic components of the headset of FIGS. 1-15.
[0071] FIG. 1 shows a mobile virtual reality headset 10 according to one embodiment of the invention being worn by user. While the present embodiment relates to a mobile virtual reality headset, which is designed to receive a mobile device incorporating a display, the present invention is equally applicable to virtual and augmented reality headsets with integrated displays. [0072] The mobile virtual reality headset 10 of the present embodiment comprises a housing 12 having a front end 14, a rear end 15, a left-hand side 16, a right-hand side 17 and top 18 and a bottom 19. The housing 12 is assembled from multiple parts, including a sidewall 20 that extends around the housing 12 and has a front end 22 and a rear end 23, a peripheral rear extension piece 30 and a detachable front cover 40. An elasticated headband 39 is attached to the sidewall 20 at the right and left-hand sides 16, 17 of the housing 12 for securing the headset 10 on the head of a user, over the eyes, as shown in FIG. 1, for hands-free use. Alternatively, an adjustable headband may be used.
[0073] The rear extension piece 30 is fitted in the rear end 23 of the sidewall 20 and, as is clear from FIGS. 1 and 2, extends around the rear end 23 of the sidewall 20, including a region (not shown) that is shaped to accommodate the user's nose. The rear end 15 of the housing 12, including the rear end 23 of the sidewall 20 and the rear extension piece 30, are shaped to match the general contours of the user's face, while the rear extension piece 30 carries a strip of foam or other cushioning material 35 for comfort on the face of the user, particularly when the headband 39 is tightened around the user's head.
[0074] The front cover 40 is detachably secured to a moulded insert 50 that is fitted in the front end 22 of the sidewall 20. As best seen in FIG. 3, the insert 50 has a front surface 51 that defines a shallow, forwards-facing recess 52 that is shaped and sized to receive a rectangular mobile device 1 having a display screen 2, of the kind shown in FIG. 4. The front cover 40 may likewise have a recessed rear face, so that upon fitting the front cover 40 to the insert 50, the rear face of the front cover 40 and front face 51 of the insert 50 form a cavity for receiving the mobile device 1. Forwardly protruding formations 53 are provided for engaging one or more cooperating formations formed on the front cover 40 for attaching the front cover 40 to the insert 50.
[0075] A dock connector 55 is mounted on a cylindrical block 56 which is hinged to the
insert 50 at the right-hand side thereof, while a manually operable catch 58 is mounted at the left-hand side, for docking a corresponding dock connector 3 of the mobile device 1 to the headset 10, with its display screen 2 facing rearwards, and securing it to the insert 50, as indicated in FIG. 4. Unlike some known mobile virtual reality headsets, the insert 50 is not movable relative to the housing 12. The dock connector 55 is arranged for connecting the mobile device 1 to various on-board electronic components of the headset 10, as described below. As is known in the art, for example, the headset 10 is equipped with a number of manually operable controls that are accessible by the user from outside the housing 12. In particular, the housing 12 is provided with a trackpad 24 (see FIG. 1), a volume control (not shown for clarity) and a back button (also not shown). The headset 10 further includes a proximity sensor 25 facing towards the user's face (see FIG. 5), an eye tracker 26, one or more motion tracking sensors 27, a charging port (also not shown) and a USB connector (also not shown).
[0076] The various component parts of the housing 12, including the sidewall 20, rear extension piece 30 and front cover 40 are formed from opaque materials to prevent ambient light from passing through these parts into the housing 12. Similarly, the rear extension piece 30 and foam strip 35 act to inhibit light from entering into the housing 12 through the rear end 15 by virtue of forming a close fit with the user's face.
[0077] The insert 50 is also formed from opaque material and, as best seen in FIG. 3, is shaped to define a pair of spaced apertures 60. The apertures 60 are arranged to ensure that each of the user's eyes can only view a respective one of two side-by-side 2-D images that are shown on the display of the mobile device 1. Juxtaposed the front surface 51 of the insert 50, the insert 50 may be formed with an integral partition 62 between the two apertures 60, to block light from one of the images bleeding into the other aperture 60. Around each of the apertures 60, the insert 50 is formed with a generally cylindrical inner surface 64 having a diameter of about 50 mm that extends rearwardly from the front surface 51 through the insert 50.
[0078] A biconvex primary lens 70 is disposed behind each of the apertures 60 for imaging the side-by-side images shown on the display to form a 3-D stereoscopic image. The primary lenses 70 have the same power as each other. The actual power of the lenses will depend on the distance between the display and lenses 70, but suitably the lenses 70 may have a power of about 35 dioptres. [0079] As shown schematically in FIG. 5, each of the primary lenses 70 is part of a respective lens group 100 that further comprises two adjustable lenses 200, 300 in accordance with the present invention, as described in more detail below. Each lens group 100 is associated with a respective one of the user's eyes E for viewing a respective one of the side-by-side 2-D images that are displayed on the display 2 of the mobile device 1 to form a virtual 3-D stereoscopic image, as mentioned above. The position of each lens group 100 between the front and rear ends 14, 15 of the housing 12 is fixed. However, each lens group 100 is mounted within the housing 12 for translational movement on a horizontal axis H between the right and left hand sides 16, 17 of the housing 12 as described in more detail below. When the headset 10 is worn, the horizontal axis H is oriented substantially parallel to an interpupillary axis extending between the user's right and left eyes. The lens groups 100 may conveniently be mounted to an interior surface of the sidewall 20, but in the present embodiment the lens groups 100 are mounted to a rear surface (not shown) of the insert 50, behind the apertures 60, as shown in FIGS. 5 and 6. Within each lens group 100, the adjustable lenses 200, 300 are disposed behind the corresponding primary lens 70. However, in other embodiments, the adjustable lenses 200, 300 may be positioned in front of their respective primary lenses 70.
[0080] Each lens group 100 therefore comprises two adjustable lenses 200, 300 for modifying the user's vision as described in more detail below. By "modifying" is meant not only that the adjustable lenses 200, 300 are capable of correcting the user's vision, but also that the user's view of the stereoscopic image may be controlled to compensate for the fact that the distance between the user's eyes and the display is constant. A first adjustable lens 200 of each lens group 100 is provided for modifying a mainly spherical component of the user's vision, while a second adjustable lens 300 is provided for modifying mainly cylinder and axis components. In the present embodiment the first adjustable lens 200 is positioned behind the second adjustable lens 300, nearer the user's eyes E. However, in other embodiments, and more generally, the first and second adjustable lenses 200, 300 may be arranged in either order, one behind the other. [0081] In the present embodiment, each of the first and second adjustable lenses 200, 300 of each lens group 100 is a cubic surface-type adjustable lens comprising two superposed lens elements 201, 202; 301, 302, as best shown in FIGS. 9A and 9B. The lens elements 201, 202; 301, 302 of each lens 200, 300 have mutually cooperating cubic surfaces which are shaped such that, when the lens elements are shifted relative to one other in a plane perpendicular to the optical axis of the lens, the optical power of the lens varies.
[0082] In other embodiments of the invention, different types of adjustable lenses such, for example, as fluidic lenses, may be employed for one or both of the adjustable lenses 200, 300, especially the first adjustable lenses 200 which are configured for correcting the user's spherical focusing power for distance vision as described in more detail below.
[0083] In the present embodiment, the lens elements 201, 202; 301, 302 of each cubic-type adjustable lens 200, 300 have polished opposite front and rear faces, and the optical thickness t of each lens element 201, 202; 301, 302 in a direction parallel to the optical axis z, as shown in FIG. 6, is defined by formula (I):
Figure imgf000021_0001
(I) wherein D is a constant representing the coefficient of a prism removed to minimise lens thickness and may be zero; E is a constant representing lens element thickness at the optical axis z; x and y represent coordinates on a rectangular coordinate system centred on the optical axis and lying in a plane perpendicular thereto; and A is a constant representing the rate of lens power variation with relative lens element movement in the x direction, being positive for one of the lens elements 201, 202; 301, 302 and negative for the other lens element 202, 201; 302, 301. One face of each lens element 201, 202; 301, 302 is planar, but in other embodiments of the invention, one face of each lens element may be a regular surface of revolution. [0084] Whilst the lens elements 202, 201; 302, 301 of the present embodiment are defined by formula (I) above, other cubic-type adjustable lenses with cubic and higher-order surfaces that are suitable for use in accordance with the present invention are disclosed by US 3,305,294, US 3,583,790, US 7,338,159, US 7,717,552, US 5,644,374 and WO 2013/030603, the contents of which are incorporated herein by reference.
[0085] Each of the lens groups 100 therefore comprises a primary lens 70 and first and second cubic-type adjustable lenses 200, 300 and is mounted within the housing 12 for lateral translational movement side-to-side on the above-mentioned horizontal axis H between the left and right hand sides 16, 17 of the housing 12, as indicated by the arrows indicated by A in FIG. 5, for adjusting the pupillary distance (PD) between the lens groups 100. A separate actuator 503, 504 is associated with each lens group 100 for moving the lens groups 100 independently of one another under the control of respective control units 501, 502, as shown in FIG. 21. A separate control unit 501, 502 is associated with each of the left-hand and right-hand lens groups 100. The pupillary distance is adjustable in the range 50-76 mm, but different embodiments may be capable of different ranges of PD adjustment, for example 52-72 mm or 54-70 mm.
[0086] FIGS. 7 and 8 show the right-hand lens group 100 schematically in more detail. The arrangement of the left-hand lens group 100 is similar to that of the right-hand lens group 100, but in mirror image. The references to right- and left-hand throughout are anatomical, corresponding to the right and left eyes of the user.
[0087] As shown in FIGS. 7 and 8, the first and second adjustable lenses 200, 300 of the right- hand lens group 100 are mounted behind the corresponding primary lens 70 on the optical axis z. The lens elements 201, 202 of the first cubic-type adjustable lens 200 are mounted for sliding movement relative to one another parallel to the x-axis of the lens 200 for adjusting the mainly spherical optical power of the lens 200.
[0088] The first adjustable lens 200 is mounted with the x-axis oriented at an acute angle Θ of about 75° to the horizontal axis H, as best shown in FIG. 12A. In other embodiments, an angle Θ in the range 60-80° may be used. The lens groups 100 may be positioned within the housing 12 such that they are disposed adjacent the region of the rear extension piece 30 that is shaped to accommodate the user's nose. At least the first cubic-type lens 200 may be positioned adjacent the user's nose when the headset is worn, and by orienting the first lens 200 at such an acute angle Θ to the horizontal axis H, substantial relative movement of the lens elements 201, 202 can be accommodated without impinging on the user's nose or the part of the housing 12 that is shaped to accommodate the user's nose. It will be understood that since the lenses are concealed within the housing 12 of the headset 10, aesthetic considerations arising from the orientation of the first lens 200 at such an acute angle do not arise.
[0089] From the foregoing, it will be understood that relative movement of the lens elements 201, 202 of the first adjustable lens 200 parallel to the x-axis causes adjustment of the mainly spherical optical power of the lens 200, as illustrated in FIGS. 13A-13C. In the present embodiment, the first adjustable lens 200 is capable of a range of spherical powers of about -8 to +6 dioptres, but in some embodiments a wider or narrower range may be used, e.g. -15 to +10 dioptres or -6 to +4 dioptres. Generally, the first adjustable lens 200 may be capable of a range of spherical powers of ±10 dioptres or up to ± 15 dioptres.
[0090] Any suitable mechanism, represented schematically by item 210 in FIGS. 7-9, may be provided for adjusting the relative disposition of the lens elements 201, 202 of the first adjustable lens 200. As shown in FIG. 12A, in the present embodiment, each of the lens elements 201, 202 is fabricated with an internally threaded protrusion 203, 204 having mutually opposite screw threads. The protrusions 203, 204 are mounted on a spindle 205 having spaced thread portions 207, 208, which also have mutually opposite screw threads configured to engage with the respective protrusions 203, 204. Rotation of the spindle 205 on its axis therefore shifts the lens elements 201, 202 relative to one another in equal and opposite directions on the x-axis. A mechanism of this kind is disclosed by US2008/0030678 Al, the contents of which are incorporated herein by reference. An actuator 506 as a prime mover is fitted at one end of the spindle 205 for rotating the spindle 205 under control of the control unit 502. A corresponding actuator 505 is associated with the first adjustable lens 200 of the left-hand lens group 100. [0091] As shown in FIGS. 7 and 8, the second adjustable lens 300 of each lens group 100 is mounted between the first adjustable lens 200 and the primary lens 70. Unlike the first adjustable lens 200, the second adjustable lens 300 is mounted for rotation about the optical axis z as shown in FIGS. 14A-C. The lens elements 301, 302 of the second adjustable lens 300 are also mounted for sliding movement relative to one another in a direction parallel to the y-axis of the second lens 300, as illustrated in FIGS. 15A-C. From formula (I) above, it will be understood that shifting the lens elements 301, 302 of the second lens 300 relative to one another along the j-axis causes adjustment of the mainly cylinder optical power of the second lens 300. Rotation of the second lens 300 about the z-axis allows adjustment of the axis of the lens 300. In combination, the rotation and adjustment of the second lens 300 by relative movement of the lens elements 300, 301 parallel to the y-axis allows for mainly cylinder and axis
modification of the user's vision, including the correction of astigmatism.
[0092] In the present embodiment, the second adjustable lens 300 is capable of a range of
cylinder powers of about +4 to -4 dioptres, but, as with the first adjustable lens 200, in some embodiments a wider or narrower range may be used, e.g. +8 to -8 dioptres or +2 to -2 dioptres. The second adjustable lens 300 is capable or rotation about the optical axis of 90° to provide the full range of axis prescriptions for different users.
[0093] In the present embodiment, the lens elements 301, 302 of the second adjustable lens 300 are incapable of shifting relative to one another parallel to the x-axis of the second lens 300. However, it is envisaged that in other embodiments, some relative sliding of the lens elements 301, 302 of the second adjustable lens 300 may be permitted on the x-axis under the control of a suitable further actuation mechanism for allowing independent correction of the sphere components of the user's right and left eyes. In such embodiments, the second adjustable lens 300 may have a variable sphere in the range 0- 2 dioptres or 0-1 dioptres in addition to its variable cylinder and axis components.
[0094] As for the first adjustable lens 200, any suitable mechanism, indicated schematically by item 310 in FIG. 9B, may be provided for relative movement of the lens elements 301,
302 on the j-axis as described above. In the present embodiment, each of the lens elements 301, 302 is fabricated with an interiorly threaded protrusion 303, 304 as shown in FIG. 12B and a spindle 305 equipped with two longitudinally spaced thread portions 307, 308 is engaged with the threaded protrusions 303, 304. The protrusions 303, 304 and threaded portions 307, 308 are formed with opposite screw threads, so that rotation of the spindle 305 causes relative movement of the lens elements 301, 302 parallel to the j-axis, transverse the z-axis. An actuator 508 is mounted to one end of the spindle 305 for causing rotation of the spindle 305 under the control of the control unit 502. The mirror image left-hand second adjustable lens 300 is similarly provided with a corresponding actuator 507 under control of the corresponding control unit 501.
[0095] Likewise, any suitable mechanism may be provided for controlling rotation of the second adjustable lens 300 about the optical axis z. In the present embodiment, one of the lens elements 301, 302 is fabricated with a gear profile 315 comprising a series of teeth 316, and the second lens 300 is mounted within the housing 12 such that the gear profile 315 engages a small pinion 320 as shown in FIGS. 11 and 12B. The teeth 316 of the gear profile 315 are arranged along a curvilinear arc, and the pinion 320 is drivably connected to an actuator 510 for causing rotation of the third lens 300 about the z-axis under the control of the control unit 502. A corresponding actuator 509 is provided for rotation of the left-hand second adjustable lens 300 under control of the control unit 501.
[0096] As illustrated schematically in the chart of FIG. 16, in addition to the pair of primary lenses 70 for imaging two side-by-side 2-D images displayed on the display 2 of the device 1 to form a stereoscopic 3-D image, the first and second adjustable lenses 200, 300 of each lens group 100 are capable of modifying the user's vision by adjusting sphere (first adjustable lens 200) and cylinder and axis (second adjustable lens 300). The two lens groups 100 may be moved closer together or further apart for modifying the user's pupillary distance. Control of the actuators 503-510 for the left and right- hand lens groups 70 occurs under the control of the control units 501, 502. With reference to FIG. 21, the control units 501, 502 are connected to a processor 520 mounted on a small PCB that is fixedly secured within the housing 12. The
processor 520 is also connected to a memory unit 521, the dock connector 55 on the insert 50, for communication with the mobile device 1, the trackpad 24 and other user- operable controls on the headset 10, the proximity sensor 25, eye tracker 26 and motion tracking sensor 27.
[0097] In particular, the first adjustable lens 200 of each lens group 100 may be adjusted by the actuators 505, 506 under the control of the control units 501, 502 to provide the nearest equivalent sphere (NES) for correcting the user's distance vision. For a user's who also has astigmatism, the second adjustable lens 300 of each lens group 100 may be adjusted using the actuators 507, 508, 509, 510 for correcting for cylinder and axis. The pupillary distance between the lens groups 100 may be adjusted using the actuators 503, 504. In the present embodiment, each first adjustable lens 200 is adjusted
independently of the other to allow for different vision correction to each eye.
However, in some embodiments, the first adjustable lenses 200 of the two lens groups 100 may be adjusted together to afford the same degree of sphere correction in both eyes, and one or both of the second adjustable lenses 300 may be capable of additional independent adjustment for sphere by providing for the lens elements 301, 302 to slide relative to each other parallel to the x-axis as well as the j-axis as described above. In this way, the first adjustable lenses 200 may be operated, for example, to provide an average vision correction for sphere between the two eyes, and small final adjustments for differences in sphere as between the two eyes may be made using the second adjustable lenses 300.
[0098] Advantageously, prescription set-up software in accordance with the present invention for adjusting the first and second adjustable lenses 200, 300 may be stored in a storage device 4 or memory unit 5 of the mobile device 1 , the software being executable by a processor 6 of the mobile device 1 for transmitting instructions to the control units 501, 502 for operating the actuators 503-510 when the mobile device 1 is connected via its dock connector 3 to the dock connector 55 of the headset 10. In the present
embodiment, the prescription set-up software includes executable machine code for displaying a prompt for the user on the display 2 of the mobile device 1 to input his or her ophthalmic prescription, including at least the user's sphere correction for distance and optionally also including fitting data for a lens to the wearer, e.g. pupillary distance, cylinder, axis and lens vertex distance, age and/or add power. The software also comprises code for allowing the user to store his or her prescription data in the storage device 4 of the mobile device 1 to avoid the user having to re-enter the data when he or she uses the headset 10 again. The prescription data is stored on the mobile device 1 in association with a user account in a manner well known to those skilled in the art, so that different users may store their own prescription data on the mobile device 1.
[0099] FIG. 17 is a flowchart for setting up the headset 10 with a user's prescription in
accordance with the present invention. Upon docking the mobile device 1 with the dock connector 55 on the headset 10, the electronic components of the headset 10 receive power 1001 from the mobile device 1 causing the headset to power up. Upon executing the prescription set-up software in the processor 6 of the mobile device 1 (e.g. by opening an app on the mobile device), the electronic components of the headset 10 communicate 1002 with the mobile device 1 and the user's personal ophthalmic prescription data is retrieved 1004 from the storage device 4 and loaded into the memory 5. If the user's prescription data cannot be found in the storage device 4, the user is prompted to input them. Using the user's prescription data, the set-up software is executed by the processor 6 1005 to transmit instructions to the on-board processor 520 of the headset 10 to instruct the control units 501, 502 to operate the actuators 503- 510 to adjust the positions of the lens groups 100 on the horizontal axis H and the settings of the first and second adjustable lenses 200, 300 of each lens group 100 in accordance with the user's prescription. It will be appreciated that the lens groups 100 will usually be situated further away from the user's eyes E than a pair of glasses would be and, in consequence of this, some offset to the user's prescription is needed to adjust the first and second lenses 200, 300 correctly for the user.
[0100] The set-up software of the invention may therefore include code for prompting the user to input a measurement of the position of his or her eyes when entering prescription data and may calculate the necessary offset for the first and second lenses 200, 300 based on the user's prescription data and the position of his or her eyes. Alternatively, the set-up software may include a short calibration routine which is executed when a new user uses the headset 10 for the first time to fine tune the adjustment of the first and second lenses 200, 300 for the spacing of the user's eyes from the lens groups 100. The calibration data may then be stored in the storage device 4 with the user's prescription data as a profile, so the user does not have to re-calibrate the headset 10 each time he or she uses it.
[0101] In a variant of the invention, in which a headset in accordance with the invention includes one or more integrated displays instead of receiving a mobile device, the software may run on a separate computer (not shown) that is connected to the headset wirelessly or via a cable (e.g., USB). When executed, the set-up software may prompt the user for login details 1002A and, after receiving correct login credentials, may operate a processor of the separate computer to retrieve the user's ophthalmic prescription data from a storage device that is integral with or connected to the separate computer and calculate the necessary adjustments to the lens groups 100 based on the user's prescription data.
[0102] In accordance with the present invention therefore, the lens groups 100 may be set-up under software control in accordance with the user's vision. As a result, the user does not need to wear glasses for correcting refractive errors when using the headset of the invention, and the headset does not need to include sufficient space to accommodate the user's glasses. This may bring numerous advantages in terms of the quality of the image viewed by the user, but additionally may allow the headset to be designed with a smaller form factor as compared with previous virtual and augmented reality headsets, where a certain minimum distance from the user's face was required to accommodate glasses for users who needed them for vision correction. By having a headset that does not protrude as far forwardly from the user's face, the turning force applied by the headset to the user's face owing to the weight of the headset may be reduced, making the headset more comfortable to wear.
[0103] As mentioned above, the headset 10 of the present embodiment includes an eye
tracker 26, which allows the headset 10 of the invention to be operated in such a way, in accordance with a particular aspect of the invention, as to reduce or eliminate feelings of nausea or other discomfort that are sometimes associated with viewing 3-D stereoscopic images, as described below. However, other embodiments may omit the eye tracker and the associated additional features described below. [0104] Any suitable eye tracker 26 that is capable of measuring the user's point of gaze may be used, but in the present embodiment, a video-based eye tracker is used in which infrared/near-infrared non-collimated light is used to create corneal reflections, and a vector between the pupil centre and the corneal reflections is used to compute the point of regard on a surface or the gaze direction. The eye tracker 26 may be mounted within the housing 12 at any convenient location with a clear line of sight to one of the user's eyes. In the present embodiment, a single eye tracker 26 is used for measuring the line of sight of the user's left eye. However, it will be understood that in other
embodiments, the right eye may be measured, or two eye trackers may be employed, one associated with each eye.
[0105] Virtual reality software of the kind known in the art for operating a mobile virtual reality headset that is fitted with a mobile device such, for example, as mobile device 1 in the present embodiment, typically consists of executable code and data. As mentioned above, a mobile device generally comprises a processor, memory, storage and a display. The software is stored in the storage, and a temporary copy of at least the executable code is retrieved and loaded into the memory when the software is run. The data comprises image data encoding a 3-D stereographic image for display within the headset, and the code is executable by the processor for retrieving the image data from the storage and using the image data to generate a display signal for displaying the image on the display within the headset. The image data may comprise information for displaying a static stereoscopic image on the display, but in some embodiments, may comprise information for displaying a moving stereoscopic image. For instance, the image data may comprise a movie file or a 3-D game. The image displayed on the display is updated dynamically in response to head movements of the user which are measured using the motion tracking sensor 27 (e.g. gyroscopes, accelerometers, structured light systems) in the headset or the mobile device to simulate the user looking around a 3-D scene. Virtual reality software of this kind is known in the art and need not be described in more detail herein.
[0106] In accordance with the present invention, the software of the present embodiment further includes virtual distance data as part of the image data. For instance, the virtual distance data may be encoded as parallax data as between the two images of the stereoscopic image. The virtual distance data comprises virtual distance values for virtual distances from the user's eyes E of multiple points or regions within the image to be displayed. The data also includes accommodation data comprising accommodation and vergence values corresponding to the virtual distances. Upon running the software, the virtual distance data and accommodation data are retrieved from the storage device 4 and loaded into the memory 5 for rapid access by the processor 6.
[0107] The display signal generated by the processor 6 causes the display 2 to show two side-by- side 2-D images, which when imaged together through the respective lens groups 100 in the headset, form a virtual 3-D stereoscopic image in the manner known to those skilled in the art. The user views stereoscopic image in 3-D through the lens groups 100 and, upon moving his or head to look around a virtual scene included in the image, the image is updated to change with the user's movements.
[0108] It will be appreciated that in a typical virtual 3-D scene some parts of the image will be further from the user than other parts, even though the actual distance of the display from the user is constant and fixed. When viewing a real 3-D scene, a user's eyes adapt according to the distance of the part of the scene that is being looked at by the user, as illustrated in FIGS. 19 and 20. In particular, the user's pupillary distance decreases (the user's eyes verge) and accommodation increases when viewing near objects. When viewing a virtual 3-D scene, the user's eyes do not naturally adapt, which may give rise to feelings of nausea in some people (vergence-accommodation conflict). In some aspects, the present invention seeks to overcome this problem by adjusting the position and optical power of the lens groups 100 in response to the line of sight of the user, which is measured using the eye tracker 26, although as mentioned above, the present invention with the first and second adjustable lenses 200, 300 may be implemented for correcting for the user's prescription in a virtual or augmented reality headset, without these additional features.
[0109] With reference to FIG. 21, the processor 520 in the headset 10 is arranged to receive a signal from the eye tracker 26 and transmit data encoding the user's line of sight to the mobile device 1 through the dock connector 55. As shown in the flowchart of FIG. 18, this step is indicated by reference numeral 2001. The executable code includes an algorithm of the kind known in the art for calculating the user's point of gaze on the displayed image from the line of sight data 2002. The virtual distance value for the part of the image corresponding to the user's point of gaze is then looked up in the virtual distance data stored in the memory 5. Using the retrieved virtual distance value, the corresponding accommodation and vergence values are obtained from the
accommodation data in the memory 5. The processor then uses the obtained
accommodation and vergence values to transmit instructions to the processor 520 of the headset 10 for instructing the control units 501, 502 to send control signals to the actuators 503, 504, 505, 506 for controlling the positions of the right- and left-hand lens groups 100 on the horizontal axis H and the spherical power of the first adjustable lenses 200 according to the obtained accommodation and vergence values. particular, the actuators 503, 504 are operated by the control units 501, 502 to adjust the pupillary distance between respective lens groups 100 to match the vergence value corresponding to the virtual distance value for the user's point of gaze in the displayed image. Similarly, the actuators 505, 506 operated by the control units 501, 502 to adjust the first adjustable lenses 200 to match the spherical power of the first lenses 200 to the accommodation value corresponding to the virtual distance value for the user's point of gaze on the image. In this way, the lens groups 100 of the headset of the present embodiment are adjusted to oblige the user to accommodate in response to his or her point of gaze. When the user looks at near points within the virtual 3-D image, the lens groups 100 are moved closer together using the actuators 503, 504 causing the user's eyes to verge. When the user looks at points that are further away, the lens groups 100 are moved apart. When the user looks at near points, the first adjustable lenses 100 are adjusted using the actuators 505, 506 to shift the lens elements 202, 203 of each first lens 100 in a direction parallel to the x-axis of the lens 200 for reducing the sphere of the lens 100, causing the user to accommodate more strongly to view the image in focus. Similarly, when the user looks at points that are further away, the actuators 505, 506 are operated to increase the optical power of the first adjustable lenses 200 in accordance with the user's distance vision. [0111] The virtual reality software of the present invention may be integrated with the prescription setup software described above, so that dynamic adjustments to the first and second adjustable lenses 200, 300 are made relative to the user's normal prescription. Accordingly, when the user views a distant point within the 3-D virtual image on the display 2, the lens groups 100 are adjusted to match the user's prescription for pupillary distance, sphere, cylinder and axis, etc. When the user views a nearer point within the virtual image, as determined from the user's line of sight measured by the eye tracker 26, the lens groups 100 are adjusted under control of the software to adjust the distance between the lens groups 100 and the degree of accommodation provided by the first adjustable lenses 200 (and optionally second adjustable lenses 300 as described above) to force the user's eyes to accommodate and verge in the same way as they would naturally when viewing a real 3-D scene.
[0112] With reference to FIG. 16, the virtual reality headset 10 of the present embodiment is capable of providing a wide range of modifications to the user's vision. As described above, the first adjustable lenses 200 may be selectively adjusted to correct the user's spherical prescription as indicated in box (a) of FIG. 16. The second adjustable lenses 300 may be adjusted as well as to correct the user's cylinder and axis as shown in box (b). If the user requires no distance correction, only astigmatism and axis may be corrected as shown in box (e).
[0113] In addition, the lens groups 100 of the headset 10 can be adjusted to modify dynamically the user's vision according to the virtual distance of the part of the virtual 3-D stereoscopic image that the user is looking at within the headset. In particular, when the user's point of gaze is located at a part of the image at a relatively near virtual distance from the user, the pupillary distance between the lens groups 100 can be adjusted by moving the lens groups 100 closer together as shown in box (j) of FIG. 16. The pupillary distance between the lens groups 100 may also be adjusted in the same manner to fit the user. In addition, the user's eyes can be forced to accommodate when viewing closer virtual objects by reducing the optical power of the first adjustable lenses 200 as shown in box (h). A combination of pupillary distance and accommodation adjustments is indicated in box (i). [0114] If the user requires vision correction in addition to forced accommodation when looking at nearer parts of the virtual image, the first and second adjustable lenses 200, 300 may be adjusted accordingly. As indicated in box (f) the first and second adjustable lenses 200, 300 can be controlled for correcting the user's cylinder and axis and an adjustment corresponding to the virtual distance of the part of the image at which the user is looking. In a variant, as shown in box (g), only the pupillary distance and cylinder/axis are adjusted. For a user who requires distance correction and has astigmatism, the first and second adjustable lenses 200, 300 may be adjusted to correct for cylinder and axis and the net sphere provided by the first adjustable lenses 200 may be calculated from the user's distance correction and the adjustment corresponding to the virtual distance to the part of the image which the user is looking. When looking at near objects, the pupillary distance may also be reduced as shown in box (d).
[0115] The headset 10 and software of the present embodiment has been described with
reference to a virtual reality headset and virtual reality software, but it will be understood by those skilled in the art that the invention may be embodied in a similar fashion in an augmented reality headset in which a virtual 3-D image is superimposed on an image of the real world viewed through a transparent screen or opening in the headset.

Claims

1. A virtual or augmented reality headset comprising two lens groups for imaging two side- by-side 2-dimensional images displayed on a display within the headset to form a virtual stereographic 3-dimensional image; each lens group comprising a primary lens, a first adjustable lens for correcting a spherical refractory error in a user's distance vision and a second adjustable lens for correcting for cylinder and axis, the primary lens and first and second adjustable lenses within each lens group being in mutual optical alignment on a respective optical axis.
2. A virtual or augmented reality headset as claimed in claim 1, wherein the headset is a virtual reality (VR) headset in which only the 3-dimensional image displayed on the display is visible to the user.
3. A virtual reality headset as claimed in claim 2, wherein the headset incorporates one or more integrated displays.
4. A virtual reality headset as claimed in claim 2, wherein the headset is a mobile virtual reality headset that is arranged to receive a mobile device having its own display, such that the mobile device is removable from the headset and, when fitted, is arranged such that its display faces the user's eyes.
5. A virtual or augmented reality headset as claimed in claim 1, wherein the headset is an augmented reality headset in which the 3-dimensional image is superimposed on an image of the real world viewed through the headset.
6. A virtual or augmented reality headset as claimed in claim 1, wherein the second adjustable lens within at least one lens group also has variable sphere.
7. A virtual or augmented reality headset as claimed in claim 6, wherein the first adjustable lenses are linked for simultaneous equal adjustment, and the at least one second adjustable lens allows for the correction of small differences in sphere between the user's two eyes.
8. A virtual or augmented reality headset as claimed in any of claims 1-7, wherein the first adjustable lenses are independently adjustable.
9. A virtual or augmented reality headset as claimed in any preceding claim, wherein the distance between the lens groups and the display within the headset is fixed.
10. A virtual or augmented reality headset as claimed in any preceding claim, wherein at least one of the lens groups is movable sideways in a direction substantially parallel to an interpupillary axis between the user's eyes for adjusting the spacing between the lens groups according to the user's interpupillary distance.
11. A virtual or augmented reality headset as claimed in any preceding claim, wherein the first adjustable lens of each lens group comprises a first cubic surface-type adjustable lens comprising two superposed lens elements having mutually cooperating cubic or higher order surfaces that are slidable relative to one other along an x-axis of the first adjustable lens, which is perpendicular to the optical axis, and shaped to provide variable sphere according to their relative positions along the x-axis for varying the focusing power of the first adjustable lens.
12. A virtual or augmented reality headset as claimed in claim 11, wherein the x-axis of the first adjustable lens is oriented at an acute angle, suitably about 45-80°, relative to an
interpupillary axis that extends between the lens groups.
13. A virtual or augmented reality headset as claimed in claim 11 or claim 12, wherein each lens group comprises a respective first sliding-controlling actuator associated with the first adjustable lens for controlling the relative disposition of the lens elements along the x-axis of the first adjustable lens for adjusting the spherical optical power of the first adjustable lens.
14. A virtual or augmented reality headset as claimed in any of claims 11-13, wherein each lens element of the first adjustable lens has a first cubic or higher-order surface of the kind described above and a second opposite surface that is a regular surface of revolution, preferably planar.
15. A virtual or augmented reality headset as claimed in any of claims 1-10, wherein the first adjustable lens of each lens group comprises a variable focusing power liquid lens comprising a distensible membrane having an optical surface, wherein the focusing power of the liquid lens is related to the curvature of the membrane.
16. A virtual or augmented reality headset as claimed in any of claims 11-15, wherein the optical power of the first adjustable lens is variable in the range up to about ±15D.
17. A virtual or augmented reality headset as claimed in any preceding claim, wherein the second adjustable lens of each lens group comprises a second cubic surface-type lens, which is mounted for rotation within the headset about the optical axis, the lens elements of the second lens being slidable relative to one another in a direction parallel to a y-axis of the second adjustable lens, which is perpendicular to the optical axis, and shaped to provide variable cylinder according to their relative positions along the y-axis for correcting astigmatism.
18. A virtual or augmented reality headset as claimed in claim 17, wherein the cubic surfaces of the lens elements of the second adjustable lens are shaped to provide variable sphere when they are displaced relative to one another along an x-axis of the second adjustable lens, the x- and y-axes of the second adjustable lens forming a 3-dimensional Cartesian coordinate system with a z-axis that is parallel to the optical axis of the adjustable lens.
19. A virtual or augmented reality headset as claimed in claim 17 or claim 18, wherein each lens group comprises a respective second sliding-controlling actuator associated with the or each cubic surface-type second adjustable lens for controlling the relative disposition of the lens elements along the y-axis of the second adjustable lens for adjusting the cylindrical power of the second adjustable lens.
20. A virtual or augmented reality headset as claimed in claiml9, wherein each lens group further comprises a rotation-controlling actuator associated with the second adjustable lens for adjusting the axis of the second adjustable lens.
21. A virtual or augmented reality headset as claimed in claim 20, wherein each lens group comprises a respective further sliding-controlling actuator associated with the second adjustable lens for controlling the relative disposition of the lens elements along the x-axis of the second adjustable lens for adjusting the spherical power of the second adjustable lens.
22. A virtual or augmented reality headset as claimed in any of claims 17-21, wherein each lens element of the second adjustable lens has a first cubic or higher-order surface of the kind described above and a second opposite surface that is a regular surface of revolution, preferably planar.
23. A virtual or augmented reality headset as claimed in any preceding claim, wherein each of the lens groups is mounted within the headset for translational movement in a direction parallel to the interpupillary axis that extends between the two lens groups.
24. A virtual or augmented reality headset as claimed in claim 22, comprising a respective pupillary distance-controlling actuator associated with each of the lens groups for adjusting the interpupillary distance between the lens groups.
25. A virtual or augmented reality headset as claimed in any preceding claim, wherein the virtual or augmented reality headset further comprises at least one eye-tracking device.
26. Software or firmware for controlling a virtual or augmented reality headset as claimed in any preceding claim, which software or firmware comprises machine code which, when executed by a computer comprising a processor and memory, controls the computer to receive in the memory ophthalmic prescription data for a user and to output a control signal to cause adjustment of the first and second adjustable lenses in accordance with the user's prescription.
27. Software or firmware for controlling a virtual or augmented reality headset as claimed in claim 13, which software or firmware comprises machine code which, when executed by a computer comprising a processor and memory, controls the computer to receive in the memory ophthalmic prescription data for a user encoding at least the user's sphere component for distance and to transmit control signals to the first sliding-controlling actuators for adjusting the relative disposition of the lens elements of the each of the first adjustable lenses to adjust the focusing powers of the first adjustable lenses in accordance with the user's prescription.
28. Software or firmware for controlling a virtual or augmented reality headset as claimed in claim 20, wherein the software or firmware, when executed by a computer comprising a processor and memory, controls the computer to receive in the memory ophthalmic prescription data for a user encoding at least the user's cylinder and axis components and to transmit control signals to the second sliding-controlling actuators for adjusting the relative disposition of the lens elements of the each of the second adjustable lenses and the rotation-controlling actuators for adjusting the angle of each of the second adjustable lenses within the headset respectively to adjust the cylinder power and axis of the second adjustable lenses in accordance with the user's prescription.
29. Software or firmware as claimed in claim 27 or claim 28, wherein the software or firmware, when executed by the computer, obtains the user's prescription data from a storage device.
30. Software or firmware as claimed in claim 27 or claim 28, wherein the software or firmware, when executed by the computer, prompts a user to input their prescription data.
31. Software or firmware as claimed in claim 27, wherein adjustment of the first adjustable lenses in accordance with the user's prescription for sphere includes an additional offset to account for the position of the user's eyes when wearing the headset.
32. Software or firmware as claimed in claim 31 , wherein the size of the spherical offset is received in the memory with the prescription data or obtained from a calibration routine.
PCT/EP2018/054983 2017-03-01 2018-02-28 Improvements in or relating to virtual and augmented reality headsets WO2018158347A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW107109989A TW201937238A (en) 2017-03-01 2018-03-23 Improvements in or relating to virtual and augmented reality headsets

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1703352.3A GB201703352D0 (en) 2017-03-01 2017-03-01 Improvements in or relating to virtual and augmented reality headsets
GB1703352.3 2017-03-01

Publications (1)

Publication Number Publication Date
WO2018158347A1 true WO2018158347A1 (en) 2018-09-07

Family

ID=58543796

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/054983 WO2018158347A1 (en) 2017-03-01 2018-02-28 Improvements in or relating to virtual and augmented reality headsets

Country Status (3)

Country Link
GB (1) GB201703352D0 (en)
TW (1) TW201937238A (en)
WO (1) WO2018158347A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10520729B1 (en) * 2017-04-25 2019-12-31 Facebook Technologies, Llc Light scattering element for providing optical cues for lens position adjustment
US10620432B1 (en) * 2017-04-25 2020-04-14 Facebook Technologies, Llc Devices and methods for lens position adjustment based on diffraction in a fresnel lens
CN111077676A (en) * 2019-12-10 2020-04-28 华为技术有限公司 Astigmatic correction lens, head-mounted display device, and astigmatic correction method
CN111416972A (en) * 2020-01-21 2020-07-14 同济大学 Three-dimensional imaging system and method based on axially adjustable cascade rotating mirror
CN112882235A (en) * 2021-01-20 2021-06-01 山东梦幻视界智能科技有限公司 Augmented reality (MR) display device
CN113680045A (en) * 2021-07-21 2021-11-23 温州大学 Virtual reality self-service game device
US11333803B2 (en) 2019-05-16 2022-05-17 Facebook Technologies, Llc Fluid lens with low energy membrane adjustment
WO2022106500A1 (en) * 2020-11-23 2022-05-27 Koninklijke Philips N.V. Artificial intelligence (ai)-based optimized solution for device localization in medical facility set-up
EP4083682A1 (en) * 2021-04-28 2022-11-02 Sony Interactive Entertainment Inc. Head-mountable display apparatus and methods
US11506825B1 (en) 2019-10-24 2022-11-22 Meta Platforms, Inc. Elastomer based flexures for fluid lenses
US11561415B1 (en) 2019-05-16 2023-01-24 Meta Platforms Technologies, Llc Moving guide actuation of fluid lenses
US11635637B1 (en) 2019-05-16 2023-04-25 Meta Platforms Technologies, Llc Fluid lens with low energy membrane adjustment
US11681146B2 (en) 2021-03-18 2023-06-20 Snap Inc. Augmented reality display for macular degeneration
US11703616B2 (en) 2019-11-05 2023-07-18 Meta Platforms Technologies, Llc Fluid lens with low gas content fluid
US11714256B2 (en) 2020-04-27 2023-08-01 Apple Inc. Electronic devices with optical module positioning systems
US11719960B1 (en) 2019-05-16 2023-08-08 Meta Platforms Technologies, Llc Gravity sag compensation in fluid-filled lenses
US11740391B1 (en) 2020-12-31 2023-08-29 Meta Platforms Technologies, Llc Fluid lens operational feedback using sensor signal
EP4055435A4 (en) * 2019-11-06 2023-12-27 Valve Corporation Optical system for head-mounted display device
US11867927B1 (en) 2019-05-16 2024-01-09 Meta Platforms Technologies, Llc Modified membranes for fluid lenses
US11908066B2 (en) 2021-03-24 2024-02-20 Sony Interactive Entertainment Inc. Image rendering method and apparatus
WO2024058507A1 (en) * 2022-09-15 2024-03-21 삼성전자주식회사 Electronic device which minimizes difference between real space and virtual space and method for manufacturing same

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI775392B (en) * 2021-04-20 2022-08-21 宏碁股份有限公司 Augmented reality glasses
CN115248500B (en) * 2021-04-25 2023-07-25 宏碁股份有限公司 Augmented reality glasses
TWI815353B (en) * 2022-03-16 2023-09-11 國立清華大學 Head-mounted device for reducing symptoms of cybersickness
TWI816492B (en) * 2022-03-22 2023-09-21 宏達國際電子股份有限公司 Head-mounted display device and eye tracking module

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090153796A1 (en) * 2005-09-02 2009-06-18 Arthur Rabner Multi-functional optometric-ophthalmic system for testing diagnosing, or treating, vision or eyes of a subject, and methodologies thereof
WO2015148442A1 (en) * 2014-03-25 2015-10-01 Eyenetra, Inc. Methods and apparatus for optical controller
WO2016115285A1 (en) * 2015-01-13 2016-07-21 Eyenetra, Inc. Variable lens system for refractive measurement
WO2016149416A1 (en) * 2015-03-16 2016-09-22 Magic Leap, Inc. Methods and systems for diagnosing and treating health ailments

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090153796A1 (en) * 2005-09-02 2009-06-18 Arthur Rabner Multi-functional optometric-ophthalmic system for testing diagnosing, or treating, vision or eyes of a subject, and methodologies thereof
WO2015148442A1 (en) * 2014-03-25 2015-10-01 Eyenetra, Inc. Methods and apparatus for optical controller
WO2016115285A1 (en) * 2015-01-13 2016-07-21 Eyenetra, Inc. Variable lens system for refractive measurement
WO2016149416A1 (en) * 2015-03-16 2016-09-22 Magic Leap, Inc. Methods and systems for diagnosing and treating health ailments

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10620432B1 (en) * 2017-04-25 2020-04-14 Facebook Technologies, Llc Devices and methods for lens position adjustment based on diffraction in a fresnel lens
US10520729B1 (en) * 2017-04-25 2019-12-31 Facebook Technologies, Llc Light scattering element for providing optical cues for lens position adjustment
US11867927B1 (en) 2019-05-16 2024-01-09 Meta Platforms Technologies, Llc Modified membranes for fluid lenses
US11719960B1 (en) 2019-05-16 2023-08-08 Meta Platforms Technologies, Llc Gravity sag compensation in fluid-filled lenses
US11635637B1 (en) 2019-05-16 2023-04-25 Meta Platforms Technologies, Llc Fluid lens with low energy membrane adjustment
US11333803B2 (en) 2019-05-16 2022-05-17 Facebook Technologies, Llc Fluid lens with low energy membrane adjustment
US11561415B1 (en) 2019-05-16 2023-01-24 Meta Platforms Technologies, Llc Moving guide actuation of fluid lenses
US11506825B1 (en) 2019-10-24 2022-11-22 Meta Platforms, Inc. Elastomer based flexures for fluid lenses
US11703616B2 (en) 2019-11-05 2023-07-18 Meta Platforms Technologies, Llc Fluid lens with low gas content fluid
EP4055435A4 (en) * 2019-11-06 2023-12-27 Valve Corporation Optical system for head-mounted display device
CN111077676B (en) * 2019-12-10 2021-09-07 华为技术有限公司 Astigmatic correction lens, head-mounted display device, and astigmatic correction method
CN111077676A (en) * 2019-12-10 2020-04-28 华为技术有限公司 Astigmatic correction lens, head-mounted display device, and astigmatic correction method
CN111416972B (en) * 2020-01-21 2021-03-26 同济大学 Three-dimensional imaging system and method based on axially adjustable cascade rotating mirror
CN111416972A (en) * 2020-01-21 2020-07-14 同济大学 Three-dimensional imaging system and method based on axially adjustable cascade rotating mirror
US11714256B2 (en) 2020-04-27 2023-08-01 Apple Inc. Electronic devices with optical module positioning systems
WO2022106500A1 (en) * 2020-11-23 2022-05-27 Koninklijke Philips N.V. Artificial intelligence (ai)-based optimized solution for device localization in medical facility set-up
US11740391B1 (en) 2020-12-31 2023-08-29 Meta Platforms Technologies, Llc Fluid lens operational feedback using sensor signal
CN112882235A (en) * 2021-01-20 2021-06-01 山东梦幻视界智能科技有限公司 Augmented reality (MR) display device
US11681146B2 (en) 2021-03-18 2023-06-20 Snap Inc. Augmented reality display for macular degeneration
US11908066B2 (en) 2021-03-24 2024-02-20 Sony Interactive Entertainment Inc. Image rendering method and apparatus
EP4083682A1 (en) * 2021-04-28 2022-11-02 Sony Interactive Entertainment Inc. Head-mountable display apparatus and methods
CN113680045A (en) * 2021-07-21 2021-11-23 温州大学 Virtual reality self-service game device
CN113680045B (en) * 2021-07-21 2024-02-27 温州大学 Virtual reality self-service game device
WO2024058507A1 (en) * 2022-09-15 2024-03-21 삼성전자주식회사 Electronic device which minimizes difference between real space and virtual space and method for manufacturing same

Also Published As

Publication number Publication date
TW201937238A (en) 2019-09-16
GB201703352D0 (en) 2017-04-19

Similar Documents

Publication Publication Date Title
WO2018158347A1 (en) Improvements in or relating to virtual and augmented reality headsets
US10650605B2 (en) Apparatuses, methods and systems coupling visual accommodation and visual convergence to the same plane at any depth of an object of interest
US11132056B2 (en) Predictive eye tracking systems and methods for foveated rendering for electronic displays
US20200051320A1 (en) Methods, devices and systems for focus adjustment of displays
CN107870424B (en) Adjustable virtual reality device capable of adjusting display module
CN108600733B (en) Naked eye 3D display method based on human eye tracking
CN108886612B (en) Multi-depth flat panel display system with reduced switching between depth planes
KR101443322B1 (en) Method for optimizing and/or manufacturing eyeglass lenses
CN112136094A (en) Depth-based foveated rendering for display systems
CN111771179A (en) Display system and method for determining registration between a display and an eye of a user
WO2017099824A1 (en) Focus adjusting virtual reality headset
WO2019030467A1 (en) Head-mountable apparatus and methods
US20140375531A1 (en) Method of roviding to the user an image from the screen of the smartphome or tablet at a wide angle of view, and a method of providing to the user 3d sound in virtual reality
US10656707B1 (en) Wavefront sensing in a head mounted display
US8692870B2 (en) Adaptive adjustment of depth cues in a stereo telepresence system
CN111328380B (en) Head-mounted display and method for designing wide-focus lens used in head-mounted display
US10545344B2 (en) Stereoscopic display with reduced accommodation fatique
CN111373307B (en) Stereoscopic glasses, method for designing glasses lenses used in stereoscopic glasses, and method for observing stereoscopic image
WO2015034453A1 (en) Providing a wide angle view image
US20230084541A1 (en) Compact imaging optics using spatially located, free form optical components for distortion compensation and image clarity enhancement
KR101490778B1 (en) Calibration lens can be seen ultra short distance and device thereof
US11327313B2 (en) Method and system for rendering an image with a pupil enhanced accommodation of the eye
US11513359B1 (en) Lens
US20220350141A1 (en) Head-mountable display apparatus and methods
TW201809799A (en) Head-mounted display device and binocular vision image calibrating method of the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18711835

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18711835

Country of ref document: EP

Kind code of ref document: A1