WO2025021455A1 - Imaging system and method - Google Patents

Imaging system and method Download PDF

Info

Publication number
WO2025021455A1
WO2025021455A1 PCT/EP2024/068794 EP2024068794W WO2025021455A1 WO 2025021455 A1 WO2025021455 A1 WO 2025021455A1 EP 2024068794 W EP2024068794 W EP 2024068794W WO 2025021455 A1 WO2025021455 A1 WO 2025021455A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging system
combiner
light
retina
eye
Prior art date
Application number
PCT/EP2024/068794
Other languages
French (fr)
Inventor
Giacomo COLZI
Carlos MACIAS
Volker ZAGOLLA
Original Assignee
Ams-Osram Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ams-Osram Ag filed Critical Ams-Osram Ag
Publication of WO2025021455A1 publication Critical patent/WO2025021455A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • the present application relates to an imaging system and to a method using imaging .
  • Imaging based eye-tracking sensors require both a direct illumination of the eye as well as a direct imaging of the same area .
  • State of the art algorithms rely on the information of the pupil location and of the position of the specular reflection of the light source ( s ) from the cornea to assess the gaze vector .
  • these optics need to feature a large enough eyebox that covers eye movement as well as some or all the IPD ( interpupillary distance ) range in the population .
  • a second problem is to determine where a person is essentially looking in a 3d space .
  • There are two components at play the first being the relative position of the gaze vectors and their intersection point .
  • the second is the focusing distance of the eye itsel f .
  • they are matched to provide minimal eye strain .
  • VR virtual reality
  • the mismatch between these two provides a problem that stems from the fact that there is a mismatch between the location of the image (virtual display a few centimeters out ) and the relative position of the gaze vectors (vergence-accommodation-mismatch) .
  • This can create additional discomfort in AR applications , as it makes it impossible to simultaneously visuali ze a virtual and a real obj ect that should be located in close vicinity to each other .
  • Document EP 0979432 Bl describes an optical system combining image presentation and eye analysis , the entire disclosure content of which is hereby incorporated by back-reference .
  • This prior art deals with an optical system for presenting an image to a user, through an optical path of transmission of these image elements to the user' s eye , as well as including a system illumination of the fundus and a system for taking an image of the fundus of the illuminated eye for eye tracking purposes .
  • the two systems partially share the same optical path, and the assembly can be integrated into a head device .
  • a generic schematic image is shown in Fig . l , and a more speci fic embodiment consisting in a helmet for aircraft pilots is outlined in Fig . 2 .
  • Document US 2017 / 0000343 Al describes augmented and virtual reality display systems and methods for determining optical prescriptions by imaging retina, the entire disclosure content of which is hereby incorporated by back-reference .
  • This prior art deals with a wearable AR (augmented reality) /VR device that can be used as a diagnostic health system ( incl . ophthalmic ) .
  • the prior art covers several diagnostic methods as embodiments , which rely on an image of the retina being acquired .
  • Document US 2017/0000342 Al relates to methods and systems for detecting health conditions by imaging portions of the eye, including the fundus, the entire disclosure content of which is hereby incorporated by back-reference. This prior art is roughly analogous to US 2017/0000343 Al.
  • An object to be solved is to facilitate retinal imaging. Further, a method is to be specified that provides information on an eye.
  • An imaging system is specified.
  • the imaging device comprises a combiner.
  • the combiner is used in reverse to capitalize from the eyebox expansion.
  • eyebox refers to a volume of space relative to an augmented reality (AR) display or virtual reality (VR) display, in which the user has to position their eye to be able to correctly see the full projected image.
  • AR augmented reality
  • VR virtual reality
  • the display is embodied as off-axis retinal scanning display (ORSD) that may be used to project an image onto the retina of the user.
  • ORSD off-axis retinal scanning display
  • This combiner displays images onto the retina of the user .
  • the combiner may use di f fraction, refraction or reflection or a combination thereof to deflect the light towards the eye .
  • the combiner comprises a holographic element such as a volume phase hologram (VPH) .
  • VPH volume phase hologram
  • the combiner comprises a reflector configured to deflect the image to be proj ected towards the user' s eye .
  • the reflector may for example comprise dielectric layers configured as a dichroic coating and/or as a Bragg reflector .
  • the imaging system is configured to operate the combiner in reverse direction in order to image the retina .
  • this optical element works bidirectionally, one can obtain an image of the retina by adding a beam splitter coupled with a detector (for example camera/photodiode + ob ective/ lens ) , for example .
  • a beam splitter coupled with a detector (for example camera/photodiode + ob ective/ lens ) , for example .
  • the angular si ze of the retinal image is equivalent to at least the angular si ze of the proj ected image , but may be larger .
  • the imaging system comprises a combiner configured to display an image onto a retina of a user' s eye , wherein the imaging system is configured to operate the combiner in reverse direction in order to image the retina .
  • the eye cornea, crystalline lens , fundus , retina
  • the eye is part of the optical system according to at least one embodiment .
  • the added complexity of the invention is small .
  • it lies mostly in adding a detector such as a camera to the optical path and re-using an already existing optical element .
  • the core optical system used is described in PCT/EP2023/ 054758 . It describes a multi-layered optical component that can be used to display an image onto the retina of a user, using either broad- or narrowband sources .
  • this component is used in reverse , essentially as an image forming element .
  • the combiner comprises a multi-layered optical component that can be used to display an image onto the retina of a user, using either broad- or narrowband sources .
  • optical component and co-linear illumination and imaging paths created e . g . , by a beam splitter .
  • the latter may be achieved by adding the imaging system and an image sensor on the return path .
  • the optical component described in PCT/EP2023/ 054758 already features the necessary elements to obtain an image .
  • the imaging system comprises a light source configured to illuminate the combiner .
  • the term " light" is not restricted to visible electromagnetic radiation .
  • the light source is configured to emit light in the near ultraviolet spectral range , in the visible spectral range and/or in the near ultraviolet spectral range .
  • near ultraviolet light refers to the spectral range from 320 nm to 490 nm .
  • Visible light refers to the spectral range from 420 nm to 780 nm .
  • Near infrared light refers to the spectral range from 781 nm to 1 . 3 pm .
  • the light source includes an emitter such as a laser diode or a light emitting diode emitting in the blue spectral range from 440 nm to 470 nm and/or in the green spectral range from 510 nm to 540 nm and/or in the red spectral range from 620 nm to 660 nm .
  • an emitter may emit radiation in the near infrared spectral range .
  • the light source is a proj ector or the light source is part of a proj ector .
  • a beam splitter is arranged in a beam path between the light source and the combiner .
  • a beam splitter In a return path from the eye towards the light source , at least part of this light may be coupled out so that it does not reach the light source .
  • the term "beam splitter” broadly covers any optical element that is suitable for splitting the radiation in the return path into two spatially separated paths .
  • the imaging system comprises a detector .
  • the detector may comprise one or more photosensitive regions .
  • the detector comprises a photodiode , an array of photodiodes or a charge coupled device ( CCD) .
  • the detector may comprise or consist of a camera which may, for example , include a camera sensor and an imaging optics .
  • light in a return path from the retina via the combiner is deflected towards the detector in particular by the beam splitter .
  • the light source is a proj ector wherein the beam splitter is arranged in the beam path between the proj ector and the combiner .
  • the imaging system comprises a scanner .
  • the beam splitter is arranged in a beam path between the light source and the scanner .
  • the scanner is a micro electro mechanical system (MEMS ) scanner .
  • MEMS micro electro mechanical system
  • the scanner may be used to provide the spatial resolution of the image to be displayed .
  • the light source itsel f does not have to provide a spatially resolved image .
  • the imaging system comprises a further light source configured to illuminate the retina with a further light .
  • the further light may be in the visible or in the invisible spectral range , for example in the near infrared spectral range . In particular the further light illuminates the retina via the combiner .
  • the further light and the combiner are adapted to one another such that the combiner deflects at least part of the further light onto the retina .
  • the combiner has a suf ficiently high reflectivity in a wavelength range of the further light .
  • the retina is illuminated with the further light such that the further light is not perceived by the eye .
  • the further light may enhance the imaging quality of the retina .
  • the further light provides a flood illumination of the retina outside a proj ection path of the image .
  • the combiner is in the beam path of the further light towards the retina, but it does not follow the beam path of the proj ected light used to display the image .
  • the further light includes wavelengths in the near infrared spectral range . This facilitates illumination of the retina of the eye without disturbing the user .
  • the further light may be the only light that is used for imaging the retina .
  • the imaging system comprises a filter configured to transmit at least part of the light of the further light source and to filter out at least part of the light of the light source .
  • the filter is part of the detector or arranged in a beam path between the detector and the combiner, in particular in a beam path between the detector and the beam splitter .
  • the light source is configured to provide a time multiplexed flood illumination of the retina .
  • the flood illumination is performed between two subsequent images to be displayed .
  • the retina can be illuminated with a higher homogeneity compared to an illumination configured to display an image .
  • the imaging system may use the flood illumination to obtain images of the retina .
  • the combiner is configured to be arranged in the user' s field of view, wherein the combiner is configured to transmit ambient light from the field of view .
  • the combiner is configured to transmit ambient light from the field of view .
  • the combiner may be configured such that it is substantially transparent to the radiation from the ambient light so that the combiner does not signi ficantly disturb the user' s view of the scene within their field of view .
  • the imaging system is configured to be used for eye tracking or eye tracking systems for augmented reality devices , mixed reality devices or virtual reality devices .
  • eye tracking may be obtained without any optical features within the user' s field of view in addition to those already present in order to display the image onto the retina .
  • a retina of the eye is imaged by operating a combiner in reverse direction, wherein the combiner is configured to display an image onto the retina of the eye .
  • an eye movement is derived from the imaged eye .
  • eye tracking is performed by means of the method .
  • the imaged retina may be used for user authentication .
  • the uniqueness that each retina structure has can be relied upon as highly reliable criterion for identi fication .
  • a focus adaption of the eye is deduced from the imaged eye .
  • an impact of the proj ected image on the eye may be detected .
  • the method can be performed using an imaging system as described above .
  • features described in connection with the imaging system also may apply for the method and vice versa .
  • the imaging system may provide access to the retina image , which provides unique authentication, for instance .
  • the imaging system may provide information to perform eye tracking (vergence and accommodation) . It may provide information on the health of the eye .
  • Access to a retinal image is one of the di f ficulties while keeping imaging components out of the field of view of the user .
  • the optical component described in PCT/EP2023/ 054758 solves this issue i f used in reverse , by providing both an extended eyebox as well as a strong illumination ef ficiency .
  • the optical component described provides a transparent , see- through optical component , using spectrally narrow-band and angular selective features in order to render it virtually invisible to the user of augmented or mixed reality glasses .
  • Figure 1 shows an exemplary embodiment of an imaging system in schematic representation
  • Figure 2 shows an exemplary embodiment of an imaging system in schematic representation
  • Figure 3 shows an exemplary embodiment of an imaging system in schematic representation
  • Figure 5 shows an exemplary embodiment of a combiner in a schematic sectional view .
  • the imaging system 1 shown in Figure 1 comprises a combiner 2 configured to display an image onto a retina 91 of a user' s eye 9 .
  • the imaging system is configured to operate the combiner 2 in reverse direction in order to image the retina 91 .
  • the light source 3 is a proj ector 30 configured to produce a spatially resolved image which is to be displayed onto the retina 91 .
  • a beam splitter 4 is arranged in a beam path between the light source 3 and the combiner 2 .
  • the imaging system comprises a detector 5 .
  • the detector 5 is a camera 50 comprising a camera optics 51 and a sensor 52 .
  • Figure 1 illustrates a proj ection path 81 of light from the light source 3 travelling towards the retina 91 . Further, a detection path 83 is illustrated, the detection path extending from the retina 91 via the combiner 2 towards the detector 5 .
  • An overlapped path 84 illustrates the combined path of proj ection path 81 and detection path 83 .
  • the detector 5 may use the light from the light source 3 , for example the light used to display the image .
  • the light source 3 may further be configured to provide a flood illumination of the retina 91 between subsequent images to be displayed .
  • the flood illumination may be performed such that it is not perceived by the eye .
  • the intensity of the flood illumination may be so low that the flood illumination is not perceived by the eye 9 .
  • the light source 3 may include an emitter emitting radiation in the infrared spectral range , in particular in the near infrared spectral range , which is not perceived by the human eye .
  • the combiner 2 can be used both for the proj ection path 81 of the light source 3 towards the retina 91 and for the returning path from the retina towards the detector 5 .
  • the combiner 2 is arranged in the user' s field of view .
  • the combiner 2 is substantially transparent to the ambient light so that the combiner 2 transmits the ambient light from the field of view .
  • the user perceives both the ambient light and the image produced by proj ector 30 .
  • a beam splitter 4 and a camera 50 are added to the return path (between VPH (volume phase hologram) and proj ector ) in order to image the retina 91 .
  • a retina 91 of the eye is imaged by operating a combiner 2 in reverse direction, wherein the combiner 2 is configured to display an image onto the retina 91 of the eye 9 .
  • the method can be used to extract eye tracking parameters such as the location of the pupil of the eye from the imaged retina 91 .
  • a focus adaption of the eye can be deduced from the imaged eye .
  • one or more health-related parameters may be deduced from the imaged eye , for example , the maximum accommodation of the eye . Further, health issues may be detected by imaging the eye 9 , for example diabetes , high blood pressure or high cholesterol .
  • the light source 3 illustrated in Figure 2 itsel f does not provide a spatial resolution of the image to be displayed .
  • the light from light source 3 is collimated by a light source optics 32 and directed towards a scanner 31 , for example a MEMS scanner . Further, an imaging optics 38 is provided between the scanner 31 and the combiner 2 .
  • the beam splitter 4 is arranged between the light source 3 and the scanner 31 .
  • the deflection of the radiation by the scanner 31 ef fects both the radiation from the light source 3 towards the retina 91 and the radiation on the return path from the retina 91 to the detector 5 .
  • the detector 5 does not necessarily have to have a spatial resolution .
  • a photodiode 55 may be used .
  • a detector optics 56 may be arranged between the beam splitter 4 and the detector 5 , for example in order to focus the light from the retina 91 onto the detector 5 .
  • the figure further shows the proj ection path 81 , the detection path 83 and the overlapped path 84 , wherein the beam path of overlapped path 84 between the scanner 31 and the retina 91 is shown for two di f ferent positions of scanner 31 .
  • an imaging system is added before a potential mems scanner in the proj ection system . This assumes a mems scanning system in the proj ector .
  • the exemplary embodiment according to Figure 3 additionally comprises a further light source 35 configured to illuminate the retina with a further light .
  • the further light may be in the visible spectral range or in the near infrared spectral range .
  • the illumination of the retina 91 by the further light may be performed such that the eye 9 does not perceive the further light .
  • the imaging quality of the detector 5 can be increased .
  • the imaging system 1 may comprise a filter (not shown in Figure 3 ) provided in the detector 5 or in a beam path from the retina to the detector upstream of the detector 5 .
  • the filter may completely, or at least in part , filter out the light of light source 3 so that only the illumination by the further light source 35 is used for imaging the retina 91 .
  • a further beam splitter 45 is arranged between the detector 5 and the combiner 2 , in particular between the beam splitter 4 and the detector 5 .
  • the return path from the retina 91 via combiner 2 extends through the beam splitter 4 and the further beam splitter 45 towards the detector 5 .
  • Lines 82 illustrate a further light path 82 of light from the further light source 35 towards the retina 91 .
  • Lines 84 illustrate an overlapped path of further light path 82 and detection path 83 .
  • Lines 85 represent a combined path of proj ection path 81 , further light path 82 and detection path 83 .
  • the embodiment shown in Figure 3 suggests adding a dedicated light source to the illumination path and a filter in the camera .
  • This illumination should be within range of the capabilities of the optical stack, but outside of the proj ection path . This would be used to supply a flood illumination of the retina alone .
  • Other wavelengths of the proj ection system can then be filtered out to provide an improved image quality .
  • the exemplary embodiment shown in Figure 4 substantially corresponds to the exemplary embodiment described in connection with Figure 3 .
  • the light from the retina is directed towards a sensor 52 of detector 5 , for example a CCD sensor .
  • a sensor 52 of detector 5 for example a CCD sensor .
  • No camera optics is used in this case .
  • a flood image is obtained without information of the proj ected image . Consequently, a compensation of the proj ected image is obtained .
  • the embodiment of Figure 4 illustrates a compensation of the proj ected image on the image sensor to obtain a flood image without information of the proj ected image .
  • Time multiplexed flood illumination proj ection supporting retinal imaging from in the display engine Using the proj ection system to provide a flood illumination every so often ( invisible to the user by being j ust bright enough to be used for illumination purposes ) .
  • Figure 5 illustrates an exemplary embodiment of a combiner that can be used for any of the preceding exemplary embodiments of the imaging system .
  • Figure 5 corresponds to Figure 4 of document PCT/EP2023/ 054758 , except that the reference numerals have been provided with a preceding capital letter "R" .
  • the optical combiner R50 has a first or front side R51 disposed towards a scene and a second or rear side R52 disposed towards a volume phase hologram R40 .
  • the VPH R40 is configured to selectively spread or fan out light incident on the VPH R40 according to an angle of incidence of the light incident on the VPH 40 .
  • the VPH R40 is configured to spread or fan out the image light ( cf . principal ray R18c in Figure 5 ) incident on the VPH R40 at higher angles of incidence but to transmit the ambient light from the scene without spreading or fanning out the ambient light from the scene .
  • the optical combiner R50 further includes a polari zationdependent reflector R54 , a retarder R56 which comprises , or which is configured to act as , a quarter-wave plate and an optically powered reflector R58 .
  • the optically powered reflector R58 comprises a curved transparent body or substrate R60 having a dichroic reflective coating R62 disposed on a front convex surface thereof .
  • the dichroic reflective coating R62 is configured to be highly reflecting in one or more narrow spectral bands , each narrow spectral band being arranged around a corresponding wavelength of the image light , but to transmit light at other wavelengths .
  • the dichroic reflective coating 62 may be configured to have a reflectance in each spectral band of 90% or greater, 95% or greater or 99% or greater .
  • the dichroic reflective coating 62 is configured to reflect ambient light 32 at wavelengths inside the one or more narrow spectral bands but to transmit ambient light 32 at wavelengths outside the one or more narrow spectral bands .
  • the polari zation-dependent reflector 54 and the dichroic reflective coating 62 of the optically powered reflector 58 define an optical cavity wherein the retarder 56 is located in the optical cavity . Moreover, the polari zation-dependent reflector 54 and the optically powered reflector 58 are arranged so that the polari zation-dependent reflector 54 is located in an optical path between the VPH R40 and the optically powered reflector R58 . The retarder R56 and the optically powered reflector R58 are separated by an airgap R64 .
  • the eyepiece R14 comprising the combiner 50 further includes a circular polari zer R70 disposed on the dichroic reflective coating R62 of the optically powered reflector R58 .
  • the optical combiner R50 ef fectively combines the ambient light R32 which is incident on the front side R51 of the optical combiner R50 with the collimated light which exits the rear side R52 of the optical combiner R50 .
  • the circular polari zer R70 imparts a circular polari zation to the ambient light R32 and the circularly polari zed ambient light
  • the circularly polari zed ambient light is incident on the first side R51 of the optical combiner R50 defined by the dichroic reflective coating R62 of the optically-powered reflector R58 .
  • the dichroic reflective coating R62 transmits , towards the retarder R56 , the wavelength of the circularly polari zed ambient light which falls outside the one or more narrow spectral bands over which the dichroic reflective coating R62 is highly reflecting .
  • the retarder R56 converts the circularly polari zed ambient light transmitted by the dichroic reflective coating R62 to linearly polari zed ambient light having a linear polari zation which is aligned with a polari zation transmission axis of the polari zation-dependent reflector R52 so that the polari zation-dependent reflector R52 transmits the linearly polari zed ambient light R32 towards an expanded eyebox .
  • Use of the circular polari zer R70 at least partially suppresses the reflection of ambient light from the polari zation-dependent reflector R54 , thereby at least partially suppressing the formation of any ghost images of the scene at the eyebox .
  • Figure 5 illustrates the reflection and collimation of linearly-polari zed image light and replication of the image for the case of linearly-polari zed principal ray R18c of the linearly polari zed image light .
  • the linearly polari zed image light and therefore that the linearly polari zed principal ray R18c of the linearly polari zed image light has a first linear polari zation which is aligned with a polari zation transmission axis of the polari zation-dependent reflector R54 .
  • the VPH R40 spreads , for example fans out or separates , the linearly polari zed principal ray R18c of image light into three di f ferent directions to form three di f ferent linearly polari zed rays of spread image light which are incident on a second or rear side R52 of the optical combiner R50 defined by the polari zation-dependent reflector R54 .
  • the first linear polari zation of each of the linearly polari zed rays of spread image light is aligned with the polari zation transmission axis of the polari zation-dependent reflector R54 so that the polari zation-dependent reflector R54 transmits each of the linearly polari zed rays of spread image light towards the retarder R56 .
  • the retarder R56 converts the polari zation of each ray of spread image light from the first linear polari zation to a first circular polari zation .
  • Each ray of spread image light then propagates from the retarder R56 to the substrate R60 of the optically powered reflector R58 , is transmitted through the substrate R60 and then reflected at the dichroic reflective coating R62 of the optically powered reflector R58 to form a corresponding ray of first reflected light having a second circular polari zation which is opposite to the first circular polari zation .
  • Each ray of first reflected light propagates back through the substrate R60 of the optically powered reflector R58 towards the retarder R56 .
  • the retarder R56 converts the polari zation of each ray of first reflected light from the second circular polari zation to a second linear polari zation which is orthogonal to the first linear polari zation and to the polari zation transmission axis of the polari zation-dependent reflector R54 . Accordingly, the polari zation-dependent reflector R54 reflects each ray of a first reflected light back towards the retarder R56 as a corresponding ray of second reflected light .
  • the retarder R56 converts the polari zation of each ray of second reflected light from the second linear polari zation to the second circular polari zation .
  • Each ray of second reflected light then propagates from the retarder R56 to the substrate R60 of the optically-powered reflector R58 , is transmitted through the substrate R60 and then reflected at the dichroic reflective coating R62 of the optically powered reflector R58 to form a corresponding ray of third reflected light having the first circular polari zation .
  • Each ray of third reflected light propagates back through the substrate 60 of the optically powered reflector 58 towards the retarder 56 .
  • the retarder 56 converts the polari zation of each ray of third reflected light from the first circular polari zation to the first linear polari zation which is parallel to the polari zation transmission axis of the polari zation-dependent reflector R54 .
  • the polari zation-dependent reflector R54 transmits each ray of third reflected light to form collimated light which travels back through the VPH R40 as a collimated light R30 which defines the expanded eyebox .
  • the proj ected image light ( cf . principal ray R18c ) is spread by the VPH R40 and then traverses the reflective pancake optical combiner R50 four times before exiting the reflective pancake optical combiner R50 on the same side of the reflective optical combiner R50 from the VPH R40 .
  • the reflective pancake optical combiner collimates the spread image light so as to form collimated light which is reflected by the reflected optical combiner R50 back through the VPH R40 without the VPH R40 spreading the collimated light so as to form the collimated light R30 which propagates to the plane at the eye of the user and replicates the image in the plane at the eye of the user to thereby expand the eyebox .
  • the reflective pancake optical combiner R50 collimates the spread image light as the spread image light propagates along a folded optical path which is defined within the reflective pancake optical combiner R50 and which extends from the VPH R40 and back to the VPH R40 .
  • use of the reflective pancake optical combiner R50 serves to reduce the physical thickness of eyepiece 14 resulting in a more compact eyepiece R14 and a more compact optical system .
  • optical combiner 2 , R50 used for the proj ection of the image light .
  • optical combiners may also be used and be operated in reverse direction as described above .
  • VPH transmissive volume phase hologram

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

An imaging system (1) comprising a combiner (2, R50) is specified, the combiner (2) being configured to display an image onto a retina (91) of a user's eye (9), wherein the imaging system (1) is configured to operate the combiner (2, R50) in reverse direction in order to image the retina (91). Further, a method is specified.

Description

Description
IMAGING SYSTEM AND METHOD
The present application relates to an imaging system and to a method using imaging .
Imaging based eye-tracking sensors require both a direct illumination of the eye as well as a direct imaging of the same area . State of the art algorithms rely on the information of the pupil location and of the position of the specular reflection of the light source ( s ) from the cornea to assess the gaze vector .
Alternative solutions use an image of the retina to perform eye tracking . In some of these cases a subset of the retina image is acquired ( defined by the position of the eye ) and this subset image is compared to a full retinal image to determine the eye position . Access to the retinal image is usually di f ficult and potentially costly, since it requires dedicated optics ( reflective , di f fractive ) within the field of view of the user .
In addition, these optics need to feature a large enough eyebox that covers eye movement as well as some or all the IPD ( interpupillary distance ) range in the population .
A second problem is to determine where a person is essentially looking in a 3d space . There are two components at play, the first being the relative position of the gaze vectors and their intersection point . The second is the focusing distance of the eye itsel f . Usually, they are matched to provide minimal eye strain . In VR (virtual reality) applications the mismatch between these two provides a problem that stems from the fact that there is a mismatch between the location of the image (virtual display a few centimeters out ) and the relative position of the gaze vectors (vergence-accommodation-mismatch) . This can create additional discomfort in AR applications , as it makes it impossible to simultaneously visuali ze a virtual and a real obj ect that should be located in close vicinity to each other .
Document EP 0979432 Bl describes an optical system combining image presentation and eye analysis , the entire disclosure content of which is hereby incorporated by back-reference . This prior art deals with an optical system for presenting an image to a user, through an optical path of transmission of these image elements to the user' s eye , as well as including a system illumination of the fundus and a system for taking an image of the fundus of the illuminated eye for eye tracking purposes . The two systems partially share the same optical path, and the assembly can be integrated into a head device . A generic schematic image is shown in Fig . l , and a more speci fic embodiment consisting in a helmet for aircraft pilots is outlined in Fig . 2 .
Document US 2017 / 0000343 Al describes augmented and virtual reality display systems and methods for determining optical prescriptions by imaging retina, the entire disclosure content of which is hereby incorporated by back-reference . This prior art deals with a wearable AR ( augmented reality) /VR device that can be used as a diagnostic health system ( incl . ophthalmic ) . The prior art covers several diagnostic methods as embodiments , which rely on an image of the retina being acquired . Document US 2017/0000342 Al relates to methods and systems for detecting health conditions by imaging portions of the eye, including the fundus, the entire disclosure content of which is hereby incorporated by back-reference. This prior art is roughly analogous to US 2017/0000343 Al.
An object to be solved is to facilitate retinal imaging. Further, a method is to be specified that provides information on an eye.
These objects are obtained, inter alia, by an imaging system and a method according to the independent claims. Further developments and expediencies are subject of the dependent claims .
An imaging system is specified.
According to at least one embodiment of the imaging system (or imaging device) , the imaging device comprises a combiner. For example, the combiner is used in reverse to capitalize from the eyebox expansion.
The term "eyebox" refers to a volume of space relative to an augmented reality (AR) display or virtual reality (VR) display, in which the user has to position their eye to be able to correctly see the full projected image. For example, the display is embodied as off-axis retinal scanning display (ORSD) that may be used to project an image onto the retina of the user.
In a previous patent application (PCT/EP2023/054758 ) the inventors disclosed a combiner that provides an extended eyebox for wearable computing devices (e.g., augmented reality glasses ) , the entire disclosure content of which is hereby incorporated by back-reference .
This combiner displays images onto the retina of the user .
For example , the combiner may use di f fraction, refraction or reflection or a combination thereof to deflect the light towards the eye . For example , the combiner comprises a holographic element such as a volume phase hologram (VPH) .
For example , the combiner comprises a reflector configured to deflect the image to be proj ected towards the user' s eye . The reflector may for example comprise dielectric layers configured as a dichroic coating and/or as a Bragg reflector . According to at least one embodiment of the imaging system, the imaging system is configured to operate the combiner in reverse direction in order to image the retina .
Since this optical element ( the combiner ) works bidirectionally, one can obtain an image of the retina by adding a beam splitter coupled with a detector ( for example camera/photodiode + ob ective/ lens ) , for example . For example , the angular si ze of the retinal image is equivalent to at least the angular si ze of the proj ected image , but may be larger .
In least one embodiment of the imaging system, the imaging system comprises a combiner configured to display an image onto a retina of a user' s eye , wherein the imaging system is configured to operate the combiner in reverse direction in order to image the retina . Thus , light from the retina passing the combiner in reverse direction can be used to image the retina . Additionally, the eye ( cornea, crystalline lens , fundus , retina ) is part of the optical system according to at least one embodiment .
This means that changes in the eye due to images being displayed can have an ef fect on image parameters . This change of image parameters can potentially be used to determine the focus adaption of the eye from the acquired data, from which one can deduce a wide range of health related parameters .
The added complexity of the invention is small . For example , it lies mostly in adding a detector such as a camera to the optical path and re-using an already existing optical element .
The core optical system used is described in PCT/EP2023/ 054758 . It describes a multi-layered optical component that can be used to display an image onto the retina of a user, using either broad- or narrowband sources .
According to at least one embodiment , this component is used in reverse , essentially as an image forming element . In particular, the combiner comprises a multi-layered optical component that can be used to display an image onto the retina of a user, using either broad- or narrowband sources .
Using the optical component and co-linear illumination and imaging paths ( created e . g . , by a beam splitter ) , one can use the extended eyebox to get access to retinal images , while maintaining zero components within the field of view of the user . The latter may be achieved by adding the imaging system and an image sensor on the return path . The optical component described in PCT/EP2023/ 054758 already features the necessary elements to obtain an image .
Further it is possible to re-use the illumination described in PCT/EP2023/ 054758 such that no additional illumination is necessary . However, a further illumination may be provided .
According to at least one embodiment of the imaging system, the imaging system comprises a light source configured to illuminate the combiner . The term " light" is not restricted to visible electromagnetic radiation . For example , the light source is configured to emit light in the near ultraviolet spectral range , in the visible spectral range and/or in the near ultraviolet spectral range .
In this context , near ultraviolet light refers to the spectral range from 320 nm to 490 nm . Visible light refers to the spectral range from 420 nm to 780 nm . Near infrared light refers to the spectral range from 781 nm to 1 . 3 pm .
For example , the light source includes an emitter such as a laser diode or a light emitting diode emitting in the blue spectral range from 440 nm to 470 nm and/or in the green spectral range from 510 nm to 540 nm and/or in the red spectral range from 620 nm to 660 nm . Alternatively or in addition an emitter may emit radiation in the near infrared spectral range .
For example , the light source is a proj ector or the light source is part of a proj ector .
According to at least one embodiment of the imaging system, a beam splitter is arranged in a beam path between the light source and the combiner . In a return path from the eye towards the light source , at least part of this light may be coupled out so that it does not reach the light source . The term "beam splitter" broadly covers any optical element that is suitable for splitting the radiation in the return path into two spatially separated paths .
According to at least one embodiment of the imaging system, the imaging system comprises a detector . The detector may comprise one or more photosensitive regions . For example , the detector comprises a photodiode , an array of photodiodes or a charge coupled device ( CCD) . Further, the detector may comprise or consist of a camera which may, for example , include a camera sensor and an imaging optics .
According to at least one embodiment of the imaging system, light in a return path from the retina via the combiner is deflected towards the detector in particular by the beam splitter .
According to at least one embodiment of the imaging system, the light source is a proj ector wherein the beam splitter is arranged in the beam path between the proj ector and the combiner .
According to at least one embodiment of the imaging system, the imaging system comprises a scanner . For example , the beam splitter is arranged in a beam path between the light source and the scanner . For example , the scanner is a micro electro mechanical system (MEMS ) scanner . The scanner may be used to provide the spatial resolution of the image to be displayed . Thus , the light source itsel f does not have to provide a spatially resolved image . According to at least one embodiment of the imaging system, the imaging system comprises a further light source configured to illuminate the retina with a further light . The further light may be in the visible or in the invisible spectral range , for example in the near infrared spectral range . In particular the further light illuminates the retina via the combiner .
According to at least one embodiment of the imaging system, the further light and the combiner are adapted to one another such that the combiner deflects at least part of the further light onto the retina . In other words , the combiner has a suf ficiently high reflectivity in a wavelength range of the further light .
For example , the retina is illuminated with the further light such that the further light is not perceived by the eye . The further light may enhance the imaging quality of the retina .
According to at least one embodiment of the imaging system, the further light provides a flood illumination of the retina outside a proj ection path of the image . Thus , the combiner is in the beam path of the further light towards the retina, but it does not follow the beam path of the proj ected light used to display the image .
According to at least one embodiment of the imaging system, the further light includes wavelengths in the near infrared spectral range . This facilitates illumination of the retina of the eye without disturbing the user . In particular, the further light may be the only light that is used for imaging the retina . According to at least one embodiment of the imaging system, the imaging system comprises a filter configured to transmit at least part of the light of the further light source and to filter out at least part of the light of the light source . For example , the filter is part of the detector or arranged in a beam path between the detector and the combiner, in particular in a beam path between the detector and the beam splitter .
According to at least one embodiment of the imaging system, the light source is configured to provide a time multiplexed flood illumination of the retina . For example , the flood illumination is performed between two subsequent images to be displayed . By means of the flood illumination, the retina can be illuminated with a higher homogeneity compared to an illumination configured to display an image .
Consequently, the imaging system may use the flood illumination to obtain images of the retina .
According to at least one embodiment of the imaging system, the combiner is configured to be arranged in the user' s field of view, wherein the combiner is configured to transmit ambient light from the field of view . Thus , during operation the user may see the ambient light within the field of view superimposed with the proj ected image .
In particular, the combiner may be configured such that it is substantially transparent to the radiation from the ambient light so that the combiner does not signi ficantly disturb the user' s view of the scene within their field of view . According to at least one embodiment of the imaging system, the imaging system is configured to be used for eye tracking or eye tracking systems for augmented reality devices , mixed reality devices or virtual reality devices . Thus , eye tracking may be obtained without any optical features within the user' s field of view in addition to those already present in order to display the image onto the retina .
Further, a method of obtaining information on an eye is speci fied .
According to at least one embodiment of the method, a retina of the eye is imaged by operating a combiner in reverse direction, wherein the combiner is configured to display an image onto the retina of the eye .
According to at least one embodiment of the method an eye movement is derived from the imaged eye . For example , eye tracking is performed by means of the method .
Alternatively or in addition, the imaged retina may be used for user authentication . Thus , the uniqueness that each retina structure has can be relied upon as highly reliable criterion for identi fication .
According to at least one embodiment of the method, a focus adaption of the eye is deduced from the imaged eye . Thus , an impact of the proj ected image on the eye may be detected .
According to at least one embodiment of the method, at least one health-related parameter is deduced from the imaged eye . For example , the eye ' s capability of accommodation can be derived from the obtained images of the eye . Further, parameters relating to the transparency of the eye parts through which the radiation passes can be derived .
The method can be performed using an imaging system as described above . Thus , features described in connection with the imaging system also may apply for the method and vice versa .
In particular, the following ef fects may be obtained .
The imaging system may provide access to the retina image , which provides unique authentication, for instance .
The imaging system may provide information to perform eye tracking (vergence and accommodation) . It may provide information on the health of the eye .
Access to a retinal image is one of the di f ficulties while keeping imaging components out of the field of view of the user .
The optical component described in PCT/EP2023/ 054758 solves this issue i f used in reverse , by providing both an extended eyebox as well as a strong illumination ef ficiency . The optical component described provides a transparent , see- through optical component , using spectrally narrow-band and angular selective features in order to render it virtually invisible to the user of augmented or mixed reality glasses .
The high ef ficiency of the optical component enables a high signal-to-noise ratio ( SNR) , normally not possible with other solutions such as SRG ( surface relief grating) waveguide systems . The imaging system may be used for eye tracking, eye tracking systems for AR/VR devices such as wearable headsets , head up displays (HUDs ) , for instance .
Further features and expediencies will become apparent from the Figures . Features described above in connection with at least one embodiment of the method or the optical system and/or disclosed in one of the Figures may also be used for other embodiments or combined with other features described in connection with at least one embodiment of the method or the optical system unless they are contradictory .
In the figures :
Figure 1 shows an exemplary embodiment of an imaging system in schematic representation;
Figure 2 shows an exemplary embodiment of an imaging system in schematic representation;
Figure 3 shows an exemplary embodiment of an imaging system in schematic representation;
Figure 4 shows an exemplary embodiment of an imaging system in schematic representation;
Figure 5 shows an exemplary embodiment of a combiner in a schematic sectional view .
The imaging system 1 shown in Figure 1 comprises a combiner 2 configured to display an image onto a retina 91 of a user' s eye 9 . The imaging system is configured to operate the combiner 2 in reverse direction in order to image the retina 91 .
The imaging system 1 further comprises a light source 3 configured to illuminate the combiner 2 . For example , the light source 3 includes emitters emitting light in the red, green and blue spectral range to provide a full color image . The light source may comprise optical elements such as lenses and/or mirrors to shape and/or deflect the emitted light .
For example , the light source 3 is a proj ector 30 configured to produce a spatially resolved image which is to be displayed onto the retina 91 .
A beam splitter 4 is arranged in a beam path between the light source 3 and the combiner 2 . Further, the imaging system comprises a detector 5 . In the exemplary embodiment shown, the detector 5 is a camera 50 comprising a camera optics 51 and a sensor 52 .
Light in a return path from the retina 91 via the combiner 2 is deflected by the beam splitter 4 towards the detector 5 . Figure 1 illustrates a proj ection path 81 of light from the light source 3 travelling towards the retina 91 . Further, a detection path 83 is illustrated, the detection path extending from the retina 91 via the combiner 2 towards the detector 5 . An overlapped path 84 illustrates the combined path of proj ection path 81 and detection path 83 .
For imaging the retina, the detector 5 may use the light from the light source 3 , for example the light used to display the image . The light source 3 may further be configured to provide a flood illumination of the retina 91 between subsequent images to be displayed . The flood illumination may be performed such that it is not perceived by the eye . For example , the intensity of the flood illumination may be so low that the flood illumination is not perceived by the eye 9 . Alternatively, the light source 3 may include an emitter emitting radiation in the infrared spectral range , in particular in the near infrared spectral range , which is not perceived by the human eye .
As Figure 1 illustrates , the combiner 2 can be used both for the proj ection path 81 of the light source 3 towards the retina 91 and for the returning path from the retina towards the detector 5 . During operation of the imaging system 1 , the combiner 2 is arranged in the user' s field of view . The combiner 2 is substantially transparent to the ambient light so that the combiner 2 transmits the ambient light from the field of view . Thus , the user perceives both the ambient light and the image produced by proj ector 30 .
In this exemplary embodiment , a beam splitter 4 and a camera 50 ( imaging system + image sensor ) are added to the return path (between VPH (volume phase hologram) and proj ector ) in order to image the retina 91 .
In a method of obtaining information on an eye , a retina 91 of the eye is imaged by operating a combiner 2 in reverse direction, wherein the combiner 2 is configured to display an image onto the retina 91 of the eye 9 . For example , the method can be used to extract eye tracking parameters such as the location of the pupil of the eye from the imaged retina 91 . Alternatively or in addition, a focus adaption of the eye can be deduced from the imaged eye .
Furthermore , one or more health-related parameters may be deduced from the imaged eye , for example , the maximum accommodation of the eye . Further, health issues may be detected by imaging the eye 9 , for example diabetes , high blood pressure or high cholesterol .
The exemplary embodiment shown in Figure 2 essentially corresponds to that described in connection with Figure 1 .
Unlike in Figure 1 , the light source 3 illustrated in Figure 2 itsel f does not provide a spatial resolution of the image to be displayed . The light from light source 3 is collimated by a light source optics 32 and directed towards a scanner 31 , for example a MEMS scanner . Further, an imaging optics 38 is provided between the scanner 31 and the combiner 2 . The beam splitter 4 is arranged between the light source 3 and the scanner 31 .
Thus , the deflection of the radiation by the scanner 31 ef fects both the radiation from the light source 3 towards the retina 91 and the radiation on the return path from the retina 91 to the detector 5 .
In this exemplary embodiment the detector 5 does not necessarily have to have a spatial resolution . For example a photodiode 55 may be used . A detector optics 56 may be arranged between the beam splitter 4 and the detector 5 , for example in order to focus the light from the retina 91 onto the detector 5 . The figure further shows the proj ection path 81 , the detection path 83 and the overlapped path 84 , wherein the beam path of overlapped path 84 between the scanner 31 and the retina 91 is shown for two di f ferent positions of scanner 31 .
Thus , in this exemplary embodiment , an imaging system is added before a potential mems scanner in the proj ection system . This assumes a mems scanning system in the proj ector .
The exemplary embodiment shown in Figure 3 substantially corresponds to that described in connection with Figure 1 .
The exemplary embodiment according to Figure 3 additionally comprises a further light source 35 configured to illuminate the retina with a further light . The further light may be in the visible spectral range or in the near infrared spectral range . The illumination of the retina 91 by the further light may be performed such that the eye 9 does not perceive the further light .
Using the further light , the imaging quality of the detector 5 , for example embodied as camera 50 , can be increased . In particular, the imaging system 1 may comprise a filter (not shown in Figure 3 ) provided in the detector 5 or in a beam path from the retina to the detector upstream of the detector 5 . The filter may completely, or at least in part , filter out the light of light source 3 so that only the illumination by the further light source 35 is used for imaging the retina 91 . In the exemplary embodiment shown in Figure 3 , a further beam splitter 45 is arranged between the detector 5 and the combiner 2 , in particular between the beam splitter 4 and the detector 5 . Thus , the return path from the retina 91 via combiner 2 extends through the beam splitter 4 and the further beam splitter 45 towards the detector 5 .
Lines 82 illustrate a further light path 82 of light from the further light source 35 towards the retina 91 . Lines 84 illustrate an overlapped path of further light path 82 and detection path 83 .
Lines 85 represent a combined path of proj ection path 81 , further light path 82 and detection path 83 .
Thus , the embodiment shown in Figure 3 suggests adding a dedicated light source to the illumination path and a filter in the camera . This illumination should be within range of the capabilities of the optical stack, but outside of the proj ection path . This would be used to supply a flood illumination of the retina alone . Other wavelengths of the proj ection system can then be filtered out to provide an improved image quality .
The exemplary embodiment shown in Figure 4 substantially corresponds to the exemplary embodiment described in connection with Figure 3 .
In this exemplary embodiment , the light from the retina is directed towards a sensor 52 of detector 5 , for example a CCD sensor . No camera optics is used in this case . Thus , a flood image is obtained without information of the proj ected image . Consequently, a compensation of the proj ected image is obtained . Thus , the embodiment of Figure 4 illustrates a compensation of the proj ected image on the image sensor to obtain a flood image without information of the proj ected image .
As an additional option that may apply to any of the previous exemplary embodiments :
Time multiplexed flood illumination proj ection supporting retinal imaging from in the display engine . Using the proj ection system to provide a flood illumination every so often ( invisible to the user by being j ust bright enough to be used for illumination purposes ) .
Figure 5 illustrates an exemplary embodiment of a combiner that can be used for any of the preceding exemplary embodiments of the imaging system . Figure 5 corresponds to Figure 4 of document PCT/EP2023/ 054758 , except that the reference numerals have been provided with a preceding capital letter "R" .
The optical combiner R50 has a first or front side R51 disposed towards a scene and a second or rear side R52 disposed towards a volume phase hologram R40 . The VPH R40 is configured to selectively spread or fan out light incident on the VPH R40 according to an angle of incidence of the light incident on the VPH 40 . Speci fically, the VPH R40 is configured to spread or fan out the image light ( cf . principal ray R18c in Figure 5 ) incident on the VPH R40 at higher angles of incidence but to transmit the ambient light from the scene without spreading or fanning out the ambient light from the scene . The optical combiner R50 further includes a polari zationdependent reflector R54 , a retarder R56 which comprises , or which is configured to act as , a quarter-wave plate and an optically powered reflector R58 . The optically powered reflector R58 comprises a curved transparent body or substrate R60 having a dichroic reflective coating R62 disposed on a front convex surface thereof . The dichroic reflective coating R62 is configured to be highly reflecting in one or more narrow spectral bands , each narrow spectral band being arranged around a corresponding wavelength of the image light , but to transmit light at other wavelengths . For example , the dichroic reflective coating 62 may be configured to have a reflectance in each spectral band of 90% or greater, 95% or greater or 99% or greater . One of ordinary skill in the art will understand that this also means that the dichroic reflective coating 62 is configured to reflect ambient light 32 at wavelengths inside the one or more narrow spectral bands but to transmit ambient light 32 at wavelengths outside the one or more narrow spectral bands .
The polari zation-dependent reflector 54 and the dichroic reflective coating 62 of the optically powered reflector 58 define an optical cavity wherein the retarder 56 is located in the optical cavity . Moreover, the polari zation-dependent reflector 54 and the optically powered reflector 58 are arranged so that the polari zation-dependent reflector 54 is located in an optical path between the VPH R40 and the optically powered reflector R58 . The retarder R56 and the optically powered reflector R58 are separated by an airgap R64 .
The eyepiece R14 comprising the combiner 50 further includes a circular polari zer R70 disposed on the dichroic reflective coating R62 of the optically powered reflector R58 . In use , the optical combiner R50 ef fectively combines the ambient light R32 which is incident on the front side R51 of the optical combiner R50 with the collimated light which exits the rear side R52 of the optical combiner R50 .
Speci fically, the circular polari zer R70 imparts a circular polari zation to the ambient light R32 and the circularly polari zed ambient light , and the circularly polari zed ambient light is incident on the first side R51 of the optical combiner R50 defined by the dichroic reflective coating R62 of the optically-powered reflector R58 . The dichroic reflective coating R62 transmits , towards the retarder R56 , the wavelength of the circularly polari zed ambient light which falls outside the one or more narrow spectral bands over which the dichroic reflective coating R62 is highly reflecting . The retarder R56 converts the circularly polari zed ambient light transmitted by the dichroic reflective coating R62 to linearly polari zed ambient light having a linear polari zation which is aligned with a polari zation transmission axis of the polari zation-dependent reflector R52 so that the polari zation-dependent reflector R52 transmits the linearly polari zed ambient light R32 towards an expanded eyebox . Use of the circular polari zer R70 at least partially suppresses the reflection of ambient light from the polari zation-dependent reflector R54 , thereby at least partially suppressing the formation of any ghost images of the scene at the eyebox .
Figure 5 illustrates the reflection and collimation of linearly-polari zed image light and replication of the image for the case of linearly-polari zed principal ray R18c of the linearly polari zed image light . For the purposes of the following description, it is assumed that the linearly polari zed image light and therefore that the linearly polari zed principal ray R18c of the linearly polari zed image light has a first linear polari zation which is aligned with a polari zation transmission axis of the polari zation-dependent reflector R54 . The VPH R40 spreads , for example fans out or separates , the linearly polari zed principal ray R18c of image light into three di f ferent directions to form three di f ferent linearly polari zed rays of spread image light which are incident on a second or rear side R52 of the optical combiner R50 defined by the polari zation-dependent reflector R54 . The first linear polari zation of each of the linearly polari zed rays of spread image light is aligned with the polari zation transmission axis of the polari zation-dependent reflector R54 so that the polari zation-dependent reflector R54 transmits each of the linearly polari zed rays of spread image light towards the retarder R56 . The retarder R56 converts the polari zation of each ray of spread image light from the first linear polari zation to a first circular polari zation . Each ray of spread image light then propagates from the retarder R56 to the substrate R60 of the optically powered reflector R58 , is transmitted through the substrate R60 and then reflected at the dichroic reflective coating R62 of the optically powered reflector R58 to form a corresponding ray of first reflected light having a second circular polari zation which is opposite to the first circular polari zation . Each ray of first reflected light propagates back through the substrate R60 of the optically powered reflector R58 towards the retarder R56 . The retarder R56 converts the polari zation of each ray of first reflected light from the second circular polari zation to a second linear polari zation which is orthogonal to the first linear polari zation and to the polari zation transmission axis of the polari zation-dependent reflector R54 . Accordingly, the polari zation-dependent reflector R54 reflects each ray of a first reflected light back towards the retarder R56 as a corresponding ray of second reflected light .
The retarder R56 converts the polari zation of each ray of second reflected light from the second linear polari zation to the second circular polari zation . Each ray of second reflected light then propagates from the retarder R56 to the substrate R60 of the optically-powered reflector R58 , is transmitted through the substrate R60 and then reflected at the dichroic reflective coating R62 of the optically powered reflector R58 to form a corresponding ray of third reflected light having the first circular polari zation .
Each ray of third reflected light propagates back through the substrate 60 of the optically powered reflector 58 towards the retarder 56 . The retarder 56 converts the polari zation of each ray of third reflected light from the first circular polari zation to the first linear polari zation which is parallel to the polari zation transmission axis of the polari zation-dependent reflector R54 . Accordingly, the polari zation-dependent reflector R54 transmits each ray of third reflected light to form collimated light which travels back through the VPH R40 as a collimated light R30 which defines the expanded eyebox .
The proj ected image light ( cf . principal ray R18c ) is spread by the VPH R40 and then traverses the reflective pancake optical combiner R50 four times before exiting the reflective pancake optical combiner R50 on the same side of the reflective optical combiner R50 from the VPH R40 . As a consequence of the optical power of the optically powered reflector R58 , the reflective pancake optical combiner collimates the spread image light so as to form collimated light which is reflected by the reflected optical combiner R50 back through the VPH R40 without the VPH R40 spreading the collimated light so as to form the collimated light R30 which propagates to the plane at the eye of the user and replicates the image in the plane at the eye of the user to thereby expand the eyebox . In ef fect , the reflective pancake optical combiner R50 collimates the spread image light as the spread image light propagates along a folded optical path which is defined within the reflective pancake optical combiner R50 and which extends from the VPH R40 and back to the VPH R40 . As such, use of the reflective pancake optical combiner R50 serves to reduce the physical thickness of eyepiece 14 resulting in a more compact eyepiece R14 and a more compact optical system .
Using this optical combiner in reverse direction allows to image the retina of the user' s eye without having to place any optical elements within the user' s field of view that might negatively af fect the user . Rather, the only optical element within the user' s field of view is the optical combiner 2 , R50 used for the proj ection of the image light .
However, other types of optical combiners may also be used and be operated in reverse direction as described above .
Features shown or described in one of the embodiments may be combined with features of other embodiments unless they are contradictory .
This patent application claims the priority of German patent applications 10 2023 119 379 . 1 and 10 2023 132 220 . 6 , the disclosure content of which is hereby incorporated by reference .
The invention described herein is not restricted by the description given with reference to the exemplary embodiments . Rather, the invention encompasses any novel feature and any combination of features , including in particular any combination of features in the claims , even i f this feature or this combination is not itsel f explicitly indicated in the claims or exemplary embodiments .
List of reference signs
1 imaging system
2 combiner
3 light source
30 pro j ector
31 scanner
32 light source optics
35 further light source
38 imaging optics
4 beam splitter
45 further beam splitter
5 detector
50 camera
51 camera optics
52 sensor
55 photodiode
56 detector optics
81 proj ection path
82 further light path
83 detection path
84 overlapped path (path 82 and 83 )
85 overlapped path (path 81 , 82 and 83 )
9 eye
91 retina
R14 eyepiece
R18c principal ray of image light
R30 collimated light
R40 transmissive volume phase hologram (VPH)
R50 optical combiner
R51 first side of optical combiner
R52 second side of optical combiner
R54 polari zation-dependent reflector R56 retarder
R58 optically powered reflector
R60 transparent substrate of optically powered reflector
R62 dichroic reflective coating R64 airgap
R70 circular polari zer

Claims

Claims
1. An imaging system (1) comprising a combiner (2, R50) configured to display an image onto a retina (91) of a user's eye (9) , wherein the imaging system (1) is configured to operate the combiner (2, R50) in reverse direction in order to image the retina (91) .
2. The imaging system according to claim 1, further comprising
- a light source (3) configured to illuminate the combiner (2, R50) ;
- a beam splitter (4) arranged in a beam path between the light source (3) and the combiner (2, R50) , and
- a detector ( 5 ) ; wherein light in a return path from the retina (91) via the combiner (2, R50) is deflected by the beam splitter (4) towards the detector (5) .
3. The imaging system according to claim 2, wherein the light source (3) is a projector (30) and the beam splitter (4) is arranged in the beam path between the projector (30) and the combiner (2, R50) .
4. The imaging system according to claim 2, wherein the imaging system (1) comprises a scanner (31) , wherein the beam splitter (4) is arranged in a beam path between the light source (3) and the scanner (31) .
5. The imaging system according to any one of the preceding claims , wherein the imaging system (1) comprises a further light source (35) configured to illuminate the retina (91) with a further light.
6. The imaging system according to claim 5, wherein the further light and the combiner (2, R50) are adapted to one another such that the combiner (2, R50) deflects at least part of the further light onto the retina (91) .
7. The imaging system according to claim 5 or 6, wherein the further light provides a flood illumination of the retina outside of a projection path (81) of the image.
8. The imaging system according to any one of claims 5 to 7, wherein the further light includes wavelengths in the near infrared spectral range.
9. The imaging system according to any one of claims 5 to 8, wherein the imaging system comprises a filter configured transmit at least part of the light of the further light source and to filter out at least part of the light of the light source.
10. The imaging system according to any one of the preceding claims , wherein the light source (3) is configured to provide a time multiplexed flood illumination of the retina (91) .
11. The imaging system according to any one of the preceding claims, wherein the combiner (2, R50) is configured to be arranged in the user' s field of view, wherein the combiner is configured to transmit ambient light from the field of view. 12. The imaging system according to any one of the preceding claims , wherein the imaging system (1) is configured to be used for eye tracking or eye tracking systems for augmented, mixed or virtual reality devices.
13. A method of obtaining information on an eye (9) , wherein a retina (91) of the eye (9) is imaged by operating a combiner (2, R50) in reverse direction, wherein the combiner (2, R50) is configured to display an image onto the retina
( 91 ) of the eye ( 9 ) .
14. The method according to claim 13, wherein an eye movement is derived from the imaged eye.
15. The method according to claim 13 or 14, wherein a focus adaption of the eye is deduced from the imaged eye. 16. The method according to any one of claims 13 to 15, wherein at least one health-related parameter is deduced from the imaged eye.
17. The method according to any one of claims 13 to 16, wherein an imaging system (1) according to any one of claims 1 to 12 is used.
PCT/EP2024/068794 2023-07-21 2024-07-04 Imaging system and method WO2025021455A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE102023119379 2023-07-21
DE102023119379.1 2023-07-21
DE102023132220 2023-11-20
DE102023132220.6 2023-11-20

Publications (1)

Publication Number Publication Date
WO2025021455A1 true WO2025021455A1 (en) 2025-01-30

Family

ID=91898495

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/068794 WO2025021455A1 (en) 2023-07-21 2024-07-04 Imaging system and method

Country Status (1)

Country Link
WO (1) WO2025021455A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0979432B1 (en) 1997-04-29 2003-06-18 Thales Avionics S.A. Optical system combining image presentation and eye analysis
US20100188638A1 (en) * 2000-10-07 2010-07-29 Metaio Gmbh Information system and method for providing information using a holographic element
US20170000343A1 (en) 2015-03-16 2017-01-05 Magic Leap, Inc. Augmented and virtual reality display systems and methods for determining optical prescriptions by imaging retina
US20200081530A1 (en) * 2017-05-29 2020-03-12 Eyeway Vision Ltd. Method and system for registering between an external scene and a virtual image
US20210311303A1 (en) * 2017-08-14 2021-10-07 Huawei Technologies Co., Ltd. Eyeball tracking system and eyeball tracking method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0979432B1 (en) 1997-04-29 2003-06-18 Thales Avionics S.A. Optical system combining image presentation and eye analysis
US20100188638A1 (en) * 2000-10-07 2010-07-29 Metaio Gmbh Information system and method for providing information using a holographic element
US20170000343A1 (en) 2015-03-16 2017-01-05 Magic Leap, Inc. Augmented and virtual reality display systems and methods for determining optical prescriptions by imaging retina
US20170000342A1 (en) 2015-03-16 2017-01-05 Magic Leap, Inc. Methods and systems for detecting health conditions by imaging portions of the eye, including the fundus
US20200081530A1 (en) * 2017-05-29 2020-03-12 Eyeway Vision Ltd. Method and system for registering between an external scene and a virtual image
US20210311303A1 (en) * 2017-08-14 2021-10-07 Huawei Technologies Co., Ltd. Eyeball tracking system and eyeball tracking method

Similar Documents

Publication Publication Date Title
CN108882845B (en) Eye tracker based on retinal imaging via light-guide optical elements
US11838496B2 (en) Method and system for tracking eye movement in conjunction with a light scanning projector
US10254547B2 (en) Method and apparatus for head worn display with multiple exit pupils
US11128847B2 (en) Information display device and information display method
US11630306B2 (en) Smartglasses, lens for smartglasses and method for generating an image on the retina
JP3338837B2 (en) Composite display
US20240004189A1 (en) Optical systems and methods for eye tracking based on eye imaging via collimating element and light-guide optical element
IL294102A (en) Optical systems and methods for tracking eyes based on directing light from the eye through an optical arrangement associated with a light-directing optical element
US20170261750A1 (en) Co-Aligned Retinal Imaging And Display System
WO2025021455A1 (en) Imaging system and method
US20230012288A1 (en) Eye information detection device and image display apparatus
EP4242725A1 (en) Display device
WO2025021638A1 (en) Optoelectronic arrangement and method for tracking the movement of an object being a human eye using an optoelectronic arrangement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24740376

Country of ref document: EP

Kind code of ref document: A1