WO2010034502A2 - Ophthalmoscopy using direct sensing of the flat aerial-image created by an aspheric lens - Google Patents

Ophthalmoscopy using direct sensing of the flat aerial-image created by an aspheric lens Download PDF

Info

Publication number
WO2010034502A2
WO2010034502A2 PCT/EP2009/006925 EP2009006925W WO2010034502A2 WO 2010034502 A2 WO2010034502 A2 WO 2010034502A2 EP 2009006925 W EP2009006925 W EP 2009006925W WO 2010034502 A2 WO2010034502 A2 WO 2010034502A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
lens
eye
sensor
patient
Prior art date
Application number
PCT/EP2009/006925
Other languages
French (fr)
Other versions
WO2010034502A3 (en
Inventor
Kamran Shamraei Ghahfarokhi
Christos Bergeles
Jake J. Abbott
Bradley J. Nelson
Original Assignee
Eth Zurich Eth Transfer
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eth Zurich Eth Transfer filed Critical Eth Zurich Eth Transfer
Publication of WO2010034502A2 publication Critical patent/WO2010034502A2/en
Publication of WO2010034502A3 publication Critical patent/WO2010034502A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography

Definitions

  • the present invention pertains generally to ophthalmoscopy. It presents a method and an apparatus for vitroretinal imaging and localization of intraocular objects.
  • Retina fundus imaging is a critical task in ophthalmoscopy.
  • funduscopy the spherical shape of the retina must be transformed to a flat shape on the camera sensor to achieve a sharp image.
  • the optical structure of the eye including the cornea, lens, vitreous and aqueous humor, and pupil makes wide-field-of-view imaging challenging.
  • the retinal tissue is highly sensitive to light intensity.
  • Funduscopy is typically performed by positioning a high-magnification lens in front of the eye. This lens generates an aerial image that is curved. Then, the aerial image is made planar as it is passed through a series of field lenses, and finally the flat image is captured by an image sensor. In addition to modifying the curvature of the aerial image, field lenses are used to fit the size of the aerial image on the image sensor. This method is discussed in "O. Pommerantzeff, R. H. Webb, and F. C. Delori, Image Formation in Fundus Cameras, Investigative Ophthalmology and Visual Science, 18(6):630-637, 1979". In that work, a transpupilary illumination method is used to illuminate the interior of the eye.
  • a wide-field-of-view imaging method is also proposed in "O. Pommerantzeff, Wide-angle Non-contact and Small-angle Contact Cameras, Investigative Ophthalmology and Visual Science, 19(8):973- 979, 1980". This method also applies a transpupilary illumination method, including the separation of apertures of ophthalmoscopic illumination and observation.
  • Pomerantzeff proposed a contact ophthalmoscopic setup with a wide field of view. Since the applied ophthalmoscopic lens generates a curved aerial image, the setup proposed should be accompanied by secondary optical viewing equipment (e.g. field lenses, a camera, an observer's eye).
  • Transpupilary illumination suffers from the reflections caused by several interfaces within the light path. In order to avoid reflections, typically a portion of the pupil is used for imaging and a portion is used for illumination, leading to limited field of view. Transpupilary illumination also requires additional optical devices within the imaging path, resulting in a larger number of optical interfaces (e.g. reflecting surfaces), which attenuates the light reaching the image sensor.
  • optical interfaces e.g. reflecting surfaces
  • Transscleral illumination is proposed by Pomerantzeff in U.S. Pat. No. 3,954,329. His method applies two or more optical fibers in contact with the sclera. Gil et al. in U.S. Pat. Pub. No. US2007/0159600A1 have also used a transscleral method, proposing the use of two optical fibers focusing on the pars plana region with the help of additional lenses. Their method utilizes a halogen lamp and transfers the light through optical fibers. This method requires near-IR and near-UV filters.
  • Optical properties of the human sclera i.e. transmission, reflection, and absorption coefficients
  • the safety radiance thresholds of the human eye are given in "I. I. C. on Electromagnetic Safety (SCC39), Standard for Safety Levels with Respect to Human Exposure to Radio Frequency Electromagnetic Field 3khz to 300ghz, IEEE Standards, C95.1, pages 87-88, 2005”.
  • Transscleral laser is used clinically for treatment of the glaucoma.
  • Optical properties of sclera, conjunctiva, and ciliary body have been examined in "B. Nemati, H. G. Rylander, and A. J. Welch, Optical Properties of Conjunctiva, Sclera, and the Ciliary Body and Their Consequences for Transscleral Cyclophotocoagulation, Applied Optics, 35(19):3321-3327, 1996” in order to determine the optimal laser wavelength.
  • Nanjo has proposed a device in U.S. Pat. No. 5,668,621.
  • transpupilary illumination is applied.
  • This device is a gun-shaped fundus camera, wherein the imaging optical construction is embedded in the main body and the illumination structure is embedded in the handle.
  • Another possible design is proposed in "C. Gliss, J. Parel, J. T. Flynn, H. Pratisto, and P. Niederer, Toward a Miniaturized Fundus Camera, Journal of Biomedical Optics, 9(1): 126-131, Jan/Feb 2004".
  • the fundamental structure of macro ophthalmoscopes i.e. generating an aerial image by an ophthalmoscopic lens and accordingly imaging from it by a set of field lenses, is miniaturized.
  • the transpupilary and transscleral illumination are also investigated.
  • the common methods of ophthalmoscopy are based on two steps: creation of the retina's aerial image and capturing of the aerial image.
  • the scattering, reflection, and absorption due to each optical interface within the light path require high irradiance of the retina.
  • the imaging duration should be limited.
  • the field lenses and any illumination component take up space. Consequently, available fundus cameras cannot be easily miniaturized.
  • the object of this invention is to provide a method and an apparatus for intraocular imaging that is made simpler than the known devices and uses in particular an optical system with a reduced number of optical interfaces.
  • Another object of this invention is a miniature wide-angle fundus viewing camera that does not require pupil dilation, that is capable of handling images automatically, and that can be operated by personnel with minimal training.
  • Another object of this invention is a miniature ophthalmoscope that remains stationary with respect to the eye, and has the ability to image objects throughout the posterior of the eye and to localize these intraocular objects during medical interventions.
  • Another object of this invention is to provide a method for minimizing the eye irradiance intensity and therefore maximizing the allowable duration of imaging. Furthermore, uniformity of the illumination is an object of the invention. Additionally, the illumination device should not obstruct the imaging apparatus or negatively impact on image formation.
  • An auxiliary object of this invention is an imaging method and apparatus that allows real-time electromagnetic therapy (e.g. laser) of the interior of the eye.
  • real-time electromagnetic therapy e.g. laser
  • the present invention uses a single ophthalmoscopic lens or a lens system (compound lens), and does not require any additional imaging device, e.g. microscope.
  • This invention stems from the advent of new asperical lenses capable of generating flat aerial images of the posterior of the eye.
  • This type of lenses transforms the spherical shape of the retina to a flat plane. They can be categorized as contact and non-contact types. Contact lenses generate a wide field of view and typically have a smaller diameter, while non-contact lenses generate a smaller field of view and have a larger diameter. We utilize these asperical lenses for several applications described as examples hereinafter.
  • the image sensor is directly placed on the aerial image position.
  • This structure provides the minimum distance between the image sensor and the pupil.
  • the apparatus structure facilitates miniaturization and has a large aperture, which provides a wide field of view for the apparatus. This wide field of view can be obtained with a single image sensor, provided its size is large enough.
  • a rotational or translational mechanism moves it over the whole aerial image, while the image sensor captures different portions of it.
  • the captured images can be post-processed and mosaiced to obtain the full aerial image.
  • Capturing sharp images requires focusing mechanisms. Using the natural focus of the crystalline lens of the eye is a possible approach in which the patient is asked to focus on an image in a proper distance, resulting in the creation of the aerial image on the image sensor. Another possibility would be using a translational mechanism for focusing. The image sensor would move in parallel to the optical axis to maximize the image sharpness.
  • the described method and apparatus can also provide focused images of the locations in the posterior of the eye other than the retina. Focus information (or information on the location of the image plane) can then be used for the localization of objects in the posterior of the eye. As a result, it can be used as part of a system that controls intraocular devices, e.g. untethered microrobotic devices, with visual feedback.
  • a transscleral illumination apparatus based on light emitting diodes (LEDs) is used with the proposed imaging apparatus.
  • the LED-based light source is equipped with a lens that faces the sclera, and focuses the light on it.
  • the shape of the light source combined with the scattering effect of the sclera generates a uniform distribution of the irradiance on the retina.
  • the limited number of lens surfaces between the image sensor and the retina in addition to the uniformity of light distribution, provide safe long-term imaging.
  • the aerial image need not be formed directly above the asperical lens.
  • the image can be redirected to a sensor that is at an angle with respect to the optical axis of the lens. This enables our device to be used for directed electromagnetic irradiation of the interior of the eye without affecting image formation.
  • a simple miniature hand-held retinal fundus camera apparatus is proposed, combining direct aerial image sensing and LED-based transscleral illumination.
  • the construction of the apparatus facilitates simple application instructions, handling by personnel with minimal training.
  • Fig.l shows the basic optical structure of the direct aerial image sensing intraocular ophthalmoscopy.
  • Fig.2 shows an apparatus based on the natural focusing of the crystalline lens of the eye.
  • Fig.3 shows a contact imaging apparatus using the natural focusing ability of the patient's crystalline eye lens.
  • Fig.4 shows an imaging apparatus using a small rotating sensor to capture the full field of view of the retina.
  • Fig.5a+b show a LED-based illumination method applied for imaging.
  • Fig.6 demonstrates the mechanism of the illumination loss found in common methods.
  • Fig.7 shows a schematic eye used to calculate the functional safe radiance graph for the illumination system.
  • Fig.8 is the graph of safe illumination power versus duration of imaging.
  • Fig.9 demonstrates the use of a partially reflecting mirror to move the image sensor of the optical axis of the lens. This also enables electromagnetic radiation to be transmitted into the eye through the asperical lens.
  • Fig.10 illustrates the method and apparatus used for localization of the intraocular devices.
  • Fig.11 a) shows the back view of a prototype hand-held fundus camera
  • b) shows the front view of a prototype hand-held fundus camera
  • c) represents the explosion view of a prototype hand-held fundus camera.
  • Fig. 12 shows an optical model of a mechatronic vitreoretinal ophthalmoscope, including the isofocus surfaces and isopixel curves
  • the different isofocus surfaces correspond to values of the sensor position dls in mm, for uniform sensor steps of appr. 2.1 mm.
  • the isopixel curves correspond to pixel distances from the optical axis for uniform steps of appr. 2.3 mm.
  • Fig. 13 shows an experimental imaging setup.
  • PP principal planes of the thick condensing lens
  • CPP principal planes of the compound lens system.
  • Fig. 14 shows actual values and model fits for the depth-from-focus experiment.
  • Full calibration shows a mean absolute error of 190 ⁇ m.
  • Biometric calibration shows a mean absolute error of 224 ⁇ m.
  • Fig. 15 shows a localization experiment in a model eye.
  • Fig. 16 shows an exploded view of a camera according to the principle of Fig. 3, e.g. for radiotherapy of the retina.
  • the ophthalmoscope comprises an asperical lens 1 capable of generating a flat aerial image 3 of the retina 4a on a plane A- A at a defined distance 5 above the asperical lens 1. This distance 5 is defined as the focal length of the apparatus (when placed in the working distance with respect to the eye 4).
  • the asperical lens 1 could essentially have the following optical parameters:
  • the asperical lens 1 is placed at its working distance 6 from the eye pupil 4d such that the optical axis 8 and the center of the sensing device 2 are aligned.
  • the aerial image 3 of the retina 4a is captured by the sensing device 2 directly.
  • the sensing device can be a CCD sensor, a CMOS sensor, or any other image sensor.
  • CMOS sensor as the imaging device to illustrate the concept.
  • the field of view 10 is measured from the pupil and corresponds to the angle subtended by the part of the retinal fundus captured in a single image.
  • the optical structure of the apparatus has an aperture of nearly the diameter of the asperical lens 1 which, in addition to the short working distance 6, provides a high field of view 10.
  • a typical property of this type of asperical lens is that using them avoids dilation of the pupil 4d. Exploiting this type of ophthalmoscopic lens 1, our method results in a simple intraocular imaging device.
  • Fig. 1 also shows the image 3 of the retina 4a of an artificial eye taken by this method; this image is included to illustrate the concept.
  • the asperical lens can be chosen from a family of the asperical lenses 1, proposed by VoIk Co. in U.S. Patent No. 5,706,073. These lenses are specifically adapted to the optical parameters of the human eye and can be used for applications in connection with the human eye, e.g. funduscopy.
  • This lens family contains non-contact and contact types.
  • Non-contact lenses provide a comfortable imaging situation.
  • the exact working distance 6 depends on the position of the patient's eye 4. Increasing the working distance 6 can result in the reduction of the field of view 10.
  • non-contact lenses 1 are larger and the generated field of view 10 is less than for the contact group.
  • Non-contact lenses typically provide a field of view 10 in a range of 60° to 100° .
  • contact lenses typically offer a field of view 10 of more than 100° .
  • contact lenses have a smaller diameter, lead to more compact devices, and no working distance 6 adjustments is required. They also increase the isolation of the apparatus from the ambient incoming light. However, since they must be in contact with the patient's eye 4, sterilization is required, and the imaging conditions are not optimal for patient comfort.
  • asperical lens 1 also depends on the required dimension of the features on the image sensor 2. This factor is defined by the magnification of the asperical lens 1.
  • the pixel size on the image sensor 2 defines the depth of field and hence, the required resolution of the translational focusing mechanism.
  • the image sensor 2 can be replaced with a screen that can be any kind of image generator, e.g. an LCD, for applications other than ophthalmoscopy.
  • the image of the screen e.g. the text of a book, 3D stereo images, a movie, a virtual environment, is projected on the retina 4a through the asperical lens 1.
  • the aerial image 3 is flat, maximum sharpness can be achieved by moving 7the image sensor 2 in parallel to the optical axis 8 as shown in Fig.1 or by using the natural focusing of the crystalline lens 4h of the patient's eye 4 as shown in Fig.2.
  • the former can be used for conscious/unconscious patients, whereas the latter can be used for imaging the retina of conscious patients, able to accommodate their crystalline lens 4h spontaneously.
  • Fig. 1 shows the movement 7 of the image sensor 2 in order to change the focal length 5 of the camera apparatus.
  • the change of the focal length 5 results in that different surfaces within the eye appear sharp on the image sensor. These surfaces are also called isofocus surfaces and are shown in Fig. 10 and 12 and described in greater detail in connection with this figure.
  • the movement 7 along the optical axis 8 of the lens 1 and of the eye 4, respectively, can on one hand be used for bringing the retina into focus.
  • information on the position of the sensor relative to the lens can be used to localize objects within the eye:
  • the relative positions of sensor 2 and lens 1 are adapted such that the intraocular object appears sharp in the acquired image. Then, the position information is used to gain information on the distance of the object from the lens (e.g.
  • the position of the image of the object within the acquired image can be used to gain information on the position of the object within the isofocus surface (e.g. position perpendicular to the optical axis). This is further described in connection with Figs. 10 and 12.
  • Fig.2 shows the concept of using the natural focusing of the crystalline lens 4h to achieve the maximum sharpness of the images of the retina 4a.
  • An image projector 9 is positioned in order to generate a guiding image 11 on a plane B-B on the image sensor plane A-A or in an equivalent distance with respect to the asperical lens 1.
  • the projector 9 guides the patient's crystalline lens 4h to focus on the image 11 so that the retinal image 3 is generated on the sensor 2.
  • the projector 9 may be placed directly in the image sensor plane A-A, as shown in Fig. 2, or may be placed such that its image plane
  • Fig.3 shows a fundus imaging apparatus, wherein a contact asperical compound lens 15 (Ib in combination with Ia) is used.
  • a contact asperical compound lens 15 Ib in combination with Ia
  • the same construction can be used with a non- contact asperical lens.
  • the distance 5 of the image sensor 2 with respect to the asperical lens 15 is fixed.
  • the focusing mechanism is like in Fig. 2:
  • a projector 9 projects an image 11 on a plane B-B in a specific distance 13 to the asperical lens 15.
  • the projected image 11 is reflected by a partially reflecting mirror 12 toward the eye 4.
  • the patient is asked to focus on the generated image 11. Focusing of the eye crystalline lens 4h on the plane B-B brings the image of retina 3 in focus on the sensor plane A-A.
  • a schematic projection apparatus 9 is used to illustrate the concept.
  • the removal of the mechanical mechanisms in addition to the application of contact lens 15 results in higher miniaturization and simplicity of construction.
  • the distance between the image sensor plane A-A to the lens is equal to the redirected distance 13 between projection apparatus 9 to the asperical lens 15.
  • the partially reflecting mirror 12 separates the focusing beams, projected for guidance of the patient's eye 4, from the imaging beams.
  • This apparatus can be used especially for inspections of the retinal fundus 4a of the eye 4 where the eye vision is not lost completely.
  • the projection apparatus 9 is proposed to illustrate the idea of guidance of the patient's eye 4.
  • the field of view 10 of our method depends on the size of the image sensor 2. Large image sensors that can capture the full field of view are available commercially but are relatively costly.
  • the first possibility is the application of low magnification asperical lenses 1, which generates a smaller aerial image 3.
  • the other possibility is using a mechanical mechanism to navigate a small image sensor 2a over the aerial image 3 on its plane A-A and capturing a sequence of images 16.
  • a rotational 17 or translational 18 mechanism is proposed to navigate the sensor 2a over the aerial image 3 in a plane perpendicular to the optical axis 8 in a translatory or rotational manner.
  • the captured sequence of the images can be later mosaiced into the full aerial image 3.
  • the device also utilizes a translational mechanism 7 for movement of the sensor 2 in the direction of the optical axis 8 and thus for image focusing.
  • Transpupilary illumination can be performed by equipping the apparatus with a mirror between the sensor 2 and the asperical lens 1 and illuminating vertically to the imaging axis 8. This construction suffers from an increase in size and from the reflections caused by the several optical interfaces.
  • transscleral illumination device as shown in Fig. 5a+b.
  • an LED based transscleral illumination method is used.
  • the illumination device uses a plurality of LEDs 26a, 26b.
  • the observer e.g. an intraocular imaging device such as the ophthalmoscope according to the invention or a human eye, is indicated with reference numeral 28.
  • LEDs 26a/26b generate a wide angle of illumination (view angle).
  • Lenses 27a, 27b are used to concentrate the view angle to the required area 4b on the sclera 4g:
  • two LEDs 26b with discrete lenses 27b are used that direct the light along optical axes 14 to regions 29b to the left and to the right of the iris 4e (see left part of Fig. 5a).
  • a continuous ring-shaped lens 27a is used that directs light from a plurality of LEDs 26a towards a ring-shaped region 29a around the iris 4e (see left part of Fig. 5b).
  • 3,954,329 has proposed the pars plana regions for the irradiance area 29b on the sclera 4g, since its transmission coefficient for visible light is higher than the rest of the eye 4. Hence, the light can be focused on these regions 29b (Fig. 5a).
  • the illuminated area 4b can also be a ring 29a around the iris 4e (Fig. 5b).
  • the symmetric region 29a particularly enables even distribution of light over the retina 4a.
  • the scattering effect of the illuminated region 4b generates a nearly Lambertian light distribution in the interior of the eye 4 shown with shortly dashed lines.
  • LEDs 26a/26b should be chosen based on the light spectrum that they generate. LEDs 26a/26b normally emit in the visible range.
  • Fig. 5a+b show that the lenses 27a/27b are in non-contact condition with respect to the sclera 4g.
  • An LED-based light source results in high miniaturization and no constraint on the imaging device shown in Fig.l, since the illumination device shown in Fig. 5a+b and the imaging device shown in Fig. 1 are two separate setups that use different optical axes 8/14.
  • the device according to the invention compared to common ophthalmoscopes, also requires a lower rate of light to be exerted on the eye 4.
  • Fig. 6 schematically illustrates the light loss of common methods, where the aerial image 33 generated by a standard lens and thus curved is not directly captured with a sensor 38 but imaged onto the sensor by means of one or more field lenses 39.
  • a human eye 4 is illuminated with a light intensity 32 generated by means of an illumination source 30, which for example comprises a LED 31.
  • an illumination source 30 which for example comprises a LED 31.
  • the device shown in Fig. 5a+b can be used.
  • the light intensity 37 reaching to the sensor 38 is only a small fraction of the generated light 32. Therefore, higher irradiance load must be exerted on the sensitive tissue of the retina 4a in common methods. As a result, safe retinal fundus imaging requires a short time span.
  • the apparatus according to the invention introduces the minimum number of optical interfaces within the path of imaging light. Moreover, since the aerial image 3 is sensed directly, the light loss is further minimized. Hence, the light source intensity can be minimal and the imaging duration can be accordingly increased. This feature of the invention is of special application for long duration surgeries and minimally invasive diagnosis.
  • FIG. 7 shows the schematic construction of the illumination source 53 with respect to the eye 4.
  • the sclera 4g acts as a Lambertian surface.
  • the radiance of the sclera 4g is given by:
  • is the radiant flux
  • d ⁇ is the solid angle made up by differential surface of dA on the retina 4a and da on the sclera 4g.
  • is the angle between the normal to the surface and the differential solid angle axis.
  • C v is the transmission coefficient of the vitreous 4f.
  • the radiance of the illumination source 53 that reaches the sclera 4g is given by:
  • a is the area of the sclera illuminated 4b and ⁇ max is the maximum view angle of the light source 53.
  • the eye irradiance also causes an increase in the temperature of the eye 4:
  • C sa is the absorption coefficient of the sclera 4g
  • defines the increase in the temperature of the eye 4
  • m is the mass of the eye 4.
  • Fig. 9 shows how the imaging method described above can be applied not only to create an image of the retina but also to project a specific external image onto the retina, for example to conduct a treatment by irradiating the retina 4a with a specific intensity distribution of electromagnetic waves.
  • the irradiation pattern is created by a light source 20 which is movable. If the light source 20 is a point light source, its image corresponds to an illumination spot on the retina 4a. The location of this spot on the retina 4a depends on the position of the light source 20. The clinician can observe the retina 4a and define the position of the electromagnetic radiation source 20 simultaneously.
  • the asperical lens 1 and the partially reflecting mirror 12a create the image 3 of the retina 4a on the image sensor placed at the plane A-A.
  • the target area 18 on the retina 4a is defined by the clinician.
  • the transformed region 19 (image of the target area) on the focal plane is the required area that the vertical radiation source 20 should be navigated over.
  • the navigation of radiation source 20 on the focal plane A-A corresponds to the navigation of the irradiance point over the required retinal region 18.
  • a partially reflecting mirror 12a separates the intraocular incoming light and transpupilary electromagnetic outgoing beam 21.
  • the refraction index of the asperical lens 1 for incoming light wavelength and the electromagnetic wave 21 wavelength should be possibly identical.
  • the clinician can have real-time observation through the image sensor 2 and can simultaneously irradiate the required point of the interior of the eye 4.
  • the retina's highly sensitive and small structures require accurate and precise surgeries.
  • the invention enables accurate intraocular localization.
  • the wide field of view, high auto-focus resolution, simple optical structure, and low illumination radiance of this method, in addition to the unoccupied periphery of the sclera 4g, are promising characteristics necessary for accurate long-term localization of intraocular devices 22 during surgeries.
  • Fig. 10 describes the localization concept. This localization requires internal and external calibration of the apparatus with respect to the applied asperical lens 1.
  • Different positions of the image sensor 2 with respect to the optical axis are indicated by planes a-a, b-b, ... f-f. These positions correspond to isofocus surfaces a'-a', b'-b', ..., f - f within the eye: For example, all points on the isofocus surface b'-b' are imaged onto plane b-b.
  • the lateral position of an object is derived by analyzing the position of it's image within the acquired image. All points within the eye that are imaged onto the same pixel nl, n2, n3 on the sensor define an isopixel curve nl', n2', ....
  • the position of the intraocular device 22 can be extracted based on the position of the planes of the image sensor a-a, b-b, etc. and the position of the image on the sensor.
  • Fig. 10 also shows two images 3, 25 taken on the planes a-a and c-c.
  • the image sensor 2 on the plane a-a is in-focus with the retina 4a on isofocus surface a'-a' (image 3), whereas on the plane c-c it is in focus with the intraocular device 22 (i.e. the needle tip) on isofocus surface c'-c' (image 25).
  • the sharpness of the image 25 is used for the localization of the object:
  • the feedback of apparatus focal length 23 and the position 24 of the image on the sensor (here pixel n2) defines the position of the intraocular device 22.
  • the intraocular corresponding curves of the positions of the sensor and the pixel are illustrated.
  • the three DOFs are defined. Based on this method, accurate localization and visualization in the posterior of the eye is achievable.
  • Fig.11 shows a sample of a miniaturized intraocular imaging apparatus which is based on the invention.
  • the device comprises an asperical lens 1 and an image sensor 2, which are arranged in a common housing 56 and, within the housing 56, in an imaging apparatus cover 40.
  • the imaging apparatus cover 40 is isolated with non-reflecting paint.
  • the asperical lens 1 is positioned inside this cover 40.
  • the distance between the asperical lens 1 and the eye pupil is fixed, e.g. by appropriate positioning elements (not shown), and equal to the working distance of the asperical lens 1.
  • the asperical lens 1 creates a flat aerial image directly on the image sensor 2.
  • the applied asperical lens 1 provides a field of view of 103° measured from the pupil, nearly covering the full retinal fundus, and a magnification of 0.72.
  • the apparatus In order to visualize objects in the full posterior of the eye, the apparatus should have a variable focal length of 10 - 20ww .
  • the active area of the image sensor 2 is smaller than the aerial image.
  • the image sensor 2 moves over the aerial image plane generated by the asperical lens 1, i.e. perpendicular to the optical axis of the lens 1. This movement is achieved by rotational motor 48.
  • the position of the sensor 2 in the direction of the optical axis can be adjusted by two linear motors 41.
  • Fig.11a shows the back view of the intraocular imaging apparatus.
  • the cover 56 of the apparatus is cut to illustrate the apparatus's mechanisms and circuitry.
  • Two linear actuators e.g. two motors 41, move the sensor 2 along the optical axis to focus on the desired intraocular object.
  • a rotational mechanism actuated by a motor 48, moves the image sensor 2 over a plane perpendicular to the imaging axis.
  • Fig.1 Ib shows the front view of the intraocular imaging apparatus.
  • the light source lens 27a faces the eye sclera.
  • a set of high-power LEDs 26a is mounted behind the lens 27a.
  • a heat sink 47 is assembled behind the LEDs 26a and holds the apparatus cover 56.
  • Fig. l ie shows the explosion view of the intraocular imaging apparatus.
  • the image sensor 2 is connected through a cable to the frame grabber and control circuit 46.
  • the circuits 46 are mounted on a moving chassis 44.
  • the moving chassis 44 is translated by the motor holders 45.
  • the set of the light source lens 27a and imaging device cover 40 acts as the stationary chassis.
  • the stationary gear 43 acts as the holder of the asperical lens 1 inside the stationary chassis.
  • Two motors 41 translate the imaging device.
  • a motor 48 rotates the imaging device and therefore the image sensor 2 over a plane perpendicular to the imaging axis by the rotary gear 49 and stationary gear 43 interactions.
  • the imaging device and motors are assembled on a rotating holder 42.
  • the user can control the image sensor's 2 position through interface software.
  • the mechanical mechanisms could be exchanged with any improved motion mechanism. Obviously, optimization in the mechanical mechanisms would result in higher miniaturization and better performance.
  • the intraocular imaging apparatus can be placed on a glasses frame, fixed base, or even it can be held by a technician manually.
  • Fig. 12 shows a simulation of the optical properties of a mechatronic vitreoretinal ophthalmoscope (MVO) based on indirect ophthalmoscopy, e.g. as shown in Fig. 1 1.
  • MVO mechatronic vitreoretinal ophthalmoscope
  • the MVO is a simple device that consists of two components: a condensing lens 1 that is kept at a constant position with respect to the eye, and a sensor 2 that captures the aerial image directly and moves with respect to the lens to focus on objects throughout the eye.
  • a lens as described above with respect to Fig. 1 is used as the condensing lens.
  • a 24x24 mm 2 CMOS sensor can be used to capture the full field-of-view.
  • the condensing lens causes a magnification of 0.72 and thus, a structure of 100 ⁇ m on or near the retina will create an image of 72 ⁇ m. Even with no additional magnification, a CMOS sensor with a common sensing element size of 6x6 mm 2 will be able to resolve small retinal structures sufficiently.
  • CMOS sensors have a shallow depth-of- focus, and as a result, they can be used effectively in depth-from-focus techniques.
  • the sensor To focus on objects throughout the eye, the sensor must have a travel of about 45 mm. However, since we are interested in localizing objects mainly in the posterior of the eye, a sensor travel of only about 10 mm is necessary.
  • An additional benefit of the MVO is that it minimizes the amount of illumination necessary for adequate image quality. Since the human eye, especially the retina, is a sensitive structure, any method that minimizes irradiance on it is preferable. Every lens surface attenuates the light passing through. Thus, for every additional surface we must exert an extra amount of light on the eye.
  • the MVO allows the minimum light intensity to be exerted on the patient's eye by minimizing the number of lens surfaces within the light path to two. This minimal requirement for retina irradiance, in combination with the wide field-of-view, adequate spatial resolution, and ability to obtain depth information from focus, makes the MVO promising for imaging and localizing intraocular devices.
  • the condensing lens projects the spherical surface of the retina on a flat aerial image.
  • moving the sensor will focus the image at different surfaces inside the eye; we call these surfaces isofocus surfaces.
  • locations inside the eye will correspond to pixels on a moving sensor in a way that differs from the perspective projection model; the locus of intraocular points that is imaged on the same pixel coordinates is called an isopixel curve.
  • Fig. 12 shows that the expected depth resolution is higher for regions far from the retina. Moreover, it shows that the formed image is inverted. From the slope of the isopixel curves, it is understood that the magnification of the intraocular object increases farther from the retina. As a result, we conclude that both spatial and lateral resolution increase for positions closer to the intraocular lens. Furthermore, as the intraocular lens is approached, the isofocus surfaces transition from spherical surfaces to planes.
  • Paraxial models are first-order models of the optics of a system of lenses, which hold in regions near the optical axis. The accuracy of these methods decreases as the angles of the incoming rays with respect to the optical axis increase, and the breakdown point of the approximations depends on the lenses that compose the combined optical system.
  • Fig. 13 aligned imaging system
  • the compound system projects an object at So to an image on the sensor at distance Si:
  • Fig. 14 In the model eye it is possible to calibrate for the errors using the actual values, and the result is shown in Fig. 14. In reality, limited access to the interior of the human eye makes full calibration impossible. However, from biometric measurements, the position of the retina can be known. Assuming that we have knowledge of the true retinal position, we can calibrate our system and estimate the errors ⁇ s, and from only one initial observation. In Fig. 14 the biometric model is fit on the actual values using just one point for calibration. Using this method, the accuracy will always be high near the retina, but has the potential to decrease for regions closer to the pupil.
  • Fig. 16 shows an exploded view of the main components of a camera device 100 which additionally comprises a projector 9 for projecting an image into the eye, e.g. as schematically shown in Fig. 9.
  • the camera device 100 comprises an asperical lens 1 as well as a sensor 2.
  • the sensor is mounted on holder 106, which is coupled with base element 105.
  • the sensor 2 can be moved in the direction of the imaging axis
  • the motor 41 is arranged in a housing 102 and coupled with the sensor 2 and the sensor base element
  • the components of the camera device 100 are arranged in a common housing 56.
  • the housing 56 also houses a beam splitter 12a, e.g. a partially reflective mirror.
  • the housing also houses a beam splitter 12a, e.g. a partially reflective mirror.
  • FIG. 10 An illumination system as described above with respect to Fig. 10 comprising a plurality of LEDs 26a and a corresponding ring-shaped lens 27a as well as a heat sink 47 is also present.
  • the projector 9 comprises an image generator with a light source, here a laser diode 117.
  • the light source is held by laser diode holder 116. It can be moved means of two actuators, i.e. tilted by motor 113 and rotated by motor 109, in a plane perpendicular to the imaging axis.
  • the movable and stationary parts of the projector further comprise: a projector head 110, a coupling 1 11, a rotary chassis 112 of the projector, a nut 115 of the linear motor 113. Adjustment of depth of focus of the projector 9 is not possible with this configuration, but another actuator for adjusting the position of the sensor along the optical axis could be added.
  • the camera 100 with integrated projector 9 can be used to project an image into the eye or to illuminate a certain spot on the retina, for example for radiotherapy of the retina, and for simultaneous observation thereof.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)
  • Prostheses (AREA)

Abstract

A method for intraocular imaging based on the direct acquisition of the flat aerial image generated by asperical lenses is proposed. This approach leads to the removal of the field lenses, guaranteeing that the least number of optical elements will be used. Based on this idea, we design a miniaturized camera apparatus able to capture the entire patient's retina. Patient-based and machine-based methods to auto-focus on the retina are discussed, and different variations of the initial design are provided. We propose an extended system able to visualize objects in the full posterior of the eye, as well as a light-emitting-diode-based transscleral illumination method that is applicable for uniform illumination of the retina and long-term observation.

Description

OPHTHALMOSCOPY USING DIRECT SENSING OF THE FLAT AERIAL- IMAGE CREATED BY AN ASPERICAL LENS
FIELD OF INVENTION
The present invention pertains generally to ophthalmoscopy. It presents a method and an apparatus for vitroretinal imaging and localization of intraocular objects.
STATE OF THE ART
Retina fundus imaging is a critical task in ophthalmoscopy. In funduscopy, the spherical shape of the retina must be transformed to a flat shape on the camera sensor to achieve a sharp image. The optical structure of the eye including the cornea, lens, vitreous and aqueous humor, and pupil makes wide-field-of-view imaging challenging. Moreover, the retinal tissue is highly sensitive to light intensity.
Funduscopy is typically performed by positioning a high-magnification lens in front of the eye. This lens generates an aerial image that is curved. Then, the aerial image is made planar as it is passed through a series of field lenses, and finally the flat image is captured by an image sensor. In addition to modifying the curvature of the aerial image, field lenses are used to fit the size of the aerial image on the image sensor. This method is discussed in "O. Pommerantzeff, R. H. Webb, and F. C. Delori, Image Formation in Fundus Cameras, Investigative Ophthalmology and Visual Science, 18(6):630-637, 1979". In that work, a transpupilary illumination method is used to illuminate the interior of the eye. A wide-field-of-view imaging method is also proposed in "O. Pommerantzeff, Wide-angle Non-contact and Small-angle Contact Cameras, Investigative Ophthalmology and Visual Science, 19(8):973- 979, 1980". This method also applies a transpupilary illumination method, including the separation of apertures of ophthalmoscopic illumination and observation. In U.S. Pat. No. 3,954,329, Pomerantzeff proposed a contact ophthalmoscopic setup with a wide field of view. Since the applied ophthalmoscopic lens generates a curved aerial image, the setup proposed should be accompanied by secondary optical viewing equipment (e.g. field lenses, a camera, an observer's eye). The necessity of a secondary imaging device limits the miniaturization of such a system. The curvature of the aerial image remained a relevant aberration of the whole systems until VoIk Co. and A. Donald in U.S. Patent No. 5,706,073 proposed a family of asperical lens capable of generating flat images of the retina, with a trade-off between wide field of view and high magnification. This patent contains both contact and non-contact asperical lenses for use with a slit lamp biomicroscope, other operating microscopes, and manual observation.
Proper illumination for fundus photography can be achieved non-invasively through the sclera or pupil. Transpupilary illumination suffers from the reflections caused by several interfaces within the light path. In order to avoid reflections, typically a portion of the pupil is used for imaging and a portion is used for illumination, leading to limited field of view. Transpupilary illumination also requires additional optical devices within the imaging path, resulting in a larger number of optical interfaces (e.g. reflecting surfaces), which attenuates the light reaching the image sensor.
Transscleral illumination is proposed by Pomerantzeff in U.S. Pat. No. 3,954,329. His method applies two or more optical fibers in contact with the sclera. Gil et al. in U.S. Pat. Pub. No. US2007/0159600A1 have also used a transscleral method, proposing the use of two optical fibers focusing on the pars plana region with the help of additional lenses. Their method utilizes a halogen lamp and transfers the light through optical fibers. This method requires near-IR and near-UV filters.
Optical properties of the human sclera (i.e. transmission, reflection, and absorption coefficients) are determined in "A. Vogel, C. Dlugos, R. Nuker, and R. Birngruber, Optical Properties of Human Sclera, and Their Consequences for Transscleral Laser Applications, Laser in Surgery and Medicine, (11):331-340, 1991". The safety radiance thresholds of the human eye are given in "I. I. C. on Electromagnetic Safety (SCC39), Standard for Safety Levels with Respect to Human Exposure to Radio Frequency Electromagnetic Field 3khz to 300ghz, IEEE Standards, C95.1, pages 87-88, 2005". The proposed exposure safety of the human eye can be adopted for the calculations of maximum light intensity for a safe ophthalmoscopy. Transscleral laser is used clinically for treatment of the glaucoma. Optical properties of sclera, conjunctiva, and ciliary body have been examined in "B. Nemati, H. G. Rylander, and A. J. Welch, Optical Properties of Conjunctiva, Sclera, and the Ciliary Body and Their Consequences for Transscleral Cyclophotocoagulation, Applied Optics, 35(19):3321-3327, 1996" in order to determine the optimal laser wavelength.
A few miniature hand-held retinal fundus cameras have been proposed to date. Nanjo has proposed a device in U.S. Pat. No. 5,668,621. In this device, transpupilary illumination is applied. This device is a gun-shaped fundus camera, wherein the imaging optical construction is embedded in the main body and the illumination structure is embedded in the handle. Another possible design is proposed in "C. Gliss, J. Parel, J. T. Flynn, H. Pratisto, and P. Niederer, Toward a Miniaturized Fundus Camera, Journal of Biomedical Optics, 9(1): 126-131, Jan/Feb 2004". In the device proposed, the fundamental structure of macro ophthalmoscopes, i.e. generating an aerial image by an ophthalmoscopic lens and accordingly imaging from it by a set of field lenses, is miniaturized. The transpupilary and transscleral illumination are also investigated.
Generally, the common methods of ophthalmoscopy are based on two steps: creation of the retina's aerial image and capturing of the aerial image. In the latter, the scattering, reflection, and absorption due to each optical interface within the light path require high irradiance of the retina. Thus, the imaging duration should be limited. The field lenses and any illumination component take up space. Consequently, available fundus cameras cannot be easily miniaturized.
OBJECTS OF THE INVENTION
The object of this invention is to provide a method and an apparatus for intraocular imaging that is made simpler than the known devices and uses in particular an optical system with a reduced number of optical interfaces.
Another object of this invention is a miniature wide-angle fundus viewing camera that does not require pupil dilation, that is capable of handling images automatically, and that can be operated by personnel with minimal training.
Another object of this invention is a miniature ophthalmoscope that remains stationary with respect to the eye, and has the ability to image objects throughout the posterior of the eye and to localize these intraocular objects during medical interventions.
Another object of this invention is to provide a method for minimizing the eye irradiance intensity and therefore maximizing the allowable duration of imaging. Furthermore, uniformity of the illumination is an object of the invention. Additionally, the illumination device should not obstruct the imaging apparatus or negatively impact on image formation.
An auxiliary object of this invention is an imaging method and apparatus that allows real-time electromagnetic therapy (e.g. laser) of the interior of the eye.
SUMMARY OF THE INVENTION
These and further objects are achieved by a method and apparatus as described in the independent claims. Beneficial embodiments are described in the dependent claims and the description, and are shown in the figures. The present invention uses a single ophthalmoscopic lens or a lens system (compound lens), and does not require any additional imaging device, e.g. microscope.
This invention stems from the advent of new asperical lenses capable of generating flat aerial images of the posterior of the eye. This type of lenses transforms the spherical shape of the retina to a flat plane. They can be categorized as contact and non-contact types. Contact lenses generate a wide field of view and typically have a smaller diameter, while non-contact lenses generate a smaller field of view and have a larger diameter. We utilize these asperical lenses for several applications described as examples hereinafter.
Applying this type of lenses allows the removal of the field lenses. The image sensor is directly placed on the aerial image position. This structure provides the minimum distance between the image sensor and the pupil. Moreover, the apparatus structure facilitates miniaturization and has a large aperture, which provides a wide field of view for the apparatus. This wide field of view can be obtained with a single image sensor, provided its size is large enough. In the case of using a small image sensor, a rotational or translational mechanism moves it over the whole aerial image, while the image sensor captures different portions of it. The captured images can be post-processed and mosaiced to obtain the full aerial image.
Capturing sharp images requires focusing mechanisms. Using the natural focus of the crystalline lens of the eye is a possible approach in which the patient is asked to focus on an image in a proper distance, resulting in the creation of the aerial image on the image sensor. Another possibility would be using a translational mechanism for focusing. The image sensor would move in parallel to the optical axis to maximize the image sharpness.
The described method and apparatus can also provide focused images of the locations in the posterior of the eye other than the retina. Focus information (or information on the location of the image plane) can then be used for the localization of objects in the posterior of the eye. As a result, it can be used as part of a system that controls intraocular devices, e.g. untethered microrobotic devices, with visual feedback.
A transscleral illumination apparatus based on light emitting diodes (LEDs) is used with the proposed imaging apparatus. The LED-based light source is equipped with a lens that faces the sclera, and focuses the light on it. The shape of the light source combined with the scattering effect of the sclera generates a uniform distribution of the irradiance on the retina. The limited number of lens surfaces between the image sensor and the retina, in addition to the uniformity of light distribution, provide safe long-term imaging.
The aerial image need not be formed directly above the asperical lens. With a partially reflecting mirror or prism, the image can be redirected to a sensor that is at an angle with respect to the optical axis of the lens. This enables our device to be used for directed electromagnetic irradiation of the interior of the eye without affecting image formation.
A simple miniature hand-held retinal fundus camera apparatus is proposed, combining direct aerial image sensing and LED-based transscleral illumination. The construction of the apparatus facilitates simple application instructions, handling by personnel with minimal training.
DESCRIPTION OF THE FIGURES
Fig.l shows the basic optical structure of the direct aerial image sensing intraocular ophthalmoscopy.
Fig.2 shows an apparatus based on the natural focusing of the crystalline lens of the eye.
Fig.3 shows a contact imaging apparatus using the natural focusing ability of the patient's crystalline eye lens. Fig.4 shows an imaging apparatus using a small rotating sensor to capture the full field of view of the retina.
Fig.5a+b show a LED-based illumination method applied for imaging.
Fig.6 demonstrates the mechanism of the illumination loss found in common methods.
Fig.7 shows a schematic eye used to calculate the functional safe radiance graph for the illumination system.
Fig.8 is the graph of safe illumination power versus duration of imaging.
Fig.9 demonstrates the use of a partially reflecting mirror to move the image sensor of the optical axis of the lens. This also enables electromagnetic radiation to be transmitted into the eye through the asperical lens.
Fig.10 illustrates the method and apparatus used for localization of the intraocular devices.
Fig.11 a) shows the back view of a prototype hand-held fundus camera, b) shows the front view of a prototype hand-held fundus camera, c) represents the explosion view of a prototype hand-held fundus camera.
Fig. 12 shows an optical model of a mechatronic vitreoretinal ophthalmoscope, including the isofocus surfaces and isopixel curves The different isofocus surfaces correspond to values of the sensor position dls in mm, for uniform sensor steps of appr. 2.1 mm. The isopixel curves correspond to pixel distances from the optical axis for uniform steps of appr. 2.3 mm.
Fig. 13 shows an experimental imaging setup. PP: principal planes of the thick condensing lens, CPP: principal planes of the compound lens system. Fig. 14 shows actual values and model fits for the depth-from-focus experiment. Full calibration shows a mean absolute error of 190 μm. Biometric calibration shows a mean absolute error of 224 μm.
Fig. 15 shows a localization experiment in a model eye.
Fig. 16 shows an exploded view of a camera according to the principle of Fig. 3, e.g. for radiotherapy of the retina.
DETAILED DESCRIPTION OF THE INVENTION
OPTICAL CONSTRUCTION
Fig.l illustrates the concept of the invention. The ophthalmoscope comprises an asperical lens 1 capable of generating a flat aerial image 3 of the retina 4a on a plane A- A at a defined distance 5 above the asperical lens 1. This distance 5 is defined as the focal length of the apparatus (when placed in the working distance with respect to the eye 4).
The asperical lens 1 could essentially have the following optical parameters:
Surface Radius Thickness Refraction Index Aspericalit Aperture y
Back side of asperical 9.48 13 1.531 -9.3255 13.5 lens
Front side of asperical -11.65 5 1 -1.0671 13.5 lens The asperical lens 1 is placed at its working distance 6 from the eye pupil 4d such that the optical axis 8 and the center of the sensing device 2 are aligned. The aerial image 3 of the retina 4a is captured by the sensing device 2 directly. The sensing device can be a CCD sensor, a CMOS sensor, or any other image sensor. Hereinafter, we use a CMOS sensor as the imaging device to illustrate the concept.
The field of view 10 is measured from the pupil and corresponds to the angle subtended by the part of the retinal fundus captured in a single image. The optical structure of the apparatus has an aperture of nearly the diameter of the asperical lens 1 which, in addition to the short working distance 6, provides a high field of view 10. A typical property of this type of asperical lens is that using them avoids dilation of the pupil 4d. Exploiting this type of ophthalmoscopic lens 1, our method results in a simple intraocular imaging device.
Fig. 1 also shows the image 3 of the retina 4a of an artificial eye taken by this method; this image is included to illustrate the concept.
The asperical lens can be chosen from a family of the asperical lenses 1, proposed by VoIk Co. in U.S. Patent No. 5,706,073. These lenses are specifically adapted to the optical parameters of the human eye and can be used for applications in connection with the human eye, e.g. funduscopy. This lens family contains non-contact and contact types. Non-contact lenses provide a comfortable imaging situation. The exact working distance 6 depends on the position of the patient's eye 4. Increasing the working distance 6 can result in the reduction of the field of view 10. Moreover, non-contact lenses 1 are larger and the generated field of view 10 is less than for the contact group. Non-contact lenses typically provide a field of view 10 in a range of 60° to 100° . Whereas contact lenses typically offer a field of view 10 of more than 100° . In addition, contact lenses have a smaller diameter, lead to more compact devices, and no working distance 6 adjustments is required. They also increase the isolation of the apparatus from the ambient incoming light. However, since they must be in contact with the patient's eye 4, sterilization is required, and the imaging conditions are not optimal for patient comfort.
The selection of asperical lens 1 also depends on the required dimension of the features on the image sensor 2. This factor is defined by the magnification of the asperical lens 1. The pixel size on the image sensor 2 defines the depth of field and hence, the required resolution of the translational focusing mechanism.
The image sensor 2 can be replaced with a screen that can be any kind of image generator, e.g. an LCD, for applications other than ophthalmoscopy. The image of the screen, e.g. the text of a book, 3D stereo images, a movie, a virtual environment, is projected on the retina 4a through the asperical lens 1.
IMAGE FOCUSING
Since the aerial image 3 is flat, maximum sharpness can be achieved by moving 7the image sensor 2 in parallel to the optical axis 8 as shown in Fig.1 or by using the natural focusing of the crystalline lens 4h of the patient's eye 4 as shown in Fig.2. The former can be used for conscious/unconscious patients, whereas the latter can be used for imaging the retina of conscious patients, able to accommodate their crystalline lens 4h spontaneously.
Fig. 1 shows the movement 7 of the image sensor 2 in order to change the focal length 5 of the camera apparatus. The change of the focal length 5 results in that different surfaces within the eye appear sharp on the image sensor. These surfaces are also called isofocus surfaces and are shown in Fig. 10 and 12 and described in greater detail in connection with this figure. The movement 7 along the optical axis 8 of the lens 1 and of the eye 4, respectively, can on one hand be used for bringing the retina into focus. On the other hand, information on the position of the sensor relative to the lens can be used to localize objects within the eye: The relative positions of sensor 2 and lens 1 are adapted such that the intraocular object appears sharp in the acquired image. Then, the position information is used to gain information on the distance of the object from the lens (e.g. position along the optical axis). Furthermore, the position of the image of the object within the acquired image can be used to gain information on the position of the object within the isofocus surface (e.g. position perpendicular to the optical axis). This is further described in connection with Figs. 10 and 12.
Fig.2 shows the concept of using the natural focusing of the crystalline lens 4h to achieve the maximum sharpness of the images of the retina 4a. An image projector 9 is positioned in order to generate a guiding image 11 on a plane B-B on the image sensor plane A-A or in an equivalent distance with respect to the asperical lens 1. The projector
9 guides the patient's crystalline lens 4h to focus on the image 11 so that the retinal image 3 is generated on the sensor 2. The projector 9 may be placed directly in the image sensor plane A-A, as shown in Fig. 2, or may be placed such that its image plane
B-B has the same distance from the asperical lens 1 than the sensor plane A-A, e.g. as shown in Fig. 3.
Fig.3 shows a fundus imaging apparatus, wherein a contact asperical compound lens 15 (Ib in combination with Ia) is used. The same construction can be used with a non- contact asperical lens. The distance 5 of the image sensor 2 with respect to the asperical lens 15 is fixed. The focusing mechanism is like in Fig. 2: A projector 9 projects an image 11 on a plane B-B in a specific distance 13 to the asperical lens 15. The projected image 11 is reflected by a partially reflecting mirror 12 toward the eye 4. The patient is asked to focus on the generated image 11. Focusing of the eye crystalline lens 4h on the plane B-B brings the image of retina 3 in focus on the sensor plane A-A. A schematic projection apparatus 9 is used to illustrate the concept. In this case, the removal of the mechanical mechanisms in addition to the application of contact lens 15 results in higher miniaturization and simplicity of construction. Obviously, the distance between the image sensor plane A-A to the lens is equal to the redirected distance 13 between projection apparatus 9 to the asperical lens 15. The partially reflecting mirror 12 separates the focusing beams, projected for guidance of the patient's eye 4, from the imaging beams. This apparatus can be used especially for inspections of the retinal fundus 4a of the eye 4 where the eye vision is not lost completely. The projection apparatus 9 is proposed to illustrate the idea of guidance of the patient's eye 4.
CAPTURING THE ENTIRE FIELD OF VIEW
As shown in Fig.l, the field of view 10 of our method depends on the size of the image sensor 2. Large image sensors that can capture the full field of view are available commercially but are relatively costly.
Cheap typical sensors are usually smaller. In the case of using a small sensor, and in order to capture the entire field of view 10, two possibilities are proposed. The first possibility is the application of low magnification asperical lenses 1, which generates a smaller aerial image 3. The other possibility, as shown in Fig. 4, is using a mechanical mechanism to navigate a small image sensor 2a over the aerial image 3 on its plane A-A and capturing a sequence of images 16. A rotational 17 or translational 18 mechanism is proposed to navigate the sensor 2a over the aerial image 3 in a plane perpendicular to the optical axis 8 in a translatory or rotational manner. The captured sequence of the images can be later mosaiced into the full aerial image 3. The device also utilizes a translational mechanism 7 for movement of the sensor 2 in the direction of the optical axis 8 and thus for image focusing.
ILLUMINATION
Basically, in the apparatus according to the invention, the required space for the illumination devices is limited, since the sensor 2 is directly placed on the aerial image 3. A possible method for illuminating of the interior of the eye 4 would be transpupilary. Transpupilary illumination can be performed by equipping the apparatus with a mirror between the sensor 2 and the asperical lens 1 and illuminating vertically to the imaging axis 8. This construction suffers from an increase in size and from the reflections caused by the several optical interfaces.
Another possibility is using a transscleral illumination device as shown in Fig. 5a+b. With the aim of miniaturization, an LED based transscleral illumination method is used. The illumination device uses a plurality of LEDs 26a, 26b. The observer, e.g. an intraocular imaging device such as the ophthalmoscope according to the invention or a human eye, is indicated with reference numeral 28.
Typically, LEDs 26a/26b generate a wide angle of illumination (view angle). Lenses 27a, 27b are used to concentrate the view angle to the required area 4b on the sclera 4g: In Fig. 5a, two LEDs 26b with discrete lenses 27b are used that direct the light along optical axes 14 to regions 29b to the left and to the right of the iris 4e (see left part of Fig. 5a). In Fig. 5b, a continuous ring-shaped lens 27a is used that directs light from a plurality of LEDs 26a towards a ring-shaped region 29a around the iris 4e (see left part of Fig. 5b). Pomerantzeff in U.S. Patent No. 3,954,329 has proposed the pars plana regions for the irradiance area 29b on the sclera 4g, since its transmission coefficient for visible light is higher than the rest of the eye 4. Hence, the light can be focused on these regions 29b (Fig. 5a). The illuminated area 4b can also be a ring 29a around the iris 4e (Fig. 5b). The symmetric region 29a particularly enables even distribution of light over the retina 4a. The scattering effect of the illuminated region 4b generates a nearly Lambertian light distribution in the interior of the eye 4 shown with shortly dashed lines. LEDs 26a/26b should be chosen based on the light spectrum that they generate. LEDs 26a/26b normally emit in the visible range. Based on the eye safety characteristics, near- IR and near-UV should be avoided. The light intensity can be calculated based on the imaging span, the image sensor 2 brightness threshold, and the safety standards of the eye 4. Fig. 5a+b show that the lenses 27a/27b are in non-contact condition with respect to the sclera 4g. An LED-based light source results in high miniaturization and no constraint on the imaging device shown in Fig.l, since the illumination device shown in Fig. 5a+b and the imaging device shown in Fig. 1 are two separate setups that use different optical axes 8/14. However, as shown in Fig.6, the device according to the invention, compared to common ophthalmoscopes, also requires a lower rate of light to be exerted on the eye 4.
Fig. 6 schematically illustrates the light loss of common methods, where the aerial image 33 generated by a standard lens and thus curved is not directly captured with a sensor 38 but imaged onto the sensor by means of one or more field lenses 39. A human eye 4 is illuminated with a light intensity 32 generated by means of an illumination source 30, which for example comprises a LED 31. For example, the device shown in Fig. 5a+b can be used.
A fraction of light intensity 36 emitted from the aerial image 33 escapes from the periphery 35 of the field lens 39. Furthermore, the intensity 34 of the incoming light to the field lens 39 is diminished due to the reflecting and scattering effect of the field lenses 39. A brief calculation shows that if the transmission coefficient of every interface is X , the number of interfaces is n , and the fraction of light that enters the field lens is Y , then the generated light that reaches the sensor is X" • Y of the total light radiated toward the eye 4. Thus, the light intensity 37 reaching to the sensor 38 is only a small fraction of the generated light 32. Therefore, higher irradiance load must be exerted on the sensitive tissue of the retina 4a in common methods. As a result, safe retinal fundus imaging requires a short time span.
In contrast to previous methods, as shown in Fig.l, the apparatus according to the invention introduces the minimum number of optical interfaces within the path of imaging light. Moreover, since the aerial image 3 is sensed directly, the light loss is further minimized. Hence, the light source intensity can be minimal and the imaging duration can be accordingly increased. This feature of the invention is of special application for long duration surgeries and minimally invasive diagnosis.
Hereinafter the relationship between the illumination source power and the observation duration is calculated. Fig. 7 shows the schematic construction of the illumination source 53 with respect to the eye 4.
We assume that the sclera 4g acts as a Lambertian surface. The radiance of the sclera 4g is given by:
L, = d'Φ (1) dΩ. • da • cos θ
Where Φ is the radiant flux, dΩ is the solid angle made up by differential surface of dA on the retina 4a and da on the sclera 4g. θ is the angle between the normal to the surface and the differential solid angle axis. For the solid angle dΩ inside the eye, we have:
dΩ = ^ dAL' „ d"A*■ - c 2os θ " = -— ^ dA (2)
Ar2 - cos ø
Where dA' and R define the differential solid angle dΩ.. dA is the differential area on the retina 4a subtended with the differential solid angle. Hence, substitution results in:
Ar2 - d2Φ
L = (3) dAdα
Moreover, the irradiance of the retina 4a is given by:
Figure imgf000017_0001
Where, Cv, is the transmission coefficient of the vitreous 4f. Furthermore, we have:
LS=ES-CS, (5)
In which C , is the transmission coefficient and E is the irradiance of the illuminated area on the sclera 4b. Therefore, for the radiant exposure of the retina 4a we have:
Figure imgf000018_0001
Moreover, the radiance of the illumination source 53 that reaches the sclera 4g is given by:
E 1^s - ~P λ source .J ,-^Ll ( VJ ') )
Ωmax a
Where a is the area of the sclera illuminated 4b and Ωmax is the maximum view angle of the light source 53. The eye irradiance also causes an increase in the temperature of the eye 4:
^ P source-C3a-η _O_ (g) m n
Where Csa is the absorption coefficient of the sclera 4g, η defines the increase in the temperature of the eye 4 and m is the mass of the eye 4.
Fig. 8 shows the relationship between the source power and the observation illumination. This plot is for a special case in which Csa =0.25, η
Figure imgf000018_0002
m = 70g, Csl=0.25, Cv, =0.9, HrjΛmshM = 2.92 J ■ cm~2 md Atlhreshold =4°C.
REDIRECTED AERIAL-IMAGE FOR ADDITIONAL EYE TREATMENTS
Fig. 9 shows how the imaging method described above can be applied not only to create an image of the retina but also to project a specific external image onto the retina, for example to conduct a treatment by irradiating the retina 4a with a specific intensity distribution of electromagnetic waves. The irradiation pattern is created by a light source 20 which is movable. If the light source 20 is a point light source, its image corresponds to an illumination spot on the retina 4a. The location of this spot on the retina 4a depends on the position of the light source 20. The clinician can observe the retina 4a and define the position of the electromagnetic radiation source 20 simultaneously. The asperical lens 1 and the partially reflecting mirror 12a create the image 3 of the retina 4a on the image sensor placed at the plane A-A. The target area 18 on the retina 4a is defined by the clinician. The transformed region 19 (image of the target area) on the focal plane is the required area that the vertical radiation source 20 should be navigated over. The navigation of radiation source 20 on the focal plane A-A corresponds to the navigation of the irradiance point over the required retinal region 18. A partially reflecting mirror 12a separates the intraocular incoming light and transpupilary electromagnetic outgoing beam 21. Essentially, the refraction index of the asperical lens 1 for incoming light wavelength and the electromagnetic wave 21 wavelength should be possibly identical. The clinician can have real-time observation through the image sensor 2 and can simultaneously irradiate the required point of the interior of the eye 4.
LOCALIZATION
The retina's highly sensitive and small structures require accurate and precise surgeries. The invention enables accurate intraocular localization. The wide field of view, high auto-focus resolution, simple optical structure, and low illumination radiance of this method, in addition to the unoccupied periphery of the sclera 4g, are promising characteristics necessary for accurate long-term localization of intraocular devices 22 during surgeries. Fig. 10 describes the localization concept. This localization requires internal and external calibration of the apparatus with respect to the applied asperical lens 1.
Different positions of the image sensor 2 with respect to the optical axis are indicated by planes a-a, b-b, ... f-f. These positions correspond to isofocus surfaces a'-a', b'-b', ..., f - f within the eye: For example, all points on the isofocus surface b'-b' are imaged onto plane b-b. The lateral position of an object is derived by analyzing the position of it's image within the acquired image. All points within the eye that are imaged onto the same pixel nl, n2, n3 on the sensor define an isopixel curve nl', n2', ....
After calibration of the apparatus, the position of the intraocular device 22 can be extracted based on the position of the planes of the image sensor a-a, b-b, etc. and the position of the image on the sensor. Fig. 10 also shows two images 3, 25 taken on the planes a-a and c-c. The image sensor 2 on the plane a-a is in-focus with the retina 4a on isofocus surface a'-a' (image 3), whereas on the plane c-c it is in focus with the intraocular device 22 (i.e. the needle tip) on isofocus surface c'-c' (image 25). The sharpness of the image 25 is used for the localization of the object: The feedback of apparatus focal length 23 and the position 24 of the image on the sensor (here pixel n2) defines the position of the intraocular device 22. The intraocular corresponding curves of the positions of the sensor and the pixel are illustrated. Based on the position of the pixel 24 (n2) on the image sensor 2 and the corresponding line (n2') inside the eye, and on the focal length 23 and the curved surface b'-b' corresponding to the sensor plane, the three DOFs are defined. Based on this method, accurate localization and visualization in the posterior of the eye is achievable. EXAMPLE 1: MINIATURIZED INTRAOCULAR IMAGING APPARATUS
Fig.11 shows a sample of a miniaturized intraocular imaging apparatus which is based on the invention.
The device comprises an asperical lens 1 and an image sensor 2, which are arranged in a common housing 56 and, within the housing 56, in an imaging apparatus cover 40.
The imaging apparatus cover 40 is isolated with non-reflecting paint. The asperical lens 1 is positioned inside this cover 40. The distance between the asperical lens 1 and the eye pupil is fixed, e.g. by appropriate positioning elements (not shown), and equal to the working distance of the asperical lens 1.
The asperical lens 1 creates a flat aerial image directly on the image sensor 2. The applied asperical lens 1 provides a field of view of 103° measured from the pupil, nearly covering the full retinal fundus, and a magnification of 0.72. In order to visualize objects in the full posterior of the eye, the apparatus should have a variable focal length of 10 - 20ww .
In this example, the active area of the image sensor 2 is smaller than the aerial image. In order to acquire a complete image, the image sensor 2 moves over the aerial image plane generated by the asperical lens 1, i.e. perpendicular to the optical axis of the lens 1. This movement is achieved by rotational motor 48. The position of the sensor 2 in the direction of the optical axis can be adjusted by two linear motors 41.
Detailed descriptions of the parts will be presented hereinafter.
Fig.11a shows the back view of the intraocular imaging apparatus. The cover 56 of the apparatus is cut to illustrate the apparatus's mechanisms and circuitry. Two linear actuators, e.g. two motors 41, move the sensor 2 along the optical axis to focus on the desired intraocular object. A rotational mechanism, actuated by a motor 48, moves the image sensor 2 over a plane perpendicular to the imaging axis.
Fig.1 Ib shows the front view of the intraocular imaging apparatus. The light source lens 27a faces the eye sclera. A set of high-power LEDs 26a is mounted behind the lens 27a. A heat sink 47 is assembled behind the LEDs 26a and holds the apparatus cover 56.
Fig. l ie shows the explosion view of the intraocular imaging apparatus. The image sensor 2 is connected through a cable to the frame grabber and control circuit 46. The circuits 46 are mounted on a moving chassis 44. The moving chassis 44 is translated by the motor holders 45. The set of the light source lens 27a and imaging device cover 40 acts as the stationary chassis. The stationary gear 43 acts as the holder of the asperical lens 1 inside the stationary chassis. Two motors 41 translate the imaging device. A motor 48 rotates the imaging device and therefore the image sensor 2 over a plane perpendicular to the imaging axis by the rotary gear 49 and stationary gear 43 interactions. The imaging device and motors are assembled on a rotating holder 42.
The user can control the image sensor's 2 position through interface software. The mechanical mechanisms could be exchanged with any improved motion mechanism. Obviously, optimization in the mechanical mechanisms would result in higher miniaturization and better performance.
The intraocular imaging apparatus can be placed on a glasses frame, fixed base, or even it can be held by a technician manually.
50 is a mounting bracket for the sensor. 52 is a motor that tilts the sensor, passing through the nut 51. Motor 41 passing through bolt 42 enables the translation of the image sensor 2 for focusing purposes. Motor 48 rotates gear 49 which rotates gear 43 that rotates the image sensor to scan different parts of the aerial image. Fig. 12 shows a simulation of the optical properties of a mechatronic vitreoretinal ophthalmoscope (MVO) based on indirect ophthalmoscopy, e.g. as shown in Fig. 1 1. The MVO is a simple device that consists of two components: a condensing lens 1 that is kept at a constant position with respect to the eye, and a sensor 2 that captures the aerial image directly and moves with respect to the lens to focus on objects throughout the eye. A lens as described above with respect to Fig. 1 is used as the condensing lens. A 24x24 mm2 CMOS sensor can be used to capture the full field-of-view. The condensing lens causes a magnification of 0.72 and thus, a structure of 100 μm on or near the retina will create an image of 72 μm. Even with no additional magnification, a CMOS sensor with a common sensing element size of 6x6 mm2 will be able to resolve small retinal structures sufficiently. Dense CMOS sensors have a shallow depth-of- focus, and as a result, they can be used effectively in depth-from-focus techniques. To focus on objects throughout the eye, the sensor must have a travel of about 45 mm. However, since we are interested in localizing objects mainly in the posterior of the eye, a sensor travel of only about 10 mm is necessary.
An additional benefit of the MVO is that it minimizes the amount of illumination necessary for adequate image quality. Since the human eye, especially the retina, is a sensitive structure, any method that minimizes irradiance on it is preferable. Every lens surface attenuates the light passing through. Thus, for every additional surface we must exert an extra amount of light on the eye. The MVO allows the minimum light intensity to be exerted on the patient's eye by minimizing the number of lens surfaces within the light path to two. This minimal requirement for retina irradiance, in combination with the wide field-of-view, adequate spatial resolution, and ability to obtain depth information from focus, makes the MVO promising for imaging and localizing intraocular devices.
As previously stated, the condensing lens projects the spherical surface of the retina on a flat aerial image. One expects that moving the sensor will focus the image at different surfaces inside the eye; we call these surfaces isofocus surfaces. Moreover, locations inside the eye will correspond to pixels on a moving sensor in a way that differs from the perspective projection model; the locus of intraocular points that is imaged on the same pixel coordinates is called an isopixel curve.
We estimate the isofocus surfaces and the isopixel curves with exact raytracing. Due to the rotational symmetry of the system, we examine the 2D case. For a grid of points inside a model eye (Navarro's eye), a fan of rays is traced until the sensor position. We position the sensor plane so that the spot size created by this rayfan is minimized (i.e. the image is in focus). The 2D coordinates on the sensor plane where the rayfan is focused specify the pixel coordinates on the image. With the calculated information we create the isofocus surfaces and isopixel curves. The results within the area of validity of the Navarro eye can be seen in Fig. 12. The position of an intraocular point can be estimated from the intersection of its isopixel curve with its isofocus surface determined from dis.
Fig. 12 shows that the expected depth resolution is higher for regions far from the retina. Moreover, it shows that the formed image is inverted. From the slope of the isopixel curves, it is understood that the magnification of the intraocular object increases farther from the retina. As a result, we conclude that both spatial and lateral resolution increase for positions closer to the intraocular lens. Furthermore, as the intraocular lens is approached, the isofocus surfaces transition from spherical surfaces to planes.
The first and simplest method to consider for intraocular localization uses a paraxial model. Paraxial models are first-order models of the optics of a system of lenses, which hold in regions near the optical axis. The accuracy of these methods decreases as the angles of the incoming rays with respect to the optical axis increase, and the breakdown point of the approximations depends on the lenses that compose the combined optical system. We conducted a depth-from-focus experiment on an aligned imaging system (Fig. 13) that consists of the MVO and a model eye, in order to extract the precise relation between the intraocular object and the in-focus sensor position. A circular calibration pattern was moved with 1mm steps in the model eye. The image sensor and object were translated using Sutter stages. The required focus scores were calculated. In order to extract a paraxial model of the condensing lens (which is symmetric around the optical axis), we fit first-order curves on its sides in an image. With thick lens equations we can calculate the equivalent focal length and the principal planes for the condensing lens:
Figure imgf000025_0001
and for the compound imaging system:
Figure imgf000025_0002
(5') puHcl] = M*Λ +jP*n$ (6') [H2,HC2] = Λ<«fa + k.*D
Je where doe = 31.0 mm, fe = 30.00 mm, Ri = 21.71 mm and R2 = -11 .08 mm (the radii of surfaces 1 and 2 of Fig. 13, respectively) were measured, del = 6,22mm was empirically chosen based on preliminary experiments, and rii = 1.65 is an estimated refractive index of the condensing lens (see Fig. 13 for the explanation of each parameter). The operator [;] denotes the signed distance between two points.
The compound system projects an object at So to an image on the sensor at distance Si:
(7') e,,
Figure imgf000025_0003
(8') (9')
where dis is our control variable. Due to measurement and calculation uncertainties, there are accumulated errors that can be lumped and included as errors in the estimated image (βsi) and object (βdoe) positions.
In the model eye it is possible to calibrate for the errors using the actual values, and the result is shown in Fig. 14. In reality, limited access to the interior of the human eye makes full calibration impossible. However, from biometric measurements, the position of the retina can be known. Assuming that we have knowledge of the true retinal position, we can calibrate our system and estimate the errors Θs, and
Figure imgf000026_0001
from only one initial observation. In Fig. 14 the biometric model is fit on the actual values using just one point for calibration. Using this method, the accuracy will always be high near the retina, but has the potential to decrease for regions closer to the pupil.
In order to perform full 3D localization after estimating the intraocular object depth, we take advantage of the concept of the nodal points of a lens system: a ray reaching the first nodal point will exit the system from the second nodal point with the same angle. The lens system can be eliminated, and can be considered as a pinhole camera located at the first nodal point. The focal length of the pinhole camera is Si calculated from (7'). As a result, we can estimate the 3D coordinates of an intraocular point by knowing its pixel coordinates and the sensor position:
Figure imgf000026_0002
where So is given in (8'), (u; v) and (Uo; Vo) are the 2D pixel coordinates of the intraocular point and the center of the image, respectively, and kx, ky are the dimensions of the pixel elements.
In order to estimate the area of validity of the paraxial approximation, we consider points on the retina of the model eye, and we perform a localization experiment for various angles with respect to the optical axis, and decreasing distances from the pupil. The results are shown in Fig. 15. Localization errors are minimal for angles smaller than approximately 10° from the optical axis.
Fig. 16 shows an exploded view of the main components of a camera device 100 which additionally comprises a projector 9 for projecting an image into the eye, e.g. as schematically shown in Fig. 9. The camera device 100 comprises an asperical lens 1 as well as a sensor 2. The sensor is mounted on holder 106, which is coupled with base element 105. The sensor 2 can be moved in the direction of the imaging axis
(perpendicular to the sensor plane) by means of linear motor 41. The motor 41 is arranged in a housing 102 and coupled with the sensor 2 and the sensor base element
105, respectively, via elements 103 (nut of the bearing) and 104 (bearing core). The components of the camera device 100 are arranged in a common housing 56. The housing 56 also houses a beam splitter 12a, e.g. a partially reflective mirror. The housing
56 is inserted in a cylindrical bore in a projector chassis 118. An illumination system as described above with respect to Fig. 10 comprising a plurality of LEDs 26a and a corresponding ring-shaped lens 27a as well as a heat sink 47 is also present.
The projector 9 comprises an image generator with a light source, here a laser diode 117. The light source is held by laser diode holder 116. It can be moved means of two actuators, i.e. tilted by motor 113 and rotated by motor 109, in a plane perpendicular to the imaging axis. The movable and stationary parts of the projector further comprise: a projector head 110, a coupling 1 11, a rotary chassis 112 of the projector, a nut 115 of the linear motor 113. Adjustment of depth of focus of the projector 9 is not possible with this configuration, but another actuator for adjusting the position of the sensor along the optical axis could be added.
The camera 100 with integrated projector 9 can be used to project an image into the eye or to illuminate a certain spot on the retina, for example for radiotherapy of the retina, and for simultaneous observation thereof.

Claims

1. A method for intraocular imaging comprising
generating an aerial image of a structure within the eye with at least one asperical lens;
directly capturing the aerial image with an image sensor in a way such that no additional field lenses are needed to concentrate or further focus the aerial image.
2. The method according to Claim 1, wherein the image sensor is moved relative to the asperical lens, preferably in the direction of the optical axis of the asperical lens, in order to arrange the plane of the sensor in the plane of the aerial image for achieving better image sharpness.
3. The method according to Claim 1 or 2, wherein an image is displayed on the plane of the image sensor to guide the patient to focus on the image sensor, thus utilizing the natural focusing of the patient's eye to achieve a sharper intraocular image of the patient.
4. The method of Claim 3, wherein the image is created utilizing partially reflecting mirrors to guide the patient to focus at a distance equivalent to the location of the image sensor, thus utilizing the natural focusing of the patient's eye to achieve a sharper intraocular image of the patient.
5. The method according to one of the preceding claims, wherein the sensed aerial image is that of the retinal fundus.
6. The method according to one of the preceding claims, wherein the at least one asperical lens is designed such that the aerial image of the retinal fundus is essentially flat.
7. The method according to one of the preceding claims, comprising:
moving the image sensor with a large range of travel to enable localization and/or visualization of intraocular objects throughout the posterior of the eye.
8. The method according to claim 7, comprising:
- moving the image sensor in the direction of the optical axis of the asperical lens in order to acquire a sharp image of an intraocular object;
monitoring the position of the sensor;
deriving information on the position of the intraocular object along the optical axis from the information on the position of the sensor.
9. The method according to claim 8, further comprising:
deriving further information on the position of the intraocular object from the information on the position of the image of the object within the acquired image.
10. The method according to claim 8 or 9, further comprising inserting the object, in particular an untethered microrobotic device, into the eye.
11. The method according to claim 4, comprising moving the image with a large range of travel to enable localization and/or visualization of intraocular objects throughout the posterior of the eye while utilizing the natural focusing of the patient's eye.
12. The method according to one of the preceding claims, wherein an image sensor that is smaller than the total aerial image created by the asperical lens is used and is moved on the aerial image while capturing portions of it.
13. The method according to claim 12, wherein the portions captured by the image sensor are combined by means of a data processing unit that is connected to the image sensor.
14. A method for projecting a predetermined image into the eye, preferably on the retina, comprising
arranging at least one asperical lens in front of the eye;
providing an essentially flat screen generating the predetermined image, in particular by emitting radiation,
- directly projecting the desired image into the eye by means of the asperical lens.
15. The method as claimed in claim 14, wherein the at least one asperical lens is designed such that the projected image of the screen corresponds to the shape of the retinal fundus.
16. An apparatus for intraocular imaging and/or treatment, comprising
- at least one aspericalal lens; at least one image sensor mounted in such a way with respect to the at least one asperical lens that it directly captures an aerial image generated by the at least one aspericalal lens,
wherein no additional field lenses are present to concentrate or further focus the aerial image.
17. Apparatus according to claim 16, further comprising a focusing mechanism for arranging the plane of the sensor in the plane of the aerial image by adjusting the respective positions or by utilizing the natural focusing of the patient's eye.
18. Apparatus according to claim 17, wherein the sensor is movably coupled with the at least one aspericalal lens in such a way that it is movable in the direction of the optical axis of the lens, in order to arrange the plane of the sensor in the plane of the aerial image, preferably by a corresponding actuator.
19. Apparatus according to claim 17, wherein the focusing mechanism comprises a display unit for displaying an image on the plane of the image sensor or in a distance equivalent thereto to guide the patient to focus on the image sensor or in an equivalent distance.
20. Apparatus according to one of claims 16-19, wherein the sensor is movably coupled with the at least one aspericalal lens in such a way that it is movable in a direction perpendicular to the optical axis of the lens, in order to acquire a complete image by capturing portions of it at different positions, preferably by a corresponding actuator.
21. Apparatus according to one of claims 16-20, further comprising a housing accommodating the at least one aspericalal lens and the image sensor, wherein the housing is adapted to be arranged in a predetermined distance from the eye such that the at least one aspericalal lens is arranged in a predetermined distance and orientation with respect to the eye.
22. Apparatus according to claim 21, wherein the housing is designed such that, when in use, a first optical surface of the at least one lens is placed in direct contact with the cornea, or at a predetermined distance therefrom.
23. Apparatus according to one of claims 16-22, further comprising an illumination device for preferably transscleral illumination, which is preferably arranged such that it is neither in contact with the patient's eye nor obscures the optical path of any existing imaging system.
24. Apparatus according to one of claims 16-23, further comprising an essentially flat screen generating a predetermined image, in particular by emitting radiation, for projection of the image into the eye.
25. Apparatus according to claims 16-23, comprising an irradiation device that directly projects an image into the eye by means of the aspericalal lens, preferably projecting the image onto the retina.
26. Apparatus according to one of the claims 16-25, further comprising partially reflecting mirrors to guide the patient to focus at a distance equivalent to the location of the image sensor, thus utilizing the natural focusing of the patient's eye to achieve a sharper intraocular image of the patient.
PCT/EP2009/006925 2008-09-26 2009-09-25 Ophthalmoscopy using direct sensing of the flat aerial-image created by an aspheric lens WO2010034502A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP08016979 2008-09-26
EP08016979.0 2008-09-26

Publications (2)

Publication Number Publication Date
WO2010034502A2 true WO2010034502A2 (en) 2010-04-01
WO2010034502A3 WO2010034502A3 (en) 2010-05-27

Family

ID=41319897

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2009/006925 WO2010034502A2 (en) 2008-09-26 2009-09-25 Ophthalmoscopy using direct sensing of the flat aerial-image created by an aspheric lens

Country Status (1)

Country Link
WO (1) WO2010034502A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017121464A (en) * 2015-08-05 2017-07-13 フェニックス テクノロジー グループ インコーポレイテッド Wide-field retinal imaging system
US9781412B2 (en) 2015-02-04 2017-10-03 Sony Corporation Calibration methods for thick lens model
WO2023102059A1 (en) * 2021-11-30 2023-06-08 Innovative Drive Corporation Intraocular cyclophotocoagulation device and methods of use

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3954329A (en) * 1972-09-25 1976-05-04 Retina Foundation Wide-angle opthalmoscope employing transillumination
US5661537A (en) * 1995-03-16 1997-08-26 Nikon Corporation Ophthalmic auxiliary instrument and ophthalmic system using the same instrument
WO2006119349A2 (en) * 2005-04-29 2006-11-09 Novadaq Technologies, Inc. Choroid and retinal imaging and treatment system
WO2007089629A2 (en) * 2006-01-26 2007-08-09 Volk Optical Inc. Improved diagnostic ophthalmic lens using extra-low dispersion (ed) material

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3954329A (en) * 1972-09-25 1976-05-04 Retina Foundation Wide-angle opthalmoscope employing transillumination
US5661537A (en) * 1995-03-16 1997-08-26 Nikon Corporation Ophthalmic auxiliary instrument and ophthalmic system using the same instrument
WO2006119349A2 (en) * 2005-04-29 2006-11-09 Novadaq Technologies, Inc. Choroid and retinal imaging and treatment system
WO2007089629A2 (en) * 2006-01-26 2007-08-09 Volk Optical Inc. Improved diagnostic ophthalmic lens using extra-low dispersion (ed) material

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9781412B2 (en) 2015-02-04 2017-10-03 Sony Corporation Calibration methods for thick lens model
JP2017121464A (en) * 2015-08-05 2017-07-13 フェニックス テクノロジー グループ インコーポレイテッド Wide-field retinal imaging system
US10893803B2 (en) 2015-08-05 2021-01-19 Phoenix Technology Group Llc Wide-field retinal imaging system
JP7075178B2 (en) 2015-08-05 2022-05-25 フェニックス テクノロジー グループ インコーポレイテッド Wide-field retinal imager and how it works
WO2023102059A1 (en) * 2021-11-30 2023-06-08 Innovative Drive Corporation Intraocular cyclophotocoagulation device and methods of use

Also Published As

Publication number Publication date
WO2010034502A3 (en) 2010-05-27

Similar Documents

Publication Publication Date Title
JP5066094B2 (en) Optical medical treatment system and method using virtual image aiming device
JP2021098043A (en) Corneal topography measurement and alignment of corneal surgical procedures
EP2517617B1 (en) Digital eye camera
US7922327B2 (en) Apparatus and method for illuminating and viewing the anterior segment of an eye of a patient
CN109963535A (en) Integrated form ophthalmic surgical system
JP5928844B2 (en) Corneal confocal microscope (CCM)
KR20140001865A (en) Electronically controlled fixation light for ophthalmic imaging systems
JP2017519564A (en) Diagnostic and surgical laser apparatus using visible laser diodes
US11911103B2 (en) Personalized patient interface for ophthalmic devices
JP7343331B2 (en) Ophthalmological device, its control method, program, and recording medium
JP2024040337A (en) slit lamp microscope
WO2010034502A2 (en) Ophthalmoscopy using direct sensing of the flat aerial-image created by an aspheric lens
JP2021112610A (en) Image processing method and ophthalmologic image system
US11490810B2 (en) Reflectometry instrument and method for measuring macular pigment
JP2005287782A (en) Device and method of aligning optical system, and device and method of three-dimensional observation state measurement using the same
US20230337912A1 (en) System, device and method for portable, connected and intelligent eye imaging
EP3668370B1 (en) Miniaturized indirect ophthalmoscopy for wide-field fundus photography
JP7437931B2 (en) slit lamp microscope
KR20140112615A (en) Optical coherence tomography device having mire ring light source
CN106999039B (en) Lens system for eye examination
JP2020014761A (en) Ophthalmologic imaging apparatus
Yoder Jr Integration of a precision stereo microscope into an excimer-laser-beam delivery system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09778716

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09778716

Country of ref document: EP

Kind code of ref document: A2