EP3019070A1 - Verfahren und vorrichtung zur augenhintergrundüberwachung - Google Patents

Verfahren und vorrichtung zur augenhintergrundüberwachung

Info

Publication number
EP3019070A1
EP3019070A1 EP14822887.7A EP14822887A EP3019070A1 EP 3019070 A1 EP3019070 A1 EP 3019070A1 EP 14822887 A EP14822887 A EP 14822887A EP 3019070 A1 EP3019070 A1 EP 3019070A1
Authority
EP
European Patent Office
Prior art keywords
retinal
eye
light
imaging
wavelength
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14822887.7A
Other languages
English (en)
French (fr)
Other versions
EP3019070A4 (de
Inventor
David Alexander Kahn
Ian Powell
Alan Boate
Jeremy Lloyd Gribben
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Annidis Health Systems Corp
Original Assignee
Annidis Health Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/940,668 external-priority patent/US8807751B2/en
Application filed by Annidis Health Systems Corp filed Critical Annidis Health Systems Corp
Publication of EP3019070A1 publication Critical patent/EP3019070A1/de
Publication of EP3019070A4 publication Critical patent/EP3019070A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/15Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • A61B5/14555Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases specially adapted for the eye fundus
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/21Polarisation-affecting properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7285Specific aspects of physiological measurement analysis for synchronising or triggering a physiological measurement or image acquisition with a physiological event or waveform, e.g. an ECG signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/062LED's

Definitions

  • the present invention relates generally to a method and apparatus for imaging of the retinal fundus. More particularly, the present invention relates to a method and apparatus for quantitative imaging of the retinal fundus.
  • the fundus of the eye, or retina is a complex layered structure arranged in an approximately spherical shape at the back of the eyeball. It contains the light sensing rods and cones that enable vision. It is nourished by oxygenated blood supplied through arterioles and removed through venules. The nerve impulses from the rods and cones are directed to the brain through the optic nerve on the fundus, corresponding to the blind spot.
  • Direct visual observation of the retinal fundus can be accomplished using an ophthalmoscope, an instrument that has been around in various forms for over 150 years.
  • the ophthalmoscope employs a light source, means for coupling the light into the eye through the pupil, and means for collecting light reflected back from the fundus and presenting an image of the fundus to the observer.
  • the eye responds to continuous light by constricting the pupil size and so reducing the amount of light available to form an image. For this reason, the eye pupil may have to be chemically dilated using a mydriatic.
  • a fundus camera is similar to the ophthalmoscope but provides a permanent record of the fundus image in the form of a photograph. It also enables the use of a short, powerful flash of light to replace the continuous light required for the ophthalmoscope, and so sometimes avoiding the need for a mydriatic.
  • the fundus camera uses an electronic image sensor such as a charge-coupled device (CCD) or a scientific grade CMOS image sensor, and the image is stored electronically. It may be displayed on a monitor or printed out as a photograph.
  • CCD charge-coupled device
  • CMOS image sensor scientific grade CMOS image sensor
  • the fundus image is dominated by the appearance of the optic nerve and the vascular structure of arterioles and venules. It is substantially of the color red, this coming from the blood, with some regions having an orange or yellow bias.
  • the ophthalmologist is able to use this visual image to aid in the diagnosis of the health of the eye. Thorough diagnosis requires the use of a battery of other oculometric instruments in addition to the fundus camera.
  • the present disclosure provides a method for assessing retinal health of an eye of a patient.
  • the method comprises: sequentially imaging the retinal fundus of the eye at a series of wavelengths, a time duration of the imaging of the retinal fundus at an individual wavelength being an individual wavelength imaging time duration, a sum of all individual wavelength imaging time durations being less than a constriction latency period of the eye, the imaging being effected with an imaging device comprising a plurality of pixels; determining a spectral reflectivity of the retina for each pixel within a field of view by comparing, for each pixel, the illuminating light energy with a reflected light energy on the basis of specular retinal reflectivity and diffuse retinal reflectivity data of the retinal fundus image; and assessing retinal health based on the spectral reflectivity of the retina.
  • the present disclosure provides a system for assessing retinal health of an eye of a patient.
  • the system comprises: an optical unit for sequentially imaging the retinal fundus of the eye at different wavelengths, the optical unit having an imaging sensor onto which the retinal fundus is imaged, the imaging sensor comprising pixels; a light source module having light sources, each light source corresponding to one of the different wavelengths, the light source module also having a light source selection means to select a light source with which to illuminate the retinal fundus at an illuminating light energy, the light source selection means to switch from one light source to another in a switching time, a time duration of the imaging of the retinal fundus at an individual wavelength being an individual wavelength imaging time duration, a sum of all individual wavelength imaging time durations and of each switching time require to switch from one light source to another being less than a constriction latency period of the eye; and a processor for determining spectral reflectivity of the retina for each pixel within a field of view by comparing, for each pixel
  • FIG. 1 is a schematic block diagram of the retinal fundus imaging system according to an embodiment.
  • FIGs. 2A and 2B are a side elevation and a top view, respectively, of the retinal fundus imaging system according to an embodiment.
  • Fig. 3 is a longitudinal cross-section of Fig. 2A.
  • Fig. 4A is a cross-section along B-B of Fig. 2A illustrating the optical head the retinal fundus imaging system according to an embodiment.
  • Fig. 4B is a cross-section along C-C of Fig. 2A.
  • FIG. 5 is an illustration of the optical path during retinal illumination according to an embodiment.
  • Fig. 6 is an illustration of the optical path during power monitoring according to an embodiment.
  • Fig. 7 is an illustration of the optical path during retinal imaging according to an embodiment.
  • FIG. 8 is an illustration of the optical path during corneal imaging according to an embodiment.
  • Fig. 9 is an illustration of the optical path during fixation of targets according to an embodiment.
  • Fig. 10 is a flowchart showing the method of quantitative imaging of the retinal fundus according to an aspect of the present invention.
  • Fig. 1 1 shows a graph of optical power as a function of time in accordance with certain embodiments of the present disclosure.
  • Fig. 12 shows a flowchart of a method according to certain examples of the present disclosure.
  • Fig. 13 shows a light source selection module within context of a retinal illumination optical path, in accordance with an embodiment of the present disclosure.
  • Fig. 14 shows an expanded view of the light source selection module shown in Fig. 13.
  • Fig. 15 shows another embodiment of a light source selection module in accordance with the present disclosure.
  • Fig. 16 shows yet another embodiment of a light source selection module in accordance with the present disclosure.
  • FIG. 17 shows an embodiment of a system for assessing retinal health of an eye of a patient in accordance with the present disclosure.
  • FIG. 18 shows a block diagram of an embodiment of an imaging device positioning system in accordance with the present disclosure.
  • Fig. 19 shows an example of how an eye can be illuminated through the sclera by a pair of light source modules.
  • the color perception of the human eye is variable. No two people perceive the same color image in the same way, and in some cases, one may suffer from a form of colorblindness, commonly an inability to distinguish red from green. As there is only a very minor blue component in a retinal image, red-green color blindness effectively removes all color information.
  • the color perception of the human eye is also conditioned by the intensity and spectrum of the environmental lighting; the background illumination may come from daylight, some form of fluorescent lighting, or incandescent lighting.
  • any photograph or display is limited by the gamut of colors enclosed by the specific three primary colors employed. The process and manufacturing tolerances will result in a spread from one photograph or display to another, which will be compounded by aging effects and the impact of environmental influences such as temperature.
  • Visual observation of the fundus is essentially a rudimentary form of multispectral imaging where the three color channels correspond to those of the observing eye.
  • the spectral sampling locations and widths of the three visual color channels do not necessarily correspond with those that would be chosen in an optimal fashion determined by the reflection characteristics of the retina associated with specific retinal diseases or defects.
  • the limitations of the display and its perception are further compounded by the uncertainties associated with the generation of the image.
  • the illumination source energy will vary from camera to camera, from time to time, and with age. This will result in concomitant variations in apparent image brightness.
  • the sensitivity of the image sensor be it film or electronic (CCD or CMOS image sensor), will vary from unit to unit. This will also result in concomitant variations in apparent image brightness.
  • the optical transmission efficiency is not always high, especially in the presence of cataracts. The efficiency will also vary across the spectrum. This will result in concomitant variations in apparent image brightness and color.
  • the amount of illumination that is reflected from the retina is strongly dependent on the size of the pupil. As the size of the pupil varies greatly from person to person and with environmental lighting conditions, this will result in concomitant variations in apparent image brightness.
  • the reflectivity of the retina is strongly dependent on the ethnicity of the person, as a consequence of the different concentrations of melanin. People of African ethnicity have higher melanin concentrations resulting in low retinal reflectivity and this causes dark retinal images that are difficult to interpret.
  • Ophthalmologists need to carefully track the progression of the retinal health problems of their patients in order to prescribe the most appropriate course of treatment. For this purpose, they carry out examinations over time to establish longitudinal trends. However, because of the variations and uncertainties listed above, the utility of fundus cameras for longitudinal monitoring is severely limited.
  • the present invention provides a method and apparatus for quantitative imaging the retinal fundus.
  • the method for retinal health assessment comprises imaging the retinal fundus of a patient's eye at different wavelengths within a spectral range and determining spectral reflectivity of the retina for each pixel within a field of view (FOV).
  • the retinal health is assessed based on the spectral reflectivity of the retina.
  • the retinal fundus is illuminated with an illuminating light energy and the spectral reflectivity is determined based on a comparison, on pixel-by-pixel basis, the illuminating light energy with a reflected light energy.
  • each and every point on the retinal fundus image equal to the area of the pixel can be individually monitored and analyzed for obtaining a retinal health assessment.
  • Information about retinal health previously unavailable can be obtained from the enhanced pixel-by-pixel evaluation of the image data.
  • the quantitative fundus surveillance instrument generates spectral reflectivity data based upon the capture and analysis of a sequence of substantially mono-spectral retinal images.
  • the electromagnetic spectrum within which these images may be captured can extend over the entire ocularly transparent spectral region that includes the visible spectrum and infrared spectrum (i.e., between 400 and 1400 nm).
  • the image data obtained with an embodiment of the instrument is calibrated in terms of diffuse retinal reflectivity and the specular retinal reflectivity.
  • the diffuse or scattered reflection from the retina is modeled by that of a Lambertian surface where the reflected light is directed over an entire hemisphere according to the cosine law of distribution.
  • the ratio of diffusely reflected light energy to incident light energy is governed by the surface reflectivity, a dimensionless quantity with a value that lies between zero and one.
  • the retinal reflectivity is a function of wavelength and other factors, and generally lies in the region between 0.001 and 0.02, the former being typical at the shortest (blue) wavelength and the latter occurring at infrared wavelengths for eyes with low melanin content.
  • the reflection of light from the retina is not entirely of a diffuse character. A small portion of the incident light is reflected in a specular or mirror-like fashion.
  • the specular or mirror-like reflectivity indicates the flatness of the surface and tends to be relatively independent of wavelength. Unless this is factored into calculations following measurements, this would introduce small errors in the estimation of the diffuse reflectivity.
  • the instrument includes a measurement means to distinguish between the two modes of reflection.
  • the diffuse spectral reflectivity is characteristic of the chemical composition of the organic tissue just behind the retinal surface. While the spectral reflectivity profile is indicative of certain health conditions applying to vision, it is also indicative of other health conditions such as diabetes.
  • the eye functions as a window to the blood and thus enables non-invasive blood analysis. It may therefore be appreciated that the instrument, according to an embodiment, includes capabilities beyond those that are strictly of interest to ophthalmologists.
  • the specular reflectivity is substantially independent of the chemical composition behind the retinal surface and is substantially constant over the spectrum. Measurement of the specularly reflected light can be used to estimate the spectral absorption within the ocular lens and the aqueous and vitreous humours (intra-ocular transmission losses) and parameters that are also of medical interest, and that would normally contribute uncertainty to the diffuse reflectivity measurement. The spectral absorption can be also factored into calculations following measurements, to avoid the introduction of errors in the estimation of the diffuse reflectivity.
  • the measurement of retinal reflectivity is not the only non-invasive measurement operation that can be implemented by an embodiment of the instrument.
  • the retina has the property of auto-fluorescence whereby it absorbs light at one wavelength and simultaneously emits light at a longer wavelength.
  • the strength of the auto-fluorescence is governed by the presence and concentration of lipofuscin and drusens, both of which are indicative of ocular health conditions.
  • the instrument according to an embodiment, has the capability to illuminate the retina at one wavelength while capturing the retinal image being emitted at another, for example, longer, wavelengths.
  • the image data is calibrated in terms of the retinal auto-fluorescence factor, a dimensionless quantity analogous to reflectivity but generally having a much lower value.
  • the embodiments of the instrument are capable of several types of measurement, including mapping retinal spectral reflectivity, measuring interior specular absorption, and mapping retinal auto-fluorescence, employing retinal illumination having wavelengths anywhere between 400 and 1400 nm. In this way, it has greater value to the ophthalmologist who would otherwise have to invest in additional instruments, if available, and devote more time to the patients.
  • N ⁇ 2 ⁇ /( ⁇ 5 2 ) Eqn. (1 );
  • is the quantum conversion efficiency of the image sensor
  • A is the pupil area
  • r is the retinal diffuse reflectivity
  • is the one-way transmission through the eye
  • s is a dimensional parameter of the eye
  • T is the transmission through the image viewing optical path
  • U is a uniformity factor applying to the illumination field
  • M is the ratio between the illumination field solid angle and the pixel field solid angle.
  • the instrument in an embodiment, generates objective quantitative data for every pixel in addition to being able to generate images.
  • the numerical data for each pixel presents an approximation to the absolute retinal reflectivity (or fluorescence coefficient) at the sampling wavelength. This can be calculated because the instrument not only counts the photoelectrons received in each pixel, but also measures the total amount of energy launched and the diameter of the pupil.
  • the total area of the pupil can be measured and used to normalize the reflected light energy to determine the spectral reflectivity of the retina independent of the total area of the pupil.
  • the surface topology information of various reflective layers of the retina can be obtained and used for assessing the retinal health. This level of absolute measurement enables more information and greater reliability of assessment to be obtained.
  • the total launched energy is measured using an internal energy monitor, while the pupil diameter is measured when the image of the cornea is captured when alignment is complete.
  • an embodiment of the instrument can measure retinal oxygenation.
  • Retinal oxygenation is typically assessed by measuring the reflectivity at two or more carefully chosen wavelengths. These wavelengths have been located in the visible region compatible with that of conventional instruments that are restricted to making measurements at one location, or to making measurements only of arterioles and venules.
  • oxygenation estimates for the full retinal area can be obtained. This is achieved using at least four models each addressing a different region, specifically the optic nerve, the fovea, arterioles/venules, and the remaining area - the preponderance.
  • an embodiment of the instrument uses Light Emitting Diodes (LED) with relatively broad Gaussian spectra.
  • LED Light Emitting Diodes
  • the sources located near 505 nm, 617 nm and 850 nm are chosen. This is a unique permutation. The rationale is that the 505 nm measurement gives a result substantially independent of oxygenation or pigmentation, the 617 nm measurement gives a result strongly dependent on oxygenation but also influenced by pigmentation, while the 850 nm measurement gives a result strongly influenced by pigmentation and also influenced by oxygenation. Combining these enables one to substantially eliminate the pigmentation factor and determine the oxygenation level.
  • the quality of the diffuse retinal image can be described in terms of three resolution terms, viz. the spatial resolution; spectral resolution; and, the reflectivity resolution.
  • the spatial resolution is determined by the combination of the pixel count of the image sensor, normally a CCD or a CMOS image sensor, the field-of-view (FOV) on the retina, and the limitations of the eye itself.
  • Good spatial resolution is desirable, as is a large FOV that includes both the central macular region and the optic nerve region. For example, if a FOV of 40 degrees is stipulated and imaging performance is close to being diffraction limited, the required pixel count is in the region of four million.
  • Another source of loss of spatial resolution is the blur that results from microsaccadic activity of the eye while it is nominally fixated. To keep this within acceptable limits, the duration of the retinal illumination pulse must be kept low, typically to the order of a few tens of milliseconds.
  • the spectral resolution requirements are determined by the spectral reflectivity profile of the retina. There is no apparent line structure to the retinal reflection spectrum and the rate of variation with wavelength is low, with complete reversal cycles typically occupying spectral widths of the order of several tens of nanometers. For this reason the spectral resolution requirements are generally compatible with the use of substantially mono-spectral LED illumination sources and do not require the use of narrowband sources such as provided by lasers.
  • the reflectivity resolution or uncertainty is determined by the number of photons captured per pixel at the image sensor in association with the quantum conversion efficiency of the sensor. As may be expected, the more photons received per pixel, the better quality is the image. It is, therefore, desirable to use an efficient sensor and efficient optical systems both for launching the illumination pulse and for extracting the image reflection.
  • the illumination pulse energy In an embodiment, a portion of the pulse energy is diverted to a monitor sensor. The measurement from the monitor is applied to the calculation of reflectivity to compensate for any factor such as ageing that could cause the pulse energy to change.
  • the pupil size The amount of energy collected from the retina is directly proportional to the pupil area. In an embodiment, the pupil image is captured and used to calculate the area. This in turn is factored into the reflectivity calculation.
  • the eye transmission Light that is collected from the retina passes through the eye twice, first on its way in and then on its way back out. Any absorption along the transmission path within the eyeball needs to be factored into the calculation of the reflectivity. As described above, the ability of the instrument, according to an embodiment, to discriminate between specular and diffuse reflections enables an estimate of eye transmission to be made that can be used to calculate the absorption correction.
  • the instrument optical path and sensor efficiencies can be determined by calibration during manufacture and are normally stable over time.
  • Non-uniformity in the illumination field This can also be measured and calibrated during manufacture.
  • Reflectivity changes induced by the cardiac pulse It is known that the reflectivity of the retina at some wavelengths varies with the instantaneous blood pressure and is, therefore, cyclic and synchronous with the cardiac pulse.
  • the cardiac pulse is monitored by a sensor and the result is used to synchronize the image capture with the cardiac pulse, thus removing any random variation that would occur if the image capture moment was at a random point of the cardiac cycle. Consequently, image capture events are typically spaced at intervals of one second. The image capture events would preferably be timed to avoid the times of peak blood pressure (the pulse).
  • an embodiment of the instrument captures multiple images using only a narrowband of spectral radiation at a time.
  • the camera position is automatically optimized to provide the best resolution, from blue through to infrared over a substantially 2:1 ratio of wavelength. This enables the generation of high-resolution images anywhere within the overall instrument measurement spectral range.
  • the preferred means of illumination is the LED. LEDs can be pulsed for a short duration and are robust and reliable with consistent output that is repeatable.
  • the preferred type of LED is surface emitting as distinct from edge emitting. A typical source size of a type suitable for this application is of the order of 1 millimeter. As the drive current is likely of the order of an ampere, it is advisable to control the rate of the rising and falling drive current edges to prevent unwanted electromagnetic emissions at radio frequencies.
  • the light from the LED can be collected by a condenser lens and then relayed into the pupil through an optical path that will generally contain lenses, mirrors and beam splitters.
  • the ray bundle or optical mode-volume from the LED is limited using apertures such that, where it reaches the cornea, it has a prescribed area and convergence.
  • the ray bundle will have a diameter of 1 mm at the cornea and will launch in the region of 50 to 100 micro joules of energy with a single pulse.
  • the form of the illumination spot has to be determined.
  • the illumination spot has a defined beam diameter at the corneal surface, typically 1 mm, and a defined cone angle of convergence suitable for illumination a sufficient portion of the retina. This could be 50 degrees.
  • the illumination beam is formed from the source LED by two circular apertures used in association with a series of lens elements. One aperture defines the spot diameter and the other aperture defines the cone angle.
  • the multispectral fundus mapper employs a multiplicity of LEDs each having a different optical spectrum. These are coupled into the illumination path sequentially in time.
  • One option is to employ a multiplicity of optical beam combiners that are spectrally discriminating.
  • This approach is complex where there are many sources and each beam splitter contributes loss. Every source requires a beam combiner matched to its spectrum, such that it passes the one spectrum while reflecting all the others.
  • a beam splitter matched to this requirement would be needed.
  • Another technique is to mount the various illumination sources on along a circular locus on a mount that can rotate, enabling the selected source to be placed in the correct position, one at a time.
  • This arrangement suffers from the fact that each source is electrically connected with wires and all these wires would be constantly flexing as the selected source changes, resulting in eventual fatigue and failure. Moreover, there would be limits as to the direction of motion, as an indefinite movement in one direction would cause the wires to twist together.
  • Some form of switching or multiplexing is needed to accommodate several LEDs that are to be activated in a sequence. Spatial multiplexing is inherently inefficient and therefore unsuitable. Passive wavelength multiplexing is more efficient but is difficult to arrange when operating with large mode volumes. It is also inflexible in that the multiplexing filters must be designed to be compatible with the specific LED wavelengths.
  • the instrument employs an active switching arrangement that consists of a rotating periscope.
  • the periscope is located in the collimated space following the first condenser lens.
  • the periscope is highly efficient and is suitable for all combinations of source wavelengths and can be operated by a stepper motor.
  • the LEDs are all deployed on a circular path.
  • An advantageous feature of the rotating periscope over potential alternative active arrangements is that there is no requirement for the source LEDs to move and consequently no constant stressing of wire harnesses that would result in fatigue and failure. There is also no limit to the sequence combinations that can be used.
  • the illumination sources are coupled one at a time to the common illumination path using a rotating periscopic arrangement.
  • the LEDs are, as described before, mounted along a circular locus and are stationary while the periscope is moved. This provides a highly efficient coupling and is totally achromatic - that works well with any combination of source spectra. Moreover, there is total flexibility of movement direction and sequences.
  • Each LED source is mounted next to a dedicated condenser lens that collects a proportion of the total LED power and collimates it.
  • the collimated light passes through an aperture that defines the cone angle associated with the illuminated corneal spot.
  • the collimated beam then passes through the periscope.
  • a second lens focuses the light upon an aperture that defines the spot size.
  • one LED emitting in the infrared part of the spectrum is used for focusing purposes and be seen by the patient.
  • This LED is associated with a cross shaped mask in the collimated space following the condenser lens. This cross shape is projected on to the retina and provides a high contrast image that eases the task of accurate focusing.
  • the selection of the LEDs by wavelength is driven by the measurement requirements. Commonly, one of the requirements is the simulation of a conventional fundus color image. This requires that the LED set include a blue, a red and a green LED. In order for the measurement of retinal oxygenation, the use of an infrared LED, a red LED (the same as above), and a cyan LED has been found suitable.
  • a blue LED is required, possibly supplemented by a LED at a different wavelength. It is necessary to apply an optical filter to the light from a LED to be used for stimulating fluorescence. This filter substantially passes the LED light but effectively blocks the long wavelength skirt of the spectrum that can stretch out considerably albeit at a low level. This filter can also be deployed in the collimated space following the condenser lens.
  • the imaging system that relays the reflected light from the eye to the image sensor is an important part of the design of any retinal imaging system.
  • the relay system of an embodiment the instrument includes places to deploy masks and filters, some on a dynamic basis.
  • the image relay design is substantially achromatic when used in conjunction with a standard human eye that has substantial chromatic aberration.
  • the position of the image sensor is controlled by a precision motorized drive that allows fine- tuning to automatically optimize the focus for each wavelength.
  • the same motorized drive is used to automatically set the focus to accommodate the prescription of the patient.
  • the final focus is achieved by manual control of the motorized drive, with the operator using a visual presentation of the retinal image in the infrared region.
  • a filter can be inserted for use with auto-fluorescence measurements. This filter blocks the reflected light at the exciting wavelength, e.g. blue, but passes efficiently the excited light in the longer wavelength spectral region.
  • the exciting wavelength e.g. blue
  • a mask can be inserted to alter the distribution of the image between its specular and diffuse components.
  • a technical challenge in any fundus imager design is to accommodate the very large ratio between the illuminating source power and the power that is collected by the image sensor. This ratio is of the order of a million to one.
  • the threat is that the magnitude of any unwanted reflection could easily swamp and wash out the wanted imaging power from the retina.
  • the main sources of such unwanted reflections are a) the corneal reflection, b) reflections from any optical elements (lenses and windows) in the common optical path shared by the illuminating and reflected light, and c) any polymer intra-ocular lens (IOL) typically inserted after cataract surgery.
  • the common optical path includes just one lens doublet in addition to the corneal surface that reflects typically about 3% of the incoming power. As the retina reflects back through the pupil only about 0.1 % of the incoming power, it is evident that the corneal reflection is typically 30 times greater unless measures are taken to avoid or remove it.
  • the lens doublet reflections are less as the doublet surfaces are given broadband antireflection coatings that limit the two reflections to less than 1 % each; however, this is still much greater than the retinal reflection power.
  • the usual technique employed is to spatially segregate the corneal area of illumination from the corneal area of collection.
  • the illumination is in the form of an annulus while the collection is taken from the circular area in the centre of the annulus.
  • the corneal reflection then is reflected outwards and avoids the central area wherein the collected power travels.
  • the converse arrangement may also be used, where the illumination is in the centre and the collection made through the annulus.
  • a difficulty with the annular illumination technique is that the annulus can easily approach the border of the pupil, which risks illuminating the edge of the iris, causing a large reflection. Therefore, the size of the annulus must leave a margin to avoid the large reflection resulting in a relatively small area for collection. Hence, to obtain sufficient light to obtain a good image, the eye pupil must be dilated (mydriated) or a large illumination level must be used; in either case the patient experience is negative.
  • a further difficulty with annular illumination is the loss of efficiency that generally results from having to transform the illumination source shape into the annular illumination shape.
  • Another version of the prior art separates the unwanted reflections using a polarization technique.
  • the incident light is polarized and the light specularly reflected from the cornea is similarly polarized.
  • the diffusely scattered light reflected from the retina is largely unpolarized. Therefore, the collected light path is equipped with a polarizer that blocks the light having the same polarization as the incident light, leaving only the light from the retina.
  • the eye also has some polarizing characteristics that degrade the value of this approach. As a matter of practicality, it is also difficulty to obtain polarizers that operate well over a large spectral range.
  • the retinal illumination system and the retinal image collection system are integrated.
  • the two optical systems are usually combined with a beamsplitter angled at 45 degrees to one of the paths.
  • the beamsplitter is located at a plane in the optical system that is a conjugate image plane to the corneal surface.
  • the beamsplitter usually consists of a mirror with a small circular hole in the centre. The illumination path reflects off the mirror into the eye, while the retinal image path is directed through the hole in the centre.
  • the conventional form of arrangement results in the illumination beam at the corneal surface having an annular shape, with a diameter of about 3 mm.
  • the retinal image path passes through the hole in the centre of the annulus.
  • the size of the illumination beam at the pupil is small enough to fit within the pupil, but allows sufficient area within the central circle to enable sufficient power to be collected from the retina.
  • the retinal image is made up of reflections that occur at various locations and depths within the retina. Part of the image is contributed by diffuse reflections resulting from multiple scattering events within the tissue, while part is specularly reflected from discrete layer interfaces characterized by small changes in refractive index.
  • the diffuse reflections are generally evenly spread out in a polar distribution that is approximated by the Lambertian model.
  • the specular reflections being mirrorlike, are directed such that the angles of incidence to the reflection surface are equal to the angles of reflection, and the resulting direction is dependent upon both the incident angle and the gradient angle of the retinal surface.
  • the retinal surface is spherical to a first approximation, it has a texture and detailed contour that is characteristic of the proximate structural elements such as the vasculature and any deformations that may be natural or characteristic of a pathology.
  • the incident angles of the rays illuminating the retinal surfaces are smoothly distributed over a range that is determined by the size of the illuminating annulus at the ocular lens just behind the cornea and the effective focal distance from the ocular lens to the retina. This translates into a range of about plus and minus four degrees.
  • the collection angle in the centre is about one to two degrees in diameter.
  • the retinal surface is perfectly smooth and normal to the collection axis, no specularly reflected light from the surface will be collected. If however, the retinal surface gradient is angled from the normal to the collection axis, some of the specularly reflected light may be collected, depending on the surface gradient angle. Because the incident light is distributed over a range of angles, the amount of collected light from each retinal pixel will be only weakly dependent on the retinal surface gradient.
  • the illumination and collection paths be transposed. This results in the illumination at the cornea being in a central circular area with the retinal image being collected through the surrounding annular area. As a consequence, all the light reaching the retinal surface pixel arrives from substantially the same direction; it is not dispersed over a range of angles. All the light specularly reflected from a retinal surface pixel is directed in the same direction and is not dispersed over a range of angles. [0114] Depending on the direction of the reflected light, it will then either be substantially collected through the viewing annulus or it will substantially not be collected; a small change of incident angle can result in a large change in the amount of collected light. The amount of specular collected light from each retinal pixel will be strongly dependent on the retinal surface gradient.
  • the retinal image resulting from this arrangement will include a specular component that includes the gradient contours of the retina, features that are normally dispersed (smeared) and substantially not discernable with the conventional fundus camera design, thus providing a substantial advantage over conventional fundus cameras.
  • the light launched into the eye can be polarized and the light reflected from the retina can be analyzed polarimetrically such that the portion of the reflected light that is orthogonally polarized with respect to the illumination is collected.
  • This substantially blocks light that is specularly reflected - corresponding to light reflected from the surfaces as distinct from being backscattered just below the surfaces.
  • the incident light is linearly polarized and the reflected light passes through a similar polarizer at right angles. More generically, light with any polarization state can be used for.
  • circularly polarized light can be used for illumination and light having circular polarization of the opposite sense can be collected.
  • the diffusivity is spectrally dependent and is indicative of the chemical content.
  • the specular component is also spectrally dependent, but not through the reflective action but instead through the intermediate absorption.
  • the reflection model of the retina is a complex aggregation of spectrally dependent reflective and absorptive layers.
  • the illumination is in the form of a small circular area in the centre and the collection is taken from the surrounding annulus. Illumination using a small circular area in the center enables an efficient match to the illumination source, in this case an LED, and provides maximal margin from reflection interference with the iris. Collection from the annulus provides a good level of collected power. Means for blocking the unwanted reflections and means for maintaining high image quality are employed when collecting from the off-axis annular field as described below.
  • the separation or blocking of the unwanted light reflected from the central areas of the cornea and the nearby objective lens is achieved by the use of masks of a suitable size and location being placed in the optical collection path prior to the image sensor. Every surface generating an unwanted reflection requires a mask. These masks will block all the unwanted reflections but will block only a small proportion of the wanted reflections from the retina.
  • the basic principle used here is that the intermediate virtual image planes associated with the retina, the cornea and the objective lens surfaces, occur at different locations and that the real images of the unwanted reflections are small in relation to the area occupied by the wanted light at those locations.
  • a means to correctly align the optical head to the eye of the patient The patient rests his/her chin on a chin-rest and presses his/her forehead against a forehead brace. These two measures stabilize the patient. Typically, a view of the cornea is then displayed to the operator, who can then control the lateral and vertical positions to centre the instrument.
  • a means to accurately set up the optical head at the correct working distance from the patient typically in the region of 20 or 30 mm.
  • the illuminating light converges to a small spot that is coincident with the corneal surface.
  • the working distance for this is fixed, for example at 20 mm.
  • a live view of the cornea is presented to the operator, initially for lateral alignment purposes.
  • the cornea is illuminated by two LEDs emitting in the invisible infrared part of the spectrum, one either side of the objective lens and having the two beams incident at about 45 degrees to the face. The operator sees a reflection of these two LED sources on the corneal surface.
  • the size of these reflected images is minimized to two small spots coincident with the corneal surface. At any other distance they appear as annuli where the diameters increase with the distance. If the wrong distance is set, the illuminating light will either be diverging (too far) or will not have converged enough (too near) with the result that the illumination will deploy on the iris rather than disappear through the pupil. Alternatively, the operator can choose to focus upon the patient's iris to determine the working distance.
  • focusing on the retina can be achieved by longitudinally adjusting the image sensor location.
  • the first approximation to the correct position is carried out automatically where the patient prescription in terms of short or long sightedness is known.
  • a fixation target or several targets upon which the patient must fixate his/her gaze during the measurement session is provided.
  • the light from the fixation target screen is adjustable as this affects the pupil size.
  • the fixation target screen is momentarily disabled to prevent light from it interfering with the retinal images.
  • the major unwanted reflections can be removed by deployment of suitable masks.
  • suitable masks typically having a quasi-Gaussian spatial profile.
  • this profile is recorded during calibration and subtracted from the images captured during normal operation. The profile varies slightly with wavelength and with camera position and these factors can be taken into account before the subtraction operation.
  • Figure 1 shows a high level partitioning of the system elements. These comprise, for example, the optical unit/head 108, a personal computer 104, and power supplies 102.
  • the optical unit/head is provided with a touch screen display 1 10 that is used both for operator control and also for the display of images used for alignment and focusing.
  • Attached to the optical head is a small device that straps around a finger and is used to monitor the cardiac pulse 1 12.
  • At the base of the optical unit/head are manual controls 106 used to position the optical head in the three dimensions with respect to the eye under observation. Adjacent to the objective lens is the fixture that combines the chinrest and the forehead brace.
  • Figs. 2A and 2B are a side elevation and a top view, respectively, of an embodiment of the retinal fundus imaging system.
  • a camera 100 is shown in Figure 2A.
  • the camera 100 can be, for example, a CCD or CMOS image sensor.
  • Figure 3 is a longitudinal cross-section of Fig. 2A.
  • Figures 4A and 4B are cross-sections along lines B-B and C-C of Fig. 2A.
  • Figure 4A shows details of the optical head of an embodiment of the retinal fundus imaging system.
  • the optical head design comprises six integrated optical systems. These consist of the retinal illumination system, the corneal illumination system, the retinal illumination pulse energy monitor system, the retinal viewing system, the corneal viewing system and the fixation target screen system. These are described below.
  • LEDs are mounted on a circular locus on a printed circuit board adjacent to the lenses L15.
  • the LEDs emit at wavelengths of 470 nm, 505 nm, 530 nm, 617 nm and 850 nm.
  • Suitable LED devices are made by Philips Lumileds Inc. and Osram GmbH.
  • each LED is adjacent to a condenser lens L15 set at a distance that best collimates the light from the LED. Adjacent to the lens, where appropriate, an optical filter is used to modify the LED spectrum or a projection mask is used as an aid to focusing.
  • the collimated light from the lens L15 is then directed to the two periscope mirror reflectors R2 that displace the beam from the offset LED axis to the central axis.
  • the light exiting the periscope is then passed through the lens L16 that focuses it back to create a real image at a plane occupied by the aperture A3.
  • the image magnification from the LED to the real image is 3.33.
  • the aperture A3 defines the size and shape of the illuminating light that will eventually reach the cornea. It is substantially filled by the real image.
  • the light After passing through the aperture, the light is reflected from R3 to travel upwards in a vertical direction. It then passes through a beam splitter B2 with low loss. The beam splitter is not used for the illumination function. The light then passes through three relay lenses, the biconvex L1 1 , the convex-concave L10 and the plano-convex L9. At the exit of L9 is an aperture that sets the illuminating field angle. [0138] The light then impinges upon the main beam splitter B1 where it is divided into two parts of approximately equal power. The reflected part then passes through the objective lens L1 and then converges to form the corneal spot of diameter about 1 mm.
  • the corneal illumination system is used for alignment purposes and also to enable the size of the pupil to be captured. It consists of two infrared LEDs that are powered continuously. Each LED emits at a wavelength of 850 nm and is contained in a standard 5 mm collimating package generating a beam divergence of 44 degrees. Each LED is mounted beside L1 , one on each side, and each is angled such that the centre of the projected beam is coincident with the centre of the cornea. The corneal illumination is extinguished during retinal imaging operations.
  • the optical path from the LED to the main beam splitter B1 is the same as that described for the retinal illuminator.
  • the light destined for the energy monitor passes through the beam splitter and proceeds to the attenuating reflector F5. This absorbs about 95% of the incident power and reflects the remainder horizontally.
  • the reflected light then passes through a 10 dB attenuator F6 angled to the beam such as to direct any reflections to the side of the chamber where they are absorbed.
  • the attenuated light passing through F6 then passes through the biconvex lens L17 that focuses it to a smaller area that lies on the monitor photodiode surface. Any reflections from the photodiode surface have to pass through F6 and F5 where they are further attenuated; this arrangement prevents any significant reflections from the monitor arm from re-entering the retinal-viewing path.
  • the retinal viewing system is shown in Figure 7.
  • Light reflected from the retina exits the eye through the pupil and then is collected by the biconvex objective lens L1 . It then passes through the main beam splitter B1 .
  • the light is then relayed through the lens doublet L2 and a biconvex lens L3. At this point, the light is in a relatively large area, collimated mode. It then passes to the final lens group or camera objective group consisting of the plano-convex lens L4, two piano- convex lenses L5 and L6, and two further plano-convex lenses L7 and L8.
  • a mask M1 is inserted between L4 and L5. This blocks the reflection from the cornea.
  • a second mask M2 is inserted between L6 and L7. This blocks the reflection from the nearer surface of L1 .
  • a third mask M3 is inserted between L7 and L8. This blocks the reflection from the outer surface of L1 .
  • the camera is moveable on its axis and its position is controlled by a motor. This movement is used to compensate for the prescription of the patient, to optimize the focus as a function of wavelength, and to optimize the focus under the control of the operator who is viewing a live video representation of the fundus.
  • the nominal magnification ratio from retina to CCD or CMOS image sensor has a value of 1 .25.
  • the corneal viewing system is shown in Figure 8. Note that the same camera is used both for corneal and retinal viewing. To switch from one mode to the other, the reflector R1 is moved; in one position, the retinal viewing path is unobscured while in the other position, the camera view is deflected into a vertical path containing L13 and L14.
  • the two viewing modes are arranged such that they are co-axial - that is when the optical head is aligned, the centre of the cornea and the centre of the retinal view appear at the same location of the CCD or CMOS image sensor.
  • the corneal viewing path begins with the biconvex lens L1 .
  • Light is diverted at the main beam splitter B1 and travels down through the lenses L9, L10 and L1 1 to the second beam splitter B2.
  • a small proportion of the light typically about 8%, reflects off B2 and passes through the lens L12 to the dichroic beam splitter B3.
  • the infrared light used for corneal viewing is almost wholly reflected up through the lenses L13 and L14 after which it reflects off R1 . From this point, it follows the same path as the retinal viewing system, passing through the camera objective group to the CCD or CMOS.
  • the nominal magnification ratio from cornea to CCD or CMOS has a value of 1 .0.
  • the fixation target screen system is shown in Figure 9.
  • the viewing path is the same as that of the corneal viewing path described above, with the exception that at the dichroic beam splitter B3, the visible light from the targets screen display passes through.
  • the target screen in an embodiment, consists of a white surface marked up with seven fixation target crosses, one in the centre and six evenly spaced around the periphery.
  • the surface of the target screen is front-lit by a white LED. Behind each cross is a red LED that is activated when that cross is to be used as the fixation target. This causes the cross to have a red backlight. The power from the white LED can be varied to control the pupil opening to some extent.
  • a dynamic target screen such as that provided by an LCD or OLED display. This would place the operation of fixation target location wholly under the control of imaging software.
  • the following sequence of operations applies to the operation of the exemplary embodiment of the instrument.
  • the method for quantitative imaging the retinal fundus is illustrated in Fig. 10.
  • the method for retinal health assessment comprises imaging the retinal fundus at different wavelengths within a spectral range and determining spectral reflectivity of the retina for each pixel within a field of view (FOV).
  • the retinal health is assessed based on the spectral reflectivity of the retina.
  • a patient is seated comfortably and places the forehead against the forehead brace and the chin on a chinrest of the instrument.
  • the cardiac pulse sensor is placed at a suitable position on the patient; for example, the cardiac sensor is wrapped around a finger.
  • the instrument is then put in the corneal viewing mode.
  • Reflector R1 is placed in position and the corneal illuminating LEDs are activated.
  • a fixation target is selected and illuminated and the patient is asked to gaze at the fixation target.
  • An operator adjusts the position of the optical head to centre the eye on the viewing axis and to set the correct working distance.
  • the camera captures a view of the cornea, which is used to estimate the pupil size.
  • the instrument is then switched into the retinal-viewing mode.
  • R1 is removed from the optical path and the corneal LEDs are extinguished.
  • the infrared LED for illuminating the retina for focusing is activated.
  • the operator optimizes the focus of the retina using the monitor. Once the focus is optimized, the retinal image capture sequence starts.
  • the pulsed infrared (IR) LED is coupled into the periscope port. About half a second after the heartbeat, the IR LED is pulsed for 4 milliseconds. During this time, the fixation illumination is extinguished.
  • the CCD or CMOS is actively storing photoelectrons during the image capture phase. At the end, the image charges are transferred into CCD storage, or CMOS image sensor storage, and serially transferred out of the chip. The images are digitized and the results placed in a temporary store. The image data is then transferred by a suitable connection to the computer and digitally stored.
  • the periscope rotates and brings the red port into view. Upon the next heartbeat, the red LED is pulsed. The same sequence as above is followed and is repeated for the other LEDs (green, cyan and blue).
  • the appropriate exciting LED is coupled to the illumination path using the rotating periscope. Then a blocking filter F1 is inserted into the viewing path.
  • the CCD or CMOS image sensor can be set to the 2 x 2 binning mode to enhance the signal to noise ratio. Then the image can be captured as above.
  • the retinal image capture sequence described above is repeated with another mask temporarily inserted.
  • the computer performs multiple processing operations on the captured image data to prepare for presentation to the ophthalmologist who is typically using a remote PC connected to the instrument through Ethernet.
  • the ophthalmologist is able to view images and to extract quantitative and qualitative data relating to the images.
  • the instrument is capable of high-resolution digital multi-spectral retinal health assessment targeting research related to biochemical and structural retinal malfunction.
  • the embodiment integrates a number of flexible measurement capabilities into a bench top instrument, which facilitates advanced clinical research measurements for monitoring the metabolic and anatomical activity of the eye to detect, at the earliest stage, activity that could lead to the onset of blinding eye diseases such as macular degeneration, diabetic retinopathy, glaucoma, cataracts, etc.
  • the exemplary embodiment targets the measurement of transient and persistent metabolic dysfunction, through advanced measurements of spatially resolved retinal oxygen saturation and retinal auto fluorescence. It enables the investigation of biochemical processes, and enhances the detection of drusen and other markers of RPE dysfunction through auto fluorescence and spectrally resolved fundus imaging at different wavelengths within a spectral range that spans from the visible region (about 450 nm) into the near infrared (NIR) region (about 1000 nm). In addition full color 40 degrees high-resolution fundus images provide correlation to clinical fundus photography.
  • the embodiment can generate quantitative as distinct from qualitative data that can be used to more accurately gauge the health of the retina, particularly where such measurements are carried out at different time intervals and would allow trend analysis related to health degradation.
  • the quantitative data will represent the spectral reflectivity of the retina for each pixel within the field of view (FOV).
  • FOV field of view
  • the exemplary embodiment provides integration of sophisticated and novel measurement capabilities, system control, and data analysis and management and data processing capabilities. The capabilities and features of the exemplary embodiment are described below.
  • Choroidal oxygenation is mapped across a 40 degree retinal field centered on the fovea with better than 30 ⁇ lateral resolution.
  • a signal extraction method enables oxygenation mapping equivalent to full spectral measurement with a finite number of wavelengths, resulting in shorter measurement times while maintaining accuracy and resolution.
  • the exemplary embodiment provides spectrally controlled stimulation and spectrally resolved detection of retinal auto fluorescence with up to 20 ⁇ resolution across the 40 degree retinal field.
  • Long term RPE function disruption can be mapped through quantitative lipofuscin distribution and drusen density analysis across the 40 degree field of the auto fluorescence retinal image.
  • researchers and users can refine their auto fluorescence analysis through easy access of the spectrally resolved images of auto fluorescence.
  • the exemplary embodiment can automatically combine images taken at different illumination wavelengths to produce a high-resolution RGB-standard color fundus image.
  • An optimized GUI-based user interface on the high-performance computer platform provided with the exemplary embodiment allows for intuitive control over the functions of the instrument.
  • Data entry windows allow seamless integration of custom measurement parameters, such as setting of illumination intensities and saving commonly used experimental configurations.
  • Use of standard file format, such as DICOM standard ensures reliable data and subject information management across multiple platforms using different instrument configurations.
  • the software used in the exemplary embodiment provides secure and effective management of the acquired image data and subject or patient information. Spectral slicing, false color, automatic and manual balancing, zoom, pan, etc., are all controlled from the host computer and displayed on a high-resolution display, such as a LCD monitor.
  • the exemplary embodiment can be packaged as a robust tabletop instrument, designed for simple placement and positioning.
  • the remote AC power adaptor and computer/controller enable optimum experimental flexibility while the integrated sensing units maintain reproducibility over time.
  • the modular design allows for easy maintenance.
  • the level of integration in the exemplary embodiment provides a highly effective and flexible instrument for advanced investigation of retinal functions through fundus imaging and metabolic activity monitoring. The capability allows researchers to configure and control experiments with high quality and reproducible results.
  • the exemplary embodiment measures retinal health by monitoring metabolic activity through oxygenation and auto fluorescence of accumulated retinal by product.
  • the instrument is a spatially resolved oxymeter with multiple narrowband illumination sources; auto fluorescence lipofuscin and drusen camera with multiple filters and multiple stimulation frequencies; and a high resolution fundus camera, for example, 4 mega pixel for each wavelength with a working distance of about 20 mm in a room illumination of 10 lux and having a pupil diameter of 3.5 ⁇ 0.5 mm with a beam diameter of about 1 mm at cornea.
  • the angle of coverage (circular) is about 40 degrees, and the wavelength range for detection is 450-1000 nm.
  • the typical spectral resolution is 5 to 50 nm (FWHM) and the spatial resolution is about 30 ⁇ for oxymetry.
  • the acquisition time is typically in the region of 10 to 60 millisecond per image as determined by the illumination flash duration and the acquisition timing is related to the cardiac pulse instant.
  • the spatial resolution for auto fluorescence is about 20 ⁇ and the dynamic range is about 40 dB and the wavelength detection is about 500 to 1000 nm in spectral bands.
  • the total number of spectral points is a minimum of 4 and minimum detectable intensity change is of the order of 1 % for each wavelength band.
  • the patient's apparent viewing range is focused on infinity with adjustment for presbyopia.
  • the spatial resolution on the retina for the full color fundus camera is about 20 ⁇ .
  • the illumination levels conform to class 1 ANSI Z136 standard.
  • the instrument can be controlled with standard operating systems such as Windows® and the image data conform to standards such as DICOM, jpeg, tiff, bitmap, etc. Additional adjustments include vertical and lateral adjustment to center dark pupil; two point source reflections minimized to set correct working distance of 20 mm; lighting adjusted to optimize pupil size; automatic coarse focus using patient's prescription accommodating a range of ⁇ 16 diopters; and automatic optimization for each wavelength.
  • the exemplary embodiment also provides live image of cornea with off-axis IR illumination; a sequence of illumination pulses each synchronized to the cardiac pulse; corneal image capture to automatically calculate pupil area; live retinal view under IR illumination with manual fine focus; an illumination cone angle of about 43 ⁇ 47 degrees; and an image capture cone angle of about 41 ⁇ 45 degrees.
  • the cold instrument warm up time is typically less than 10 minutes and standby warm time is typically less than 1 minute.
  • the typical total illumination energy is of the order of 50 ⁇ with a per pixel illumination energy is about 42 pJ.
  • the photon count on the retina is about 1 15 million per pixel while the photoelectron count at the CCD or CMOS image sensor is about 15000 per pixel.
  • a typical limiting value for sensor resolution is about 1 .3 arcmin or about 6.5 ⁇ .
  • the embodiments of the instrument described herein are capable of several types of measurement, including mapping retinal spectral reflectivity, measuring interior specular absorption, and mapping retinal auto-fluorescence and retinal oxygenation measurements.
  • mapping retinal spectral reflectivity measuring interior specular absorption
  • mapping retinal auto-fluorescence and retinal oxygenation measurements are capable of several types of measurement, including mapping retinal spectral reflectivity, measuring interior specular absorption, and mapping retinal auto-fluorescence and retinal oxygenation measurements.
  • the instrument has greater value to the ophthalmologist who would otherwise have to invest in additional instruments, if available, and devote more time to patient care.
  • each image corresponding to a spectral sample is captured with an exposure time typically of the order of tens of milliseconds, separated at discrete time intervals, for example, several seconds. This extends the total image capture session period and can also introduce the possibility that various unwanted changes occur between each capture period. For example, the patient may move his head or eye gaze orientation causing misalignment and the need for a retake. Further, the process described in relation to Figure 10 implies the need for an image registration process to follow. Furthermore, the pupil of the patient will tend to close (constrict) after exposure to flash illumination.
  • CMOS image sensors enables many high quality images to be captured in a short period. These sensors can capture an image in as little as ten milliseconds. This speed capability can be exploited in the present disclosure by arranging, in a rapid sequence, a group of spectral illumination sources (light sources), such that to the patient, the appearance is that of a single flash. The total duration required to take the multiple images of the retinal fundus at the multiple wavelengths can be limited so that the pupil size does not significantly contract during the extended flash.
  • spectral illumination sources light sources
  • the time period can be referred to as the constriction latency period of the pupil, which sets an upper limit for the aggregated flash duration. This limit can be extended where an illumination flash has a spectrum in the infrared region that does not stimulate pupil contraction.
  • the ability to capture a multiplicity of spectral images in an almost simultaneous manner shortens the session time, largely removes the need for retakes, eliminates the need for image registration when creating composite images, and ensures a consistent pupil size over the set of images taken at the multiple wavelengths.
  • Figure 1 1 which is not to scale, shows a graph 200 of optical power as a function of time.
  • the graph 200 has five sections each representing an illumination wavelength (or wavelength range).
  • the eye (or the retinal fundus of the eye) is first illuminated by light 202 at the first wavelength ⁇ 1 for a pre-determined time period t 1 .
  • the eye is illuminated by light 204 at a second wavelength ⁇ 2 for a pre-determined time period t 2 .
  • the eye After a time delay At 203 from the moment when the light at the second wavelength is turned off, the eye is illuminated by light 206 at a third wavelength ⁇ 3 for a pre-determined time period t 3 . After a time delay At 203 from the moment when the light at the third wavelength is turned off, the eye is illuminated by light 208 at a fourth wavelength ⁇ 4 for a pre-determined time period t 4 . Finally, in the present example, after a time delay At 203 from the moment when the light at the fourth wavelength is turned off, the eye is illuminated by light 210 at a fifth wavelength ⁇ 5 for a pre-determined time period t 5 . The time delay At 203 does not need to be the same between each pair of wavelengths.
  • the time delay At 203 is introduced to allow for the next source of light to be switched into position to illuminate the eye.
  • the power and duration of each flash can be different, but their product - that is the flash energy - can be substantially similar (e.g., within a 10% difference in flash energy, or, in some other cases within a 50% difference in flash energy). Thus the more powerful flashes are shorter.
  • Figure 12 shows a flowchart of a method of acquiring sequential images of the eye at different wavelengths in accordance with certain examples of the present disclosure.
  • the method begins.
  • the eye is illuminated at an illumination wavelength and an image of the eye (of the retinal fundus) illuminated at the illumination wavelength is captured.
  • the image can be captured with any suitable imaging device such as, for example, a CMOS imaging device.
  • it is determined if images have been captured for all illumination wavelengths. If not, the illumination wavelength is changed by changing (switching the light source) and the method proceeds back to action 214.
  • the method ends at 220.
  • Rapid image sequencing such as described in relation to the examples of Figures 1 1 and 12, implies the need for high speed switching of the optical paths from the various spectral sources (illumination sources, e.g., LEDs).
  • the rotating periscope switch used in the original design may not have, in certain applications, sufficient speed (see periscope mirrors R2 at Figures 3 and 5).
  • a revised light source selection module or light source module can be used.
  • Figure 13 shows an embodiment of such a light source module 300 within context of the retinal illumination optical path.
  • the light source module 300 includes a series of light sources 302, for example, LED sources of different wavelengths, that can be selectively turned on and off by a controllable power supply (not shown).
  • the light emitted by each LED 302 transmits through a pair of lenses 304 before impinging on a fixed reflector (mirror) 306.
  • the fixed reflector 306 directs light from its respective LED source toward the reflecting surface of a rotatable mirror 308, which directs the light from the LED source toward lens L16.
  • the rotatable mirror 308 is rotatable about an axis 310, by the motor unit 312. The rotatable mirror 308 thus optically couples the selected light source to the retinal fundus of the eye.
  • Figure 14 shows an expanded view of the light source module 300.
  • the reflecting surface 314 of the rotatable mirror 308 is shown in Figure 14. [0185] By having to move only one mirror (rotatable mirror 308) rather than two mirrors (R2 in Figure 5), the load on the motor unit 312 can be reduced and faster switching between LEDs can be achieved.
  • Figure 15 shows a similar embodiment of a light source selection module with, for clarity purposes, only two LEDs.
  • the light source selection module 316 of Figure 15 has two LEDs 318 that produces light at a respective wavelength.
  • Each LED 318 has associated thereto a lens 320, which collimated light from its respective LED 318 and transmits that light to a respective fixed reflector 306 (static mirror).
  • the light from the LEDs 318 is reflected by the fixed mirrors 306 towards a rotatable mirror 308 (rotating mirror), which directs the light towards the lens L16.
  • the rotatable mirror 308 is rotatable about the rotation axis 310, by a motor unit (not shown).
  • the LEDs can be energized (switched on) only when needed. Specifically, they are switched on as soon as the mirror movement has finished and the optical path is set up for illumination of the eye by the LED, and switched off before the next mirror movement begins.
  • Figure 15 is a side view and displays only two of multiple channels (a channel is equivalent to an LED and its related optics that allow to illuminate the eye).
  • the rotating mirror 308 rotates about the axis 310. Light from each LED is directed to a lens 320 that substantially collimates the beam into one of large area and small divergence. The beam is deflected through 90 degrees at the static mirror 306 and directed towards the rotating mirror 308. After deflection through another 90 degrees back to a vertical orientation, the beam is directed to a second lens (L16) that converges the beam.
  • the rotating mirror 308 turns through 180 degrees to move from one channel to the other. In presence of additional channels, the rotating mirror 308 would turn through a lesser angle to couple into another source not shown. For example, if the sources were spread 30 degrees apart, from a vertical perspective, then 12 channels would be accommodated.
  • the rotating mirror 308 of Figure 14 or of Figure 15 can be replaced by a series of sliding plane mirrors of fixed orientation.
  • Figure 16 shows two such sliding mirrors 322.
  • Each LED 318 has associated thereto a sliding mirror 322.
  • Each sliding mirror is 322 can be slid between and in-path position (in-optical-path position) and an out-of-path position (out-of-optical-path position) by a respective actuator such as, for example, a solenoid actuator 324 (or a spring-loaded solenoid actuator.
  • the solenoid actuators 324 are operationally connected to, and controlled by, an actuator controller (not shown). This arrangement ensures that any transition between channels (LEDs) takes the same time, an advantage over the rotating mirror design. It also substantially removes air resistance during motion, so easing the drive requirement.
  • Imaging of the anterior of the eye uses infrared light for illumination, which does not distract the patient.
  • Two infrared LEDs can be positioned horizontally on each side of the main objective lens and illuminate the eye anterior with the incident light having an angle of incidence of typically 45 degrees with respect to the viewing axis.
  • Figure 17 shows an alternative embodiment where the anterior position of the eye can be monitored with an image sensor module 400 simultaneously to the camera 100 viewing the retina of the eye being monitored.
  • the camera 100 views the retina at the same time the image sensor module 400 views the anterior of the eye and there is no delay between the images.
  • the image of the anterior portion of the eye is processed to generate a position vector using an eye tracking technique.
  • the position vector (its direction and magnitude) is indicative of the direction towards which the eye points. This position vector is then used to control the positioning motors to set the correct position automatically and continuously. Moreover, only when the correct position is obtained is the instrument directed to capture images.
  • This automatic alignment technique requires that the eye position be substantially continuously monitored.
  • An example of a positioning arrangement is shown at Figure 18.
  • the image sensor module 400 is directed at the front of the eye 402 where it can image the cornea, iris and pupil 404. These are all illuminated as before by light from light sources such as, for example, LEDs (not shown).
  • the signal from the image sensor module is sent to a processor 406.
  • the image is there analyzed using a limbus tracker technique that enables the position of the eye to be determined.
  • the limbus tracker technique locates and identifies the circles corresponding to the borders of the iris.
  • the limbus tracker technique then generates x and y (horizontal and vertical) co-ordinates, and these are compared to values corresponding to a p re-determined well-aligned position (target position).
  • Error signals that is the difference between the ideal and actual co-ordinates, are used within control loops that direct the horizontal and vertical motor drivers 408 that in turn drive the respective motors (positioners) 410 to achieve the required position (target position).
  • the control may also extend to the longitudinal spacing between the eye and the equipment and be arranged to set this for the best retinal focus. This would operate on a third motor (positioner), which is not shown.
  • the autofluorescence image capture feature used a blue LED.
  • An improved result can be obtained using a source of near infrared wavelength in the optical wavelength region of 600 nm as the autofluorescence at the longer wavelength avoids the internal scattering that degrades the image when using blue light.
  • the captured, autofluorescence image is formed by irradiation at longer wavelengths such as those greater than 650 nm, using a high-pass optical filter located at position A2 in Figure 3.
  • a further change in autofluorescence image capture can be in the size of the corneal illumination, which can be increased from having a diameter of about 1 mm to a much larger diameter such as, for example, a diameter of 3 mm. This enables the launched energy to be increased by an order of magnitude.
  • the large corneal reflection is blocked by the aforementioned high-pass filter positioned at A2 in Figure.
  • the size of the source LED can be increased from 1 mm to 3 mm in diameter, and the aperture A3 of Figure 3 that limits the area illuminated can be correspondingly enlarged.
  • the oxygenation image capture feature indicated the use of illumination wavelengths of 505 nm, 617 nm and 850 nm.
  • a different oxygenation image can be created from the ratio of two images of contiguous spectral bands.
  • a first image (a first spectral reflectivity image) can be captured using illumination extending from, for example, 570 to 586 nm;
  • a second image (a second spectral reflectivity image) can be captured using illumination band extending from, for example, 586 to 610 nm. Determining the ratio of the first image to the second image, i.e., taking the spectral reflectivity of the two images, provides an indication of the retinal oxygenation and can serve to assess the retinal health.
  • While energy at these wavelengths may be directly generated by LEDs, it can also be indirectly generated by LEDs, using a phosphor intermediate layer to convert blue light to that of a broad spectrum. This is a technique used to generate "white" light from LEDs. The white light can then be restricted to the required spectral band using an external optical bandpass filter. Referring again to Figure 5, this filter can be located, for example, in the optical path, between L15 and the periscope mirror R2, where the light is substantially collimated.
  • U.S. Patent Application Publication No. 2013/010721 1A1 describes a technique used to image the choroid of the eye.
  • the apparatus described in the present disclosure can be adapted to include the ability to capture choroidal images and so significantly increase its utility and avoid the need for a separate apparatus.
  • the illumination (irradiation) for choroidal imaging is not injected into the eye through the pupil, as it is for retinal imaging. It is instead injected into the eye through the sclera.
  • the image collection arrangement through the pupil remains unchanged and the apparatus of the present disclosure can be used.
  • Figure 19 shows an example of how an eye 500 can be illuminated through the sclera 502 by a pair of light source modules 504.
  • the design of the instrument as shown in the example of Figure 3 can be adapted to capture stereoscopic images. Such images are of particular value in studying the optic nerve bundle where it passes through the retina at the blind spot and proceeds towards the brain.
  • a stereoscopic image requires two images each associated with a different viewing angle. Conventionally, this requires moving the imaging platform with respect to the object to be captured. However, this is not required for the instrument design of Figure 3.
  • the retinal image is passed through an annular area of the cornea. Each part of the annulus is associated with a viewing angle.
  • the mask is located at position A1 in Figure 3. It has three lateral separated positions. In one position, it provides no blocking of the image and this is the position used for all non-stereoscopic image capture events. In another position, the right half of the annulus is blocked. In a third position, the left half of the annulus is blocked. With a typical pupil diameter of 4 mm corresponding to the outer annular diameter, the angular difference between the two views is about 13 degrees, corresponding to eventual viewing at an apparent range of about 28 cm.
  • the stereoscopic image is later viewed on a display where both angular images are simultaneously presented.
  • Each angular view has a different color and the viewer is wearing standard stereo spectacles that include color filters such that the left eye sees one image and the right eye the other image.
  • Embodiments of the invention can be represented as a software product stored in a machine-readable medium (also referred to as a computer-readable medium, a processor- readable medium, or a computer usable medium having a computer-readable program code embodied therein).
  • the machine-readable medium can be any suitable tangible medium, including magnetic, optical, or electrical storage medium including a diskette, compact disk read only memory (CD-ROM), memory device (volatile or non-volatile), or similar storage mechanism.
  • the machine-readable medium can contain various sets of instructions, code sequences, configuration information, or other data, which, when executed, cause a processor to perform steps in a method according to an embodiment of the invention.
  • Those of ordinary skill in the art will appreciate that other instructions and operations necessary to implement the described invention can also be stored on the machine-readable medium.
  • Software running from the machine-readable medium can interface with circuitry to perform the described tasks.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Chemical & Material Sciences (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • General Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Optics & Photonics (AREA)
  • Eye Examination Apparatus (AREA)
EP14822887.7A 2013-07-12 2014-07-11 Verfahren und vorrichtung zur augenhintergrundüberwachung Withdrawn EP3019070A4 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/940,668 US8807751B2 (en) 2008-04-22 2013-07-12 Retinal fundus surveillance method and apparatus
PCT/CA2014/050663 WO2015003274A1 (en) 2013-07-12 2014-07-11 Retinal fundus surveillance method and apparatus

Publications (2)

Publication Number Publication Date
EP3019070A1 true EP3019070A1 (de) 2016-05-18
EP3019070A4 EP3019070A4 (de) 2017-03-22

Family

ID=52279269

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14822887.7A Withdrawn EP3019070A4 (de) 2013-07-12 2014-07-11 Verfahren und vorrichtung zur augenhintergrundüberwachung

Country Status (3)

Country Link
EP (1) EP3019070A4 (de)
HK (1) HK1223534A1 (de)
WO (1) WO2015003274A1 (de)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108324241B (zh) * 2018-01-22 2024-02-02 深圳盛达同泽科技有限公司 多光谱光源、眼底成像系统和成像方法
NL2026025B1 (en) * 2020-06-15 2022-02-17 Akkolens Int B V Apparatus and method to measure accommodative structure of the eye
WO2021256922A1 (en) * 2020-06-15 2021-12-23 Akkolens International B.V. Apparatus and method to size the accommodative structure of the eye
WO2022011420A1 (en) * 2020-07-14 2022-01-20 Centre For Eye Research Australia Limited Non-mydriatic hyperspectral ocular fundus camera
DE102020209379A1 (de) * 2020-07-24 2022-01-27 Carl Zeiss Meditec Ag Verfahren und Vorrichtung zur Einstellung und Kontrolle von Parametern des Beleuchtungsfeldes ophthalmologischer Geräte
FI130181B (fi) * 2021-05-28 2023-03-30 Teknologian Tutkimuskeskus Vtt Oy Laite ja menetelmä valonsäteiden yhdistämiseksi
WO2023049536A1 (en) 2021-09-21 2023-03-30 Halliburton Energy Services, Inc. Inflatable element system for downhole tools
WO2024075064A1 (en) * 2022-10-05 2024-04-11 Eyecare Spa Methods and apparatus for detection of optical diseases

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9813041D0 (en) * 1998-06-16 1998-08-12 Scient Generics Ltd Eye tracking technique
US6276798B1 (en) * 1998-09-29 2001-08-21 Applied Spectral Imaging, Ltd. Spectral bio-imaging of the eye
US7360895B2 (en) * 2000-07-14 2008-04-22 Visual Pathways, Inc. Simplified ocular fundus auto imager
JP4563828B2 (ja) * 2005-01-27 2010-10-13 株式会社トプコン 眼底検査装置
DE112006001217T5 (de) * 2005-05-13 2008-03-27 Customvis Plc, Balcatta Schnell reagierende Augennachführung
US20090137893A1 (en) * 2007-11-27 2009-05-28 University Of Washington Adding imaging capability to distal tips of medical tools, catheters, and conduits
WO2009129624A1 (en) * 2008-04-22 2009-10-29 Annidis Health Systems Corp. Retinal fundus surveillance method and apparatus
US8807751B2 (en) * 2008-04-22 2014-08-19 Annidis Health Systems Corp. Retinal fundus surveillance method and apparatus
JP2011243440A (ja) * 2010-05-19 2011-12-01 Olympus Corp 照明装置

Also Published As

Publication number Publication date
WO2015003274A1 (en) 2015-01-15
EP3019070A4 (de) 2017-03-22
HK1223534A1 (zh) 2017-08-04

Similar Documents

Publication Publication Date Title
US8807751B2 (en) Retinal fundus surveillance method and apparatus
CA2759646C (en) Retinal fundus surveillance method and apparatus
US10542885B2 (en) Retinal cellscope apparatus
WO2015003274A1 (en) Retinal fundus surveillance method and apparatus
US9480394B2 (en) Apparatus and method for imaging an eye
CN209611102U (zh) 适配器和包含适配器的眼底照相系统
JP5651119B2 (ja) 眼の画像化装置及び方法
US6685317B2 (en) Digital eye camera
US20060268231A1 (en) Illumination method and system for obtaining color images by transcleral ophthalmic illumination
US8142018B2 (en) Reflectance measurement of macular pigment using multispectral imaging
WO2003061460A2 (en) Device and method for optical imaging of retinal function
AU2004233640A1 (en) Measurement of distribution of macular pigment
Harings et al. Real-time video funduscopy with continuously moving fixation target
Bartczak et al. Spectrally tunable light source based on a digital micromirror device for retinal image contrast enhancement
EP3440990A1 (de) System zur abbildung eines augenfundus
Alterini Hyperspectral imaging system for the fast recording of the ocular fundus
Sprowl Reflex free fundus imaging of wide field of views through small pupils using DMD projector illumination
GB2570939A (en) Imaging device and method of imaging a subject's eye
WO2008133697A1 (en) Reflectance measurement of macular pigment using multispectral imaging
EP3440989A1 (de) System zur abbildung eines auges

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160212

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
RIN1 Information on inventor provided before grant (corrected)

Inventor name: POWELL, IAN

Inventor name: GRIBBEN, JEREMY LLOYD

Inventor name: BOATE, ALAN

Inventor name: KAHN, DAVID ALEXANDER

A4 Supplementary search report drawn up and despatched

Effective date: 20170217

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 3/13 20060101ALI20170213BHEP

Ipc: A61B 3/14 20060101ALI20170213BHEP

Ipc: A61B 3/00 20060101ALI20170213BHEP

Ipc: A61B 3/12 20060101AFI20170213BHEP

Ipc: A61B 3/15 20060101ALI20170213BHEP

REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1223534

Country of ref document: HK

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20190201

REG Reference to a national code

Ref country code: HK

Ref legal event code: WD

Ref document number: 1223534

Country of ref document: HK