WO1996017545A1 - Electronic imaging system for retinal examination and treatment - Google Patents

Electronic imaging system for retinal examination and treatment Download PDF

Info

Publication number
WO1996017545A1
WO1996017545A1 PCT/US1995/015996 US9515996W WO9617545A1 WO 1996017545 A1 WO1996017545 A1 WO 1996017545A1 US 9515996 W US9515996 W US 9515996W WO 9617545 A1 WO9617545 A1 WO 9617545A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
images
image acquisition
display
color
Prior art date
Application number
PCT/US1995/015996
Other languages
French (fr)
Other versions
WO1996017545A9 (en
Inventor
Sven-Erik Bursell
Lloyd M. Aiello
William Kelley Gardner
Original Assignee
Joslin Diabetes Center, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Joslin Diabetes Center, Inc. filed Critical Joslin Diabetes Center, Inc.
Priority to AU45127/96A priority Critical patent/AU706720B2/en
Priority to JP8517808A priority patent/JPH10510187A/en
Publication of WO1996017545A1 publication Critical patent/WO1996017545A1/en
Publication of WO1996017545A9 publication Critical patent/WO1996017545A9/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/145Arrangements specially adapted for eye photography by video means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/15Processing image signals for colour aspects of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/341Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals

Definitions

  • the present invention relates to systems and methods for examining and treating the retina of an eye. and more particularly, to imaging systems and communication systems that provide stereoscopic images of the retina of an eye.
  • Diabetes is the leading cause of blindness in working age adults. It is a disease that, among its many symptoms, includes a progressive impairment of the peripheral vascular system. These changes in the vasculature of the retina cause progressive vision impairment and eventually complete loss of sight. The tragedy of diabetic retinopathy is that in the vast majority of cases, blindness is preventable by early diagnosis and treatment, but screening programs that could provide early detection are not widespread.
  • EDRS Early Treatment Diabetic Retinopathy Study
  • an image collection or acquisition unit provides digitized high resolution true color images of the retina of the eye and a computer network receives and manages these images causing them to be displayed as visual images on a display/monitor workstation for diagnostic examination.
  • the images are stereo images.
  • the computer network is interfaced with a medical record database or means for entering medical history information, such that the retinal images and medical records are linked in a relational database and accessed together.
  • a telecommunications link interconnects the computer network to the image acquisition station where the patient is actually examined, with a bandwidth such that fundus images are made available through the computer without degradation for viewing on the examination displays remotely from the room, building, or city where the images are acquired from the patient.
  • the images may be taken and viewed stereoscopically for analysis by a trained ophthalmologist or by other trained personnel.
  • a graphic interface operating in conjunction with special processing modules identifies image features, or changes, of diagnostic significance and allows the viewer to enter textual data in the record.
  • the telecommunications system preferably includes a fiber optic link for the image data, and may connect to teleconferencing equipment at either end, allowing an optometrist or trained technician in one location to take and transmit stereo fundus images while consulting in real time with a specialist who analyzes those images at another location.
  • the image acquisition station includes a relatively simple apparatus in which a relatively unskilled operator may aim. adjust and actuate the camera to produce true color images, while the examination station includes operator control units for retrieving, displaying and manipulating the images acquired from the image acquisition station.
  • These controls may include image processing modules implemented in software for enhancing feature definition, selectively filtering colors from the image, or implementing block transforms other imaging processes to accentuate the appearance of vessels or surrounding tissue appearing in the stereo image, or identifying changes since a previous imaging session.
  • the image examination stations may receive or retrieve images and display them in spatial order as standard fields, may project a series of standardized images such as the fundus Photograph Reading Center of the University of Wisconsin (P.O. Box 5240, Madison Wisconsin, U.S.A. 53705), and may project comparative sets of stereo images for clinical review, feature enumeration and data entry, or other analytic or recording purposes.
  • the image control or manipulation units may also include control units which operate in real time for sending direct image control signals to the image acquisition station to adjust the level of image resolution, the illumination color or intensity, or the field of view or image orientation.
  • the telecommunications link advantageously includes an audio channel, to be used for simple verbal consultation and instruction, such as instructing the person operating the imaging camera to aim the camera at a specific site or lesion or make other adjustments or observations.
  • the audio channel may also be used for remote consultations with the evaluating doctor.
  • the image acquisition unit includes high definition digital frame true color imaging equipment, such as a camera or pair of video cameras attached to an ophthalmic imaging device, or a multiplexed single camera aimed at the focal planes of a stereo fundus camera ophthalmic imaging assembly, to generate high definition video frame data, preferably at a sufficient speed to provide time resolution motion pictures or high resolution views of the retinal field.
  • the video image acquisition unit also includes an adjacent or proximal control unit which may include means for inserting alphanumeric information display legends within the video frames to provide information such as patient name, medical record number, date of image and precise time information.
  • the local control unit may also provide time information and synchronization with a dye injector or other stimulus so that the time marked on each frame reflects the time interval elapsed since the start of a fluorescein or dye angiography, or other procedure.
  • the stereo video frames do not just show the retinal tissue and vasculature, but rather, successive frames provide an evolving image which quantitatively shows both the circulatory capacity and the flow velocity in each vessel or region of the retina. This, together with the elapsed time indication provides a quantifiable record of retinal circulation.
  • the image acquisition unit preferably includes an adjustment for color matching an image frame before the frame is recorded, allowing all frames of a set, whether recorded at the same or different sessions, to achieve optimal feature visibility and avoid heterochromic artifacts that might otherwise arise when comparing or processing multiple views.
  • the image frames are preferably coded and stored to identify their relative position on the fundus, and the image examination stations selectively display the acquired images as multi-image sets arrayed in accordance with their fundus position, or display images as one or more sets of stereo pairs, or as isomorphic enlargements of a region designated by the operator.
  • a fiber optic link interconnects at least a portion of the image acquisition or storage with the computer network, and true color digital images may be taken and displayed in real time to effectively place the specialist-ophthalmologist in immediate control of the diagnostic procedure.
  • treatment instruments may also be set up at the image acquisition station and continuously monitored or controlled from the examination/display room.
  • Such units may include laser surgery or coagulation units for performing retinal microsurgery.
  • the operator control units may include means for controlling the power level and aiming of such treatment instrument remotely from the examination room.
  • the control system When operated in this mode, preferably rather than aiming and firing the coagulation unit directly, the control system includes an image analysis module, which allows the ophthalmologist to select the tissue sites on the stereo image video frames displayed in the examination room which are to be treated in the remote station. Once the retinal site coordinates on the image frame are identified, this information is transmitted over the network to the image acquisition application site.
  • the laser coagulation unit at the patient site preferably includes image analysis and target tracking software which allows the identified coordinates on the frame viewed by the ophthalmologist to be located in the current frame scan of the patient's eye, and control interface software to aim the laser unit accordingly.
  • intervening movement of the patient's eye or destabilization of the instrument aiming system during the time elapsed with transmission and display is corrected by correlating successive frames of the video image to identify the precise physical spot in the current frame at which treatment is to occur.
  • Figure 1 illustrates a system constructed according to the present invention for retinal evaluation
  • Figure 2A and 2B illustrate standard fields and a thumbnail array of displayed images. Detailed Description of the Illustrated Embodiment
  • Fig. 1 illustrates a system 10 constructed according to the present invention, for acquiring and examining images of the retina of an eye.
  • the system 10 includes a data collection unit 12, a network unit 14, a data storage unit 16, a central computer 18, remote work station units 20A and 20B, and a telecommunication link 22.
  • the network 14 interconnects the other elements of this system 10 so that each element is in communication with each other element.
  • the system 10 is adapted for communicating information and command signals between one image acquisition station and another image examining station.
  • the system provides an apparatus that allows an operator at a remote work station unit 20 to acquire images and examine the images of a patient located proximal to the data collection unit 12.
  • a basic embodiment of the system of the present invention may be assembled from readily available computing, image conditioning and processing components interfaced with available ophthalmoscopic instruments.
  • a suitable set of network and computer elements to implement the system is described in a system description below, and preferably includes special dedicated cards for graphic acceleration, e.g., image handling and transmission, although the basic image acquisition workstation may operate with relatively simple optical and computer hardware.
  • the data collection or image acquisition unit 12 illustrated in system 10 includes a set of hardware equipment for acquiring high resolution full color retinal images, including a computer processing system 30, a first camera 32 and a second camera 34, a network interface 36 and a set of optional stereo viewing goggles 38.
  • the two video cameras 32, 34 are directed at the focal planes of a stereo fundus camera imaging assembly (not shown), so that they generate views from two different angles, across a triangulation base, of a commonly viewed region of the patient's eye.
  • this allows photometric analysis of fundus tissue, which in conjunction with special applications processing, is partially or completely automated and integrated with the relational database information of system 10.
  • a single camera may be directed to alternately image each of the two focal planes of the ophthalmic instrument, or that, as is commonly done for photographic image recording, a single fundus camera may be manually shifted by the operator to successively take two different images of the retina which form a stereo pair. In that case, a single camera and no special switching are used.
  • the use of two separate video cameras with a stereo fundus camera is preferred so as not to reduce the total amount of light available for forming each image frame and to form uniform and well aligned stereo images at a single instant in time.
  • the video camera 32 or 34 may be a CCD array or other special two- dimensional sensing device, and need not be a complete "camera" in the consumer product sense.
  • the presently preferred embodiment may be a commercially available camera of suitable resolution which produces an output video signal in NTSC, PAL or SECAM format.
  • a direct digital camera may also be used, that is, one having a formatted digital output, although in general special drivers will be required to interface such a camera with computer-manageable video frame handling circuitry.
  • the data collection unit 12 is adapted for acquiring stereo images of the fundus of the patient's eye.
  • the camera elements 32 and 34 interface to the two stereo image planes of a Donaldson stereo fundus camera (not shown), and camera elements 32 and 34 are video camera elements of the type suitable for generating high resolution image signals representative of a visual image encoded in a video format such as the NTSC, S-video or other format; each camera element 32 and 34 connects to the computer processing unit 30.
  • the computer processing unit 30 includes a video interface board that is adapted to interface with the camera elements 32 and 34 to receive video signals therefrom along transmission paths such as impedance matched coaxial cables.
  • the video interface cards of the computer unit 30 contain special high speed circuitry for digitizing the video signals and converting them into high resolution, e.g., 1280 x 1024 pixel, video frames. This corresponds to an effective resolution often to twenty micrometers, with typical fundus camera objective optics.
  • the computer unit 30 also has a provision for processing the captured stereo image frames to generate or allow generation of three dimensional images that can be manipulated and viewed, as will be explained in greater detail hereinafter, by an operator at the data collection unit 12, or at the remote work stations 20A and 20B.
  • the processor system 30 preferably has a foot switch activation signal for initiating image acquisition, with the camera signals fed to an audio/video processing board such as the J300 board of Digital Equipment Corporation.
  • This circuit performs any necessary video scaling, filtering and color dithering of the video signals received in a standard format from the camera, and produces a normalized output in NTSC or PAL video, or in composite or S-video formats, that is suitable for digital processing.
  • the output of the J3000 board is fed to a twenty-four plane double-buffered graphics accelerator circuit, which digitizes and formats the frames, and provides efficient internal handling and processing in real time for true color high resolution video frames.
  • the accelerator card allows left - and right - video frames to be alternately displayed on the display at twice the normal 70 Hz refresh rate, for stereo viewing.
  • Frame-to-frame spatial correlation then allows a three-dimensional stereo imaging and display program to build and manipulate three- dimensional views of the imaged field of varying appearance, and to form fractal maps of the displayed retinal image topography.
  • the interface element 36 is a separate element connected via a transmission path to the computer 30, and connects the computer unit 30 to the network 14 for transmitting information signals and command signals therebetween.
  • the interface element 36 can be a standard ethernet network card that is adapted for interfacing the computer unit 30 to a standard ethernet network such as the Novell network system, and preferably includes high bandwidth optical fiber data ports and adapters.
  • the display at the data collection unit 12 advantageously operates with a set of stereo goggles.
  • the data collection unit 12 is adapted for acquiring and viewing stereo images that represent the fundus of a patient's eye and presenting them in an alternating time-sequential fashion on display that enhances the viewer's perception of depth and shading.
  • the computer monitor element 30A works in concert with the stereo goggles 38 to allow an operator to view three dimensional images of the patient's retina.
  • the monitor element 30A is a high resolution video monitor that operates at a frame rate of 140 Hz for displaying a sequence of retinal images taken by the cameras.
  • Each of the frames is captured and indexed, and recorded in temporary storage, such as local disc storage, and the sequence of frames is sent on the network where a massive memory device such as a 700 Gbyte storage unit stores the images as a patient image record.
  • This storage unit can store the records of hundreds or thousands of patients with their fundus images, and may be periodically backed up so that complete records are available for all current patients.
  • the frame sequence which may be stopped, if desired, to allow individual frames to be viewed, is viewed through the stereo goggles 38 which act as a time- switched pair of shutters over the operator's eyes.
  • stereo goggles 38 can be of the type manufactured by and commercially available from the Stereographies Corporation and used for image display and analysis, in fields such as stereographic survey image analysis.
  • Each viewing lens of the goggles is formed of an electroactive material which is opaque when energized, and when not energized is clear.
  • An IR receiver on the goggles synchronizes the actuation of the left and right lenses with the display of right and left views on d e display, so that each eye is blanked during the time the other side view is displayed on the monitor.
  • the illustrated network system 14 is adapted for carrying video frame information signals and oti er data or command signals between the image acquisition site and a remote image examination site.
  • the network element 14 is adapted for carrying high volume information signals representative of many digital stereoscopic images and the various computers or processors are also interfaced with a medical records management program to store the images in a relational database, and make the medical record information available at the examination stations.
  • the network includes an optical fiber cable system, and implements an ethernet network protocol.
  • a multichannel direct microwave communications link may be effective to connect the image acquisition unit to a central computer system located within a few miles crosstown, while intra-building optical fiber channels may interconnect medical records and the examination stations within a single building or complex.
  • Leased channel space on a fiber optic link may interconnect acquisition units 12 in distant cities or remote neighborhoods, and satellite up-and-down links may also be used effectively.
  • the invention may be practiced in various forms utilizing low frame rates, or at full resolution but with motion studies delayed somewhat to allow transmission and storage of a frame sequence before viewing.
  • the network 14 includes basic server software and circuitry for interfacing the central computer system 18 and data storage element 16 with a plurality of work station/display units 20a, 20b. etc., of which two are shown.
  • the data storage element 16 is preferably an optical storage element with associated server circuitry, commonly referred to as an optical jukebox, having a large capacity in die range of seven hundred gigabytes. Incoming image frames from the image acquisition site are placed in storage and also made available over die network directly to an examination unit 20, during the course of a procedure, as discussed further below.
  • image acquisition is carried out at a first site such as a mobile screening unit or small clinic, or in a medical center in a community which may be remote from the primary hospital housing the units 20a, 20b and computer 18, using equipment as described above in relation to image acquisition unit 12.
  • the context of the examination may involve a local optometrist or ophthalmologist responsible for the patient, and who actually sets up or controls the fundus camera or other ophthalmic instrumentation in the acquisition room.
  • the network connection by satellite, leased line, fiber optic or other communications link provides the video frame and other audio and digital data to the network which, in general, with the other units described below is located at a central research center or ophthalmology department of a teaching hospital.
  • the network and server with a plurality of image examination stations thus form the hub for a number of image acquisition units 12.
  • Each examination unit 20 may be a relatively powerful work station, with memory for storing and immediately accessing 50-200 Mbytes of data and processing capability including a video accelerator for displaying high resolution true color images.
  • the unit 20 includes a high definition video display, preferably as noted above, a stereo imaging display with specialized viewing discrimination equipment such as infrared-synchronized stereo viewing goggles, and preferably also a digitizing medical records pad and/or keyboard entry system on which a specialist ophthalmologist or specially trained technician viewing the frame can write observations and conclusions of diagnostic relevance.
  • Such a hand-held pad is of die type commonly used by the physician to create standardized medical examination records, and for this application is configured to interface directly to die workstation and software which accommodates the image data in a relational database for accessing and displaying from the existing records in the central data storage, or filling in relevant medical history and other data.
  • a suitable physician's data entry pad is at sold by the Datamedic Corporation, which facilitates entry of medical data and its linking and correlation with data displayed on die screen at d e time.
  • examination units 20a, 20b may be located at a
  • the sites 12 may be simply clinic examination rooms located in the same hospital, but also, due to d e unique system architecture, may consist of plural widely separated clinics at different remote sites.
  • the work performed at examination units 20a, 20b may involve the meticulous view of images and tabulation of diagnostic details, such as counting the number of microaneurisms. leaks, scars or the like in particular retinal image fields, or deriving other quantitative measures, as described, for example in die ETDRS Reports Nos. 10-13 at pages 786-834 of Volume 98, Ophthalmology (May 1991 Supplement). It may also involve other less quantitative or less time consuming clinical judgments, as well as various forms of image manipulation, color enhancement, and record annotation.
  • die precise form of imaging effected at acquisition sites 12 will vary depending on the nature of acquisition equipment available at each site 12 (e.g., ophthalmoscope, fundus camera, stereo fundus camera) and the nature of the image capture at that site.
  • a video camera with subsequent digital frame conversion will produce a true color image frame images with a 6-700 line field resolution which is capable of transmission as under a megabyte of data
  • a high resolution direct digital camera such as a Kodak DCS 420 or DCS 460 with a 1500 x 1250 pixel frame will require 2-4 megabytes of information.
  • applicant has found it necessary to use true color images, and to transmit using only image compression protocols that are not lossy.
  • the availability of a high bandwidth communications channel such as a satellite or fiber optic link is critical to real time consultations, although if only a lesser bandwidth is available it is still possible to transmit images (albeit over several minutes) and undertake remote consultations using the system.
  • a high bandwidth communications channel such as a satellite or fiber optic link
  • DEI-470 color video camera manufactured by Optronics Engineering, or that company's higher resolution DEI-750 3 -chip RGB camera
  • the retina may be imaged in true color using light levels comparable to or less man those employed in die viewing system of a typical retinal fundus camera or ophthalmoscope, and widi a resolution of 10-20 micrometers.
  • d e camera 32 and image acquisition and control unit 12 are configured to produce "normalized" or
  • One suitable retinal imaging assembly is a fundus camera such as a Nikon NF-505 or Topcon TRC-50 fundus camera assembly which has its customary flash, film transport and supporting electronics removed, and replaced by an Optronics video imaging camera mounted to electronically convert the focused fundus image.
  • an image acquisition processor receives the electronic image data and attends to any necessary signal conversion, formatting, framing or other processing for display or transmission of the electronic images.
  • a full color LCD display is mounted in a gimbaled setting to die side of d e camera where it may be directly viewed by the operator as the camera is aimed at the fundus.
  • the electronic imaging assembly may be a direct digital imaging camera, in which all or a portion of die image signal processing and framing circuitry is incorporated in d e camera itself, or may be an analog (video) output imaging tube or array in which the output video signal is subsequently digitized by processor 12 and associated electronics.
  • the frames imaged by the electronic imaging assembly are also delivered to and displayed on the pivoting LCD color screen, thus allowing a direct and large-scale view of the digital color image being captured by the camera.
  • d at retinal images due to the intrinsic color of background tissue, are of a generally orange-red hue.
  • the electronic imaging assembly attached to die fundus camera is intended to function with a relatively low level of continuous illumination rather than flash illumination, and its images may quickly become washed out as illumination increases, or may become a dark brownish purple as the illumination level decreases to a point near the imaging threshold.
  • These light response characteristics stem from the saturation properties of the image sensing CCD portion of the camera, which preferably has an imaging sensitivity below about .01 - 0.2 lux, as well as from the spectral responses of the three different color pixel sets or their filters. Applicant has found that this level of variability, even when the light source is initially set so that the observed tissue appears normal, impairs the diagnostic value of acquired images.
  • the adjustable illumination source may be, for example, a tungsten linear filament lamp of up to about ten or fifteen watts power.
  • the adjustable source has a spectral output with peaks in the yellow/red region of the spectrum, and the peaks shift from red toward shorter wavelengths as d e lamp drive voltage is increased. Variation in this spectral response therefore allows the operator to vary the visual appearance of retinal tissue as the camera is focused on die fundus, and results in a generally greater variation in the color or hue of the displayed images being captured by the camera.
  • the image processor 12 operates in an adjustment mode by providing an image of die current camera image "S" to the LCD display, which is arranged so the operator may simultaneously view the LCD display while looking through the viewing port of the camera assembly.
  • the two images are visually compared by die operator as lamp drive voltage is adjusted, so that die displayed video image ranges through a range of brightness, contrast and relative hues, while the actual observed tissue may appear generally brighter and yellower.
  • the operator then adjusts the lamp until the two images-the LCD display and the direct visual appearance-match as closely as possible in color and intensity in the major (e.g.. background tissue) regions of the image.
  • This color-matching operation can be performed widi accuracy by the human eye, and results in the acquisition of images die digital representations of which are color-matched, despite die actual color balance adjustments of die monitor upon which die operator is observing the images and die camera's spectral response.
  • This color normalization requires mat the color balance "transformation" introduced by die monitor in transforming electronic-to-light images be approximately the inverse of the color transformation introduced by die camera in transforming light-to- electronic image signals.
  • tiiat within the color range encompassed by retinal images, tiiis is dependably achieved by adjusting the camera and display color settings to be approximately inverse under standard light conditions.
  • the necessary recognized settings are generally different from d e standard factory-defined default settings for both the video camera and the monitor, but result in a display mat approximates the actual clinical appearance of tissue.
  • the operator d en needs only to compare the color of the digitized retinal image on ie monitor screen to iat directly observed tiirough the fundus camera eyepiece, and make minor adjustments to die intensity level of the fundus camera viewing light, in order for die color content of the digitized retinal image on the monitor screen to be the same as that observed directly tiirough the fundus camera eyepiece.
  • the institution of this adjustment protocol for the correct color of the retinal image means that it is no longer necessary to rigorously calibrate both monitor and video camera for every patient.
  • each frame of the pair may be normalized as described above.
  • the image acquisition system can be operated by personnel with minimal training, primarily that associated with recognition of the different areas of the retina tiiat are needed for diagnosis of the level of diabetic retinopathy, (i.e. the "standard fields") and some practice in obtaining stereo pair images of the same retinal field, as well as basic clerical skills, e.g.. familiarity with entering patient demographic data in the appropriate fields of a pen pad medical record entry system.
  • the operator need not be skilled in die art of taking retinal photographic images. Because the acquisition is video-based, the acquisition of images and the quality of the images is as simple as operating a commercial home use video camera.
  • d e invention when practiced witii the described Optronics cameras allows accurate image capture at low light levels, so it is less invasive than the use of flash exposure, which is painful. As such, the capture of continuous illumination low light level images is expected to remove a major barrier to universal screening and to promote the practice of routine retinal monitoring.
  • the invention may be practiced with other image capture technology, such as the direct digital output DCS-420 Kodak single frame camera. Such a camera may provide higher resolution, altiiough it would generally require an auxiliary flash source to produce its larger format images.
  • a principal application of the images will be the acquisition of high resolution true color baseline images, including stereo pair images, of the retina and retinal vasculature, and the acquisition of stereographic motion pictures of standardized ophthalmologic procedures such as fluorescein angiography, which will reveal the dynamic properties of circulation at time die test is run.
  • True color images at the described twenty micron or better level of resolution will provide sufficient information for a trained ophthalmologist to evaluate the status of retinal pathology or incipient pathological conditions, to evaluate the efficacy of medications, and provide prognostic consultations.
  • the storage of video frame data and the video monitor 30a at die image acquisition site allow d e referring physician or treating technician to show the patient the precise images involved and discuss die implications of what is seen on the screen.
  • the display monitors are preferably true color twenty-one inch or larger monitors, or where group consultations or teaching are to use the images, a stereo-capable projector with low-persistence green phosphor is used to allow projection of sequences of retinal images.
  • the overall image management program is preferably implemented by adding it to a medical records platform mat links patient records to die image records, and allows both to be augmented or annotated at d e examination stations.
  • the images are preferably acquired and marked as sets of seven standard fields as shown in Figure 2 A and defined in d e Early Treatment Diabetic Retinopathy Study (ETDRS).
  • EDRS Early Treatment Diabetic Retinopathy Study
  • the image display software provides a first display mode in which the seven fields are displayed as thumbnail stereo pair images, as shown in Figure 2B, with the same layout used for botii left and right eyes.
  • the workstation image management software allows the operator to select the type of full-scale image display so tiiat by clicking on the left one of a thumbnail stereo image, the actual full-scale image or chosen set of images is displayed. Display modes are provided for viewing individual fields, as stereograms, or pairs of the fields for comparison, as well as pairs such as present image/previous visit, filtered or color-enhanced images, and grid-matched magnified images to better exhibit particular tissue.
  • the image display sequences provide all the diagnostic features required by the ophthalmologist to make an assessment of the level of diabetic retinopathy.
  • the design preferably provides a screen witi out window decorations maximizing available display area and image viewing without the distractions of display icons or pull down menus.
  • the display functions incorporate true color stereo display, with as many as four separate retinal images displayed in stereo on the same screen. This provides d e facility for meaningful comparative evaluations. For example image fields from three prior visits can be displayed witii a corresponding image field from the current visit to determine rate of progression of retinopathy.
  • Another feature provides for the side-by-side display of ETDRS standard retinal images previously scanned into die system, to be compared with the patient retinal images so as to facilitate accurate level of retinopathy determination.
  • This provides a standard for assessment of level of retinopathy, namely a comparison with the ETDRS standard retinal images.
  • die image management software allows display of images in the traditional red-free format which provides an enhanced contrast of the retinal vasculature.
  • the software simply applies a digital filter to the already acquired retinal images in order to provide an image comparable to a red-free image. That is, die software sets the red and blue color components of the image equal to zero, to automatically display a green high contrast image of die retinal vasculature.
  • the flexibility inherent in these digitized images for color manipulation provides enhanced diagnostic information from different layers of die retina. For example, blue light is primarily reflected from the front regions of the retina while red light is primarily reflected from deeper retina regions.
  • the processed images selectively provide pathological information from different depths of the retina.
  • these images may be achieved from a single video image, witiiout using filtered sources or other special preparations at the image acquisition stage.
  • a high contrast green image is obtained.
  • the invention further contemplates a modified fluorescein angiograph procedure, wherein monochrome fluorescein angiograms are captured as moving pictures, i.e., as a sequence of successive stereo frames.
  • Images may also be acquired by the aforesaid Optronics camera fitted to an indirect ophthalmoscope. Focusing may be by set up to respond to a foot switch or voice activated control. The camera may be interfaced to one of the viewing eyepieces of the slit lamp. This instrumentation provides good quality anterior segment images, with lesser viewing light intensity than is applied by the slit lamp.
  • the network may carry data signals including control or image adjustment signals by which the ophthalmologist at the examination unit 20 directly controls die image acquisition occurring at the acquisition unit 12.
  • control or image adjustment signals by which the ophthalmologist at the examination unit 20 directly controls die image acquisition occurring at the acquisition unit 12.
  • command signals as zoom magnification, steering adjustments, and wavelength of field illumination may be selectively varied remotely to achieve desired imaging effect.
  • questionable tissue structures requiring greater magnification or a different perspective for their elucidation may be quickly resolved without ambiguity by varying such control parameters.
  • control signals may include time varying signals to initiate stimulation with certain wavelengths of light, to initiate imaging at certain times after stimulation or delivery of dye or drugs, or other such precisely controlled imaging protocols.
  • the digital data signals for these operations may be interfaced to the ophtiialmic equipment in a relatively straightforward fashion, provided such equipment already has initiating switches or internal digital circuitry for controlling die particular parameters involved, or is capable of readily adapting electric controls to such control parameters as system focus, illumination and die like.
  • a UNIX operating system of Digital Equipment Corporation was employed with a separate program X/II R5 to control the graphics display.
  • the electronic medical record hardware required only a processor such as an Intel 486, with 8MB of memory and a 300 MB disk drive, a keyboard, and a pen pad display /entry system.
  • the pen pad/display is installed such that it appears in the same plane as the image monitor for the image acquisition.
  • the necessary software on a windows based work group system includes standard programs for pen pad operation, the Clinictec NextGen medical record platform, and an oracle driver for accessing die server in a relational database.
  • certain procedures such as laser photocoagulation of retinal vessels, for which the necessary ophthalmic instrumentation already involves highly integrated and computer controlled aiming, focusing and actuation circuitry, may be adapted to remote operation in conjunction with the stereo imaging work stations provided by die present invention.
  • the ophthalmologist at examination station 20 may not only view the stereo images and identify pathologic sites or processes, but may actively control the photocoagulation unit in real time as the procedure continues.
  • tiiat the imaging and ophthalmic treatment instrumentation in this case will generally include a steering and stabilization system which maintains both instruments in alignment and stabilized on die structures appearing in the field of view.
  • the system control further includes image identification and correlation software which allows the ophthalmologist at site 20 to identify particular positions in the retinal field of view, such as pinpointing particular vessels or tissue structures
  • the image acquisition computer 30 includes image recognition software which enables it to identify patterns in the video frames and correlate the identified position with each image frame as it is acquired at d e acquisition site 12.
  • the image recognition software may lock onto a pattern of retinal vessels.
  • d e invention further contemplates that the images provided by acquisition unit 12 are processed for photogrammetric analysis of tissue features and blood flow characteristics. This is accomplished as follows.
  • An image acquired by unit 12 is sent to an examination unit, illustratively unit 20b. where it is displayed on the screen.
  • an examination unit illustratively unit 20b.
  • such image may include a network of blood vessels having various diameters and lengths. These vessels include botii arterial and venous capillaries constituting the blood supply and return network.
  • the features of these vessels e.g., spectral reflectance
  • the workstation is equipped with a photogrammetric measurement program which enables the technician to place a cursor on an imaged vessel, and moving the cursor along the vessel while clicking, have the software automatically determine the widtii of the vessel and d e subvessels to which it is connected, as well as the coordinates thereof.
  • the software for noting coordinates from the pixel positions and linking displayed features in a record, as well as submodules which determine vessel capacities and the like, are straightforward and readily built up from photogrammetric program techniques.
  • Work station protocols may also be implemented to automatically map the vasculature, or to compare two images taken at historically different times and identify or annotate the changes which have occurred, highlighting for the operator features such as vessel erosion, tissue which has changed color, or other differences.
  • a user graphical interface allows the specialist to type in diagnostic indications linked to the image, or to a particular feature appearing at a location in the image, so that the image or processed version of it becomes more useful.
  • the relative health of the vessel, its blood carrying capacity and die like may also be visually observed and noted.
  • This photogrammetric analysis allows a road map of me vasculature and its capacity to be compiled, together with annotations as to the extent of tissue health or disease apparent upon such inspection.
  • a very precise and well-annotated medical record may be readily compiled and may be compared to a previously taken view for detailed evidence of changes over a period of time, or may be compared, for example, to immediately preceding angiographic views in order to assess the actual degree of blood flow occurring therein.
  • the measurement entries at examination unit 20b become an annotated image record and are stored in the central library as part of the patient's record.
  • the computer processors include three-dimensional imaging software which operates on the pairs of image frames sent from the acquisition site 12 to compose detail images and enhanced side views, i.e., to perform computerized rendering with a perspective not formerly possible in ophthalmic imaging, which are derived from accentuating color differences in corresponding points of adjacent image frames to show the curvature, shading, color and other visible features of the microscopic structures appearing in the video frames.
  • the video software may combine points from different color planes to compose an enhanced perspective view of a retinal vessel where it attaches to or rises out of underlying tissue, and these features which provide strong diagnostic information and were previously visible only in blurry or vague sectional views, may be represented as processed video lateral or perspective side views of the relevant tissue or fascia in which their physical structure is clearly visible.
  • Various artifacts of specular or diffuse reflection may be eliminated based on color or on light intensity limits, and contrast may be readily adjusted by offset or regional image transformations to reveal new diagnostic indications.
  • These artificially generated images may be composed in real time as requested by the technician or ophthalmologist in the examination room, or derived later, and may also be annotated as described above and saved as critical parts of the record.
  • the system of the present invention allows the remote stereographic imaging of retinal tissue, the manipulation and formation of new images, and the immediate linking and annotation of diagnostic data into hybrid image/text records for storage in the medical records system, as well as remote control of imaging or treatment instruments.

Abstract

An image acquisition unit provides true color high resolution digital images of the retina of the eye to a computer network (14, 16, 18) which interfaces with a central library and displays images at a remote work station (20A and 20B) for diagnostic examination. The computer network (14, 16, 18) is also interfaced with a unit for entering medical history information, and the diagnostic data, stereo images and medical records are linked in a relational databse allowing all text and image records of the patient to reside on a work station/display (20A and 20B) for review or consultation. A telecommunications link (22) interconnects the computer network (14, 16, 18) and image examination stations with the image acquisition station where the patient is actually examined, so that stereo fundus images may be made available through the computer without degradation for viewing on the examination displays remotely from the room, building, or city where the images are acquired from the patient. By introducing digital true-color normalization at the image acquisition source (34, 32), applicant avoids the problem of spurious color artifacts arising when the images are viewed subsequenty, or on other monitors.

Description

ELECTRONIC IMAGING SYSTEM FOR RETINAL EXAMINATION AND TREATMENT
Field of the Invention
The present invention relates to systems and methods for examining and treating the retina of an eye. and more particularly, to imaging systems and communication systems that provide stereoscopic images of the retina of an eye.
Background of the Invention
Diabetes is the leading cause of blindness in working age adults. It is a disease that, among its many symptoms, includes a progressive impairment of the peripheral vascular system. These changes in the vasculature of the retina cause progressive vision impairment and eventually complete loss of sight. The tragedy of diabetic retinopathy is that in the vast majority of cases, blindness is preventable by early diagnosis and treatment, but screening programs that could provide early detection are not widespread.
Promising techniques for early detection of diabetic retinopathy presently exist. Researchers have found that retinopathy is preceded by visibly detectable changes in blood flow through the retina. Diagnostic techniques now exist that grade and classify diabetic retinopathy, and together with a series of retinal images taken at different times, these provide a methodology for the early detection of degeneration. Various medical, surgical and dietary interventions may then prevent the disease from progressing to blindness.
In the United States, a 22-hospital collaborative clinical trial, the Early Treatment Diabetic Retinopathy Study (ETDRS) has shown that high risk cases can be identified early, and early treatment can substantially reduce the risk of severe visual loss. Comparative economic studies have estimated that if all diabetic persons received routine annual eye examinations with appropriate intervention, cost savings of hundreds of millions of dollars and avoidance of hundreds of thousands of person-years of blindness would be achieved. The health system performance measures of HEDIS recommend that health maintenance organizations require annual retinal examinations for all diabetic patients.
Despite the existing techniques for preventing diabetic blindness, only a small fraction of the afflicted population receives timely and proper care, and significant barriers separate most patients from state-of-the art diabetes eye care. There are a limited number of ophthalmologists trained to evaluate retinopathy, and most are located in population centers. Many patients cannot afford the costs or the time for travel to a specialist. Additionally. cultural and language barriers often prevent elderly, rural and ethnic minority patients from seeking proper care. Moreover, because diabetes is a persistent disease and diabetic retinopathy is a degenerative disease, an afflicted patient requires lifelong disease management, including periodic examinations to monitor and record the condition of the retina, and sustained attention on the part of the patient to medical or behavioral guidelines. Such a sustained level of personal responsibility requires a high degree of motivation, and lifelong disease management can be a significant lifestyle burden. These factors increase the likelihood that the patient will, at least at some point, fail to receive proper disease management, often with catastrophic consequences.
Accordingly, it would be desirable to implement more widespread screening for retinal degeneration or pathology, and to positively address the financial, social and cultural barriers to implementation of such screening. It would also be desirable to improve the efficiency and quality of retinal evaluation.
Summary of the Invention
These and other desirable ends are achieved in a system according to the present invention wherein an image collection or acquisition unit provides digitized high resolution true color images of the retina of the eye and a computer network receives and manages these images causing them to be displayed as visual images on a display/monitor workstation for diagnostic examination. Preferably the images are stereo images. The computer network is interfaced with a medical record database or means for entering medical history information, such that the retinal images and medical records are linked in a relational database and accessed together. A telecommunications link interconnects the computer network to the image acquisition station where the patient is actually examined, with a bandwidth such that fundus images are made available through the computer without degradation for viewing on the examination displays remotely from the room, building, or city where the images are acquired from the patient. The images may be taken and viewed stereoscopically for analysis by a trained ophthalmologist or by other trained personnel. A graphic interface operating in conjunction with special processing modules identifies image features, or changes, of diagnostic significance and allows the viewer to enter textual data in the record. The telecommunications system preferably includes a fiber optic link for the image data, and may connect to teleconferencing equipment at either end, allowing an optometrist or trained technician in one location to take and transmit stereo fundus images while consulting in real time with a specialist who analyzes those images at another location.
In a preferred embodiment, the image acquisition station includes a relatively simple apparatus in which a relatively unskilled operator may aim. adjust and actuate the camera to produce true color images, while the examination station includes operator control units for retrieving, displaying and manipulating the images acquired from the image acquisition station. These controls may include image processing modules implemented in software for enhancing feature definition, selectively filtering colors from the image, or implementing block transforms other imaging processes to accentuate the appearance of vessels or surrounding tissue appearing in the stereo image, or identifying changes since a previous imaging session. The image examination stations may receive or retrieve images and display them in spatial order as standard fields, may project a series of standardized images such as the fundus grading slides available from the Fundus Photograph Reading Center of the University of Wisconsin (P.O. Box 5240, Madison Wisconsin, U.S.A. 53705), and may project comparative sets of stereo images for clinical review, feature enumeration and data entry, or other analytic or recording purposes.
The image control or manipulation units may also include control units which operate in real time for sending direct image control signals to the image acquisition station to adjust the level of image resolution, the illumination color or intensity, or the field of view or image orientation. In addition, the telecommunications link advantageously includes an audio channel, to be used for simple verbal consultation and instruction, such as instructing the person operating the imaging camera to aim the camera at a specific site or lesion or make other adjustments or observations. The audio channel may also be used for remote consultations with the evaluating doctor. The image acquisition unit includes high definition digital frame true color imaging equipment, such as a camera or pair of video cameras attached to an ophthalmic imaging device, or a multiplexed single camera aimed at the focal planes of a stereo fundus camera ophthalmic imaging assembly, to generate high definition video frame data, preferably at a sufficient speed to provide time resolution motion pictures or high resolution views of the retinal field. The video image acquisition unit also includes an adjacent or proximal control unit which may include means for inserting alphanumeric information display legends within the video frames to provide information such as patient name, medical record number, date of image and precise time information. In other embodiments, the local control unit may also provide time information and synchronization with a dye injector or other stimulus so that the time marked on each frame reflects the time interval elapsed since the start of a fluorescein or dye angiography, or other procedure. For such a time-evolution procedure, the stereo video frames do not just show the retinal tissue and vasculature, but rather, successive frames provide an evolving image which quantitatively shows both the circulatory capacity and the flow velocity in each vessel or region of the retina. This, together with the elapsed time indication provides a quantifiable record of retinal circulation. The image acquisition unit preferably includes an adjustment for color matching an image frame before the frame is recorded, allowing all frames of a set, whether recorded at the same or different sessions, to achieve optimal feature visibility and avoid heterochromic artifacts that might otherwise arise when comparing or processing multiple views. The image frames are preferably coded and stored to identify their relative position on the fundus, and the image examination stations selectively display the acquired images as multi-image sets arrayed in accordance with their fundus position, or display images as one or more sets of stereo pairs, or as isomorphic enlargements of a region designated by the operator.
In a preferred embodiment, a fiber optic link interconnects at least a portion of the image acquisition or storage with the computer network, and true color digital images may be taken and displayed in real time to effectively place the specialist-ophthalmologist in immediate control of the diagnostic procedure. In related embodiments, treatment instruments may also be set up at the image acquisition station and continuously monitored or controlled from the examination/display room. Such units may include laser surgery or coagulation units for performing retinal microsurgery. In this case, the operator control units may include means for controlling the power level and aiming of such treatment instrument remotely from the examination room. When operated in this mode, preferably rather than aiming and firing the coagulation unit directly, the control system includes an image analysis module, which allows the ophthalmologist to select the tissue sites on the stereo image video frames displayed in the examination room which are to be treated in the remote station. Once the retinal site coordinates on the image frame are identified, this information is transmitted over the network to the image acquisition application site. The laser coagulation unit at the patient site preferably includes image analysis and target tracking software which allows the identified coordinates on the frame viewed by the ophthalmologist to be located in the current frame scan of the patient's eye, and control interface software to aim the laser unit accordingly. Thus, intervening movement of the patient's eye or destabilization of the instrument aiming system during the time elapsed with transmission and display is corrected by correlating successive frames of the video image to identify the precise physical spot in the current frame at which treatment is to occur.
Brief Description of the Drawings
These and other features of the invention will be understood from the description below taken together with illustrative figures, wherein:
Figure 1 illustrates a system constructed according to the present invention for retinal evaluation; and
Figure 2A and 2B illustrate standard fields and a thumbnail array of displayed images. Detailed Description of the Illustrated Embodiment
Fig. 1 illustrates a system 10 constructed according to the present invention, for acquiring and examining images of the retina of an eye. As illustrated in Fig. 1, the system 10 includes a data collection unit 12, a network unit 14, a data storage unit 16, a central computer 18, remote work station units 20A and 20B, and a telecommunication link 22. As illustrated in Fig. 1, the network 14 interconnects the other elements of this system 10 so that each element is in communication with each other element. In this way, the system 10 is adapted for communicating information and command signals between one image acquisition station and another image examining station. As such, the system provides an apparatus that allows an operator at a remote work station unit 20 to acquire images and examine the images of a patient located proximal to the data collection unit 12.
In many respects, a basic embodiment of the system of the present invention may be assembled from readily available computing, image conditioning and processing components interfaced with available ophthalmoscopic instruments. A suitable set of network and computer elements to implement the system is described in a system description below, and preferably includes special dedicated cards for graphic acceleration, e.g., image handling and transmission, although the basic image acquisition workstation may operate with relatively simple optical and computer hardware.
Returning to Figure 1, the data collection or image acquisition unit 12 illustrated in system 10 includes a set of hardware equipment for acquiring high resolution full color retinal images, including a computer processing system 30, a first camera 32 and a second camera 34, a network interface 36 and a set of optional stereo viewing goggles 38. The two video cameras 32, 34 are directed at the focal planes of a stereo fundus camera imaging assembly (not shown), so that they generate views from two different angles, across a triangulation base, of a commonly viewed region of the patient's eye. As discussed further below, this allows photometric analysis of fundus tissue, which in conjunction with special applications processing, is partially or completely automated and integrated with the relational database information of system 10. It will be understood that with suitable optical multiplexing arrangements a single camera, rather than two cameras, may be directed to alternately image each of the two focal planes of the ophthalmic instrument, or that, as is commonly done for photographic image recording, a single fundus camera may be manually shifted by the operator to successively take two different images of the retina which form a stereo pair. In that case, a single camera and no special switching are used. However, in general, the use of two separate video cameras with a stereo fundus camera is preferred so as not to reduce the total amount of light available for forming each image frame and to form uniform and well aligned stereo images at a single instant in time. It will be further understood that the video camera 32 or 34 may be a CCD array or other special two- dimensional sensing device, and need not be a complete "camera" in the consumer product sense. However, by way of example, the presently preferred embodiment may be a commercially available camera of suitable resolution which produces an output video signal in NTSC, PAL or SECAM format. A direct digital camera may also be used, that is, one having a formatted digital output, although in general special drivers will be required to interface such a camera with computer-manageable video frame handling circuitry.
In the illustrated embodiment, the data collection unit 12 is adapted for acquiring stereo images of the fundus of the patient's eye. In a representative embodiment, the camera elements 32 and 34 interface to the two stereo image planes of a Donaldson stereo fundus camera (not shown), and camera elements 32 and 34 are video camera elements of the type suitable for generating high resolution image signals representative of a visual image encoded in a video format such as the NTSC, S-video or other format; each camera element 32 and 34 connects to the computer processing unit 30. The computer processing unit 30 includes a video interface board that is adapted to interface with the camera elements 32 and 34 to receive video signals therefrom along transmission paths such as impedance matched coaxial cables. The video interface cards of the computer unit 30 contain special high speed circuitry for digitizing the video signals and converting them into high resolution, e.g., 1280 x 1024 pixel, video frames. This corresponds to an effective resolution often to twenty micrometers, with typical fundus camera objective optics. The computer unit 30 also has a provision for processing the captured stereo image frames to generate or allow generation of three dimensional images that can be manipulated and viewed, as will be explained in greater detail hereinafter, by an operator at the data collection unit 12, or at the remote work stations 20A and 20B.
At the image acquisition site, the processor system 30 preferably has a foot switch activation signal for initiating image acquisition, with the camera signals fed to an audio/video processing board such as the J300 board of Digital Equipment Corporation. This circuit performs any necessary video scaling, filtering and color dithering of the video signals received in a standard format from the camera, and produces a normalized output in NTSC or PAL video, or in composite or S-video formats, that is suitable for digital processing. The output of the J3000 board is fed to a twenty-four plane double-buffered graphics accelerator circuit, which digitizes and formats the frames, and provides efficient internal handling and processing in real time for true color high resolution video frames. The accelerator card allows left - and right - video frames to be alternately displayed on the display at twice the normal 70 Hz refresh rate, for stereo viewing. Frame-to-frame spatial correlation then allows a three-dimensional stereo imaging and display program to build and manipulate three- dimensional views of the imaged field of varying appearance, and to form fractal maps of the displayed retinal image topography.
In the illustrated embodiment, the interface element 36 is a separate element connected via a transmission path to the computer 30, and connects the computer unit 30 to the network 14 for transmitting information signals and command signals therebetween. The interface element 36 can be a standard ethernet network card that is adapted for interfacing the computer unit 30 to a standard ethernet network such as the Novell network system, and preferably includes high bandwidth optical fiber data ports and adapters.
As further illustrated in Fig. 1, the display at the data collection unit 12 advantageously operates with a set of stereo goggles. In this embodiment of the present invention the data collection unit 12 is adapted for acquiring and viewing stereo images that represent the fundus of a patient's eye and presenting them in an alternating time-sequential fashion on display that enhances the viewer's perception of depth and shading. The computer monitor element 30A works in concert with the stereo goggles 38 to allow an operator to view three dimensional images of the patient's retina. In one embodiment of the present invention, the monitor element 30A is a high resolution video monitor that operates at a frame rate of 140 Hz for displaying a sequence of retinal images taken by the cameras. Each of the frames is captured and indexed, and recorded in temporary storage, such as local disc storage, and the sequence of frames is sent on the network where a massive memory device such as a 700 Gbyte storage unit stores the images as a patient image record. This storage unit can store the records of hundreds or thousands of patients with their fundus images, and may be periodically backed up so that complete records are available for all current patients.
At the monitor, the frame sequence, which may be stopped, if desired, to allow individual frames to be viewed, is viewed through the stereo goggles 38 which act as a time- switched pair of shutters over the operator's eyes. Such stereo goggles 38 can be of the type manufactured by and commercially available from the Stereographies Corporation and used for image display and analysis, in fields such as stereographic survey image analysis. Each viewing lens of the goggles is formed of an electroactive material which is opaque when energized, and when not energized is clear. An IR receiver on the goggles synchronizes the actuation of the left and right lenses with the display of right and left views on d e display, so that each eye is blanked during the time the other side view is displayed on the monitor. Since the two stereo frames alternate at a high frame rate, the effect is to form a sequence of stereo video images at the normal 70 Hz refresh rate of the display which appears continuous to the viewer. The illustrated network system 14 is adapted for carrying video frame information signals and oti er data or command signals between the image acquisition site and a remote image examination site. In particular, the network element 14 is adapted for carrying high volume information signals representative of many digital stereoscopic images and the various computers or processors are also interfaced with a medical records management program to store the images in a relational database, and make the medical record information available at the examination stations. In a preferred embodiment of the invention, the network includes an optical fiber cable system, and implements an ethernet network protocol. However, the present invention can also be practiced with another suitable network system or configuration that has sufficient bandwidth to carry the information signals representative of the stereoscopic images. Thus, for example, a multichannel direct microwave communications link may be effective to connect the image acquisition unit to a central computer system located within a few miles crosstown, while intra-building optical fiber channels may interconnect medical records and the examination stations within a single building or complex. Leased channel space on a fiber optic link may interconnect acquisition units 12 in distant cities or remote neighborhoods, and satellite up-and-down links may also be used effectively. Furthermore, where sufficient channel space is not readily available, the invention may be practiced in various forms utilizing low frame rates, or at full resolution but with motion studies delayed somewhat to allow transmission and storage of a frame sequence before viewing.
Continuing now with the description of Figure 1, the network 14 includes basic server software and circuitry for interfacing the central computer system 18 and data storage element 16 with a plurality of work station/display units 20a, 20b. etc., of which two are shown. The data storage element 16 is preferably an optical storage element with associated server circuitry, commonly referred to as an optical jukebox, having a large capacity in die range of seven hundred gigabytes. Incoming image frames from the image acquisition site are placed in storage and also made available over die network directly to an examination unit 20, during the course of a procedure, as discussed further below.
In broad terms, image acquisition is carried out at a first site such as a mobile screening unit or small clinic, or in a medical center in a community which may be remote from the primary hospital housing the units 20a, 20b and computer 18, using equipment as described above in relation to image acquisition unit 12. The context of the examination may involve a local optometrist or ophthalmologist responsible for the patient, and who actually sets up or controls the fundus camera or other ophthalmic instrumentation in the acquisition room. The network connection by satellite, leased line, fiber optic or other communications link provides the video frame and other audio and digital data to the network which, in general, with the other units described below is located at a central research center or ophthalmology department of a teaching hospital. The network and server with a plurality of image examination stations thus form the hub for a number of image acquisition units 12.
Each examination unit 20 may be a relatively powerful work station, with memory for storing and immediately accessing 50-200 Mbytes of data and processing capability including a video accelerator for displaying high resolution true color images. The unit 20 includes a high definition video display, preferably as noted above, a stereo imaging display with specialized viewing discrimination equipment such as infrared-synchronized stereo viewing goggles, and preferably also a digitizing medical records pad and/or keyboard entry system on which a specialist ophthalmologist or specially trained technician viewing the frame can write observations and conclusions of diagnostic relevance. Such a hand-held pad is of die type commonly used by the physician to create standardized medical examination records, and for this application is configured to interface directly to die workstation and software which accommodates the image data in a relational database for accessing and displaying from the existing records in the central data storage, or filling in relevant medical history and other data. A suitable physician's data entry pad is at sold by the Datamedic Corporation, which facilitates entry of medical data and its linking and correlation with data displayed on die screen at d e time.
In general, it is contemplated that the examination units 20a, 20b may be located at a
"hub" which services a number of image acquisition sites 12. The sites 12 may be simply clinic examination rooms located in the same hospital, but also, due to d e unique system architecture, may consist of plural widely separated clinics at different remote sites. The work performed at examination units 20a, 20b may involve the meticulous view of images and tabulation of diagnostic details, such as counting the number of microaneurisms. leaks, scars or the like in particular retinal image fields, or deriving other quantitative measures, as described, for example in die ETDRS Reports Nos. 10-13 at pages 786-834 of Volume 98, Ophthalmology (May 1991 Supplement). It may also involve other less quantitative or less time consuming clinical judgments, as well as various forms of image manipulation, color enhancement, and record annotation. Much of this work can be performed by specially trained technicians, so the organization of an examination center "hub" in this manner will allow the limited number of trained ophthalmologists to examine and treat a vastly greater number of diabetic patients. Typically, twenty or more, and up to five hundred work stations 20 may be located at the hub.
In general, die precise form of imaging effected at acquisition sites 12 will vary depending on the nature of acquisition equipment available at each site 12 (e.g., ophthalmoscope, fundus camera, stereo fundus camera) and the nature of the image capture at that site. In particular, a video camera with subsequent digital frame conversion will produce a true color image frame images with a 6-700 line field resolution which is capable of transmission as under a megabyte of data, whereas a high resolution direct digital camera such as a Kodak DCS 420 or DCS 460 with a 1500 x 1250 pixel frame will require 2-4 megabytes of information. For diagnostic utility, applicant has found it necessary to use true color images, and to transmit using only image compression protocols that are not lossy.
Thus, the availability of a high bandwidth communications channel such as a satellite or fiber optic link is critical to real time consultations, although if only a lesser bandwidth is available it is still possible to transmit images (albeit over several minutes) and undertake remote consultations using the system. By using the DEI-470 color video camera manufactured by Optronics Engineering, or that company's higher resolution DEI-750 3 -chip RGB camera, the retina may be imaged in true color using light levels comparable to or less man those employed in die viewing system of a typical retinal fundus camera or ophthalmoscope, and widi a resolution of 10-20 micrometers. These cameras, operating with low illumination levels rather than flash and produce images in true color, i.e., visually indistinguishable from that seen rough an ophthalmoscope are operated to produce the large number of frames (28) required for diagnostic fundus grading without the drawbacks of existing (flash) fundus photography.
In a presently preferred embodiment for retinal imaging and evaluation, d e camera 32 and image acquisition and control unit 12 are configured to produce "normalized" or
"absolute" true color images, further enhancing their utility. This is achieved as follows.
One suitable retinal imaging assembly is a fundus camera such as a Nikon NF-505 or Topcon TRC-50 fundus camera assembly which has its customary flash, film transport and supporting electronics removed, and replaced by an Optronics video imaging camera mounted to electronically convert the focused fundus image. As in the system of FIGURE 1 , an image acquisition processor receives the electronic image data and attends to any necessary signal conversion, formatting, framing or other processing for display or transmission of the electronic images. Furthermore, in this embodiment, a full color LCD display is mounted in a gimbaled setting to die side of d e camera where it may be directly viewed by the operator as the camera is aimed at the fundus. As in die embodiment of FIGURE 1, the electronic imaging assembly may be a direct digital imaging camera, in which all or a portion of die image signal processing and framing circuitry is incorporated in d e camera itself, or may be an analog (video) output imaging tube or array in which the output video signal is subsequently digitized by processor 12 and associated electronics. In any case, the frames imaged by the electronic imaging assembly are also delivered to and displayed on the pivoting LCD color screen, thus allowing a direct and large-scale view of the digital color image being captured by the camera. In general, it will be understood d at retinal images, due to the intrinsic color of background tissue, are of a generally orange-red hue. Moreover, the electronic imaging assembly attached to die fundus camera is intended to function with a relatively low level of continuous illumination rather than flash illumination, and its images may quickly become washed out as illumination increases, or may become a dark brownish purple as the illumination level decreases to a point near the imaging threshold. These light response characteristics stem from the saturation properties of the image sensing CCD portion of the camera, which preferably has an imaging sensitivity below about .01 - 0.2 lux, as well as from the spectral responses of the three different color pixel sets or their filters. Applicant has found that this level of variability, even when the light source is initially set so that the observed tissue appears normal, impairs the diagnostic value of acquired images. This is especially true when the acquired image is to be compared to one or more baseline images acquired at different times, or is one of a set of adjacent or overlapping standardized fields such as used for Airlie House classification of retinopathy. For such images, unintended variations in color, contrast or density may seriously impair the visibility of details, or may prevent meaningful comparisons or determination of changes.
This shortcoming is addressed in accordance witfi a further aspect of the invention by providing an adjustable illumination source, and normalizing the captured image to the actual appearance of the observed tissue as the images are acquired.
The adjustable illumination source may be, for example, a tungsten linear filament lamp of up to about ten or fifteen watts power. The adjustable source has a spectral output with peaks in the yellow/red region of the spectrum, and the peaks shift from red toward shorter wavelengths as d e lamp drive voltage is increased. Variation in this spectral response therefore allows the operator to vary the visual appearance of retinal tissue as the camera is focused on die fundus, and results in a generally greater variation in the color or hue of the displayed images being captured by the camera.
To normalize the captured digital images, the image processor 12 operates in an adjustment mode by providing an image of die current camera image "S" to the LCD display, which is arranged so the operator may simultaneously view the LCD display while looking through the viewing port of the camera assembly. The two images are visually compared by die operator as lamp drive voltage is adjusted, so that die displayed video image ranges through a range of brightness, contrast and relative hues, while the actual observed tissue may appear generally brighter and yellower. The operator then adjusts the lamp until the two images-the LCD display and the direct visual appearance-match as closely as possible in color and intensity in the major (e.g.. background tissue) regions of the image. Once the normalized color is adjusted in this manner, the operator clicks on the shutter control switch (e.g., a mouse) and die electronic image frame is stored. This color-matching operation can be performed widi accuracy by the human eye, and results in the acquisition of images die digital representations of which are color-matched, despite die actual color balance adjustments of die monitor upon which die operator is observing the images and die camera's spectral response. This color normalization requires mat the color balance "transformation" introduced by die monitor in transforming electronic-to-light images be approximately the inverse of the color transformation introduced by die camera in transforming light-to- electronic image signals. While in general the different color transformations over d e entire visible spectrum, of the camera and die display, may not be amenable to such a compensating inverse relationship, applicant has found tiiat within the color range encompassed by retinal images, tiiis is dependably achieved by adjusting the camera and display color settings to be approximately inverse under standard light conditions. The necessary recognized settings are generally different from d e standard factory-defined default settings for both the video camera and the monitor, but result in a display mat approximates the actual clinical appearance of tissue. The operator d en needs only to compare the color of the digitized retinal image on ie monitor screen to iat directly observed tiirough the fundus camera eyepiece, and make minor adjustments to die intensity level of the fundus camera viewing light, in order for die color content of the digitized retinal image on the monitor screen to be the same as that observed directly tiirough the fundus camera eyepiece. The institution of this adjustment protocol for the correct color of the retinal image means that it is no longer necessary to rigorously calibrate both monitor and video camera for every patient. In fact, applicant has found that this procedure for obtaining the correct color for the retinal image functions equally well regardless of d e pigmentation of the fundus, i.e., regardless of whether the patient is Caucasian, Asian, African or Hispanic, or otherwise has a specially pigmented retinal background tissue.
By introducing this digital true-color normalization at the image acquisition source, applicant avoids the problem of spurious color artifacts arising when the images are viewed subsequently or on other monitors, e.g., remotely. When images are taken as stereo pairs, each frame of the pair may be normalized as described above.
As described above, the image acquisition system can be operated by personnel with minimal training, primarily that associated with recognition of the different areas of the retina tiiat are needed for diagnosis of the level of diabetic retinopathy, (i.e. the "standard fields") and some practice in obtaining stereo pair images of the same retinal field, as well as basic clerical skills, e.g.. familiarity with entering patient demographic data in the appropriate fields of a pen pad medical record entry system. Thus, the operator need not be skilled in die art of taking retinal photographic images. Because the acquisition is video-based, the acquisition of images and the quality of the images is as simple as operating a commercial home use video camera.
As noted above, d e invention when practiced witii the described Optronics cameras allows accurate image capture at low light levels, so it is less invasive than the use of flash exposure, which is painful. As such, the capture of continuous illumination low light level images is expected to remove a major barrier to universal screening and to promote the practice of routine retinal monitoring. However, in other embodiments the invention may be practiced with other image capture technology, such as the direct digital output DCS-420 Kodak single frame camera. Such a camera may provide higher resolution, altiiough it would generally require an auxiliary flash source to produce its larger format images.
As noted above, in d e field of retinal evaluation, a principal application of the images will be the acquisition of high resolution true color baseline images, including stereo pair images, of the retina and retinal vasculature, and the acquisition of stereographic motion pictures of standardized ophthalmologic procedures such as fluorescein angiography, which will reveal the dynamic properties of circulation at time die test is run. True color images at the described twenty micron or better level of resolution will provide sufficient information for a trained ophthalmologist to evaluate the status of retinal pathology or incipient pathological conditions, to evaluate the efficacy of medications, and provide prognostic consultations. While the video data is acquired and is viewed at two different sites in real time, die described teleconferencing and network communications links between the two sites allow the ophti almologist to discuss the patient's condition and details of the exact images as the procedure is being effected, and to adjust the imaging parameters or arrange for further views as appropriate. In addition, the storage of video frame data and the video monitor 30a at die image acquisition site allow d e referring physician or treating technician to show the patient the precise images involved and discuss die implications of what is seen on the screen. Applicants have found tiiat such direct sharing of medical information, enabling the patient to see the actual images and discuss die microscopic changes involved, strongly impresses the individual with die concrete reality of his medical condition, bolstering the importance or relevance of his own health and its interrelationship with lifestyle and behavioral activities. Thus, this manner of establishing a treatment relation with the patient is less alienating, and encourages the patient to make a commitment to continue proper treatment.
At the image examination station, the display monitors are preferably true color twenty-one inch or larger monitors, or where group consultations or teaching are to use the images, a stereo-capable projector with low-persistence green phosphor is used to allow projection of sequences of retinal images. The overall image management program is preferably implemented by adding it to a medical records platform mat links patient records to die image records, and allows both to be augmented or annotated at d e examination stations. The images are preferably acquired and marked as sets of seven standard fields as shown in Figure 2 A and defined in d e Early Treatment Diabetic Retinopathy Study (ETDRS). The image display software provides a first display mode in which the seven fields are displayed as thumbnail stereo pair images, as shown in Figure 2B, with the same layout used for botii left and right eyes. The workstation image management software allows the operator to select the type of full-scale image display so tiiat by clicking on the left one of a thumbnail stereo image, the actual full-scale image or chosen set of images is displayed. Display modes are provided for viewing individual fields, as stereograms, or pairs of the fields for comparison, as well as pairs such as present image/previous visit, filtered or color-enhanced images, and grid-matched magnified images to better exhibit particular tissue.
In general, the image display sequences provide all the diagnostic features required by the ophthalmologist to make an assessment of the level of diabetic retinopathy. The design preferably provides a screen witi out window decorations maximizing available display area and image viewing without the distractions of display icons or pull down menus. The display functions incorporate true color stereo display, with as many as four separate retinal images displayed in stereo on the same screen. This provides d e facility for meaningful comparative evaluations. For example image fields from three prior visits can be displayed witii a corresponding image field from the current visit to determine rate of progression of retinopathy. Another feature provides for the side-by-side display of ETDRS standard retinal images previously scanned into die system, to be compared with the patient retinal images so as to facilitate accurate level of retinopathy determination. This provides a standard for assessment of level of retinopathy, namely a comparison with the ETDRS standard retinal images. In addition to d e color images die image management software allows display of images in the traditional red-free format which provides an enhanced contrast of the retinal vasculature. In this case, the software simply applies a digital filter to the already acquired retinal images in order to provide an image comparable to a red-free image. That is, die software sets the red and blue color components of the image equal to zero, to automatically display a green high contrast image of die retinal vasculature. The flexibility inherent in these digitized images for color manipulation provides enhanced diagnostic information from different layers of die retina. For example, blue light is primarily reflected from the front regions of the retina while red light is primarily reflected from deeper retina regions. Thus by appropriate color manipulation, the processed images selectively provide pathological information from different depths of the retina. Significantly, these images may be achieved from a single video image, witiiout using filtered sources or other special preparations at the image acquisition stage. As noted above, by simple zeroing of the red and blue components of an image, a high contrast green image is obtained. The invention further contemplates a modified fluorescein angiograph procedure, wherein monochrome fluorescein angiograms are captured as moving pictures, i.e., as a sequence of successive stereo frames. For such imaging applicant has modified the back of a Donaldson fundus camera, with two DEI-470 Optronic cameras coupled to the retinal focal planes. The cameras interface with a pair of J-300 sound and motion video image capture cards to acquire a fast image sequence. For this imaging, a fiber bundle is interfaced to the camera to supplement the viewing illumination. With ti is arrangement, it is possible to avoid reliance on flash exposure as required in the prior art, and to employ light levels comparable to viewing levels.
Images may also be acquired by the aforesaid Optronics camera fitted to an indirect ophthalmoscope. Focusing may be by set up to respond to a foot switch or voice activated control. The camera may be interfaced to one of the viewing eyepieces of the slit lamp. This instrumentation provides good quality anterior segment images, with lesser viewing light intensity than is applied by the slit lamp.
In addition to the communication of images and medical information between persons involved in the procedure, the network may carry data signals including control or image adjustment signals by which the ophthalmologist at the examination unit 20 directly controls die image acquisition occurring at the acquisition unit 12. In particular, such command signals as zoom magnification, steering adjustments, and wavelength of field illumination may be selectively varied remotely to achieve desired imaging effect. Thus, questionable tissue structures requiring greater magnification or a different perspective for their elucidation may be quickly resolved without ambiguity by varying such control parameters.
Furthermore, by switching illumination wavelengths views may be selectively taken to represent different layers of tissue, or to accentuate imaging of the vasculature and blood flow characteristics. In addition, where a specialized study such as fluorescence imaging is undertaken, the control signals may include time varying signals to initiate stimulation with certain wavelengths of light, to initiate imaging at certain times after stimulation or delivery of dye or drugs, or other such precisely controlled imaging protocols. The digital data signals for these operations may be interfaced to the ophtiialmic equipment in a relatively straightforward fashion, provided such equipment already has initiating switches or internal digital circuitry for controlling die particular parameters involved, or is capable of readily adapting electric controls to such control parameters as system focus, illumination and die like.
For implementation of a prototype embodiment, a UNIX operating system of Digital Equipment Corporation was employed with a separate program X/II R5 to control the graphics display. This allowed for images to be displayed witiiout window decorations, increasing die available image area on the screen and simplifying the projection of stereo images. The electronic medical record hardware required only a processor such as an Intel 486, with 8MB of memory and a 300 MB disk drive, a keyboard, and a pen pad display /entry system. The pen pad/display is installed such that it appears in the same plane as the image monitor for the image acquisition. The necessary software on a windows based work group system includes standard programs for pen pad operation, the Clinictec NextGen medical record platform, and an oracle driver for accessing die server in a relational database.
In a like manner, certain procedures, such as laser photocoagulation of retinal vessels, for which the necessary ophthalmic instrumentation already involves highly integrated and computer controlled aiming, focusing and actuation circuitry, may be adapted to remote operation in conjunction with the stereo imaging work stations provided by die present invention. In these instances, the ophthalmologist at examination station 20 may not only view the stereo images and identify pathologic sites or processes, but may actively control the photocoagulation unit in real time as the procedure continues. It will be understood tiiat the imaging and ophthalmic treatment instrumentation in this case will generally include a steering and stabilization system which maintains both instruments in alignment and stabilized on die structures appearing in the field of view. However, in view of die small but non-negligible time delays still involved between image acquisition and initiation of diagnostic or treatment activity at the examination site 20, in this aspect of the invention, the invention contemplates that the system control further includes image identification and correlation software which allows the ophthalmologist at site 20 to identify particular positions in the retinal field of view, such as pinpointing particular vessels or tissue structures, and the image acquisition computer 30 includes image recognition software which enables it to identify patterns in the video frames and correlate the identified position with each image frame as it is acquired at d e acquisition site 12. For example, the image recognition software may lock onto a pattern of retinal vessels. Thus, despite die presence of saccades and other abrupt eye movements of the small retinal field which may occur over relatively brief time intervals, the ophthalmic instrumentation is aimed at d e identified site in the field of view and remote treatment is achieved.
In addition to die foregoing operation, d e invention further contemplates that the images provided by acquisition unit 12 are processed for photogrammetric analysis of tissue features and blood flow characteristics. This is accomplished as follows. An image acquired by unit 12 is sent to an examination unit, illustratively unit 20b. where it is displayed on the screen. As indicated schematically in the figure, such image may include a network of blood vessels having various diameters and lengths. These vessels include botii arterial and venous capillaries constituting the blood supply and return network. The features of these vessels (e.g., spectral reflectance) clearly indicate which network is which, and die relative sizes of each vessel and their branch connection are also clearly visible. At the examination unit 20b, the workstation is equipped with a photogrammetric measurement program which enables the technician to place a cursor on an imaged vessel, and moving the cursor along the vessel while clicking, have the software automatically determine the widtii of the vessel and d e subvessels to which it is connected, as well as the coordinates thereof.
The software for noting coordinates from the pixel positions and linking displayed features in a record, as well as submodules which determine vessel capacities and the like, are straightforward and readily built up from photogrammetric program techniques. Work station protocols may also be implemented to automatically map the vasculature, or to compare two images taken at historically different times and identify or annotate the changes which have occurred, highlighting for the operator features such as vessel erosion, tissue which has changed color, or other differences. In addition, a user graphical interface allows the specialist to type in diagnostic indications linked to the image, or to a particular feature appearing at a location in the image, so that the image or processed version of it becomes more useful.
With suitable training, the relative health of the vessel, its blood carrying capacity and die like may also be visually observed and noted. This photogrammetric analysis allows a road map of me vasculature and its capacity to be compiled, together with annotations as to the extent of tissue health or disease apparent upon such inspection. Thus, a very precise and well-annotated medical record may be readily compiled and may be compared to a previously taken view for detailed evidence of changes over a period of time, or may be compared, for example, to immediately preceding angiographic views in order to assess the actual degree of blood flow occurring therein. As with the ophtiialmologist's note pad entries at examination unit 20a, the measurement entries at examination unit 20b become an annotated image record and are stored in the central library as part of the patient's record.
In addition to the foregoing, it is understood tiiat for the stereographic video images the computer processors include three-dimensional imaging software which operates on the pairs of image frames sent from the acquisition site 12 to compose detail images and enhanced side views, i.e., to perform computerized rendering with a perspective not formerly possible in ophthalmic imaging, which are derived from accentuating color differences in corresponding points of adjacent image frames to show the curvature, shading, color and other visible features of the microscopic structures appearing in the video frames. Thus, for example, while the fundus camera takes a series of views across a baseline which is limited by the width of the dilated pupil and is essentially an almost vertical perspective view, the video software may combine points from different color planes to compose an enhanced perspective view of a retinal vessel where it attaches to or rises out of underlying tissue, and these features which provide strong diagnostic information and were previously visible only in blurry or vague sectional views, may be represented as processed video lateral or perspective side views of the relevant tissue or fascia in which their physical structure is clearly visible. Various artifacts of specular or diffuse reflection may be eliminated based on color or on light intensity limits, and contrast may be readily adjusted by offset or regional image transformations to reveal new diagnostic indications. These artificially generated images may be composed in real time as requested by the technician or ophthalmologist in the examination room, or derived later, and may also be annotated as described above and saved as critical parts of the record. Thus, the system of the present invention allows the remote stereographic imaging of retinal tissue, the manipulation and formation of new images, and the immediate linking and annotation of diagnostic data into hybrid image/text records for storage in the medical records system, as well as remote control of imaging or treatment instruments.
The various prototype embodiments discussed above-Donaldson camera, slit lamp, indirect ophthalmoscope and other instruments illustrate a range of opthalmic imaging instruments to which applicants imaging invention and medical/image record management and recording system apply. Unlike a simple medical record system, the present invention changes the dynamics of patient access to care, and die efficiency of delivery of ophthalmic expertise in a manner that solves an enormous current health care dilemma, namely, d e obstacle to proper universal screening for diabetic retinopathy. A basic embodiment of the invention being thus disclosed and described, further variations and modifications will occur to those skilled in the art, and all such variations and modifications are encompassed within the scope of the invention as defined in the claims appended hereto.
What is claimed is:

Claims

Claims
1. A system for examining the retina of an eye, comprising
image acquisition means including an ophthalmic viewing instrument and an electronic imaging camera coupled to said instrument for generating true color digital image signal frames representative of a retina of an eye,
a computer network interfaced to said image acquisition means for sending and receiving said image signal frames, said computer network including storage means for storing said image frames, and
a display coupled to said computer network for displaying said image frames for examination.
2. A system according to claim 1 wherein said system includes an electronic image display at said viewing instrument, and means for matching said true color digital image frames to visual images viewed through said instrument.
3. A system according to claim 1 wherein said system organizes and displays plural high resolution retinal images in an array for comparison of said images.
4. A system according to claim 2 wherein said video system generates high resolution images.
5. A system according to claim 2 wherein said video camera means generates video image signals forming a sequence of time-successive images.
6. A system according to claim 1 wherein said image acquisition means includes means for generating frames of retinal image data separated by a base, and said display includes means for displaying stereo true color images formed of said frames.
7. A system according to claim 1 wherein said computer network includes
a monitor for displaying true color high resolution images,
a medical records platform for accessing and storing medical records, and a graphic interface for identifying diagnostic features appearing in the images displayed on die monitor and storing d e images and identified features as medical records in a relational database.
8. A system according to claim 1 wherein said computer network includes means for organizing said true color digital signal frames in a relational database for accessing image and text data together as a medical record, and graphic manipulation means for processing color-enhanced images from e image data in a record.
9. A system according to claim 1 further comprising an image database memory coupled to said computer network and having storage for one or more data records which include an image field for storing one of said image frames and an identification field for storing diagnostic text identification tiiat identifies features in the stored image frame of said data record.
10. A system according to claim 1 wherein said image acquisition means operates to image with continuous non-flash illumination at light levels below 5 lux .
11. A system according to claim 1 wherein die display includes control means for remotely controlling said image acquisition means to selectively generate said image signals.
12. A system according to claim 1 further comprising an ophthalmic treatment instrument coordinated with said image acquisition means for delivering treatment to said retina as it is imaged.
13. A system according to claim 12 wherein said network interconnects a controller at said display remote from said image acquisition means to control said ophtiialmic treatment instrument while an operator views the display.
14. A system according to claim 1 , wherein said computer network interfaces with said storage and display means to create image/text records such that a patient's complete medical record may be stored at a work station in digital form and accessed for display and evaluation.
15. A system according to claim 7, wherein said network is located at a hub location having a plurality of said monitors set up as workstations such tiiat trained technicians view and identify features in images from image acquisition means located at a plurality of different sites to identify retinopathy, thereby providing image-based remote expert screening.
16. A system for retinal imaging comprising image acquisition means including an ophti almic viewing instrument and an electronic imaging camera for capturing true color digital images tiirough said instrument a continuous light source which is adjustable to correct color of the digital images, an image processor, said image processor being connected to a network, and means in said network for incorporating the true color digital images taken by said camera into medical records.
17. A system according to claim 16, further comprising an examination system connected to said network, die examination system comprising means for accessing said images and medical records, and means for displaying said images adjacent to one anotiier for examination.
18. A system according to claim 16, further comprising digital color filter means for applying a digital color filter to said true color images to form high contrast images.
PCT/US1995/015996 1994-12-09 1995-12-11 Electronic imaging system for retinal examination and treatment WO1996017545A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU45127/96A AU706720B2 (en) 1994-12-09 1995-12-11 Electronic imaging system for retinal examination and treatment
JP8517808A JPH10510187A (en) 1994-12-09 1995-12-11 Electronic imaging device for retinal examination and treatment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US35348694A 1994-12-09 1994-12-09
US08/353,486 1994-12-09

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US08/870,939 Continuation-In-Part US5993001A (en) 1997-06-05 1997-06-05 Stereoscopic imaging system for retinal examination with remote examination unit

Publications (2)

Publication Number Publication Date
WO1996017545A1 true WO1996017545A1 (en) 1996-06-13
WO1996017545A9 WO1996017545A9 (en) 1996-08-22

Family

ID=23389330

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1995/015996 WO1996017545A1 (en) 1994-12-09 1995-12-11 Electronic imaging system for retinal examination and treatment

Country Status (4)

Country Link
JP (1) JPH10510187A (en)
AU (1) AU706720B2 (en)
CA (1) CA2207318A1 (en)
WO (1) WO1996017545A1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0857455A2 (en) * 1997-02-10 1998-08-12 Nidek Co., Ltd. Telecommunication system for examining an eye and an ophthalmic apparatus used for the system
WO1998051207A1 (en) * 1997-05-09 1998-11-19 Quigley Michael G Examining retinal vessels before and after ophthalmic intervention
WO1999063882A1 (en) * 1998-06-08 1999-12-16 Amiram Grinvald Imaging and analyzing movement of individual erythrocytes in blood vessels
WO2000002480A1 (en) * 1998-07-09 2000-01-20 Colorado State University Research Foundation Retinal vasculature image acquisition apparatus and method
WO2000061009A1 (en) * 1999-04-14 2000-10-19 Iôdp (S.A.R.L.) Medical imaging system
WO2001089375A1 (en) * 2000-05-19 2001-11-29 The Lions Eye Institute Of Western Australia Incorporated Portable slit lamp
EP1210905A2 (en) * 2000-12-01 2002-06-05 Nidek Co., Ltd. Fundus camera
JP2002516688A (en) * 1998-06-04 2002-06-11 レーザー・ディアグノスティック・テクノロジーズ・インコーポレイテッド Eye test equipment using polarized probe
US6659610B2 (en) 2000-12-27 2003-12-09 Konan Medical, Inc. Corneal endothelium analysis service method and system
US7025459B2 (en) 2000-07-14 2006-04-11 Visual Pathways, Inc. Ocular fundus auto imager
AU2001258043B2 (en) * 2000-05-19 2006-04-27 Lions Eye Institute Limited Portable slit lamp
US7360895B2 (en) 2000-07-14 2008-04-22 Visual Pathways, Inc. Simplified ocular fundus auto imager
EP1703744A3 (en) * 2005-03-16 2010-02-24 Sony Corporation Apparatus and method for stereoscopic image recording/reproduction and display
US7682150B2 (en) * 1996-01-02 2010-03-23 Jjl Technologies Llc Method for preparing a dental prosthesis based on electronically determined image and color/shade data and based on telephone communication
US7856135B1 (en) 2009-12-02 2010-12-21 Aibili—Association for Innovation and Biomedical Research on Light and Image System for analyzing ocular fundus images
US7894645B2 (en) 2000-08-10 2011-02-22 Ohio State University High-resolution digital image processing in the analysis of pathological materials
US20160092721A1 (en) * 2013-05-19 2016-03-31 Commonwealth Scientific And Industrial Research Organization A system and method for remote medical diagnosis
US9541901B2 (en) 2008-07-10 2017-01-10 Real View Imaging Ltd. Viewer tracking in a projection system
US10524658B2 (en) 2016-09-14 2020-01-07 DigitalOptometrics LLC Remote comprehensive eye examination system
US10665345B2 (en) 2012-11-06 2020-05-26 20/20 Vision Center, Llc Systems and methods for enabling customers to obtain vision and eye health examinations
US10827925B2 (en) 2016-09-14 2020-11-10 DigitalOptometrics LLC Remote comprehensive eye examination system
US10874299B2 (en) 2007-02-16 2020-12-29 20/20 Vision Center, Llc System and method for enabling customers to obtain refraction specifications and purchase eyeglasses or contact lenses
CN114259337A (en) * 2021-12-09 2022-04-01 杭州叁伟医疗科技有限公司 Brightness adjusting method, equipment and medium for hyperbaric oxygen eye therapeutic apparatus
US11931103B2 (en) 2017-01-05 2024-03-19 Heidelberg Engineering Gmbh Method and device for carrying out the method in a vehicle

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3469872B2 (en) * 2000-12-27 2003-11-25 株式会社コーナン・メディカル Corneal endothelial cell analysis system
US7232220B2 (en) * 2001-03-01 2007-06-19 Richard Franz System for vision examination utilizing telemedicine
JP6869032B2 (en) * 2017-01-11 2021-05-12 株式会社トプコン Ophthalmic examination system
JP7042029B2 (en) * 2017-03-07 2022-03-25 株式会社トプコン Ophthalmic observation device and its operation method
JP6895278B2 (en) * 2017-03-07 2021-06-30 株式会社トプコン Ophthalmic observation device and its operation method
JP6895277B2 (en) * 2017-03-07 2021-06-30 株式会社トプコン Ophthalmic observation device and its operation method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5049147A (en) * 1989-04-06 1991-09-17 Danon Nissim N Apparatus for computerized laser surgery
US5220360A (en) * 1990-10-24 1993-06-15 Ophthalmic Imaging Systems, Inc. Apparatus and method for topographical analysis of the retina

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5049147A (en) * 1989-04-06 1991-09-17 Danon Nissim N Apparatus for computerized laser surgery
US5220360A (en) * 1990-10-24 1993-06-15 Ophthalmic Imaging Systems, Inc. Apparatus and method for topographical analysis of the retina

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7682150B2 (en) * 1996-01-02 2010-03-23 Jjl Technologies Llc Method for preparing a dental prosthesis based on electronically determined image and color/shade data and based on telephone communication
EP0857455A2 (en) * 1997-02-10 1998-08-12 Nidek Co., Ltd. Telecommunication system for examining an eye and an ophthalmic apparatus used for the system
EP0857455B1 (en) * 1997-02-10 2005-12-28 Nidek Co., Ltd. Telecommunication system for examining an eye and an ophthalmologic apparatus used for the system
WO1998051207A1 (en) * 1997-05-09 1998-11-19 Quigley Michael G Examining retinal vessels before and after ophthalmic intervention
JP2002516688A (en) * 1998-06-04 2002-06-11 レーザー・ディアグノスティック・テクノロジーズ・インコーポレイテッド Eye test equipment using polarized probe
US6588901B1 (en) 1998-06-08 2003-07-08 Yeda Research And Development Co., Ltd. Imaging and analyzing movement of individual erythrocytes in blood vessels
WO1999063882A1 (en) * 1998-06-08 1999-12-16 Amiram Grinvald Imaging and analyzing movement of individual erythrocytes in blood vessels
US6766041B2 (en) 1998-07-09 2004-07-20 Colorado State University Research Foundation Retinal vasculature image acquisition apparatus and method
WO2000002480A1 (en) * 1998-07-09 2000-01-20 Colorado State University Research Foundation Retinal vasculature image acquisition apparatus and method
FR2792441A1 (en) * 1999-04-14 2000-10-20 Iodp MEDICAL IMAGING SYSTEM
WO2000061009A1 (en) * 1999-04-14 2000-10-19 Iôdp (S.A.R.L.) Medical imaging system
US6980676B2 (en) 1999-04-14 2005-12-27 Iodp (S.A.R.L.) Medical imaging system
US7083281B2 (en) 2000-05-19 2006-08-01 The Lions Eye Institute Of Western Australia, Inc. Portable slit lamp
AU2001258043B2 (en) * 2000-05-19 2006-04-27 Lions Eye Institute Limited Portable slit lamp
WO2001089375A1 (en) * 2000-05-19 2001-11-29 The Lions Eye Institute Of Western Australia Incorporated Portable slit lamp
US7025459B2 (en) 2000-07-14 2006-04-11 Visual Pathways, Inc. Ocular fundus auto imager
US7360895B2 (en) 2000-07-14 2008-04-22 Visual Pathways, Inc. Simplified ocular fundus auto imager
US7894645B2 (en) 2000-08-10 2011-02-22 Ohio State University High-resolution digital image processing in the analysis of pathological materials
US6654553B2 (en) 2000-12-01 2003-11-25 Nidek Co., Ltd. Fundus camera
EP1210905A3 (en) * 2000-12-01 2002-11-13 Nidek Co., Ltd. Fundus camera
EP1210905A2 (en) * 2000-12-01 2002-06-05 Nidek Co., Ltd. Fundus camera
US6659610B2 (en) 2000-12-27 2003-12-09 Konan Medical, Inc. Corneal endothelium analysis service method and system
EP1703744A3 (en) * 2005-03-16 2010-02-24 Sony Corporation Apparatus and method for stereoscopic image recording/reproduction and display
US10874299B2 (en) 2007-02-16 2020-12-29 20/20 Vision Center, Llc System and method for enabling customers to obtain refraction specifications and purchase eyeglasses or contact lenses
US9541901B2 (en) 2008-07-10 2017-01-10 Real View Imaging Ltd. Viewer tracking in a projection system
US9594347B2 (en) 2008-07-10 2017-03-14 Real View Imaging Ltd. Man machine interface for a 3D display system
US10120335B2 (en) 2008-07-10 2018-11-06 Real View Imaging Ltd. Viewer tracking in a projection system
US10585395B2 (en) 2008-07-10 2020-03-10 Real View Imaging Ltd. Holographic image display system
US7856135B1 (en) 2009-12-02 2010-12-21 Aibili—Association for Innovation and Biomedical Research on Light and Image System for analyzing ocular fundus images
US10734114B2 (en) 2012-11-06 2020-08-04 20/20 Vision Center, Llc Systems and methods for enabling customers to obtain vision and eye health examinations
US10665345B2 (en) 2012-11-06 2020-05-26 20/20 Vision Center, Llc Systems and methods for enabling customers to obtain vision and eye health examinations
US10714217B2 (en) 2012-11-06 2020-07-14 20/20 Vision Center, Llc. Systems and methods for enabling customers to obtain vision and eye health examinations
US20160092721A1 (en) * 2013-05-19 2016-03-31 Commonwealth Scientific And Industrial Research Organization A system and method for remote medical diagnosis
US9898659B2 (en) * 2013-05-19 2018-02-20 Commonwealth Scientific And Industrial Research Organisation System and method for remote medical diagnosis
US10602928B2 (en) 2016-09-14 2020-03-31 DigitalOptometrics LLC Remote comprehensive eye examination system
US10827925B2 (en) 2016-09-14 2020-11-10 DigitalOptometrics LLC Remote comprehensive eye examination system
US10524658B2 (en) 2016-09-14 2020-01-07 DigitalOptometrics LLC Remote comprehensive eye examination system
US11259701B2 (en) 2016-09-14 2022-03-01 DigitalOptometrics LLC Remote comprehensive eye examination system
US11510569B2 (en) 2016-09-14 2022-11-29 DigitalOptometrics LLC Remote comprehensive eye examination system
US11759108B2 (en) 2016-09-14 2023-09-19 DigitalOptometrics LLC Remote comprehensive eye examination system
US11931103B2 (en) 2017-01-05 2024-03-19 Heidelberg Engineering Gmbh Method and device for carrying out the method in a vehicle
CN114259337A (en) * 2021-12-09 2022-04-01 杭州叁伟医疗科技有限公司 Brightness adjusting method, equipment and medium for hyperbaric oxygen eye therapeutic apparatus

Also Published As

Publication number Publication date
AU706720B2 (en) 1999-06-24
JPH10510187A (en) 1998-10-06
CA2207318A1 (en) 1996-06-13
AU4512796A (en) 1996-06-26

Similar Documents

Publication Publication Date Title
US5993001A (en) Stereoscopic imaging system for retinal examination with remote examination unit
AU706720B2 (en) Electronic imaging system for retinal examination and treatment
WO1996017545A9 (en) Electronic imaging system for retinal examination and treatment
US6705726B2 (en) Instrument for eye examination and method
US10101571B2 (en) Perfusion assessment multi-modality optical medical device
US7583827B2 (en) Assessment of lesions in an image
Freeman et al. Simultaneous indocyanine green and fluorescein angiography using a confocal scanning laser ophthalmoscope
US20030157464A1 (en) Instrument for eye examination and method
US8303115B2 (en) Method and system for retinal health management
EP0665686A2 (en) Visual information system
JP6972049B2 (en) Image processing method and image processing device using elastic mapping of vascular plexus structure
Tang et al. Telemedicine for eye care
US20230320584A1 (en) Image processing method, image processing program, image processing device, image display device, and image display method
Berger et al. Computer-vision-enabled augmented reality fundus biomicroscopy
Bergua et al. Tele-transmission of stereoscopic images of the optic nerve head in glaucoma via Internet
Lamminen et al. Fundus imaging and the telemedical management of diabetes
EP3954279A1 (en) Dermal image capture
Cuadros et al. Diabetic retinopathy screening practice guide
DE102019205318B4 (en) METHOD FOR QUANTITATIVE DETERMINATION OF DISTURBANCES OF THE FIELD OF THE EYE OF A SUBJECT
CN107661087A (en) Medical imaging apparatus and method for the imaging of photosensitive object such as biological tissue
Siddiqui et al. Revolutionizing Ophthalmology: The Empowering Role of Artificial Intelligence
Bettley Applications of low-cost eye tracking in telemedicine
KR20060022817A (en) The iridology photographing system using wellbeing remote controlled device and the mouse
Soliz et al. Improving the visualization of drusen in age-related macular degeneration through maximum entropy digitization and stereo viewing
Cuadros Kanagasingam Yogesan Sajeesh Kumar Leonard Goldschmidt

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

COP Corrected version of pamphlet

Free format text: PAGES 1/3-3/3,DRAWINGS,REPLACED BY NEW PAGES BEARING THE SAME NUMBER;DUE TO LATE TRANSMITTAL BY THERECEIVING OFFICE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref document number: 2207318

Country of ref document: CA

Ref country code: CA

Ref document number: 2207318

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 08870939

Country of ref document: US

122 Ep: pct application non-entry in european phase