EP2420181B1 - Eyeground observation device - Google Patents

Eyeground observation device Download PDF

Info

Publication number
EP2420181B1
EP2420181B1 EP10764219.1A EP10764219A EP2420181B1 EP 2420181 B1 EP2420181 B1 EP 2420181B1 EP 10764219 A EP10764219 A EP 10764219A EP 2420181 B1 EP2420181 B1 EP 2420181B1
Authority
EP
European Patent Office
Prior art keywords
scanning
image
fundus
light
positional misalignment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP10764219.1A
Other languages
German (de)
English (en)
French (fr)
Other versions
EP2420181A1 (en
EP2420181A4 (en
Inventor
Hiroshi Koizumi
Koki Harumoto
Tsutomu Kikawa
Takefumi Hayashi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Topcon Corp
Original Assignee
Topcon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Topcon Corp filed Critical Topcon Corp
Priority to EP14001193.3A priority Critical patent/EP2752151B1/en
Publication of EP2420181A1 publication Critical patent/EP2420181A1/en
Publication of EP2420181A4 publication Critical patent/EP2420181A4/en
Application granted granted Critical
Publication of EP2420181B1 publication Critical patent/EP2420181B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/102Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for optical coherence tomography [OCT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/15Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing
    • A61B3/152Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing for aligning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • A61B3/1225Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes using coherent radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10101Optical tomography; Optical coherence tomography [OCT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20068Projection on vertical or horizontal image axis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Definitions

  • the present invention relates to a fundus observation apparatus configured to form images of a fundus of an eye by using optical coherence tomography.
  • optical coherence tomography that forms images of the surface morphology and internal morphology of an object by using a light beam from a laser light source or the like has attracted attention.
  • optical coherence tomography is noninvasive to human bodies, and is therefore expected to be utilized in the medical field and biological field.
  • Japanese Unexamined Patent Application Publication No. Hei 11-325849 discloses a device to which optical coherence tomography is applied.
  • This device has such a configuration that: a measuring arm scans an object by a rotary deflection mirror (a Galvano mirror); a reference arm is provided with a reference mirror; and an interferometer is mounted at the outlet to analyze, by a spectrometer, the intensity of an interference light of light fluxes from the measurement arm and the reference arm.
  • the reference arm is configured to gradually change the light flux phase of the reference light by discontinuous values.
  • the device of Japanese Unexamined Patent Application Publication No. Hei 11-325849 uses a technique of so-called "Fourier Domain OCT (Optical Coherence Tomography).” That is to say, the device radiates a low coherence light beam to an object, superposes the reflected light and the reference light to generate an interference light, and acquires the spectral intensity distribution of the interference light to execute Fourier transform, thereby imaging the morphology in the depth direction (the z-direction) of the object.
  • the technique of this type is also called Spectral Domain.
  • the device described in Japanese Unexamined Patent Application Publication No. Hei 11-325849 is provided with a Galvano mirror that scans with a light beam (a signal light), and is thereby configured to form an image of a desired measurement target region of the object. Because this device is configured to scan with the light beam only in one direction (the x-direction) orthogonal to the z-direction, an image formed by this device is a two-dimensional tomographic image in the depth direction (the z-direction) along the scanning direction (the x-direction) of the light beam.
  • Japanese Unexamined Patent Application Publication No. 2002-139421 discloses a technique of scanning with a signal light in the horizontal direction (x-direction) and the vertical direction (y-direction) to form a plurality of two-dimensional tomographic images in the horizontal direction, and acquiring and imaging three-dimensional tomographic information of a measured range based on the tomographic images.
  • the three-dimensional imaging for example, a method of arranging and displaying a plurality of tomographic images in the vertical direction (referred to as stack data or the like), and a method of executing a rendering process on a plurality of tomographic images to form a three-dimensional image are considered.
  • Japanese Unexamined Patent Application Publication No. 2007-24677 and Japanese Unexamined Patent Application Publication No. 2006-153838 disclose other types of OCT devices.
  • Japanese Unexamined Patent Application Publication No. 2007-24677 describes an OCT device that images the morphology of an object by scanning the object with light of various wavelengths, acquiring the spectral intensity distribution based on an interference light obtained by superposing the reflected lights of the light of the respective wavelengths on the reference light, and executing Fourier transform.
  • Such an OCT device is called a Swept Source type or the like.
  • the Swept Source type is a kind of the Fourier Domain type.
  • Japanese Unexamined Patent Application Publication No. 2006-153838 describes an OCT device that radiates a light having a predetermined beam diameter to an object and analyzes the components of an interference light obtained by superposing the reflected light and the reference light, thereby forming an image of the object in a cross-section orthogonal to the travelling direction of the light.
  • Such an OCT device is called a full-field type, en-face type or the like.
  • Japanese Unexamined Patent Application Publication No. 2008-73099 discloses a configuration in which the OCT is applied to the ophthalmologic field. According to this fundus observation apparatus, it is possible to obtain tomographic images and 3-dimensional images of a fundus. Before the OCT device was applied to the ophthalmologic field, a fundus observation apparatus such as a retinal camera had been used (for example, refer to Japanese Unexamined Patent Application Publication No. Hei 9-276232 ).
  • a fundus observation apparatus using OCT has a merit that tomographic images and 3-dimensional images of a fundus are obtained. Therefore, contribution to increase of the diagnosis accuracy and early detection of a lesion are expected.
  • Document US2003/199769 discloses a fundus observation apparatus with the features of the preamble of claim 1.
  • This invention resolves the abovementioned problem, with the purpose of providing a fundus observation apparatus capable of capturing a highly accurate OCT image, even if the eye moves or blinks during scanning with a signal light.
  • a preferred embodiment is the fundus observation apparatus according to Claim 1, wherein said prescribed time interval is a substantially integral multiple of a scan time interval that is from the timing at which said signal light is irradiated to one of said plurality of scanning points to the timing at which said signal light is irradiated to the next scanning point; while said signal light is sequentially irradiated to said plurality of scanning points by said scanning part, said detection part detects the position of said fundus each time when the relevant integral number of scanning points are scanned; and said calculation part divides said plurality of 1-dimensional images into 1-dimensional image groups, each group comprising the relevant integral number of 1-dimensional images, specifies the position of each 1-dimensional image group based on the detection results of the position of said fundus when the relevant integral number of scanning points corresponding to each 1-dimensional image group are being scanned, and calculates said positional misalignment amount based on said specified position of each 1-dimensional image group.
  • said integral is one; said 1-dimensional image group consists of one 1-dimensional image; and said calculation part specifies the position of the 1-dimensional image with regard to said plurality of 1-dimensional images based on the detection results of the position of said fundus when a scanning point corresponding to the 1-dimensional image is being scanned, and calculates said positional misalignment amount based on the specified plurality of positions.
  • said integral is equal to or greater than two; said 1-dimensional image group consists of two or more 1-dimensional images; and said calculation part estimates, based on the detection results of the position of said fundus when two or more scanning points corresponding to one of said plurality of 1-dimensional image groups are being scanned and the detection results of the position of said fundus when two or more scanning points corresponding to the next 1-dimensional image group are being scanned, said positional misalignment amount of a 1-dimensional image included in said one of said plurality of 1-dimensional image group and/or said next 1-dimensional image group.
  • said detection part includes an imaging part that forms a moving image by imaging said fundus at said prescribed time interval when the scanning with said signal light is executed by said scanning part, and an image region-specifying part that specifies an image region of a characteristic site of said fundus in each still image forming said moving image, and obtains the position of said image region in said each still image as the position of said fundus.
  • said calculation part includes a scanning point-specifying part that, when there is a still image in which said image region is not specified by said image region-specifying part, specifies a scanning point of a 1-dimensional image corresponding to the still image; said scanning part reirradiates said signal light to the specified scanning point; and said image forming part forms a new 1-dimensional image based on the detection results of interference light of said reirradiated signal light and said reference light.
  • said calculation part sequentially calculates said positional misalignment amount based on the position of said fundus that is sequentially detected at said prescribed time interval when scanning with said signal light is executed; and comprising a controlling part that corrects the irradiation position of said signal light to said fundus by controlling said scanning part based on said sequentially calculated positional misalignment amount.
  • said plurality of scanning points are arranged along a prescribed scanning line; said scanning part repeatedly scans along said prescribed scanning line with said signal light; said image forming part repeatedly forms said plurality of 1-dimensional images corresponding to said plurality of scanning points following the repetitive scanning; said calculation part repeatedly calculates said positional misalignment amount following the repetitive formations; comprising: a determination part which determines whether or not the repeatedly calculated each positional misalignment amount is included in a prescribed permissible range; and an image overlapping part that overlaps, for each 1-dimensional image corresponding to each scanning point, a set of said plurality of 1-dimensional images corresponding to said positional misalignment amount determined as inclusive to said prescribed permissible range; and said image forming part forms a tomographic image along said prescribed scanning line by arranging a plurality of new 1-dimensional images formed as a result of said overlapping in accordance with the arrangement of said plurality of scanning points.
  • said calculation part includes an image specifying part that specifies a 1-dimensional image with the calculated positional misalignment amount of greater than a prescribed value; said scanning part reirradiates said signal light towards a scanning point corresponding to each 1-dimensional image specified by said image specifying part; and said image forming part forms a new 1-dimensional image at the scanning point based on the detection results of interference light of said reirradiated signal light and said reference light.
  • said plurality of scanning points are arranged along a prescribed scanning line; said calculation part includes an image selecting part that, for each of said plurality of scanning points, selects the 1-dimensional image closest to the original position of the scanning point among said plurality of 1-dimensional images, based on the calculated positional misalignment amount; and said image forming part forms a tomographic image along said prescribed scanning line by arranging the selected 1-dimensional image in accordance with the arrangement of said plurality of scanning points.
  • said calculation part calculates the positional misalignment amount of said plurality of 1-dimensional images in the depth direction of said fundus, based on a separate 1-dimensional image group arranged in a separate scanning direction that is formed by said image forming part based on the detection results of interference light of signal light that is separately scanned by said scanning part and reference light.
  • said scanning part sequentially irradiates said signal light, as said separate scanning, to a prescribed number of scanning points along a scanning line crossing the arrangement direction of said plurality of scanning points; said image forming part forms said 1-dimensional image at each of said prescribed number of scanning points and forms a tomographic image corresponding to said scanning line based on said prescribed number of formed 1-dimensional images; and said calculation part specifies an image region of a characteristic layer of said fundus in said tomographic image, specifies the image region of said characteristic layer in a tomographic image formed by arranging said plurality of scanning points, calculates the depthwise displacement of said image region corresponding to said scanning line and said image region corresponding to said plurality of scanning points, and calculates the depthwise positional misalignment amount of said plurality of 1-dimensional images based on the calculated displacement.
  • said calculation part includes a second correction part that corrects the position of said plurality of 1-dimensional images in the depth direction, based on the calculated depthwise positional misalignment amount.
  • a second correction part that corrects the position of said plurality of 1-dimensional images in the depth direction, based on the calculated depthwise positional misalignment amount.
  • an optical system that splits low coherence light into signal light and reference light, generates interference light by superposing said signal light that has passed through the fundus of an eye and reference light that has passed through a reference optical path, and detects the generated interference light; a scanning part that two-dimensionally scans said fundus with said signal light; an image forming part that forms, based on the detection results of said interference light, a 3-dimensional image corresponding to the region of said fundus in which the two-dimensional scanning with said signal light is executed; an imaging part that forms a moving image of said fundus when the two-dimensional scanning with said signal light is executed; and a correction part that corrects the position of said 3-dimensional image in a
  • the positional misalignment amount of a plurality of 1-dimensional images in the fundus surface direction may be calculated, based on detected temporal changes in the position of the fundus by detecting the position of the fundus at a prescribed time interval during scanning with a signal light.
  • the position in the fundus surface direction in a 3-dimensional image of a fundus may be corrected based on a moving image of the fundus and, because the depthwise position of the fundus of a 3-dimensional image may be corrected based on tomographic images of the fundus based on the detection results of interference light of the signal light that is separately scanned by a scanning part and reference light, even if the eye moves or blinks during scanning with the signal light, it is possible to capture a highly accurate 3-dimensional image (OCT image).
  • OCT image highly accurate 3-dimensional image
  • the fundus observation apparatus embodying the present invention forms tomographic images of a fundus using optical coherence tomography.
  • Optical coherence tomography of an arbitrary type involving scanning with a signal light such as a Fourier Domain type, a swept source type, etc. are applicable to the fundus observation apparatus.
  • an image obtained by optical coherence tomography is sometimes referred to as an OCT image.
  • a measuring action for forming an OCT image is sometimes referred to as an OCT measurement.
  • a fundus observation apparatus 1 includes a retinal camera unit 2, an OCT unit 100, and an arithmetic and control unit 200.
  • the retinal camera unit 2 has almost the same optical system as a conventional retinal camera.
  • the OCT unit 100 is provided with an optical system for obtaining an OCT image of a fundus.
  • the arithmetic and control unit 200 is provided with a computer that executes various arithmetic processes, control processes, and so on.
  • the retinal camera unit shown in Fig. 1 is provided with an optical system for forming a 2-dimensional image (fundus image) representing the surface morphology of the fundus Ef of an eye E.
  • Fundus images include observation images, photographed images, etc.
  • the observation image is, for example, a monochrome moving image formed at a prescribed frame rate using near-infrared light.
  • the photographed image is, for example, a color image captured by flashing visible light.
  • the retinal camera unit 2 may also be configured so as to be capable of capturing other types of images such as a fluorescein angiography image or an indocyanine green fluorescent image.
  • the retinal camera unit 2 is provided with a chin rest and a forehead placement for retaining the face of the subject, similar to a conventional retinal camera. Moreover, like a conventional retinal camera, the retinal camera unit 2 is provided with an illumination optical system 10 and an imaging optical system 30.
  • the illumination optical system 10 radiates an illumination light to the fundus Ef.
  • the imaging optical system 30 guides a fundus reflected light of the illumination light to imaging devices (CCD image sensors 35, 38).
  • the imaging optical system 30 guides a signal light LS coming from the OCT unit 100 to the fundus Ef, and guides the signal light propagated through the fundus Ef to the OCT unit 100.
  • An observation light source 11 of the illumination optical system 10 comprises, for example, a halogen lamp.
  • Light (observation illumination light) output from the observation light source 11 is reflected by a reflection mirror 12 with a curved reflection surface, and becomes near infrared after passing through a visible cut filter 14 via a condenser lens 13. Furthermore, the observation illumination light is once converged near an imaging light source 15, reflected by a mirror 16, and passes through relay lenses 17, 18, diaphragm 19, and relay lens 20. Then, the observation illumination light is reflected on the peripheral part (the surrounding region of an aperture part) of an aperture mirror 21 and illuminates the fundus Ef via an object lens 22.
  • the fundus reflection light of the observation illumination light is refracted by the object lens 22, passes through the aperture part formed in the center region of the aperture mirror 21, passes through a dichroic mirror 55 and, travels through a focus lens 31, and is reflected by a dichroic mirror 32. Furthermore, the fundus reflection light passes through a half-mirror 40 and forms an image on the light receiving surface of the CCD image sensor 35 by a condenser lens 34 after being reflected by a dichroic mirror 33.
  • the CCD image sensor 35 detects, for example, the fundus reflection light at a prescribed frame rate.
  • An image (observation image) K based on the fundus reflection light detected by the CCD image sensor 35 is displayed on a display device 3.
  • the imaging light source 15 consists of, for example, a xenon lamp.
  • the light (imaging illumination light) output from the imaging light source 15 is irradiated to the fundus Ef via a route that is the same as the observation illumination light.
  • the fundus reflection light of the imaging illumination light is guided to the dichroic mirror 33 via the same route as that of the observation illumination light, passes through the dichroic mirror 33, and forms an image on the light receiving surface of the CCD image sensor 38 by a condenser lens 37 after being reflected by a mirror 36.
  • An image (photographed image) H based on the fundus reflection light detected by the CCD image sensor 38 is displayed on the display device 3. It should be noted that the display device 3 for displaying an observation image K and the display device 3 for displaying a photographed image H may be the same or different.
  • An LCD (Liquid Crystal Display) 39 displays a fixation target or a visual target for measuring eyesight.
  • the fixation target is a visual target for fixing the eye E, and is used when imaging a fundus or forming a tomographic image.
  • the visual target for measuring eyesight is a visual target used for measuring an eyesight value of the eye E, for example, such as Landolt rings. It should be noted that the visual target for measuring eyesight is sometimes simply referred to as a target.
  • Part of the light output from the LCD 39 is reflected by a half-mirror 40, reflected by the dichroic mirror 32, passes through the aperture part of the aperture mirror 21 via the focus lens 31 as well as a dichroic mirror 55, is refracted by the object lens 22 and projected to the fundus Ef.
  • the fixation position of the eye E there are a position for acquiring an image centered on the macula of the fundus Ef, a position for acquiring an image centered on the optic papilla, a position for acquiring an image centered on the fundus center between the macula and the optic papilla, and so on, as in conventional retinal cameras.
  • the retinal camera unit 2 is provided with an alignment optical system 50 and a focus optical system 60.
  • the alignment optical system 50 generates a target (alignment target) for matching the position (alignment) of the device optical system with respect to the eye E.
  • the focus optical system 60 generates a target (split target) for matching the focus with respect to the eye Ef.
  • Light (alignment light) output from the LED (Light Emitting Diode) 51 of the alignment optical system 50 is reflected by the dichroic mirror 55 via diaphragms 52, 53, and a relay lens 54, passes through the aperture part of the aperture mirror 21, and is projected onto the cornea of the eye E by the object lens 22.
  • Part of cornea reflection light of the alignment light is transmitted through the dichroic mirror 55 via the object lens 22 and the aperture part, passes through the focus lens 31, is reflected by the dichroic mirror 32, transmitted through the half-mirror 40, reflected by the dichroic mirror 33, and projected onto the light receiving surface of the CCD image sensor 35 by the condenser lens 34.
  • a light receiving image (alignment target) by the CCD image sensor 35 is displayed on the display device 3 along with the observation image K.
  • a user conducts alignment by an operation that is the same as conventional fundus cameras. It should be noted that alignment may be performed, by an arithmetic and control unit 200, as a result of analyzing the position of the alignment target and moving the optical system.
  • the reflection surface of a reflection rod 67 is provided in a slanted position on the light path of the illumination optical system 10.
  • Light (focus light) output from an LED 61 of the focus optical system 60 passes through a relay lens 62, is split into two light fluxes by a split target plate 63, passes through a two-hole diaphragm 64, is reflected by a mirror 65, and is reflected after an image is formed once on the reflection surface of the reflection rod 67 by a condenser lens 66.
  • the focus light is reflected at the aperture mirror 21 via the relay lens 20 and an image is formed on the fundus Ef by the object lens 22.
  • the fundus reflection light of the focus light passes through the same route as the cornea reflection light of the alignment light and is detected by the CCD image sensor 35.
  • a light receiving image (split target) by the CCD image sensor 35 is displayed on the display device 3 along with an observation image K.
  • the arithmetic and control unit 200 analyzes the position of the split target, and moves the focus lens 31 and the focus optical system 60 for focusing. It should be noted that focusing may be performed manually while visually recognizing the split target.
  • An optical path including a mirror 41, collimator lens 42, and Galvano mirrors 43, 44 is provided behind the dichroic mirror 32.
  • the optical path is connected to the OCT unit 100.
  • the Galvano mirror 44 performs scanning with a signal light LS from the OCT unit 100 in the x-direction.
  • the Galvano mirror 43 performs scanning with a signal light LS in the y-direction. Scanning may be performed with the signal light LS in an arbitrary direction in the xy-plane due to the two Galvano mirrors 43 and 44.
  • the OCT unit 100 shown in Fig. 2 is provided with an optical system for obtaining a tomographic image of the fundus Ef.
  • the optical system has a similar configuration to a conventional Fourier-Domain-type OCT device. That is to say, the optical system is configured to split a low coherence light into a reference light and a signal light, make the signal light propagated through a fundus and the reference light propagated through a reference optical path interfere with each other to generate an interference light, and detects the spectral components of this interference light. This detection result (detection signal) is transmitted to the arithmetic and control unit 200.
  • a light source unit 101 outputs a low coherence light L0.
  • the low coherence light L0 is, for example, light (invisible light) consisting of wavelengths that is impossible to be detected by human eyes. Furthermore, the low coherence light L0 is, for example, near-infrared light having the center wave of about 1050-1060nm.
  • the light source unit 101 is configured to include light output device, such as an SLD (super luminescent diode), SOA (Semiconductor Optical Amplifier) and the like.
  • the low coherence light L0 output from the light source unit 101 is guided to a fiber coupler 103 by an optical fiber 102 and split into signal light LS and reference light LR.
  • the fiber coupler 103 acts both as a means to split light (splitter) as well as a means to synthesize light (coupler), but herein the same is conventionally referred to as a "fiber coupler.”
  • the signal light LS is guided by the optical fiber 104 and becomes a parallel light flux by a collimator lens unit 105. Furthermore, the signal light LS is reflected by Galvano mirrors 44 and 43, converged by the collimator lens 42, reflected by the mirror 41, transmitted through a dichroic mirror 32, and irradiated to the fundus Ef after passing through a route that is the same as the light from the LCD 39.
  • the signal light LS is scattered and reflected at the fundus Ef.
  • the scattered light and the reflection light are sometimes all together referred to as the fundus reflection light of the signal light LS.
  • the fundus reflection light of the signal light LS progresses along the same route in the reverse direction and is guided to the fiber coupler 103.
  • the reference light LR is guided by an optical fiber 106 and becomes a parallel light flux by a collimator lens unit 107. Furthermore, the reference light LR is reflected by mirrors 108, 109, 110, dimmed by an ND (Neutral Density) filter 111, and reflected by a mirror 112, with the image formed on a reflection surface of a reference mirror 114 by a collimator lens 113. The reference light LR reflected by the reference mirror 114 progresses along the same route in the reverse direction and is guided to the fiber coupler 103. It should be noted that an optical element (pair prism, etc.) for dispersion compensation and/or an optical element for polarization correction (wave plate, etc.) may also be provided for the optical path (reference optical path) of the reference light LR.
  • an optical element may also be provided for the optical path (reference optical path) of the reference light LR.
  • the fiber coupler 103 superposes the fundus reflection light of the signal light LS and the reference light LR reflected by the reference mirror 114.
  • Interference light LC thus generated is guided by an optical fiber 115 and output from an incidental end 116. Furthermore, the interference light LC is converted to a parallel light flux by a collimator lens 117, spectrally divided (spectrally decomposed) by a diffraction grating 118, converged by the convergence lens 57, and projected onto the light receiving surface of a CCD image sensor 120.
  • the CCD image sensor 120 is for example a line sensor, and detects the respective spectral components of the divided interference light LC and converts the components into electric charges.
  • the CCD image sensor 120 accumulates these electric charges and generates a detection signal. Furthermore, the CCD image sensor 120 transmits the detection signal to the arithmetic and control unit 200.
  • CMOS Complementary Metal Oxide Semiconductor
  • the arithmetic and control unit 200 analyzes the detection signals inputted from the CCD image sensor 120, and forms an OCT image of the fundus Ef.
  • An arithmetic process for this is like that of a conventional Fourier-Domain-type OCT device.
  • the arithmetic and control unit 200 controls each part of the retinal camera unit 2, the display device 3 and the OCT unit 100.
  • the arithmetic and control unit 200 executes: control of action of the observation light source 101, the imaging light source 103 and LED's 51 and 61; control of action of the LCD 39; control of movement of the focus lens 31; control of movement of the reflection rod 67; control of movement of the focus optical system 60; control of action of the respective Galvano mirrors 43 and 44; and so on.
  • the arithmetic and control unit 200 executes: control of action of the light source unit 101; control of movement of the reference mirror 114 and the collimator lens 113; control of action of the CCD image sensor 120; and so on.
  • the arithmetic and control unit 200 includes a microprocessor, a RAM, a ROM, a hard disk drive, a communication interface, and so on, as in conventional computers.
  • the storage device such as the hard disk drive stores a computer program for controlling the fundus observation apparatus 1.
  • the arithmetic and control unit 200 may be provided with a circuit board dedicated for forming OCT images based on detection signals from the CCD image sensor 120.
  • the arithmetic and control unit 200 may be provided with operation devices (imput devices) such as a keyboard and a mouse, and/or display devices such as LCD.
  • the retinal camera unit 2, display device 3, OCT unit 100, and arithmetic and control unit 200 may be integrally configured (that is, within a single case), or configured as individual separate bodies.
  • the control system of the fundus observation apparatus 1 has a configuration centered on a controller 210 of the arithmetic and control unit 200.
  • the controller 210 includes, for example, the aforementioned microprocessor, RAM, ROM, hard disk drive, and communication interface.
  • a controller 210 is provided with a main controller 211 and storage 212.
  • the main controller 211 performs the aforementioned various kinds of control. Specifically, the main controller 211 controls a scan driver 70 as well as a focus driver 80 of the retinal camera unit 2, and further controls a reference driver 130 of the OCT unit 100.
  • the scan driver 70 is configured, for example, including a servo motor and independently changes the facing direction of the Galvano mirrors 43 and 44.
  • the scan driver 70 consists of one example of the "scanning part" in the embodiment along with the Galvano mirrors 43 and 44.
  • the focus driver 80 is configured, for example, including a pulse motor and moves the focus lens 31 in the optical axis direction. Thereby, the focus position of light towards the fundus Ef is changed.
  • the reference driver 130 is configured, for example, including a pulse motor and integrally moves the collimator lens 113 as well as the reference mirror 114 along the travelling direction of the reference light LR.
  • the main controller 211 executes a process of writing data into the storage 212, and a process of reading out the data from the storage 212.
  • the storage 212 stores various kinds of data.
  • the data stored in the storage 212 is, for example, image data of OCT images, image data of fundus images, and eye information.
  • the eye information includes information on the eye, for example, information on a subject such as a patient ID and a name, information on identification of left eye or right eye, and so on.
  • An image forming part 220 forms image data of a tomographic image of the fundus Ef based on the detection signals from the CCD image sensor 120.
  • this process includes processes such as noise elimination (noise reduction), filtering, and FFT (Fast Fourier Transform).
  • the image forming part 220 includes, for example, the aforementioned circuit board and communication interface. It should be noted that “image data” and the “image” presented based on the image data may be identified with each other in this specification,.
  • An image processor 230 executes various image processing and analysis on images formed by the image forming part 220.
  • the image processor 230 executes various correction processes such as luminance correction and dispersion correction of images.
  • the image processor 230 executes, for example, an interpolation process of interpolating pixels between tomographic images formed by the image forming part 220, thereby forming image data of a three-dimensional image of the fundus Ef.
  • Image data of a three-dimensional image refers to image data that the positions of pixels are defined by the three-dimensional coordinates.
  • the image data of a three-dimensional image is, for example, image data composed of three-dimensionally arranged voxels. This image data is referred to as volume data, voxel data, or the like.
  • the image processor 230 executes a rendering process (such as volume rendering and MIP (Maximum Intensity Projection)) on this volume data, and forms image data of a pseudo three-dimensional image taken from a specific view direction. On a display device such as the display 240, this pseudo three-dimensional image is displayed.
  • a rendering process such as volume rendering and MIP (Maximum Intensity Projection)
  • stack data of a plurality of tomographic images is image data obtained by three-dimensionally arranging a plurality of tomographic images obtained along a plurality of scanning lines, based on the positional relation of the scanning lines. That is to say, stack data is image data obtained by expressing a plurality of tomographic images defined by originally individual two-dimensional coordinate systems by a three-dimensional coordinate system (namely, embedding into a three-dimensional space).
  • the image processor 230 includes, for example, the aforementioned microprocessor, RAM, ROM, hard disk drive, circuit board, and so on.
  • the image processor 230 has an x-correction part 231, y-correction part 232, and z-correction part 233.
  • the x-correction part 231, y-correction part 232, and z-correction part 233 respectively perform positional corrections of a 3-dimensional image in the x-direction (horizontally), y-direction (vertically), and z-direction (depthwise).
  • the x-direction and the y-direction are directions along the surface of the fundus Ef (fundus surface direction).
  • the z-direction is a direction depthwise of the fundus Ef (fundus depth direction).
  • These correction parts 231 to 233 are one example of the "correction part" in the present embodiment. Hereinafter, processes executed by these correction parts 231 to 233 are explained.
  • the x-correction part 231 corrects the position in the x-direction of a plurality of tomographic images captured by three-dimensional scanning described below, thereby corrects the position in the x-direction of a 3-dimensional image based on these tomographic images.
  • three-dimensional scanning scanning is performed with the signal light LS along a plurality of scanning lines arranged in the y-direction. Each scanning line includes a plurality of linearly arranged scanning points in the x-direction.
  • the observation image K moving image
  • the frame rate is set so as to capture still images (frames) corresponding to scanning along each scanning line. Consequently, it becomes possible to associate a still image with each scanning line (each tomographic image).
  • Fig. 4 represents the arrangement of tomographic images Gi when the fundus Ef is seen from the side of the fundus observation apparatus 1. It should be noted that, in a state in which the problem is not generated, tomographic images Gi are arranged at equal intervals within a scanning region R, without misalignment in the x-direction (the direction along each scanning line Ri).
  • the observation image K of the fundus Ef is simultaneously captured and still images (frames) corresponding to each scanning line Ri (each tomographic image Gi) are obtained.
  • the x-correction part 231 analyzes the pixel values (luminance values) of each still image and specifies an image region of a characteristic site of the fundus Ef in the still image. For example, optic papilla, macula, blood vessels, branching parts of blood vessels, lesions, etc. are cited as characteristic sites.
  • the x-correction part 231 calculates the positional misalignment amount of the above image region in these still images. This process is, for example, to calculate the displacement of the above image region in still images corresponding to each of the other tomographic images G2 to Gm with respect to the above image region in a still image corresponding to the first fundus image G1 (standard still image).
  • the displacement calculated herein is the displacement in the x-direction and the displacement in the y-direction.
  • the x-correction part 231 corrects the relative position in the x-direction of the plurality of tomographic images Gi so as to cancel the calculated positional misalignment amount (displacement). Thereby, the position in the x-direction of a 3-dimensional image based on the plurality of tomographic images Gi is corrected.
  • the x-correction part 231 deletes a part (end part region) of each tomographic image Gi that is included in the end part regions Ra and Rb of a scanning region R. Consequently, it becomes possible to obtain a 3-dimensional image of the center portion (image region) Rc of the scanning region R.
  • the y-correction part 232 corrects the relative position in the y-direction of the plurality of tomographic images Gi so as to cancel the above positional misalignment amount (displacement) calculated based on still images. Thereby, the position of a 3-dimensional image in the y-direction based on the plurality of tomographic images Gi is corrected. It should be noted that the calculation process of the positional misalignment amount may be conducted by the y-correction part 232.
  • the y-correction part 232 adjusts the intervals of the plurality of tomographic images Gi after the relative position is corrected as described above.
  • this process there is a process (supplementary process) of filling in (supplementing) portions where tomographic images are sparse, and a process of thinning portions where tomographic images are dense (thinning process).
  • the supplementary process is performed, for example, as below.
  • the y-correction part 232 determines whether the calculated interval is equal to or greater than a prescribed value or not.
  • the prescribed value is set, for example, based on the size of the scanning region R and the number of scanning lines Ri. It should be noted that obtaining the density of the tomographic images Gi ends up determining whether the density is equal to or less than a prescribed value or not.
  • the controller 210 controls the scan driver 70 and rescans with a signal light LS along the scanning lines located within a region sandwiched by two tomographic images with an interval that is equal to or greater than the prescribed value.
  • the image forming part 220 forms a new tomographic image based on the detection results of interference light of the rescanned signal light LS and reference light LR, and the image processor 230 forms a 3-dimensional image corresponding to the above region based on these new tomographic images.
  • the y-correction part 232 may perform the following process. First, the y-correction part 232 determines whether each positional misalignment amount calculated based on a plurality of still images is equal to or greater than a prescribed value or not.
  • the controller 210 controls the scan driver 70 and rescans with the signal light LS along a scanning line located at a region close to the scanning line of a tomographic image corresponding to a still image whose positional misalignment amount is determined to be equal to or greater than the prescribed value.
  • the image forming part 220 forms a new tomographic image along the rescanned scanning line, based on the detection results of interference light of the rescanned signal light LS and reference light LR.
  • the y-correction part 232 may also perform the following process. First, for each of a plurality of scanning lines, the y-correction part 232 selects a tomographic image closest to the original position of the scanning line among the plurality of tomographic images Gi, based on the calculated positional misalignment amount based on the plurality of still images.
  • the original position of a scanning line is represented by a coordinate value of a scanning line set within the scanning region R. This coordinate value (particularly a y-coordinate value) is easily obtained based on the size of the scanning region R and the number of scanning lines.
  • the y-correction part 232 selects the tomographic image located closest to this coordinate position.
  • the image processor 230 forms a 3-dimensional image based only on the selected tomographic images.
  • the y-correction part 232 may also perform a process such as follows. After the relative position of the plurality of tomographic images Gi is corrected, the y-correction part 232 calculates the interval of these tomographic images Gi.
  • the image processor 230 forms, based on the calculated intervals as well as these tomographic images Gi, a plurality of tomographic images arranged at equal intervals. In this process, the pixel value at positions arranged at equal intervals in the y-direction is calculated, for example, by performing a linear interpolation process based on the pixel values (luminance values) at scanning points arranged in the y-direction. A plurality of tomographic images arranged at equal intervals is obtained by forming images using a calculated pixel value. Furthermore, the image processor 230 forms a 3-dimensional image based on these tomographic images arranged at equal intervals.
  • the y-correction part 232 specifies a scanning line of a tomographic image corresponding to the still image.
  • the still images and the tomographic images are associated as described previously, and so the tomographic images and the scanning lines are associated as a one-to-one correspondence; hence, making it possible to carry out this process easily.
  • the controller 210 controls the scan driver 70 and rescans with a signal light LS along the specified scanning line. Then, the observation image K is also captured.
  • the image forming part 220 forms a new tomographic image along the specified scanning line, based on the detection results of interference light of the rescanned signal light LS and reference light LR.
  • the x-correction part 231 and the y-correction part 232 are capable of performing the above correction process, based on the new tomographic image and the new observation image K.
  • the image processor 230 is capable of forming a 3-dimensional image of a region corresponding to the specified scanning line, based on the new tomographic image.
  • the z-correction part 233 corrects a position in the z-direction of a 3-dimensional image (plurality of tomographic images Gi).
  • scanning is performed separately from three-dimensional scanning.
  • the separate scanning consists of scanning in the direction crossing the plurality of scanning lines Ri.
  • scanning with the signal light LS is performed along each of a prescribed number of scanning lines (scanning lines for correction) orthogonal to the plurality of scanning lines Ri.
  • the image forming part 220 forms a tomographic image (tomographic image for correction) corresponding to each scanning line for correction, based on the detection results of interference light LC obtained by the separate scanning.
  • the z-correction part 233 specifies an image region of a characteristic layer of the fundus Ef in a prescribed number of formed tomographic images for correction.
  • the characteristic layer it is desirable to select a site that can be easily specified in a tomographic image, for example, a site (organ) clearly depicted with high luminance.
  • the z-correction part 233 moves each tomographic image Gi in the fundus depth direction (z-direction) so as to match the depthwise position (z-coordinate value) of the image region in a tomographic image for correction and the depthwise position of the image region of the characteristic layer in each tomographic image Gi.
  • the positional correction in the fundus depth direction of a 3-dimensional image becomes possible.
  • the image forming part 220 and the image processor 230 are an example of the "image forming part" of the present embodiment.
  • the display 240 is configured including a display device of the aforementioned arithmetic and control unit 200.
  • the operation part 250 is configured including an operation device of the aforementioned arithmetic and control unit 200.
  • the operation part 250 may also include various kinds of buttons or keys provided with the case of the fundus observation apparatus 1 or its outside.
  • the display 240 may also include various display devices such as a touch panel monitor, etc. provided with the case of the retinal camera unit 2.
  • the display 240 and the operation part 250 do not need to be composed as separate devices.
  • a device in which the display function and the operation function are integrated can be used.
  • the scan aspect of the signal light LS by the fundus observation apparatus 1 is, for example, a horizontal scan, vertical scan, cruciform scan, radial scan, circular scan, concentric scan, and helical scan. These scan aspects are selectively used as necessary in consideration of an observation site of the fundus, an analysis target (the retinal thickness or the like), a time required to scan, the accuracy of a scan, and so on.
  • a horizontal scan is a scan with the signal light LS in the horizontal direction (x-direction).
  • the horizontal scan includes an aspect of scanning with the signal light LS along a plurality of scanning lines extending in the horizontal direction arranged in the vertical direction (y-direction). In this aspect, it is possible to set any interval between scanning lines. By setting the interval between adjacent scanning lines to be sufficiently narrow, it is possible to form the aforementioned three-dimensional image (three-dimensional scan).
  • a vertical scan is also performed in a similar manner.
  • a cruciform scan is a scan with the signal light LS along a cross-shape trajectory formed by two linear trajectories (line trajectories) orthogonal to each other.
  • a radial scan is a scan with the signal light LS along a radial trajectory formed by a plurality of line trajectories arranged at predetermined angles.
  • the cruciform scan is an example of the radial scan.
  • a circular scan is a scan with the signal light LS along a circular trajectory.
  • a concentric scan is a scan with the signal light LS along a plurality of circular trajectories arranged concentrically around a predetermined center position. The circular scan is regarded as a special example of the concentric scan.
  • a helical scan is a scan with the signal light LS along a helical trajectory while making the turning radius gradually smaller (or greater).
  • the Galvano mirrors 43 and 44 are capable of scanning with the signal light LS in the x-direction and the y-direction independently, and is therefore capable of scanning with the signal light LS along an arbitrary trajectory on the xy-plane.
  • a region on the fundus Ef subjected to scanning by the signal light LS as above is referred to as a scanning region as previously described.
  • a scanning region in three-dimensional scanning is a rectangular-shaped region in which a plurality of horizontal scans are arranged (refer to the scanning region R of Fig. 4 ).
  • a scanning region in a concentric circular scan is a disc-shaped region surrounded by the trajectories of a circular scan of a maximum diameter.
  • the scanning region in a radial scan is a disc-shaped (or polygonal-shaped) region linking end positions of scanning lines.
  • the position in the x-direction and the y-direction of the tomographic images Gi may be corrected, based on the observation image K.
  • the fundus observation apparatus 1 with regard to a region where the tomographic images Gi (scanning line Ri) are sparse, scanning may be performed again to complement a tomographic image.
  • scanning may be performed again to complement a tomographic image.
  • new tomographic images Jk are obtained along a scanning line Rk in a sparse region Rd and a 3-dimensional image in the sparse region Rd may be formed based on these new tomographic images Jk.
  • tomographic images Gi may be thinned with regard to a portion where the tomographic images Gi are dense.
  • a plurality of tomographic images arranged at favorable intervals may be obtained and a favorable 3-dimensional image may be obtained via such a complement or thinning.
  • the position of a 3-dimensional image in the fundus depth direction may be corrected, based on tomographic images (tomographic images for correction) based on the detection results of interference light LC of signal light LS scanned separately from the three-dimensional scanning and reference light LR.
  • intervals of a plurality of tomographic images Gi after the relative position is corrected, and a plurality of tomographic images arranged at equal intervals are formed, based on the calculated interval and the plurality of tomographic images Gi; thereby, making it possible to form a 3-dimensional image based on these tomographic images of equal intervals.
  • an image region of a characteristic site of the fundus Ef in each still image consisting of the observation image K may be specified to calculate the positional misalignment amount of these image regions and, if determined that the positional misalignment amount is equal to or greater than a prescribed value, a new tomographic image may be formed by rescanning with the signal light LS along a scanning line located in a region close to the scanning line of a tomographic image corresponding to the still image whose positional misalignment amount is determined to be equal to or greater than the prescribed value; thereby, making it possible to form a 3-dimensional image corresponding to the above close region based on the new tomographic image.
  • the fundus observation apparatus 1 with regard to each scanning line Ri, based on the above calculated positional misalignment amount, the tomographic image closest to the original position of the scanning line Ri among the plurality of tomographic images Gi is selected; thereby, making it possible to form a 3-dimensional image based on the selected tomographic image.
  • the fundus observation apparatus 1 when there exists a still image in which an image region of a characteristic site is not specified, a scanning line of a tomographic image corresponding to the still image is specified, and a new tomographic image is formed by rescanning with the signal light LS along the specified scanning line; thereby, making it possible to form a 3-dimensional image of a region corresponding to the scanning line based on the new tomographic image.
  • the fundus observation apparatus 1 that acts as described, even if the eye E moves or blinks during scanning with the signal light LS, a highly accurate 3-dimensional image may be obtained.
  • each scanning line consists of a plurality of scanning points.
  • the second embodiment describes a technology for obtaining the positional misalignment amount for one or more scanning point(s) as a unit.
  • the obtained positional misalignment amount may be used for correcting positional misalignment as in the first embodiment in addition to being used for other purposes.
  • the second embodiment describes an application to a technology for forming a highly precise image by superposing more than two images obtained by scanning the same site of a fundus.
  • the fundus observation apparatus in the present embodiment carries out measurements that are the same as in the first embodiment and forms 1-dimensional images extending depthwise of a fundus at each scanning point.
  • This 1-dimensional image is hereinafter referred to as an A-scan image.
  • a tomographic image is formed by arranging a plurality of A-scan images according to the arrangement of a plurality of scanning points.
  • the fundus observation apparatus detects the position of a fundus at a prescribed time interval when scanning with signal light is executed, and calculates a positional misalignment amount of the plurality of A-scan images in the fundus surface direction (xy-direction) based on temporal changes in the detected position of the fundus.
  • Signal light LS is irradiated toward each scanning point Rij.
  • the eye E moves during measurement, as shown in Fig.7B , the actual irradiation position Tij of the signal light LS ends up being shifted from the original position of the scanning point Rij.
  • the position of the A-scan image that should depicts the position of the fundus Ef corresponding to the scanning point Rij ends up being shifted (that is, ends up obtaining an A-scan image depicting the position of the fundus Ef corresponding to the actual irradiation position Tij).
  • the amount of positional misalignment (positional misalignment amount) of such an A-scan image is obtained.
  • the positional misalignment amount of each A-scan image may be obtained, or the positional misalignment amount of a continuous prescribed number of A-scan images may also be obtained all together.
  • the first embodiment is one example of the latter process in which the positional misalignment amount of n number of A-scan images on each scanning line Ri are obtained all together.
  • the positional misalignment amount of the A-scan images is a vector quantity. That is, the positional misalignment amount includes information representing the displacement direction of the actual irradiation position Tij with respect to the scanning point Rij (misalignment direction information) and information representing the displacement amount (misalignment amount information).
  • the fundus observation apparatus has the following configuration in order to realize a process as above.
  • the fundus observation apparatus has a hardware configuration that is the same as that in the first embodiment. That is, the fundus observation apparatus is of the configuration shown in Fig. 1 and Fig. 2 .
  • these figures are appropriately used as a reference for the explanation.
  • the configuration of the control system of the fundus observation apparatus is described.
  • the control system of the fundus observation apparatus has parts in common with the first embodiment (ref. Fig.3 ).
  • One example of the configuration of the control system of the fundus observation apparatus is shown in Fig. 8 . It should be noted that among the component elements shown in Fig. 8 , the same symbols are given to those that are common in the first embodiment.
  • the image processor 230 has a characteristic site-specification part 261, calculation part 262, scanning point-specification part 265, and correction part 266.
  • the image processor 230 has a characteristic site-specification part 261, calculation part 262, scanning point-specification part 265, and correction part 266.
  • This fundus observation apparatus forms an observation image K (moving image) of the fundus Ef using the observation light source 11, the CCD image sensor 35 and so on.
  • the observation image K is obtained by imaging the fundus Ef at a prescribed frame rate.
  • the reciprocal of the frame rate corresponds to the "prescribed time interval" in the present embodiment.
  • the fundus observation apparatus forms the observation image K by imaging the fundus Ef when the scanning with the signal light LS is executed.
  • the configuration for forming the observation image K (the illumination optical system 10 and the imaging optical system 30) is one example of the "imaging part" in the present embodiment.
  • the characteristic site-specification part 261 analyzes each still image consisting of the observation image K and specifies an image region of a characteristic site of the fundus Ef. This process is described in the first embodiment.
  • the characteristic site-specification part 261 is one example of the "image region-specifying part" in the present embodiment.
  • the characteristic site-specification part 261 obtains the position of an image region of a characteristic site in each still image as a position of the fundus Ef. That is, a two-dimensional coordinate system is preliminarily defined for each still image and the characteristic site-specification part 261 recognizes a coordinate value of the image region in the two-dimensional coordinate system as the position of the fundus Ef.
  • a coordinate value of the image region for example, a coordinate value of a characteristic point (center point, point of gravity, etc.) within the image region may be used.
  • the two-dimensional coordinate system and the xy-coordinate system are associated such that mutual coordinate conversion is possible.
  • the xy-coordinate system itself may be used as the two-dimensional coordinate system.
  • the imaging part and the image region-specifying part constitute one example of the "detection part" in the present embodiment.
  • the calculation part 262 calculates the positional misalignment amount of a plurality of A-scan images in the fundus surface direction, based on temporal changes in the position of the fundus Ef obtained by the characteristic site-specification part 261.
  • the calculation part 262 constitutes one example of the "calculation part" along with the correction part 266 in the present embodiment.
  • the calculation part 262 has a position specification part 263 and a positional misalignment-calculation part 264.
  • the scan time interval means the time interval from the time when the signal light LS is irradiated to one scanning point Rij until the signal light LS is irradiated to the next scanning point Ri(j+1).
  • a time interval (scanning line switching time) from the time when the signal light LS is irradiated to a final scanning point Rin of one scanning line Ri until the signal light LS is irradiated to the first scanning point R(i+1)1 of the next scanning line R(i+1) may be the same as the above scan time interval or it may also be different. If different, the position detecting interval may be controlled along with the scanning line switching time interval. Furthermore, instead of controlling the position detecting interval, the scanning line switching time interval may be set to a value that is an integral multiple of the scan time interval.
  • a position detecting interval is set to an integral (Q ⁇ 1) multiple of a scan time interval. That is, the fundus observation apparatus detects the position of the fundus Ef each time a Q number of scanning points are scanned while irradiating the signal light LS sequentially to a plurality of scanning points Rij.
  • the fundus observation apparatus detects the position of the fundus Ef every Q number of scanning points. Such action is realized by synchronizing control of the accumulated time of the CCD image sensor 35 and the control of the scan driver 70.
  • the calculation part 262 divides a plurality of A-scan images that have been sequentially formed into groups of A-scan images, wherein each group includes Q number of A-scan images.
  • the "division" may actually be dividing the plurality of A-scan images for every Q number of the same (for example, each group of A-scan images are stored separately) or each group of A-scan images are made identifiable by adding an identification information, etc. In any case, it is sufficient as long as processes are executable with the individual group of A-scan images in the following processes.
  • the position specification part 263 specifies the position of each 1-dimensional image group, based on the detection result of the position of the fundus Ef when Q number of scanning points corresponding to each group of A-scan images are being scanned. This process is described in further detail. That is, as described previously, because a group of scanning points and the detection result of the position of the fundus Ef are associated with a group of A-scan images, the position specification part 263 specifies, with reference to the association, the detection results of the position of the fundus Ef corresponding to each group of A-scan images so as to be regarded as the position of the group of A-scan images. This process corresponds to specifying the actual irradiation position Tij shown in Fig. 7B .
  • the positional misalignment-calculation part 264 calculates the positional misalignment amount based on the position of each group of A-scan images specified by the position specification part 263.
  • the positional misalignment amount may be accurately obtained for a certain A-scan image (that is, an A-scan image corresponding to the scanning point scanned at a moment when the position of the fundus Ef is detected) included in a group of A-scan images, but for other A-scan images, there will generally be some marginal errors.
  • the positional misalignment-calculation part 264 stores the positional information of each scanning point Rij (scanning point positional information) according to a preliminarily set scanning mode.
  • the scanning point positional information is represented by, for example, a coordinate value that is defined by the aforementioned xy-coordinate system.
  • the scanning point positional information may also be represented by a coordinate value defined by a 2-dimensional coordinate system in which one of a plurality of scanning points Rij (for example, the first scanning point R11) is the origin.
  • an xy-coordinate value of one of the plurality of scanning points Rij for example, the first scanning point R11
  • an interval of adjacent scanning points may also be stored.
  • the length of each scanning line may also be stored.
  • the form of the scanning point positional information may be arbitrary as long as the position of each scanning point is uniquely defined.
  • the positional misalignment-calculation part 264 first acquires a corresponding position (that is, an original position of each A-scan image) of each scanning point Rij with regard to each group of A-scan images from the scanning point positional information. Next, with regard to each group of A-scan images, the positional misalignment-calculation part 264 compares, for each scanning point Rij, the position of the acquired scanning point Rij and the actual irradiation position Tij specified by the position specification part 263. Thereby, the positional misalignment amount of the irradiation position Tij with respect to the position of the scanning point Rij is obtained.
  • the correction part 266 corrects the position of the A-scan image in the fundus surface direction, based on the positional misalignment amount calculated by the positional misalignment-calculation part 264.
  • the correction part 266 is one example of the "first correction part" in the present embodiment.
  • the positional misalignment amount in the fundus surface direction obtained by the positional misalignment-calculation part 264 corresponds to the positional misalignment amount of the irradiation position Tij with respect to the position of the scanning point Rij.
  • the correction part 266 corrects the position of the A-scan image so as to cancel the corresponding positional misalignment amount, that is, so as to move the actual irradiation position Tij to the original position of the scanning point Rij. Consequently, an actually obtained A-scan image may be arranged in the original position (the position of the scanning point Rij). This is the end of the description regarding the calculation process of the positional misalignment amount in the fundus surface direction.
  • the calculation part 262 calculates the depthwise positional misalignment amount of a plurality of A-scan images, based on a group of 1-dimensional images (a separate group of A-scan images) based on the detection results of interference light LC comprising signal light LS separately scanned from the above scanning (scanning for obtaining the plurality of A-scan images, referred to as a main scanning) and reference light LR.
  • a separate group of A-scan images comprises a prescribed number of A-scan images arranged in the direction of the above separate scanning.
  • the direction of the separate scanning is different from that of the main scanning. That is, a scanning line linking the prescribed number of scanning points in the separate scanning is presumed to cross each scanning line in the main scanning.
  • the fundus observation apparatus Prior to the calculation process of the positional misalignment amount in the fundus depth direction, the fundus observation apparatus forms separate groups of A-scan images by executing the above separate scanning and, furthermore, a tomographic image (standard tomographic image) corresponding to the scanning line in the separate scanning is formed, based on these separate groups of A-scan images.
  • a tomographic image standard tomographic image
  • the calculation part 262 specifies an image region of a characteristic layer of the fundus Ef in the standard tomographic image, and specifies an image region of the characteristic layer in the tomographic image that is obtained in the main scanning.
  • the calculation part 262 calculates the depthwise displacement between the image region specified from the standard tomographic image and the image region specified from the tomographic image of the main scanning. Furthermore, as in the first embodiment, the calculation part 262 (positional misalignment-calculation part 264) calculates the depthwise positional misalignment amount of the A-scan image obtained in the main scanning, based on the calculated displacement.
  • the correction part 266 corrects the depthwise position of the A-scan image obtained in the main scanning, based on the depthwise positional misalignment amount calculated by the calculation part 262. This process is executed by moving, in the depthwise direction, the depthwise position of the A-scan image obtained from the main scanning so as to cancel the positional misalignment amount.
  • the correction part 266 is one example of the "second correction part" in the present embodiment. This is the end of the description of the calculation process of the depthwise positional misalignment.
  • the scanning point-specification part 265 acts when there exists a still image (frame of an observation image K) for which an image region for a purpose is not specified by the characteristic site-specification part 261.
  • the scanning point specification part 265 specifies the scanning point of an A-scan image corresponding to a still image with regard to each still image for which the image region is not specified. This process may be easily executed based on the aforementioned association of a group of A-scan images, a group of scanning points, and the detection results of the position of the fundus Ef, in addition to association of the detection results of the position of the fundus Ef and a still image (note that the detection of the position is executed based on a still image).
  • the scanning point-specification part 265 is one example of the "scanning point-specifying part" in the present embodiment.
  • the main controller 211 controls the scan driver 70 and arranges the Galvano mirrors 43 and 44 in a position corresponding to the specified scanning point. Furthermore, the main controller 211 illuminates the observation light source 11 to capture the observation image K, and outputs low coherence light L0 by controlling the light source unit 101. Consequently, the signal light LS is irradiated to the specified scanning point. It should be noted that when a plurality of specified scanning points are present, these scanning points are sequentially scanned with the signal light LS.
  • the image forming part 220 receives, from the CCD image sensor 120, the detection results of interference light LC comprising the signal light LS and reference light LR, and forms a new A-scan image corresponding to the scanning point.
  • the image processor 230 executes the previously described process with respect to the new A-scan image and a corresponding still image (frame of the observation image K). Furthermore, the image forming part 220 is capable of forming a tomographic image of the fundus Ef, based on the new A-scan image and other scanning points that have already been obtained.
  • the position of the fundus Ef may be detected at a prescribed time interval while scanning with the signal light LS the plurality of scanning points Rij, and the positional misalignment amount of a plurality of A-scan images in the fundus surface direction may be calculated based on temporal changes in the detected position of the fundus Ef. Furthermore, according to the fundus observation apparatus, it is possible to correct the position of the plurality of A-scan images based on the calculated positional misalignment amount.
  • the positions may be corrected by each group comprising a prescribed number (more than one) of A-scan images, more precise corrections than the first embodiment, in which the correction is executed for each tomographic image as a unit, are possible.
  • the fundus observation apparatus when an image region of a characteristic site for obtaining the positional misalignment amount of an A-scan image is not specified, by specifying a scanning point corresponding to the A-scan image and remeasuring the scanning point, a new A-scan image may be formed. Consequently, even in the event of failing to obtain the positional misalignment amount, reobtaining may automatically be conducted; hence, making it possible to obtain highly accurate OCT images. Specifically, because only a scanning point corresponding to the A-scan image for which the positional misalignment amount failed to be obtained is remeasured, the examination time as well as the load on a patient may be reduced.
  • a positional misalignment amount, in the depthwise direction of the fundus Ef, a plurality of A-scan images may be calculated, based on a separate group of A-scan images based on the detection results of interference light LC of the separately scanned signal light LS and reference light LR.
  • the depthwise position of the plurality of A-scan images may be corrected based on the positional misalignment amount.
  • the position detecting interval is greater than twice the scan time interval (Q ⁇ 2)
  • the positional misalignment amount is obtained with respect to a group of Q number of A-scan images.
  • the process of obtaining the amount of a plurality (less than Q) of positional misalignments with respect to a group of Q number of A-scan images is described.
  • the positional misalignment-calculation part 264 obtaines the detection results of the position of a fundus Ef while Q number of scanning points corresponding to a first group of A-scan images are being scanned and the detection results of the position of the fundus Ef while Q number of scanning points corresponding to the next second group of A-scan images are being scanned.
  • the positional misalignment-calculation part 264 estimates the positional misalignment amount of each A-scan image included in the first group of A-scan images and/or the second group of A-scan images based on these two detection results.
  • a specific example of this estimating process is described with reference to Fig. 9 .
  • Q 3.
  • a first group of scanning points U1 corresponding to the first group of A-scan images includes three scanning points Ri1 to Ri3, and a second group of scanning points U2 corresponding to the second group of A-scan images includes three scanning points Ri4 to Ri6.
  • the position of the fundus Ef at the time when the first scanning points Ri1 and Ri4 are being scanned is detected in each group of scanning points U1 and U2. That is, the image processor 230 obtains the positional misalignment amount corresponding to each group of scanning points U1 and U2, based on still images (frames of the observation image K) imaged when the first scanning points Ri1 and Ri4 in each group of scanning points U1 and U2 are being scanned. This process is described previously.
  • the positional misalignment amount corresponding to the first group of scanning points U1 is ( ⁇ x1, ⁇ y1)
  • the positional misalignment amount corresponding to the second group of scanning points U2 is ( ⁇ x2, ⁇ y2).
  • the positional misalignment-calculation part 264 presumes that the positional misalignment amount of the first scanning points Ri1 and Ri4 are respectively ( ⁇ x1, ⁇ y1) and ( ⁇ x2, ⁇ y2).
  • the positional misalignment-calculation part 264 estimates the positional misalignment amount of each scanning point Ri2 and Ri3 sandwiched by two scanning points Ri1 and Ri4 in the following way, based on the positional misalignment amount ( ⁇ x1, ⁇ y1) and ( ⁇ x2, ⁇ y2).
  • each scanning point Ri2 and Ri3 subject to the estimation is respectively located at a point where the line linking the scanning points Ri1 and Ri4 is internally divided into 1:2 and 2:1.
  • the positional misalignment-calculation part 264 calculates (( ⁇ x2- ⁇ x1)/3, ( ⁇ y2- ⁇ y1)/3), and the calculation result is regarded as the positional misalignment amount corresponding to the scanning point Ri2.
  • (2 ⁇ ( ⁇ x2- ⁇ x1)/3, 2 ⁇ ( ⁇ y2- ⁇ y1)/3) is calculated by the positional misalignment-calculation part 264, and this is regarded as the positional misalignment amount corresponding to the scanning point Ri3. In this way, the positional misalignment amount corresponding to each of the four scanning points Ri1 to Ri4 is obtained.
  • the positional misalignment amount corresponding to each scanning point Ri5 and Ri6 is obtained in the same way with reference to the positional misalignment amount corresponding to the next group of scanning points following the second group of scanning points U2.
  • the positional misalignment-calculation part 264 sequentially obtains a positional misalignment amount corresponding to each scanning point Rij.
  • the positional misalignment amounts corresponding to the scanning points Ri2 and Ri3 sandwiched by them are estimated, but even if other scanning points are regarded as standard, the positional misalignment amount corresponding to each scanning point sandwiched by the standard two scanning points may be likewise estimated.
  • scanning points Ri2 and Ri5 in the middle are regarded as standard, it is possible to estimate the positional misalignment amounts corresponding to the scanning point Ri3 in the first group of scanning points U1 and the scanning point Ri4 in the second group of scanning points U2.
  • the last scanning points Ri3 and Ri6 are regarded as standard, it is possible to estimate the positional misalignment amounts corresponding to scanning points Ri4 and Ri5 in the second group of scanning points U2.
  • the correction part 266 may correct the position of each of a plurality of A-scan images, based on the obtained positional misalignment amount.
  • the positional misalignment amount for each A-scan image may be obtained while detecting the position of the fundus Ef for every Q number of scanning points. Therefore, even if there are restrictions on the detection interval of the position of the fundus Ef, the positional misalignment amount of each A-scan image may be obtained. Furthermore, there is an advantage in making it possible to set a short scan time interval.
  • the position of an already formed A-scan image is corrected, based on the positional misalignment amount of the A-scan image.
  • this modification example an embodiment for controlling scanning with a signal light LS based on the positional misalignment amount of an A-scan image in real time is described.
  • the image processor 230 sequentially calculates the positional misalignment amount based on the position of a fundus Ef that is sequentially detected at a prescribed time interval.
  • the detection of the position of the fundus Ef may be carried out in the same way as in the above embodiment.
  • the process of calculating each positional misalignment amount is also executed as in the above embodiment.
  • the main controller 211 controls the scan driver 70 based on the positional misalignment amount that is sequentially calculated and corrects the irradiation position of the signal light LS with respect to the fundus Ef.
  • the main controller 211 is one example of the "controlling part" in the present embodiment.
  • the correction process of the irradiation position of the signal light LS is described in further detail.
  • the position of the Galvano mirrors 43 and 44 (mirror position) with respect to each scanning point Rij is preliminarily set based on a scanning mode that is executed.
  • the main controller 211 controls the scan driver 70 and sequentially moves the Galvano mirrors 43 and 44 to each mirror position following the scanning sequence of the scanning point Rij.
  • the positional misalignment caused this way is amended by correcting the position of an already obtained A-scan image.
  • the irradiation position of the signal light LS is corrected according to the calculated positional misalignment amount. That is, the main controller 211 controls the scan driver 70 so as to irradiate the signal light LS to a position displaced by the relevant positional misalignment amount from the original position of the next scanning point Rij.
  • the irradiation position of the signal light LS may follow the movement of the eye E (fundus Ef); thereby, making it possible to obtain highly accurate OCT images even if the eye E moves during scanning with the signal light LS.
  • the calculation part 262 is provided with an image specification part 267.
  • the image specification part 267 compares, with a prescribed value, each positional misalignment amount calculated by the positional misalignment-calculation part 264. Further, the image specification part 267 specifies an A-scan image with a positional misalignment amount that is equal to or greater than the prescribed value.
  • the image specification part 267 is one example of the "image specifying part" of the present embodiment.
  • the main controller 211 controls the light source unit 101 and scan driver 70 to reirradiate the signal light LS toward a scanning point corresponding to the specified A-scan image. If more than two A-scan images are specified, the main controller 211 sequentially reirradiates the signal light LS towards the two or more corresponding scanning points.
  • the image forming part 220 Based on the detection results of the interference light LS of the reirradiated signal light LS and reference light LR, the image forming part 220 forms a new A-scan image at the scanning point.
  • the image forming part 220 may form a tomographic image based on the new A-scan image and A-scan images corresponding to other scanning points.
  • An image selection part 268 is provided with an image selection part 268. Based on the positional misalignment amount calculated by the positional misalignment-calculation part 264, with regard to each scanning point Rij, the image selection part 268 selects the A-scan image closest to the original position of the scanning point Rij among the plurality of A-scan images that have been obtained.
  • the original position of each scanning point Rij is preliminarily set.
  • the position of each A-scan image is obtained based on the position of a corresponding scanning point Rij and a calculated positional misalignment amount.
  • the image selection part 268 regards the position which is the position of the scanning point Rij displaced by the positional misalignment amount as the position of the A-scan image.
  • the image selection part 268 selects the A-scan image closest to the original position of the scanning point Rij. It should be noted that if the positional misalignment amount is small enough, an A-scan image corresponding to the scanning line Rij is selected, but if the positional misalignment amount is great, other A-scan images might be selected.
  • the image selection part 268 is one example of the "image selecting part" in the present embodiment.
  • the image forming part 220 forms a tomographic image by arranging the selected A-scan image with respect to each scanning point Rij according to the arrangement of a plurality of scanning points.
  • a tomographic image may be formed by selecting the A-scan image that is closest to each scanning point Rij; thereby, making it possible to obtain highly accurate OCT images without performing further scanning.
  • the positional misalignment amount is used in a process of superposing a plurality of tomographic images based on a plurality of scans performed along the same scanning line. This superposing process is for the purpose of achieving high quality images.
  • scanning with the signal light LS is performed along a prescribed scanning line.
  • This fundus observation apparatus repeatedly performs scanning with the signal light LS along a prescribed scanning line.
  • the image forming part 220 repeatedly forms a plurality of A-scan images corresponding to a plurality of scanning points on the scanning line according to the repetitive scanning. Consequently, a tomographic image from each scanning is obtained.
  • the positional misalignment-calculation part 264 repeatedly calculates the positional misalignment amount of A-scan images that are repeatedly formed.
  • the calculation part 262 is provided with a positional misalignment amount-determination part 269 and an image overlapping part 270.
  • the positional misalignment amount-determination part 269 determines whether or not each positional misalignment amount repeatedly calculated by the positional misalignment-calculation part 264 is included in a prescribed permissible range. As the permissible range, a range with a smaller positional misalignment amount than the prescribed value is preliminarily set.
  • the positional misalignment amount-determination part 269 is one example of the "determination part" in the present embodiment.
  • the image overlapping part 270 superposes a set of A-scan images whose positional misalignment amount is determined to be included within the prescribed permissible range. To do this, the image overlapping part 270 forms a set of A-scan images corresponding to each scanning point Rij and overlaps each set of A-scan images.
  • the image overlapping part 270 is one example of the "image overlapping part" in the present embodiment.
  • the image forming part 220 arranges a plurality of new A-scan images formed by this overlapping process according to the arrangement of a plurality of scanning points Rij. Consequently, a tomographic image along a prescribed scanning line is formed.
  • the position of the fundus Ef is detected based on the observation image K.
  • the detection part is capable of detecting the position of the fundus Ef at a prescribed time interval during scanning with the signal light LS, an arbitrary configuration is applicable.
  • the configuration cited in the literature includes a confocal tracking reflectometer, dither scanner, and tracking galvanometers.
  • a tracking beam tracks down the characteristic point of a fundus.
  • the confocal tracking reflectometer is used so that movement of the eye can be determined by the reflection light of the beam irradiated to the fundus.
  • the beam drives the dither scanner, with a prescribed resonant frequency (8kHz) and adding 90 degrees of phase difference between the x and y scanners, in a manner such that a circle is drawn by the beam.
  • the detection signal includes the signal of the above resonant frequency, and the phase is proportional to the distance between the beam and a target.
  • Detection of phase sensitivity using a lock-in amplifier generates an error signal, which is applied to a DSP feed back control loop.
  • the control loop provides instructions to the tracking Galvanometer according to the processed error signal so that images are locked in response to the movement of the eye.
  • the position of the reference mirror 114 is changed so as to change an optical path length difference between the optical path of the signal light LS and the optical path of the reference light LR.
  • a method for changing the optical path length difference is not limited thereto.
  • the computer program used in the above embodiments can be stored in any kind of recording medium that can be read by a drive device of a computer.
  • this recording medium for example, an optical disk, a magneto-optic disk (CD-ROM, DVD-RAM, DVD-ROM, MO, and so on), and a magnetic storing medium (a hard disk, a floppy disk (TM), ZIP, and so on) can be used.
  • a storing device such as a hard disk drive and a memory.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Eye Examination Apparatus (AREA)
EP10764219.1A 2009-04-15 2010-04-02 Eyeground observation device Active EP2420181B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP14001193.3A EP2752151B1 (en) 2009-04-15 2010-04-02 Fundus observation apparatus

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2009099447 2009-04-15
JP2009223312A JP5437755B2 (ja) 2009-04-15 2009-09-28 眼底観察装置
PCT/JP2010/002424 WO2010119632A1 (ja) 2009-04-15 2010-04-02 眼底観察装置

Related Child Applications (2)

Application Number Title Priority Date Filing Date
EP14001193.3A Division EP2752151B1 (en) 2009-04-15 2010-04-02 Fundus observation apparatus
EP14001193.3A Division-Into EP2752151B1 (en) 2009-04-15 2010-04-02 Fundus observation apparatus

Publications (3)

Publication Number Publication Date
EP2420181A1 EP2420181A1 (en) 2012-02-22
EP2420181A4 EP2420181A4 (en) 2013-08-07
EP2420181B1 true EP2420181B1 (en) 2018-03-07

Family

ID=42982307

Family Applications (2)

Application Number Title Priority Date Filing Date
EP10764219.1A Active EP2420181B1 (en) 2009-04-15 2010-04-02 Eyeground observation device
EP14001193.3A Active EP2752151B1 (en) 2009-04-15 2010-04-02 Fundus observation apparatus

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP14001193.3A Active EP2752151B1 (en) 2009-04-15 2010-04-02 Fundus observation apparatus

Country Status (4)

Country Link
US (1) US8573776B2 (ja)
EP (2) EP2420181B1 (ja)
JP (1) JP5437755B2 (ja)
WO (1) WO2010119632A1 (ja)

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7365856B2 (en) 2005-01-21 2008-04-29 Carl Zeiss Meditec, Inc. Method of motion correction in optical coherence tomography imaging
US7805009B2 (en) 2005-04-06 2010-09-28 Carl Zeiss Meditec, Inc. Method and apparatus for measuring motion of a subject using a series of partial images from an imaging system
JP5317830B2 (ja) * 2009-05-22 2013-10-16 キヤノン株式会社 眼底観察装置
JP5628636B2 (ja) * 2010-11-09 2014-11-19 株式会社トプコン 眼底画像処理装置及び眼底観察装置
JP2012161382A (ja) * 2011-02-03 2012-08-30 Nidek Co Ltd 眼科装置
JP5721478B2 (ja) * 2011-03-10 2015-05-20 キヤノン株式会社 撮像装置及び撮像装置の制御方法
JP5917004B2 (ja) * 2011-03-10 2016-05-11 キヤノン株式会社 撮像装置及び撮像装置の制御方法
US9033510B2 (en) 2011-03-30 2015-05-19 Carl Zeiss Meditec, Inc. Systems and methods for efficiently obtaining measurements of the human eye using tracking
JP5220155B2 (ja) * 2011-03-31 2013-06-26 キヤノン株式会社 眼科装置および眼科装置の制御方法
JP5220156B2 (ja) * 2011-03-31 2013-06-26 キヤノン株式会社 医療装置および医療システム
JP2012223428A (ja) * 2011-04-21 2012-11-15 Topcon Corp 眼科装置
US8857988B2 (en) 2011-07-07 2014-10-14 Carl Zeiss Meditec, Inc. Data acquisition methods for reduced motion artifacts and applications in OCT angiography
US9101294B2 (en) 2012-01-19 2015-08-11 Carl Zeiss Meditec, Inc. Systems and methods for enhanced accuracy in OCT imaging of the cornea
EP2633802B1 (en) * 2012-02-29 2021-08-11 Nidek Co., Ltd. Method for taking a tomographic image of an eye
JP6217085B2 (ja) * 2013-01-23 2017-10-25 株式会社ニデック 眼科撮影装置
JP6460618B2 (ja) 2013-01-31 2019-01-30 キヤノン株式会社 光干渉断層撮像装置およびその制御方法
JP5793156B2 (ja) * 2013-03-01 2015-10-14 キヤノン株式会社 眼科装置及びその制御方法
JP6224908B2 (ja) 2013-04-17 2017-11-01 キヤノン株式会社 撮像装置
JP6402879B2 (ja) * 2013-08-06 2018-10-10 株式会社ニデック 眼科撮影装置
EP2865323B1 (en) * 2013-10-23 2022-02-16 Canon Kabushiki Kaisha Retinal movement tracking in optical coherence tomography
JP6480104B2 (ja) * 2014-03-11 2019-03-06 国立大学法人 筑波大学 光コヒーレンストモグラフィー装置及び光コヒーレンストモグラフィーによる変位測定方法
JP6528932B2 (ja) * 2014-12-26 2019-06-12 株式会社ニデック 走査型レーザー検眼鏡
CN104614834A (zh) * 2015-02-04 2015-05-13 深圳市华星光电技术有限公司 曝光机自动更换滤波片装置及曝光机
JP2016202453A (ja) * 2015-04-20 2016-12-08 株式会社トプコン 眼科手術用顕微鏡
JP2017153543A (ja) * 2016-02-29 2017-09-07 株式会社トプコン 眼科撮影装置
JP2017158836A (ja) * 2016-03-10 2017-09-14 キヤノン株式会社 眼科装置および撮像方法
US11452442B2 (en) * 2016-06-15 2022-09-27 Oregon Health & Science University Systems and methods for automated widefield optical coherence tomography angiography
JP6776076B2 (ja) * 2016-09-23 2020-10-28 株式会社トプコン Oct装置
JP6900651B2 (ja) * 2016-10-27 2021-07-07 株式会社ニデック Oct装置、およびoct制御プログラム
JP7013134B2 (ja) * 2017-03-09 2022-01-31 キヤノン株式会社 情報処理装置、情報処理方法及びプログラム
JP2019041841A (ja) 2017-08-30 2019-03-22 株式会社トプコン 眼科装置、及びその制御方法
EP4447781A1 (en) * 2021-12-15 2024-10-23 Carl Zeiss Meditec, Inc. System and method for assisting a subject with alignment to an ophthalmologic device

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09276232A (ja) 1996-04-12 1997-10-28 Nikon Corp 眼底カメラ
DE19814057B4 (de) 1998-03-30 2009-01-02 Carl Zeiss Meditec Ag Anordnung zur optischen Kohärenztomographie und Kohärenztopographie
JP2002139421A (ja) 2000-11-01 2002-05-17 Fuji Photo Film Co Ltd 光断層画像取得装置
US7113818B2 (en) * 2002-04-08 2006-09-26 Oti Ophthalmic Technologies Inc. Apparatus for high resolution imaging of moving organs
JP4597744B2 (ja) 2004-11-08 2010-12-15 株式会社トプコン 光画像計測装置及び光画像計測方法
JP4578994B2 (ja) * 2005-02-02 2010-11-10 株式会社ニデック 眼科撮影装置
JP4804820B2 (ja) 2005-07-15 2011-11-02 サンテック株式会社 光断層画像表示システム
JP4850495B2 (ja) * 2005-10-12 2012-01-11 株式会社トプコン 眼底観察装置及び眼底観察プログラム
JP4884777B2 (ja) * 2006-01-11 2012-02-29 株式会社トプコン 眼底観察装置
JP4869757B2 (ja) * 2006-03-24 2012-02-08 株式会社トプコン 眼底観察装置
JP4869756B2 (ja) * 2006-03-24 2012-02-08 株式会社トプコン 眼底観察装置
JP4864516B2 (ja) * 2006-04-07 2012-02-01 株式会社トプコン 眼科装置
JP4461259B2 (ja) * 2006-08-09 2010-05-12 国立大学法人 筑波大学 光断層画像の処理方法
JP5095167B2 (ja) 2006-09-19 2012-12-12 株式会社トプコン 眼底観察装置、眼底画像表示装置及び眼底観察プログラム
JP4996917B2 (ja) 2006-12-26 2012-08-08 株式会社トプコン 光画像計測装置及び光画像計測装置を制御するプログラム
JP4971864B2 (ja) 2007-04-18 2012-07-11 株式会社トプコン 光画像計測装置及びそれを制御するプログラム

Also Published As

Publication number Publication date
JP2010264225A (ja) 2010-11-25
EP2752151A1 (en) 2014-07-09
US20120033181A1 (en) 2012-02-09
EP2420181A1 (en) 2012-02-22
EP2420181A4 (en) 2013-08-07
EP2752151B1 (en) 2019-12-18
JP5437755B2 (ja) 2014-03-12
WO2010119632A1 (ja) 2010-10-21
US8573776B2 (en) 2013-11-05

Similar Documents

Publication Publication Date Title
EP2420181B1 (en) Eyeground observation device
EP2581035B1 (en) Fundus observation apparatus
US9615734B2 (en) Ophthalmologic apparatus
US8622547B2 (en) Fundus observation apparatus
JP5867719B2 (ja) 光画像計測装置
US9275283B2 (en) Fundus image processing apparatus and fundus observation apparatus
US9498116B2 (en) Ophthalmologic apparatus
US10561311B2 (en) Ophthalmic imaging apparatus and ophthalmic information processing apparatus
US8970846B2 (en) Optical image measurement apparatus
JP5706506B2 (ja) 眼科装置
US20170258326A1 (en) Ophthalmologic apparatus and imaging method
US10045691B2 (en) Ophthalmologic observation apparatus using optical coherence tomography
EP3459434A1 (en) Ophthalmologic apparatus and method of controlling the same
JP2022044838A (ja) 眼科装置及びデータ収集方法
JP2017046924A (ja) 光干渉断層計及びその制御方法
JP6021289B2 (ja) 血流情報生成装置、血流情報生成方法、及びプログラム
JP6527970B2 (ja) 眼科装置
EP4218545A1 (en) Ophthalmologic apparatus and method for controlling the same

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20111011

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20130705

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 3/12 20060101AFI20130701BHEP

Ipc: A61B 3/10 20060101ALI20130701BHEP

Ipc: A61B 10/00 20060101ALI20130701BHEP

Ipc: A61B 3/14 20060101ALI20130701BHEP

Ipc: A61B 3/113 20060101ALI20130701BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20170922

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: AT

Ref legal event code: REF

Ref document number: 975634

Country of ref document: AT

Kind code of ref document: T

Effective date: 20180315

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602010049037

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20180307

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180607

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 975634

Country of ref document: AT

Kind code of ref document: T

Effective date: 20180307

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180607

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180608

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602010049037

Country of ref document: DE

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20180430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180709

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180402

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

26N No opposition filed

Effective date: 20181210

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20180607

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180430

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180430

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180430

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180607

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180402

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180507

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180402

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180307

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20100402

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180307

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180707

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240227

Year of fee payment: 15